hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
501fd043edbc2b3dcd427fe15de135c71d207725 | 2,105 | py | Python | ToroboTakahashi/.ipynb_checkpoints/torobo_forward-checkpoint.py | jacknlliu/lyapunovearner | a5152050abc665705999a5ccb955c3a76df4a92f | [
"MIT"
] | 2 | 2020-03-29T15:35:03.000Z | 2021-08-24T03:22:10.000Z | ToroboTakahashi/torobo_forward.py | jacknlliu/lyapunovearner | a5152050abc665705999a5ccb955c3a76df4a92f | [
"MIT"
] | null | null | null | ToroboTakahashi/torobo_forward.py | jacknlliu/lyapunovearner | a5152050abc665705999a5ccb955c3a76df4a92f | [
"MIT"
] | 1 | 2020-11-10T00:36:06.000Z | 2020-11-10T00:36:06.000Z | import numpy as np
def forward_kinematics(theta):
(c1, c2, c3, c4, c5, c6, c7) = np.cos(np.radians(theta))
(s1, s2, s3, s4, s5, s6, s7) = np.sin(np.radians(theta))
d3 = 310.0
d5 = 310.0
d7 = 160.0
T = np.array([[
((((-s1*s3 + c1*c2*c3)*c4 - s2*s4*c1)*c5 + (-s1*c3 - s3*c1*c2)*s5)*c6 - ((-s1*s3 + c1*c2*c3)*s4 + s2*c1*c4)*s6)*c7 + (-((-s1*s3 + c1*c2*c3)*c4 - s2*s4*c1)*s5 + (-s1*c3 - s3*c1*c2)*c5)*s7,
-((((-s1*s3 + c1*c2*c3)*c4 - s2*s4*c1)*c5 + (-s1*c3 - s3*c1*c2)*s5)*c6 - ((-s1*s3 + c1*c2*c3)*s4 + s2*c1*c4)*s6)*s7 + (-((-s1*s3 + c1*c2*c3)*c4 - s2*s4*c1)*s5 + (-s1*c3 - s3*c1*c2)*c5)*c7,
(((-s1*s3 + c1*c2*c3)*c4 - s2*s4*c1)*c5 + (-s1*c3 - s3*c1*c2)*s5)*s6 + ((-s1*s3 + c1*c2*c3)*s4 + s2*c1*c4)*c6,
d3*s2*c1 - d5*(-(-s1*s3 + c1*c2*c3)*s4 - s2*c1*c4) - d7*(-(((-s1*s3 + c1*c2*c3)*c4 - s2*s4*c1)*c5 + (-s1*c3 - s3*c1*c2)*s5)*s6 - ((-s1*s3 + c1*c2*c3)*s4 + s2*c1*c4)*c6)
], [
((((s1*c2*c3 + s3*c1)*c4 - s1*s2*s4)*c5 + (-s1*s3*c2 + c1*c3)*s5)*c6 - ((s1*c2*c3 + s3*c1)*s4 + s1*s2*c4)*s6)*c7 + (-((s1*c2*c3 + s3*c1)*c4 - s1*s2*s4)*s5 + (-s1*s3*c2 + c1*c3)*c5)*s7,
-((((s1*c2*c3 + s3*c1)*c4 - s1*s2*s4)*c5 + (-s1*s3*c2 + c1*c3)*s5)*c6 - ((s1*c2*c3 + s3*c1)*s4 + s1*s2*c4)*s6)*s7 + (-((s1*c2*c3 + s3*c1)*c4 - s1*s2*s4)*s5 + (-s1*s3*c2 + c1*c3)*c5)*c7,
(((s1*c2*c3 + s3*c1)*c4 - s1*s2*s4)*c5 + (-s1*s3*c2 + c1*c3)*s5)*s6 + ((s1*c2*c3 + s3*c1)*s4 + s1*s2*c4)*c6,
d3*s1*s2 - d5*(-(s1*c2*c3 + s3*c1)*s4 - s1*s2*c4) - d7*(-(((s1*c2*c3 + s3*c1)*c4 - s1*s2*s4)*c5 + (-s1*s3*c2 + c1*c3)*s5)*s6 - ((s1*c2*c3 + s3*c1)*s4 + s1*s2*c4)*c6)
], [
(((-s2*c3*c4 - s4*c2)*c5 + s2*s3*s5)*c6 - (-s2*s4*c3 + c2*c4)*s6)*c7 + (-(-s2*c3*c4 - s4*c2)*s5 + s2*s3*c5)*s7,
-(((-s2*c3*c4 - s4*c2)*c5 + s2*s3*s5)*c6 - (-s2*s4*c3 + c2*c4)*s6)*s7 + (-(-s2*c3*c4 - s4*c2)*s5 + s2*s3*c5)*c7,
((-s2*c3*c4 - s4*c2)*c5 + s2*s3*s5)*s6 + (-s2*s4*c3 + c2*c4)*c6,
d3*c2 - d5*(s2*s4*c3 - c2*c4) - d7*(-((-s2*c3*c4 - s4*c2)*c5 + s2*s3*s5)*s6 - (-s2*s4*c3 + c2*c4)*c6)
], [
0.0, 0.0, 0.0, 1.0
]])
return T
| 72.586207 | 196 | 0.460808 | 465 | 2,105 | 2.083871 | 0.075269 | 0.115583 | 0.105263 | 0.090815 | 0.825593 | 0.809082 | 0.800826 | 0.800826 | 0.800826 | 0.726522 | 0 | 0.267303 | 0.2038 | 2,105 | 28 | 197 | 75.178571 | 0.310859 | 0 | 0 | 0.115385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038462 | false | 0 | 0.038462 | 0 | 0.115385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
503351df968adcfbead0ae1145c73133dbf72ab1 | 13,533 | py | Python | tests/test_main.py | fossabot/easy_sast | 1aa7dbbf340e3340fa2f70ec5bafb798294bfa7a | [
"BSD-3-Clause"
] | null | null | null | tests/test_main.py | fossabot/easy_sast | 1aa7dbbf340e3340fa2f70ec5bafb798294bfa7a | [
"BSD-3-Clause"
] | null | null | null | tests/test_main.py | fossabot/easy_sast | 1aa7dbbf340e3340fa2f70ec5bafb798294bfa7a | [
"BSD-3-Clause"
] | 1 | 2021-01-20T20:59:52.000Z | 2021-01-20T20:59:52.000Z | #!/usr/bin/env python3
# pylint: disable=too-many-public-methods
"""
Unit tests for main.py
"""
# built-ins
import copy
import logging
from argparse import Namespace
from unittest.mock import patch
from unittest import TestCase
from typing import Union
# custom
from tests import constants as test_constants
import main
from veracode.api import ResultsAPI, UploadAPI
# Setup a logger
logging.getLogger()
FORMAT = "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
logging.basicConfig(level="DEBUG", format=FORMAT)
logging.raiseExceptions = True
LOG = logging.getLogger(__name__)
def return_unmodified_api_object(
*, api: Union[ResultsAPI, UploadAPI], config: dict
): # pylint: disable=unused-argument
"""
A helper test function to help when mocking functions such as apply_config
Returns the provided api object
"""
return api
class TestMain(TestCase):
"""
Test main.py
"""
@patch("main.get_config")
@patch("main.apply_config", side_effect=return_unmodified_api_object)
@patch("main.submit_artifacts")
@patch("main.check_compliance")
def test_veracode_happy_path(
self,
mock_check_compliance,
mock_submit_artifacts,
mock_apply_config,
mock_get_config,
):
"""
Test the main happy path with defaults
"""
# For the linter, this is unused
mock_apply_config.return_value = None
# Actual test
mock_check_compliance.return_value = True
mock_submit_artifacts.return_value = True
mock_get_config.return_value = test_constants.CLEAN_EFFECTIVE_CONFIG
with patch(
"argparse.ArgumentParser.parse_args",
return_value=Namespace(
app_id=test_constants.VALID_UPLOAD_API["app_id"],
build_dir=test_constants.VALID_UPLOAD_API["build_dir"],
build_id=test_constants.VALID_UPLOAD_API["build_id"],
disable_auto_scan=not test_constants.VALID_UPLOAD_API["auto_scan"],
disable_scan_nonfatal_modules=not test_constants.VALID_UPLOAD_API[
"scan_all_nonfatal_top_level_modules"
],
loglevel=logging.WARNING,
api_key_id=test_constants.VALID_UPLOAD_API["api_key_id"],
api_key_secret=test_constants.VALID_UPLOAD_API["api_key_secret"],
),
):
self.assertIsNone(main.main())
@patch("main.get_config")
@patch("main.apply_config", side_effect=return_unmodified_api_object)
@patch("main.submit_artifacts")
@patch("main.check_compliance")
def test_veracode_unknown_config_step(
self,
mock_check_compliance,
mock_submit_artifacts,
mock_apply_config,
mock_get_config,
):
"""
Test main with an unknown config step
"""
# For the linter, this is unused
mock_apply_config.return_value = None
# Actual test
mock_check_compliance.return_value = True
mock_submit_artifacts.return_value = True
config = copy.deepcopy(test_constants.CLEAN_EFFECTIVE_CONFIG)
config["workflow"] = ["unknown", "check_compliance", "unknown"]
mock_get_config.return_value = config
with patch(
"argparse.ArgumentParser.parse_args",
return_value=Namespace(
app_id=test_constants.VALID_UPLOAD_API["app_id"],
build_dir=test_constants.VALID_UPLOAD_API["build_dir"],
build_id=test_constants.VALID_UPLOAD_API["build_id"],
disable_auto_scan=not test_constants.VALID_UPLOAD_API["auto_scan"],
disable_scan_nonfatal_modules=not test_constants.VALID_UPLOAD_API[
"scan_all_nonfatal_top_level_modules"
],
loglevel=logging.WARNING,
api_key_id=test_constants.VALID_UPLOAD_API["api_key_id"],
api_key_secret=test_constants.VALID_UPLOAD_API["api_key_secret"],
),
):
self.assertIsNone(main.main())
@patch("main.get_config")
@patch("main.apply_config", side_effect=return_unmodified_api_object)
@patch("main.submit_artifacts")
@patch("main.check_compliance")
def test_veracode_failed_submit_artifacts(
self,
mock_check_compliance,
mock_submit_artifacts,
mock_apply_config,
mock_get_config,
):
"""
Test main when submit_artifacts fails
"""
# For the linter, this is unused
mock_apply_config.return_value = None
# Actual test
mock_check_compliance.return_value = True
mock_submit_artifacts.return_value = False
mock_get_config.return_value = test_constants.CLEAN_EFFECTIVE_CONFIG
with patch(
"argparse.ArgumentParser.parse_args",
return_value=Namespace(
app_id=test_constants.VALID_UPLOAD_API["app_id"],
build_dir=test_constants.VALID_UPLOAD_API["build_dir"],
build_id=test_constants.VALID_UPLOAD_API["build_id"],
disable_auto_scan=not test_constants.VALID_UPLOAD_API["auto_scan"],
disable_scan_nonfatal_modules=not test_constants.VALID_UPLOAD_API[
"scan_all_nonfatal_top_level_modules"
],
loglevel=logging.WARNING,
api_key_id=test_constants.VALID_UPLOAD_API["api_key_id"],
api_key_secret=test_constants.VALID_UPLOAD_API["api_key_secret"],
),
):
with self.assertRaises(SystemExit) as contextmanager:
main.main()
self.assertEqual(contextmanager.exception.code, 1)
# pylint: disable=too-many-arguments
@patch("main.get_config")
@patch("main.apply_config", side_effect=return_unmodified_api_object)
@patch("main.configure_environment")
@patch("main.submit_artifacts")
@patch("main.check_compliance")
def test_veracode_failed_check_compliance(
self,
mock_check_compliance,
mock_submit_artifacts,
mock_configure_environment,
mock_apply_config,
mock_get_config,
):
"""
Test main when check_compliance fails
"""
# For the linter, this is unused
mock_apply_config.return_value = None
# Actual test
mock_check_compliance.return_value = False
mock_submit_artifacts.return_value = True
mock_get_config.return_value = test_constants.CLEAN_EFFECTIVE_CONFIG
mock_configure_environment.return_value = True
with patch(
"argparse.ArgumentParser.parse_args",
return_value=Namespace(
app_id=test_constants.VALID_UPLOAD_API["app_id"],
build_dir=test_constants.VALID_UPLOAD_API["build_dir"],
build_id=test_constants.VALID_UPLOAD_API["build_id"],
disable_auto_scan=not test_constants.VALID_UPLOAD_API["auto_scan"],
disable_scan_nonfatal_modules=not test_constants.VALID_UPLOAD_API[
"scan_all_nonfatal_top_level_modules"
],
loglevel=logging.WARNING,
api_key_id=test_constants.VALID_UPLOAD_API["api_key_id"],
api_key_secret=test_constants.VALID_UPLOAD_API["api_key_secret"],
),
):
with self.assertRaises(SystemExit) as contextmanager:
main.main()
self.assertEqual(contextmanager.exception.code, 1)
@patch("main.get_config")
@patch("main.apply_config", side_effect=return_unmodified_api_object)
@patch("main.submit_artifacts")
@patch("main.check_compliance")
def test_veracode_ignore_unknown_api(
self,
mock_check_compliance,
mock_submit_artifacts,
mock_apply_config,
mock_get_config,
):
"""
Test the main happy path with defaults
"""
# For the linter, this is unused
mock_apply_config.return_value = None
# Actual test
mock_check_compliance.return_value = True
mock_submit_artifacts.return_value = True
config = copy.deepcopy(test_constants.CLEAN_EFFECTIVE_CONFIG)
config["apis"].update({"unknown_api": {"something": "here"}})
mock_get_config.return_value = config
with patch(
"argparse.ArgumentParser.parse_args",
return_value=Namespace(
app_id=test_constants.VALID_UPLOAD_API["app_id"],
build_dir=test_constants.VALID_UPLOAD_API["build_dir"],
build_id=test_constants.VALID_UPLOAD_API["build_id"],
disable_auto_scan=not test_constants.VALID_UPLOAD_API["auto_scan"],
disable_scan_nonfatal_modules=not test_constants.VALID_UPLOAD_API[
"scan_all_nonfatal_top_level_modules"
],
loglevel=logging.WARNING,
api_key_id=test_constants.VALID_UPLOAD_API["api_key_id"],
api_key_secret=test_constants.VALID_UPLOAD_API["api_key_secret"],
),
):
self.assertIsNone(main.main())
@patch("main.get_config")
@patch("main.apply_config", side_effect=return_unmodified_api_object)
def test_veracode_main_get_config_value_error(
self,
mock_apply_config,
mock_get_config,
):
"""
Test main.py when get_config returns a ValueError
"""
# For the linter, this is unused
mock_apply_config.return_value = None
# Actual test
mock_get_config.side_effect = ValueError
with self.assertRaises(SystemExit) as contextmanager:
main.main()
self.assertEqual(contextmanager.exception.code, 1)
@patch("main.get_config")
@patch("main.apply_config", side_effect=TypeError)
@patch("main.configure_environment")
@patch("main.submit_artifacts")
@patch("main.check_compliance")
def test_veracode_main_apply_config_type_error(
self,
mock_check_compliance,
mock_submit_artifacts,
mock_configure_environment,
mock_apply_config,
mock_get_config,
):
"""
Test main.py when apply_config returns a TypeError
"""
# For the linter, this is unused
mock_apply_config.return_value = None
# Actual test
mock_check_compliance.return_value = True
mock_submit_artifacts.return_value = True
mock_get_config.return_value = test_constants.CLEAN_EFFECTIVE_CONFIG
mock_configure_environment.return_value = True
with self.assertRaises(SystemExit) as contextmanager:
main.main()
self.assertEqual(contextmanager.exception.code, 1)
@patch("main.get_config")
@patch("main.apply_config", side_effect=return_unmodified_api_object)
@patch("main.submit_artifacts")
@patch("main.check_compliance")
def test_veracode_main_no_sandbox_name(
self,
mock_check_compliance,
mock_submit_artifacts,
mock_apply_config,
mock_get_config,
):
"""
Test main with an effective config lacking a sandbox_name
"""
# For the linter, this is unused
mock_apply_config.return_value = None
# Actual test
mock_check_compliance.return_value = True
mock_submit_artifacts.return_value = True
config = copy.deepcopy(test_constants.CLEAN_EFFECTIVE_CONFIG)
del config["apis"]["sandbox"]["sandbox_name"]
mock_get_config.return_value = config
with patch(
"argparse.ArgumentParser.parse_args",
return_value=Namespace(
app_id=test_constants.VALID_UPLOAD_API["app_id"],
build_dir=test_constants.VALID_UPLOAD_API["build_dir"],
build_id=test_constants.VALID_UPLOAD_API["build_id"],
disable_auto_scan=not test_constants.VALID_UPLOAD_API["auto_scan"],
disable_scan_nonfatal_modules=not test_constants.VALID_UPLOAD_API[
"scan_all_nonfatal_top_level_modules"
],
loglevel=logging.WARNING,
api_key_id=test_constants.VALID_UPLOAD_API["api_key_id"],
api_key_secret=test_constants.VALID_UPLOAD_API["api_key_secret"],
),
):
self.assertIsNone(main.main())
# Test for UnboundLocalError
mock_apply_config.side_effect = UnboundLocalError
with patch(
"argparse.ArgumentParser.parse_args",
return_value=Namespace(
app_id=test_constants.VALID_UPLOAD_API["app_id"],
build_dir=test_constants.VALID_UPLOAD_API["build_dir"],
build_id=test_constants.VALID_UPLOAD_API["build_id"],
disable_auto_scan=not test_constants.VALID_UPLOAD_API["auto_scan"],
disable_scan_nonfatal_modules=not test_constants.VALID_UPLOAD_API[
"scan_all_nonfatal_top_level_modules"
],
loglevel=logging.WARNING,
api_key_id=test_constants.VALID_UPLOAD_API["api_key_id"],
api_key_secret=test_constants.VALID_UPLOAD_API["api_key_secret"],
),
):
with self.assertRaises(SystemExit) as contextmanager:
main.main()
self.assertEqual(contextmanager.exception.code, 1)
| 37.383978 | 83 | 0.650706 | 1,543 | 13,533 | 5.296176 | 0.095917 | 0.090675 | 0.10793 | 0.143906 | 0.853647 | 0.853647 | 0.853647 | 0.853647 | 0.853647 | 0.852668 | 0 | 0.000605 | 0.267199 | 13,533 | 361 | 84 | 37.487535 | 0.823435 | 0.075815 | 0 | 0.874074 | 0 | 0 | 0.132403 | 0.067629 | 0 | 0 | 0 | 0 | 0.051852 | 1 | 0.033333 | false | 0 | 0.033333 | 0 | 0.074074 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
acd4acc0a953c737d663669ece8f6697f56a5d22 | 130 | py | Python | perceiver/model/core/__init__.py | Borda/perceiver-io | 812db5e2edf6a740c4a039cb6159cfac2918839c | [
"Apache-2.0"
] | 1 | 2022-01-05T09:56:59.000Z | 2022-01-05T09:56:59.000Z | perceiver/model/core/__init__.py | Borda/perceiver-io | 812db5e2edf6a740c4a039cb6159cfac2918839c | [
"Apache-2.0"
] | null | null | null | perceiver/model/core/__init__.py | Borda/perceiver-io | 812db5e2edf6a740c4a039cb6159cfac2918839c | [
"Apache-2.0"
] | null | null | null | from perceiver.model.core.config import *
from perceiver.model.core.lightning import *
from perceiver.model.core.modules import *
| 32.5 | 44 | 0.815385 | 18 | 130 | 5.888889 | 0.444444 | 0.367925 | 0.509434 | 0.622642 | 0.528302 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.092308 | 130 | 3 | 45 | 43.333333 | 0.898305 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
acdb47845d3115cb27b1ed097f3e4ae4d7ed12ee | 29 | py | Python | python_snakes/somme_x_y.py | ljonata/ai_snakes_official | 0d12600be48de4ed657d2491e70d49933d4a9069 | [
"Apache-2.0"
] | null | null | null | python_snakes/somme_x_y.py | ljonata/ai_snakes_official | 0d12600be48de4ed657d2491e70d49933d4a9069 | [
"Apache-2.0"
] | null | null | null | python_snakes/somme_x_y.py | ljonata/ai_snakes_official | 0d12600be48de4ed657d2491e70d49933d4a9069 | [
"Apache-2.0"
] | null | null | null | def somme(x,y):
return x+y | 14.5 | 15 | 0.62069 | 7 | 29 | 2.571429 | 0.714286 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206897 | 29 | 2 | 16 | 14.5 | 0.782609 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
acec5e8c923c319bfbe139efdd7d9018aa88441e | 87 | py | Python | sb3_contrib/tdqn/__init__.py | zappavignandrea/stable-baselines3-contrib | aca9c2871289f8961ee349befa374983108fae20 | [
"MIT"
] | null | null | null | sb3_contrib/tdqn/__init__.py | zappavignandrea/stable-baselines3-contrib | aca9c2871289f8961ee349befa374983108fae20 | [
"MIT"
] | null | null | null | sb3_contrib/tdqn/__init__.py | zappavignandrea/stable-baselines3-contrib | aca9c2871289f8961ee349befa374983108fae20 | [
"MIT"
] | null | null | null | from sb3_contrib.tdqn.policies import MlpPolicy
from sb3_contrib.tdqn.tdqn import TDQN
| 29 | 47 | 0.862069 | 14 | 87 | 5.214286 | 0.5 | 0.191781 | 0.383562 | 0.493151 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025316 | 0.091954 | 87 | 2 | 48 | 43.5 | 0.898734 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
a84d0b298405cad7aaba62d419753fb5f9c4c4c4 | 227 | py | Python | libsousou/hashers/__init__.py | sousouindustries/python-libsousou | e6d66bd15a550b164a92cc1228dad56edc84464a | [
"Apache-2.0"
] | null | null | null | libsousou/hashers/__init__.py | sousouindustries/python-libsousou | e6d66bd15a550b164a92cc1228dad56edc84464a | [
"Apache-2.0"
] | null | null | null | libsousou/hashers/__init__.py | sousouindustries/python-libsousou | e6d66bd15a550b164a92cc1228dad56edc84464a | [
"Apache-2.0"
] | null | null | null | from libsousou.hashers.base import check_password
from libsousou.hashers.base import make_password
from libsousou.hashers.pbkdf2 import PBKDF2PasswordHasherSHA256
from libsousou.hashers.pbkdf2 import PBKDF2PasswordHasherSHA512
| 45.4 | 63 | 0.894273 | 26 | 227 | 7.730769 | 0.423077 | 0.258706 | 0.39801 | 0.238806 | 0.616915 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047393 | 0.070485 | 227 | 4 | 64 | 56.75 | 0.905213 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 9 |
a84ed21d82c2dca1b56407d49018cafa5d652121 | 26,330 | py | Python | test/test_TraceSetOptimizer.py | shanefeng123/agilkia | 0ac4e9dd29f9ab0026037f71d7f28d017e54949b | [
"MIT"
] | null | null | null | test/test_TraceSetOptimizer.py | shanefeng123/agilkia | 0ac4e9dd29f9ab0026037f71d7f28d017e54949b | [
"MIT"
] | null | null | null | test/test_TraceSetOptimizer.py | shanefeng123/agilkia | 0ac4e9dd29f9ab0026037f71d7f28d017e54949b | [
"MIT"
] | null | null | null | import random
import unittest
from pathlib import Path
import numpy as np
import pytest
import agilkia
class TestObjectiveFunctions(unittest.TestCase):
event1 = agilkia.Event("Order", {"Name": "Mark"}, {"Status": 0})
event1b = agilkia.Event("Order", {"Name": "Mark"}, {"Status": 2})
event2 = agilkia.Event("Skip", {"Size": 3, "Name": "Sue"}, {"Status": 1, "Error": "Too big"})
trace1 = agilkia.Trace([event1, event1b], meta_data={"freq": 0.6})
trace2 = agilkia.Trace([event2, event1], meta_data={"freq": 0.5})
trace3 = agilkia.Trace([event2, event1b], meta_data={"freq": 0.7})
trace4 = agilkia.Trace([event1, event2], meta_data={"freq": 0.8})
trace_set = agilkia.TraceSet([trace1, trace2, trace3])
objective_function = agilkia.ObjectiveFunction()
frequency_function = agilkia.FrequencyCoverage()
def test_objective_function(self):
solution = np.array([1, 1, 0])
self.objective_function.set_data(self.trace_set, 2)
self.assertEqual(0, self.objective_function.evaluate(solution))
def test_objective_function2(self):
with pytest.raises(ValueError):
agilkia.ObjectiveFunction(weight=-0.5)
def test_objective_function3(self):
with pytest.raises(ValueError):
self.objective_function.set_data([self.trace1, self.trace2], 1)
def test_objective_function4(self):
with pytest.raises(ValueError):
self.objective_function.set_data(self.trace_set, 0)
def test_objective_function5(self):
with pytest.raises(ValueError):
self.objective_function.set_data(self.trace_set, 4)
def test_objective_function6(self):
with pytest.raises(ValueError):
self.objective_function.set_data(self.trace_set, 2.5)
def test_frequency_objective_function(self):
solution = np.array([1, 1, 0])
self.frequency_function.set_data(self.trace_set, 2)
self.assertEqual((0.6 + 0.5) / (0.6 + 0.5 + 0.7), self.frequency_function.evaluate(solution))
def test_frequency_objective_function2(self):
solution = np.array([1, 1, 1])
self.frequency_function.set_data(self.trace_set, 2)
self.assertEqual((np.sum(solution) - 2) * -1, self.frequency_function.evaluate(solution))
def test_frequency_objective_function3(self):
trace1 = agilkia.Trace([self.event1, self.event1b])
trace2 = agilkia.Trace([self.event2, self.event1])
trace3 = agilkia.Trace([self.event2, self.event1b])
trace_set = agilkia.TraceSet([trace1, trace2, trace3])
with pytest.raises(ValueError):
self.frequency_function.set_data(trace_set, 2)
def test_action_status_objective_function(self):
solution = np.array([1, 0, 0])
objective_function = agilkia.EventCoverage(event_to_str=lambda ev: ev.action + "_" + str(ev.status))
objective_function.set_data(self.trace_set, 1)
self.assertEqual(len({"Order_0", "Order_2"}) / len({"Order_0", "Order_2", "Skip_1"}),
objective_function.evaluate(solution))
def test_action_status_objective_function2(self):
solution = np.array([1, 1, 1])
trace_set = agilkia.TraceSet([self.trace1, self.trace2, self.trace3])
objective_function = agilkia.EventCoverage(event_to_str=lambda ev: ev.action + "_" + str(ev.status))
objective_function.set_data(trace_set, 2)
self.assertEqual((np.sum(solution) - 2) * -1, objective_function.evaluate(solution))
def test_action_objective_function(self):
solution = np.array([1, 0, 0])
objective_function = agilkia.EventCoverage(event_to_str=lambda ev: ev.action)
objective_function.set_data(self.trace_set, 1)
self.assertEqual(len({"Order", "Order"}) / len({"Order", "Order", "Skip"}),
objective_function.evaluate(solution))
def test_action_objective_function2(self):
solution = np.array([1, 1, 1])
objective_function = agilkia.EventCoverage(event_to_str=lambda ev: ev.action)
objective_function.set_data(self.trace_set, 2)
self.assertEqual((np.sum(solution) - 2) * -1, objective_function.evaluate(solution))
def test_action_objective_function3(self):
solution = np.array([1, 1, 1])
objective_function = agilkia.EventCoverage()
objective_function.set_data(self.trace_set, 2)
self.assertEqual((np.sum(solution) - 2) * -1, objective_function.evaluate(solution))
def test_status_objective_function(self):
solution = np.array([1, 0, 0])
objective_function = agilkia.EventCoverage(event_to_str=lambda ev: str(ev.status))
objective_function.set_data(self.trace_set, 1)
self.assertEqual(len({"0", "2"}) / len({"0", "2", "1"}), objective_function.evaluate(solution))
def test_status_objective_function2(self):
solution = np.array([1, 1, 1])
objective_function = agilkia.EventCoverage(event_to_str=lambda ev: str(ev.status))
objective_function.set_data(self.trace_set, 2)
self.assertEqual((np.sum(solution) - 2) * -1, objective_function.evaluate(solution))
def test_action_pair_objective_function(self):
solution = np.array([1, 1, 0, 0])
trace_set = agilkia.TraceSet([self.trace1, self.trace2, self.trace3, self.trace4])
objective_function = agilkia.EventPairCoverage(event_to_str=lambda ev: ev.action)
objective_function.set_data(trace_set, 2)
self.assertEqual(
len({"Order_Order", "Skip_Order"}) / len({"Order_Order", "Skip_Order", "Skip_Order", "Order_Skip"}),
objective_function.evaluate(solution))
def test_action_pair_objective_function2(self):
solution = np.array([1, 1, 0, 0])
trace_set = agilkia.TraceSet([self.trace1, self.trace2, self.trace3, self.trace4])
objective_function = agilkia.EventPairCoverage()
objective_function.set_data(trace_set, 2)
self.assertEqual(
len({"Order_Order", "Skip_Order"}) / len({"Order_Order", "Skip_Order", "Skip_Order", "Order_Skip"}),
objective_function.evaluate(solution))
def test_action_pair_objective_function3(self):
solution = np.array([1, 1, 1])
objective_function = agilkia.EventPairCoverage(event_to_str=lambda ev: ev.action)
objective_function.set_data(self.trace_set, 2)
self.assertEqual((np.sum(solution) - 2) * -1, objective_function.evaluate(solution))
def test_status_pair_objective_function(self):
solution = np.array([1, 1, 0, 0])
trace_set = agilkia.TraceSet([self.trace1, self.trace2, self.trace3, self.trace4])
objective_function = agilkia.EventPairCoverage(event_to_str=lambda ev: str(ev.status))
objective_function.set_data(trace_set, 2)
self.assertEqual(len({"0_2", "1_0"}) / len({"0_2", "1_0", "1_2", "0_1"}), objective_function.evaluate(solution))
def test_status_pair_objective_function2(self):
solution = np.array([1, 1, 1])
objective_function = agilkia.EventPairCoverage(event_to_str=lambda ev: str(ev.status))
objective_function.set_data(self.trace_set, 2)
self.assertEqual((np.sum(solution) - 2) * -1, objective_function.evaluate(solution))
def test_action_status_pair_objective_function(self):
solution = np.array([1, 1, 0, 0])
trace_set = agilkia.TraceSet([self.trace1, self.trace2, self.trace3, self.trace4])
objective_function = agilkia.EventPairCoverage(event_to_str=lambda ev: ev.action + "_" + str(ev.status))
objective_function.set_data(trace_set, 2)
self.assertEqual(len({"Order_0_Order_2", "Skip_1_Order_0"}) / len(
{"Order_0_Order_2", "Skip_1_Order_0", "Skip_1_Order_2", "Order_0_Skip_1"}),
objective_function.evaluate(solution))
def test_action_status_pair_objective_function2(self):
solution = np.array([1, 1, 1])
objective_function = agilkia.EventPairCoverage(event_to_str=lambda ev: ev.action + "_" + str(ev.status))
objective_function.set_data(self.trace_set, 2)
self.assertEqual((np.sum(solution) - 2) * -1, objective_function.evaluate(solution))
def set_seed():
random.seed(3)
np.random.seed(3)
class TestTraceSetOptimizer(unittest.TestCase):
event1 = agilkia.Event("Order", {"Name": "Mark"}, {"Status": 0})
event1b = agilkia.Event("Order", {"Name": "Mark"}, {"Status": 2})
event2 = agilkia.Event("Skip", {"Size": 3, "Name": "Sue"}, {"Status": 1, "Error": "Too big"})
trace1 = agilkia.Trace([event1, event1b], meta_data={"freq": 0.6})
trace2 = agilkia.Trace([event2, event1], meta_data={"freq": 0.5})
trace3 = agilkia.Trace([event2, event1b], meta_data={"freq": 0.7})
trace4 = agilkia.Trace([event1, event2], meta_data={"freq": 0.8})
trace_set = agilkia.TraceSet([trace1, trace2, trace3, trace4])
def test_trace_set_optimizer(self):
objective_function = agilkia.FrequencyCoverage()
agilkia.TraceSetOptimizer(objective_function)
def test_trace_set_optimizer2(self):
with pytest.raises(ValueError):
agilkia.TraceSetOptimizer(lambda ev: ev.action)
def test_trace_set_optimizer3(self):
with pytest.raises(ValueError):
objective_function = agilkia.FrequencyCoverage()
traceset_optimizer = agilkia.TraceSetOptimizer(objective_function)
traceset_optimizer.set_data([self.trace1, self.trace2], 1)
def test_trace_set_optimizer4(self):
with pytest.raises(ValueError):
objective_function = agilkia.FrequencyCoverage()
traceset_optimizer = agilkia.TraceSetOptimizer(objective_function)
traceset_optimizer.set_data(self.trace_set, 0)
def test_trace_set_optimizer5(self):
with pytest.raises(ValueError):
objective_function = agilkia.FrequencyCoverage()
traceset_optimizer = agilkia.TraceSetOptimizer(objective_function)
traceset_optimizer.set_data(self.trace_set, 5)
def test_trace_set_optimizer6(self):
with pytest.raises(ValueError):
objective_function = agilkia.FrequencyCoverage()
traceset_optimizer = agilkia.TraceSetOptimizer(objective_function)
traceset_optimizer.set_data(self.trace_set, 2.5)
def test_greedy(self):
objective_functions = [agilkia.FrequencyCoverage(), agilkia.EventCoverage(event_to_str=lambda ev: ev.action)]
greedy_optimizer = agilkia.GreedyOptimizer(objective_functions)
greedy_optimizer.set_data(self.trace_set, 1)
selected_traces, best_objective_value = greedy_optimizer.optimize()
self.assertEqual([self.trace4], selected_traces.traces)
self.assertEqual(((3 / 3) * 0.5 + (0.8 / 2.6) * 0.5), best_objective_value)
def test_greedy2(self):
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventCoverage(event_to_str=lambda ev: ev.action + "_" + str(ev.status))]
greedy_optimizer = agilkia.GreedyOptimizer(objective_functions)
greedy_optimizer.set_data(self.trace_set, 1)
selected_traces, best_objective_value = greedy_optimizer.optimize()
self.assertEqual([self.trace4], selected_traces.traces)
self.assertEqual(((2 / 3) * 0.5 + (0.8 / 2.6) * 0.5), best_objective_value)
def test_greedy3(self):
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventCoverage(event_to_str=lambda ev: str(ev.status))]
greedy_optimizer = agilkia.GreedyOptimizer(objective_functions)
greedy_optimizer.set_data(self.trace_set, 1)
selected_traces, best_objective_value = greedy_optimizer.optimize()
self.assertEqual([self.trace4], selected_traces.traces)
self.assertEqual(((2 / 3) * 0.5 + (0.8 / 2.6) * 0.5), best_objective_value)
def test_greedy4(self):
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventPairCoverage(event_to_str=lambda ev: ev.action)]
greedy_optimizer = agilkia.GreedyOptimizer(objective_functions)
greedy_optimizer.set_data(self.trace_set, 2)
selected_traces, best_objective_value = greedy_optimizer.optimize()
self.assertEqual({self.trace4, self.trace3}, set(selected_traces.traces))
self.assertEqual(((2 / 3) * 0.5 + (1.5 / 2.6) * 0.5), best_objective_value)
def test_greedy5(self):
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventPairCoverage(event_to_str=lambda ev: ev.action + "_" + str(ev.status))]
greedy_optimizer = agilkia.GreedyOptimizer(objective_functions)
greedy_optimizer.set_data(self.trace_set, 2)
selected_traces, best_objective_value = greedy_optimizer.optimize()
self.assertEqual({self.trace4, self.trace3}, set(selected_traces.traces))
self.assertEqual(((2 / 4) * 0.5 + (1.5 / 2.6) * 0.5), best_objective_value)
def test_greedy6(self):
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventPairCoverage(event_to_str=lambda ev: str(ev.status))]
greedy_optimizer = agilkia.GreedyOptimizer(objective_functions)
greedy_optimizer.set_data(self.trace_set, 2)
selected_traces, best_objective_value = greedy_optimizer.optimize()
self.assertEqual({self.trace4, self.trace3}, set(selected_traces.traces))
self.assertEqual(((2 / 4) * 0.5 + (1.5 / 2.6) * 0.5), best_objective_value)
def test_pso(self):
objective_functions = [agilkia.FrequencyCoverage(), agilkia.EventCoverage(event_to_str=lambda ev: ev.action)]
pso_optimizer = agilkia.ParticleSwarmOptimizer(objective_functions)
pso_optimizer.set_data(self.trace_set, select=1)
selected_traces, best_objective_value = pso_optimizer.optimize()
self.assertEqual([self.trace4], selected_traces.traces)
self.assertEqual(((3 / 3) * 0.5 + (0.8 / 2.6) * 0.5), best_objective_value)
def test_pso2(self):
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventCoverage(event_to_str=lambda ev: ev.action + "_" + str(ev.status))]
pso_optimizer = agilkia.ParticleSwarmOptimizer(objective_functions)
pso_optimizer.set_data(self.trace_set, select=1)
selected_traces, best_objective_value = pso_optimizer.optimize()
self.assertEqual([self.trace4], selected_traces.traces)
self.assertEqual(((2 / 3) * 0.5 + (0.8 / 2.6) * 0.5), best_objective_value)
def test_pso3(self):
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventCoverage(event_to_str=lambda ev: str(ev.status))]
pso_optimizer = agilkia.ParticleSwarmOptimizer(objective_functions)
pso_optimizer.set_data(self.trace_set, select=1)
selected_traces, best_objective_value = pso_optimizer.optimize()
self.assertEqual([self.trace4], selected_traces.traces)
self.assertEqual(((2 / 3) * 0.5 + (0.8 / 2.6) * 0.5), best_objective_value)
def test_pso4(self):
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventPairCoverage(event_to_str=lambda ev: ev.action)]
pso_optimizer = agilkia.ParticleSwarmOptimizer(objective_functions)
pso_optimizer.set_data(self.trace_set, select=2)
selected_traces, best_objective_value = pso_optimizer.optimize()
self.assertEqual({self.trace4, self.trace3}, set(selected_traces.traces))
self.assertEqual(((2 / 3) * 0.5 + (1.5 / 2.6) * 0.5), best_objective_value)
def test_pso5(self):
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventPairCoverage(event_to_str=lambda ev: ev.action + "_" + str(ev.status))]
pso_optimizer = agilkia.ParticleSwarmOptimizer(objective_functions)
pso_optimizer.set_data(self.trace_set, select=2)
selected_traces, best_objective_value = pso_optimizer.optimize()
self.assertEqual({self.trace4, self.trace3}, set(selected_traces.traces))
self.assertEqual(((2 / 4) * 0.5 + (1.5 / 2.6) * 0.5), best_objective_value)
def test_pso6(self):
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventPairCoverage(event_to_str=lambda ev: str(ev.status))]
pso_optimizer = agilkia.ParticleSwarmOptimizer(objective_functions)
pso_optimizer.set_data(self.trace_set, select=2)
selected_traces, best_objective_value = pso_optimizer.optimize()
self.assertEqual({self.trace4, self.trace3}, set(selected_traces.traces))
self.assertEqual(((2 / 4) * 0.5 + (1.5 / 2.6) * 0.5), best_objective_value)
def test_pso7(self):
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventPairCoverage(event_to_str=lambda ev: str(ev.status))]
with pytest.raises(ValueError):
agilkia.ParticleSwarmOptimizer(objective_functions, num_of_particles=0)
def test_pso8(self):
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventPairCoverage(event_to_str=lambda ev: str(ev.status))]
with pytest.raises(ValueError):
agilkia.ParticleSwarmOptimizer(objective_functions, num_of_iterations=0)
def test_pso9(self):
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventPairCoverage(event_to_str=lambda ev: str(ev.status))]
with pytest.raises(ValueError):
agilkia.ParticleSwarmOptimizer(objective_functions, c1=0)
def test_pso10(self):
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventPairCoverage(event_to_str=lambda ev: str(ev.status))]
with pytest.raises(ValueError):
agilkia.ParticleSwarmOptimizer(objective_functions, c2=0)
def test_ga(self):
set_seed()
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventPairCoverage(event_to_str=lambda ev: ev.action)]
ga_optimizer = agilkia.GeneticOptimizer(objective_functions)
ga_optimizer.set_data(self.trace_set, select=2)
selected_traces, best_objective_value = ga_optimizer.optimize()
self.assertEqual({self.trace4, self.trace3}, set(selected_traces.traces))
self.assertEqual(((2 / 3) * 0.5 + (1.5 / 2.6) * 0.5), best_objective_value)
def test_ga2(self):
set_seed()
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventCoverage(event_to_str=lambda ev: ev.action + "_" + str(ev.status))]
ga_optimizer = agilkia.GeneticOptimizer(objective_functions)
ga_optimizer.set_data(self.trace_set, select=1)
selected_traces, best_objective_value = ga_optimizer.optimize()
self.assertEqual([self.trace4], selected_traces.traces)
self.assertEqual(((2 / 3) * 0.5 + (0.8 / 2.6) * 0.5), best_objective_value)
def test_ga3(self):
set_seed()
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventCoverage(event_to_str=lambda ev: str(ev.status))]
ga_optimizer = agilkia.GeneticOptimizer(objective_functions)
ga_optimizer.set_data(self.trace_set, select=1)
selected_traces, best_objective_value = ga_optimizer.optimize()
self.assertEqual([self.trace4], selected_traces.traces)
self.assertEqual(((2 / 3) * 0.5 + (0.8 / 2.6) * 0.5), best_objective_value)
def test_ga4(self):
set_seed()
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventPairCoverage(event_to_str=lambda ev: ev.action)]
ga_optimizer = agilkia.GeneticOptimizer(objective_functions)
ga_optimizer.set_data(self.trace_set, select=2)
selected_traces, best_objective_value = ga_optimizer.optimize()
self.assertEqual({self.trace4, self.trace3}, set(selected_traces.traces))
self.assertEqual(((2 / 3) * 0.5 + (1.5 / 2.6) * 0.5), best_objective_value)
def test_ga5(self):
set_seed()
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventPairCoverage(event_to_str=lambda ev: ev.action + "_" + str(ev.status))]
ga_optimizer = agilkia.GeneticOptimizer(objective_functions)
ga_optimizer.set_data(self.trace_set, select=2)
selected_traces, best_objective_value = ga_optimizer.optimize()
self.assertEqual({self.trace4, self.trace3}, set(selected_traces.traces))
self.assertEqual(((2 / 4) * 0.5 + (1.5 / 2.6) * 0.5), best_objective_value)
def test_ga6(self):
set_seed()
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventPairCoverage(event_to_str=lambda ev: str(ev.status))]
ga_optimizer = agilkia.GeneticOptimizer(objective_functions)
ga_optimizer.set_data(self.trace_set, select=2)
selected_traces, best_objective_value = ga_optimizer.optimize()
self.assertEqual({self.trace4, self.trace3}, set(selected_traces.traces))
self.assertEqual(((2 / 4) * 0.5 + (1.5 / 2.6) * 0.5), best_objective_value)
def test_ga7(self):
set_seed()
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventCoverage(event_to_str=lambda ev: str(ev.status))]
ga_optimizer = agilkia.GeneticOptimizer(objective_functions, crossover="single")
ga_optimizer.set_data(self.trace_set, select=1)
selected_traces, best_objective_value = ga_optimizer.optimize()
self.assertEqual([self.trace4], selected_traces.traces)
self.assertEqual(((2 / 3) * 0.5 + (0.8 / 2.6) * 0.5), best_objective_value)
def test_ga8(self):
set_seed()
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventCoverage(event_to_str=lambda ev: str(ev.status))]
with pytest.raises(ValueError):
agilkia.GeneticOptimizer(objective_functions, crossover="triple")
def test_ga9(self):
set_seed()
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventCoverage(event_to_str=lambda ev: str(ev.status))]
with pytest.raises(ValueError):
agilkia.GeneticOptimizer(objective_functions, num_of_iterations=0)
def test_ga10(self):
set_seed()
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventCoverage(event_to_str=lambda ev: str(ev.status))]
with pytest.raises(ValueError):
agilkia.GeneticOptimizer(objective_functions, num_of_chromosomes=0)
def test_ga11(self):
set_seed()
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventCoverage(event_to_str=lambda ev: str(ev.status))]
with pytest.raises(ValueError):
agilkia.GeneticOptimizer(objective_functions, prob_cross=-0.1)
def test_ga12(self):
set_seed()
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventCoverage(event_to_str=lambda ev: str(ev.status))]
with pytest.raises(ValueError):
agilkia.GeneticOptimizer(objective_functions, prob_mutate=-0.1)
def test_ga13(self):
set_seed()
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventCoverage(event_to_str=lambda ev: str(ev.status))]
with pytest.raises(ValueError):
agilkia.GeneticOptimizer(objective_functions, prob_cross=2)
def test_ga14(self):
set_seed()
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventCoverage(event_to_str=lambda ev: str(ev.status))]
with pytest.raises(ValueError):
agilkia.GeneticOptimizer(objective_functions, prob_cross=2.5)
def test_ga15(self):
set_seed()
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventCoverage(event_to_str=lambda ev: str(ev.status))]
with pytest.raises(ValueError):
agilkia.GeneticOptimizer(objective_functions, prob_mutate=2)
def test_ga16(self):
set_seed()
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventCoverage(event_to_str=lambda ev: str(ev.status))]
with pytest.raises(ValueError):
agilkia.GeneticOptimizer(objective_functions, prob_mutate=2.5)
class TestScanner(unittest.TestCase):
trace_set = agilkia.TraceSet.load_from_json(Path("./fixtures/scanner.json"))
objective_functions = [agilkia.FrequencyCoverage(),
agilkia.EventCoverage(event_to_str=lambda ev: ev.action + "_" + str(ev.status))]
greedy_optimizer = agilkia.GreedyOptimizer(objective_functions)
pso_optimizer = agilkia.ParticleSwarmOptimizer(objective_functions)
ga_optimizer = agilkia.GeneticOptimizer(objective_functions)
brute_force_results = [0.363, 0.455, 0.494, 0.532, 0.569, 0.606, 0.642, 0.679, 0.714]
def test_greedy(self):
for i in range(2, 11):
self.greedy_optimizer.set_data(self.trace_set, select=i)
selected_traces, best_objective_value = self.greedy_optimizer.optimize()
assert best_objective_value == pytest.approx(self.brute_force_results[i - 2], 0.01)
def test_pso(self):
for i in range(2, 11):
self.pso_optimizer.set_data(self.trace_set, select=i)
selected_traces, best_objective_value = self.pso_optimizer.optimize()
assert best_objective_value == pytest.approx(self.brute_force_results[i - 2], 0.01)
def test_ga(self):
for i in range(2, 11):
set_seed()
self.ga_optimizer.set_data(self.trace_set, select=i)
selected_traces, best_objective_value = self.ga_optimizer.optimize()
assert best_objective_value == pytest.approx(self.brute_force_results[i - 2], 0.01)
| 52.66 | 120 | 0.672009 | 3,133 | 26,330 | 5.405681 | 0.053304 | 0.072272 | 0.026571 | 0.042513 | 0.929145 | 0.920229 | 0.905527 | 0.90039 | 0.894426 | 0.878248 | 0 | 0.029516 | 0.209951 | 26,330 | 499 | 121 | 52.765531 | 0.784636 | 0 | 0 | 0.711944 | 0 | 0 | 0.019408 | 0.000874 | 0 | 0 | 0 | 0 | 0.135831 | 1 | 0.152225 | false | 0 | 0.014052 | 0 | 0.229508 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a88820b2e1382e18723b949507c5fa66d175c1dd | 36,321 | py | Python | preproces.py | liukeweiaway/iMRM | d2a3822b3a9d566abe4d808352d04a0b2df9c122 | [
"MIT"
] | 5 | 2020-04-15T07:34:03.000Z | 2022-03-21T06:08:21.000Z | preproces.py | liukeweiaway/iMRM | d2a3822b3a9d566abe4d808352d04a0b2df9c122 | [
"MIT"
] | 1 | 2020-04-04T04:11:07.000Z | 2020-04-09T14:07:13.000Z | preproces.py | liukeweiaway/iMRM | d2a3822b3a9d566abe4d808352d04a0b2df9c122 | [
"MIT"
] | 2 | 2020-04-15T07:34:06.000Z | 2022-03-27T12:10:13.000Z | #!/usr/bin/env python
# -*- coding:utf-8-*-
# author: Liukewei time:2020/1/21 QQ:422209303 e-mail:Liukeweiaway@hotmail.com
# ----------------------------------------------------------------------------
import predictType
def cuttingText(seq, num):
seq_split = [] ## 空列表
while (seq != ''):
seq_split.append(seq[0:num])
seq = seq[num:]
return seq_split
def readFile(fileName):
with open(fileName, 'r') as f:
file_data = f.readlines()
return file_data[0], file_data[1]
def preprocesser(inputFile, outputFile, species, modification, threshold):
Fasta_name, sequence = readFile(inputFile)
sequence = sequence.upper()
htmlf = open('iRMRModel.html', 'r', encoding="utf-8")
outputFileR = 'results/' + outputFile
outputFileP = 'results/' + 'Possibility_' + outputFile
result_real = open(outputFileR, 'w', encoding="utf-8")
result_poosi = open(outputFileP, 'w', encoding="utf-8")
if species == 'Human':
if modification == 'm1A':
hg_line_M1A, hg_line_M1A_pro, hg_M1A_list = predictType.human_XG_iRNA_M1A(sequence, species, modification,
threshold)
seq_split = cuttingText(sequence, 60)
M1A_seq_ = ['-'] * len(sequence)
for i in hg_M1A_list:
M1A_seq_[i - 1] = 'A' # <font color=#1E90FF>A</font>
M1A_seq__split = cuttingText(''.join(M1A_seq_), 60)
for line in htmlf.readlines():
if '<li>>' not in line:
result_real.write(line.strip())
result_real.write('\n')
elif '<li>>N1 :GUGAUAUAACUCAGUGGCAGA</li>' in line:
result_real.write(Fasta_name)
result_real.write('<br>')
for i in range(len(seq_split)):
M1A_seq = M1A_seq__split[i].replace('A', '<font color=#1E90FF>A</font>')
result_real.write('    ')
result_real.write(''.join(seq_split[i]))
result_real.write('    ')
if i < len(seq_split) - 1:
result_real.write(str((i + 1) * 60))
result_real.write('<br>')
result_real.write('m1A:')
result_real.write(''.join(M1A_seq))
result_real.write('<br>')
result_poosi.write('m1A:')
result_poosi.write('<br>')
result_poosi.write(hg_line_M1A_pro)
elif modification == 'm6A':
hg_line_M6A, hg_line_M6A_pro, hg_M6A_list = predictType.human_XG_iRNA_M6A(sequence, species, modification,
threshold)
seq_split = cuttingText(sequence, 60)
M6A_seq_ = ['-'] * len(sequence)
for i in hg_M6A_list:
M6A_seq_[i - 1] = 'A' # <font color=#00FF00>A</font>
M6A_seq__split = cuttingText(''.join(M6A_seq_), 60)
for line in htmlf.readlines():
if '<li>>' not in line:
result_real.write(line.strip())
result_real.write('\n')
elif '<li>>N1 :GUGAUAUAACUCAGUGGCAGA</li>' in line:
result_real.write(Fasta_name)
result_real.write('<br>')
for i in range(len(seq_split)):
M6A_seq = M6A_seq__split[i].replace('A', '<font color=#00FF00>A</font>')
result_real.write('    ')
result_real.write(''.join(seq_split[i]))
result_real.write('    ')
if i < len(seq_split) - 1:
result_real.write(str((i + 1) * 60))
result_real.write('<br>')
result_real.write('m6A:')
result_real.write(''.join(M6A_seq))
result_real.write('<br>')
result_poosi.write('m6A:')
result_poosi.write('<br>')
result_poosi.write(hg_line_M6A_pro)
elif modification == 'm5C':
hg_line_M5C, hg_line_M5C_pro, hg_M5C_list = predictType.human_XG_iRNA_M5C(sequence, species, modification,
threshold)
seq_split = cuttingText(sequence, 60)
M5C_seq_ = ['-'] * len(sequence)
for i in hg_M5C_list:
M5C_seq_[i - 1] = 'C' # <font color=#FF7F50>C</font>
M5C_seq__split = cuttingText(''.join(M5C_seq_), 60)
for line in htmlf.readlines():
if '<li>>' not in line:
result_real.write(line.strip())
result_real.write('\n')
elif '<li>>N1 :GUGAUAUAACUCAGUGGCAGA</li>' in line:
result_real.write(Fasta_name)
result_real.write('<br>')
for i in range(len(seq_split)):
M5C_seq = M5C_seq__split[i].replace('C', '<font color=#FF7F50>C</font>')
result_real.write('    ')
result_real.write(''.join(seq_split[i]))
result_real.write('    ')
if i < len(seq_split) - 1:
result_real.write(str((i + 1) * 60))
result_real.write('<br>')
result_real.write('m5C:')
result_real.write(''.join(M5C_seq))
result_real.write('<br>')
result_poosi.write('<br>')
result_poosi.write('m5C:')
result_poosi.write(hg_line_M5C_pro)
elif modification == 'pseudouridine':
hg_line_pse, hg_line_pse_pro, hg_pse_list = predictType.human_XG_iRNA_pse(sequence, species, modification,
threshold)
seq_split = cuttingText(sequence, 60)
pse_seq_ = ['-'] * len(sequence)
for i in hg_pse_list:
pse_seq_[i - 1] = 'U' # <font color=#FF7F50>C</font>
pse_seq__split = cuttingText(''.join(pse_seq_), 60)
for line in htmlf.readlines():
if '<li>>' not in line:
result_real.write(line.strip())
result_real.write('\n')
elif '<li>>N1 :GUGAUAUAACUCAGUGGCAGA</li>' in line:
result_real.write(Fasta_name)
result_real.write('<br>')
for i in range(len(seq_split)):
pse_seq = pse_seq__split[i].replace('U', '<font color=#DC143C>U</font>')
result_real.write('    ')
result_real.write(''.join(seq_split[i]))
result_real.write('    ')
if i < len(seq_split) - 1:
result_real.write(str((i + 1) * 60))
result_real.write('<br>')
result_real.write('Pse:')
result_real.write(''.join(pse_seq))
result_real.write('<br>')
result_poosi.write('<br>')
result_poosi.write('Pse:')
result_poosi.write(hg_line_pse_pro)
elif modification == 'A-to-I':
hg_line_AI, hg_line_AI_pro, hg_AI_list = predictType.human_XG_iRNA_AI(sequence, species, modification,
threshold)
seq_split = cuttingText(sequence, 60)
AI_seq_ = ['-'] * len(sequence)
for i in hg_AI_list:
AI_seq_[i - 1] = 'A' # <font color=#FF7F50>C</font>
AI_seq__split = cuttingText(''.join(AI_seq_), 60)
for line in htmlf.readlines():
if '<li>>' not in line:
result_real.write(line.strip())
result_real.write('\n')
elif '<li>>N1 :GUGAUAUAACUCAGUGGCAGA</li>' in line:
result_real.write(Fasta_name)
result_real.write('<br>')
for i in range(len(seq_split)):
AI_seq = AI_seq__split[i].replace('A', '<font color=#9932CC>A</font>')
result_real.write('    ')
result_real.write(''.join(seq_split[i]))
result_real.write('    ')
if i < len(seq_split) - 1:
result_real.write(str((i + 1) * 60))
result_real.write('<br>')
result_real.write('A-I:')
result_real.write(''.join(AI_seq))
result_real.write('<br>')
result_poosi.write('<br>')
result_poosi.write('A-I:')
result_poosi.write(hg_line_AI_pro)
elif modification == 'all':
modification = 'pseudouridine'
hg_line_Pse, hg_line_Pse_pro, hg_Pse_list = predictType.human_XG_iRNA_pse(sequence, species, modification,
threshold)
modification = 'A-to-I'
hg_line_AI, hg_line_AI_pro, hg_AI_list = predictType.human_XG_iRNA_AI(sequence, species, modification,
threshold)
modification = 'm5C'
hg_line_M5C, hg_line_M5C_pro, hg_M5C_list = predictType.human_XG_iRNA_M5C(sequence, species, modification,
threshold)
modification = 'm6A'
hg_line_M6A, hg_line_M6A_pro, hg_M6A_list = predictType.human_XG_iRNA_M6A(sequence, species, modification,
threshold)
modification = 'm1A'
hg_line_M1A, hg_line_M1A_pro, hg_M1A_list = predictType.human_XG_iRNA_M1A(sequence, species, modification,
threshold)
seq_split = cuttingText(sequence, 60)
pse_seq_ = ['-'] * len(sequence)
AI_seq_ = ['-'] * len(sequence)
M5C_seq_ = ['-'] * len(sequence)
M6A_seq_ = ['-'] * len(sequence)
M1A_seq_ = ['-'] * len(sequence)
for i in hg_Pse_list:
pse_seq_[i - 1] = 'U' # <font color=#DC143C>U</font>
pse_seq__split = cuttingText(''.join(pse_seq_), 60)
for i in hg_AI_list:
AI_seq_[i - 1] = 'A' # <font color=#9932CC>A</font>
AI_seq__split = cuttingText(''.join(AI_seq_), 60)
for i in hg_M5C_list:
M5C_seq_[i - 1] = 'C' # <font color=#FF7F50>C</font>
M5C_seq__split = cuttingText(''.join(M5C_seq_), 60)
for i in hg_M6A_list:
M6A_seq_[i - 1] = 'A' # <font color=#00FF00>A</font>
M6A_seq__split = cuttingText(''.join(M6A_seq_), 60)
for i in hg_M1A_list:
M1A_seq_[i - 1] = 'A' # <font color=#1E90FF>A</font>
M1A_seq__split = cuttingText(''.join(M1A_seq_), 60)
for line in htmlf.readlines():
if '<li>>' not in line:
result_real.write(line.strip())
result_real.write('\n')
elif '<li>>N1 :GUGAUAUAACUCAGUGGCAGA</li>' in line:
result_real.write(Fasta_name)
result_real.write('<br>')
for i in range(len(seq_split)):
pse_seq = pse_seq__split[i].replace('U', '<font color=#DC143C>U</font>')
AI_seq = AI_seq__split[i].replace('A', '<font color=#9932CC>A</font>')
M5C_seq = M5C_seq__split[i].replace('C', '<font color=#FF7F50>C</font>')
M6A_seq = M6A_seq__split[i].replace('A', '<font color=#00FF00>A</font>')
M1A_seq = M1A_seq__split[i].replace('A', '<font color=#1E90FF>A</font>')
result_real.write('    ')
result_real.write(''.join(seq_split[i]))
result_real.write('    ')
if i < len(seq_split) - 1:
result_real.write(str((i + 1) * 60))
result_real.write('<br>')
result_real.write('PSE:')
result_real.write(''.join(pse_seq))
result_real.write('<br>')
result_real.write('A-I:')
result_real.write(''.join(AI_seq))
result_real.write('<br>')
result_real.write('m6A:')
result_real.write(''.join(M6A_seq))
result_real.write('<br>')
result_real.write('m1A:')
result_real.write(''.join(M1A_seq))
result_real.write('<br>')
result_real.write('m5C:')
result_real.write(''.join(M5C_seq))
result_real.write('<br>')
result_poosi.write('PSE:')
result_poosi.write(hg_line_Pse_pro)
result_poosi.write('<br>')
result_poosi.write('AI:')
result_poosi.write(hg_line_AI_pro)
result_poosi.write('<br>')
result_poosi.write('m6A:')
result_poosi.write(hg_line_M6A_pro)
result_poosi.write('<br>')
result_poosi.write('m1A:')
result_poosi.write(hg_line_M1A_pro)
result_poosi.write('<br>')
result_poosi.write('m5C:')
result_poosi.write(hg_line_M5C_pro)
if species == 'Yeast': # --- Yeast -----------------------------------------------------
if modification == 'm1A':
sc_line_M1A, sc_line_M1A_pro, sc_M1A_list = predictType.sc_XG_iRNA_M1A(sequence, species, modification,
threshold)
seq_split = cuttingText(sequence, 60)
M1A_seq_ = ['-'] * len(sequence)
for i in sc_M1A_list:
M1A_seq_[i - 1] = 'A' # <font color=#1E90FF>A</font>
M1A_seq__split = cuttingText(''.join(M1A_seq_), 60)
for line in htmlf.readlines():
if '<li>>' not in line:
result_real.write(line.strip())
result_real.write('\n')
elif '<li>>N1 :GUGAUAUAACUCAGUGGCAGA</li>' in line:
result_real.write(Fasta_name)
result_real.write('<br>')
for i in range(len(seq_split)):
M1A_seq = M1A_seq__split[i].replace('A', '<font color=#1E90FF>A</font>')
result_real.write('    ')
result_real.write(''.join(seq_split[i]))
result_real.write('    ')
if i < len(seq_split) - 1:
result_real.write(str((i + 1) * 60))
result_real.write('<br>')
result_real.write('m1A:')
result_real.write(''.join(M1A_seq))
result_real.write('<br>')
result_poosi.write('m1A:')
result_poosi.write(sc_line_M1A_pro)
elif modification == 'm6A':
sc_line_M6A, sc_line_M6A_pro, sc_M6A_list = predictType.sc_XG_iRNA_M6A(sequence, species, modification,
threshold)
seq_split = cuttingText(sequence, 60)
M6A_seq_ = ['-'] * len(sequence)
for i in sc_M6A_list:
M6A_seq_[i - 1] = 'A' # <font color=#00FF00>A</font>
M6A_seq__split = cuttingText(''.join(M6A_seq_), 60)
for line in htmlf.readlines():
if '<li>>' not in line:
result_real.write(line.strip())
result_real.write('\n')
elif '<li>>N1 :GUGAUAUAACUCAGUGGCAGA</li>' in line:
result_real.write(Fasta_name)
result_real.write('<br>')
for i in range(len(seq_split)):
M6A_seq = M6A_seq__split[i].replace('A', '<font color=#00FF00>A</font>')
result_real.write('    ')
result_real.write(''.join(seq_split[i]))
result_real.write('    ')
if i < len(seq_split) - 1:
result_real.write(str((i + 1) * 60))
result_real.write('<br>')
result_real.write('m6A:')
result_real.write(''.join(M6A_seq))
result_real.write('<br>')
result_poosi.write('m6A:')
result_poosi.write(sc_line_M6A_pro)
elif modification == 'm5C':
sc_line_M5C, sc_line_M5C_pro, sc_M5C_list = predictType.sc_XG_iRNA_M5C(sequence, species, modification,
threshold)
seq_split = cuttingText(sequence, 60)
M5C_seq_ = ['-'] * len(sequence)
for i in sc_M5C_list:
M5C_seq_[i - 1] = 'C' # <font color=#FF7F50>C</font>
M5C_seq__split = cuttingText(''.join(M5C_seq_), 60)
for line in htmlf.readlines():
if '<li>>' not in line:
result_real.write(line.strip())
result_real.write('\n')
elif '<li>>N1 :GUGAUAUAACUCAGUGGCAGA</li>' in line:
result_real.write(Fasta_name)
result_real.write('<br>')
for i in range(len(seq_split)):
M5C_seq = M5C_seq__split[i].replace('C', '<font color=#FF7F50>C</font>')
result_real.write('    ')
result_real.write(''.join(seq_split[i]))
result_real.write('    ')
if i < len(seq_split) - 1:
result_real.write(str((i + 1) * 60))
result_real.write('<br>')
result_real.write('m5C:')
result_real.write(''.join(M5C_seq))
result_real.write('<br>')
result_poosi.write('m5C:')
result_poosi.write(sc_line_M5C_pro)
elif modification == 'pseudouridine':
sc_line_pse, sc_line_pse_pro, sc_pse_list = predictType.sc_XG_iRNA_pse(sequence, species, modification,
threshold)
seq_split = cuttingText(sequence, 60)
pse_seq_ = ['-'] * len(sequence)
for i in sc_pse_list:
pse_seq_[i - 1] = 'U' # <font color=#FF7F50>C</font>
pse_seq__split = cuttingText(''.join(pse_seq_), 60)
for line in htmlf.readlines():
if '<li>>' not in line:
result_real.write(line.strip())
result_real.write('\n')
elif '<li>>N1 :GUGAUAUAACUCAGUGGCAGA</li>' in line:
result_real.write(Fasta_name)
result_real.write('<br>')
for i in range(len(seq_split)):
pse_seq = pse_seq__split[i].replace('U', '<font color=#DC143C>U</font>')
result_real.write('    ')
result_real.write(''.join(seq_split[i]))
result_real.write('    ')
if i < len(seq_split) - 1:
result_real.write(str((i + 1) * 60))
result_real.write('<br>')
result_real.write('PSE:')
result_real.write(''.join(pse_seq))
result_real.write('<br>')
result_poosi.write('PSE:')
result_poosi.write(sc_line_pse_pro)
elif modification == 'all':
modification = 'pseudouridine'
sc_line_Pse, sc_line_Pse_pro, sc_Pse_list = predictType.sc_XG_iRNA_pse(sequence, species, modification,
threshold)
modification = 'm5C'
sc_line_M5C, sc_line_M5C_pro, sc_M5C_list = predictType.sc_XG_iRNA_M5C(sequence, species, modification,
threshold)
modification = 'm6A'
sc_line_M6A, sc_line_M6A_pro, sc_M6A_list = predictType.sc_XG_iRNA_M6A(sequence, species, modification,
threshold)
modification = 'm1A'
sc_line_M1A, sc_line_M1A_pro, sc_M1A_list = predictType.sc_XG_iRNA_M1A(sequence, species, modification,
threshold)
seq_split = cuttingText(sequence, 60)
pse_seq_ = ['-'] * len(sequence)
M5C_seq_ = ['-'] * len(sequence)
M6A_seq_ = ['-'] * len(sequence)
M1A_seq_ = ['-'] * len(sequence)
for i in sc_Pse_list:
pse_seq_[i - 1] = 'U' # <font color=#DC143C>U</font>
pse_seq__split = cuttingText(''.join(pse_seq_), 60)
for i in sc_M5C_list:
M5C_seq_[i - 1] = 'C' # <font color=#FF7F50>C</font>
M5C_seq__split = cuttingText(''.join(M5C_seq_), 60)
for i in sc_M6A_list:
M6A_seq_[i - 1] = 'A' # <font color=#00FF00>A</font>
M6A_seq__split = cuttingText(''.join(M6A_seq_), 60)
for i in sc_M1A_list:
M1A_seq_[i - 1] = 'A' # <font color=#1E90FF>A</font>
M1A_seq__split = cuttingText(''.join(M1A_seq_), 60)
for line in htmlf.readlines():
if '<li>>' not in line:
result_real.write(line.strip())
result_real.write('\n')
elif '<li>>N1 :GUGAUAUAACUCAGUGGCAGA</li>' in line:
result_real.write(Fasta_name)
result_real.write('<br>')
for i in range(len(seq_split)):
pse_seq = pse_seq__split[i].replace('U', '<font color=#DC143C>U</font>')
M5C_seq = M5C_seq__split[i].replace('C', '<font color=#FF7F50>C</font>')
M6A_seq = M6A_seq__split[i].replace('A', '<font color=#00FF00>A</font>')
M1A_seq = M1A_seq__split[i].replace('A', '<font color=#1E90FF>A</font>')
result_real.write('    ')
result_real.write(''.join(seq_split[i]))
result_real.write('    ')
if i < len(seq_split) - 1:
result_real.write(str((i + 1) * 60))
result_real.write('<br>')
result_real.write('PSE:')
result_real.write(''.join(pse_seq))
result_real.write('<br>')
result_real.write('m6A:')
result_real.write(''.join(M6A_seq))
result_real.write('<br>')
result_real.write('m1A:')
result_real.write(''.join(M1A_seq))
result_real.write('<br>')
result_real.write('m5C:')
result_real.write(''.join(M5C_seq))
result_real.write('<br>')
result_poosi.write('PSE:')
result_poosi.write(sc_line_Pse_pro)
result_poosi.write('<br>')
result_poosi.write('m6A:')
result_poosi.write(sc_line_M6A_pro)
result_poosi.write('<br>')
result_poosi.write('m1A:')
result_poosi.write(sc_line_M1A_pro)
result_poosi.write('<br>')
result_poosi.write('m5C:')
result_poosi.write(sc_line_M5C_pro)
if species == 'Mouse': # Mouse
if modification == 'm1A':
mm_line_M1A, mm_line_M1A_pro, mm_M1A_list = predictType.mm_XG_iRNA_M1A(sequence, species, modification,
threshold)
seq_split = cuttingText(sequence, 60)
M1A_seq_ = ['-'] * len(sequence)
for i in mm_M1A_list:
M1A_seq_[i - 1] = 'A' # <font color=#1E90FF>A</font>
M1A_seq__split = cuttingText(''.join(M1A_seq_), 60)
for line in htmlf.readlines():
if '<li>>' not in line:
result_real.write(line.strip())
result_real.write('\n')
elif '<li>>N1 :GUGAUAUAACUCAGUGGCAGA</li>' in line:
result_real.write(Fasta_name)
result_real.write('<br>')
for i in range(len(seq_split)):
M1A_seq = M1A_seq__split[i].replace('A', '<font color=#1E90FF>A</font>')
result_real.write('    ')
result_real.write(''.join(seq_split[i]))
result_real.write('    ')
if i < len(seq_split) - 1:
result_real.write(str((i + 1) * 60))
result_real.write('<br>')
result_real.write('m1A:')
result_real.write(''.join(M1A_seq))
result_real.write('<br>')
result_poosi.write('m1A:')
result_poosi.write(mm_line_M1A_pro)
elif modification == 'm5C':
mm_line_M5C, mm_line_M5C_pro, mm_M5C_list = predictType.mm_XG_iRNA_M5C(sequence, species, modification,
threshold)
seq_split = cuttingText(sequence, 60)
M5C_seq_ = ['-'] * len(sequence)
for i in mm_M5C_list:
M5C_seq_[i - 1] = 'C' # <font color=#FF7F50>C</font>
M5C_seq__split = cuttingText(''.join(M5C_seq_), 60)
for line in htmlf.readlines():
if '<li>>' not in line:
result_real.write(line.strip())
result_real.write('\n')
elif '<li>>N1 :GUGAUAUAACUCAGUGGCAGA</li>' in line:
result_real.write(Fasta_name)
result_real.write('<br>')
for i in range(len(seq_split)):
M5C_seq = M5C_seq__split[i].replace('C', '<font color=#FF7F50>C</font>')
result_real.write('    ')
result_real.write(''.join(seq_split[i]))
result_real.write('    ')
if i < len(seq_split) - 1:
result_real.write(str((i + 1) * 60))
result_real.write('<br>')
result_real.write('m5C:')
result_real.write(''.join(M5C_seq))
result_real.write('<br>')
result_poosi.write('m5C:')
result_poosi.write(mm_line_M5C_pro)
elif modification == 'pseudouridine':
mm_line_pse, mm_line_pse_pro, mm_pse_list = predictType.mm_XG_iRNA_pse(sequence, species, modification,
threshold)
seq_split = cuttingText(sequence, 60)
pse_seq_ = ['-'] * len(sequence)
for i in mm_pse_list:
pse_seq_[i - 1] = 'U' # <font color=#FF7F50>C</font>
pse_seq__split = cuttingText(''.join(pse_seq_), 60)
for line in htmlf.readlines():
if '<li>>' not in line:
result_real.write(line.strip())
result_real.write('\n')
elif '<li>>N1 :GUGAUAUAACUCAGUGGCAGA</li>' in line:
result_real.write(Fasta_name)
result_real.write('<br>')
for i in range(len(seq_split)):
pse_seq = pse_seq__split[i].replace('U', '<font color=#DC143C>U</font>')
result_real.write('    ')
result_real.write(''.join(seq_split[i]))
result_real.write('    ')
if i < len(seq_split) - 1:
result_real.write(str((i + 1) * 60))
result_real.write('<br>')
result_real.write('PSE:')
result_real.write(''.join(pse_seq))
result_real.write('<br>')
result_poosi.write('PSE:')
result_poosi.write(mm_line_pse_pro)
elif modification == 'm6A':
mm_line_M6A, mm_line_M6A_pro, mm_M6A_list = predictType.mm_XG_iRNA_M6A(sequence, species, modification,
threshold)
seq_split = cuttingText(sequence, 60)
M6A_seq_ = ['-'] * len(sequence)
for i in mm_M6A_list:
M6A_seq_[i - 1] = 'A' # <font color=#00FF00>A</font>
M6A_seq__split = cuttingText(''.join(M6A_seq_), 60)
for line in htmlf.readlines():
if '<li>>' not in line:
result_real.write(line.strip())
result_real.write('\n')
elif '<li>>N1 :GUGAUAUAACUCAGUGGCAGA</li>' in line:
result_real.write(Fasta_name)
result_real.write('<br>')
for i in range(len(seq_split)):
M6A_seq = M6A_seq__split[i].replace('A', '<font color=#00FF00>A</font>')
result_real.write('    ')
result_real.write(''.join(seq_split[i]))
result_real.write('    ')
if i < len(seq_split) - 1:
result_real.write(str((i + 1) * 60))
result_real.write('<br>')
result_real.write('m6A:')
result_real.write(''.join(M6A_seq))
result_real.write('<br>')
result_poosi.write('m6A:')
result_poosi.write(mm_line_M6A_pro)
elif modification == 'all':
modification = 'pseudouridine'
mm_line_Pse, mm_line_Pse_pro, mm_Pse_list = predictType.mm_XG_iRNA_pse(sequence, species, modification,
threshold)
modification = 'm5C'
mm_line_M5C, mm_line_M5C_pro, mm_M5C_list = predictType.mm_XG_iRNA_M5C(sequence, species, modification,
threshold)
modification = 'm1A'
mm_line_M1A, mm_line_M1A_pro, mm_M1A_list = predictType.mm_XG_iRNA_M1A(sequence, species, modification,
threshold)
modification = 'm6A'
mm_line_M6A, mm_line_M6A_pro, mm_M6A_list = predictType.mm_XG_iRNA_M6A(sequence, species, modification,
threshold)
seq_split = cuttingText(sequence, 60)
pse_seq_ = ['-'] * len(sequence)
M5C_seq_ = ['-'] * len(sequence)
M1A_seq_ = ['-'] * len(sequence)
M6A_seq_ = ['-'] * len(sequence)
for i in mm_Pse_list:
pse_seq_[i - 1] = 'U' # <font color=#DC143C>U</font>
pse_seq__split = cuttingText(''.join(pse_seq_), 60)
for i in mm_M5C_list:
M5C_seq_[i - 1] = 'C' # <font color=#FF7F50>C</font>
M5C_seq__split = cuttingText(''.join(M5C_seq_), 60)
for i in mm_M1A_list:
M1A_seq_[i - 1] = 'A' # <font color=#1E90FF>A</font>
M1A_seq__split = cuttingText(''.join(M1A_seq_), 60)
for i in mm_M6A_list:
M6A_seq_[i - 1] = 'A' # <font color=#1E90FF>A</font>
M6A_seq__split = cuttingText(''.join(M6A_seq_), 60)
for line in htmlf.readlines():
if '<li>>' not in line:
result_real.write(line.strip())
result_real.write('\n')
elif '<li>>N1 :GUGAUAUAACUCAGUGGCAGA</li>' in line:
result_real.write(Fasta_name)
result_real.write('<br>')
for i in range(len(seq_split)):
pse_seq = pse_seq__split[i].replace('U', '<font color=#DC143C>U</font>')
M5C_seq = M5C_seq__split[i].replace('C', '<font color=#FF7F50>C</font>')
M1A_seq = M1A_seq__split[i].replace('A', '<font color=#1E90FF>A</font>')
M6A_seq = M6A_seq__split[i].replace('A', '<font color=#00FF00>A</font>')
result_real.write('    ')
result_real.write(''.join(seq_split[i]))
result_real.write('    ')
if i < len(seq_split) - 1:
result_real.write(str((i + 1) * 60))
result_real.write('<br>')
result_real.write('PSE:')
result_real.write(''.join(pse_seq))
result_real.write('<br>')
result_real.write('m6A:')
result_real.write(''.join(M6A_seq))
result_real.write('<br>')
result_real.write('m1A:')
result_real.write(''.join(M1A_seq))
result_real.write('<br>')
result_real.write('m5C:')
result_real.write(''.join(M5C_seq))
result_real.write('<br>')
result_poosi.write('PSE:')
result_poosi.write(mm_line_Pse_pro)
result_poosi.write('<br>')
result_poosi.write('m6A:')
result_poosi.write(mm_line_M6A_pro)
result_poosi.write('<br>')
result_poosi.write('m1A:')
result_poosi.write(mm_line_M1A_pro)
result_poosi.write('<br>')
result_poosi.write('m5C:')
result_poosi.write(mm_line_M5C_pro)
htmlf.close()
result_real.close()
result_poosi.close()
| 54.210448 | 119 | 0.466342 | 3,839 | 36,321 | 4.116437 | 0.029695 | 0.141745 | 0.210719 | 0.062393 | 0.958615 | 0.956274 | 0.950452 | 0.944251 | 0.944061 | 0.938809 | 0 | 0.031355 | 0.410809 | 36,321 | 669 | 120 | 54.29148 | 0.707103 | 0.027422 | 0 | 0.921095 | 0 | 0 | 0.082264 | 0.029021 | 0.00161 | 0 | 0 | 0 | 0 | 1 | 0.004831 | false | 0 | 0.00161 | 0 | 0.009662 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a8a2ecfa08a3d01becd062025b74bedaef7a12cb | 7,149 | py | Python | examples/trials/chao_trial/SK_Manual.py | chyan0411/nni | 2064bba03da468ee9093b9015e98e62ca4262113 | [
"MIT"
] | null | null | null | examples/trials/chao_trial/SK_Manual.py | chyan0411/nni | 2064bba03da468ee9093b9015e98e62ca4262113 | [
"MIT"
] | null | null | null | examples/trials/chao_trial/SK_Manual.py | chyan0411/nni | 2064bba03da468ee9093b9015e98e62ca4262113 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# coding: utf-8
# In[1]:
import multiprocessing
multiprocessing.set_start_method('forkserver')
import itertools
import numpy as np
import pandas as pd
from numpy import mean
from numpy import std
import matplotlib.pyplot as plt
from sklearn.datasets import make_hastie_10_2
from sklearn.ensemble import GradientBoostingClassifier,RandomForestClassifier
from sklearn.model_selection import train_test_split
from sklearn.metrics import f1_score
from sklearn.metrics import confusion_matrix,accuracy_score
from sklearn.model_selection import cross_val_score
from sklearn.model_selection import RepeatedStratifiedKFold
from sklearn import metrics
get_ipython().run_line_magic('matplotlib', 'inline')
# In[30]:
FILE_PATH = 'Verdika_remove_unnecessary_done_v2.csv'
data = pd.read_csv(FILE_PATH)
data.head(5)
X = data.drop(['Swapped'], axis = 1)
y = data.Swapped
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.20, random_state=42)
model = GradientBoostingClassifier(n_estimators=100, learning_rate=0.05, max_depth=5, random_state=0, verbose = 1)
cv = RepeatedStratifiedKFold(n_splits=5, n_repeats=3, random_state=1)
n_scores = cross_val_score(model, X_train, y_train, scoring='roc_auc_ovr_weighted', cv=cv, n_jobs=-1, error_score='raise')
print('AUC: %.3f (%.3f)' % (mean(n_scores), std(n_scores)))
# fit the model on the whole dataset
model.fit(X_train, y_train)
from sklearn.metrics import classification_report, precision_score, recall_score, f1_score, precision_recall_fscore_support
y_preds = model.predict(X_test)
print('Precision : %.3f'%precision_score(y_test, y_preds))
print('Recall : %.3f'%recall_score(y_test, y_preds))
print('F1-Score : %.3f'%f1_score(y_test, y_preds))
print('\nPrecision Recall F1-Score Support Per Class : \n',precision_recall_fscore_support(y_test, y_preds))
print('\nClassification Report : ')
print(classification_report(y_test, y_preds))
metrics.roc_auc_score(y_test, y_preds, average='weighted')
# In[ ]:
# In[ ]:
# In[3]:
FILE_PATH = 'Verdika_remove_unnecessary_done_v2.csv'
data = pd.read_csv(FILE_PATH)
data.head(5)
X = data.drop(['Swapped'], axis = 1)
y = data.Swapped
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.20, random_state=42)
model = RandomForestClassifier(n_estimators=10,
max_depth=5,
random_state=0,
verbose = 1,
min_samples_leaf= 10
)
# random forest for classification in scikit-learn
cv = RepeatedStratifiedKFold(n_splits=5, n_repeats=3, random_state=1)
n_scores = cross_val_score(model, X_train, y_train, scoring='roc_auc_ovr_weighted', cv=cv, n_jobs=-1, error_score='raise')
print('AUC: %.3f (%.3f)' % (mean(n_scores), std(n_scores)))
# fit the model on the whole dataset
model.fit(X_train, y_train)
from sklearn.metrics import classification_report, precision_score, recall_score, f1_score, precision_recall_fscore_support
y_preds = model.predict(X_test)
print('Precision : %.3f'%precision_score(y_test, y_preds))
print('Recall : %.3f'%recall_score(y_test, y_preds))
print('F1-Score : %.3f'%f1_score(y_test, y_preds))
print('\nPrecision Recall F1-Score Support Per Class : \n',precision_recall_fscore_support(y_test, y_preds))
print('\nClassification Report : ')
print(classification_report(y_test, y_preds))
metrics.roc_auc_score(y_test, y_preds, average='weighted')
# In[4]:
from sklearn.metrics import balanced_accuracy_score
print('Balanced Accuracy : ',balanced_accuracy_score(y_test, y_preds))
# In[ ]:
# In[6]:
FILE_PATH = 'Verdika_remove_unnecessary_done_v2.csv'
data = pd.read_csv(FILE_PATH)
data.head(5)
X = data.drop(['Swapped'], axis = 1)
y = data.Swapped
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.20, random_state=42)
model = RandomForestClassifier(n_estimators=10,
max_depth=5,
random_state=0,
verbose = 1,
min_samples_leaf= 10
)
# random forest for classification in scikit-learn
cv = RepeatedStratifiedKFold(n_splits=5, n_repeats=1, random_state=1)
n_scores = cross_val_score(model, X_train, y_train, scoring='roc_auc_ovr_weighted', cv=cv, n_jobs=-1, error_score='raise')
print('AUC: %.3f (%.3f)' % (mean(n_scores), std(n_scores)))
# fit the model on the whole dataset
model.fit(X_train, y_train)
from sklearn.metrics import classification_report, precision_score, recall_score, f1_score, precision_recall_fscore_support
y_preds = model.predict(X_test)
print('Precision : %.3f'%precision_score(y_test, y_preds))
print('Recall : %.3f'%recall_score(y_test, y_preds))
print('F1-Score : %.3f'%f1_score(y_test, y_preds))
print('\nPrecision Recall F1-Score Support Per Class : \n',precision_recall_fscore_support(y_test, y_preds))
print('\nClassification Report : ')
print(classification_report(y_test, y_preds))
metrics.roc_auc_score(y_test, y_preds, average='weighted')
# In[7]:
from xgboost import XGBClassifier
FILE_PATH = 'Verdika_remove_unnecessary_done_v2.csv'
data = pd.read_csv(FILE_PATH)
data.head(5)
X = data.drop(['Swapped'], axis = 1)
y = data.Swapped
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.20, random_state=42)
model = XGBClassifier(n_estimators=10,
max_depth=5,
random_state=0,
verbose = 1,
learning_rate = 0.01,
n_jobs=-1)
# random forest for classification in scikit-learn
cv = RepeatedStratifiedKFold(n_splits=5, n_repeats=1, random_state=1)
n_scores = cross_val_score(model, X_train, y_train, scoring='roc_auc_ovr_weighted', cv=cv, error_score='raise')
print('AUC: %.3f (%.3f)' % (mean(n_scores), std(n_scores)))
# fit the model on the whole dataset
model.fit(X_train, y_train)
from sklearn.metrics import classification_report, precision_score, recall_score, f1_score, precision_recall_fscore_support
y_preds = model.predict(X_test)
print('Precision : %.3f'%precision_score(y_test, y_preds))
print('Recall : %.3f'%recall_score(y_test, y_preds))
print('F1-Score : %.3f'%f1_score(y_test, y_preds))
print('\nPrecision Recall F1-Score Support Per Class : \n',precision_recall_fscore_support(y_test, y_preds))
print('\nClassification Report : ')
print(classification_report(y_test, y_preds))
metrics.roc_auc_score(y_test, y_preds, average='weighted')
# In[ ]:
# In[ ]:
# In[ ]:
# In[ ]:
| 30.292373 | 123 | 0.664289 | 987 | 7,149 | 4.517731 | 0.137791 | 0.037004 | 0.03364 | 0.061673 | 0.840996 | 0.830455 | 0.814308 | 0.814308 | 0.807804 | 0.807804 | 0 | 0.022941 | 0.225626 | 7,149 | 235 | 124 | 30.421277 | 0.782514 | 0.05777 | 0 | 0.765217 | 0 | 0 | 0.212072 | 0.022653 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.182609 | 0 | 0.182609 | 0.252174 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
763b53c5013922fc900b9df7abab0a95d27ce9d2 | 1,691 | py | Python | netbox/dcim/migrations/0138_extend_tag_support.py | cybarox/netbox | ea197eff5f4fe925bb354d1375912decd81752bd | [
"Apache-2.0"
] | 1 | 2021-12-09T13:41:46.000Z | 2021-12-09T13:41:46.000Z | netbox/dcim/migrations/0138_extend_tag_support.py | cybarox/netbox | ea197eff5f4fe925bb354d1375912decd81752bd | [
"Apache-2.0"
] | null | null | null | netbox/dcim/migrations/0138_extend_tag_support.py | cybarox/netbox | ea197eff5f4fe925bb354d1375912decd81752bd | [
"Apache-2.0"
] | 1 | 2022-02-07T20:36:31.000Z | 2022-02-07T20:36:31.000Z | # Generated by Django 3.2.8 on 2021-10-21 14:50
from django.db import migrations
import taggit.managers
class Migration(migrations.Migration):
dependencies = [
('extras', '0062_clear_secrets_changelog'),
('dcim', '0137_relax_uniqueness_constraints'),
]
operations = [
migrations.AddField(
model_name='devicerole',
name='tags',
field=taggit.managers.TaggableManager(through='extras.TaggedItem', to='extras.Tag'),
),
migrations.AddField(
model_name='location',
name='tags',
field=taggit.managers.TaggableManager(through='extras.TaggedItem', to='extras.Tag'),
),
migrations.AddField(
model_name='manufacturer',
name='tags',
field=taggit.managers.TaggableManager(through='extras.TaggedItem', to='extras.Tag'),
),
migrations.AddField(
model_name='platform',
name='tags',
field=taggit.managers.TaggableManager(through='extras.TaggedItem', to='extras.Tag'),
),
migrations.AddField(
model_name='rackrole',
name='tags',
field=taggit.managers.TaggableManager(through='extras.TaggedItem', to='extras.Tag'),
),
migrations.AddField(
model_name='region',
name='tags',
field=taggit.managers.TaggableManager(through='extras.TaggedItem', to='extras.Tag'),
),
migrations.AddField(
model_name='sitegroup',
name='tags',
field=taggit.managers.TaggableManager(through='extras.TaggedItem', to='extras.Tag'),
),
]
| 33.156863 | 96 | 0.589592 | 155 | 1,691 | 6.348387 | 0.316129 | 0.113821 | 0.163618 | 0.192073 | 0.705285 | 0.705285 | 0.705285 | 0.705285 | 0.705285 | 0.705285 | 0 | 0.018852 | 0.278533 | 1,691 | 50 | 97 | 33.82 | 0.787705 | 0.026611 | 0 | 0.636364 | 1 | 0 | 0.212287 | 0.037105 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.045455 | 0 | 0.113636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
76411c102b9c0da2359f8d698c656f78ae1f59d8 | 54,562 | py | Python | schematic/schemas/df_parser.py | nf-osi/schematic | b59856f40c613a43d117fe3fafa2ca5ba5bbf8d6 | [
"MIT"
] | null | null | null | schematic/schemas/df_parser.py | nf-osi/schematic | b59856f40c613a43d117fe3fafa2ca5ba5bbf8d6 | [
"MIT"
] | 6 | 2020-10-08T19:53:47.000Z | 2021-05-07T14:50:39.000Z | schematic/schemas/df_parser.py | nf-osi/schematic | b59856f40c613a43d117fe3fafa2ca5ba5bbf8d6 | [
"MIT"
] | null | null | null | import os
import string
import re
import io
import requests
import logging
from typing import Any, Dict, Optional, Text # allows specifying explicit variable types
import pandas as pd
import numpy as np
from schematic.schemas.explorer import SchemaExplorer
from schematic import LOADER
logger = logging.getLogger(__name__)
"""
Utility for converting csv file containing a data model definition schema (see scRNA-seq.csv for an example) into schema.org schema.
"""
# required headers for schema; may or may not abstract further; for now hardcode
required_headers = set(["Attribute", "Description", "Valid Values", "DependsOn", "Required", "Parent", "Properties", "DependsOn Component", "Source", "Validation Rules"])
def get_class(se: SchemaExplorer, class_display_name: str, description: str = None, subclass_of: list = None, requires_dependencies: list = None, requires_range: list = None, requires_components: list = None, required:bool = None, validation_rules: list = None) -> dict:
"""Constructs a new schema.org compliant class given a set of schema object attributes
Args:
se: a schema explorer object allowing the traversal and modification of a schema graph
display_class_name: human readable label for the schema object/attribute: key characteristic X of the assay, related protocol, or downstream data that we want to record as metadata feature
description: definition or a reference containing the definition of attribute X. Preferably provide a source ontology link or code in addition to the definition.
subclass_of: *schema* label of this attribute/object's parent node in the schema
requires_dependencies: important characteristics, if any, of attribute X that need to be recorded as metadata features given attribute X is specified. These characteristics are attributes themselves and need to pre-exist in the schema as such
requires_range: a set/range of values that this attribute can be assigned to. this domain is stored in the rangeIncludes property of this object.
requires_components: a set of associated components/categories that this object/entity requires for its full specification; each component is a high level ontology class in which entities/objects are categorized/componentized and it is an entity on its own that needs to exist in the schema.
required: indicates if this attribute is required or optional in a schema
validation_rules: a list of validation rules defined for this class (e.g. defining what is a valid object of this class)
Returns: a json schema.org object
"""
class_name = se.get_class_label_from_display_name(class_display_name)
# setup biothings object template with mandatory elements
class_attributes = {
'@id': 'bts:'+class_name,
'@type': 'rdfs:Class',
'rdfs:comment': description if description and not pd.isnull(description) else "TBD",
'rdfs:label': class_name,
'schema:isPartOf': {'@id': 'http://schema.biothings.io'}
}
# determine parent class of element and add subclass relationship to schema - required by biothings
# if no subclass is provided, set a default to schema.org Thing
if subclass_of:
if len(subclass_of) == 1 and pd.isnull(subclass_of[0]):
parent = {'rdfs:subClassOf':[{'@id':'schema:Thing'}]}
else:
parent = {'rdfs:subClassOf':[{'@id':'bts:' + se.get_class_label_from_display_name(sub)} for sub in subclass_of]}
else:
parent = {'rdfs:subClassOf':[{'@id':'schema:Thing'}]}
class_attributes.update(parent)
# add optional attribute specifying attributes/objects that are required for the specification of this object
# useful for specifying annotation requirements, for example
if requires_dependencies:
requirement = {'sms:requiresDependency':[{'@id':'bts:' + dep} for dep in requires_dependencies]}
class_attributes.update(requirement)
# add optional attribute specifying the possible values this object can be set to; can be other objects, including primitives
if requires_range:
value_constraint = {'schema:rangeIncludes':[{'@id':'bts:' + se.get_class_label_from_display_name(val)} for val in requires_range]}
class_attributes.update(value_constraint)
# add optional attribute specifying validation patterns associated with this object (e.g. precise definition of the object range)
if validation_rules:
class_attributes.update({'sms:validationRules': validation_rules})
else:
class_attributes.update({'sms:validationRules': []})
# add optional attribute specifying the required components (i.e. high level ontology class in which entities/objects are categorized/componentized)
# that are required for the specification of this object
if requires_components:
requirement = {'sms:requiresComponent':[{'@id':'bts:' + c} for c in requires_components]}
class_attributes.update(requirement)
if required:
class_attributes.update({'sms:required':'sms:true'})
else:
class_attributes.update({'sms:required':'sms:false'})
# ensure display name does not contain leading/trailing white spaces
class_attributes.update({'sms:displayName':class_display_name.strip()})
return class_attributes
def get_property(se: SchemaExplorer, property_display_name: str, property_class_name: str, description: str = None, requires_range: list = None, requires_dependencies: list = None, required:bool = None, validation_rules: str = None) -> dict:
"""Constructs a new schema.org compliant property of an existing schema.org object/class; note that the property itself is a schema.org object class.
Args:
se: a schema explorer object allowing the traversal and modification of a schema graph
property_display_name: human readable label for the schema object/attribute: key characteristic X of the assay, related protocol, or downstream data that we want to record as metadata feature
property_class_name: *schema* label of the class/object that this is a property of
description: definition or a reference containing the definition of attribute X. Preferably provide a source ontology link or code in addition to the definition.
requires_range: what is the set/domain of values that this attribute can be assigned to; currently only used to specify primitive types. TODO: extend to reg exp patterns
requires_dependencies: important characteristics, if any, of property X that need to be recorded as metadata features given property X is specified. These characteristics are attributes themselves and need to pre-exist in the schema as such
validation_rules: a list of validation rules defined for this class (e.g. defining what is a valid object of this property)
Returns: a json schema.org property object
"""
property_name = se.get_property_label_from_display_name(property_display_name)
property_attributes = {
'@id': 'bts:' + property_name,
'@type': 'rdf:Property',
'rdfs:comment': description if description and not pd.isnull(description) else "TBD",
'rdfs:label': property_name,
'sms:displayName': property_display_name,
'schema:domainIncludes': {'@id': 'bts:' + se.get_class_label_from_display_name(property_class_name)},
'schema:isPartOf': {'@id': 'http://schema.biothings.io'},
}
if requires_range:
value_constraint = {'schema:rangeIncludes':[{'@id':'bts:' + se.get_class_label_from_display_name(val)} for val in requires_range]}
property_attributes.update(value_constraint)
if requires_dependencies:
requirement = {'sms:requiresDependency':[{'@id':'bts:' + dep} for dep in requires_dependencies]}
property_attributes.update(requirement)
# add optional attribute specifying validation patterns associated with this object (e.g. precise definition of the object range)
if validation_rules:
property_attributes.update({'sms:validationRules': validation_rules})
else:
property_attributes.update({'sms:validationRules': []})
if required:
property_attributes.update({'sms:required':'sms:true'})
else:
property_attributes.update({'sms:required':'sms:false'})
#'http://schema.org/domainIncludes':{'@id': 'bts:' + property_class_name},
#'http://schema.org/rangeIncludes':{'@id': 'schema:' + allowed_values},
# ensure display name does not contain leading/trailing white spaces
property_attributes.update({'sms:displayName':property_display_name.strip()})
return property_attributes
def attribute_exists(se: SchemaExplorer, attribute_label:str) -> bool:
"""Check if a given attribute exists already in schema
Args:
se: a schema explorer object allowing the traversal and modification of a schema graph
attribute_label: a schema label for the attribute to check
Returns:
True/False indicating if attribute exists or not
"""
schema_graph = se.get_nx_schema()
if attribute_label in schema_graph.nodes:
return True
return False
def check_schema_definition(schema_definition: pd.DataFrame) -> bool:
"""Checks if a schema definition data frame contains the right required headers.
See schema definition guide for more details
TODO: post and link schema definition guide
Args:
schema_definition: a pandas dataframe containing schema definition; see example here: https://docs.google.com/spreadsheets/d/1J2brhqO4kpeHIkNytzlqrdIiRanXDr6KD2hqjOTC9hs/edit#gid=0
Raises: Exception
"""
if required_headers.issubset(set(list(schema_definition.columns))):
return
elif (
"Requires" in list(schema_definition.columns) or
"Requires Component" in list(schema_definition.columns)
):
raise ValueError(
"The input CSV schema file contains the 'Requires' and/or the 'Requires "
"Component' column headers. These columns were renamed to 'DependsOn' and "
"'DependsOn Component', respectively. Switch to the new column names."
)
def create_schema_classes(schema_extension: pd.DataFrame, se: SchemaExplorer) -> SchemaExplorer:
"""Creates classes for all attributes and adds them to the schema
Args:
schema_extension: a pandas dataframe containing schema definition; see example here: https://docs.google.com/spreadsheets/d/1J2brhqO4kpeHIkNytzlqrdIiRanXDr6KD2hqjOTC9hs/edit#gid=0
se: a schema explorer object allowing the traversal and modification of a schema graph
base_schema_path: a path to a json-ld file containing an existing schema
Returns:
An updated schema explorer object
"""
try:
check_schema_definition(schema_extension)
logger.debug("Schema definition csv ready for processing!")
except:
raise ValueError(f"Schema extension headers: {set(list(schema_extension.columns))} "
f"do not match required schema headers: {required_headers}")
# get attributes from Attribute column
attributes = schema_extension[list(required_headers)].to_dict("records")
# get all properties across all attributes from Properties column
props = set(schema_extension[["Properties"]].dropna().values.flatten())
# clean properties strings
all_properties = []
for prop in props:
all_properties += [p.strip() for p in prop.split(",")]
# get both attributes and their properties (if any)
properties = schema_extension[["Attribute", "Properties"]].to_dict("records")
# property to class map
prop_2_class = {}
for record in properties:
if not pd.isnull(record["Properties"]):
props = record["Properties"].strip().split(",")
for p in props:
prop_2_class[p.strip()] = record["Attribute"]
logger.debug("Adding attributes")
for attribute in attributes:
required = None
if not pd.isnull(attribute["Required"]):
required = attribute["Required"]
if not attribute["Attribute"] in all_properties:
display_name = attribute["Attribute"]
subclass_of = None
if not pd.isnull(attribute["Parent"]):
subclass_of = [parent for parent in attribute["Parent"].strip().split(",")]
new_class = get_class(se, display_name,
description = attribute["Description"],
subclass_of = subclass_of,
required = required
)
se.update_class(new_class)
"""
print(se.get_nx_schema().nodes[new_class["rdfs:label"]])
# check if attribute doesn't already exist and add it
if not attribute_exists(se, new_class["rdfs:label"]):
se.update_class(new_class)
else:
print("ATTRIBUTE EXISTS")
print(new_class)
"""
else:
display_name = attribute["Attribute"]
new_property = get_property(se, display_name,
prop_2_class[display_name],
description = attribute["Description"],
required = required
)
# check if attribute doesn't already exist and add it
if not attribute_exists(se, new_property["rdfs:label"]):
se.update_property(new_property)
logger.debug("Done adding attributes")
#TODO check if schema already contains property - may require property context in csv schema definition
logger.debug("Adding and editing properties")
for prop in properties:
if not pd.isnull(prop["Properties"]): # a class may have or not have properties
for p in prop["Properties"].strip().split(","): # a class may have multiple properties
attribute = prop["Attribute"]
# check if property is already present as attribute under attributes column
# TODO: adjust logic below to compactify code
p = p.strip()
if p in list(schema_extension["Attribute"]):
description = schema_extension.loc[schema_extension["Attribute"] == p]["Description"].values[0]
property_info = se.explore_property(se.get_property_label_from_display_name(p))
range_values = property_info["range"] if "range" in property_info else None
requires_dependencies = property_info["dependencies"] if "dependencies" in property_info else None
required = property_info["required"] if "required" in property_info else None
new_property = get_property(se, p,
property_info["domain"],
description = description,
requires_range = range_values,
requires_dependencies = requires_dependencies,
required = required
)
se.edit_property(new_property)
else:
description = None
new_property = get_property(se, p,
attribute,
description = description
)
se.update_property(new_property)
logger.debug("Done adding properties")
# set range values and dependency requirements for each attribute
# if not already added, add each attribute in required values and dependencies to the schema extension
logger.debug("Editing attributes and properties to add requirements and value ranges")
for attribute in attributes:
# TODO: refactor processing of multi-valued cells in columns and corresponding schema updates; it would compactify code below if class and property are encapsulated as objects inheriting from a common attribute parent object
# get values in range for this attribute, if any are specified
range_values = attribute["Valid Values"]
if not pd.isnull(range_values):
logger.debug("Adding value range for " + attribute["Attribute"])
# prepare the range values list and split based on appropriate delimiter
# if the string "range_values" starts with double quotes, then extract all "valid values" within double quotes
range_values_list = []
if range_values[0] == "\"":
range_values_list = re.findall(r'"([^"]*)"', range_values)
else:
range_values_list = range_values.strip().split(",")
for val in range_values_list:
# check if value is in attributes column; add it as a class if not
if not val.strip() in list(schema_extension["Attribute"]):
#determine parent class of the new value class
# if this attribute is not a property, set it as a parent class
if not attribute["Attribute"] in all_properties:
parent = attribute["Attribute"]
else:
# this attribute is a property, set the parent to the domain class of this attribute
parent = se.get_class_by_property(attribute["Attribute"])
if not parent:
raise ValueError(f"Listed valid value: {val}, for attribute: {attribute['Attribute']} "
"must have a class parent. The extension could not be added to the schema.")
new_class = get_class(se, val,
description = None,
subclass_of = [parent]
)
# check if attribute doesn't already exist and add it
if not attribute_exists(se, new_class["rdfs:label"]):
se.update_class(new_class)
#update rangeIncludes of attribute
# if attribute is not a property, then assume it is a class
if not attribute["Attribute"] in all_properties:
logger.debug(attribute["Attribute"])
class_info = se.explore_class(se.get_class_label_from_display_name(attribute["Attribute"]))
class_info["range"].append(se.get_class_label_from_display_name(val))
class_range_edit = get_class(se, attribute["Attribute"],
description = attribute["Description"],\
subclass_of = [attribute["Parent"]],\
requires_dependencies = class_info["dependencies"],\
requires_range = class_info["range"],
required = class_info["required"],
validation_rules = class_info["validation_rules"]
)
se.edit_class(class_range_edit)
else:
# the attribute is a property
property_info = se.explore_property(se.get_property_label_from_display_name(attribute["Attribute"]))
property_info["range"].append(se.get_class_label_from_display_name(val))
property_range_edit = get_property(se, attribute["Attribute"],
property_info["domain"],
description = property_info["description"],
requires_dependencies = property_info["dependencies"],
requires_range = property_info["range"],
required = property_info["required"],
validation_rules = property_info["validation_rules"]
)
se.edit_property(property_range_edit)
logger.debug(val + " added to value range.")
logger.debug("<<< Done adding value range for " + attribute["Attribute"])
# get validation rules for this attribute, if any are specified
validation_rules = attribute["Validation Rules"]
if not pd.isnull(validation_rules):
logger.debug(">>> Adding validation rules for " + attribute["Attribute"])
# TODO: make validation rules delimiter configurable parameter
validation_rules = [val_rule.strip() for val_rule in validation_rules.strip().split("::")]
#update validation rules of attribute
# if attribute is not a property, then assume it is a class
if not attribute["Attribute"] in all_properties:
logger.debug(attribute["Attribute"])
class_info = se.explore_class(se.get_class_label_from_display_name(attribute["Attribute"]))
class_info["validation_rules"] = validation_rules
class_val_rule_edit = get_class(se, attribute["Attribute"],
description = attribute["Description"],\
subclass_of = [attribute["Parent"]],\
requires_dependencies = class_info["dependencies"],\
requires_range = class_info["range"],
required = class_info["required"],
validation_rules = class_info["validation_rules"]
)
se.edit_class(class_val_rule_edit)
else:
# the attribute is a property
property_info = se.explore_property(se.get_property_label_from_display_name(attribute["Attribute"]))
property_info["validation_rules"] = validation_rules
property_val_rule_edit = get_property(se, attribute["Attribute"],
property_info["domain"],
description = property_info["description"],
requires_dependencies = property_info["dependencies"],
requires_range = property_info["range"],
required = property_info["required"],
validation_rules = property_info["validation_rules"]
)
se.edit_property(property_val_rule_edit)
# get dependencies for this attribute, if any are specified
requires_dependencies = attribute["DependsOn"]
if not pd.isnull(requires_dependencies):
logger.debug(">>> Adding dependencies for " + attribute["Attribute"])
for dep in requires_dependencies.strip().split(","):
# check if dependency is a property or not
dep = dep.strip()
dep_is_property = dep in all_properties
dep_label = ""
# set dependency label based on kind of dependency: class or property
if dep_is_property:
dep_label = se.get_property_label_from_display_name(dep)
else:
dep_label = se.get_class_label_from_display_name(dep)
# check if dependency is in attributes column; add it to the list if not
if not dep.strip() in list(schema_extension["Attribute"]):
# if dependency is a property create a new property; else create a new class
if not dep_is_property:
# if this attribute is not a property, set it as a parent class
if not attribute["Attribute"] in all_properties:
parent = attribute["Attribute"]
else:
# this attribute is a property, set the parent to the domain class of this attribute
parent = se.get_class_by_property(attribute["Attribute"])
if not parent:
raise ValueError(f"Listed required dependency: {dep}, for attribute: {attribute['Attribute']} "
"must have a class parent. The extension could not be added to the schema.")
new_class = get_class(se, dep,
description = None,
subclass_of = [parent]
)
#se.update_class(new_class)
# check if attribute doesn't already exist and add it
if not attribute_exists(se, new_class["rdfs:label"]):
se.update_class(new_class)
else:
if not attribute["Attribute"] in all_properties:
domain_attribute = attribute["Attribute"]
else:
# this attribute is a property, set the domain of this property to the domain class of the attribute
domain_attribute = se.get_class_by_property(attribute["Attribute"])
if not domain_attribute:
raise ValueError(f"Listed required dependency: {dep}, must have a class parent. "
"The extension could not be added to the schema.")
description = None
new_property = get_property(se, dep,
domain_attribute,
description = description
)
# check if attribute doesn't already exist and add it
if not attribute_exists(se, new_property["rdfs:label"]):
se.update_property(new_property)
# update required dependencies of attribute
# if attribute is not a property then assume it is a class
if not attribute["Attribute"] in all_properties:
class_info = se.explore_class(se.get_class_label_from_display_name(attribute["Attribute"]))
class_info["dependencies"].append(dep_label)
class_dependencies_edit = get_class(se, attribute["Attribute"],
description = attribute["Description"],
subclass_of = [attribute["Parent"]],
requires_dependencies = class_info["dependencies"],
requires_range = class_info["range"],
required = class_info["required"],
validation_rules = class_info["validation_rules"]
)
se.edit_class(class_dependencies_edit)
else:
# the attribute is a property then update as a property
property_info = se.explore_property(se.get_property_label_from_display_name(attribute["Attribute"]))
property_info["dependencies"].append(dep_label)
property_dependencies_edit = get_property(se, attribute["Attribute"],
property_info["domain"],
description = property_info["description"],
requires_dependencies = property_info["dependencies"],
requires_range = property_info["range"],
required = property_info["required"],
validation_rules = property_info["validation_rules"]
)
se.edit_property(property_dependencies_edit)
logger.debug(dep + " added to dependencies.")
#TODO check for cycles in attribute dependencies schema subgraph
logger.debug("<<< Done adding dependencies for " + attribute["Attribute"])
# check if the attribute requires any components
if not pd.isnull(attribute["DependsOn Component"]):
component_dependencies = attribute["DependsOn Component"]
else:
continue
logger.debug(">>> Adding component dependencies for " + attribute["Attribute"])
# iterate over potentially multiple dependency components
for comp_dep in component_dependencies.strip().split(","):
# check if a component is already defined as an attribute; if not define it in the schema
if not comp_dep.strip() in list(schema_extension["Attribute"]):
#component is not in csv schema so try adding it as a class with a parent Thing
new_class = get_class(se, comp_dep,
description = None
)
# check if attribute doesn't already exist in schema.org schema and add it
# (component may not be in csv schema, but could be in the base schema we are extending)
if not attribute_exists(se, new_class["rdfs:label"]):
se.update_class(new_class)
#update this attribute requirements to include component
class_info = se.explore_class(se.get_class_label_from_display_name(attribute["Attribute"]))
class_info["component_dependencies"].append(se.get_class_label_from_display_name(comp_dep))
class_component_dependencies_edit = get_class(se, attribute["Attribute"],
description = class_info["description"],
subclass_of = class_info["subClassOf"],
requires_dependencies = class_info["dependencies"],
requires_range = class_info["range"],
validation_rules = class_info["validation_rules"],
requires_components = class_info["component_dependencies"]
)
se.edit_class(class_component_dependencies_edit)
#TODO check for cycles in component dependencies schema subgraph
logger.debug("<<< Done adding component dependencies for " + attribute["Attribute"])
logger.info("Done adding requirements and value ranges to attributes")
return se
def create_nx_schema_objects(schema_extension: pd.DataFrame, se: SchemaExplorer) -> SchemaExplorer:
"""Creates classes for all attributes and adds them to the schema.
Args:
schema_extension: a pandas dataframe containing schema definition; see example here: https://docs.google.com/spreadsheets/d/1J2brhqO4kpeHIkNytzlqrdIiRanXDr6KD2hqjOTC9hs/edit#gid=0
se: a schema explorer object allowing the traversal and modification of a schema graph
base_schema_path: a path to a json-ld file containing an existing schema
Returns:
An updated schema explorer object
"""
try:
check_schema_definition(schema_extension)
logger.debug("Schema definition csv ready for processing!")
except:
raise ValueError(f"Schema extension headers: {set(list(schema_extension.columns))} "
f"do not match required schema headers: {required_headers}")
rel_dict = {
"rdfs:subClassOf": {
"parentOf": "in"
},
"schema:domainIncludes": {
"domainValue": "in"
},
"sms:requiresDependency": {
"requiresDependency": "out"
},
"sms:requiresComponent": {
"requiresComponent": "out"
},
"schema:rangeIncludes": {
"rangeValue": "out"
}
}
# get attributes from Attribute column
attributes = schema_extension[list(required_headers)].to_dict("records")
# get all properties across all attributes from Properties column
props = set(schema_extension[["Properties"]].dropna().values.flatten())
# clean properties strings
all_properties = []
for prop in props:
all_properties += [p.strip() for p in prop.split(",")]
# get both attributes and their properties (if any)
properties = schema_extension[["Attribute", "Properties"]].to_dict("records")
# property to class map
prop_2_class = {}
for record in properties:
if not pd.isnull(record["Properties"]):
props = record["Properties"].strip().split(",")
for p in props:
prop_2_class[p.strip()] = record["Attribute"]
logger.debug("Adding attributes")
for attribute in attributes:
required = None
if not pd.isnull(attribute["Required"]):
required = attribute["Required"]
if not attribute["Attribute"] in all_properties:
display_name = attribute["Attribute"]
subclass_of = None
if not pd.isnull(attribute["Parent"]):
subclass_of = [parent for parent in attribute["Parent"].strip().split(",")]
new_class = get_class(se, display_name,
description = attribute["Description"],
subclass_of = subclass_of,
required = required
)
se.add_schema_object_nx(new_class, **rel_dict)
"""
print(se.get_nx_schema().nodes[new_class["rdfs:label"]])
# check if attribute doesn't already exist and add it
if not attribute_exists(se, new_class["rdfs:label"]):
se.add_schema_object_nx(new_class, **rel_dict)
else:
print("ATTRIBUTE EXISTS")
print(new_class)
"""
else:
display_name = attribute["Attribute"]
new_property = get_property(se, display_name,
prop_2_class[display_name],
description = attribute["Description"],
required = required
)
# check if attribute doesn't already exist and add it
if not attribute_exists(se, new_property["rdfs:label"]):
se.add_schema_object_nx(new_property, **rel_dict)
logger.debug("Done adding attributes")
#TODO check if schema already contains property - may require property context in csv schema definition
logger.debug("Adding and editing properties")
for prop in properties:
if not pd.isnull(prop["Properties"]): # a class may have or not have properties
for p in prop["Properties"].strip().split(","): # a class may have multiple properties
attribute = prop["Attribute"]
# check if property is already present as attribute under attributes column
# TODO: adjust logic below to compactify code
p = p.strip()
if p in list(schema_extension["Attribute"]):
description = schema_extension.loc[schema_extension["Attribute"] == p]["Description"].values[0]
property_info = se.explore_property(se.get_property_label_from_display_name(p))
range_values = property_info["range"] if "range" in property_info else None
requires_dependencies = property_info["dependencies"] if "dependencies" in property_info else None
required = property_info["required"] if "required" in property_info else None
new_property = get_property(se, p,
property_info["domain"],
description = description,
requires_range = range_values,
requires_dependencies = requires_dependencies,
required = required
)
se.edit_schema_object_nx(new_property)
else:
description = None
new_property = get_property(se, p,
attribute,
description = description
)
se.add_schema_object_nx(new_property, **rel_dict)
logger.debug("Done adding properties")
# # set range values and dependency requirements for each attribute
# # if not already added, add each attribute in required values and dependencies to the schema extension
# print("Editing attributes and properties to add requirements and value ranges")
# print("====================================================================================")
for attribute in attributes:
# TODO: refactor processing of multi-valued cells in columns and corresponding schema updates; it would compactify code below if class and property are encapsulated as objects inheriting from a common attribute parent object
# get values in range for this attribute, if any are specified
range_values = attribute["Valid Values"]
if not pd.isnull(range_values):
# prepare the range values list and split based on appropriate delimiter
# if the string "range_values" starts with double quotes, then extract all "valid values" within double quotes
range_values_list = []
if range_values[0] == "\"":
range_values_list = re.findall(r'"([^"]*)"', range_values)
else:
range_values_list = range_values.strip().split(",")
for val in range_values_list:
# check if value is in attributes column; add it as a class if not
if not val.strip() in list(schema_extension["Attribute"]):
#determine parent class of the new value class
# if this attribute is not a property, set it as a parent class
if not attribute["Attribute"] in all_properties:
parent = attribute["Attribute"]
else:
# this attribute is a property, set the parent to the domain class of this attribute
parent = se.get_class_by_property(attribute["Attribute"])
if not parent:
raise ValueError(f"Listed valid value: {val}, for attribute: {attribute['Attribute']} "
"must have a class parent. The extension could not be added to the schema.")
new_class = get_class(se, val,
description = None,
subclass_of = [parent]
)
# check if attribute doesn't already exist and add it
if not attribute_exists(se, new_class["rdfs:label"]):
se.add_schema_object_nx(new_class, **rel_dict)
#update rangeIncludes of attribute
# if attribute is not a property, then assume it is a class
if not attribute["Attribute"] in all_properties:
class_info = se.explore_class(se.get_class_label_from_display_name(attribute["Attribute"]))
class_info["range"].append(se.get_class_label_from_display_name(val))
class_range_edit = get_class(se, attribute["Attribute"],
description = attribute["Description"],\
subclass_of = [attribute["Parent"]],\
requires_dependencies = class_info["dependencies"],\
requires_range = class_info["range"],
required = class_info["required"],
validation_rules = class_info["validation_rules"]
)
se.edit_schema_object_nx(class_range_edit)
else:
# the attribute is a property
property_info = se.explore_property(se.get_property_label_from_display_name(attribute["Attribute"]))
property_info["range"].append(se.get_class_label_from_display_name(val))
property_range_edit = get_property(se, attribute["Attribute"],
property_info["domain"],
description = property_info["description"],
requires_dependencies = property_info["dependencies"],
requires_range = property_info["range"],
required = property_info["required"],
validation_rules = property_info["validation_rules"]
)
se.edit_schema_object_nx(property_range_edit)
logger.debug(val + " added to value range")
# get validation rules for this attribute, if any are specified
validation_rules = attribute["Validation Rules"]
if not pd.isnull(validation_rules):
# TODO: make validation rules delimiter configurable parameter
validation_rules = [val_rule.strip() for val_rule in validation_rules.strip().split("::")]
#update validation rules of attribute
# if attribute is not a property, then assume it is a class
if not attribute["Attribute"] in all_properties:
class_info = se.explore_class(se.get_class_label_from_display_name(attribute["Attribute"]))
class_info["validation_rules"] = validation_rules
class_val_rule_edit = get_class(se, attribute["Attribute"],
description = attribute["Description"],\
subclass_of = [attribute["Parent"]],\
requires_dependencies = class_info["dependencies"],\
requires_range = class_info["range"],
required = class_info["required"],
validation_rules = class_info["validation_rules"]
)
se.edit_schema_object_nx(class_val_rule_edit)
else:
# the attribute is a property
property_info = se.explore_property(se.get_property_label_from_display_name(attribute["Attribute"]))
property_info["validation_rules"] = validation_rules
property_val_rule_edit = get_property(se, attribute["Attribute"],
property_info["domain"],
description = property_info["description"],
requires_dependencies = property_info["dependencies"],
requires_range = property_info["range"],
required = property_info["required"],
validation_rules = property_info["validation_rules"]
)
se.edit_schema_object_nx(property_val_rule_edit)
logger.debug(val + "validation rules added")
# get dependencies for this attribute, if any are specified
requires_dependencies = attribute["DependsOn"]
if not pd.isnull(requires_dependencies):
for dep in requires_dependencies.strip().split(","):
# check if dependency is a property or not
dep = dep.strip()
dep_is_property = dep in all_properties
dep_label = ""
# set dependency label based on kind of dependency: class or property
if dep_is_property:
dep_label = se.get_property_label_from_display_name(dep)
else:
dep_label = se.get_class_label_from_display_name(dep)
# check if dependency is in attributes column; add it to the list if not
if not dep.strip() in list(schema_extension["Attribute"]):
# if dependency is a property create a new property; else create a new class
if not dep_is_property:
# if this attribute is not a property, set it as a parent class
if not attribute["Attribute"] in all_properties:
parent = attribute["Attribute"]
else:
# this attribute is a property, set the parent to the domain class of this attribute
parent = se.get_class_by_property(attribute["Attribute"])
if not parent:
raise ValueError(f"Listed required dependency: {dep}, for attribute: {attribute['Attribute']} "
"must have a class parent. The extension could not be added to the schema.")
new_class = get_class(se, dep,
description = None,
subclass_of = [parent]
)
#se.add_schema_object_nx(new_class, **rel_dict)
# check if attribute doesn't already exist and add it
if not attribute_exists(se, new_class["rdfs:label"]):
se.add_schema_object_nx(new_class, **rel_dict)
else:
if not attribute["Attribute"] in all_properties:
domain_attribute = attribute["Attribute"]
else:
# this attribute is a property, set the domain of this property to the domain class of the attribute
domain_attribute = se.get_class_by_property(attribute["Attribute"])
if not domain_attribute:
raise ValueError(f"Listed required dependency: {dep}, must have a class parent. "
"The extension could not be added to the schema.")
description = None
new_property = get_property(se, dep,
domain_attribute,
description = description
)
# check if attribute doesn't already exist and add it
if not attribute_exists(se, new_property["rdfs:label"]):
se.add_schema_object_nx(new_property, **rel_dict)
# update required dependencies of attribute
# if attribute is not a property then assume it is a class
if not attribute["Attribute"] in all_properties:
class_info = se.explore_class(se.get_class_label_from_display_name(attribute["Attribute"]))
class_info["dependencies"].append(dep_label)
class_dependencies_edit = get_class(se, attribute["Attribute"],
description = attribute["Description"],
subclass_of = [attribute["Parent"]],
requires_dependencies = class_info["dependencies"],
requires_range = class_info["range"],
required = class_info["required"],
validation_rules = class_info["validation_rules"]
)
se.edit_schema_object_nx(class_dependencies_edit)
else:
# the attribute is a property then update as a property
property_info = se.explore_property(se.get_property_label_from_display_name(attribute["Attribute"]))
property_info["dependencies"].append(dep_label)
property_dependencies_edit = get_property(se, attribute["Attribute"],
property_info["domain"],
description = property_info["description"],
requires_dependencies = property_info["dependencies"],
requires_range = property_info["range"],
required = property_info["required"],
validation_rules = property_info["validation_rules"]
)
se.edit_schema_object_nx(property_dependencies_edit)
logger.debug(dep + " added to dependencies")
#TODO check for cycles in attribute dependencies schema subgraph
# check if the attribute requires any components
if not pd.isnull(attribute["DependsOn Component"]):
component_dependencies = attribute["DependsOn Component"]
else:
continue
# iterate over potentially multiple dependency components
for comp_dep in component_dependencies.strip().split(","):
# check if a component is already defined as an attribute; if not define it in the schema
if not comp_dep.strip() in list(schema_extension["Attribute"]):
#component is not in csv schema so try adding it as a class with a parent Thing
new_class = get_class(se, comp_dep,
description = None
)
# check if attribute doesn't already exist in schema.org schema and add it
# (component may not be in csv schema, but could be in the base schema we are extending)
if not attribute_exists(se, new_class["rdfs:label"]):
se.add_schema_object_nx(new_class, **rel_dict)
#update this attribute requirements to include component
class_info = se.explore_class(se.get_class_label_from_display_name(attribute["Attribute"]))
class_info["component_dependencies"].append(se.get_class_label_from_display_name(comp_dep))
class_component_dependencies_edit = get_class(se, attribute["Attribute"],
description = class_info["description"],
subclass_of = class_info["subClassOf"],
requires_dependencies = class_info["dependencies"],
requires_range = class_info["range"],
validation_rules = class_info["validation_rules"],
requires_components = class_info["component_dependencies"]
)
se.edit_schema_object_nx(class_component_dependencies_edit)
logger.debug(comp_dep + " added to dependencies")
#TODO check for cycles in component dependencies schema subgraph
logger.info("Done adding requirements and value ranges to attributes")
return se
def _get_base_schema_path(base_schema: str = None) -> str:
"""Evaluate path to base schema.
Args:
base_schema: Path to base data model. BioThings data model is loaded by default.
Returns:
base_schema_path: Path to base schema based on provided argument.
"""
biothings_schema_path = LOADER.filename('data_models/biothings.model.jsonld')
base_schema_path = biothings_schema_path if base_schema is None else base_schema
return base_schema_path
def _convert_csv_to_data_model(schema_csv: str,
base_schema: str = None) -> SchemaExplorer:
"""Convert provided CSV spec. in CSV format to data model in JSON-LD format.
Args:
schema_csv: Path to CSV file containing data to be translated to
JSON-LD data model. Can be path to local CSV or URL.
Returns:
base_se: SchemaExplorer object which has updated properties
(base_se.schema and base_se.schema_nx).
"""
# create data model from provided RFC
rfc_df = pd.read_csv(schema_csv)
# instantiate schema explorer
base_se = SchemaExplorer()
# determine base schema path
base_schema_path = _get_base_schema_path(base_schema)
# load base schema (BioThings)
base_se.load_schema(base_schema_path)
# call parser code that converts a dataframe of the RFC
# specs. into a JSON-LD data model
base_se = create_nx_schema_objects(rfc_df, base_se)
return base_se
| 53.808679 | 298 | 0.574502 | 5,642 | 54,562 | 5.378412 | 0.073733 | 0.045675 | 0.016873 | 0.021091 | 0.845774 | 0.827813 | 0.81104 | 0.79308 | 0.790443 | 0.775779 | 0 | 0.000934 | 0.35215 | 54,562 | 1,013 | 299 | 53.861797 | 0.857531 | 0.244309 | 0 | 0.736842 | 0 | 0 | 0.13698 | 0.011067 | 0 | 0 | 0 | 0.003949 | 0 | 1 | 0.013158 | false | 0 | 0.018092 | 0 | 0.046053 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
764e30efce742afca8ccf7087dc042e8157862e3 | 329 | py | Python | DaisyXMusic/services/queues/__init__.py | skypar/NITIN_VC__MUSIC | d09f303018ab51bd1b7266ba339e3325520fe429 | [
"Unlicense"
] | null | null | null | DaisyXMusic/services/queues/__init__.py | skypar/NITIN_VC__MUSIC | d09f303018ab51bd1b7266ba339e3325520fe429 | [
"Unlicense"
] | null | null | null | DaisyXMusic/services/queues/__init__.py | skypar/NITIN_VC__MUSIC | d09f303018ab51bd1b7266ba339e3325520fe429 | [
"Unlicense"
] | null | null | null | from DaisyXMusic.services.queues.queues import clear
from DaisyXMusic.services.queues.queues import get
from DaisyXMusic.services.queues.queues import is_empty
from DaisyXMusic.services.queues.queues import put
from DaisyXMusic.services.queues.queues import task_done
__all__ = ["clear", "get", "is_empty", "put", "task_done"]
| 41.125 | 58 | 0.81459 | 45 | 329 | 5.777778 | 0.288889 | 0.288462 | 0.442308 | 0.557692 | 0.788462 | 0.788462 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088146 | 329 | 7 | 59 | 47 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0.085106 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.833333 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
76c71b826eaa810808cfde027fef10cb3e1c34fd | 3,574 | py | Python | tests/api/test_api_future.py | doccccu/rqalpha | 4356e31be2cf4c787ec8e742b2f2c7fda7f741d2 | [
"Apache-2.0"
] | 17 | 2017-04-20T05:17:25.000Z | 2020-09-30T08:58:03.000Z | tests/api/test_api_future.py | cjhfly/rqalpha | 4356e31be2cf4c787ec8e742b2f2c7fda7f741d2 | [
"Apache-2.0"
] | 1 | 2017-11-12T01:24:06.000Z | 2019-09-19T08:50:38.000Z | tests/api/test_api_future.py | cjhfly/rqalpha | 4356e31be2cf4c787ec8e742b2f2c7fda7f741d2 | [
"Apache-2.0"
] | 17 | 2017-04-17T08:17:00.000Z | 2020-10-25T01:56:49.000Z | #!/usr/bin/env python
# encoding: utf-8
import inspect
def test_buy_open():
from rqalpha.api import buy_open, subscribe, get_order, ORDER_STATUS, POSITION_EFFECT, SIDE
def init(context):
context.f1 = 'P88'
context.amount = 1
# context.marin_rate = 10
subscribe(context.f1)
context.order_count = 0
context.order = None
def handle_bar(context, bar_dict):
order_id = buy_open(context.f1, 1)
order = get_order(order_id)
assert order.order_book_id == context.f1
assert order.quantity == 1
assert order.status == ORDER_STATUS.ACTIVE
assert order.unfilled_quantity == 1
assert order.unfilled_quantity + order.filled_quantity == order.quantity
assert order.side == SIDE.BUY
assert order.position_effect == POSITION_EFFECT.OPEN
test_buy_open_code_new = "".join(inspect.getsourcelines(test_buy_open)[0])
def test_sell_open():
from rqalpha.api import sell_open, subscribe, get_order, ORDER_STATUS, POSITION_EFFECT, SIDE
def init(context):
context.f1 = 'P88'
context.amount = 1
# context.marin_rate = 10
subscribe(context.f1)
context.order_count = 0
context.order = None
def handle_bar(context, bar_dict):
order_id = sell_open(context.f1, 1)
order = get_order(order_id)
assert order.order_book_id == context.f1
assert order.quantity == 1
assert order.status == ORDER_STATUS.ACTIVE
assert order.unfilled_quantity == 1
assert order.unfilled_quantity + order.filled_quantity == order.quantity
assert order.side == SIDE.SELL
assert order.position_effect == POSITION_EFFECT.OPEN
test_sell_open_code_new = "".join(inspect.getsourcelines(test_sell_open)[0])
def test_buy_close():
from rqalpha.api import buy_close, subscribe, get_order, ORDER_STATUS, POSITION_EFFECT, SIDE
def init(context):
context.f1 = 'P88'
context.amount = 1
# context.marin_rate = 10
subscribe(context.f1)
context.order_count = 0
context.order = None
def handle_bar(context, bar_dict):
order_id = buy_close(context.f1, 1)
order = get_order(order_id)
assert order.order_book_id == context.f1
assert order.quantity == 1
assert order.status == ORDER_STATUS.ACTIVE
assert order.unfilled_quantity == 1
assert order.unfilled_quantity + order.filled_quantity == order.quantity
assert order.side == SIDE.BUY
assert order.position_effect == POSITION_EFFECT.CLOSE
test_buy_close_code_new = "".join(inspect.getsourcelines(test_buy_close)[0])
def test_sell_close():
from rqalpha.api import sell_close, subscribe, get_order, ORDER_STATUS, POSITION_EFFECT, SIDE
def init(context):
context.f1 = 'P88'
context.amount = 1
# context.marin_rate = 10
subscribe(context.f1)
context.order_count = 0
context.order = None
def handle_bar(context, bar_dict):
order_id = sell_close(context.f1, 1)
order = get_order(order_id)
assert order.order_book_id == context.f1
assert order.quantity == 1
assert order.status == ORDER_STATUS.ACTIVE
assert order.unfilled_quantity == 1
assert order.unfilled_quantity + order.filled_quantity == order.quantity
assert order.side == SIDE.SELL
assert order.position_effect == POSITION_EFFECT.CLOSE
test_sell_close_code_new = "".join(inspect.getsourcelines(test_sell_close)[0])
| 37.621053 | 97 | 0.674874 | 465 | 3,574 | 4.946237 | 0.109677 | 0.133913 | 0.045217 | 0.069565 | 0.948696 | 0.9 | 0.9 | 0.823478 | 0.808696 | 0.808696 | 0 | 0.020833 | 0.234471 | 3,574 | 94 | 98 | 38.021277 | 0.81981 | 0.036933 | 0 | 0.779221 | 0 | 0 | 0.003492 | 0 | 0 | 0 | 0 | 0 | 0.363636 | 1 | 0.155844 | false | 0 | 0.064935 | 0 | 0.220779 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
76c87af35925e8e17973501f32f7216faa494350 | 4,873 | py | Python | tests/wsgi_middleware/test_query_execution.py | commandtab/ariadne | c154b8ed3026e500d1c8dd8d792fa728ce17f6b4 | [
"BSD-3-Clause"
] | 1 | 2019-08-26T09:08:19.000Z | 2019-08-26T09:08:19.000Z | tests/wsgi/test_query_execution.py | iamrobmat/ariadne | 3d1647facde2da40fcd600cd15686ff357f74844 | [
"BSD-3-Clause"
] | null | null | null | tests/wsgi/test_query_execution.py | iamrobmat/ariadne | 3d1647facde2da40fcd600cd15686ff357f74844 | [
"BSD-3-Clause"
] | 1 | 2019-06-03T13:07:21.000Z | 2019-06-03T13:07:21.000Z | # pylint: disable=too-many-arguments
from ariadne.constants import HTTP_STATUS_200_OK, HTTP_STATUS_400_BAD_REQUEST
operation_name = "SayHello"
variables = {"name": "Bob"}
complex_query = """
query SayHello($name: String!) {
hello(name: $name)
}
"""
def test_query_is_executed_for_post_json_request(
middleware,
start_response,
graphql_query_request_factory,
graphql_response_headers,
assert_json_response_equals_snapshot,
):
request = graphql_query_request_factory(query="{ status }")
result = middleware(request, start_response)
start_response.assert_called_once_with(HTTP_STATUS_200_OK, graphql_response_headers)
assert_json_response_equals_snapshot(result)
def test_complex_query_is_executed_for_post_json_request(
middleware,
start_response,
graphql_query_request_factory,
graphql_response_headers,
assert_json_response_equals_snapshot,
):
request = graphql_query_request_factory(
query=complex_query, variables=variables, operationName=operation_name
)
result = middleware(request, start_response)
start_response.assert_called_once_with(HTTP_STATUS_200_OK, graphql_response_headers)
assert_json_response_equals_snapshot(result)
def test_complex_query_without_operation_name_executes_successfully(
middleware,
start_response,
graphql_query_request_factory,
graphql_response_headers,
assert_json_response_equals_snapshot,
):
request = graphql_query_request_factory(query=complex_query, variables=variables)
result = middleware(request, start_response)
start_response.assert_called_once_with(HTTP_STATUS_200_OK, graphql_response_headers)
assert_json_response_equals_snapshot(result)
def test_attempt_execute_complex_query_without_variables_returns_error_json(
middleware,
start_response,
graphql_query_request_factory,
graphql_response_headers,
assert_json_response_equals_snapshot,
):
request = graphql_query_request_factory(
query=complex_query, operationName=operation_name
)
result = middleware(request, start_response)
start_response.assert_called_once_with(HTTP_STATUS_200_OK, graphql_response_headers)
assert_json_response_equals_snapshot(result)
def test_attempt_execute_query_without_query_entry_returns_error_json(
middleware,
start_response,
graphql_query_request_factory,
graphql_response_headers,
assert_json_response_equals_snapshot,
):
request = graphql_query_request_factory(variables=variables)
result = middleware(request, start_response)
start_response.assert_called_once_with(
HTTP_STATUS_400_BAD_REQUEST, graphql_response_headers
)
assert_json_response_equals_snapshot(result)
def test_attempt_execute_query_with_non_string_query_returns_error_json(
middleware,
start_response,
graphql_query_request_factory,
graphql_response_headers,
assert_json_response_equals_snapshot,
):
request = graphql_query_request_factory(query={"test": "error"})
result = middleware(request, start_response)
start_response.assert_called_once_with(
HTTP_STATUS_400_BAD_REQUEST, graphql_response_headers
)
assert_json_response_equals_snapshot(result)
def test_attempt_execute_query_with_invalid_variables_returns_error_json(
middleware,
start_response,
graphql_query_request_factory,
graphql_response_headers,
assert_json_response_equals_snapshot,
):
request = graphql_query_request_factory(query=complex_query, variables="invalid")
result = middleware(request, start_response)
start_response.assert_called_once_with(
HTTP_STATUS_400_BAD_REQUEST, graphql_response_headers
)
assert_json_response_equals_snapshot(result)
def test_attempt_execute_query_with_invalid_operation_name_string_returns_error_json(
middleware,
start_response,
graphql_query_request_factory,
graphql_response_headers,
assert_json_response_equals_snapshot,
):
request = graphql_query_request_factory(
query=complex_query, variables=variables, operationName="otherOperation"
)
result = middleware(request, start_response)
start_response.assert_called_once_with(HTTP_STATUS_200_OK, graphql_response_headers)
assert_json_response_equals_snapshot(result)
def test_attempt_execute_query_with_invalid_operation_name_type_returns_error_json(
middleware,
start_response,
graphql_query_request_factory,
graphql_response_headers,
assert_json_response_equals_snapshot,
):
request = graphql_query_request_factory(
query=complex_query, variables=variables, operationName=[1, 2, 3]
)
result = middleware(request, start_response)
start_response.assert_called_once_with(
HTTP_STATUS_400_BAD_REQUEST, graphql_response_headers
)
assert_json_response_equals_snapshot(result)
| 33.840278 | 88 | 0.807921 | 579 | 4,873 | 6.2038 | 0.105354 | 0.097717 | 0.095212 | 0.13029 | 0.916203 | 0.9098 | 0.9098 | 0.9098 | 0.9098 | 0.9098 | 0 | 0.008555 | 0.136466 | 4,873 | 143 | 89 | 34.076923 | 0.845057 | 0.006977 | 0 | 0.723577 | 0 | 0 | 0.024395 | 0 | 0 | 0 | 0 | 0 | 0.219512 | 1 | 0.073171 | false | 0 | 0.00813 | 0 | 0.081301 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4f3560e0c38bb8f6d8726116102424e8acffe42d | 45 | py | Python | utipy/string/__init__.py | LudvigOlsen/utipy | c287f7eed15b3591118bba49ecdfc2b2605f59a0 | [
"MIT"
] | null | null | null | utipy/string/__init__.py | LudvigOlsen/utipy | c287f7eed15b3591118bba49ecdfc2b2605f59a0 | [
"MIT"
] | 1 | 2022-02-16T15:24:33.000Z | 2022-02-16T15:24:33.000Z | utipy/string/__init__.py | LudvigOlsen/utipy | c287f7eed15b3591118bba49ecdfc2b2605f59a0 | [
"MIT"
] | null | null | null |
from .letter_strings import letter_strings
| 11.25 | 42 | 0.844444 | 6 | 45 | 6 | 0.666667 | 0.722222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 45 | 3 | 43 | 15 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
4f454f2c001a0866cfdf363cf1bf4ebbf84510a8 | 31,851 | py | Python | release/stubs.min/System/__init___parts/_AppDomain.py | tranconbv/ironpython-stubs | a601759e6c6819beff8e6b639d18a24b7e351851 | [
"MIT"
] | null | null | null | release/stubs.min/System/__init___parts/_AppDomain.py | tranconbv/ironpython-stubs | a601759e6c6819beff8e6b639d18a24b7e351851 | [
"MIT"
] | null | null | null | release/stubs.min/System/__init___parts/_AppDomain.py | tranconbv/ironpython-stubs | a601759e6c6819beff8e6b639d18a24b7e351851 | [
"MIT"
] | null | null | null | class _AppDomain:
""" Exposes the public members of the System.AppDomain class to unmanaged code. """
def ZZZ(self):
"""hardcoded/mock instance of the class"""
return _AppDomain()
instance=ZZZ()
"""hardcoded/returns an instance of the class"""
def AppendPrivatePath(self,path):
"""
AppendPrivatePath(self: _AppDomain,path: str)
Provides COM objects with version-independent access to the System.AppDomain.AppendPrivatePath(System.String) method.
path: The name of the directory to be appended to the private path.
"""
pass
def ClearPrivatePath(self):
"""
ClearPrivatePath(self: _AppDomain)
Provides COM objects with version-independent access to the System.AppDomain.ClearPrivatePath method.
"""
pass
def ClearShadowCopyPath(self):
"""
ClearShadowCopyPath(self: _AppDomain)
Provides COM objects with version-independent access to the System.AppDomain.ClearShadowCopyPath method.
"""
pass
def CreateInstance(self,assemblyName,typeName,*__args):
"""
CreateInstance(self: _AppDomain,assemblyName: str,typeName: str) -> ObjectHandle
Provides COM objects with version-independent access to the System.AppDomain.CreateInstance(System.String,System.String) method.
assemblyName: The display name of the assembly. See System.Reflection.Assembly.FullName.
typeName: The fully qualified name of the requested type,including the namespace but not the assembly,as returned by the System.Type.FullName property.
Returns: An object that is a wrapper for the new instance specified by typeName. The return value needs to be unwrapped to access the real object.
CreateInstance(self: _AppDomain,assemblyName: str,typeName: str,activationAttributes: Array[object]) -> ObjectHandle
Provides COM objects with version-independent access to the System.AppDomain.CreateInstance(System.String,System.String,System.Object[]) method overload.
assemblyName: The display name of the assembly. See System.Reflection.Assembly.FullName.
typeName: The fully qualified name of the requested type,including the namespace but not the assembly,as returned by the System.Type.FullName property.
activationAttributes: An array of one or more attributes that can participate in activation. Typically,an array that contains a single System.Runtime.Remoting.Activation.UrlAttribute object.
The System.Runtime.Remoting.Activation.UrlAttribute specifies the URL that is required to activate a remote object.
Returns: An object that is a wrapper for the new instance specified by typeName. The return value needs to be unwrapped to access the real object.
CreateInstance(self: _AppDomain,assemblyName: str,typeName: str,ignoreCase: bool,bindingAttr: BindingFlags,binder: Binder,args: Array[object],culture: CultureInfo,activationAttributes: Array[object],securityAttributes: Evidence) -> ObjectHandle
Provides COM objects with version-independent access to the
System.AppDomain.CreateInstance(System.String,System.String,System.Boolean,System.Reflection.BindingFlags,System.Reflection.Binder,System.Object[],System.Globalization.Cultu
reInfo,System.Object[],System.Security.Policy.Evidence) method overload.
assemblyName: The display name of the assembly. See System.Reflection.Assembly.FullName.
typeName: The fully qualified name of the requested type,including the namespace but not the assembly,as returned by the System.Type.FullName property.
ignoreCase: A Boolean value specifying whether to perform a case-sensitive search or not.
bindingAttr: A combination of zero or more bit flags that affect the search for the typeName constructor. If bindingAttr is zero,a case-sensitive search for public constructors is
conducted.
binder: An object that enables the binding,coercion of argument types,invocation of members,and retrieval of System.Reflection.MemberInfo objects using reflection. If binder is
null,the default binder is used.
args: The arguments to pass to the constructor. This array of arguments must match in number,order,and type the parameters of the constructor to invoke. If the default
constructor is preferred,args must be an empty array or null.
culture: Culture-specific information that governs the coercion of args to the formal types declared for the typeName constructor. If culture is null,the
System.Globalization.CultureInfo for the current thread is used.
activationAttributes: An array of one or more attributes that can participate in activation. Typically,an array that contains a single System.Runtime.Remoting.Activation.UrlAttribute object.
The System.Runtime.Remoting.Activation.UrlAttribute specifies the URL that is required to activate a remote object.
securityAttributes: Information used to authorize creation of typeName.
Returns: An object that is a wrapper for the new instance specified by typeName. The return value needs to be unwrapped to access the real object.
"""
pass
def CreateInstanceFrom(self,assemblyFile,typeName,*__args):
"""
CreateInstanceFrom(self: _AppDomain,assemblyFile: str,typeName: str) -> ObjectHandle
Provides COM objects with version-independent access to the System.AppDomain.CreateInstanceFrom(System.String,System.String) method overload.
assemblyFile: The name,including the path,of a file that contains an assembly that defines the requested type. The assembly is loaded using the
System.Reflection.Assembly.LoadFrom(System.String) method.
typeName: The fully qualified name of the requested type,including the namespace but not the assembly,as returned by the System.Type.FullName property.
Returns: An object that is a wrapper for the new instance,or null if typeName is not found. The return value needs to be unwrapped to access the real object.
CreateInstanceFrom(self: _AppDomain,assemblyFile: str,typeName: str,activationAttributes: Array[object]) -> ObjectHandle
Provides COM objects with version-independent access to the System.AppDomain.CreateInstanceFrom(System.String,System.String,System.Object[]) method overload.
assemblyFile: The name,including the path,of a file that contains an assembly that defines the requested type. The assembly is loaded using the
System.Reflection.Assembly.LoadFrom(System.String) method.
typeName: The fully qualified name of the requested type,including the namespace but not the assembly,as returned by the System.Type.FullName property.
activationAttributes: An array of one or more attributes that can participate in activation. Typically,an array that contains a single System.Runtime.Remoting.Activation.UrlAttribute object.
The System.Runtime.Remoting.Activation.UrlAttribute specifies the URL that is required to activate a remote object.
Returns: An object that is a wrapper for the new instance,or null if typeName is not found. The return value needs to be unwrapped to access the real object.
CreateInstanceFrom(self: _AppDomain,assemblyFile: str,typeName: str,ignoreCase: bool,bindingAttr: BindingFlags,binder: Binder,args: Array[object],culture: CultureInfo,activationAttributes: Array[object],securityAttributes: Evidence) -> ObjectHandle
Provides COM objects with version-independent access to the
System.AppDomain.CreateInstanceFrom(System.String,System.String,System.Boolean,System.Reflection.BindingFlags,System.Reflection.Binder,System.Object[],System.Globalization.C
ultureInfo,System.Object[],System.Security.Policy.Evidence) method overload.
assemblyFile: The name,including the path,of a file that contains an assembly that defines the requested type. The assembly is loaded using the
System.Reflection.Assembly.LoadFrom(System.String) method.
typeName: The fully qualified name of the requested type,including the namespace but not the assembly,as returned by the System.Type.FullName property.
ignoreCase: A Boolean value specifying whether to perform a case-sensitive search or not.
bindingAttr: A combination of zero or more bit flags that affect the search for the typeName constructor. If bindingAttr is zero,a case-sensitive search for public constructors is
conducted.
binder: An object that enables the binding,coercion of argument types,invocation of members,and retrieval of System.Reflection.MemberInfo objects through reflection. If binder
is null,the default binder is used.
args: The arguments to pass to the constructor. This array of arguments must match in number,order,and type the parameters of the constructor to invoke. If the default
constructor is preferred,args must be an empty array or null.
culture: Culture-specific information that governs the coercion of args to the formal types declared for the typeName constructor. If culture is null,the
System.Globalization.CultureInfo for the current thread is used.
activationAttributes: An array of one or more attributes that can participate in activation. Typically,an array that contains a single System.Runtime.Remoting.Activation.UrlAttribute object.
The System.Runtime.Remoting.Activation.UrlAttribute specifies the URL that is required to activate a remote object.
securityAttributes: Information used to authorize creation of typeName.
Returns: An object that is a wrapper for the new instance,or null if typeName is not found. The return value needs to be unwrapped to access the real object.
"""
pass
def DefineDynamicAssembly(self,name,access,*__args):
"""
DefineDynamicAssembly(self: _AppDomain,name: AssemblyName,access: AssemblyBuilderAccess) -> AssemblyBuilder
Provides COM objects with version-independent access to the
System.AppDomain.DefineDynamicAssembly(System.Reflection.AssemblyName,System.Reflection.Emit.AssemblyBuilderAccess) method overload.
name: The unique identity of the dynamic assembly.
access: The access mode for the dynamic assembly.
Returns: Represents the dynamic assembly created.
DefineDynamicAssembly(self: _AppDomain,name: AssemblyName,access: AssemblyBuilderAccess,dir: str) -> AssemblyBuilder
Provides COM objects with version-independent access to the
System.AppDomain.DefineDynamicAssembly(System.Reflection.AssemblyName,System.Reflection.Emit.AssemblyBuilderAccess,System.String) method overload.
name: The unique identity of the dynamic assembly.
access: The mode in which the dynamic assembly will be accessed.
dir: The name of the directory where the assembly will be saved. If dir is null,the directory defaults to the current directory.
Returns: Represents the dynamic assembly created.
DefineDynamicAssembly(self: _AppDomain,name: AssemblyName,access: AssemblyBuilderAccess,evidence: Evidence) -> AssemblyBuilder
Provides COM objects with version-independent access to the
System.AppDomain.DefineDynamicAssembly(System.Reflection.AssemblyName,System.Reflection.Emit.AssemblyBuilderAccess,System.Security.Policy.Evidence) method overload.
name: The unique identity of the dynamic assembly.
access: The mode in which the dynamic assembly will be accessed.
evidence: The evidence supplied for the dynamic assembly. The evidence is used unaltered as the final set of evidence used for policy resolution.
Returns: Represents the dynamic assembly created.
DefineDynamicAssembly(self: _AppDomain,name: AssemblyName,access: AssemblyBuilderAccess,requiredPermissions: PermissionSet,optionalPermissions: PermissionSet,refusedPermissions: PermissionSet) -> AssemblyBuilder
Provides COM objects with version-independent access to the
System.AppDomain.DefineDynamicAssembly(System.Reflection.AssemblyName,System.Reflection.Emit.AssemblyBuilderAccess,System.Security.PermissionSet,System.Security.PermissionSe
t,System.Security.PermissionSet) method overload.
name: The unique identity of the dynamic assembly.
access: The mode in which the dynamic assembly will be accessed.
requiredPermissions: The required permissions request.
optionalPermissions: The optional permissions request.
refusedPermissions: The refused permissions request.
Returns: Represents the dynamic assembly created.
DefineDynamicAssembly(self: _AppDomain,name: AssemblyName,access: AssemblyBuilderAccess,dir: str,evidence: Evidence) -> AssemblyBuilder
Provides COM objects with version-independent access to the
System.AppDomain.DefineDynamicAssembly(System.Reflection.AssemblyName,System.Reflection.Emit.AssemblyBuilderAccess,System.String,System.Security.Policy.Evidence) method
overload.
name: The unique identity of the dynamic assembly.
access: The mode in which the dynamic assembly will be accessed.
dir: The name of the directory where the assembly will be saved. If dir is null,the directory defaults to the current directory.
evidence: The evidence supplied for the dynamic assembly. The evidence is used unaltered as the final set of evidence used for policy resolution.
Returns: Represents the dynamic assembly created.
DefineDynamicAssembly(self: _AppDomain,name: AssemblyName,access: AssemblyBuilderAccess,dir: str,requiredPermissions: PermissionSet,optionalPermissions: PermissionSet,refusedPermissions: PermissionSet) -> AssemblyBuilder
Provides COM objects with version-independent access to the
System.AppDomain.DefineDynamicAssembly(System.Reflection.AssemblyName,System.Reflection.Emit.AssemblyBuilderAccess,System.String,System.Security.PermissionSet,System.Securit
y.PermissionSet,System.Security.PermissionSet) method overload.
name: The unique identity of the dynamic assembly.
access: The mode in which the dynamic assembly will be accessed.
dir: The name of the directory where the assembly will be saved. If dir is null,the directory defaults to the current directory.
requiredPermissions: The required permissions request.
optionalPermissions: The optional permissions request.
refusedPermissions: The refused permissions request.
Returns: Represents the dynamic assembly created.
DefineDynamicAssembly(self: _AppDomain,name: AssemblyName,access: AssemblyBuilderAccess,evidence: Evidence,requiredPermissions: PermissionSet,optionalPermissions: PermissionSet,refusedPermissions: PermissionSet) -> AssemblyBuilder
Provides COM objects with version-independent access to the
System.AppDomain.DefineDynamicAssembly(System.Reflection.AssemblyName,System.Reflection.Emit.AssemblyBuilderAccess,System.Security.Policy.Evidence,System.Security.Permission
Set,System.Security.PermissionSet,System.Security.PermissionSet) method overload.
name: The unique identity of the dynamic assembly.
access: The mode in which the dynamic assembly will be accessed.
evidence: The evidence supplied for the dynamic assembly. The evidence is used unaltered as the final set of evidence used for policy resolution.
requiredPermissions: The required permissions request.
optionalPermissions: The optional permissions request.
refusedPermissions: The refused permissions request.
Returns: Represents the dynamic assembly created.
DefineDynamicAssembly(self: _AppDomain,name: AssemblyName,access: AssemblyBuilderAccess,dir: str,evidence: Evidence,requiredPermissions: PermissionSet,optionalPermissions: PermissionSet,refusedPermissions: PermissionSet) -> AssemblyBuilder
Provides COM objects with version-independent access to the
System.AppDomain.DefineDynamicAssembly(System.Reflection.AssemblyName,System.Reflection.Emit.AssemblyBuilderAccess,System.String,System.Security.Policy.Evidence,System.Secur
ity.PermissionSet,System.Security.PermissionSet,System.Security.PermissionSet) method overload.
name: The unique identity of the dynamic assembly.
access: The mode in which the dynamic assembly will be accessed.
dir: The name of the directory where the assembly will be saved. If dir is null,the directory defaults to the current directory.
evidence: The evidence supplied for the dynamic assembly. The evidence is used unaltered as the final set of evidence used for policy resolution.
requiredPermissions: The required permissions request.
optionalPermissions: The optional permissions request.
refusedPermissions: The refused permissions request.
Returns: Represents the dynamic assembly created.
DefineDynamicAssembly(self: _AppDomain,name: AssemblyName,access: AssemblyBuilderAccess,dir: str,evidence: Evidence,requiredPermissions: PermissionSet,optionalPermissions: PermissionSet,refusedPermissions: PermissionSet,isSynchronized: bool) -> AssemblyBuilder
Provides COM objects with version-independent access to the
System.AppDomain.DefineDynamicAssembly(System.Reflection.AssemblyName,System.Reflection.Emit.AssemblyBuilderAccess,System.String,System.Security.Policy.Evidence,System.Secur
ity.PermissionSet,System.Security.PermissionSet,System.Security.PermissionSet,System.Boolean) method overload.
name: The unique identity of the dynamic assembly.
access: The mode in which the dynamic assembly will be accessed.
dir: The name of the directory where the dynamic assembly will be saved. If dir is null,the directory defaults to the current directory.
evidence: The evidence supplied for the dynamic assembly. The evidence is used unaltered as the final set of evidence used for policy resolution.
requiredPermissions: The required permissions request.
optionalPermissions: The optional permissions request.
refusedPermissions: The refused permissions request.
isSynchronized: true to synchronize the creation of modules,types,and members in the dynamic assembly; otherwise,false.
Returns: Represents the dynamic assembly created.
"""
pass
def DoCallBack(self,theDelegate):
"""
DoCallBack(self: _AppDomain,theDelegate: CrossAppDomainDelegate)
Provides COM objects with version-independent access to the System.AppDomain.DoCallBack(System.CrossAppDomainDelegate) method.
theDelegate: A delegate that specifies a method to call.
"""
pass
def Equals(self,other):
"""
Equals(self: _AppDomain,other: object) -> bool
Provides COM objects with version-independent access to the inherited System.Object.Equals(System.Object) method.
other: The System.Object to compare to the current System.Object.
Returns: true if the specified System.Object is equal to the current System.Object; otherwise,false.
"""
pass
def ExecuteAssembly(self,assemblyFile,assemblySecurity=None,args=None):
"""
ExecuteAssembly(self: _AppDomain,assemblyFile: str,assemblySecurity: Evidence) -> int
Provides COM objects with version-independent access to the System.AppDomain.ExecuteAssembly(System.String,System.Security.Policy.Evidence) method overload.
assemblyFile: The name of the file that contains the assembly to execute.
assemblySecurity: Evidence for loading the assembly.
Returns: The value returned by the entry point of the assembly.
ExecuteAssembly(self: _AppDomain,assemblyFile: str) -> int
Provides COM objects with version-independent access to the System.AppDomain.ExecuteAssembly(System.String) method overload.
assemblyFile: The name of the file that contains the assembly to execute.
Returns: The value returned by the entry point of the assembly.
ExecuteAssembly(self: _AppDomain,assemblyFile: str,assemblySecurity: Evidence,args: Array[str]) -> int
Provides COM objects with version-independent access to the System.AppDomain.ExecuteAssembly(System.String,System.Security.Policy.Evidence,System.String[]) method overload.
assemblyFile: The name of the file that contains the assembly to execute.
assemblySecurity: The supplied evidence for the assembly.
args: The arguments to the entry point of the assembly.
Returns: The value returned by the entry point of the assembly.
"""
pass
def GetAssemblies(self):
"""
GetAssemblies(self: _AppDomain) -> Array[Assembly]
Provides COM objects with version-independent access to the System.AppDomain.GetAssemblies method.
Returns: An array of assemblies in this application domain.
"""
pass
def GetData(self,name):
"""
GetData(self: _AppDomain,name: str) -> object
Provides COM objects with version-independent access to the System.AppDomain.GetData(System.String) method.
name: The name of a predefined application domain property,or the name of an application domain property you have defined.
Returns: The value of the name property.
"""
pass
def GetHashCode(self):
"""
GetHashCode(self: _AppDomain) -> int
Provides COM objects with version-independent access to the inherited System.Object.GetHashCode method.
Returns: A hash code for the current System.Object.
"""
pass
def GetIDsOfNames(self,riid,rgszNames,cNames,lcid,rgDispId):
"""
GetIDsOfNames(self: _AppDomain,riid: Guid,rgszNames: IntPtr,cNames: UInt32,lcid: UInt32,rgDispId: IntPtr) -> Guid
Maps a set of names to a corresponding set of dispatch identifiers.
riid: Reserved for future use. Must be IID_NULL.
rgszNames: Passed-in array of names to be mapped.
cNames: Count of the names to be mapped.
lcid: The locale context in which to interpret the names.
rgDispId: Caller-allocated array which receives the IDs corresponding to the names.
"""
pass
def GetLifetimeService(self):
"""
GetLifetimeService(self: _AppDomain) -> object
Provides COM objects with version-independent access to the inherited System.MarshalByRefObject.GetLifetimeService method.
Returns: An object of type System.Runtime.Remoting.Lifetime.ILease used to control the lifetime policy for this instance.
"""
pass
def GetType(self):
"""
GetType(self: _AppDomain) -> Type
Provides COM objects with version-independent access to the System.AppDomain.GetType method.
Returns: A System.Type representing the type of the current instance.
"""
pass
def GetTypeInfo(self,iTInfo,lcid,ppTInfo):
"""
GetTypeInfo(self: _AppDomain,iTInfo: UInt32,lcid: UInt32,ppTInfo: IntPtr)
Retrieves the type information for an object,which can then be used to get the type information for an interface.
iTInfo: The type information to return.
lcid: The locale identifier for the type information.
ppTInfo: Receives a pointer to the requested type information object.
"""
pass
def GetTypeInfoCount(self,pcTInfo):
"""
GetTypeInfoCount(self: _AppDomain) -> UInt32
Retrieves the number of type information interfaces that an object provides (either 0 or 1).
"""
pass
def InitializeLifetimeService(self):
"""
InitializeLifetimeService(self: _AppDomain) -> object
Provides COM objects with version-independent access to the System.AppDomain.InitializeLifetimeService method.
Returns: Always null.
"""
pass
def Invoke(self,dispIdMember,riid,lcid,wFlags,pDispParams,pVarResult,pExcepInfo,puArgErr):
"""
Invoke(self: _AppDomain,dispIdMember: UInt32,riid: Guid,lcid: UInt32,wFlags: Int16,pDispParams: IntPtr,pVarResult: IntPtr,pExcepInfo: IntPtr,puArgErr: IntPtr) -> Guid
Provides access to properties and methods exposed by an object.
dispIdMember: Identifies the member.
riid: Reserved for future use. Must be IID_NULL.
lcid: The locale context in which to interpret arguments.
wFlags: Flags describing the context of the call.
pDispParams: Pointer to a structure containing an array of arguments,an array of argument DISPIDs for named arguments,and counts for the number of elements in the arrays.
pVarResult: Pointer to the location where the result is to be stored.
pExcepInfo: Pointer to a structure that contains exception information.
puArgErr: The index of the first argument that has an error.
"""
pass
def Load(self,*__args):
"""
Load(self: _AppDomain,assemblyRef: AssemblyName) -> Assembly
Provides COM objects with version-independent access to the System.AppDomain.Load(System.Reflection.AssemblyName) method overload.
assemblyRef: An object that describes the assembly to load.
Returns: The loaded assembly.
Load(self: _AppDomain,assemblyString: str) -> Assembly
Provides COM objects with version-independent access to the System.AppDomain.Load(System.String) method overload.
assemblyString: The display name of the assembly. See System.Reflection.Assembly.FullName.
Returns: The loaded assembly.
Load(self: _AppDomain,rawAssembly: Array[Byte]) -> Assembly
Provides COM objects with version-independent access to the System.AppDomain.Load(System.Byte[]) method overload.
rawAssembly: An array of type byte that is a COFF-based image containing an emitted assembly.
Returns: The loaded assembly.
Load(self: _AppDomain,rawAssembly: Array[Byte],rawSymbolStore: Array[Byte]) -> Assembly
Provides COM objects with version-independent access to the System.AppDomain.Load(System.Byte[],System.Byte[]) method overload.
rawAssembly: An array of type byte that is a COFF-based image containing an emitted assembly.
rawSymbolStore: An array of type byte containing the raw bytes representing the symbols for the assembly.
Returns: The loaded assembly.
Load(self: _AppDomain,rawAssembly: Array[Byte],rawSymbolStore: Array[Byte],securityEvidence: Evidence) -> Assembly
Provides COM objects with version-independent access to the System.AppDomain.Load(System.Byte[],System.Byte[],System.Security.Policy.Evidence) method overload.
rawAssembly: An array of type byte that is a COFF-based image containing an emitted assembly.
rawSymbolStore: An array of type byte containing the raw bytes representing the symbols for the assembly.
securityEvidence: Evidence for loading the assembly.
Returns: The loaded assembly.
Load(self: _AppDomain,assemblyRef: AssemblyName,assemblySecurity: Evidence) -> Assembly
Provides COM objects with version-independent access to the System.AppDomain.Load(System.Reflection.AssemblyName,System.Security.Policy.Evidence) method overload.
assemblyRef: An object that describes the assembly to load.
assemblySecurity: Evidence for loading the assembly.
Returns: The loaded assembly.
Load(self: _AppDomain,assemblyString: str,assemblySecurity: Evidence) -> Assembly
Provides COM objects with version-independent access to the System.AppDomain.Load(System.String,System.Security.Policy.Evidence) method overload.
assemblyString: The display name of the assembly. See System.Reflection.Assembly.FullName.
assemblySecurity: Evidence for loading the assembly.
Returns: The loaded assembly.
"""
pass
def SetAppDomainPolicy(self,domainPolicy):
"""
SetAppDomainPolicy(self: _AppDomain,domainPolicy: PolicyLevel)
Provides COM objects with version-independent access to the System.AppDomain.SetAppDomainPolicy(System.Security.Policy.PolicyLevel) method.
domainPolicy: The security policy level.
"""
pass
def SetCachePath(self,s):
"""
SetCachePath(self: _AppDomain,s: str)
Provides COM objects with version-independent access to the System.AppDomain.SetCachePath(System.String) method.
s: The fully qualified path to the shadow copy location.
"""
pass
def SetData(self,name,data):
"""
SetData(self: _AppDomain,name: str,data: object)
Provides COM objects with version-independent access to the System.AppDomain.SetData(System.String,System.Object) method.
name: The name of a user-defined application domain property to create or change.
data: The value of the property.
"""
pass
def SetPrincipalPolicy(self,policy):
"""
SetPrincipalPolicy(self: _AppDomain,policy: PrincipalPolicy)
Provides COM objects with version-independent access to the System.AppDomain.SetPrincipalPolicy(System.Security.Principal.PrincipalPolicy) method.
policy: One of the System.Security.Principal.PrincipalPolicy values that specifies the type of the principal object to attach to threads.
"""
pass
def SetShadowCopyPath(self,s):
"""
SetShadowCopyPath(self: _AppDomain,s: str)
Provides COM objects with version-independent access to the System.AppDomain.SetShadowCopyPath(System.String) method.
s: A list of directory names,where each name is separated by a semicolon.
"""
pass
def SetThreadPrincipal(self,principal):
"""
SetThreadPrincipal(self: _AppDomain,principal: IPrincipal)
Provides COM objects with version-independent access to the System.AppDomain.SetThreadPrincipal(System.Security.Principal.IPrincipal) method.
principal: The principal object to attach to threads.
"""
pass
def ToString(self):
"""
ToString(self: _AppDomain) -> str
Provides COM objects with version-independent access to the System.AppDomain.ToString method.
Returns: A string formed by concatenating the literal string "Name:",the friendly name of the application domain,and either string representations of the context policies or the
string "There are no context policies."
"""
pass
def __eq__(self,*args):
""" x.__eq__(y) <==> x==y """
pass
def __init__(self,*args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
def __str__(self,*args):
pass
BaseDirectory=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Provides COM objects with version-independent access to the System.AppDomain.BaseDirectory property.
Get: BaseDirectory(self: _AppDomain) -> str
"""
DynamicDirectory=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Provides COM objects with version-independent access to the System.AppDomain.DynamicDirectory property.
Get: DynamicDirectory(self: _AppDomain) -> str
"""
Evidence=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Provides COM objects with version-independent access to the System.AppDomain.Evidence property.
Get: Evidence(self: _AppDomain) -> Evidence
"""
FriendlyName=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Provides COM objects with version-independent access to the System.AppDomain.FriendlyName property.
Get: FriendlyName(self: _AppDomain) -> str
"""
RelativeSearchPath=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Provides COM objects with version-independent access to the System.AppDomain.RelativeSearchPath property.
Get: RelativeSearchPath(self: _AppDomain) -> str
"""
ShadowCopyFiles=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Provides COM objects with version-independent access to the System.AppDomain.ShadowCopyFiles property.
Get: ShadowCopyFiles(self: _AppDomain) -> bool
"""
AssemblyLoad=None
AssemblyResolve=None
DomainUnload=None
ProcessExit=None
ResourceResolve=None
TypeResolve=None
UnhandledException=None
| 57.49278 | 264 | 0.764591 | 3,906 | 31,851 | 6.207885 | 0.100358 | 0.013609 | 0.036374 | 0.044457 | 0.777301 | 0.766826 | 0.760516 | 0.750206 | 0.739979 | 0.729215 | 0 | 0.000682 | 0.170858 | 31,851 | 553 | 265 | 57.596745 | 0.91749 | 0.850177 | 0 | 0.38961 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.402597 | false | 0.38961 | 0 | 0 | 0.61039 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 9 |
4f8d17b27e62f2ca68b9286653efdc4314218b3e | 9,941 | py | Python | tests/logic/test_ParityGenerator8.py | jamesjiang52/Bitwise | c71f151d23034b3f9e2a939f637be0eaa16c45c3 | [
"MIT"
] | null | null | null | tests/logic/test_ParityGenerator8.py | jamesjiang52/Bitwise | c71f151d23034b3f9e2a939f637be0eaa16c45c3 | [
"MIT"
] | null | null | null | tests/logic/test_ParityGenerator8.py | jamesjiang52/Bitwise | c71f151d23034b3f9e2a939f637be0eaa16c45c3 | [
"MIT"
] | null | null | null | import bitwise as bw
class TestParityGenerator8:
def test_ParityGenerator8(self):
input_1 = bw.wire.Wire()
input_2 = bw.wire.Wire()
input_3 = bw.wire.Wire()
input_4 = bw.wire.Wire()
input_5 = bw.wire.Wire()
input_6 = bw.wire.Wire()
input_7 = bw.wire.Wire()
input_8 = bw.wire.Wire()
parity_bit = bw.wire.Wire()
input_bus = bw.wire.Bus8(
input_1,
input_2,
input_3,
input_4,
input_5,
input_6,
input_7,
input_8
)
a = bw.logic.ParityGenerator8(input_bus, parity_bit)
input_1.value = 0
input_2.value = 0
input_3.value = 0
input_4.value = 0
input_5.value = 0
input_6.value = 0
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 0
input_1.value = 0
input_2.value = 0
input_3.value = 0
input_4.value = 0
input_5.value = 0
input_6.value = 0
input_7.value = 0
input_8.value = 1
assert parity_bit.value == 1
input_1.value = 0
input_2.value = 0
input_3.value = 0
input_4.value = 0
input_5.value = 0
input_6.value = 0
input_7.value = 1
input_8.value = 0
assert parity_bit.value == 1
input_1.value = 0
input_2.value = 0
input_3.value = 0
input_4.value = 0
input_5.value = 0
input_6.value = 1
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 1
input_1.value = 0
input_2.value = 0
input_3.value = 0
input_4.value = 0
input_5.value = 1
input_6.value = 0
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 1
input_1.value = 0
input_2.value = 0
input_3.value = 0
input_4.value = 1
input_5.value = 0
input_6.value = 0
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 1
input_1.value = 0
input_2.value = 0
input_3.value = 1
input_4.value = 0
input_5.value = 0
input_6.value = 0
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 1
input_1.value = 0
input_2.value = 1
input_3.value = 0
input_4.value = 0
input_5.value = 0
input_6.value = 0
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 1
input_1.value = 1
input_2.value = 0
input_3.value = 0
input_4.value = 0
input_5.value = 0
input_6.value = 0
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 1
input_1.value = 0
input_2.value = 0
input_3.value = 0
input_4.value = 0
input_5.value = 0
input_6.value = 0
input_7.value = 1
input_8.value = 1
assert parity_bit.value == 0
input_1.value = 0
input_2.value = 0
input_3.value = 0
input_4.value = 0
input_5.value = 0
input_6.value = 1
input_7.value = 1
input_8.value = 0
assert parity_bit.value == 0
input_1.value = 0
input_2.value = 0
input_3.value = 0
input_4.value = 0
input_5.value = 1
input_6.value = 1
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 0
input_1.value = 0
input_2.value = 0
input_3.value = 0
input_4.value = 1
input_5.value = 1
input_6.value = 0
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 0
input_1.value = 0
input_2.value = 0
input_3.value = 1
input_4.value = 1
input_5.value = 0
input_6.value = 0
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 0
input_1.value = 0
input_2.value = 1
input_3.value = 1
input_4.value = 0
input_5.value = 0
input_6.value = 0
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 0
input_1.value = 1
input_2.value = 1
input_3.value = 0
input_4.value = 0
input_5.value = 0
input_6.value = 0
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 0
input_1.value = 0
input_2.value = 0
input_3.value = 0
input_4.value = 0
input_5.value = 0
input_6.value = 1
input_7.value = 1
input_8.value = 1
assert parity_bit.value == 1
input_1.value = 0
input_2.value = 0
input_3.value = 0
input_4.value = 0
input_5.value = 1
input_6.value = 1
input_7.value = 1
input_8.value = 0
assert parity_bit.value == 1
input_1.value = 0
input_2.value = 0
input_3.value = 0
input_4.value = 1
input_5.value = 1
input_6.value = 1
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 1
input_1.value = 0
input_2.value = 0
input_3.value = 1
input_4.value = 1
input_5.value = 1
input_6.value = 0
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 1
input_1.value = 0
input_2.value = 1
input_3.value = 1
input_4.value = 1
input_5.value = 0
input_6.value = 0
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 1
input_1.value = 1
input_2.value = 1
input_3.value = 1
input_4.value = 0
input_5.value = 0
input_6.value = 0
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 1
input_1.value = 0
input_2.value = 0
input_3.value = 0
input_4.value = 0
input_5.value = 1
input_6.value = 1
input_7.value = 1
input_8.value = 1
assert parity_bit.value == 0
input_1.value = 0
input_2.value = 0
input_3.value = 0
input_4.value = 1
input_5.value = 1
input_6.value = 1
input_7.value = 1
input_8.value = 0
assert parity_bit.value == 0
input_1.value = 0
input_2.value = 0
input_3.value = 1
input_4.value = 1
input_5.value = 1
input_6.value = 1
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 0
input_1.value = 0
input_2.value = 1
input_3.value = 1
input_4.value = 1
input_5.value = 1
input_6.value = 0
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 0
input_1.value = 1
input_2.value = 1
input_3.value = 1
input_4.value = 1
input_5.value = 0
input_6.value = 0
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 0
input_1.value = 0
input_2.value = 0
input_3.value = 0
input_4.value = 1
input_5.value = 1
input_6.value = 1
input_7.value = 1
input_8.value = 1
assert parity_bit.value == 1
input_1.value = 0
input_2.value = 0
input_3.value = 1
input_4.value = 1
input_5.value = 1
input_6.value = 1
input_7.value = 1
input_8.value = 0
assert parity_bit.value == 1
input_1.value = 0
input_2.value = 1
input_3.value = 1
input_4.value = 1
input_5.value = 1
input_6.value = 1
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 1
input_1.value = 1
input_2.value = 1
input_3.value = 1
input_4.value = 1
input_5.value = 1
input_6.value = 0
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 1
input_1.value = 0
input_2.value = 0
input_3.value = 1
input_4.value = 1
input_5.value = 1
input_6.value = 1
input_7.value = 1
input_8.value = 1
assert parity_bit.value == 0
input_1.value = 0
input_2.value = 1
input_3.value = 1
input_4.value = 1
input_5.value = 1
input_6.value = 1
input_7.value = 1
input_8.value = 0
assert parity_bit.value == 0
input_1.value = 1
input_2.value = 1
input_3.value = 1
input_4.value = 1
input_5.value = 1
input_6.value = 1
input_7.value = 0
input_8.value = 0
assert parity_bit.value == 0
input_1.value = 0
input_2.value = 1
input_3.value = 1
input_4.value = 1
input_5.value = 1
input_6.value = 1
input_7.value = 1
input_8.value = 1
assert parity_bit.value == 1
input_1.value = 1
input_2.value = 1
input_3.value = 1
input_4.value = 1
input_5.value = 1
input_6.value = 1
input_7.value = 1
input_8.value = 0
assert parity_bit.value == 1
input_1.value = 1
input_2.value = 1
input_3.value = 1
input_4.value = 1
input_5.value = 1
input_6.value = 1
input_7.value = 1
input_8.value = 1
assert parity_bit.value == 0
print(a.__doc__)
print(a)
a(
input_bus=(0, 1, 1, 1, 1, 1, 1, 1),
parity_bit=None
)
assert parity_bit.value == 1
| 24.485222 | 60 | 0.510914 | 1,477 | 9,941 | 3.194313 | 0.021666 | 0.245443 | 0.380034 | 0.161085 | 0.923485 | 0.919033 | 0.91755 | 0.91755 | 0.91755 | 0.91755 | 0 | 0.112076 | 0.409416 | 9,941 | 405 | 61 | 24.545679 | 0.691535 | 0 | 0 | 0.92011 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104683 | 1 | 0.002755 | false | 0 | 0.002755 | 0 | 0.008264 | 0.00551 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
96d852752a68cce9a15cf5bc24786c42739c4575 | 8,931 | py | Python | auth/tests/unit/routers/test_register.py | wattlecloud/foundation-server | e1467d192a7729fa4f116c80dcd001bfd58662e8 | [
"Apache-2.0"
] | null | null | null | auth/tests/unit/routers/test_register.py | wattlecloud/foundation-server | e1467d192a7729fa4f116c80dcd001bfd58662e8 | [
"Apache-2.0"
] | 1 | 2021-07-20T00:28:27.000Z | 2021-07-20T00:28:27.000Z | auth/tests/unit/routers/test_register.py | wattlecloud/foundation-server | e1467d192a7729fa4f116c80dcd001bfd58662e8 | [
"Apache-2.0"
] | null | null | null | import base64
from datetime import datetime
import json
from uuid import uuid4
from fastapi import status
from jwt.exceptions import ExpiredSignatureError, PyJWTError
from wattle.core.const import Task
from wattle.core.models.db.user import User
from wattle.core.models.py.auth import VerifyEmailModel, VerifyModel
from wattle.core.models.py.user import UserCreateModel
from wattle.auth.worker import worker
from wattle.core.exceptions import CoreError
from wattle.core.const import CoreErrorType
from pydantic.main import ValidationError
def test_register_request(test_client, mock_session, csrftoken, mocker):
mocker.patch.object(User, "create")
user = User(
id=uuid4(),
email="test@test.com",
verified=False,
first_name="first",
last_name="last",
created_at=datetime.now(),
updated_at=datetime.now(),
)
User.create.return_value = user
mock_session.return_value = "session"
mocker.patch.object(worker, "send_task")
request_data = UserCreateModel(
first_name="first",
last_name="last",
password="password",
confirm_password="password",
email=user.email,
)
response = test_client.post(
"/register",
json=request_data.dict(),
headers={"x-csrftoken": csrftoken},
cookies={"csrftoken": csrftoken},
)
assert response.status_code == status.HTTP_204_NO_CONTENT
User.create.assert_called_with("session", request_data)
worker.send_task.assert_called_with(Task.Auth.EMAIL_VERIFY, args=["test@test.com"])
def test_register_request_conflict(test_client, mock_session, csrftoken, mocker):
mocker.patch.object(User, "create")
user = User(
id=uuid4(),
email="test@test.com",
verified=False,
first_name="first",
last_name="last",
created_at=datetime.now(),
updated_at=datetime.now(),
)
User.create.return_value = user
mock_session.return_value = "session"
mocker.patch.object(worker, "send_task")
response = test_client.post(
"/register",
json={
"first_name": "first",
"last_name": "last",
"password": "password",
"confirm_password": "wrong_password",
"email": user.email,
},
headers={"x-csrftoken": csrftoken},
cookies={"csrftoken": csrftoken},
)
assert response.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY
User.create.assert_not_called()
worker.send_task.assert_not_called()
def test_register_request_invalid(test_client, mock_session, csrftoken, mocker):
mocker.patch.object(User, "create")
user = User(
id=uuid4(),
email="test@test.com",
verified=False,
first_name="first",
last_name="last",
created_at=datetime.now(),
updated_at=datetime.now(),
)
User.create.side_effect = CoreError(error_type=CoreErrorType.EMAIL_CONFLICT)
mock_session.return_value = "session"
mocker.patch.object(worker, "send_task")
request_data = UserCreateModel(
first_name="first",
last_name="last",
password="password",
confirm_password="password",
email=user.email,
)
response = test_client.post(
"/register",
json=request_data.dict(),
headers={"x-csrftoken": csrftoken},
cookies={"csrftoken": csrftoken},
)
assert response.status_code == status.HTTP_204_NO_CONTENT
User.create.assert_called_with("session", request_data)
worker.send_task.assert_not_called()
def test_verify_email_request(test_client, mock_session, csrftoken, mocker):
mocker.patch.object(User, "get_by_email")
user = User(id=uuid4(), email="test@test.com", verified=False)
User.get_by_email.return_value = user
mock_session.return_value = "session"
mocker.patch.object(worker, "send_task")
request_data = VerifyEmailModel(email=user.email)
response = test_client.post(
"/verify/email",
json=request_data.dict(),
headers={"x-csrftoken": csrftoken},
cookies={"csrftoken": csrftoken},
)
assert response.status_code == status.HTTP_204_NO_CONTENT
User.get_by_email.assert_called_with("session", request_data.email)
worker.send_task.assert_called_with(Task.Auth.EMAIL_VERIFY, args=[user.email])
def test_verify_email_not_exist_request(test_client, mock_session, csrftoken, mocker):
mocker.patch.object(User, "get_by_email")
User.get_by_email.return_value = None
mock_session.return_value = "session"
mocker.patch.object(worker, "send_task")
request_data = VerifyEmailModel(email="test@test.com")
response = test_client.post(
"/verify/email",
json=request_data.dict(),
headers={"x-csrftoken": csrftoken},
cookies={"csrftoken": csrftoken},
)
assert response.status_code == status.HTTP_204_NO_CONTENT
User.get_by_email.assert_called_with("session", request_data.email)
worker.send_task.assert_not_called()
def test_verify_email_already_verified_request(
test_client, mock_session, csrftoken, mocker
):
mocker.patch.object(User, "get_by_email")
user = User(id=uuid4(), email="test@test.com", verified=True)
User.get_by_email.return_value = user
mock_session.return_value = "session"
mocker.patch.object(worker, "send_task")
request_data = VerifyEmailModel(email=user.email)
response = test_client.post(
"/verify/email",
json=request_data.dict(),
headers={"x-csrftoken": csrftoken},
cookies={"csrftoken": csrftoken},
)
assert response.status_code == status.HTTP_204_NO_CONTENT
User.get_by_email.assert_called_with("session", request_data.email)
worker.send_task.assert_not_called()
def test_verify_request(test_client, mock_session, csrftoken, mocker):
mocker.patch.object(User, "get_by_id")
mocker.patch.object(User, "verify")
user_id = uuid4()
mocker.patch(
"wattle.auth.routers.register.decode", return_value={"sub": str(user_id)}
)
mock_session.return_value = "session"
user = User(id=user_id, email="test@test.com", verified=True)
mocker.patch.object(user, "verify")
User.get_by_id.return_value = user
response = test_client.post(
"/verify",
json=VerifyModel(token="token").dict(),
headers={"x-csrftoken": csrftoken},
cookies={"csrftoken": csrftoken},
)
assert response.status_code == status.HTTP_204_NO_CONTENT
User.get_by_id.assert_called_with("session", user.id)
user.verify.assert_called()
def test_verify_expired_token_request(test_client, mock_session, csrftoken, mocker):
mocker.patch.object(User, "get_by_id")
mocker.patch.object(User, "verify")
user_id = uuid4()
mocker.patch(
"wattle.auth.routers.register.decode", side_effect=ExpiredSignatureError
)
mock_session.return_value = "session"
user = User(id=user_id, email="test@test.com", verified=True)
mocker.patch.object(user, "verify")
response = test_client.post(
"/verify",
json=VerifyModel(token="token").dict(),
headers={"x-csrftoken": csrftoken},
cookies={"csrftoken": csrftoken},
)
assert response.status_code == status.HTTP_401_UNAUTHORIZED
User.get_by_id.assert_not_called()
user.verify.assert_not_called()
def test_verify_invalid_credentials_jwt_request(
test_client, mock_session, csrftoken, mocker
):
mocker.patch.object(User, "get_by_id")
mocker.patch.object(User, "verify")
user_id = uuid4()
mocker.patch("wattle.auth.routers.register.decode", side_effect=PyJWTError)
mock_session.return_value = "session"
user = User(id=user_id, email="test@test.com", verified=True)
mocker.patch.object(user, "verify")
response = test_client.post(
"/verify",
json=VerifyModel(token="token").dict(),
headers={"x-csrftoken": csrftoken},
cookies={"csrftoken": csrftoken},
)
assert response.status_code == status.HTTP_401_UNAUTHORIZED
User.get_by_id.assert_not_called()
user.verify.assert_not_called()
def test_verify_no_user_request(test_client, mock_session, csrftoken, mocker):
mocker.patch.object(User, "get_by_id")
mocker.patch.object(User, "verify")
user_id = uuid4()
mocker.patch(
"wattle.auth.routers.register.decode", return_value={"sub": str(user_id)}
)
mock_session.return_value = "session"
user = None
User.get_by_id.return_value = user
response = test_client.post(
"/verify",
json=VerifyModel(token="token").dict(),
headers={"x-csrftoken": csrftoken},
cookies={"csrftoken": csrftoken},
)
assert response.status_code == status.HTTP_401_UNAUTHORIZED
User.get_by_id.assert_called_with("session", user_id)
| 29.091205 | 87 | 0.683462 | 1,089 | 8,931 | 5.354454 | 0.094582 | 0.050935 | 0.067055 | 0.061224 | 0.871892 | 0.855771 | 0.845653 | 0.845653 | 0.839479 | 0.839479 | 0 | 0.005832 | 0.193595 | 8,931 | 306 | 88 | 29.186275 | 0.803805 | 0 | 0 | 0.719298 | 0 | 0 | 0.11768 | 0.015676 | 0 | 0 | 0 | 0 | 0.127193 | 1 | 0.04386 | false | 0.026316 | 0.061404 | 0 | 0.105263 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
96f320f1b0ebc2a8804514e543d1aa2688f9453f | 38,459 | py | Python | Concrete/concrete_beam_classes.py | hotmailbox/Structural-Engineering | f34dcaec728fbb3e3a05c6f29ed5dabc621550cb | [
"BSD-3-Clause"
] | null | null | null | Concrete/concrete_beam_classes.py | hotmailbox/Structural-Engineering | f34dcaec728fbb3e3a05c6f29ed5dabc621550cb | [
"BSD-3-Clause"
] | null | null | null | Concrete/concrete_beam_classes.py | hotmailbox/Structural-Engineering | f34dcaec728fbb3e3a05c6f29ed5dabc621550cb | [
"BSD-3-Clause"
] | null | null | null | '''
BSD 3-Clause License
Copyright (c) 2019, Donald N. Bockoven III
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
* Neither the name of the copyright holder nor the names of its
contributors may be used to endorse or promote products derived from
this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
'''
from __future__ import division
import math
from sys import version_info
class reinforcement:
def __init__(self,fy_ksi):
self.bar = {0:[0,0], 3:[0.375,0.11], 4: [0.5,0.2], 5: [0.625,0.31], 6: [0.75,0.44], 7: [0.875,0.6], 8: [1,0.79], 9: [1.128,1], 10: [1.27,1.27], 11: [1.41,1.56], 14: [1.693,2.25], 18: [2.257,4]}
self.fy_ksi = float(fy_ksi)
self.fy_psi = self.fy_ksi * 1000
self.Es_ksi = 29000
self.Es_psi = self.Es_ksi * 1000
class rect_beam:
def __init__(self, b_in, h_in, f_prime_c_psi,density_pcf):
self.b_in = float(b_in)
self.h_in = float(h_in)
self.f_prime_c_psi = float(f_prime_c_psi)
self.Ig_in4 = (self.b_in * self.h_in**3)/12
if 90 <= density_pcf <= 160:
self.Ec_psi = density_pcf**1.5 * 33 * self.f_prime_c_psi**(1/2.0)
else:
self.Ec_psi = 'Density of Concrete must be between 90 and 160 PCF'
self.weight_plf = (self.b_in*self.h_in) * (1/144) * density_pcf
self.weight_klf = self.weight_plf/1000.0
self.fr_psi = 7.5 * self.f_prime_c_psi**(1/2.0)
self.Mcrack_inlbs = (self.fr_psi*self.Ig_in4)/(self.h_in/2)
self.Mcrack_ftlbs = self.Mcrack_inlbs/12.0
self.Mcrack_ftkips = self.Mcrack_ftlbs/1000.0
if self.f_prime_c_psi <= 4000:
self.beta1 = 0.85
elif self.f_prime_c_psi <= 8000:
self.beta1 = 0.85 - ((0.05*(self.f_prime_c_psi-4000))/1000)
else:
self.beta1 = 0.65
#section vertex coordinates to make plotting easier
self.section_x_coords_in = [0,self.b_in,self.b_in,0,0]
self.section_y_coords_in = [0,0,self.h_in,self.h_in,0]
def as_min(self,d_in,fy_psi):
return max((3 * self.f_prime_c_psi**(1/2.0) * self.b_in * d_in)/fy_psi, (200.0 * self.b_in * d_in)/fy_psi)
def max_bars_layer(self,flexural_bar, cover_in, shear_bar, aggregate_size_in):
self.first_interior_bar_in = cover_in + shear_bar[0] + 2*shear_bar[0]
min_spacing = max(1,flexural_bar[0],1.33*aggregate_size_in)+flexural_bar[0]
num_interior_bars = 1 + ((self.b_in - 2*self.first_interior_bar_in)/min_spacing)
return math.floor(num_interior_bars)
def bar_spacing(self,flexural_bar, aggregate_size_in):
return max(1,flexural_bar[0],1.33*aggregate_size_in)+flexural_bar[0]
def min_bars_bottom_layer(self,reinf_bar,cover_in, shear_bar, fy_psi):
fs = (2/3.0)*fy_psi
cc = cover_in + shear_bar[0]
first_interior_bar_in = cover_in + shear_bar[0] + 2*shear_bar[0]
min_spacing = (15*(40000/fs)) - 2.5*cc
num_interior_bars = 1 + ((self.b_in - 2*first_interior_bar_in)/min_spacing)
return math.ceil(num_interior_bars)
def flexural_bottom_bars_automatic_by_layers(self,flexural_bars_array,flexural_bar_count_array,cover_in,shear_bar):
bottom_bars_as = []
bottom_bars_d = []
bottom_bars_cg = 0
i=0
for bars in flexural_bar_count_array:
bottom_bars_as.append(bars*flexural_bars_array[i][1])
if i==0:
bottom_bars_d.append(self.h_in - (cover_in + shear_bar[0] + (flexural_bars_array[i][0] * 0.5)))
else:
bottom_bars_d.append(bottom_bars_d[i-1] - (1 + flexural_bars_array[i-1][0] * 0.5 + flexural_bars_array[i][0] * 0.5))
i+=1
total_as = sum(bottom_bars_as)
total_as_d = 0
i=0
for i in range(len(bottom_bars_as)):
total_as_d = total_as_d + (bottom_bars_as[i]*bottom_bars_d[i])
bottom_bars_cg = total_as_d/total_as
return bottom_bars_as, bottom_bars_d, bottom_bars_cg
def flexural_top_bars_automatic_by_layers(self,flexural_bars_array,flexural_bar_count_array,cover_in,shear_bar):
top_bars_as = []
top_bars_d = []
i=0
for bars in flexural_bar_count_array:
top_bars_as.append(bars*flexural_bars_array[i][1])
if i==0:
top_bars_d.append((cover_in + shear_bar[0] + (flexural_bars_array[i][0] * 0.5)))
else:
top_bars_d.append(top_bars_d[i-1] + (1 + flexural_bars_array[i-1][0] * 0.5 + flexural_bars_array[i][0] * 0.5))
i+=1
total_as = sum(top_bars_as)
total_as_d = 0
i=0
for i in range(len(top_bars_as)):
total_as_d = total_as_d + (top_bars_as[i]*top_bars_d[i])
return top_bars_as, top_bars_d
def strain_compatibility_steel(self,bars_as_array,bars_d_array,c_in,fy_psi,Es_psi):
steel_strain = []
steel_stress = []
steel_tension_force_layer_lbs = []
steel_tension_force_lbs = 0
i=0
for i in range(len(bars_as_array)):
steel_strain.append(0.003*((bars_d_array[i]/c_in)-1))
fs = steel_strain[i] * Es_psi
if fs < -1*fy_psi or fs > fy_psi:
steel_stress.append((fs/abs(fs))*fy_psi)
steel_tension_force_layer_lbs.append(((fs/abs(fs))*fy_psi)*bars_as_array[i])
else:
steel_stress.append(fs)
steel_tension_force_layer_lbs.append(fs*bars_as_array[i])
steel_tension_force_lbs = sum(steel_tension_force_layer_lbs)
return steel_strain,steel_stress,steel_tension_force_layer_lbs,steel_tension_force_lbs
def strain_compatibility_concrete(self,c_in):
a = self.beta1 * c_in
compression_block_in2 = self.b_in * a
concrete_compression_force_lbs = 0.85 * self.f_prime_c_psi * compression_block_in2
compression_block_cg_top = a/2
return concrete_compression_force_lbs, compression_block_cg_top
def find_pna(self,bars_as_array,bars_d_array,bars_cg,fy_psi,Es_psi):
a=0
b=bars_d_array[0]
c=0
pna = 0
loop_max = 10000
tol = 0.0000000000001
loop = 0
while loop<loop_max:
c = (a+b)/2
strain_c, stress_c, layer_c, tension_c = self.strain_compatibility_steel(bars_as_array,bars_d_array,c,fy_psi,Es_psi)
compression_c, compression_cg = self.strain_compatibility_concrete(c)
if compression_c == tension_c or (b-a)/2 <= tol:
pna = c
loop = loop_max
elif compression_c > tension_c:
b = c
else:
a = c
loop+=1
return pna
def moment_capacity_inlbs(self,bars_as_array,bars_d_array,bars_cg,c_in,fy_psi,Es_psi):
a = self.beta1 * c_in
compression_block_in2 = self.b_in * a
concrete_compression_force_lbs = 0.85 * self.f_prime_c_psi * compression_block_in2
steel_strain = []
steel_stress = []
steel_tension_force_layer_lbs = []
steel_moment_component_inlbs = 0
i=0
for i in range(len(bars_as_array)):
steel_strain.append(0.003*((bars_d_array[i]/c_in)-1))
fs = steel_strain[i] * Es_psi
if fs < -1*fy_psi or fs > fy_psi:
steel_stress.append((fs/abs(fs))*fy_psi)
steel_tension_force_layer_lbs.append(((fs/abs(fs))*fy_psi)*bars_as_array[i])
steel_moment_component_inlbs = steel_moment_component_inlbs + (steel_tension_force_layer_lbs[i]*(bars_d_array[i]-bars_cg))
else:
steel_stress.append(fs)
steel_tension_force_layer_lbs.append(fs*bars_as_array[i])
steel_moment_component_inlbs = steel_moment_component_inlbs + (steel_tension_force_layer_lbs[i]*(bars_d_array[i]-bars_cg))
if max(steel_strain)<0.004:
return 'Section not Tension Controlled'
else:
if max(steel_strain)>=0.005:
phi = 0.9
else:
phi = 0.65 + ((max(steel_strain)-0.002)*(250/3))
concrete_moment_arm_in = bars_cg - (a/2)
concrete_moment_component_inlbs = concrete_compression_force_lbs * concrete_moment_arm_in
nominal_moment_inlbs = concrete_moment_component_inlbs + steel_moment_component_inlbs
#print phi,concrete_moment_component_inlbs,steel_moment_component_inlbs,concrete_moment_arm_in
#print concrete_compression_force_lbs, steel_tension_force_layer_lbs
return phi,nominal_moment_inlbs,phi*nominal_moment_inlbs
def concrete_shear_capacity_lbs(self,bars_cg):
phi = 0.75
if self.f_prime_c_psi**(1/2.0) > 100:
vc = 2*100*self.b_in*bars_cg
phivc = phi*vc
vsmax = 8*100*self.b_in*bars_cg
phivsmax = phi*vsmax
vnmax = vc + vsmax
phivnmax = phi*vnmax
return phi, vc, phivc, vsmax, vnmax, phivnmax
else:
vc = 2*self.f_prime_c_psi**(1/2.0)*self.b_in*bars_cg
phivc = phi*vc
vsmax = 8*self.f_prime_c_psi**(1/2.0)*self.b_in*bars_cg
phivsmax = phi*vsmax
vnmax = vc + vsmax
phivnmax = phi*vnmax
return phi, vc, phivc, vsmax, vnmax, phivnmax
def cracked_moment_of_inertia_in4(self,bars_as_array,bars_d_array,Es_psi):
n = Es_psi/self.Ec_psi
a=0
b=max(bars_d_array)
c=0
mna = 0
na = 0
loop_max = 10000
tol = 0.0000000000001
loop = 0
while loop<loop_max:
c = (a+b)/2
mna = self.b_in*c*(c/2)
i=0
for i in range(len(bars_as_array)):
if c < bars_d_array[i]:
mna = mna - (n*bars_as_array[i]*(bars_d_array[i]-c))
else:
mna = mna + ((n-1)*bars_as_array[i]*(c-bars_d_array[i]))
if mna == 0 or (b-a)/2 <= tol:
na = c
loop = loop_max
elif mna > 1:
b = c
else:
a = c
loop+=1
#print na
i_crack_in4 = (self.b_in*na**3)/3
for i in range(len(bars_as_array)):
if na < bars_d_array[i]:
i_crack_in4 = i_crack_in4 + (n*bars_as_array[i]*(bars_d_array[i]-na)**2)
else:
i_crack_in4 = i_crack_in4 + ((n-1)*bars_as_array[i]*(na-bars_d_array[i])**2)
return i_crack_in4
class t_beam:
def __init__(self, b_in, h_in, h_slab_in, beam_span_ft, slab_span_left_ft, slab_span_right_ft, bf_in, hf_in, f_prime_c_psi,density_pcf):
self.bw_in = float(b_in)
self.h_in = float(h_in)
self.f_prime_c_psi = float(f_prime_c_psi)
#calculation of effective flange width per ACI 318-08 section 8.12
#Section 8.12.3 - for slab on one side only
if float(slab_span_left_ft) == 0 and float(slab_span_right_ft) != 0:
self.bf_left_in = 0
self.bf_right_in = min((1/12.0)*float(beam_span_ft)*12,6*float(h_slab_in),(1/2.0)*float(slab_span_right_ft)*12)
self.bf_in = self.bw_in + self.bf_right_in
self.bf_status = 'OK'
self.hf_in = float(h_slab_in)
elif float(slab_span_right_ft) == 0 and float(slab_span_left_ft) != 0:
self.bf_left_in = min((1/12.0)*float(beam_span_ft)*12,6*float(h_slab_in),(1/2.0)*float(slab_span_left_ft)*12)
self.bf_right_in = 0
self.bf_in = self.bw_in + self.bf_left_in
self.bf_status = 'OK'
self.hf_in = float(h_slab_in)
#Section 8.12.4 - Assumes user wants to use their own flange, expects bf_in and hf_in to be non-zero
elif float(slab_span_left_ft) == 0 and float(slab_span_right_ft) == 0 and float(beam_span_ft) ==0:
if float(bf_in) <= 0 and float(hf_in) <=0:
self.bf_status = 'OK, rectangular section with no flange'
self.bf_in = self.bw_in
self.hf_in = 0
self.bf_left_in = 0
self.bf_right_in = self.bf_left_in
else:
if float(hf_in) < (1/2.0)*self.bw_in:
self.bf_status = 'NG see ACI 318-08 section 8.12.4, Hf revised to 0.5*Bw'
self.hf_in = (1/2.0)*self.bw_in
if float(bf_in) > 4*self.bw_in:
self.bf_status = self.bf_status + '\nNG see ACI 318-08 section 8.12.4, Bf revised to 4*Bw'
self.bf_in = 4*self.bw_in
self.bf_left_in = (self.bf_in - self.bw_in)/2
self.bf_right_in = self.bf_left_in
else:
self.bf_in = float(bf_in)
self.bf_left_in = (self.bf_in - self.bw_in)/2
self.bf_right_in = self.bf_left_in
self.bf_status = self.bf_status + '\nBf OK'
else:
if float(bf_in) > 4*self.bw_in:
self.bf_status = 'NG see ACI 318-08 section 8.12.4, Bf revised to 4*Bw'
self.bf_in = 4*self.bw_in
self.hf_in = float(hf_in)
self.bf_left_in = (self.bf_in - self.bw_in)/2
self.bf_right_in = self.bf_left_in
else:
self.bf_in = float(bf_in)
self.hf_in = float(hf_in)
self.bf_left_in = (self.bf_in - self.bw_in)/2
self.bf_right_in = self.bf_left_in
self.bf_status = 'OK'
#Section 8.12.2 - Slab on both sides
else:
self.bf_max_in = (1/4.0)*float(beam_span_ft)*12
self.bf_max_left_in = (self.bf_max_in - self.bw_in)/2
self.bf_max_right_in = self.bf_max_left_in
self.bf_left_in = min(8*float(h_slab_in),(1/2.0)*float(slab_span_left_ft)*12,self.bf_max_left_in)
self.bf_right_in = min(8*float(h_slab_in),(1/2.0)*float(slab_span_right_ft)*12,self.bf_max_right_in)
self.bf_in = self.bf_left_in + self.bw_in + self.bf_right_in
self.hf_in = float(h_slab_in)
self.bf_status = 'OK'
self.hw_in = self.h_in - self.hf_in
#Calculate Igross neglecting steel
self.Af_in2 = self.bf_in * self.hf_in
self.If_in4 = (self.bf_in * self.hf_in**3)/12
self.Aw_in2 = self.bw_in * self.hw_in
self.Iw_in4 = (self.bw_in * self.hw_in**3)/12
self.cgy_in = (self.Af_in2*((self.hf_in/2.0)+self.hw_in) + self.Aw_in2*(self.hw_in/2.0))/(self.Af_in2+self.Aw_in2)
self.df_in = abs(((self.hf_in/2.0)+self.hw_in)-self.cgy_in)
self.dw_in = abs((self.hw_in/2.0)-self.cgy_in)
self.Ig_in4 = self.If_in4 + (self.Af_in2*self.df_in**2) + self.Iw_in4 + (self.Aw_in2*self.dw_in**2)
#section vertex coordinates to make plotting easier
self.section_x_coords_in = [0,self.bw_in,self.bw_in,self.bf_right_in+self.bw_in,self.bf_right_in+self.bw_in,-1*self.bf_left_in,-1*self.bf_left_in,0,0]
self.section_y_coords_in = [0,0,self.hw_in,self.hw_in,self.h_in,self.h_in,self.hw_in,self.hw_in,0]
if 90 <= density_pcf <= 160:
self.Ec_psi = density_pcf**1.5 * 33 * self.f_prime_c_psi**(1/2.0)
else:
self.Ec_psi = 'Density of Concrete must be between 90 and 160 PCF'
self.weight_plf = (self.Af_in2+self.Aw_in2) * (1/144) * density_pcf
self.weight_klf = self.weight_plf/1000.0
self.fr_psi = 7.5 * self.f_prime_c_psi**(1/2.0)
self.Mcrack_inlbs = (self.fr_psi*self.Ig_in4)/(self.cgy_in)
self.Mcrack_ftlbs = self.Mcrack_inlbs/12.0
self.Mcrack_ftkips = self.Mcrack_ftlbs/1000.0
if self.f_prime_c_psi <= 4000:
self.beta1 = 0.85
elif self.f_prime_c_psi <= 8000:
self.beta1 = 0.85 - ((0.05*(self.f_prime_c_psi-4000))/1000)
else:
self.beta1 = 0.65
def as_min(self,d_in,fy_psi):
return max((3 * self.f_prime_c_psi**(1/2.0) * self.bw_in * d_in)/fy_psi, (200.0 * self.bw_in * d_in)/fy_psi)
def max_bars_layer(self,flexural_bar, cover_in, shear_bar, aggregate_size_in):
self.first_interior_bar_in = cover_in + shear_bar[0] + 2*shear_bar[0]
min_spacing = max(1,flexural_bar[0],1.33*aggregate_size_in)+flexural_bar[0]
num_interior_bars = 1 + ((self.bw_in - 2*self.first_interior_bar_in)/min_spacing)
return math.floor(num_interior_bars)
def min_bars_bottom_layer(self,reinf_bar,cover_in, shear_bar, fy_psi):
fs = (2/3.0)*fy_psi
cc = cover_in + shear_bar[0]
first_interior_bar_in = cover_in + shear_bar[0] + 2*shear_bar[0]
min_spacing = (15*(40000/fs)) - 2.5*cc
num_interior_bars = 1 + ((self.bw_in - 2*first_interior_bar_in)/min_spacing)
return math.ceil(num_interior_bars)
def flexural_bottom_bars_automatic_by_layers(self,flexural_bars_array,flexural_bar_count_array,cover_in,shear_bar):
bottom_bars_as = []
bottom_bars_d = []
bottom_bars_cg = 0
i=0
for bars in flexural_bar_count_array:
bottom_bars_as.append(bars*flexural_bars_array[i][1])
if i==0:
bottom_bars_d.append(self.h_in - (cover_in + shear_bar[0] + (flexural_bars_array[i][0] * 0.5)))
else:
bottom_bars_d.append(bottom_bars_d[i-1] - (1 + flexural_bars_array[i-1][0] * 0.5 + flexural_bars_array[i][0] * 0.5))
i+=1
total_as = sum(bottom_bars_as)
total_as_d = 0
i=0
for i in range(len(bottom_bars_as)):
total_as_d = total_as_d + (bottom_bars_as[i]*bottom_bars_d[i])
bottom_bars_cg = total_as_d/total_as
return bottom_bars_as, bottom_bars_d, bottom_bars_cg
def flexural_top_bars_automatic_by_layers(self,flexural_bars_array,flexural_bar_count_array,cover_in,shear_bar):
top_bars_as = []
top_bars_d = []
i=0
for bars in flexural_bar_count_array:
top_bars_as.append(bars*flexural_bars_array[i][1])
if i==0:
top_bars_d.append((cover_in + shear_bar[0] + (flexural_bars_array[i][0] * 0.5)))
else:
top_bars_d.append(top_bars_d[i-1] + (1 + flexural_bars_array[i-1][0] * 0.5 + flexural_bars_array[i][0] * 0.5))
i+=1
total_as = sum(top_bars_as)
total_as_d = 0
i=0
for i in range(len(top_bars_as)):
total_as_d = total_as_d + (top_bars_as[i]*top_bars_d[i])
return top_bars_as, top_bars_d
def strain_compatibility_steel(self,bars_as_array,bars_d_array,c_in,fy_psi,Es_psi):
steel_strain = []
steel_stress = []
steel_tension_force_layer_lbs = []
steel_tension_force_lbs = 0
i=0
for i in range(len(bars_as_array)):
steel_strain.append(0.003*((bars_d_array[i]/c_in)-1))
fs = steel_strain[i] * Es_psi
if fs < -1*fy_psi or fs > fy_psi:
steel_stress.append((fs/abs(fs))*fy_psi)
steel_tension_force_layer_lbs.append(((fs/abs(fs))*fy_psi)*bars_as_array[i])
else:
steel_stress.append(fs)
steel_tension_force_layer_lbs.append(fs*bars_as_array[i])
steel_tension_force_lbs = sum(steel_tension_force_layer_lbs)
return steel_strain,steel_stress,steel_tension_force_layer_lbs,steel_tension_force_lbs
def strain_compatibility_concrete(self,c_in):
a = self.beta1 * c_in
if a <= self.hf_in or self.hf_in == 0:
compression_block_in2 = self.bf_in * a
compression_block_cg_top = a/2
else:
compression_block_in2 = (self.bf_in * self.hf_in) + (self.bw_in*(a-self.hf_in))
compression_block_cg_top = (self.Af_in2*(self.hf_in/2.0) + (self.bw_in*(a-self.hf_in))*(((a-self.hf_in)/2.0)+self.hf_in))/compression_block_in2
concrete_compression_force_lbs = 0.85 * self.f_prime_c_psi * compression_block_in2
return concrete_compression_force_lbs, compression_block_cg_top
def find_pna(self,bars_as_array,bars_d_array,bars_cg,fy_psi,Es_psi):
a=0
b=max(bars_d_array)
c=0
pna = 0
loop_max = 10000
tol = 0.0000000000001
loop = 0
while loop<loop_max:
c = (a+b)/2
strain_c, stress_c, layer_c, tension_c = self.strain_compatibility_steel(bars_as_array,bars_d_array,c,fy_psi,Es_psi)
compression_c, compression_cg_top = self.strain_compatibility_concrete(c)
if compression_c == tension_c or (b-a)/2 <= tol:
pna = c
loop = loop_max
elif compression_c > tension_c:
b = c
else:
a = c
loop+=1
print(tension_c)
print(pna)
return pna
def moment_capacity_inlbs(self,bars_as_array,bars_d_array,bars_cg,c_in,fy_psi,Es_psi):
self.x_strain = []
self.y_strain = []
a = self.beta1 * c_in
if a <= self.hf_in or self.hf_in == 0:
compression_block_in2 = self.bf_in * a
compression_block_cg_top = a/2
else:
compression_block_in2 = (self.bf_in * self.hf_in) + (self.bw_in*(a-self.hf_in))
compression_block_cg_top = (self.Af_in2*(self.hf_in/2.0) + (self.bw_in*(a-self.hf_in))*(((a-self.hf_in)/2.0)+self.hf_in))/compression_block_in2
concrete_compression_force_lbs = 0.85 * self.f_prime_c_psi * compression_block_in2
self.x_strain.append(0)
self.y_strain.append(self.h_in)
self.x_strain.append(-.003)
self.y_strain.append(self.h_in)
steel_strain = []
steel_stress = []
steel_tension_force_layer_lbs = []
steel_moment_component_inlbs = 0
i=0
for i in range(len(bars_as_array)):
steel_strain.append(0.003*((bars_d_array[i]/c_in)-1))
self.x_strain.append(steel_strain[i])
self.y_strain.append(self.h_in - bars_d_array[i])
fs = steel_strain[i] * Es_psi
if fs < -1*fy_psi or fs > fy_psi:
steel_stress.append((fs/abs(fs))*fy_psi)
steel_tension_force_layer_lbs.append(((fs/abs(fs))*fy_psi)*bars_as_array[i])
steel_moment_component_inlbs = steel_moment_component_inlbs + (steel_tension_force_layer_lbs[i]*(bars_d_array[i]-bars_cg))
else:
steel_stress.append(fs)
steel_tension_force_layer_lbs.append(fs*bars_as_array[i])
steel_moment_component_inlbs = steel_moment_component_inlbs + (steel_tension_force_layer_lbs[i]*(bars_d_array[i]-bars_cg))
self.x_strain.append(0)
self.y_strain.append(self.h_in - max(bars_d_array))
if max(steel_strain)<0.004:
return 'Section not Tension Controlled'
else:
if max(steel_strain)>=0.005:
phi = 0.9
else:
phi = 0.65 + ((max(steel_strain)-0.002)*(250/3))
concrete_moment_arm_in = bars_cg - compression_block_cg_top
concrete_moment_component_inlbs = concrete_compression_force_lbs * concrete_moment_arm_in
nominal_moment_inlbs = concrete_moment_component_inlbs + steel_moment_component_inlbs
return phi,nominal_moment_inlbs,phi*nominal_moment_inlbs
def concrete_shear_capacity_lbs(self,bars_cg, shear_bars_fy_psi, shear_bar_area_in2):
phi = 0.75
#ACI 318-08 11.1.2 sqrt(f'c) limited to 100 psi
if self.f_prime_c_psi > 10000:
vc_fprimec_psi = 10000
else:
vc_fprimec_psi = self.f_prime_c_psi
#ACI 318-08 11.2.1 equation 11-3
vc = 2*vc_fprimec_psi**(1/2.0)*self.bw_in*bars_cg
phivc = phi*vc
#ACI 318-08 11.4.2 limits shear bars to 60 ksi
if shear_bars_fy_psi > 60000:
fyt_psi = 60000
else:
fyt_psi = shear_bars_fy_psi
#ACI 318-08 11.4.5.1 - assumes vertical stirrups
self.max_shear_spacing_in = min(bars_cg/2.0,24)
#ACI 318-08 11.4.6.3 - Av,min/s
self.av_min_s = min(0.75*vc_fprimec_psi**(1/2.0)*self.bw_in*(1.0/fyt_psi),50*self.bw_in*(1.0/fyt_psi))
#ACI 318-08 11.4.7.2 equation 11-15 - assumes vertical stirrups
self.s_Vs_2_legs = 2*shear_bar_area_in2*fyt_psi*bars_cg
self.s_Vs_4_legs = 4*shear_bar_area_in2*fyt_psi*bars_cg
self.s_Vs_6_legs = 6*shear_bar_area_in2*fyt_psi*bars_cg
#ACI 318-08 11.4.7.9
vsmax = 8*vc_fprimec_psi**(1/2.0)*self.bw_in*bars_cg
phivsmax = phi*vsmax
vnmax = vc + vsmax
phivnmax = phi*vnmax
return phi, vc, phivc, vsmax, vnmax, phivnmax
def concrete_threshold_torsion_inlbs(self):
phi = 0.75
#ACI 318-08 11.1.2 sqrt(f'c) limited to 100 psi
if self.f_prime_c_psi > 10000:
vc_fprimec_psi = 10000
else:
vc_fprimec_psi = self.f_prime_c_psi
#Calculation of Acp and Pcp
rect_Acp_in2 = self.bw_in*self.h_in
rect_Pcp_in = (2.0*self.bw_in)+(2.0*self.h_in)
area_over_perim_rect = (rect_Acp_in2**2.0) / rect_Pcp_in
#ACI 318-08 11.5.1.1 - T beam consraints per 13.2.4
if self.bf_left_in == 0 or self.bf_right_in == 0:
if self.bf_left_in == 0 and self.bf_right_in == 0:
self.Acp_in2 = rect_Acp_in2
self.Pcp_in = rect_Pcp_in
self.Torsion_threshold = vc_fprimec_psi**(1/2.0) * area_over_perim_rect
self.AoP_status = 'Rectangular Section used for Acp and Pcp'
elif self.bf_left_in ==0:
Torsion_bf_right_in = min(self.bf_right_in, self.hw_in, 4*self.hf_in)
Acp_in2 = (self.bw_in*self.hw_in) + ((self.bw_in+Torsion_bf_right_in)*self.hf_in)
Pcp_in = self.bw_in + self.hw_in + Torsion_bf_right_in + self.hf_in + Torsion_bf_right_in + self.bw_in + self.h_in
#Check Acp^2/Pcp of T Beam not less than for Rectangular beam
area_over_perim_t = (Acp_in2**2.0)/Pcp_in
if area_over_perim_rect > area_over_perim_t:
self.Acp_in2 = rect_Acp_in2
self.Pcp_in = rect_Pcp_in
self.Torsion_threshold = vc_fprimec_psi**(1/2.0) * area_over_perim_rect
self.AoP_status = 'Rectangular Section used for Acp and Pcp, ACI 318-08 11.5.1.1'
else:
self.Acp_in2 = Acp_in2
self.Pcp_in = Pcp_in
self.Torsion_threshold = vc_fprimec_psi**(1/2.0) * area_over_perim_t
self.AoP_status = 'T Section used for Acp and Pcp, Section may have been altered per ACI 318-08 11.5.1.1 and 13.2.4'
elif self.bf_right_in ==0:
Torsion_bf_left_in = min(self.bf_left_in, self.hw_in, 4*self.hf_in)
Acp_in2 = (self.bw_in*self.hw_in) + ((self.bw_in+Torsion_bf_left_in)*self.hf_in)
Pcp_in = self.bw_in + self.hw_in + Torsion_bf_left_in + self.hf_in + Torsion_bf_left_in + self.bw_in + self.h_in
#Check Acp^2/Pcp of T Beam not less than for Rectangular beam
area_over_perim_t = (Acp_in2**2.0)/Pcp_in
if area_over_perim_rect > area_over_perim_t:
self.Acp_in2 = rect_Acp_in2
self.Pcp_in = rect_Pcp_in
self.Torsion_threshold = vc_fprimec_psi**(1/2.0) * area_over_perim_rect
self.AoP_status = 'Rectangular Section used for Acp and Pcp, ACI 318-08 11.5.1.1'
else:
self.Acp_in2 = Acp_in2
self.Pcp_in = Pcp_in
self.Torsion_threshold = vc_fprimec_psi**(1/2.0) * area_over_perim_t
self.AoP_status = 'T Section used for Acp and Pcp, Section may have been altered per ACI 318-08 11.5.1.1 and 13.2.4'
else:
Torsion_bf_right_in = min(self.bf_right_in, self.hw_in, 4*self.hf_in)
Torsion_bf_left_in = min(self.bf_left_in, self.hw_in, 4*self.hf_in)
Acp_in2 = (self.bw_in*self.hw_in) + ((self.bw_in+Torsion_bf_left_in+Torsion_bf_right_in)*self.hf_in)
Pcp_in = self.bw_in + self.hw_in + Torsion_bf_right_in + self.hf_in + Torsion_bf_right_in + self.bw_in + Torsion_bf_left_in + self.hf_in + Torsion_bf_left_in + self.hw_in
#Check Acp^2/Pcp of T Beam not less than for Rectangular beam
area_over_perim_t = (Acp_in2**2.0)/Pcp_in
if area_over_perim_rect > area_over_perim_t:
self.Acp_in2 = rect_Acp_in2
self.Pcp_in = rect_Pcp_in
self.Torsion_threshold = vc_fprimec_psi**(1/2.0) * area_over_perim_rect
self.AoP_status = 'Rectangular Section used for Acp and Pcp, ACI 318-08 11.5.1.1'
else:
self.Acp_in2 = Acp_in2
self.Pcp_in = Pcp_in
self.Torsion_threshold = vc_fprimec_psi**(1/2.0) * area_over_perim_t
self.AoP_status = 'T Section used for Acp and Pcp, Section may have been altered per ACI 318-08 11.5.1.1 and 13.2.4'
return phi, self.Acp_in2, self.Pcp_in, self.Torsion_threshold, phi*self.Torsion_threshold, self.AoP_status
def cracked_moment_of_inertia_in4(self,bars_as_array,bars_d_array,Es_psi):
self.cmi_c_x = []
self.cmi_c_y = []
self.cmi_s_x = []
self.cmi_s_y = []
self.cmi_s_text = []
self.cmi_c_text = []
n = Es_psi/self.Ec_psi
a=0
b=max(bars_d_array)
c=0
mna = 0
self.crackna = 0
loop_max = 10000
tol = 0.0000000000001
loop = 0
while loop<loop_max:
c = (a+b)/2
if c <= self.hf_in:
compression_block_in2 = self.bf_in * c
compression_block_cg_top = c/2
else:
compression_block_in2 = (self.bf_in * self.hf_in) + (self.bw_in*(c-self.hf_in))
compression_block_cg_top = (self.Af_in2*(self.hf_in/2.0) + (self.bw_in*(c-self.hf_in))*(((c-self.hf_in)/2.0)+self.hf_in))/compression_block_in2
mna = compression_block_in2 * compression_block_cg_top
i=0
for i in range(len(bars_as_array)):
if c < bars_d_array[i]:
mna = mna - (n*bars_as_array[i]*(bars_d_array[i]-c))
else:
mna = mna + ((n-1)*bars_as_array[i]*(c-bars_d_array[i]))
if mna == 0 or (b-a)/2 <= tol:
self.crackna = c
loop = loop_max
elif mna > 1:
b = c
else:
a = c
loop+=1
if self.crackna <= self.hf_in:
i_crack_in4 = (self.bf_in*self.crackna**3)/3
self.cmi_c_x.append([0-self.bf_left_in,0+self.bw_in+self.bf_right_in,0+self.bw_in+self.bf_right_in,0-self.bf_left_in,0-self.bf_left_in])
self.cmi_c_y.append([self.h_in - self.crackna,self.h_in - self.crackna,self.h_in,self.h_in,self.h_in - self.crackna])
self.cmi_c_text.append(['Ac = {0:.3f} in2, Ec = {1:.2f} psi, Es = {2:0.2f} psi, n = Es/Ec = {3:0.4f}'.format(compression_block_in2, self.Ec_psi, Es_psi, n),0-self.bf_left_in,self.h_in+0.25])
else:
i_crack_in4 = (self.bw_in*(self.crackna-self.hf_in)**3)/3 + (self.bf_in*self.hf_in**3/3) + (self.bf_in*self.hf_in)*(self.crackna-self.hf_in)**2
self.cmi_c_x.append([0-self.bf_left_in,0-self.bf_left_in,0,0,self.bw_in,self.bw_in,self.bw_in+self.bf_right_in,self.bw_in+self.bf_right_in,0-self.bf_left_in])
self.cmi_c_y.append([self.h_in,self.hw_in,self.hw_in,self.h_in-self.crackna,self.h_in-self.crackna,self.hw_in,self.hw_in,self.h_in,self.h_in])
self.cmi_c_text.append(['Ac = {0:.3f} in2, Ec = {1:.2f} psi, Es = {2:0.2f} psi, n = Es/Ec = {3:0.4f}'.format(compression_block_in2, self.Ec_psi, Es_psi, n),0-self.bf_left_in,self.h_in+0.25])
for i in range(len(bars_as_array)):
if self.crackna < bars_d_array[i]:
i_crack_in4 = i_crack_in4 + (n*bars_as_array[i]*(bars_d_array[i]-self.crackna)**2)
self.cmi_s_x.append([(self.bw_in/2.0)-(n*bars_as_array[i]/2.0),(self.bw_in/2.0)+(n*bars_as_array[i]/2.0),(self.bw_in/2.0)+(n*bars_as_array[i]/2.0),(self.bw_in/2.0)-(n*bars_as_array[i]/2.0),(self.bw_in/2.0)-(n*bars_as_array[i]/2.0)])
self.cmi_s_y.append([self.h_in - (bars_d_array[i]-0.5),self.h_in - (bars_d_array[i]-0.5),self.h_in - (bars_d_array[i]+0.5),self.h_in - (bars_d_array[i]+0.5),self.h_in - (bars_d_array[i]-0.5)])
self.cmi_s_text.append(['nAs = {0:.3f} in2'.format(n*bars_as_array[i]),((self.bw_in/2.0)+(n*bars_as_array[i]/2.0))+0.25,self.h_in - (bars_d_array[i]+0.25)])
else:
i_crack_in4 = i_crack_in4 + ((n-1)*bars_as_array[i]*(self.crackna-bars_d_array[i])**2)
self.cmi_s_x.append([(self.bw_in/2.0)-(n*bars_as_array[i]/2.0),(self.bw_in/2.0)+(n*bars_as_array[i]/2.0),(self.bw_in/2.0)+(n*bars_as_array[i]/2.0),(self.bw_in/2.0)-(n*bars_as_array[i]/2.0),(self.bw_in/2.0)-(n*bars_as_array[i]/2.0)])
self.cmi_s_y.append([self.h_in - (bars_d_array[i]-0.5),self.h_in - (bars_d_array[i]-0.5),self.h_in - (bars_d_array[i]+0.5),self.h_in - (bars_d_array[i]+0.5),self.h_in - (bars_d_array[i]-0.5)])
self.cmi_s_x.append([(self.bw_in/2.0)-(bars_as_array[i]/2.0),(self.bw_in/2.0)+(bars_as_array[i]/2.0),(self.bw_in/2.0)+(bars_as_array[i]/2.0),(self.bw_in/2.0)-(bars_as_array[i]/2.0),(self.bw_in/2.0)-(bars_as_array[i]/2.0)])
self.cmi_s_y.append([self.h_in - (bars_d_array[i]-0.5),self.h_in - (bars_d_array[i]-0.5),self.h_in - (bars_d_array[i]+0.5),self.h_in - (bars_d_array[i]+0.5),self.h_in - (bars_d_array[i]-0.5)])
self.cmi_s_text.append(['(n-1)As = {0:.3f} in2'.format((n-1)*bars_as_array[i]),((self.bw_in/2.0)+(n*bars_as_array[i]/2.0))+0.25,self.h_in - (bars_d_array[i]+0.25)])
return i_crack_in4, self.crackna
# cover_in = 1.44
# aggregate_size_in = 1.0
# bm = t_beam(12,24,6,10,0,0,48,6,5000,145)
# shear_bars = reinforcement(60)
# shear_bar = shear_bars.bar[4]
# flexural_bars = reinforcement(60)
# flexural_bottom_bars = [flexural_bars.bar[9]]
# flexural_bottom_bars_per_layer = [3]
# flexural_top_bars = [flexural_bars.bar[6]]
# flexural_top_bars_per_layer = [3]
# print bm.max_bars_layer(flexural_bottom_bars[0], cover_in, shear_bar, aggregate_size_in)
# print bm.min_bars_bottom_layer(flexural_bottom_bars[0], cover_in, shear_bar, flexural_bars.fy_psi)
# bars_as,bars_d,tension_bars_cg = bm.flexural_bottom_bars_automatic_by_layers(flexural_bottom_bars,flexural_bottom_bars_per_layer,cover_in,shear_bar)
# top_bars_as,top_bars_d = bm.flexural_top_bars_automatic_by_layers(flexural_top_bars,flexural_top_bars_per_layer,cover_in,shear_bar)
# print bars_as,bars_d,tension_bars_cg
# print top_bars_as, top_bars_d
# flexural_bars_as = bars_as + top_bars_as
# flexural_bars_d = bars_d + top_bars_d
# total_as = sum(flexural_bars_as)
# total_as_d = 0
# for i in range(len(flexural_bars_as)):
# total_as_d = total_as_d + (flexural_bars_as[i]*flexural_bars_d[i])
# flexural_bars_cg = total_as_d/total_as
# minas = bm.as_min(tension_bars_cg,flexural_bars.fy_psi)
# c_in = bm.find_pna(flexural_bars_as,flexural_bars_d,flexural_bars_cg,flexural_bars.fy_psi,flexural_bars.Es_psi)
# print c_in
# print bm.strain_compatibility_steel(flexural_bars_as,flexural_bars_d,c_in,flexural_bars.fy_psi,flexural_bars.Es_psi)
# print bm.moment_capacity_inlbs(flexural_bars_as,flexural_bars_d,flexural_bars_cg,c_in,flexural_bars.fy_psi,flexural_bars.Es_psi)
# print bm.concrete_shear_capacity_lbs(tension_bars_cg)
# print bm.cracked_moment_of_inertia_in4(flexural_bars_as,flexural_bars_d,flexural_bars.Es_psi)
# print bm.Ig_in4
| 47.656753 | 249 | 0.606958 | 6,326 | 38,459 | 3.345716 | 0.061808 | 0.04309 | 0.027971 | 0.01975 | 0.844602 | 0.825136 | 0.793574 | 0.770612 | 0.748642 | 0.726246 | 0 | 0.050205 | 0.281651 | 38,459 | 806 | 250 | 47.715881 | 0.715894 | 0.113758 | 0 | 0.750422 | 0 | 0.010118 | 0.032273 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045531 | false | 0 | 0.005059 | 0.005059 | 0.10118 | 0.003373 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
96f5ddffd9661f8fd94768f59287d5971f5f189b | 220 | py | Python | polls/views.py | whoww/PlaylistServer | dceaac406ea63b69434e5c685d9e0c2c688e50f7 | [
"MIT"
] | null | null | null | polls/views.py | whoww/PlaylistServer | dceaac406ea63b69434e5c685d9e0c2c688e50f7 | [
"MIT"
] | null | null | null | polls/views.py | whoww/PlaylistServer | dceaac406ea63b69434e5c685d9e0c2c688e50f7 | [
"MIT"
] | null | null | null | from django.shortcuts import render, render_to_response
from django.http import HttpResponse
def index(request):
return HttpResponse("Hello cats!!")
def sound(request):
return render_to_response("index2.html")
| 24.444444 | 55 | 0.781818 | 29 | 220 | 5.793103 | 0.62069 | 0.119048 | 0.190476 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005208 | 0.127273 | 220 | 8 | 56 | 27.5 | 0.869792 | 0 | 0 | 0 | 0 | 0 | 0.104545 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
8c5b2f57761265b86315feecb622c1ce3a9fe693 | 3,425 | py | Python | test/test_temperature_scaling.py | Argonne-National-Laboratory/pyoptmat | a6e5e8d0b93c77374d4ccbc65a86262eec5df77b | [
"MIT"
] | null | null | null | test/test_temperature_scaling.py | Argonne-National-Laboratory/pyoptmat | a6e5e8d0b93c77374d4ccbc65a86262eec5df77b | [
"MIT"
] | 1 | 2022-03-30T22:20:38.000Z | 2022-03-31T15:02:22.000Z | test/test_temperature_scaling.py | Argonne-National-Laboratory/pyoptmat | a6e5e8d0b93c77374d4ccbc65a86262eec5df77b | [
"MIT"
] | 2 | 2021-11-16T15:13:54.000Z | 2022-01-06T21:35:42.000Z | import unittest
import numpy as np
import torch
from pyoptmat import temperature
class TestConstantParameter(unittest.TestCase):
def test_value(self):
pshould = torch.tensor([1.0,2.0])
obj = temperature.ConstantParameter(pshould)
pval = obj(torch.tensor(1.0))
self.assertTrue(np.allclose(pshould.numpy(), pval.numpy()))
class TestPolynomialScaling(unittest.TestCase):
def test_value(self):
coefs = torch.tensor([1.1,2.5,3.0])
x = torch.ones((100,))*1.51
obj = temperature.PolynomialScaling(coefs)
y1 = obj.value(x)
y2 = np.polyval(coefs.numpy(), x)
self.assertTrue(np.allclose(y1.numpy(), y2))
def test_value_batch(self):
coefs = torch.tensor([[1.1]*100,[2.5]*100,[3.0]*100])
x = torch.ones((100,))*1.51
obj = temperature.PolynomialScaling(coefs)
y1 = obj.value(x)
y2 = np.polyval(coefs.numpy(), x)
self.assertTrue(np.allclose(y1.numpy(), y2))
def test_value_constant(self):
coefs = torch.tensor([2.51])
x = torch.ones((100,))*1.51
obj = temperature.PolynomialScaling(coefs)
y1 = obj.value(x)
y2 = np.polyval(coefs.numpy(), x)
self.assertTrue(np.allclose(y1.numpy(), y2))
class TestKMRateSensitivityScaling(unittest.TestCase):
def test_value(self):
A = -8.679
mu = temperature.PolynomialScaling(
torch.tensor([-1.34689305e-02,-5.18806776e+00,7.86708330e+04]))
b = 2.474e-7
k = 1.38064e-20
Ts = torch.linspace(25,950.0,50)+273.15
obj = temperature.KMRateSensitivityScaling(A, mu, b, k)
v1 = obj.value(Ts)
mu_values = np.array([mu.value(T).numpy() for T in Ts])
v2 = -mu_values*b**3.0/(k*Ts*A)
self.assertTrue(np.allclose(v1.numpy(), v2))
def test_value_batch(self):
A = torch.linspace(-8.679-1,-8.679+1, 50)
mu = temperature.PolynomialScaling(
torch.tensor([-1.34689305e-02,-5.18806776e+00,7.86708330e+04]))
b = 2.474e-7
k = 1.38064e-20
Ts = torch.linspace(25,950.0,50)+273.15
obj = temperature.KMRateSensitivityScaling(A, mu, b, k)
v1 = obj.value(Ts)
mu_values = np.array([mu.value(T).numpy() for T in Ts])
v2 = -mu_values*b**3.0/(k*Ts*A.numpy())
self.assertTrue(np.allclose(v1.numpy(), v2))
class TestKMViscosityScaling(unittest.TestCase):
def test_value(self):
A = -8.679
B = -0.744
mu = temperature.PolynomialScaling(
torch.tensor([-1.34689305e-02,-5.18806776e+00,7.86708330e+04]))
b = 2.474e-7
k = 1.38064e-20
eps0 = 1e10
Ts = torch.linspace(25,950.0,50)+273.15
mu_values = np.array([mu.value(T).numpy() for T in Ts])
obj = temperature.KMViscosityScaling(A, torch.tensor(B), mu, eps0, b, k)
v1 = obj.value(Ts)
v2 = np.exp(B)*mu_values*eps0**(k*Ts.numpy()*A/(mu_values*b**3.0))
self.assertTrue(np.allclose(v1, v2))
def test_value_batch(self):
A = torch.linspace(-8.679-1,-8.679+1, 50)
B = torch.linspace(-0.744,-0.80, 50)
mu = temperature.PolynomialScaling(
torch.tensor([-1.34689305e-02,-5.18806776e+00,7.86708330e+04]))
b = 2.474e-7
k = 1.38064e-20
eps0 = 1e10
Ts = torch.linspace(25,950.0,50)+273.15
mu_values = np.array([mu.value(T).numpy() for T in Ts])
obj = temperature.KMViscosityScaling(A, B, mu, eps0, b, k)
v1 = obj.value(Ts)
v2 = np.exp(B.numpy())*mu_values*eps0**(k*Ts.numpy()*A.numpy()/(mu_values*b**3.0))
self.assertTrue(np.allclose(v1, v2))
| 28.541667 | 86 | 0.63854 | 538 | 3,425 | 4.024164 | 0.14684 | 0.050808 | 0.044342 | 0.088684 | 0.824018 | 0.810162 | 0.762125 | 0.713164 | 0.713164 | 0.678984 | 0 | 0.135029 | 0.184818 | 3,425 | 119 | 87 | 28.781513 | 0.640401 | 0 | 0 | 0.72093 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093023 | 1 | 0.093023 | false | 0 | 0.046512 | 0 | 0.186047 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8cb660a018c70ac55d0da040a5f732d29561e27d | 2,079 | py | Python | ebe_scripts/extract_event_plane_correlations.py | LipeiDu/hadronic_afterburner_toolkit | 770cbd2582c988d3950e6707ceaeb7752212ac46 | [
"MIT"
] | 3 | 2016-12-01T21:25:15.000Z | 2021-08-17T19:57:37.000Z | ebe_scripts/extract_event_plane_correlations.py | LipeiDu/hadronic_afterburner_toolkit | 770cbd2582c988d3950e6707ceaeb7752212ac46 | [
"MIT"
] | null | null | null | ebe_scripts/extract_event_plane_correlations.py | LipeiDu/hadronic_afterburner_toolkit | 770cbd2582c988d3950e6707ceaeb7752212ac46 | [
"MIT"
] | 4 | 2018-01-26T01:43:41.000Z | 2020-10-21T19:01:27.000Z | #!/usr/bin/env python
from os import path
from numpy import loadtxt
collision_system = "PbPb2760"
centrality_list = ['0-5', '5-10', '10-20', '20-30', '30-40', '40-50']
centrality_mid = [2.5, 7.5, 15, 25, 35, 45]
folder_basename = "%s_IPG_C%s_Tsw_145"
file_name = "charged_hadron_event_plane_correlation_ATLAS.dat"
output_file = open("%s_charged_hadron_event_plane_correlation_ATLAS.dat"
% collision_system, "w")
# write the header
output_file.write("# centrality 4(24) 4(24)_err 6(23) 6(23)_err "
"6(26) 6(26)_err 6(36) 6(36)err (235) (235)_err "
"(246) (246)_err (234) (234)_err\n")
for i, centrality_text in enumerate(centrality_list):
folder_name = folder_basename % (collision_system, centrality_text)
data = open(path.join('.', folder_name, file_name), "r")
data.readline()
output_file.write("%g " % (centrality_mid[i]))
for line in data.readlines():
value = float(line.split(" ")[2])
value_err = float(line.split(" ")[4].split("\n")[0])
output_file.write("%.6e %.6e " % (value, value_err))
data.close()
output_file.write("\n")
output_file.close()
file_name = "charged_hadron_event_plane_correlation_ALICE.dat"
output_file = open("%s_charged_hadron_event_plane_correlation_ALICE.dat"
% collision_system, "w")
# write the header
output_file.write("# centrality 4(24) 4(24)_err 6(23) 6(23)_err "
"6(26) 6(26)_err 6(36) 6(36)err (235) (235)_err "
"(246) (246)_err (234) (234)_err\n")
for i, centrality_text in enumerate(centrality_list):
folder_name = folder_basename % (collision_system, centrality_text)
data = open(path.join('.', folder_name, file_name), "r")
data.readline()
output_file.write("%g " % (centrality_mid[i]))
for line in data.readlines():
value = float(line.split(" ")[2])
value_err = float(line.split(" ")[4].split("\n")[0])
output_file.write("%.6e %.6e " % (value, value_err))
data.close()
output_file.write("\n")
output_file.close()
| 41.58 | 72 | 0.638769 | 307 | 2,079 | 4.065147 | 0.260586 | 0.096154 | 0.096154 | 0.073718 | 0.870192 | 0.870192 | 0.870192 | 0.786859 | 0.786859 | 0.786859 | 0 | 0.08 | 0.194324 | 2,079 | 49 | 73 | 42.428571 | 0.665075 | 0.025974 | 0 | 0.761905 | 0 | 0.047619 | 0.272502 | 0.097923 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.047619 | 0 | 0.047619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8cd0e450fa8676e1302725ed322f6b69851fd290 | 25,697 | py | Python | src/tests/unit/fixtures/endpoint_standard/mock_policy.py | kphsugntuedutw/carbon-black-cloud-sdk-python | 5277be067223bc3eda0348c57b2a0004fa70f3e9 | [
"MIT"
] | null | null | null | src/tests/unit/fixtures/endpoint_standard/mock_policy.py | kphsugntuedutw/carbon-black-cloud-sdk-python | 5277be067223bc3eda0348c57b2a0004fa70f3e9 | [
"MIT"
] | null | null | null | src/tests/unit/fixtures/endpoint_standard/mock_policy.py | kphsugntuedutw/carbon-black-cloud-sdk-python | 5277be067223bc3eda0348c57b2a0004fa70f3e9 | [
"MIT"
] | null | null | null | """Mock responses for policy queries."""
POLICY_GET_RESP = {
"policyInfo": {
"priorityLevel": "LOW",
"latestRevision": 1598397737423,
"id": 30241,
"systemPolicy": False,
"version": 2,
"policy": {
"avSettings": {
"features": [
{
"enabled": True,
"name": "SIGNATURE_UPDATE"
},
{
"enabled": True,
"name": "ONACCESS_SCAN"
},
{
"enabled": False,
"name": "ONDEMAND_SCAN"
}
],
"updateServers": {
"servers": [
{
"flags": 0,
"regId": None,
"server": [
"http://updates2.cdc.carbonblack.io/update2"
]
}
],
"serversForOffSiteDevices": [
"http://updates2.cdc.carbonblack.io/update2"
]
},
"apc": {
"maxFileSize": 4,
"maxExeDelay": 45,
"riskLevel": 4,
"enabled": False
},
"onAccessScan": {
"profile": "AGGRESSIVE"
},
"onDemandScan": {
"profile": "NORMAL",
"scanCdDvd": "AUTOSCAN",
"scanUsb": "AUTOSCAN",
"schedule": {
"days": None,
"rangeHours": 0,
"startHour": 0,
"recoveryScanIfMissed": True
}
},
"signatureUpdate": {
"schedule": {
"intervalHours": 2,
"fullIntervalHours": 0,
"initialRandomDelayHours": 1
}
}
},
"sensorSettings": [
{
"name": "ALLOW_UNINSTALL",
"value": "True"
},
{
"name": "ALLOW_UPLOADS",
"value": "True"
},
{
"name": "SHOW_UI",
"value": "True"
},
{
"name": "ENABLE_THREAT_SHARING",
"value": "True"
},
{
"name": "QUARANTINE_DEVICE",
"value": "False"
},
{
"name": "LOGGING_LEVEL",
"value": "False"
},
{
"name": "QUARANTINE_DEVICE_MESSAGE",
"value": "Your device has been quarantined. Please contact your administrator."
},
{
"name": "SET_SENSOR_MODE",
"value": "0"
},
{
"name": "SENSOR_RESET",
"value": "0"
},
{
"name": "BACKGROUND_SCAN",
"value": "True"
},
{
"name": "POLICY_ACTION_OVERRIDE",
"value": "True"
},
{
"name": "HELP_MESSAGE",
"value": ""
},
{
"name": "SCAN_NETWORK_DRIVE",
"value": "False"
},
{
"name": "BYPASS_AFTER_LOGIN_MINS",
"value": "0"
},
{
"name": "BYPASS_AFTER_RESTART_MINS",
"value": "0"
},
{
"name": "SHOW_FULL_UI",
"value": "False"
},
{
"name": "SCAN_EXECUTE_ON_NETWORK_DRIVE",
"value": "False"
},
{
"name": "DELAY_EXECUTE",
"value": "True"
},
{
"name": "PRESERVE_SYSTEM_MEMORY_SCAN",
"value": "False"
},
{
"name": "HASH_MD5",
"value": "False"
},
{
"name": "SCAN_LARGE_FILE_READ",
"value": "False"
},
{
"name": "SECURITY_CENTER_OPT",
"value": "True"
},
{
"name": "CB_LIVE_RESPONSE",
"value": "True"
},
{
"name": "UNINSTALL_CODE",
"value": "True"
},
{
"name": "UBS_OPT_IN",
"value": "True"
},
{
"name": "ALLOW_EXPEDITED_SCAN",
"value": "True"
}
],
"directoryActionRules": [],
"rules": [
{
"id": 1,
"required": True,
"operation": "RANSOM",
"application": {
"value": "COMPANY_BLACK_LIST",
"type": "REPUTATION"
},
"action": "TERMINATE"
}
],
"knownBadHashAutoDeleteDelayMs": None,
"id": -1
},
"name": "Lyon_test",
"description": ""
},
"success": True,
"message": "Success"
}
POLICY_GET_WITH_NEW_RULE_RESP = {
"policyInfo": {
"priorityLevel": "LOW",
"id": 30241,
"systemPolicy": False,
"latestRevision": 1598468096918,
"version": 2,
"policy": {
"avSettings": {
"features": [
{
"enabled": True,
"name": "SIGNATURE_UPDATE"
},
{
"enabled": True,
"name": "ONACCESS_SCAN"
},
{
"enabled": False,
"name": "ONDEMAND_SCAN"
}
],
"updateServers": {
"servers": [
{
"flags": 0,
"regId": None,
"server": [
"http://updates2.cdc.carbonblack.io/update2"
]
}
],
"serversForOffSiteDevices": [
"http://updates2.cdc.carbonblack.io/update2"
]
},
"apc": {
"maxFileSize": 4,
"maxExeDelay": 45,
"riskLevel": 4,
"enabled": False
},
"onAccessScan": {
"profile": "AGGRESSIVE"
},
"onDemandScan": {
"profile": "NORMAL",
"scanCdDvd": "AUTOSCAN",
"scanUsb": "AUTOSCAN",
"schedule": {
"days": None,
"rangeHours": 0,
"startHour": 0,
"recoveryScanIfMissed": True
}
},
"signatureUpdate": {
"schedule": {
"intervalHours": 2,
"initialRandomDelayHours": 1,
"fullIntervalHours": 0
}
}
},
"sensorSettings": [
{
"name": "ALLOW_UNINSTALL",
"value": "True"
},
{
"name": "ALLOW_UPLOADS",
"value": "True"
},
{
"name": "SHOW_UI",
"value": "True"
},
{
"name": "ENABLE_THREAT_SHARING",
"value": "True"
},
{
"name": "QUARANTINE_DEVICE",
"value": "False"
},
{
"name": "LOGGING_LEVEL",
"value": "False"
},
{
"name": "QUARANTINE_DEVICE_MESSAGE",
"value": "Your device has been quarantined. Please contact your administrator."
},
{
"name": "SET_SENSOR_MODE",
"value": "0"
},
{
"name": "SENSOR_RESET",
"value": "0"
},
{
"name": "BACKGROUND_SCAN",
"value": "True"
},
{
"name": "POLICY_ACTION_OVERRIDE",
"value": "True"
},
{
"name": "HELP_MESSAGE",
"value": ""
},
{
"name": "SCAN_NETWORK_DRIVE",
"value": "False"
},
{
"name": "BYPASS_AFTER_LOGIN_MINS",
"value": "0"
},
{
"name": "BYPASS_AFTER_RESTART_MINS",
"value": "0"
},
{
"name": "SHOW_FULL_UI",
"value": "False"
},
{
"name": "SCAN_EXECUTE_ON_NETWORK_DRIVE",
"value": "False"
},
{
"name": "DELAY_EXECUTE",
"value": "True"
},
{
"name": "PRESERVE_SYSTEM_MEMORY_SCAN",
"value": "False"
},
{
"name": "HASH_MD5",
"value": "False"
},
{
"name": "SCAN_LARGE_FILE_READ",
"value": "False"
},
{
"name": "SECURITY_CENTER_OPT",
"value": "True"
},
{
"name": "CB_LIVE_RESPONSE",
"value": "True"
},
{
"name": "UNINSTALL_CODE",
"value": "True"
},
{
"name": "UBS_OPT_IN",
"value": "True"
},
{
"name": "ALLOW_EXPEDITED_SCAN",
"value": "True"
}
],
"directoryActionRules": [],
"rules": [
{
"id": 1,
"required": True,
"operation": "RANSOM",
"application": {
"type": "REPUTATION",
"value": "COMPANY_BLACK_LIST"
},
"action": "TERMINATE"
},
{
"id": 21,
"required": True,
"operation": "RUN",
"application": {
"type": "NAME_PATH",
"value": "my_path_test"
},
"action": "DENY"
},
{
"id": 22,
"required": True,
"operation": "RUN",
"application": {
"type": "NAME_PATH",
"value": "my_path_test_2"
},
"action": "DENY"
}
],
"knownBadHashAutoDeleteDelayMs": None,
"id": -1
},
"name": "Lyon_test",
"description": ""
},
"success": True,
"message": "Success"
}
POLICY_GET_WITH_MODIFIED_RULE_RESP = {
"policyInfo": {
"priorityLevel": "LOW",
"id": 30241,
"systemPolicy": False,
"latestRevision": 1598468096918,
"version": 2,
"policy": {
"avSettings": {
"features": [
{
"enabled": True,
"name": "SIGNATURE_UPDATE"
},
{
"enabled": True,
"name": "ONACCESS_SCAN"
},
{
"enabled": False,
"name": "ONDEMAND_SCAN"
}
],
"updateServers": {
"servers": [
{
"flags": 0,
"regId": None,
"server": [
"http://updates2.cdc.carbonblack.io/update2"
]
}
],
"serversForOffSiteDevices": [
"http://updates2.cdc.carbonblack.io/update2"
]
},
"apc": {
"maxFileSize": 4,
"maxExeDelay": 45,
"riskLevel": 4,
"enabled": False
},
"onAccessScan": {
"profile": "AGGRESSIVE"
},
"onDemandScan": {
"profile": "NORMAL",
"scanCdDvd": "AUTOSCAN",
"scanUsb": "AUTOSCAN",
"schedule": {
"days": None,
"rangeHours": 0,
"startHour": 0,
"recoveryScanIfMissed": True
}
},
"signatureUpdate": {
"schedule": {
"intervalHours": 2,
"initialRandomDelayHours": 1,
"fullIntervalHours": 0
}
}
},
"sensorSettings": [
{
"name": "ALLOW_UNINSTALL",
"value": "True"
},
{
"name": "ALLOW_UPLOADS",
"value": "True"
},
{
"name": "SHOW_UI",
"value": "True"
},
{
"name": "ENABLE_THREAT_SHARING",
"value": "True"
},
{
"name": "QUARANTINE_DEVICE",
"value": "False"
},
{
"name": "LOGGING_LEVEL",
"value": "False"
},
{
"name": "QUARANTINE_DEVICE_MESSAGE",
"value": "Your device has been quarantined. Please contact your administrator."
},
{
"name": "SET_SENSOR_MODE",
"value": "0"
},
{
"name": "SENSOR_RESET",
"value": "0"
},
{
"name": "BACKGROUND_SCAN",
"value": "True"
},
{
"name": "POLICY_ACTION_OVERRIDE",
"value": "True"
},
{
"name": "HELP_MESSAGE",
"value": ""
},
{
"name": "SCAN_NETWORK_DRIVE",
"value": "False"
},
{
"name": "BYPASS_AFTER_LOGIN_MINS",
"value": "0"
},
{
"name": "BYPASS_AFTER_RESTART_MINS",
"value": "0"
},
{
"name": "SHOW_FULL_UI",
"value": "False"
},
{
"name": "SCAN_EXECUTE_ON_NETWORK_DRIVE",
"value": "False"
},
{
"name": "DELAY_EXECUTE",
"value": "True"
},
{
"name": "PRESERVE_SYSTEM_MEMORY_SCAN",
"value": "False"
},
{
"name": "HASH_MD5",
"value": "False"
},
{
"name": "SCAN_LARGE_FILE_READ",
"value": "False"
},
{
"name": "SECURITY_CENTER_OPT",
"value": "True"
},
{
"name": "CB_LIVE_RESPONSE",
"value": "True"
},
{
"name": "UNINSTALL_CODE",
"value": "True"
},
{
"name": "UBS_OPT_IN",
"value": "True"
},
{
"name": "ALLOW_EXPEDITED_SCAN",
"value": "True"
}
],
"directoryActionRules": [],
"rules": [
{
"id": 1,
"required": True,
"operation": "RANSOM",
"application": {
"type": "REPUTATION",
"value": "COMPANY_BLACK_LIST"
},
"action": "TERMINATE"
},
{
"id": 22,
"required": True,
"operation": "RUN",
"application": {
"type": "NAME_PATH",
"value": "new_test_path"
},
"action": "IGNORE"
}
],
"knownBadHashAutoDeleteDelayMs": None,
"id": -1
},
"name": "Lyon_test",
"description": ""
},
"success": True,
"message": "Success"
}
POLICY_GET_WITH_DELETED_RULE_RESP = {
"policyInfo": {
"priorityLevel": "LOW",
"latestRevision": 1598397737423,
"id": 30241,
"systemPolicy": False,
"version": 2,
"policy": {
"avSettings": {
"features": [
{
"enabled": True,
"name": "SIGNATURE_UPDATE"
},
{
"enabled": True,
"name": "ONACCESS_SCAN"
},
{
"enabled": False,
"name": "ONDEMAND_SCAN"
}
],
"updateServers": {
"servers": [
{
"flags": 0,
"regId": None,
"server": [
"http://updates2.cdc.carbonblack.io/update2"
]
}
],
"serversForOffSiteDevices": [
"http://updates2.cdc.carbonblack.io/update2"
]
},
"apc": {
"maxFileSize": 4,
"maxExeDelay": 45,
"riskLevel": 4,
"enabled": False
},
"onAccessScan": {
"profile": "AGGRESSIVE"
},
"onDemandScan": {
"profile": "NORMAL",
"scanCdDvd": "AUTOSCAN",
"scanUsb": "AUTOSCAN",
"schedule": {
"days": None,
"rangeHours": 0,
"startHour": 0,
"recoveryScanIfMissed": True
}
},
"signatureUpdate": {
"schedule": {
"intervalHours": 2,
"fullIntervalHours": 0,
"initialRandomDelayHours": 1
}
}
},
"sensorSettings": [
{
"name": "ALLOW_UNINSTALL",
"value": "True"
},
{
"name": "ALLOW_UPLOADS",
"value": "True"
},
{
"name": "SHOW_UI",
"value": "True"
},
{
"name": "ENABLE_THREAT_SHARING",
"value": "True"
},
{
"name": "QUARANTINE_DEVICE",
"value": "False"
},
{
"name": "LOGGING_LEVEL",
"value": "False"
},
{
"name": "QUARANTINE_DEVICE_MESSAGE",
"value": "Your device has been quarantined. Please contact your administrator."
},
{
"name": "SET_SENSOR_MODE",
"value": "0"
},
{
"name": "SENSOR_RESET",
"value": "0"
},
{
"name": "BACKGROUND_SCAN",
"value": "True"
},
{
"name": "POLICY_ACTION_OVERRIDE",
"value": "True"
},
{
"name": "HELP_MESSAGE",
"value": ""
},
{
"name": "SCAN_NETWORK_DRIVE",
"value": "False"
},
{
"name": "BYPASS_AFTER_LOGIN_MINS",
"value": "0"
},
{
"name": "BYPASS_AFTER_RESTART_MINS",
"value": "0"
},
{
"name": "SHOW_FULL_UI",
"value": "False"
},
{
"name": "SCAN_EXECUTE_ON_NETWORK_DRIVE",
"value": "False"
},
{
"name": "DELAY_EXECUTE",
"value": "True"
},
{
"name": "PRESERVE_SYSTEM_MEMORY_SCAN",
"value": "False"
},
{
"name": "HASH_MD5",
"value": "False"
},
{
"name": "SCAN_LARGE_FILE_READ",
"value": "False"
},
{
"name": "SECURITY_CENTER_OPT",
"value": "True"
},
{
"name": "CB_LIVE_RESPONSE",
"value": "True"
},
{
"name": "UNINSTALL_CODE",
"value": "True"
},
{
"name": "UBS_OPT_IN",
"value": "True"
},
{
"name": "ALLOW_EXPEDITED_SCAN",
"value": "True"
}
],
"directoryActionRules": [],
"rules": [
{
"id": 1,
"required": True,
"operation": "RANSOM",
"application": {
"value": "COMPANY_BLACK_LIST",
"type": "REPUTATION"
},
"action": "TERMINATE"
}
],
"knownBadHashAutoDeleteDelayMs": None,
"id": -1
},
"name": "Lyon_test",
"description": ""
},
"success": True,
"message": "Success"
}
POLICY_POST_RULE_RESP = {
"ruleId": 21,
"success": True,
"message": "Success"
}
POLICY_MODIFY_RULE_RESP = {
"success": True,
"message": "Success"
}
POLICY_DELETE_RULE_RESP = {
"success": True,
"message": "Success"
}
| 31.18568 | 99 | 0.277542 | 1,193 | 25,697 | 5.770327 | 0.133277 | 0.06043 | 0.083091 | 0.030215 | 0.980535 | 0.974579 | 0.964991 | 0.964991 | 0.964991 | 0.964991 | 0 | 0.016873 | 0.610227 | 25,697 | 823 | 100 | 31.223572 | 0.670427 | 0.001323 | 0 | 0.596319 | 0 | 0 | 0.260241 | 0.038664 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.009816 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8ce46bc26f07ba4aaaf98b31f6cee45f61c7572f | 8,905 | py | Python | tests/test_auto_merge_pull_request.py | messense/badwolf | c693785af101f68505769cd712bbf13e37423587 | [
"MIT"
] | 94 | 2016-09-19T08:40:47.000Z | 2021-07-03T12:55:11.000Z | tests/test_auto_merge_pull_request.py | messense/badwolf | c693785af101f68505769cd712bbf13e37423587 | [
"MIT"
] | 32 | 2016-09-25T07:22:53.000Z | 2018-12-20T08:13:19.000Z | tests/test_auto_merge_pull_request.py | messense/badwolf | c693785af101f68505769cd712bbf13e37423587 | [
"MIT"
] | 15 | 2016-09-20T08:14:40.000Z | 2021-03-23T07:59:09.000Z | # -*- coding: utf-8 -*-
import json
import unittest.mock as mock
from flask import url_for
import badwolf.bitbucket as bitbucket
def test_auto_merge_not_enabled(app, test_client):
app.config['AUTO_MERGE_ENABLED'] = False
payload = json.dumps({
'repository': {
'full_name': 'deepanalyzer/badwolf',
},
'pullrequest': {
'id': 1,
'title': 'PR 1',
'description': 'This is PR1',
},
})
with mock.patch.object(bitbucket.PullRequest, 'get') as pr_get:
test_client.post(
url_for('webhook.webhook_push'),
data=payload,
headers={
'User-Agent': 'Bitbucket-Webhooks/2.0',
'X-Event-Key': 'pullrequest:approved',
}
)
pr_get.assert_not_called()
app.config['AUTO_MERGE_ENABLED'] = True
def test_auto_merge_failure_pr_get_error(app, test_client):
payload = json.dumps({
'repository': {
'full_name': 'deepanalyzer/badwolf',
},
'pullrequest': {
'id': 1,
'title': 'PR 1',
'description': 'This is PR1',
},
})
with mock.patch.object(bitbucket.PullRequest, 'get') as pr_get:
pr_get.side_effect = bitbucket.BitbucketAPIError(404, 'not found', 'PR not found')
test_client.post(
url_for('webhook.webhook_push'),
data=payload,
headers={
'User-Agent': 'Bitbucket-Webhooks/2.0',
'X-Event-Key': 'pullrequest:approved',
}
)
with mock.patch.object(bitbucket.BuildStatus, 'get') as status_get:
status_get.assert_not_called()
def test_auto_merge_skip_merge_skip_in_title(test_client):
payload = json.dumps({
'repository': {
'full_name': 'deepanalyzer/badwolf',
},
'pullrequest': {
'id': 1,
'title': 'PR 1 [merge skip]',
'description': 'PR 1',
},
})
with mock.patch.object(bitbucket.PullRequest, 'get') as pr_get:
test_client.post(
url_for('webhook.webhook_push'),
data=payload,
headers={
'User-Agent': 'Bitbucket-Webhooks/2.0',
'X-Event-Key': 'pullrequest:approved',
}
)
pr_get.assert_not_called()
def test_auto_merge_skip_merge_skip_in_description(test_client):
payload = json.dumps({
'repository': {
'full_name': 'deepanalyzer/badwolf',
},
'pullrequest': {
'id': 1,
'title': 'PR 1',
'description': 'This is PR1. merge skip',
},
})
with mock.patch.object(bitbucket.PullRequest, 'get') as pr_get:
test_client.post(
url_for('webhook.webhook_push'),
data=payload,
headers={
'User-Agent': 'Bitbucket-Webhooks/2.0',
'X-Event-Key': 'pullrequest:approved',
}
)
pr_get.assert_not_called()
def test_auto_merge_skip_wip_in_title(test_client):
payload = json.dumps({
'repository': {
'full_name': 'deepanalyzer/badwolf',
},
'pullrequest': {
'id': 1,
'title': '[wip] PR 1',
'description': 'PR 1',
},
})
with mock.patch.object(bitbucket.PullRequest, 'get') as pr_get:
test_client.post(
url_for('webhook.webhook_push'),
data=payload,
headers={
'User-Agent': 'Bitbucket-Webhooks/2.0',
'X-Event-Key': 'pullrequest:approved',
}
)
pr_get.assert_not_called()
def test_auto_merge_skip_wip_in_description(test_client):
payload = json.dumps({
'repository': {
'full_name': 'deepanalyzer/badwolf',
},
'pullrequest': {
'id': 1,
'title': 'PR 1',
'description': 'This is PR1. WIP',
},
})
with mock.patch.object(bitbucket.PullRequest, 'get') as pr_get:
test_client.post(
url_for('webhook.webhook_push'),
data=payload,
headers={
'User-Agent': 'Bitbucket-Webhooks/2.0',
'X-Event-Key': 'pullrequest:approved',
}
)
pr_get.assert_not_called()
def test_auto_merge_skip_pr_not_in_open_state(test_client):
payload = json.dumps({
'repository': {
'full_name': 'deepanalyzer/badwolf',
},
'pullrequest': {
'id': 1,
'title': 'PR 1',
'description': 'This is PR1',
},
})
with mock.patch.object(bitbucket.PullRequest, 'get') as pr_get:
pr_get.return_value = {
'state': 'CLOSED',
}
test_client.post(
url_for('webhook.webhook_push'),
data=payload,
headers={
'User-Agent': 'Bitbucket-Webhooks/2.0',
'X-Event-Key': 'pullrequest:approved',
}
)
with mock.patch.object(bitbucket.BuildStatus, 'get') as status_get:
status_get.assert_not_called()
def test_auto_merge_skip_not_enough_approval(test_client):
payload = json.dumps({
'repository': {
'full_name': 'deepanalyzer/badwolf',
},
'pullrequest': {
'id': 1,
'title': 'PR 1',
'description': 'This is PR1',
},
})
with mock.patch.object(bitbucket.PullRequest, 'get') as pr_get:
pr_get.return_value = {
'state': 'OPEN',
'participants': [
{'approved': False},
{'approved': True},
{'approved': False},
]
}
test_client.post(
url_for('webhook.webhook_push'),
data=payload,
headers={
'User-Agent': 'Bitbucket-Webhooks/2.0',
'X-Event-Key': 'pullrequest:approved',
}
)
with mock.patch.object(bitbucket.BuildStatus, 'get') as status_get:
status_get.assert_not_called()
def test_auto_merge_success(test_client):
payload = json.dumps({
'repository': {
'full_name': 'deepanalyzer/badwolf',
},
'pullrequest': {
'id': 1,
'title': 'PR 1',
'description': 'This is PR1',
},
})
with mock.patch.object(bitbucket.PullRequest, 'get') as pr_get, \
mock.patch.object(bitbucket.PullRequest, 'merge') as pr_merge, \
mock.patch.object(bitbucket.BuildStatus, 'get') as status_get:
pr_get.return_value = {
'state': 'OPEN',
'participants': [
{'approved': True},
{'approved': True},
{'approved': True},
],
'source': {
'repository': {'full_name': 'deepanalyzer/badwolf'},
'commit': {'hash': '0000000'},
},
}
status_get.return_value = {
'state': 'SUCCESSFUL',
}
pr_merge.return_value = None
test_client.post(
url_for('webhook.webhook_push'),
data=payload,
headers={
'User-Agent': 'Bitbucket-Webhooks/2.0',
'X-Event-Key': 'pullrequest:approved',
}
)
assert status_get.called
assert pr_merge.called
def test_auto_merge_call_error(test_client):
payload = json.dumps({
'repository': {
'full_name': 'deepanalyzer/badwolf',
},
'pullrequest': {
'id': 1,
'title': 'PR 1',
'description': 'This is PR1',
},
})
with mock.patch.object(bitbucket.PullRequest, 'get') as pr_get, \
mock.patch.object(bitbucket.PullRequest, 'merge') as pr_merge, \
mock.patch.object(bitbucket.BuildStatus, 'get') as status_get:
pr_get.return_value = {
'state': 'OPEN',
'participants': [
{'approved': True},
{'approved': True},
{'approved': True},
],
'source': {
'repository': {'full_name': 'deepanalyzer/badwolf'},
'commit': {'hash': '0000000'},
},
}
status_get.return_value = {
'state': 'SUCCESSFUL',
}
pr_merge.side_effect = bitbucket.BitbucketAPIError(401, 'access denied', 'access denied')
test_client.post(
url_for('webhook.webhook_push'),
data=payload,
headers={
'User-Agent': 'Bitbucket-Webhooks/2.0',
'X-Event-Key': 'pullrequest:approved',
}
)
assert status_get.called
assert pr_merge.called
| 30.392491 | 97 | 0.509264 | 860 | 8,905 | 5.067442 | 0.105814 | 0.024094 | 0.058513 | 0.093621 | 0.90156 | 0.886416 | 0.886416 | 0.886416 | 0.886416 | 0.880909 | 0 | 0.012382 | 0.356092 | 8,905 | 292 | 98 | 30.496575 | 0.747646 | 0.002358 | 0 | 0.728625 | 0 | 0 | 0.238685 | 0.024769 | 0 | 0 | 0 | 0 | 0.04461 | 1 | 0.037175 | false | 0 | 0.01487 | 0 | 0.052045 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8cf3fa02cf0d1cb5c88d2944f58d01058cf3a543 | 209 | py | Python | Exercicios de Python/Curso em video/Mundo1/Ex2.py | AbraaoLeonardo/EstudandoPython | f4378eb08d4a8d1353372e181b1974d037d6ac9c | [
"MIT"
] | null | null | null | Exercicios de Python/Curso em video/Mundo1/Ex2.py | AbraaoLeonardo/EstudandoPython | f4378eb08d4a8d1353372e181b1974d037d6ac9c | [
"MIT"
] | null | null | null | Exercicios de Python/Curso em video/Mundo1/Ex2.py | AbraaoLeonardo/EstudandoPython | f4378eb08d4a8d1353372e181b1974d037d6ac9c | [
"MIT"
] | null | null | null | #Meu código
nome = str(input('Digite o seu nome: '))
print("É um prazer te conhecer: ", nome)
#Código do Guanabara
#nome = str(input('Digite o seu nome: '))
#print("É um prazer te conhecer {}".format(nome))
| 23.222222 | 49 | 0.669856 | 34 | 209 | 4.117647 | 0.5 | 0.1 | 0.171429 | 0.257143 | 0.714286 | 0.714286 | 0.714286 | 0.714286 | 0.714286 | 0.714286 | 0 | 0 | 0.162679 | 209 | 8 | 50 | 26.125 | 0.8 | 0.559809 | 0 | 0 | 0 | 0 | 0.505747 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
507f7eaa4aea666dc007be7722db11c4bc6e04c1 | 302,954 | py | Python | excelPDS.py | iangelmx/scrapperFB | f168569edb5d5d7b493ab803668be2236aafaf42 | [
"MIT"
] | null | null | null | excelPDS.py | iangelmx/scrapperFB | f168569edb5d5d7b493ab803668be2236aafaf42 | [
"MIT"
] | null | null | null | excelPDS.py | iangelmx/scrapperFB | f168569edb5d5d7b493ab803668be2236aafaf42 | [
"MIT"
] | null | null | null | from libs.pydocsxl import DocumentoExcel
archivoExcel = DocumentoExcel("ejemplo3.xlsx")
archivoExcel.abreArchivo([{'id':'Resumen', 'name':'Resumen'},{'id':'ProxLib', 'name':'Próximas Liberaciones'},{'id':'Nuevos', 'name':'Nuevos'},{'id':'MejorasRel', 'name':'Mejoras Relevantes'},{'id':'MejorasCal', 'name':'Mejoras Calidad'},{'id':'MejorasCos', 'name':'Mejoras Costos'},{'id':'Corporativos', 'name':'Corporativos'},{'id':'Regulatorios', 'name':'Regulatorios'},{'id':'Liberados', 'name':'Liberados'},{'id':'Terminados', 'name':'Terminados'}])
# ESTILOS
titulo = {
'fuente': { 'nombre':"Tahoma", 'color':'0070C2'},
'bold' : True,
'italic': False,
'fill' : { 'colorHex':'ffffff', 'tipo':'solid'},
'size' : 12,
'border': {'tipo':"borders.BORDER_THIN", 'color':'ffffff'},
'alignment':{'horiz':"left",'vert':"center", 'rotation':0,'wrap':True, 'shrink':False,'indent':0}
}
subtitulo = {
'fuente': { 'nombre':"Tahoma", 'color':'0070C2'},
'bold' : False,
'italic': False,
'fill' : { 'colorHex':'ffffff', 'tipo':'solid'},
'size' : 12,
'alignment':{'horiz':"left",'vert':"center", 'rotation':0,'wrap':True, 'shrink':False,'indent':0}
}
subtitulo2 = {
'fuente': { 'nombre':"Tahoma", 'color':'000000'},
'bold' : True,
'italic': False,
'fill' : { 'colorHex':'D3D3D3', 'tipo':'solid'},
'size' : 11,
'border': {'tipo':"borders.BORDER_THIN", 'color':'ffffff'},
'alignment':{'horiz':"center",'vert':"center", 'rotation':0,'wrap':True, 'shrink':False,'indent':0}
}
subtitulo3 = {
'fuente': { 'nombre':"Tahoma", 'color':'000000'},
'bold' : True,
'italic': False,
'fill' : { 'colorHex':'ffffff', 'tipo':'solid'},
'size' : 14,
'alignment':{'horiz':"center",'vert':"center", 'rotation':0,'wrap':True, 'shrink':False,'indent':0}
}
tituloAzulBlanco = {
'fuente': { 'nombre':"Tahoma", 'color':'ffffff'},
'bold' : True,
'italic': False,
'fill' : { 'colorHex':'00286C', 'tipo':'solid'},
'size' : 11,
'alignment':{'horiz':"center",'vert':"center", 'rotation':0,'wrap':True, 'shrink':False,'indent':0}
}
tituloAzulBlancoBorde = {
'fuente': { 'nombre':"Tahoma", 'color':'ffffff'},
'bold' : True,
'italic': False,
'fill' : { 'colorHex':'00286C', 'tipo':'solid'},
'size' : 10,
'border': {'tipo':"borders.BORDER_THIN", 'color':'ffffff'},
'alignment':{'horiz':"center",'vert':"center", 'rotation':0,'wrap':True, 'shrink':False,'indent':0}
}
tituloAzulClaroBlanco = {
'fuente': { 'nombre':"Tahoma", 'color':'ffffff'},
'bold' : True,
'italic': False,
'fill' : { 'colorHex':'0070C2', 'tipo':'solid'},
'size' : 9,
'alignment':{'horiz':"center",'vert':"center", 'rotation':0,'wrap':True, 'shrink':False,'indent':0}
}
tituloAzulClaroBlancoBorde = {
'fuente': { 'nombre':"Tahoma", 'color':'ffffff'},
'bold' : True,
'italic': False,
'fill' : { 'colorHex':'0070C2', 'tipo':'solid'},
'size' : 9,
'border': {'tipo':"borders.BORDER_THIN", 'color':'ffffff'},
'alignment':{'horiz':"center",'vert':"center", 'rotation':0,'wrap':True, 'shrink':False,'indent':0}
}
tituloRojo = {
'fuente': { 'nombre':"Tahoma", 'color':'ffffff'},
'bold' : True,
'italic': False,
'fill' : { 'colorHex':'D70002', 'tipo':'solid'},
'size' : 9,
'alignment':{'horiz':"center",'vert':"center", 'rotation':0,'wrap':True, 'shrink':False,'indent':0}
}
cuerpo = {
'fuente': { 'nombre':"Tahoma", 'color':'000000'},
'bold' : False,
'italic': False,
'fill' : { 'colorHex':'ffffff', 'tipo':'solid'},
'size' : 9,
'border': {'tipo':"borders.BORDER_THIN", 'color':'000000'},
'alignment':{'horiz':"center",'vert':"center", 'rotation':0,'wrap':True, 'shrink':False,'indent':0}
}
cuerpoBold = {
'fuente': { 'nombre':"Tahoma", 'color':'000000'},
'bold' : True,
'italic': False,
'fill' : { 'colorHex':'ffffff', 'tipo':'solid'},
'size' : 9,
'border': {'tipo':"borders.BORDER_THIN", 'color':'000000'},
'alignment':{'horiz':"center",'vert':"center", 'rotation':0,'wrap':True, 'shrink':False,'indent':0}
}
cuerpoIzq = {
'fuente': { 'nombre':"Tahoma", 'color':'000000'},
'bold' : False,
'italic': False,
'fill' : { 'colorHex':'ffffff', 'tipo':'solid'},
'size' : 9,
'border': {'tipo':"borders.BORDER_THIN", 'color':'000000'},
'alignment':{'horiz':"left",'vert':"center", 'rotation':0,'wrap':True, 'shrink':False,'indent':0}
}
cuerpoVertical = {
'fuente': { 'nombre':"Tahoma", 'color':'000000'},
'bold' : False,
'italic': False,
'fill' : { 'colorHex':'E7E7E7', 'tipo':'solid'},
'size' : 9,
'border': {'tipo':"borders.BORDER_THIN", 'color':'000000'},
'alignment':{'horiz':"center",'vert':"center", 'rotation':90,'wrap':True, 'shrink':False,'indent':0}
}
cuerpoNoVertical = {
'fuente': { 'nombre':"Tahoma", 'color':'000000'},
'bold' : False,
'italic': False,
'fill' : { 'colorHex':'E7E7E7', 'tipo':'solid'},
'size' : 9,
'border': {'tipo':"borders.BORDER_THIN", 'color':'000000'},
'alignment':{'horiz':"center",'vert':"center", 'rotation':0,'wrap':True, 'shrink':False,'indent':0}
}
cuerpoVerde = {
'fuente': { 'nombre':"Tahoma", 'color':'ffffff'},
'bold' : False,
'italic': False,
'fill' : { 'colorHex':'00AE02', 'tipo':'solid'},
'size' : 9,
'border': {'tipo':"borders.BORDER_THIN", 'color':'000000'},
'alignment':{'horiz':"center",'vert':"center", 'rotation':0,'wrap':True, 'shrink':False,'indent':0}
}
cuerpoVerdeClaro = {
'fuente': { 'nombre':"Tahoma", 'color':'000000'},
'bold' : False,
'italic': False,
'fill' : { 'colorHex':'90FA96', 'tipo':'solid'},
'size' : 9,
'border': {'tipo':"borders.BORDER_THIN", 'color':'000000'},
'alignment':{'horiz':"center",'vert':"center", 'rotation':0,'wrap':True, 'shrink':False,'indent':0}
}
cuerpoVerdeBold = {
'fuente': { 'nombre':"Tahoma", 'color':'ffffff'},
'bold' : True,
'italic': False,
'fill' : { 'colorHex':'00AE02', 'tipo':'solid'},
'size' : 9,
'border': {'tipo':"borders.BORDER_THIN", 'color':'000000'},
'alignment':{'horiz':"center",'vert':"center", 'rotation':0,'wrap':True, 'shrink':False,'indent':0}
}
cuerpoVerdeClaroBold = {
'fuente': { 'nombre':"Tahoma", 'color':'000000'},
'bold' : True,
'italic': False,
'fill' : { 'colorHex':'90FA96', 'tipo':'solid'},
'size' : 9,
'border': {'tipo':"borders.BORDER_THIN", 'color':'000000'},
'alignment':{'horiz':"center",'vert':"center", 'rotation':0,'wrap':True, 'shrink':False,'indent':0}
}
rellenoGris = {
'fill' : { 'colorHex':'696969', 'tipo':'solid'}
}
cuerpoRellenoGris = {
'fuente': { 'nombre':"Tahoma", 'color':'ffffff'},
'bold' : True,
'italic': False,
'fill' : { 'colorHex':'696969', 'tipo':'solid'},
'size' : 10,
'border': {'tipo':"borders.BORDER_THIN", 'color':'ffffff'},
'alignment':{'horiz':"center",'vert':"center", 'rotation':0,'wrap':True, 'shrink':False,'indent':0}
}
# Json en forma de Dict
jsonRoadmap = {
'Resumen':{
'EnProceso':[
{
'MERCADO': 'Empresarial',
'Nuevos':5,
'MejorasIng':2,
'MejorasCal':9,
'MejorasCos':10,
'TOTAL':24
},
{
'MERCADO': 'PyME',
'Nuevos':15,
'MejorasIng':3,
'MejorasCal':17,
'MejorasCos':1,
'TOTAL':28
},
{
'MERCADO': 'Residencial',
'Nuevos':45,
'MejorasIng':13,
'MejorasCal':31,
'MejorasCos':19,
'TOTAL':127
},
{
'MERCADO': 'DT y AR',
'Nuevos':45,
'MejorasIng':13,
'MejorasCal':31,
'MejorasCos':19,
'TOTAL':127
},
{
'MERCADO': 'Finanzas',
'Nuevos':45,
'MejorasIng':13,
'MejorasCal':31,
'MejorasCos':19,
'TOTAL':127
},
{
'MERCADO': 'Recursos Humanos',
'Nuevos':45,
'MejorasIng':13,
'MejorasCal':31,
'MejorasCos':19,
'TOTAL':127
},
{
'MERCADO': 'Regulatorio',
'Nuevos':45,
'MejorasIng':13,
'MejorasCal':31,
'MejorasCos':19,
'TOTAL':127
},
{
'MERCADO': 'TOTAL',
'Nuevos':45,
'MejorasIng':13,
'MejorasCal':31,
'MejorasCos':19,
'TOTAL':127
}
],
'Liberados':[
{
'MERCADO': 'Empresarial2',
'Nuevos':15,
'MejorasIng':3,
'MejorasCal':23,
'MejorasCos':1,
'TOTAL':34
},
{
'MERCADO': 'PyME2',
'Nuevos':7,
'MejorasIng':2,
'MejorasCal':7,
'MejorasCos':3,
'TOTAL':19
},
{
'MERCADO': 'Residencial',
'Nuevos':45,
'MejorasIng':13,
'MejorasCal':31,
'MejorasCos':19,
'TOTAL':127
},
{
'MERCADO': 'DT y AR',
'Nuevos':45,
'MejorasIng':13,
'MejorasCal':31,
'MejorasCos':19,
'TOTAL':127
},
{
'MERCADO': 'Finanzas',
'Nuevos':45,
'MejorasIng':13,
'MejorasCal':31,
'MejorasCos':19,
'TOTAL':127
},
{
'MERCADO': 'Recursos Humanos',
'Nuevos':45,
'MejorasIng':13,
'MejorasCal':31,
'MejorasCos':19,
'TOTAL':127
},
{
'MERCADO': 'Regulatorio',
'Nuevos':45,
'MejorasIng':13,
'MejorasCal':31,
'MejorasCos':19,
'TOTAL':127
},
{
'MERCADO': 'TOTAL',
'Nuevos':45,
'MejorasIng':13,
'MejorasCal':31,
'MejorasCos':19,
'TOTAL':127
}
],
'TOTAL':[
{
'MERCADO': 'Empresarial3',
'Nuevos':15,
'MejorasIng':3,
'MejorasCal':23,
'MejorasCos':1,
'TOTAL':34
},
{
'MERCADO': 'PyME3',
'Nuevos':7,
'MejorasIng':2,
'MejorasCal':7,
'MejorasCos':3,
'TOTAL':19
},
{
'MERCADO': 'Residencial',
'Nuevos':45,
'MejorasIng':13,
'MejorasCal':31,
'MejorasCos':19,
'TOTAL':127
},
{
'MERCADO': 'DT y AR',
'Nuevos':45,
'MejorasIng':13,
'MejorasCal':31,
'MejorasCos':19,
'TOTAL':127
},
{
'MERCADO': 'Finanzas',
'Nuevos':45,
'MejorasIng':13,
'MejorasCal':31,
'MejorasCos':19,
'TOTAL':127
},
{
'MERCADO': 'Recursos Humanos',
'Nuevos':45,
'MejorasIng':13,
'MejorasCal':31,
'MejorasCos':19,
'TOTAL':127
},
{
'MERCADO': 'Regulatorio',
'Nuevos':45,
'MejorasIng':13,
'MejorasCal':31,
'MejorasCos':19,
'TOTAL':127
},
{
'MERCADO': 'TOTAL',
'Nuevos':45,
'MejorasIng':13,
'MejorasCal':31,
'MejorasCos':19,
'TOTAL':127
}
],
'Info':'ROADMAP 2018'
},
'ProxLib':{
'Datos':[
{
'MERCADO':'Empresarial',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
],
'Info':
{
'TABLA':'STATUS DE LAS PRÓXIMAS LIBERACIONES AL 10/09/2018',
'TOTAL':9
}
},
'Nuevos':{
'Datos':[
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
], #fin datos de nuevos
'Info':
{
'TABLA':'ROADMAP 2018: Proyectos Nuevos',
'TOTAL':12
}
}, #fin nuevos
'MejorasRel':{
'Datos':[
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
], #fin datos
'Info':
{
'TABLA':'ROADMAP 2018: Proyectos Nuevos',
'TOTAL':12
}
}, #fin nuevos
'MejorasCal':{
'Datos':[
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
], #fin datos
'Info':
{
'TABLA':'ROADMAP 2018: Proyectos Nuevos',
'TOTAL':12
}
}, #fin nuevos
'MejorasCos':{
'Datos':[
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
], #fin datos
'Info':
{
'TABLA':'ROADMAP 2018: Proyectos Nuevos',
'TOTAL':12
}
}, #fin nuevos
'Corporativos':{
'Datos':[
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
], #fin datos
'Info':
{
'TABLA':'ROADMAP 2018: Proyectos Nuevos',
'TOTAL':12
}
}, #fin nuevos
'Regulatorios':{
'Datos':[
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
], #fin datos
'Info':
{
'TABLA':'ROADMAP 2018: Proyectos Nuevos',
'TOTAL':12
}
}, #fin nuevos
'Liberados':{
'Nuevos':{
'Datos':[
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
], #fin datos de nuevos
'Info':
{
'TABLA':'ROADMAP 2018: Proyectos Nuevos',
'TOTAL':12
}
}, #fin nuevos
'MejorasRel':{
'Datos':[
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
], #fin datos
'Info':
{
'TABLA':'ROADMAP 2018: Proyectos Nuevos',
'TOTAL':12
}
}, #fin nuevos
'MejorasCal':{
'Datos':[
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
], #fin datos
'Info':
{
'TABLA':'ROADMAP 2018: Proyectos Nuevos',
'TOTAL':12
}
}, #fin nuevos
'MejorasCos':{
'Datos':[
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
], #fin datos
'Info':
{
'TABLA':'ROADMAP 2018: Proyectos Nuevos',
'TOTAL':12
}
}, #fin nuevos
'Corporativos':{
'Datos':[
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
], #fin datos
'Info':
{
'TABLA':'ROADMAP 2018: Proyectos Nuevos',
'TOTAL':12
}
}, #fin nuevos
'Regulatorios':{
'Datos':[
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
], #fin datos
'Info':
{
'TABLA':'ROADMAP 2018: Proyectos Nuevos',
'TOTAL':12
}
},
'Info':'Liberados 179'
}, #fin liberados
'Terminados':{
'Nuevos':{
'Datos':[
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
], #fin datos de nuevos
'Info':
{
'TABLA':'ROADMAP 2018: Proyectos Nuevos',
'TOTAL':12
}
}, #fin nuevos
'MejorasRel':{
'Datos':[
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
], #fin datos
'Info':
{
'TABLA':'ROADMAP 2018: Proyectos Nuevos',
'TOTAL':12
}
}, #fin nuevos
'MejorasCal':{
'Datos':[
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
], #fin datos
'Info':
{
'TABLA':'ROADMAP 2018: Proyectos Nuevos',
'TOTAL':12
}
}, #fin nuevos
'MejorasCos':{
'Datos':[
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
], #fin datos
'Info':
{
'TABLA':'ROADMAP 2018: Proyectos Nuevos',
'TOTAL':12
}
}, #fin nuevos
'Corporativos':{
'Datos':[
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
], #fin datos
'Info':
{
'TABLA':'ROADMAP 2018: Proyectos Nuevos',
'TOTAL':12
}
}, #fin nuevos
'Regulatorios':{
'Datos':[
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'4°',
'SOLUCION':'Grabación Telmex',
'TIPO':'Ingresos',
'VALOR':'38.80',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Empresarial',
'Q':'1° 2019',
'SOLUCION':'Nube SAP HEC',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'10/10/2018',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'PYME-2',
'Q':'4°',
'SOLUCION':'Comunicaciones Unificadas Avanzadas - Microsoft',
'TIPO':'Ingresos',
'VALOR':'38',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
},
{
'MERCADO':'Residencial',
'Q':'4°',
'SOLUCION':'INFINITUM + PREMIUM - (Promociones)',
'TIPO':'Ingresos',
'VALOR':'38.00',
'LIDER':'Oscar Arreola',
'STATUS':'Diseño',
'LIBERACION':'Por Definir',
'DEPENDENCIA':'- Inicio de prueba de concepto CUAD Microsoft en laboratorio. | Mario López/ Jaime Alfaro | 10/09/2018'
}
], #fin datos
'Info':
{
'TABLA':'ROADMAP 2018: Proyectos Nuevos',
'TOTAL':12
}
},
'Info':'Liberados 179'
} #fin termiandos
}
#### RESUMEN --------------------------¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡----------------------¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡
archivoExcel.mergeCeldas('Resumen',1,1,1,4)
archivoExcel.mergeCeldas('Resumen',1,5,1,6)
archivoExcel.mergeCeldas('Resumen',2,1,2,5)
archivoExcel.mergeCeldas('Resumen',3,1,3,6)
#ROWS
archivoExcel.dimensionaRows('Resumen',1,25)
archivoExcel.dimensionaRows('Resumen',2,25)
archivoExcel.dimensionaRows('Resumen',3,25)
#COLS
archivoExcel.dimensionaCols('Resumen','A',25)
archivoExcel.dimensionaCols('Resumen','B',20)
archivoExcel.dimensionaCols('Resumen','C',20)
archivoExcel.dimensionaCols('Resumen','D',20)
archivoExcel.dimensionaCols('Resumen','E',20)
archivoExcel.dimensionaCols('Resumen','F',20)
# LOGOS
archivoExcel.addImageToCell('Resumen','tmx_logo.png','E1')
# TITLE
archivoExcel.escribeEnHoja('Resumen', 'A', 1, 'Dirección de Desarrollo Tecnológico', traceback=False, formato=titulo)
archivoExcel.escribeEnHoja('Resumen', 'A', 2, 'Subdirección de Desarrollo de Soluciones', traceback=False, formato=subtitulo)
archivoExcel.escribeEnHoja('Resumen', 'E', 1, '', traceback=False, formato=subtitulo)
archivoExcel.escribeEnHoja('Resumen', 'F', 2, '', traceback=False, formato=subtitulo)
archivoExcel.escribeEnHoja('Resumen', 'A', 3, jsonRoadmap['Resumen']['Info'], traceback=False, formato=subtitulo3)
#CUERPO
n = 4
tabls = ['EnProceso','Liberados','TOTAL']
try:
for tabl in tabls:
#for j in range(1,len(jsonRoadmap['Resumen'])):
# espacio en blanco
archivoExcel.mergeCeldas('Resumen',n,1,n,6)
archivoExcel.escribeEnHoja('Resumen', 'A', n, '', traceback=False, formato=subtitulo)
#EN proceso, Liberados, Total
archivoExcel.mergeCeldas('Resumen',n+1,1,n+1,6)
archivoExcel.dimensionaRows('Resumen',n+1,20)
#formato headers
archivoExcel.mergeCeldas('Resumen',n+2,3,n+2,5)
archivoExcel.mergeCeldas('Resumen',n+2,1,n+3,1)
archivoExcel.mergeCeldas('Resumen',n+2,2,n+3,2)
archivoExcel.mergeCeldas('Resumen',n+2,6,n+3,6)
#Headers
archivoExcel.escribeEnHoja('Resumen', 'A', n+2, 'Cliente', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja('Resumen', 'B', n+2, 'Nuevos', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja('Resumen', 'C', n+2, 'Mejoras', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja('Resumen', 'F', n+2, 'Total', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja('Resumen', 'C', n+3, 'Ingresos', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja('Resumen', 'D', n+3, 'Calidad', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja('Resumen', 'E', n+3, 'Costos', traceback=False, formato=tituloAzulClaroBlancoBorde)
#Para que los bordes no se vean cheucos
archivoExcel.escribeEnHoja('Resumen', 'A', n+3, '', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja('Resumen', 'B', n+3, '', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja('Resumen', 'D', n+2, '', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja('Resumen', 'E', n+2, '', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja('Resumen', 'F', n+3, '', traceback=False, formato=tituloAzulClaroBlancoBorde)
d = 4
for j in range(0,len(jsonRoadmap['Resumen'][tabl])):
if jsonRoadmap['Resumen'][tabl][j]['MERCADO'] == 'TOTAL':
archivoExcel.escribeEnHoja('Resumen', 'A', d+n, jsonRoadmap['Resumen'][tabl][j]['MERCADO'], traceback=False, formato=cuerpoRellenoGris)
archivoExcel.escribeEnHoja('Resumen', 'B', d+n, jsonRoadmap['Resumen'][tabl][j]['Nuevos'], traceback=False, formato=cuerpoRellenoGris)
archivoExcel.escribeEnHoja('Resumen', 'C', d+n, jsonRoadmap['Resumen'][tabl][j]['MejorasIng'], traceback=False, formato=cuerpoRellenoGris)
archivoExcel.escribeEnHoja('Resumen', 'D', d+n, jsonRoadmap['Resumen'][tabl][j]['MejorasCal'], traceback=False, formato=cuerpoRellenoGris)
archivoExcel.escribeEnHoja('Resumen', 'E', d+n, jsonRoadmap['Resumen'][tabl][j]['MejorasCos'], traceback=False, formato=cuerpoRellenoGris)
archivoExcel.escribeEnHoja('Resumen', 'F', d+n, jsonRoadmap['Resumen'][tabl][j]['TOTAL'], traceback=False, formato=cuerpoRellenoGris)
archivoExcel.dimensionaRows('Resumen', d+n, 18)
else:
archivoExcel.escribeEnHoja('Resumen', 'A', d+n, jsonRoadmap['Resumen'][tabl][j]['MERCADO'], traceback=False, formato=cuerpoNoVertical)
archivoExcel.escribeEnHoja('Resumen', 'B', d+n, jsonRoadmap['Resumen'][tabl][j]['Nuevos'], traceback=False, formato=cuerpo)
archivoExcel.escribeEnHoja('Resumen', 'C', d+n, jsonRoadmap['Resumen'][tabl][j]['MejorasIng'], traceback=False, formato=cuerpo)
archivoExcel.escribeEnHoja('Resumen', 'D', d+n, jsonRoadmap['Resumen'][tabl][j]['MejorasCal'], traceback=False, formato=cuerpo)
archivoExcel.escribeEnHoja('Resumen', 'E', d+n, jsonRoadmap['Resumen'][tabl][j]['MejorasCos'], traceback=False, formato=cuerpo)
archivoExcel.escribeEnHoja('Resumen', 'F', d+n, jsonRoadmap['Resumen'][tabl][j]['TOTAL'], traceback=False, formato=cuerpo)
d+=1
n+= 12
archivoExcel.escribeEnHoja('Resumen', 'A', 5, 'EN PROCESO', traceback=False, formato=tituloAzulBlanco)
archivoExcel.dimensionaRows('Resumen',4,12)
archivoExcel.dimensionaRows('Resumen',5,28)
archivoExcel.escribeEnHoja('Resumen', 'A', 17, 'LIBERADOS', traceback=False, formato=tituloAzulBlanco)
archivoExcel.dimensionaRows('Resumen',16,24)
archivoExcel.dimensionaRows('Resumen',17,28)
archivoExcel.escribeEnHoja('Resumen', 'A', 29, 'TOTAL', traceback=False, formato=tituloAzulBlanco)
archivoExcel.dimensionaRows('Resumen',28,24)
archivoExcel.dimensionaRows('Resumen',29,28)
except:
print('Resumen no tiene datos'+str(n))
#### PROX LIBERACIONES--------------------------¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡----------------------¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡
archivoExcel.mergeCeldas('ProxLib',1,1,1,7)
archivoExcel.mergeCeldas('ProxLib',2,1,2,7)
archivoExcel.mergeCeldas('ProxLib',3,1,3,7)
#Rows' height
archivoExcel.dimensionaRows('ProxLib',1,25)
archivoExcel.dimensionaRows('ProxLib',2,25)
archivoExcel.dimensionaRows('ProxLib',3,25)
archivoExcel.dimensionaRows('ProxLib',4,25)
#Cols' width
archivoExcel.dimensionaCols('ProxLib','A',13)
archivoExcel.dimensionaCols('ProxLib','B',42)
archivoExcel.dimensionaCols('ProxLib','C',13)
archivoExcel.dimensionaCols('ProxLib','D',13)
archivoExcel.dimensionaCols('ProxLib','E',13)
archivoExcel.dimensionaCols('ProxLib','F',13)
archivoExcel.dimensionaCols('ProxLib','G',13)
archivoExcel.dimensionaCols('ProxLib','H',56)
# LOGOS
archivoExcel.addImageToCell('ProxLib','tmx_logo.png','H1')
# TITLE
archivoExcel.escribeEnHoja('ProxLib', 'A', 1, 'Dirección de Desarrollo Tecnológico', traceback=False, formato=titulo)
archivoExcel.escribeEnHoja('ProxLib', 'A', 2, 'Subdirección de Desarrollo de Soluciones', traceback=False, formato=subtitulo)
archivoExcel.escribeEnHoja('ProxLib', 'H', 1, '', traceback=False, formato=subtitulo)
archivoExcel.escribeEnHoja('ProxLib', 'H', 2, '', traceback=False, formato=subtitulo)
# HEADERS
archivoExcel.escribeEnHoja('ProxLib', 'A', 4, '', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja('ProxLib', 'B', 4, 'SOLUCIÓN', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja('ProxLib', 'C', 4, 'TIPO', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja('ProxLib', 'D', 4, 'VALOR (MDP a 3 años)', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja('ProxLib', 'E', 4, 'LÍDER DE PRODUCTO', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja('ProxLib', 'F', 4, 'STATUS', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja('ProxLib', 'G', 4, 'LIBERACIÓN', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja('ProxLib', 'H', 4, 'DEPENDENCIA CRÍTICA', traceback=False, formato=tituloRojo)
# CUERPO
# Declarar vars
l1=[]
l2=[]
cl=''
mk = ''
x = 5
y = 0
# En caso de que no haya datos para alguna de las páginas del archivo (mientras hago pruebas)
try:
archivoExcel.escribeEnHoja('ProxLib', 'A', 3, jsonRoadmap['ProxLib']['Info']['TABLA'], traceback=False, formato=tituloAzulBlancoBorde)
# Para poenr borde blanco a todas las columnas de las celdas mergeadas del título de la tabla
archivoExcel.escribeEnHoja('ProxLib', 'B', 3, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja('ProxLib', 'C', 3, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja('ProxLib', 'D', 3, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja('ProxLib', 'E', 3, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja('ProxLib', 'F', 3, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja('ProxLib', 'G', 3, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja('ProxLib', 'H', 3, 'TOTAL = '+str(jsonRoadmap['ProxLib']['Info']['TOTAL']), traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja('ProxLib', 'H', len(jsonRoadmap['ProxLib']['Datos'])+5, 'TOTAL = '+str(jsonRoadmap['ProxLib']['Info']['TOTAL']), traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.mergeCeldas('ProxLib',len(jsonRoadmap['ProxLib']['Datos'])+5,1,len(jsonRoadmap['ProxLib']['Datos'])+5,7)
archivoExcel.dimensionaRows('ProxLib',len(jsonRoadmap['ProxLib']['Datos'])+5,20)
archivoExcel.escribeEnHoja('ProxLib', 'A', len(jsonRoadmap['ProxLib']['Datos'])+5, '', traceback=False, formato=rellenoGris)
for j in range(0,len(jsonRoadmap['ProxLib']['Datos'])):
archivoExcel.escribeEnHoja('ProxLib', 'A', j+5, jsonRoadmap['ProxLib']['Datos'][j]['MERCADO'], traceback=False, formato=cuerpoNoVertical)
archivoExcel.escribeEnHoja('ProxLib', 'B', j+5, jsonRoadmap['ProxLib']['Datos'][j]['SOLUCION'], traceback=False, formato=cuerpoBold)
archivoExcel.escribeEnHoja('ProxLib', 'C', j+5, jsonRoadmap['ProxLib']['Datos'][j]['TIPO'], traceback=False, formato=cuerpo)
#cambiar a formato currency en el excel. No hay una funcion para hacerlo directo, lo más cercano es self.__excelFile[ hoja ].number_format = u'"$ "#,###.00'
archivoExcel.escribeEnHoja('ProxLib', 'D', j+5, jsonRoadmap['ProxLib']['Datos'][j]['VALOR'], traceback=False, formato=cuerpo)
archivoExcel.escribeEnHoja('ProxLib', 'E', j+5, jsonRoadmap['ProxLib']['Datos'][j]['LIDER'], traceback=False, formato=cuerpo)
archivoExcel.escribeEnHoja('ProxLib', 'F', j+5, jsonRoadmap['ProxLib']['Datos'][j]['STATUS'], traceback=False, formato=cuerpo)
# bold cuendo fecha ya nob está Por Definir
if '/' in jsonRoadmap['ProxLib']['Datos'][j]['LIBERACION']:
archivoExcel.escribeEnHoja('ProxLib', 'G', j+5, jsonRoadmap['ProxLib']['Datos'][j]['LIBERACION'], traceback=False, formato=cuerpoBold)
else:
archivoExcel.escribeEnHoja('ProxLib', 'G', j+5, jsonRoadmap['ProxLib']['Datos'][j]['LIBERACION'], traceback=False, formato=cuerpo)
archivoExcel.escribeEnHoja('ProxLib', 'H', j+5, jsonRoadmap['ProxLib']['Datos'][j]['DEPENDENCIA'], traceback=False, formato=cuerpoIzq)
except:
print('Prox. Liberaciones no tiene datos')
# loop through pestañas del archivo que tienen la misma estructura-----------------------¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡¡----------------------¡¡¡¡¡¡¡¡¡¡¡¡¡¡
tabs = ['Nuevos', 'MejorasRel','MejorasCal', 'MejorasCos', 'Corporativos', 'Regulatorios']
#Para tablas en pestaña de liberados y termiandos
iniLib = 4
iniTer = 4
# iterar por las pestañas del archivo
for tab in tabs:
#CELL FORMAT
#Merge
archivoExcel.mergeCeldas(tab,1,1,1,8)
archivoExcel.mergeCeldas(tab,2,1,2,8)
archivoExcel.mergeCeldas(tab,3,1,3,8)
#Rows' height
archivoExcel.dimensionaRows(tab,1,25)
archivoExcel.dimensionaRows(tab,2,25)
archivoExcel.dimensionaRows(tab,3,25)
archivoExcel.dimensionaRows(tab,4,25)
#Cols' width
archivoExcel.dimensionaCols(tab,'A',5)
archivoExcel.dimensionaCols(tab,'B',5)
archivoExcel.dimensionaCols(tab,'C',42)
archivoExcel.dimensionaCols(tab,'D',13)
archivoExcel.dimensionaCols(tab,'E',13)
archivoExcel.dimensionaCols(tab,'F',13)
archivoExcel.dimensionaCols(tab,'G',13)
archivoExcel.dimensionaCols(tab,'H',13)
archivoExcel.dimensionaCols(tab,'I',56)
# LOGOS
archivoExcel.addImageToCell(tab,'tmx_logo.png','I1')
# TITLE
archivoExcel.escribeEnHoja(tab, 'A', 1, 'Dirección de Desarrollo Tecnológico', traceback=False, formato=titulo)
archivoExcel.escribeEnHoja(tab, 'A', 2, 'Subdirección de Desarrollo de Soluciones', traceback=False, formato=subtitulo)
archivoExcel.escribeEnHoja(tab, 'I', 1, '', traceback=False, formato=subtitulo)
archivoExcel.escribeEnHoja(tab, 'I', 2, '', traceback=False, formato=subtitulo)
# HEADERS
archivoExcel.escribeEnHoja(tab, 'A', 4, '', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja(tab, 'B', 4, 'Q', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja(tab, 'C', 4, 'SOLUCIÓN', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja(tab, 'D', 4, 'TIPO', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja(tab, 'E', 4, 'VALOR (MDP a 3 años)', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja(tab, 'F', 4, 'LÍDER DE PRODUCTO', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja(tab, 'G', 4, 'STATUS', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja(tab, 'H', 4, 'LIBERACIÓN', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja(tab, 'I', 4, 'DEPENDENCIA CRÍTICA', traceback=False, formato=tituloRojo)
# CUERPO
# Declarar vars
l1=[]
l2=[]
mk = ''
x = 5
y = 0
l0=[]
x1 = 5
y1 = 0
# En caso de que no haya datos para alguna de las páginas del archivo (mientras hago pruebas)
try:
archivoExcel.escribeEnHoja(tab, 'A', 3, jsonRoadmap[tab]['Info']['TABLA'], traceback=False, formato=tituloAzulBlancoBorde)
# Para poenr borde blanco a todas las columnas de las celdas mergeadas del título de la tabla
archivoExcel.escribeEnHoja(tab, 'B', 3, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja(tab, 'C', 3, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja(tab, 'D', 3, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja(tab, 'E', 3, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja(tab, 'F', 3, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja(tab, 'G', 3, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja(tab, 'H', 3, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja(tab, 'I', 3, 'TOTAL = '+str(jsonRoadmap[tab]['Info']['TOTAL']), traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja(tab, 'I', len(jsonRoadmap[tab]['Datos'])+5, 'TOTAL = '+str(jsonRoadmap[tab]['Info']['TOTAL']), traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.mergeCeldas(tab,len(jsonRoadmap[tab]['Datos'])+5,1,len(jsonRoadmap[tab]['Datos'])+5,8)
archivoExcel.dimensionaRows(tab,len(jsonRoadmap[tab]['Datos'])+5,20)
archivoExcel.escribeEnHoja(tab, 'A', len(jsonRoadmap[tab]['Datos'])+5, '', traceback=False, formato=rellenoGris)
for j in range(0,len(jsonRoadmap[tab]['Datos'])):
# PARA MERGE CELDAS (COL A)
if mk != jsonRoadmap[tab]['Datos'][j]['MERCADO']:
l1.append(jsonRoadmap[tab]['Datos'][j]['MERCADO'])
l0.append('')
# PARA MERGE CELDAS (COL A)
l2.append(jsonRoadmap[tab]['Datos'][j]['MERCADO'])
mk = jsonRoadmap[tab]['Datos'][j]['MERCADO']
# PARA MERGE CELDAS (COL B)
l0.append(jsonRoadmap[tab]['Datos'][j]['Q'])
# PARA MERGE CELDAS (COL A)
for ele in l1:
if l2.count(ele) > 1:
y = x+l2.count(ele)-1
archivoExcel.mergeCeldas(tab,x,1,y,1)
x+=l2.count(ele)-1
x+=1
# PARA MERGE CELDAS (COL B)
for ele in range(0,len(l0)):
if l0[ele] != '' and l0[ele] == l0[ele-1]:
if y1 == 0:
y1 = x1
x1+=1
if ele == len(l0)-1:
archivoExcel.mergeCeldas(tab,y1-1,2,x1-1,2)
elif l0[ele] != '' and l0[ele] != l0[ele-1]:
if y1 != 0:
archivoExcel.mergeCeldas(tab,y1-1,2,x1-1,2)
x1+=1
y1 = 0
# Si no se coloca al final (despues de los ele), los bordes no aparecen bien para las celdas con merge
for j in range(0,len(jsonRoadmap[tab]['Datos'])):
archivoExcel.escribeEnHoja(tab, 'A', j+5, jsonRoadmap[tab]['Datos'][j]['MERCADO'], traceback=False, formato=cuerpoVertical)
archivoExcel.escribeEnHoja(tab, 'B', j+5, jsonRoadmap[tab]['Datos'][j]['Q'], traceback=False, formato=cuerpo)
archivoExcel.escribeEnHoja(tab, 'C', j+5, jsonRoadmap[tab]['Datos'][j]['SOLUCION'], traceback=False, formato=cuerpoBold)
archivoExcel.escribeEnHoja(tab, 'D', j+5, jsonRoadmap[tab]['Datos'][j]['TIPO'], traceback=False, formato=cuerpo)
#cambiar a formato currency en el excel. No hay una funcion para hacerlo directo, lo más cercano es self.__excelFile[ hoja ].number_format = u'"$ "#,###.00'
archivoExcel.escribeEnHoja(tab, 'E', j+5, jsonRoadmap[tab]['Datos'][j]['VALOR'], traceback=False, formato=cuerpo)
archivoExcel.escribeEnHoja(tab, 'F', j+5, jsonRoadmap[tab]['Datos'][j]['LIDER'], traceback=False, formato=cuerpo)
archivoExcel.escribeEnHoja(tab, 'G', j+5, jsonRoadmap[tab]['Datos'][j]['STATUS'], traceback=False, formato=cuerpo)
# bold cuendo fecha ya no está Por Definir
if '/' in jsonRoadmap[tab]['Datos'][j]['LIBERACION']:
archivoExcel.escribeEnHoja(tab, 'H', j+5, jsonRoadmap[tab]['Datos'][j]['LIBERACION'], traceback=False, formato=cuerpoBold)
else:
archivoExcel.escribeEnHoja(tab, 'H', j+5, jsonRoadmap[tab]['Datos'][j]['LIBERACION'], traceback=False, formato=cuerpo)
archivoExcel.escribeEnHoja(tab, 'I', j+5, jsonRoadmap[tab]['Datos'][j]['DEPENDENCIA'], traceback=False, formato=cuerpoIzq)
# PARA PESTAÑA DE LIBERADOS Y TERMINADOS
#CELL FORMAT
#Merge
tabs2 = ['Liberados','Terminados']
for tab2 in tabs2:
if tab2 == 'Liberados':
ini = iniLib
elif tab2 == 'Terminados':
ini = iniTer
l3=[]
l4=[]
mkLibTer = ''
x2 = 2
y2 = 0
l00=[]
x3 = 2
y3 = 0
archivoExcel.mergeCeldas(tab2,1,1,1,8)
archivoExcel.mergeCeldas(tab2,2,1,2,8)
archivoExcel.mergeCeldas(tab2,3,1,3,8)
#Rows' height
archivoExcel.dimensionaRows(tab2,1,25)
archivoExcel.dimensionaRows(tab2,2,25)
archivoExcel.dimensionaRows(tab2,3,25)
#Cols' width
archivoExcel.dimensionaCols(tab2,'A',5)
archivoExcel.dimensionaCols(tab2,'B',5)
archivoExcel.dimensionaCols(tab2,'C',42)
archivoExcel.dimensionaCols(tab2,'D',13)
archivoExcel.dimensionaCols(tab2,'E',13)
archivoExcel.dimensionaCols(tab2,'F',13)
archivoExcel.dimensionaCols(tab2,'G',13)
archivoExcel.dimensionaCols(tab2,'H',13)
archivoExcel.dimensionaCols(tab2,'I',56)
# LOGOS
archivoExcel.addImageToCell(tab2,'tmx_logo.png','I1')
try:
# TITLE
archivoExcel.escribeEnHoja(tab2, 'A', 1, 'Dirección de Desarrollo Tecnológico', traceback=False, formato=titulo)
archivoExcel.escribeEnHoja(tab2, 'A', 2, 'Subdirección de Desarrollo de Soluciones', traceback=False, formato=subtitulo)
archivoExcel.escribeEnHoja(tab2, 'I', 1, '', traceback=False, formato=subtitulo)
archivoExcel.escribeEnHoja(tab2, 'I', 2, '', traceback=False, formato=subtitulo)
archivoExcel.escribeEnHoja(tab2, 'I', 3, 'TOTAL: '+str(jsonRoadmap[tab2]['Info']), traceback=False, formato=subtitulo2)
archivoExcel.dimensionaRows(tab2,ini,25)
archivoExcel.dimensionaRows(tab2,ini+1,25)
archivoExcel.mergeCeldas(tab2,ini,1,ini,8)
archivoExcel.escribeEnHoja(tab2, 'A', ini, jsonRoadmap[tab2][tab]['Info']['TABLA'], traceback=False, formato=tituloAzulBlancoBorde)
# Para poenr borde blanco a todas las columnas de las celdas mergeadas del título de la tabla
archivoExcel.escribeEnHoja(tab2, 'B', ini, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja(tab2, 'C', ini, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja(tab2, 'D', ini, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja(tab2, 'E', ini, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja(tab2, 'F', ini, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja(tab2, 'G', ini, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja(tab2, 'H', ini, '', traceback=False, formato=tituloAzulBlancoBorde)
archivoExcel.escribeEnHoja(tab2, 'I', ini, 'TOTAL = '+str(jsonRoadmap[tab2][tab]['Info']['TOTAL']), traceback=False, formato=tituloAzulBlancoBorde)
# HEADERS
archivoExcel.escribeEnHoja(tab2, 'A', ini+1, '', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja(tab2, 'B', ini+1, 'Q', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja(tab2, 'C', ini+1, 'SOLUCIÓN', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja(tab2, 'D', ini+1, 'TIPO', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja(tab2, 'E', ini+1, 'VALOR (MDP a 3 años)', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja(tab2, 'F', ini+1, 'LÍDER DE PRODUCTO', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja(tab2, 'G', ini+1, 'STATUS', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja(tab2, 'H', ini+1, 'LIBERACIÓN', traceback=False, formato=tituloAzulClaroBlancoBorde)
archivoExcel.escribeEnHoja(tab2, 'I', ini+1, 'DEPENDENCIA CRÍTICA', traceback=False, formato=tituloRojo)
# recorrer los renglones de cada una de las tablas que hay en la pestaña Liberados
for j in range(0,len(jsonRoadmap[tab2][tab]['Datos'])):
# Para merge Col A
if mkLibTer != jsonRoadmap[tab2][tab]['Datos'][j]['MERCADO']:
l3.append(jsonRoadmap[tab2][tab]['Datos'][j]['MERCADO'])
# PARA MERGE CELDAS (COL B)
l00.append('')
l4.append(jsonRoadmap[tab2][tab]['Datos'][j]['MERCADO'])
mkLibTer = jsonRoadmap[tab2][tab]['Datos'][j]['MERCADO']
# PARA MERGE CELDAS (COL B)
l00.append(jsonRoadmap[tab2][tab]['Datos'][j]['Q'])
# PARA MERGE CELDAS (COL A)
for ele in l3:
if l4.count(ele) > 1:
y2 = x2+l4.count(ele)-1
archivoExcel.mergeCeldas(tab2,x2+ini,1,y2+ini,1)
#print(y2+ini)
x2+=l4.count(ele)-1
x2+=1
# PARA MERGE CELDAS (COL B)
for ele in range(0,len(l00)):
# ele es un campo repetido (dentro de un mismo Mercado)
if l00[ele] != '' and l00[ele] == l00[ele-1]:
# se define la row inicial (y3) de una cadena de ele repetidos
if y3 == 0:
y3 = x3
x3+=1
#si se trata del caso de la úlyima ele y resulta ser una repetida (needs to merge)
if ele == len(l00)-1:
archivoExcel.mergeCeldas(tab2,y3+ini-1,2,x3+ini-1,2)
# ele es un campo diferente al del row anterior
elif l00[ele] != '' and l00[ele] != l00[ele-1]:
# if el valor de ele se había venido repitiendo en los rows anteriores a este
if y3 != 0:
archivoExcel.mergeCeldas(tab2,y3+ini-1,2,x3+ini-1,2)
x3+=1
# se resetea y3
y3 = 0
for j in range(0,len(jsonRoadmap[tab2][tab]['Datos'])):
archivoExcel.escribeEnHoja(tab2, 'A', ini+2+j, jsonRoadmap[tab2][tab]['Datos'][j]['MERCADO'], traceback=False, formato=cuerpoVertical)
archivoExcel.escribeEnHoja(tab2, 'B', ini+2+j, jsonRoadmap[tab2][tab]['Datos'][j]['Q'], traceback=False, formato=cuerpo)
archivoExcel.escribeEnHoja(tab2, 'C', ini+2+j, jsonRoadmap[tab2][tab]['Datos'][j]['SOLUCION'], traceback=False, formato=cuerpoBold)
archivoExcel.escribeEnHoja(tab2, 'D', ini+2+j, jsonRoadmap[tab2][tab]['Datos'][j]['TIPO'], traceback=False, formato=cuerpo)
#cambiar a formato currency en el excel. No hay una funcion para hacerlo directo, lo más cercano es self.__excelFile[ hoja ].number_format = u'"$ "#,###.00'
archivoExcel.escribeEnHoja(tab2, 'E', ini+2+j, jsonRoadmap[tab]['Datos'][j]['VALOR'], traceback=False, formato=cuerpo)
archivoExcel.escribeEnHoja(tab2, 'F', ini+2+j, jsonRoadmap[tab2][tab]['Datos'][j]['LIDER'], traceback=False, formato=cuerpo)
if tab2 == 'Liberados':
archivoExcel.escribeEnHoja(tab2, 'G', ini+2+j, jsonRoadmap[tab2][tab]['Datos'][j]['STATUS'], traceback=False, formato=cuerpoVerdeClaro)
if '/' in jsonRoadmap[tab2][tab]['Datos'][j]['LIBERACION']:
archivoExcel.escribeEnHoja(tab2, 'H', ini+2+j, jsonRoadmap[tab2][tab]['Datos'][j]['LIBERACION'], traceback=False, formato=cuerpoVerdeClaroBold)
else:
archivoExcel.escribeEnHoja(tab2, 'H', ini+2+j, jsonRoadmap[tab2][tab]['Datos'][j]['LIBERACION'], traceback=False, formato=cuerpoVerdeClaro)
elif tab2 == 'Terminados':
archivoExcel.escribeEnHoja(tab2, 'G', ini+2+j, jsonRoadmap[tab2][tab]['Datos'][j]['STATUS'], traceback=False, formato=cuerpoVerde)
if '/' in jsonRoadmap[tab2][tab]['Datos'][j]['LIBERACION']:
archivoExcel.escribeEnHoja(tab2, 'H', ini+2+j, jsonRoadmap[tab2][tab]['Datos'][j]['LIBERACION'], traceback=False, formato=cuerpoVerdeBold)
else:
archivoExcel.escribeEnHoja(tab2, 'H', ini+2+j, jsonRoadmap[tab2][tab]['Datos'][j]['LIBERACION'], traceback=False, formato=cuerpoVerde)
archivoExcel.escribeEnHoja(tab2, 'I', ini+2+j, jsonRoadmap[tab2][tab]['Datos'][j]['DEPENDENCIA'], traceback=False, formato=cuerpoIzq)
# actualizar contador que marca inicio de cada tabla
ini+=len(jsonRoadmap[tab2][tab]['Datos'])+4
# footers de cada tabla
archivoExcel.dimensionaRows(tab2,ini-1,25)
archivoExcel.dimensionaRows(tab2,ini-2,20)
archivoExcel.mergeCeldas(tab2,ini-2,1,ini-2,8)
archivoExcel.escribeEnHoja(tab2, 'A', ini-2, '', traceback=False, formato=rellenoGris)
archivoExcel.escribeEnHoja(tab2, 'I', ini-2, 'TOTAL = '+str(jsonRoadmap[tab2][tab]['Info']['TOTAL']), traceback=False, formato=tituloAzulBlancoBorde)
if tab2 == 'Liberados':
iniLib = ini
elif tab2 == 'Terminados':
iniTer = ini
except:
print(str(tab2)+' (tab2) no tiene datos')
except:
print(str(tab)+' no tiene datos')
archivoExcel.guardaArchivo()
print("Terminé...")
| 32.890457 | 454 | 0.601451 | 32,929 | 302,954 | 5.556956 | 0.011783 | 0.031697 | 0.031697 | 0.075176 | 0.969325 | 0.96256 | 0.956417 | 0.948788 | 0.931366 | 0.92366 | 0 | 0.047132 | 0.213527 | 302,954 | 9,210 | 455 | 32.894028 | 0.717499 | 0.010342 | 0 | 0.781569 | 0 | 0.082068 | 0.628544 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.000113 | 0 | 0.000113 | 0.000567 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
50c38d11c107b0bdcb3ea62ef959eeb27ce47805 | 1,577 | py | Python | bin/ehelps.py | ddierschow/cab984 | 108ec0e4358690719ca2dabe0149021cfc7acd93 | [
"MIT"
] | null | null | null | bin/ehelps.py | ddierschow/cab984 | 108ec0e4358690719ca2dabe0149021cfc7acd93 | [
"MIT"
] | null | null | null | bin/ehelps.py | ddierschow/cab984 | 108ec0e4358690719ca2dabe0149021cfc7acd93 | [
"MIT"
] | null | null | null | #!/usr/local/bin/python
import os, sys
import mbdata
jsfun = '''<SCRIPT LANGUAGE="JavaScript">
<!--
function pick(symbol) {
if (window.opener && !window.opener.closed)
window.opener.document.%s.value = symbol;
window.close();
}
// -->
</SCRIPT>
'''
def countries_help():
if len(sys.argv) > 1:
print jsfun % sys.argv[1]
print "<table>"
cols = 5
rows = ((len(mbdata.countries) - 1) / cols) + 1
while rows:
print "<tr>"
for col in range(0, cols):
cc = mbdata.countries.pop(0)
if len(sys.argv) > 1:
print '''<td><a href="javascript:pick('%s')">%s</a></td>''' % cc
else:
print '''<td>%s</td>''' % cc[1]
if not mbdata.countries:
break
print "</tr>"
rows -= 1
print "</table>"
makes_help = '''<SCRIPT LANGUAGE="JavaScript">
<!--
function pick(symbol) {
if (window.opener && !window.opener.closed)
window.opener.document.%s.value = symbol;
window.close();
}
// -->
</SCRIPT>
''' % sys.argv[1]
print "<table>"
cols = 5
rows = ((len(mbdata.countries) - 1) / cols) + 1
while rows:
print "<tr>"
for col in range(0, cols):
cc = mbdata.countries.pop(0)
if len(sys.argv) > 1:
print '''<td><a href="javascript:pick('%s')">%s</a></td>''' % cc
else:
print '''<td>%s</td>''' % cc[1]
if not mbdata.countries:
break
print "</tr>"
rows -= 1
print "</table>"
| 21.902778 | 80 | 0.490171 | 190 | 1,577 | 4.057895 | 0.263158 | 0.054475 | 0.051881 | 0.084306 | 0.902724 | 0.902724 | 0.879377 | 0.879377 | 0.879377 | 0.879377 | 0 | 0.017691 | 0.31896 | 1,577 | 71 | 81 | 22.211268 | 0.700186 | 0.013951 | 0 | 0.824561 | 0 | 0 | 0.352638 | 0.148005 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.035088 | null | null | 0.22807 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
50dcb0d388675f27c4807eb530edb0e388fab663 | 140 | py | Python | AIDEMY/Intro_1_1_5.py | vox256/Codes | c408ef0fbc25af46dacef93b3496985feb98dd5c | [
"MIT"
] | null | null | null | AIDEMY/Intro_1_1_5.py | vox256/Codes | c408ef0fbc25af46dacef93b3496985feb98dd5c | [
"MIT"
] | null | null | null | AIDEMY/Intro_1_1_5.py | vox256/Codes | c408ef0fbc25af46dacef93b3496985feb98dd5c | [
"MIT"
] | null | null | null | # 3 + 5
print (3 + 5)
# 3 - 5
print (3 - 5)
# 3 × 5
print (3 * 5)
# 3 ÷ 5
print (3 / 5)
# 3を5で割った余り
print (3 % 5)
# 3の5乗
print (3 ** 5) | 8.235294 | 14 | 0.45 | 30 | 140 | 2.166667 | 0.233333 | 0.246154 | 0.646154 | 0.492308 | 0.430769 | 0.292308 | 0 | 0 | 0 | 0 | 0 | 0.26087 | 0.342857 | 140 | 17 | 14 | 8.235294 | 0.423913 | 0.271429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
ba08a134164960e15812830fff2f342dcd9fcb20 | 17,246 | py | Python | portraitseg/configurations.py | MatthewKleinsmith/portrait-seg | 0dcdd5952c6d10aa103c4997556559173d922687 | [
"MIT"
] | 50 | 2018-03-05T10:55:48.000Z | 2021-02-26T16:37:25.000Z | portraitseg/configurations.py | MatthewKleinsmith/portrait-seg | 0dcdd5952c6d10aa103c4997556559173d922687 | [
"MIT"
] | 1 | 2017-12-02T12:46:56.000Z | 2017-12-05T22:28:29.000Z | portraitseg/configurations.py | MatthewKleinsmith/portrait-seg | 0dcdd5952c6d10aa103c4997556559173d922687 | [
"MIT"
] | 9 | 2018-03-05T12:34:15.000Z | 2021-02-26T16:37:27.000Z | from collections import OrderedDict
from torch.optim import SGD, RMSprop
from torch.nn.functional import cross_entropy
from portraitseg.utils import (cross_entropy2d)
"""Using a dictionary like this is a form of hand-tuning, a slow approach.
Consider using an automatic search method like random search, where your job is
to specify ranges and scales rather than specific hyperparameter values.
"""
# TODO: Prepend this dictionary to the Postgres configurations table
configurations = {
1: OrderedDict(
data_aug=None,
lr=1e-10,
momentum=0,
weight_decay=0,
lr_bias=2e-14, # ## oops
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=100000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
2: OrderedDict(
data_aug=None,
lr=1e-10,
momentum=0.99,
weight_decay=0.0005,
lr_bias=2e-10,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=100000,
loss_fn_kwargs={"size_average": False},
update_interval=1, # 10 in authors' code
shuffle_every_epoch=True),
3: OrderedDict(
data_aug=None,
lr=1e-14,
momentum=0.99,
weight_decay=0.0005,
lr_bias=2e-14,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=100000,
loss_fn_kwargs={"size_average": False},
update_interval=1,
shuffle_every_epoch=True),
4: OrderedDict(
data_aug=None,
lr=1e-14,
momentum=0.99,
weight_decay=0.0005,
lr_bias=2e-14,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d, # Replace with CrossEntropyLoss2d
max_iter=100000,
loss_fn_kwargs={"size_average": False},
update_interval=1,
shuffle_every_epoch=True),
5: OrderedDict(
data_aug=None,
lr=1e-4, # Divide color channels by 255 # NaN
momentum=0.99,
weight_decay=0.0005,
lr_bias=2e-4,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
update_interval=10,
shuffle_every_epoch=True),
6: OrderedDict(
data_aug=1, # first data aug, see notes
lr=1e-4,
momentum=0.99,
weight_decay=0.0005,
lr_bias=2e-4,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
update_interval=10,
shuffle_every_epoch=True),
7: OrderedDict(
data_aug=None,
lr=1e-4, # Divide color channels by 255 # NaN
momentum=0.99,
weight_decay=0.0005,
lr_bias=2e-4,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
update_interval=10,
shuffle_every_epoch=True),
8: OrderedDict(
data_aug=None,
lr=1e-10,
momentum=0,
weight_decay=0,
lr_bias=2e-10,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=10,
shuffle_every_epoch=True),
9: OrderedDict(
data_aug=None,
dropout=0, # ############
lr=1e-10,
momentum=0,
weight_decay=0,
lr_bias=2e-14,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=100000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
10: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-10,
momentum=0,
weight_decay=0,
lr_bias=2e-10, # ############
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=100000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
11: OrderedDict(
data_aug=None,
dropout=0.5,
lr=1e-10,
momentum=0,
weight_decay=0,
lr_bias=2e-10,
weight_decay_bias=0,
optimizer=RMSprop, # ############
loss_fn=cross_entropy2d,
max_iter=100000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
# http://pytorch.org/docs/master/optim.html#torch.optim.RMSprop
12: OrderedDict(
data_aug=None,
dropout=0.5,
lr=1e-2,
momentum=0,
weight_decay=0,
lr_bias=2e-2,
weight_decay_bias=0,
optimizer=RMSprop, # ############
loss_fn=cross_entropy2d,
max_iter=100000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
13: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-4, # ############
momentum=0,
weight_decay=0,
lr_bias=2e-10,
weight_decay_bias=0,
optimizer=RMSprop,
loss_fn=cross_entropy2d,
max_iter=100000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
14: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-4,
momentum=0,
weight_decay=0,
lr_bias=1e-4, # ############
weight_decay_bias=0,
optimizer=RMSprop,
loss_fn=cross_entropy2d,
max_iter=200000, # ############
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
15: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-4, # ############
momentum=0,
weight_decay=0,
lr_bias=2e-14,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
16: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-7, # ############
momentum=0,
weight_decay=0,
lr_bias=2e-14,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
17: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-9, # ############
momentum=0,
weight_decay=0,
lr_bias=2e-14,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
18: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-8, # ############
momentum=0,
weight_decay=0,
lr_bias=2e-14,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
19: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-8, # ############
momentum=0.1,
weight_decay=0,
lr_bias=2e-14,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
20: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-8, # ############
momentum=0.2,
weight_decay=0,
lr_bias=2e-14,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
21: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-8, # ############
momentum=0.2,
weight_decay=0,
lr_bias=2e-14,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
22: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-8, # ############
momentum=0.4,
weight_decay=0,
lr_bias=2e-14,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
23: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-8, # ############
momentum=0.5,
weight_decay=0,
lr_bias=2e-14,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
24: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-8, # ############
momentum=0.6,
weight_decay=0,
lr_bias=2e-14,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
25: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-8,
momentum=0.7,
weight_decay=0,
lr_bias=2e-14,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
26: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-10, # ############
momentum=0.1, # ############
weight_decay=0,
lr_bias=2e-14,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
27: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-10,
momentum=0.9, # ############
weight_decay=0,
lr_bias=2e-14,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
28: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-10,
momentum=0.2, # ############
weight_decay=0,
lr_bias=1e-10,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
29: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-10,
momentum=0.5, # ############
weight_decay=0,
lr_bias=1e-10,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
30: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-10,
momentum=0.4, # ############
weight_decay=0,
lr_bias=1e-10,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
31: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-11, # ############
momentum=0.5, # ############
weight_decay=0,
lr_bias=1e-10,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
32: OrderedDict(
data_aug=None,
dropout=0,
lr=1e-7, # ############
momentum=0.5, # ############
weight_decay=0,
lr_bias=1e-10,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True,
divide_by_255=True),
33: OrderedDict( # config 29 with mirroring
data_aug=1, # mirroring
dropout=0,
lr=1e-10,
momentum=0.5,
weight_decay=0,
lr_bias=1e-10,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
34: OrderedDict( # config 29 with random crop
data_aug=2, # random crop
dropout=0,
lr=1e-10,
momentum=0.5,
weight_decay=0,
lr_bias=1e-10,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
35: OrderedDict( # config 29 with random crop and mirroring
data_aug=3, # random crop and mirroring
dropout=0,
lr=1e-10,
momentum=0.5,
weight_decay=0,
lr_bias=1e-10,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=500000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
36: OrderedDict( # config 35 with dropout
data_aug=3,
dropout=0.5,
lr=1e-10,
momentum=0.5,
weight_decay=0,
lr_bias=1e-10,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=500000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
37: OrderedDict( # config 29 with dropout
data_aug=None,
dropout=0.5,
lr=1e-10,
momentum=0.5, # ############
weight_decay=0,
lr_bias=1e-10,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy2d,
max_iter=500000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
38: OrderedDict( # config 35 with loss averaging, and a new CE function
data_aug=3, # random crop and mirroring
dropout=0,
lr=1e-10,
momentum=0.5,
weight_decay=0,
lr_bias=1e-10,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy, # Trying a new cross_entropy function
max_iter=500000,
loss_fn_kwargs=None, # Enable averaging (remove the disable setting)
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True),
39: OrderedDict( # config 33 with PyTorch's cross-entropy
data_aug=1, # mirroring
dropout=0,
lr=1e-10,
momentum=0.5,
weight_decay=0,
lr_bias=1e-10,
weight_decay_bias=0,
optimizer=SGD,
loss_fn=cross_entropy,
max_iter=200000,
loss_fn_kwargs={"size_average": False},
optimizer_kwards={},
update_interval=1,
shuffle_every_epoch=True,
divide_by_255=False),
}
def get_config(config_id):
config = configurations[config_id]
if 'dropout' not in config.keys():
config['dropout'] = 0.5
return config
| 28.887772 | 79 | 0.558042 | 2,039 | 17,246 | 4.431584 | 0.081412 | 0.094954 | 0.051793 | 0.069057 | 0.885237 | 0.884683 | 0.874281 | 0.871735 | 0.860558 | 0.846724 | 0 | 0.072411 | 0.325757 | 17,246 | 596 | 80 | 28.936242 | 0.704678 | 0.041053 | 0 | 0.893836 | 0 | 0 | 0.029532 | 0 | 0 | 0 | 0 | 0.001678 | 0 | 1 | 0.001712 | false | 0 | 0.006849 | 0 | 0.010274 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ba08b49a089c397c526d450943c3888cd975b985 | 17,800 | py | Python | tests/test_simulate.py | lvayssac/bioptim | 526abff72a8a1b2cb84ccc40c6067b7a18f537e3 | [
"MIT"
] | null | null | null | tests/test_simulate.py | lvayssac/bioptim | 526abff72a8a1b2cb84ccc40c6067b7a18f537e3 | [
"MIT"
] | null | null | null | tests/test_simulate.py | lvayssac/bioptim | 526abff72a8a1b2cb84ccc40c6067b7a18f537e3 | [
"MIT"
] | null | null | null | import pytest
import numpy as np
from bioptim import Shooting
from .utils import TestUtils
def test_merge_phases_one_phase():
# Load pendulum
bioptim_folder = TestUtils.bioptim_folder()
pendulum = TestUtils.load_module(bioptim_folder + "/examples/getting_started/pendulum.py")
ocp = pendulum.prepare_ocp(
biorbd_model_path=bioptim_folder + "/examples/getting_started/pendulum.bioMod",
final_time=2,
n_shooting=10,
)
sol = ocp.solve()
sol_merged = sol.merge_phases()
for key in sol.states:
np.testing.assert_almost_equal(sol_merged.states[key], sol.states[key])
for key in sol.controls:
np.testing.assert_almost_equal(sol_merged.controls[key], sol.controls[key])
def test_merge_phases_multi_phase():
# Load pendulum
bioptim_folder = TestUtils.bioptim_folder()
cube = TestUtils.load_module(bioptim_folder + "/examples/getting_started/example_multiphase.py")
ocp = cube.prepare_ocp(
biorbd_model_path=bioptim_folder + "/examples/getting_started/cube.bioMod",
)
sol = ocp.solve()
sol_merged = sol.merge_phases()
for key in sol.states[0]:
expected = np.concatenate([s[key][:, :-1] for s in sol.states], axis=1)
expected = np.concatenate((expected, sol.states[-1][key][:, -1][:, np.newaxis]), axis=1)
np.testing.assert_almost_equal(sol_merged.states[key], expected)
for key in sol.controls[0]:
expected = np.concatenate([s[key][:, :-1] for s in sol.controls], axis=1)
expected = np.concatenate((expected, sol.controls[-1][key][:, -1][:, np.newaxis]), axis=1)
np.testing.assert_almost_equal(sol_merged.controls[key], expected)
def test_interpolate():
# Load pendulum
bioptim_folder = TestUtils.bioptim_folder()
pendulum = TestUtils.load_module(bioptim_folder + "/examples/getting_started/pendulum.py")
n_shooting = 10
ocp = pendulum.prepare_ocp(
biorbd_model_path=bioptim_folder + "/examples/getting_started/pendulum.bioMod",
final_time=2,
n_shooting=n_shooting,
)
sol = ocp.solve()
n_frames = 100
sol_interp = sol.interpolate(n_frames)
sol_interp_list = sol.interpolate([n_frames])
shapes = (4, 2, 2)
for i, key in enumerate(sol.states):
np.testing.assert_almost_equal(sol_interp.states[key][:, [0, -1]], sol.states[key][:, [0, -1]])
np.testing.assert_almost_equal(sol_interp_list.states[key][:, [0, -1]], sol.states[key][:, [0, -1]])
assert sol_interp.states[key].shape == (shapes[i], n_frames)
assert sol_interp_list.states[key].shape == (shapes[i], n_frames)
assert sol.states[key].shape == (shapes[i], n_shooting + 1)
with pytest.raises(
RuntimeError,
match="There is no controls in the solution. This may happen in previously "
"integrated and interpolated structure",
):
_ = sol_interp.controls
with pytest.raises(
ValueError,
match="n_frames should either be a int to merge_phases phases or a "
"list of int of the number of phases dimension",
):
sol.interpolate([n_frames, n_frames])
def test_interpolate_multiphases():
# Load pendulum
bioptim_folder = TestUtils.bioptim_folder()
cube = TestUtils.load_module(bioptim_folder + "/examples/getting_started/example_multiphase.py")
ocp = cube.prepare_ocp(
biorbd_model_path=bioptim_folder + "/examples/getting_started/cube.bioMod",
)
sol = ocp.solve()
n_frames = 100
n_shooting = [20, 30, 20]
sol_interp = sol.interpolate([n_frames, n_frames, n_frames])
shapes = (6, 3, 3)
for i, key in enumerate(sol.states[0]):
np.testing.assert_almost_equal(sol_interp.states[i][key][:, [0, -1]], sol.states[i][key][:, [0, -1]])
assert sol_interp.states[i][key].shape == (shapes[i], n_frames)
assert sol.states[i][key].shape == (shapes[i], n_shooting[i] + 1)
with pytest.raises(
RuntimeError,
match="There is no controls in the solution. This may happen in previously "
"integrated and interpolated structure",
):
_ = sol_interp.controls
with pytest.raises(
ValueError,
match="n_frames should either be a int to merge_phases phases or a "
"list of int of the number of phases dimension",
):
sol.interpolate([n_frames, n_frames])
def test_interpolate_multiphases_merge_phase():
# Load pendulum
bioptim_folder = TestUtils.bioptim_folder()
cube = TestUtils.load_module(bioptim_folder + "/examples/getting_started/example_multiphase.py")
ocp = cube.prepare_ocp(
biorbd_model_path=bioptim_folder + "/examples/getting_started/cube.bioMod",
)
sol = ocp.solve()
n_frames = 100
n_shooting = [20, 30, 20]
sol_interp = sol.interpolate(n_frames)
shapes = (6, 3, 3)
for i, key in enumerate(sol.states[0]):
expected = np.array([sol.states[0][key][:, 0], sol.states[-1][key][:, -1]]).T
np.testing.assert_almost_equal(sol_interp.states[key][:, [0, -1]], expected)
assert sol_interp.states[key].shape == (shapes[i], n_frames)
assert sol.states[i][key].shape == (shapes[i], n_shooting[i] + 1)
with pytest.raises(
RuntimeError,
match="There is no controls in the solution. This may happen in previously "
"integrated and interpolated structure",
):
_ = sol_interp.controls
def test_integrate():
# Load pendulum
bioptim_folder = TestUtils.bioptim_folder()
pendulum = TestUtils.load_module(bioptim_folder + "/examples/getting_started/pendulum.py")
n_shooting = 10
ocp = pendulum.prepare_ocp(
biorbd_model_path=bioptim_folder + "/examples/getting_started/pendulum.bioMod",
final_time=2,
n_shooting=n_shooting,
)
sol = ocp.solve()
sol_integrated = sol.integrate(shooting_type=Shooting.MULTIPLE, keepdims=False)
shapes = (4, 2, 2)
for i, key in enumerate(sol.states):
np.testing.assert_almost_equal(sol_integrated.states[key][:, [0, -1]], sol.states[key][:, [0, -1]])
assert sol_integrated.states[key].shape == (shapes[i], n_shooting * 5 + 1)
assert sol.states[key].shape == (shapes[i], n_shooting + 1)
with pytest.raises(
RuntimeError,
match="There is no controls in the solution. This may happen in previously "
"integrated and interpolated structure",
):
_ = sol_integrated.controls
def test_integrate_keepdims():
# Load pendulum
bioptim_folder = TestUtils.bioptim_folder()
pendulum = TestUtils.load_module(bioptim_folder + "/examples/getting_started/pendulum.py")
n_shooting = 10
ocp = pendulum.prepare_ocp(
biorbd_model_path=bioptim_folder + "/examples/getting_started/pendulum.bioMod",
final_time=2,
n_shooting=n_shooting,
)
sol = ocp.solve()
with pytest.raises(
ValueError,
match="Shooting.MULTIPLE and keepdims=True cannot be used simultaneously since it would do nothing",
):
_ = sol.integrate(shooting_type=Shooting.MULTIPLE, keepdims=True)
@pytest.mark.parametrize("keepdims", [False, True])
def test_integrate_single_shoot(keepdims):
# Load pendulum
bioptim_folder = TestUtils.bioptim_folder()
pendulum = TestUtils.load_module(bioptim_folder + "/examples/getting_started/pendulum.py")
n_shooting = 10
ocp = pendulum.prepare_ocp(
biorbd_model_path=bioptim_folder + "/examples/getting_started/pendulum.bioMod",
final_time=2,
n_shooting=n_shooting,
)
sol = ocp.solve()
sol_integrated = sol.integrate(keepdims=keepdims)
shapes = (4, 2, 2)
for i, key in enumerate(sol.states):
np.testing.assert_almost_equal(sol_integrated.states[key][:, [0, -1]], sol.states[key][:, [0, -1]])
if keepdims:
np.testing.assert_almost_equal(sol_integrated.states[key], sol.states[key])
assert sol_integrated.states[key].shape == (shapes[i], n_shooting + 1)
else:
assert sol_integrated.states[key].shape == (shapes[i], n_shooting * 5 + 1)
assert sol.states[key].shape == (shapes[i], n_shooting + 1)
with pytest.raises(
RuntimeError,
match="There is no controls in the solution. This may happen in previously "
"integrated and interpolated structure",
):
_ = sol_integrated.controls
@pytest.mark.parametrize("shooting", [Shooting.SINGLE_CONTINUOUS, Shooting.MULTIPLE, Shooting.SINGLE])
@pytest.mark.parametrize("merge", [False, True])
def test_integrate_non_continuous(shooting, merge):
# Load pendulum
bioptim_folder = TestUtils.bioptim_folder()
pendulum = TestUtils.load_module(bioptim_folder + "/examples/getting_started/pendulum.py")
n_shooting = 10
ocp = pendulum.prepare_ocp(
biorbd_model_path=bioptim_folder + "/examples/getting_started/pendulum.bioMod",
final_time=2,
n_shooting=n_shooting,
)
sol = ocp.solve()
if shooting == Shooting.MULTIPLE:
with pytest.raises(
ValueError,
match="Shooting.MULTIPLE and keepdims=True cannot be used simultaneously since it would do nothing",
):
_ = sol.integrate(shooting_type=shooting, continuous=False, keepdims=True)
else:
with pytest.raises(
ValueError,
match="continuous=False and keepdims=True cannot be used simultaneously since "
"it would necessarily change the dimension",
):
_ = sol.integrate(shooting_type=shooting, continuous=False, keepdims=True)
sol_integrated = sol.integrate(shooting_type=shooting, continuous=False, merge_phases=merge, keepdims=False)
shapes = (4, 2, 2)
for i, key in enumerate(sol.states):
np.testing.assert_almost_equal(sol_integrated.states[key][:, [0, -1]], sol.states[key][:, [0, -1]])
np.testing.assert_almost_equal(sol_integrated.states[key][:, [0, -2]], sol.states[key][:, [0, -1]])
assert sol_integrated.states[key].shape == (shapes[i], n_shooting * (5 + 1) + 1)
assert sol.states[key].shape == (shapes[i], n_shooting + 1)
with pytest.raises(
RuntimeError,
match="There is no controls in the solution. This may happen in previously "
"integrated and interpolated structure",
):
_ = sol_integrated.controls
@pytest.mark.parametrize("shooting", [Shooting.SINGLE_CONTINUOUS, Shooting.MULTIPLE, Shooting.SINGLE])
@pytest.mark.parametrize("keepdims", [True, False])
def test_integrate_multiphase(shooting, keepdims):
# Load pendulum
bioptim_folder = TestUtils.bioptim_folder()
cube = TestUtils.load_module(bioptim_folder + "/examples/getting_started/example_multiphase.py")
ocp = cube.prepare_ocp(
biorbd_model_path=bioptim_folder + "/examples/getting_started/cube.bioMod",
)
sol = ocp.solve()
n_shooting = [20, 30, 20]
if shooting == Shooting.MULTIPLE and keepdims:
with pytest.raises(
ValueError,
match="Shooting.MULTIPLE and keepdims=True cannot be used simultaneously since it would do nothing",
):
_ = sol.integrate(shooting_type=shooting, continuous=False, keepdims=keepdims)
return
sol_integrated = sol.integrate(shooting_type=shooting, keepdims=keepdims)
shapes = (6, 3, 3)
for i in range(len(sol_integrated.states)):
for k, key in enumerate(sol.states[i]):
np.testing.assert_almost_equal(sol_integrated.states[i][key][:, [0, -1]], sol.states[i][key][:, [0, -1]])
if keepdims:
np.testing.assert_almost_equal(sol_integrated.states[i][key], sol.states[i][key])
assert sol_integrated.states[i][key].shape == (shapes[k], n_shooting[i] + 1)
else:
assert sol_integrated.states[i][key].shape == (shapes[k], n_shooting[i] * 5 + 1)
assert sol.states[i][key].shape == (shapes[k], n_shooting[i] + 1)
with pytest.raises(
RuntimeError,
match="There is no controls in the solution. This may happen in previously "
"integrated and interpolated structure",
):
_ = sol_integrated.controls
@pytest.mark.parametrize("shooting", [Shooting.SINGLE_CONTINUOUS, Shooting.MULTIPLE, Shooting.SINGLE])
@pytest.mark.parametrize("keepdims", [True, False])
def test_integrate_multiphase_merged(shooting, keepdims):
# Load pendulum
bioptim_folder = TestUtils.bioptim_folder()
cube = TestUtils.load_module(bioptim_folder + "/examples/getting_started/example_multiphase.py")
ocp = cube.prepare_ocp(
biorbd_model_path=bioptim_folder + "/examples/getting_started/cube.bioMod",
)
sol = ocp.solve()
if shooting == Shooting.MULTIPLE and keepdims:
with pytest.raises(
ValueError,
match="Shooting.MULTIPLE and keepdims=True cannot be used simultaneously since it would do nothing",
):
_ = sol.integrate(shooting_type=shooting, continuous=False, keepdims=keepdims)
return
n_shooting = [20, 30, 20]
sol_integrated = sol.integrate(shooting_type=shooting, merge_phases=True, keepdims=keepdims)
shapes = (6, 3, 3)
for k, key in enumerate(sol.states[0]):
expected = np.array([sol.states[0][key][:, 0], sol.states[-1][key][:, -1]]).T
np.testing.assert_almost_equal(sol_integrated.states[key][:, [0, -1]], expected)
if keepdims:
# The interpolation prevents from comparing all points
expected = np.concatenate((sol.states[0][key][:, 0:1], sol.states[-1][key][:, -1][:, np.newaxis]), axis=1)
np.testing.assert_almost_equal(sol_integrated.states[key][:, [0, -1]], expected)
assert sol_integrated.states[key].shape == (shapes[k], sum(n_shooting) + 1)
else:
assert sol_integrated.states[key].shape == (shapes[k], sum(n_shooting) * 5 + 1)
for i in range(len(sol_integrated.states)):
for k, key in enumerate(sol.states[i]):
assert sol.states[i][key].shape == (shapes[k], n_shooting[i] + 1)
with pytest.raises(
RuntimeError,
match="There is no controls in the solution. This may happen in previously "
"integrated and interpolated structure",
):
_ = sol_integrated.controls
@pytest.mark.parametrize("shooting", [Shooting.SINGLE_CONTINUOUS, Shooting.MULTIPLE, Shooting.SINGLE])
def test_integrate_multiphase_non_continuous(shooting):
# Load pendulum
bioptim_folder = TestUtils.bioptim_folder()
cube = TestUtils.load_module(bioptim_folder + "/examples/getting_started/example_multiphase.py")
ocp = cube.prepare_ocp(
biorbd_model_path=bioptim_folder + "/examples/getting_started/cube.bioMod",
)
sol = ocp.solve()
n_shooting = [20, 30, 20]
sol_integrated = sol.integrate(shooting_type=shooting, continuous=False, keepdims=False)
shapes = (6, 3, 3)
for i in range(len(sol_integrated.states)):
for k, key in enumerate(sol.states[i]):
np.testing.assert_almost_equal(sol_integrated.states[i][key][:, [0, -1]], sol.states[i][key][:, [0, -1]])
np.testing.assert_almost_equal(sol_integrated.states[i][key][:, [0, -2]], sol.states[i][key][:, [0, -1]])
assert sol_integrated.states[i][key].shape == (shapes[k], n_shooting[i] * (5 + 1) + 1)
assert sol.states[i][key].shape == (shapes[k], n_shooting[i] + 1)
with pytest.raises(
RuntimeError,
match="There is no controls in the solution. This may happen in previously "
"integrated and interpolated structure",
):
_ = sol_integrated.controls
@pytest.mark.parametrize("shooting", [Shooting.SINGLE_CONTINUOUS, Shooting.MULTIPLE, Shooting.SINGLE])
def test_integrate_multiphase_merged_non_continuous(shooting):
# Load pendulum
bioptim_folder = TestUtils.bioptim_folder()
cube = TestUtils.load_module(bioptim_folder + "/examples/getting_started/example_multiphase.py")
ocp = cube.prepare_ocp(
biorbd_model_path=bioptim_folder + "/examples/getting_started/cube.bioMod",
)
sol = ocp.solve()
if shooting == Shooting.MULTIPLE:
with pytest.raises(
ValueError,
match="Shooting.MULTIPLE and keepdims=True cannot be used simultaneously since it would do nothing",
):
_ = sol.integrate(shooting_type=shooting, continuous=False, keepdims=True)
else:
with pytest.raises(
ValueError,
match="continuous=False and keepdims=True cannot be used simultaneously since "
"it would necessarily change the dimension",
):
_ = sol.integrate(shooting_type=shooting, continuous=False, keepdims=True)
n_shooting = [20, 30, 20]
sol_integrated = sol.integrate(shooting_type=shooting, merge_phases=True, continuous=False, keepdims=False)
shapes = (6, 3, 3)
for k, key in enumerate(sol.states[0]):
expected = np.array([sol.states[0][key][:, 0], sol.states[-1][key][:, -1]]).T
np.testing.assert_almost_equal(sol_integrated.states[key][:, [0, -1]], expected)
np.testing.assert_almost_equal(sol_integrated.states[key][:, [0, -2]], expected)
assert sol_integrated.states[key].shape == (shapes[k], sum(n_shooting) * (5 + 1) + 1)
for i in range(len(sol_integrated.states)):
for k, key in enumerate(sol.states[i]):
assert sol.states[i][key].shape == (shapes[k], n_shooting[i] + 1)
with pytest.raises(
RuntimeError,
match="There is no controls in the solution. This may happen in previously "
"integrated and interpolated structure",
):
_ = sol_integrated.controls
| 38.949672 | 118 | 0.665 | 2,263 | 17,800 | 5.057888 | 0.060981 | 0.05906 | 0.044819 | 0.063603 | 0.961908 | 0.946357 | 0.941901 | 0.922768 | 0.917176 | 0.897169 | 0 | 0.014722 | 0.210056 | 17,800 | 456 | 119 | 39.035088 | 0.799303 | 0.013146 | 0 | 0.7851 | 0 | 0 | 0.174577 | 0.060168 | 0 | 0 | 0 | 0 | 0.12894 | 1 | 0.037249 | false | 0 | 0.011461 | 0 | 0.054441 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e8713aac69a21382600f0c42ac959b009f2c6014 | 38 | py | Python | References/Geovana Neves/TCC_Geovana_Neves_GitHub/SUAVE_modifications/SUAVE-feature-constant_throttle_EAS/trunk/SUAVE/Analyses/Mission/Vary_Cruise/__init__.py | Vinicius-Tanigawa/Undergraduate-Research-Project | e92372f07882484b127d7affe305eeec2238b8a9 | [
"MIT"
] | null | null | null | References/Geovana Neves/TCC_Geovana_Neves_GitHub/SUAVE_modifications/SUAVE-feature-constant_throttle_EAS/trunk/SUAVE/Analyses/Mission/Vary_Cruise/__init__.py | Vinicius-Tanigawa/Undergraduate-Research-Project | e92372f07882484b127d7affe305eeec2238b8a9 | [
"MIT"
] | null | null | null | References/Geovana Neves/TCC_Geovana_Neves_GitHub/SUAVE_modifications/SUAVE-feature-constant_throttle_EAS/trunk/SUAVE/Analyses/Mission/Vary_Cruise/__init__.py | Vinicius-Tanigawa/Undergraduate-Research-Project | e92372f07882484b127d7affe305eeec2238b8a9 | [
"MIT"
] | null | null | null |
from Given_Weight import Given_Weight | 19 | 37 | 0.894737 | 6 | 38 | 5.333333 | 0.666667 | 0.6875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 38 | 2 | 37 | 19 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
e875469efb9af078c45fda7d0a27d17a1f6312f5 | 1,074 | py | Python | dnikma_integrity_checker/helpers/banner.py | dbdness/dnikma-thesis-2021 | 2b1b0223cace22e02ef241191d0c25066a187a7a | [
"BSD-3-Clause"
] | 1 | 2021-06-01T01:31:00.000Z | 2021-06-01T01:31:00.000Z | dnikma_integrity_checker/helpers/banner.py | dbdness/dnikma-thesis-2021 | 2b1b0223cace22e02ef241191d0c25066a187a7a | [
"BSD-3-Clause"
] | 31 | 2021-05-21T15:39:42.000Z | 2021-06-29T11:29:54.000Z | dnikma_integrity_checker/helpers/banner.py | DynnamiK/dnikma-thesis-2021-v2 | d47f206735b4fc32051c95c065cd83a1e28a7d9b | [
"BSD-3-Clause"
] | 1 | 2021-06-01T01:30:45.000Z | 2021-06-01T01:30:45.000Z | """
A custom ASCII banner for the tool.
Generated with https://manytools.org/hacker-tools/ascii-banner/
"""
banner = """
_ _ _
__| |_ _ (_) |___ __ __ _
/ _` | ' \| | / / ' \/ _` |
\__,_|_||_|_|_\_\_|_|_\__,_|
__ __ ___ ___ _ ___ _ _ _ ___ _ _
| \/ |_ _/ __|/ _ \| | |_ _|_ _| |_ ___ __ _ _ _(_) |_ _ _ / __| |_ ___ __| |_____ _ _
| |\/| | || \__ \ (_) | |__ | || ' \ _/ -_) _` | '_| | _| || | | (__| ' \/ -_) _| / / -_) '_|
|_| |_|\_, |___/\__\_\____| |___|_||_\__\___\__, |_| |_|\__|\_, | \___|_||_\___\__|_\_\___|_|
|__/ |___/ |__/
"""
| 63.176471 | 98 | 0.22905 | 17 | 1,074 | 5.058824 | 0.823529 | 0.255814 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.612663 | 1,074 | 16 | 99 | 67.125 | 0.206731 | 0.092179 | 0 | 0 | 1 | 0.363636 | 0.981386 | 0.056877 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e87ecdff69685833ca5454d842413497ff06700f | 25,179 | py | Python | sdk/python/pulumi_rancher2/cluster_alert_group.py | pulumi/pulumi-rancher2 | 7a98af8cf598b711084a7f46c0fe71b43ed7a8ac | [
"ECL-2.0",
"Apache-2.0"
] | 3 | 2020-03-23T15:59:11.000Z | 2021-01-29T00:37:32.000Z | sdk/python/pulumi_rancher2/cluster_alert_group.py | pulumi/pulumi-rancher2 | 7a98af8cf598b711084a7f46c0fe71b43ed7a8ac | [
"ECL-2.0",
"Apache-2.0"
] | 76 | 2020-01-16T20:00:25.000Z | 2022-03-31T20:30:08.000Z | sdk/python/pulumi_rancher2/cluster_alert_group.py | pulumi/pulumi-rancher2 | 7a98af8cf598b711084a7f46c0fe71b43ed7a8ac | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2020-03-27T17:39:59.000Z | 2020-11-24T23:09:24.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
from . import outputs
from ._inputs import *
__all__ = ['ClusterAlertGroupArgs', 'ClusterAlertGroup']
@pulumi.input_type
class ClusterAlertGroupArgs:
def __init__(__self__, *,
cluster_id: pulumi.Input[str],
annotations: Optional[pulumi.Input[Mapping[str, Any]]] = None,
description: Optional[pulumi.Input[str]] = None,
group_interval_seconds: Optional[pulumi.Input[int]] = None,
group_wait_seconds: Optional[pulumi.Input[int]] = None,
labels: Optional[pulumi.Input[Mapping[str, Any]]] = None,
name: Optional[pulumi.Input[str]] = None,
recipients: Optional[pulumi.Input[Sequence[pulumi.Input['ClusterAlertGroupRecipientArgs']]]] = None,
repeat_interval_seconds: Optional[pulumi.Input[int]] = None):
"""
The set of arguments for constructing a ClusterAlertGroup resource.
:param pulumi.Input[str] cluster_id: The cluster id where create cluster alert group (string)
:param pulumi.Input[Mapping[str, Any]] annotations: The cluster alert group annotations (map)
:param pulumi.Input[str] description: The cluster alert group description (string)
:param pulumi.Input[int] group_interval_seconds: The cluster alert group interval seconds. Default: `180` (int)
:param pulumi.Input[int] group_wait_seconds: The cluster alert group wait seconds. Default: `180` (int)
:param pulumi.Input[Mapping[str, Any]] labels: The cluster alert group labels (map)
:param pulumi.Input[str] name: The cluster alert group name (string)
:param pulumi.Input[Sequence[pulumi.Input['ClusterAlertGroupRecipientArgs']]] recipients: The cluster alert group recipients (list)
:param pulumi.Input[int] repeat_interval_seconds: The cluster alert group wait seconds. Default: `3600` (int)
"""
pulumi.set(__self__, "cluster_id", cluster_id)
if annotations is not None:
pulumi.set(__self__, "annotations", annotations)
if description is not None:
pulumi.set(__self__, "description", description)
if group_interval_seconds is not None:
pulumi.set(__self__, "group_interval_seconds", group_interval_seconds)
if group_wait_seconds is not None:
pulumi.set(__self__, "group_wait_seconds", group_wait_seconds)
if labels is not None:
pulumi.set(__self__, "labels", labels)
if name is not None:
pulumi.set(__self__, "name", name)
if recipients is not None:
pulumi.set(__self__, "recipients", recipients)
if repeat_interval_seconds is not None:
pulumi.set(__self__, "repeat_interval_seconds", repeat_interval_seconds)
@property
@pulumi.getter(name="clusterId")
def cluster_id(self) -> pulumi.Input[str]:
"""
The cluster id where create cluster alert group (string)
"""
return pulumi.get(self, "cluster_id")
@cluster_id.setter
def cluster_id(self, value: pulumi.Input[str]):
pulumi.set(self, "cluster_id", value)
@property
@pulumi.getter
def annotations(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
The cluster alert group annotations (map)
"""
return pulumi.get(self, "annotations")
@annotations.setter
def annotations(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "annotations", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
The cluster alert group description (string)
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter(name="groupIntervalSeconds")
def group_interval_seconds(self) -> Optional[pulumi.Input[int]]:
"""
The cluster alert group interval seconds. Default: `180` (int)
"""
return pulumi.get(self, "group_interval_seconds")
@group_interval_seconds.setter
def group_interval_seconds(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "group_interval_seconds", value)
@property
@pulumi.getter(name="groupWaitSeconds")
def group_wait_seconds(self) -> Optional[pulumi.Input[int]]:
"""
The cluster alert group wait seconds. Default: `180` (int)
"""
return pulumi.get(self, "group_wait_seconds")
@group_wait_seconds.setter
def group_wait_seconds(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "group_wait_seconds", value)
@property
@pulumi.getter
def labels(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
The cluster alert group labels (map)
"""
return pulumi.get(self, "labels")
@labels.setter
def labels(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "labels", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The cluster alert group name (string)
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def recipients(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ClusterAlertGroupRecipientArgs']]]]:
"""
The cluster alert group recipients (list)
"""
return pulumi.get(self, "recipients")
@recipients.setter
def recipients(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ClusterAlertGroupRecipientArgs']]]]):
pulumi.set(self, "recipients", value)
@property
@pulumi.getter(name="repeatIntervalSeconds")
def repeat_interval_seconds(self) -> Optional[pulumi.Input[int]]:
"""
The cluster alert group wait seconds. Default: `3600` (int)
"""
return pulumi.get(self, "repeat_interval_seconds")
@repeat_interval_seconds.setter
def repeat_interval_seconds(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "repeat_interval_seconds", value)
@pulumi.input_type
class _ClusterAlertGroupState:
def __init__(__self__, *,
annotations: Optional[pulumi.Input[Mapping[str, Any]]] = None,
cluster_id: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
group_interval_seconds: Optional[pulumi.Input[int]] = None,
group_wait_seconds: Optional[pulumi.Input[int]] = None,
labels: Optional[pulumi.Input[Mapping[str, Any]]] = None,
name: Optional[pulumi.Input[str]] = None,
recipients: Optional[pulumi.Input[Sequence[pulumi.Input['ClusterAlertGroupRecipientArgs']]]] = None,
repeat_interval_seconds: Optional[pulumi.Input[int]] = None):
"""
Input properties used for looking up and filtering ClusterAlertGroup resources.
:param pulumi.Input[Mapping[str, Any]] annotations: The cluster alert group annotations (map)
:param pulumi.Input[str] cluster_id: The cluster id where create cluster alert group (string)
:param pulumi.Input[str] description: The cluster alert group description (string)
:param pulumi.Input[int] group_interval_seconds: The cluster alert group interval seconds. Default: `180` (int)
:param pulumi.Input[int] group_wait_seconds: The cluster alert group wait seconds. Default: `180` (int)
:param pulumi.Input[Mapping[str, Any]] labels: The cluster alert group labels (map)
:param pulumi.Input[str] name: The cluster alert group name (string)
:param pulumi.Input[Sequence[pulumi.Input['ClusterAlertGroupRecipientArgs']]] recipients: The cluster alert group recipients (list)
:param pulumi.Input[int] repeat_interval_seconds: The cluster alert group wait seconds. Default: `3600` (int)
"""
if annotations is not None:
pulumi.set(__self__, "annotations", annotations)
if cluster_id is not None:
pulumi.set(__self__, "cluster_id", cluster_id)
if description is not None:
pulumi.set(__self__, "description", description)
if group_interval_seconds is not None:
pulumi.set(__self__, "group_interval_seconds", group_interval_seconds)
if group_wait_seconds is not None:
pulumi.set(__self__, "group_wait_seconds", group_wait_seconds)
if labels is not None:
pulumi.set(__self__, "labels", labels)
if name is not None:
pulumi.set(__self__, "name", name)
if recipients is not None:
pulumi.set(__self__, "recipients", recipients)
if repeat_interval_seconds is not None:
pulumi.set(__self__, "repeat_interval_seconds", repeat_interval_seconds)
@property
@pulumi.getter
def annotations(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
The cluster alert group annotations (map)
"""
return pulumi.get(self, "annotations")
@annotations.setter
def annotations(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "annotations", value)
@property
@pulumi.getter(name="clusterId")
def cluster_id(self) -> Optional[pulumi.Input[str]]:
"""
The cluster id where create cluster alert group (string)
"""
return pulumi.get(self, "cluster_id")
@cluster_id.setter
def cluster_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "cluster_id", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
The cluster alert group description (string)
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter(name="groupIntervalSeconds")
def group_interval_seconds(self) -> Optional[pulumi.Input[int]]:
"""
The cluster alert group interval seconds. Default: `180` (int)
"""
return pulumi.get(self, "group_interval_seconds")
@group_interval_seconds.setter
def group_interval_seconds(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "group_interval_seconds", value)
@property
@pulumi.getter(name="groupWaitSeconds")
def group_wait_seconds(self) -> Optional[pulumi.Input[int]]:
"""
The cluster alert group wait seconds. Default: `180` (int)
"""
return pulumi.get(self, "group_wait_seconds")
@group_wait_seconds.setter
def group_wait_seconds(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "group_wait_seconds", value)
@property
@pulumi.getter
def labels(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
The cluster alert group labels (map)
"""
return pulumi.get(self, "labels")
@labels.setter
def labels(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "labels", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The cluster alert group name (string)
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def recipients(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ClusterAlertGroupRecipientArgs']]]]:
"""
The cluster alert group recipients (list)
"""
return pulumi.get(self, "recipients")
@recipients.setter
def recipients(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ClusterAlertGroupRecipientArgs']]]]):
pulumi.set(self, "recipients", value)
@property
@pulumi.getter(name="repeatIntervalSeconds")
def repeat_interval_seconds(self) -> Optional[pulumi.Input[int]]:
"""
The cluster alert group wait seconds. Default: `3600` (int)
"""
return pulumi.get(self, "repeat_interval_seconds")
@repeat_interval_seconds.setter
def repeat_interval_seconds(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "repeat_interval_seconds", value)
class ClusterAlertGroup(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
annotations: Optional[pulumi.Input[Mapping[str, Any]]] = None,
cluster_id: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
group_interval_seconds: Optional[pulumi.Input[int]] = None,
group_wait_seconds: Optional[pulumi.Input[int]] = None,
labels: Optional[pulumi.Input[Mapping[str, Any]]] = None,
name: Optional[pulumi.Input[str]] = None,
recipients: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ClusterAlertGroupRecipientArgs']]]]] = None,
repeat_interval_seconds: Optional[pulumi.Input[int]] = None,
__props__=None):
"""
Provides a Rancher v2 Cluster Alert Group resource. This can be used to create Cluster Alert Group for Rancher v2 environments and retrieve their information.
## Example Usage
```python
import pulumi
import pulumi_rancher2 as rancher2
# Create a new Rancher2 Cluster Alert Group
foo = rancher2.ClusterAlertGroup("foo",
cluster_id="<cluster_id>",
description="Terraform cluster alert group",
group_interval_seconds=300,
repeat_interval_seconds=3600)
```
## Import
Cluster Alert Group can be imported using the Rancher cluster alert group ID
```sh
$ pulumi import rancher2:index/clusterAlertGroup:ClusterAlertGroup foo <CLUSTER_ALERT_GROUP_ID>
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[Mapping[str, Any]] annotations: The cluster alert group annotations (map)
:param pulumi.Input[str] cluster_id: The cluster id where create cluster alert group (string)
:param pulumi.Input[str] description: The cluster alert group description (string)
:param pulumi.Input[int] group_interval_seconds: The cluster alert group interval seconds. Default: `180` (int)
:param pulumi.Input[int] group_wait_seconds: The cluster alert group wait seconds. Default: `180` (int)
:param pulumi.Input[Mapping[str, Any]] labels: The cluster alert group labels (map)
:param pulumi.Input[str] name: The cluster alert group name (string)
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ClusterAlertGroupRecipientArgs']]]] recipients: The cluster alert group recipients (list)
:param pulumi.Input[int] repeat_interval_seconds: The cluster alert group wait seconds. Default: `3600` (int)
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: ClusterAlertGroupArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Provides a Rancher v2 Cluster Alert Group resource. This can be used to create Cluster Alert Group for Rancher v2 environments and retrieve their information.
## Example Usage
```python
import pulumi
import pulumi_rancher2 as rancher2
# Create a new Rancher2 Cluster Alert Group
foo = rancher2.ClusterAlertGroup("foo",
cluster_id="<cluster_id>",
description="Terraform cluster alert group",
group_interval_seconds=300,
repeat_interval_seconds=3600)
```
## Import
Cluster Alert Group can be imported using the Rancher cluster alert group ID
```sh
$ pulumi import rancher2:index/clusterAlertGroup:ClusterAlertGroup foo <CLUSTER_ALERT_GROUP_ID>
```
:param str resource_name: The name of the resource.
:param ClusterAlertGroupArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(ClusterAlertGroupArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
annotations: Optional[pulumi.Input[Mapping[str, Any]]] = None,
cluster_id: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
group_interval_seconds: Optional[pulumi.Input[int]] = None,
group_wait_seconds: Optional[pulumi.Input[int]] = None,
labels: Optional[pulumi.Input[Mapping[str, Any]]] = None,
name: Optional[pulumi.Input[str]] = None,
recipients: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ClusterAlertGroupRecipientArgs']]]]] = None,
repeat_interval_seconds: Optional[pulumi.Input[int]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = ClusterAlertGroupArgs.__new__(ClusterAlertGroupArgs)
__props__.__dict__["annotations"] = annotations
if cluster_id is None and not opts.urn:
raise TypeError("Missing required property 'cluster_id'")
__props__.__dict__["cluster_id"] = cluster_id
__props__.__dict__["description"] = description
__props__.__dict__["group_interval_seconds"] = group_interval_seconds
__props__.__dict__["group_wait_seconds"] = group_wait_seconds
__props__.__dict__["labels"] = labels
__props__.__dict__["name"] = name
__props__.__dict__["recipients"] = recipients
__props__.__dict__["repeat_interval_seconds"] = repeat_interval_seconds
alias_opts = pulumi.ResourceOptions(aliases=[pulumi.Alias(type_="rancher2:index/clusterAlterGroup:ClusterAlterGroup")])
opts = pulumi.ResourceOptions.merge(opts, alias_opts)
super(ClusterAlertGroup, __self__).__init__(
'rancher2:index/clusterAlertGroup:ClusterAlertGroup',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
annotations: Optional[pulumi.Input[Mapping[str, Any]]] = None,
cluster_id: Optional[pulumi.Input[str]] = None,
description: Optional[pulumi.Input[str]] = None,
group_interval_seconds: Optional[pulumi.Input[int]] = None,
group_wait_seconds: Optional[pulumi.Input[int]] = None,
labels: Optional[pulumi.Input[Mapping[str, Any]]] = None,
name: Optional[pulumi.Input[str]] = None,
recipients: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ClusterAlertGroupRecipientArgs']]]]] = None,
repeat_interval_seconds: Optional[pulumi.Input[int]] = None) -> 'ClusterAlertGroup':
"""
Get an existing ClusterAlertGroup resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[Mapping[str, Any]] annotations: The cluster alert group annotations (map)
:param pulumi.Input[str] cluster_id: The cluster id where create cluster alert group (string)
:param pulumi.Input[str] description: The cluster alert group description (string)
:param pulumi.Input[int] group_interval_seconds: The cluster alert group interval seconds. Default: `180` (int)
:param pulumi.Input[int] group_wait_seconds: The cluster alert group wait seconds. Default: `180` (int)
:param pulumi.Input[Mapping[str, Any]] labels: The cluster alert group labels (map)
:param pulumi.Input[str] name: The cluster alert group name (string)
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ClusterAlertGroupRecipientArgs']]]] recipients: The cluster alert group recipients (list)
:param pulumi.Input[int] repeat_interval_seconds: The cluster alert group wait seconds. Default: `3600` (int)
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _ClusterAlertGroupState.__new__(_ClusterAlertGroupState)
__props__.__dict__["annotations"] = annotations
__props__.__dict__["cluster_id"] = cluster_id
__props__.__dict__["description"] = description
__props__.__dict__["group_interval_seconds"] = group_interval_seconds
__props__.__dict__["group_wait_seconds"] = group_wait_seconds
__props__.__dict__["labels"] = labels
__props__.__dict__["name"] = name
__props__.__dict__["recipients"] = recipients
__props__.__dict__["repeat_interval_seconds"] = repeat_interval_seconds
return ClusterAlertGroup(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def annotations(self) -> pulumi.Output[Mapping[str, Any]]:
"""
The cluster alert group annotations (map)
"""
return pulumi.get(self, "annotations")
@property
@pulumi.getter(name="clusterId")
def cluster_id(self) -> pulumi.Output[str]:
"""
The cluster id where create cluster alert group (string)
"""
return pulumi.get(self, "cluster_id")
@property
@pulumi.getter
def description(self) -> pulumi.Output[Optional[str]]:
"""
The cluster alert group description (string)
"""
return pulumi.get(self, "description")
@property
@pulumi.getter(name="groupIntervalSeconds")
def group_interval_seconds(self) -> pulumi.Output[Optional[int]]:
"""
The cluster alert group interval seconds. Default: `180` (int)
"""
return pulumi.get(self, "group_interval_seconds")
@property
@pulumi.getter(name="groupWaitSeconds")
def group_wait_seconds(self) -> pulumi.Output[Optional[int]]:
"""
The cluster alert group wait seconds. Default: `180` (int)
"""
return pulumi.get(self, "group_wait_seconds")
@property
@pulumi.getter
def labels(self) -> pulumi.Output[Mapping[str, Any]]:
"""
The cluster alert group labels (map)
"""
return pulumi.get(self, "labels")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
The cluster alert group name (string)
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def recipients(self) -> pulumi.Output[Optional[Sequence['outputs.ClusterAlertGroupRecipient']]]:
"""
The cluster alert group recipients (list)
"""
return pulumi.get(self, "recipients")
@property
@pulumi.getter(name="repeatIntervalSeconds")
def repeat_interval_seconds(self) -> pulumi.Output[Optional[int]]:
"""
The cluster alert group wait seconds. Default: `3600` (int)
"""
return pulumi.get(self, "repeat_interval_seconds")
| 43.942408 | 166 | 0.654911 | 2,790 | 25,179 | 5.694265 | 0.062724 | 0.09278 | 0.093284 | 0.070498 | 0.8743 | 0.860767 | 0.84566 | 0.840184 | 0.830364 | 0.828036 | 0 | 0.005268 | 0.238492 | 25,179 | 572 | 167 | 44.019231 | 0.823302 | 0.287541 | 0 | 0.811146 | 1 | 0 | 0.113791 | 0.05385 | 0 | 0 | 0 | 0 | 0 | 1 | 0.160991 | false | 0.003096 | 0.021672 | 0 | 0.278638 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
fa16e06a7b03b5563dbcaac4a2f03aeb2fba1820 | 273 | py | Python | tests/test_hello_crypto.py | janFiOpelito/hello-crypto | 2c9e859a0c8451c9521c1c967f717bbf52c5dcf8 | [
"MIT"
] | null | null | null | tests/test_hello_crypto.py | janFiOpelito/hello-crypto | 2c9e859a0c8451c9521c1c967f717bbf52c5dcf8 | [
"MIT"
] | 1 | 2020-06-10T07:31:17.000Z | 2020-06-10T07:31:17.000Z | tests/test_hello_crypto.py | janFiOpelito/hello-crypto | 2c9e859a0c8451c9521c1c967f717bbf52c5dcf8 | [
"MIT"
] | null | null | null | """Unit tests for the hello_crypto module."""
from hello_crypto import hello_crypto_def
import pytest
def test_function_1():
assert hello_crypto_def.hello_crypto_world() == "hello_crypto_world"
def test_function_2():
assert hello_crypto_def.add_crypto(3,2) == 5
| 24.818182 | 72 | 0.776557 | 43 | 273 | 4.534884 | 0.465116 | 0.394872 | 0.215385 | 0.205128 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021008 | 0.128205 | 273 | 10 | 73 | 27.3 | 0.798319 | 0.142857 | 0 | 0 | 0 | 0 | 0.078947 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
fa24dc644066e9629f1259eb356f4a01d6b02267 | 2,381 | py | Python | tests/core/commands/testExit.py | rrpg/engine | 989f701b82aa7c73ea98003eed13077e5d6f15f9 | [
"MIT"
] | 2 | 2016-04-07T23:36:46.000Z | 2016-12-20T15:35:17.000Z | tests/core/commands/testExit.py | rrpg/engine | 989f701b82aa7c73ea98003eed13077e5d6f15f9 | [
"MIT"
] | 5 | 2016-02-04T16:28:33.000Z | 2016-03-18T17:02:07.000Z | tests/core/commands/testExit.py | rrpg/engine | 989f701b82aa7c73ea98003eed13077e5d6f15f9 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import unittest
import tests.common
from core.localisation import _
import json
class exitTests(tests.common.common):
def test_no_place_given_text(self):
self.rpg.setAction([_('EXIT_COMMAND')])
output = self.rpg._runAction()
self.assertEquals(output, _('ERROR_EXIT_NO_PLACE_GIVEN'))
def test_no_place_given_json(self):
self.rpg.setAction([_('EXIT_COMMAND')])
output = self.rpg._runAction(True)
self.assertEquals(output, {"error": {"message": _('ERROR_EXIT_NO_PLACE_GIVEN'), "code": 1}})
def test_place_not_available_text(self):
self.rpg.setAction([_('EXIT_COMMAND'), _('PLACE_TYPE_CAVE')])
output = self.rpg._runAction()
self.assertEquals(output, _('ERROR_ENTER_PLACE_NOT_AVAILABLE'))
def test_place_not_available_json(self):
self.rpg.setAction([_('EXIT_COMMAND'), _('PLACE_TYPE_CAVE')])
output = self.rpg._runAction(True)
self.assertEquals(output, {"error": {"message": _('ERROR_ENTER_PLACE_NOT_AVAILABLE'), "code": 1}})
def test_wrong_place_text(self):
self.rpg.setAction([_('EXIT_COMMAND'), 'dummyplace'])
output = self.rpg._runAction()
self.assertEquals(output, _('ERROR_UNKNOWN_PLACE_TYPE'))
def test_wrong_place_given_json(self):
self.rpg.setAction([_('EXIT_COMMAND'), 'dummyplace'])
output = self.rpg._runAction(True)
self.assertEquals(output, {"error": {"message": _('ERROR_UNKNOWN_PLACE_TYPE'), "code": 1}})
def test_when_still_out_text(self):
self.rpg.setAction([_('EXIT_COMMAND'), _('PLACE_TYPE_DUNGEON')])
output = self.rpg._runAction()
self.assertEquals(output, _('ERROR_ENTER_PLACE_NOT_AVAILABLE'))
def test_when_still_out_json(self):
self.rpg.setAction([_('EXIT_COMMAND'), _('PLACE_TYPE_DUNGEON')])
output = self.rpg._runAction(True)
self.assertEquals(output, {"error": {"message": _('ERROR_ENTER_PLACE_NOT_AVAILABLE'), "code": 1}})
def test_text(self):
self.rpg.setAction([_('ENTER_COMMAND'), _('PLACE_TYPE_DUNGEON')])
self.rpg._runAction()
self.rpg.setAction([_('EXIT_COMMAND'), _('PLACE_TYPE_DUNGEON')])
output = self.rpg._runAction()
self.assertEquals(output, _('EXIT_CONFIRMATION'))
def test_json(self):
self.rpg.setAction([_('ENTER_COMMAND'), _('PLACE_TYPE_DUNGEON')])
self.rpg._runAction(True)
self.rpg.setAction([_('EXIT_COMMAND'), _('PLACE_TYPE_DUNGEON')])
output = self.rpg._runAction(True)
self.assertEquals(output, [_('EXIT_CONFIRMATION')])
| 38.403226 | 100 | 0.737925 | 308 | 2,381 | 5.262987 | 0.146104 | 0.10364 | 0.118445 | 0.123381 | 0.889574 | 0.780999 | 0.780999 | 0.776064 | 0.742751 | 0.715608 | 0 | 0.002309 | 0.090718 | 2,381 | 61 | 101 | 39.032787 | 0.74642 | 0.00882 | 0 | 0.530612 | 0 | 0 | 0.264631 | 0.094148 | 0 | 0 | 0 | 0 | 0.204082 | 1 | 0.204082 | false | 0 | 0.081633 | 0 | 0.306122 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3af8d5a49e98ec2c269922b467d0cd5c81af0aee | 7,607 | py | Python | jCMIP/CMIPgenerate.py | jmecki/jCMIP | 4a6193e265a499002d17f201915b6bc40afb800b | [
"MIT"
] | null | null | null | jCMIP/CMIPgenerate.py | jmecki/jCMIP | 4a6193e265a499002d17f201915b6bc40afb800b | [
"MIT"
] | null | null | null | jCMIP/CMIPgenerate.py | jmecki/jCMIP | 4a6193e265a499002d17f201915b6bc40afb800b | [
"MIT"
] | 1 | 2021-06-02T13:05:29.000Z | 2021-06-02T13:05:29.000Z | ##############################################################################
#
# Generates list of CMIP model objects:
#
# #############################################################################
import os
import _pickle as pickle
from . import CMIPobject
# Read in list of models:
def readList(filename):
infile = open(filename,'rb')
Clist = pickle.load(infile)
infile.close()
return Clist
# Save list of models:
def saveList(filename,Clist):
outfile = open(filename,'wb')
pickle.dump(Clist,outfile)
outfile.close()
# If the list doesn't exist yet, generates a list of models for the specified
# cmip otherwise adds missing models to list.
def generateList(cmip,filename):
Models = CMIPobject.getModels(cmip=cmip)
if os.path.isfile(filename):
Clist = readList(filename)
else:
Clist = dict()
print('starting models on list: ' + str(len(Clist)))
for mm in Models:
if not mm in Clist:
Clist[mm] = CMIPobject.CMIPmodel(mm,cmip=cmip)
# Save to file:
saveList(filename,Clist)
print('added ' + mm)
print('total models on list: ' + str(len(Clist)))
# Updates information in the CMIP models list:
def updateList(filename,Model,cmip='',datadir='',savedir='',\
Ogrid='',Omeshmask='',Oreg='',Olon='',Olat='',OflipNS='',OextraT='',OextraWE='',\
Agrid='',Ameshmask='',Areg='',Alon='',Alat='',AflipNS='',AextraT='',AextraWE='',\
Igrid='',Imeshmask='',Ireg='',Ilon='',Ilat='',IflipNS='',IextraT='',IextraWE='',\
Lgrid='',Lmeshmask='',Lreg='',Llon='',Llat='',LflipNS='',LextraT='',LextraWE='',\
notes=''):
Clist = readList(filename)
# Generic Model:
if (cmip != ''):
Clist[Model].cmip = cmip
if (datadir != ''):
Clist[Model].datadir = datadir
if (savedir != ''):
Clist[Model].savedir = savedir
# Ocean:
if (Ogrid != ''):
Clist[Model].Ogrid = Ogrid
if (Omeshmask != ''):
Clist[Model].Omeshmask = Omeshmask
if (Oreg != ''):
Clist[Model].Oreg = Oreg
if (Olon != ''):
Clist[Model].Olon = Olon
if (Olat != ''):
Clist[Model].Olat = Olat
if (OflipNS != ''):
Clist[Model].OflipNS = OflipNS
if (OextraT != ''):
Clist[Model].OextraT = OextraT
if (OextraWE != ''):
Clist[Model].OextraWE = OextraWE
# Atmosphere:
if (Agrid != ''):
Clist[Model].Agrid = Agrid
if (Ameshmask != ''):
Clist[Model].Ameshmask = Ameshmask
if (Areg != ''):
Clist[Model].Areg = Areg
if (Alon != ''):
Clist[Model].Alon = Alon
if (Alat != ''):
Clist[Model].Alat = Alat
if (AflipNS != ''):
Clist[Model].AflipNS = AflipNS
if (AextraT != ''):
Clist[Model].AextraT = AextraT
if (AextraWE != ''):
Clist[Model].AextraWE = AextraWE
# Sea Ice:
if (Igrid != ''):
Clist[Model].Igrid = Igrid
if (Imeshmask != ''):
Clist[Model].Imeshmask = Imeshmask
if (Ireg != ''):
Clist[Model].Ireg = Ireg
if (Ilon != ''):
Clist[Model].Ilon = Ilon
if (Ilat != ''):
Clist[Model].Ilat = Ilat
if (IflipNS != ''):
Clist[Model].IflipNS = IflipNS
if (IextraT != ''):
Clist[Model].IextraT = IextraT
if (IextraWE != ''):
Clist[Model].IextraWE = IextraWE
# Land:
if (Lgrid != ''):
Clist[Model].Lgrid = Lgrid
if (Lmeshmask != ''):
Clist[Model].Lmeshmask = Lmeshmask
if (Lreg != ''):
Clist[Model].Lreg = Lreg
if (Llon != ''):
Clist[Model].Llon = Llon
if (Llat != ''):
Clist[Model].Llat = Llat
if (LflipNS != ''):
Clist[Model].LflipNS = LflipNS
if (LextraT != ''):
Clist[Model].LextraT = LextraT
if (LextraWE != ''):
Clist[Model].LextraWE = LextraWE
# General useful information:
if (notes != ''):
Clist[Model].notes = notes
# Save to file:
saveList(filename,Clist)
# Updates information for all members in list:
def updateAllList(filename,cmip='',datadir='',savedir='',\
Ogrid='',Omeshmask='',Oreg='',Olon='',Olat='',OflipNS='',OextraT='',OextraWE='',\
Agrid='',Ameshmask='',Areg='',Alon='',Alat='',AflipNS='',AextraT='',AextraWE='',\
Igrid='',Imeshmask='',Ireg='',Ilon='',Ilat='',IflipNS='',IextraT='',IextraWE='',\
Lgrid='',Lmeshmask='',Lreg='',Llon='',Llat='',LflipNS='',LextraT='',LextraWE='',\
notes=''):
Clist = readList(filename)
Models = list(Clist.keys())
for Model in Models:
# Generic Model:
if (cmip != ''):
Clist[Model].cmip = cmip
if (datadir != ''):
Clist[Model].datadir = datadir
if (savedir != ''):
Clist[Model].savedir = savedir
# Ocean:
if (Ogrid != ''):
Clist[Model].Ogrid = Ogrid
if (Omeshmask != ''):
Clist[Model].Omeshmask = Omeshmask
if (Oreg != ''):
Clist[Model].Oreg = Oreg
if (Olon != ''):
Clist[Model].Olon = Olon
if (Olat != ''):
Clist[Model].Olat = Olat
if (OflipNS != ''):
Clist[Model].OflipNS = OflipNS
if (OextraT != ''):
Clist[Model].OextraT = OextraT
if (OextraWE != ''):
Clist[Model].OextraWE = OextraWE
# Atmosphere:
if (Agrid != ''):
Clist[Model].Agrid = Agrid
if (Ameshmask != ''):
Clist[Model].Ameshmask = Ameshmask
if (Areg != ''):
Clist[Model].Areg = Areg
if (Alon != ''):
Clist[Model].Alon = Alon
if (Alat != ''):
Clist[Model].Alat = Alat
if (AflipNS != ''):
Clist[Model].AflipNS = AflipNS
if (AextraT != ''):
Clist[Model].AextraT = AextraT
if (AextraWE != ''):
Clist[Model].AextraWE = AextraWE
# Sea Ice:
if (Igrid != ''):
Clist[Model].Igrid = Igrid
if (Imeshmask != ''):
Clist[Model].Imeshmask = Imeshmask
if (Ireg != ''):
Clist[Model].Ireg = Ireg
if (Ilon != ''):
Clist[Model].Ilon = Ilon
if (Ilat != ''):
Clist[Model].Ilat = Ilat
if (IflipNS != ''):
Clist[Model].IflipNS = IflipNS
if (IextraT != ''):
Clist[Model].IextraT = IextraT
if (IextraWE != ''):
Clist[Model].IextraWE = IextraWE
# Land:
if (Lgrid != ''):
Clist[Model].Lgrid = Lgrid
if (Lmeshmask != ''):
Clist[Model].Lmeshmask = Lmeshmask
if (Lreg != ''):
Clist[Model].Lreg = Lreg
if (Llon != ''):
Clist[Model].Llon = Llon
if (Llat != ''):
Clist[Model].Llat = Llat
if (LflipNS != ''):
Clist[Model].LflipNS = LflipNS
if (LextraT != ''):
Clist[Model].LextraT = LextraT
if (LextraWE != ''):
Clist[Model].LextraWE = LextraWE
# General useful information:
if (notes != ''):
Clist[Model].notes = notes
# Save to file:
saveList(filename,Clist) | 31.433884 | 96 | 0.477324 | 707 | 7,607 | 5.134371 | 0.155587 | 0.198347 | 0.023141 | 0.014876 | 0.795317 | 0.795317 | 0.774105 | 0.774105 | 0.774105 | 0.774105 | 0 | 0 | 0.337058 | 7,607 | 242 | 97 | 31.433884 | 0.71981 | 0.064283 | 0 | 0.855615 | 1 | 0 | 0.008216 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026738 | false | 0 | 0.016043 | 0 | 0.048128 | 0.016043 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d7114a0e56ba82745942d9d28b8dbb90b359cb3c | 225 | py | Python | PygameHelper/utils/__init__.py | blankRiot96/PygameHelper | 02b156eedcdc1c8cdae8b8c280a49d7cefd149f7 | [
"MIT"
] | null | null | null | PygameHelper/utils/__init__.py | blankRiot96/PygameHelper | 02b156eedcdc1c8cdae8b8c280a49d7cefd149f7 | [
"MIT"
] | null | null | null | PygameHelper/utils/__init__.py | blankRiot96/PygameHelper | 02b156eedcdc1c8cdae8b8c280a49d7cefd149f7 | [
"MIT"
] | null | null | null | from PygameHelper.utils.noise import (
noise
)
from PygameHelper.utils.pathfinding import pathfinding
from PygameHelper.utils.formulas import *
from PygameHelper.utils.utils import *
from PygameHelper.utils.draw import *
| 28.125 | 54 | 0.817778 | 27 | 225 | 6.814815 | 0.296296 | 0.434783 | 0.570652 | 0.293478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115556 | 225 | 7 | 55 | 32.142857 | 0.924623 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.714286 | 0 | 0.714286 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
d71574ba544bedaf8d79321193ea136f5e570b81 | 140 | py | Python | isochrones/parsec/__init__.py | aidantmcb/isochrones | d03a68e1fd9e35f14af904962d88084328dd0fb6 | [
"MIT"
] | null | null | null | isochrones/parsec/__init__.py | aidantmcb/isochrones | d03a68e1fd9e35f14af904962d88084328dd0fb6 | [
"MIT"
] | null | null | null | isochrones/parsec/__init__.py | aidantmcb/isochrones | d03a68e1fd9e35f14af904962d88084328dd0fb6 | [
"MIT"
] | 1 | 2019-07-01T21:36:34.000Z | 2019-07-01T21:36:34.000Z | from .grid import ParsecModelGrid
from .isochrone import Parsec_Isochrone
def download_grids():
return ParsecModelGrid.download_grids() | 28 | 43 | 0.828571 | 16 | 140 | 7.0625 | 0.625 | 0.230089 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 140 | 5 | 43 | 28 | 0.91129 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.5 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
d772804c2d7072690191e320bcfd8c3f778041a0 | 7,719 | py | Python | lyricsgenius/api/public_methods/search.py | jack-debug/LyricsGenius | 7da13630f75e4ff49aeeb080b6d85ce7b97a3b33 | [
"MIT"
] | 692 | 2018-02-21T15:32:21.000Z | 2022-03-26T09:50:13.000Z | lyricsgenius/api/public_methods/search.py | jack-debug/LyricsGenius | 7da13630f75e4ff49aeeb080b6d85ce7b97a3b33 | [
"MIT"
] | 171 | 2018-02-21T05:39:07.000Z | 2022-03-30T22:00:05.000Z | lyricsgenius/api/public_methods/search.py | jack-debug/LyricsGenius | 7da13630f75e4ff49aeeb080b6d85ce7b97a3b33 | [
"MIT"
] | 170 | 2018-02-22T19:56:44.000Z | 2022-03-18T11:12:37.000Z | class SearchMethods(object):
"""Search methods of the public API."""
def search(self, search_term, per_page=None, page=None, type_=''):
"""Searches Genius.
Args:
search_term (:obj:`str`): A term to search on Genius.
per_page (:obj:`int`, optional): Number of results to
return per request. It can't be more than 50.
page (:obj:`int`, optional): Paginated offset (number of the page).
text_format (:obj:`str`, optional): Text format of the results
('dom', 'html', 'markdown' or 'plain').
type_ (:obj:`str`, optional): Type of item to search for
('song', 'lyric', 'artist', 'album', 'video',
'article', 'user' or 'multi').
Returns:
:obj:`dict`
.. Note::
Specifying no :obj:`type_` parameter (which defaults to ``''``) or
setting it as ``song`` will return the same results. Both will return
songs. The only different is they return the hits in different
keys:
* ``type_=''``: ``response['hits']``
* ``type_='song'``: ``response['sections'][0]['hits']``
By Setting the type as ``multi`` the method will perform a search
for all the other types and return an extra section called ``top hits``.
.. Note::
Instead of calling this method directly and specifying a type, you
can use the alias methods.
"""
if type_ == '':
path = 'search'
else:
path = 'search/' + type_
params = {'q': search_term,
'per_page': per_page,
'page': page}
return self._make_request(path, params_=params, public_api=True)
def search_albums(self, search_term, per_page=None, page=None):
"""Searches the albums on Genius.
Alias for :meth:`search() <PublicAPI.search>`
Args:
search_term (:obj:`str`): A term to search on Genius
per_page (:obj:`int`, optional): Number of results to
return per request. It can't be more than 50.
page (:obj:`int`, optional): Paginated offset (number of the page).
text_format (:obj:`str`, optional): Text format of the results
('dom', 'html', 'markdown' or 'plain').
Returns:
:obj:`dict`
"""
endpoint = 'album'
return self.search(search_term, per_page, page, endpoint)
def search_articles(self, search_term, per_page=None, page=None):
"""Searches the articles on Genius.
Alias for :meth:`search() <PublicAPI.search>`
Args:
search_term (:obj:`str`): A term to search on Genius
per_page (:obj:`int`, optional): Number of results to
return per request. It can't be more than 50.
page (:obj:`int`, optional): Paginated offset (number of the page).
text_format (:obj:`str`, optional): Text format of the results
('dom', 'html', 'markdown' or 'plain').
Returns:
:obj:`dict`
"""
endpoint = 'article'
return self.search(search_term, per_page, page, endpoint)
def search_artists(self, search_term, per_page=None, page=None):
"""Searches the artists on Genius.
Alias for :meth:`search() <PublicAPI.search>`
Args:
search_term (:obj:`str`): A term to search on Genius
per_page (:obj:`int`, optional): Number of results to
return per request. It can't be more than 50.
page (:obj:`int`, optional): Paginated offset (number of the page).
text_format (:obj:`str`, optional): Text format of the results
('dom', 'html', 'markdown' or 'plain').
Returns:
:obj:`dict`
"""
endpoint = 'artist'
return self.search(search_term, per_page, page, endpoint)
def search_lyrics(self, search_term, per_page=None, page=None):
"""Searches the lyrics on Genius.
Alias for :meth:`search() <PublicAPI.search>`
Args:
search_term (:obj:`str`): A term to search on Genius
per_page (:obj:`int`, optional): Number of results to
return per request. It can't be more than 50.
page (:obj:`int`, optional): Paginated offset (number of the page).
text_format (:obj:`str`, optional): Text format of the results
('dom', 'html', 'markdown' or 'plain').
Returns:
:obj:`dict`
"""
endpoint = 'lyric'
return self.search(search_term, per_page, page, endpoint)
def search_songs(self, search_term, per_page=None, page=None):
"""Searches the songs on Genius.
Alias for :meth:`search() <PublicAPI.search>`
Args:
search_term (:obj:`str`): A term to search on Genius
per_page (:obj:`int`, optional): Number of results to
return per request. It can't be more than 50.
page (:obj:`int`, optional): Paginated offset (number of the page).
text_format (:obj:`str`, optional): Text format of the results
('dom', 'html', 'markdown' or 'plain').
Returns:
:obj:`dict`
"""
endpoint = 'song'
return self.search(search_term, per_page, page, endpoint)
def search_users(self, search_term, per_page=None, page=None):
"""Searches the users on Genius.
Alias for :meth:`search() <PublicAPI.search>`
Args:
search_term (:obj:`str`): A term to search on Genius
per_page (:obj:`int`, optional): Number of results to
return per request. It can't be more than 50.
page (:obj:`int`, optional): Paginated offset (number of the page).
text_format (:obj:`str`, optional): Text format of the results
('dom', 'html', 'markdown' or 'plain').
Returns:
:obj:`dict`
"""
endpoint = 'user'
return self.search(search_term, per_page, page, endpoint)
def search_videos(self, search_term, per_page=None, page=None):
"""Searches the videos on Genius.
Alias for :meth:`search() <PublicAPI.search>`
Args:
search_term (:obj:`str`): A term to search on Genius
per_page (:obj:`int`, optional): Number of results to
return per request. It can't be more than 50.
page (:obj:`int`, optional): Paginated offset (number of the page).
text_format (:obj:`str`, optional): Text format of the results
('dom', 'html', 'markdown' or 'plain').
Returns:
:obj:`dict`
"""
endpoint = 'video'
return self.search(search_term, per_page, page, endpoint)
def search_all(self, search_term, per_page=None, page=None):
"""Searches all types.
Including: albums, articles, lyrics, songs, users and
videos.
Alias for :meth:`search() <PublicAPI.search>`
Args:
search_term (:obj:`str`): A term to search on Genius.
per_page (:obj:`int`, optional): Number of results to
return per page. It can't be more than 5 for this method.
page (:obj:`int`, optional): Number of the page.
Returns:
:obj:`dict`
Note:
This method will also return a ``top hits`` section
alongside other types.
"""
endpoint = 'multi'
return self.search(search_term, per_page, page, endpoint)
| 36.582938 | 84 | 0.558492 | 945 | 7,719 | 4.475132 | 0.125926 | 0.048002 | 0.055332 | 0.072358 | 0.766375 | 0.766375 | 0.756444 | 0.756444 | 0.74864 | 0.728305 | 0 | 0.003432 | 0.320508 | 7,719 | 210 | 85 | 36.757143 | 0.80286 | 0.642052 | 0 | 0.235294 | 0 | 0 | 0.037939 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.264706 | false | 0 | 0 | 0 | 0.558824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
ad04fc994cfb7d38520c9103fd7ab3ef7cfdf32f | 23,225 | py | Python | etl/parsers/etw/Microsoft_Windows_DataIntegrityScan.py | IMULMUL/etl-parser | 76b7c046866ce0469cd129ee3f7bb3799b34e271 | [
"Apache-2.0"
] | 104 | 2020-03-04T14:31:31.000Z | 2022-03-28T02:59:36.000Z | etl/parsers/etw/Microsoft_Windows_DataIntegrityScan.py | IMULMUL/etl-parser | 76b7c046866ce0469cd129ee3f7bb3799b34e271 | [
"Apache-2.0"
] | 7 | 2020-04-20T09:18:39.000Z | 2022-03-19T17:06:19.000Z | etl/parsers/etw/Microsoft_Windows_DataIntegrityScan.py | IMULMUL/etl-parser | 76b7c046866ce0469cd129ee3f7bb3799b34e271 | [
"Apache-2.0"
] | 16 | 2020-03-05T18:55:59.000Z | 2022-03-01T10:19:28.000Z | # -*- coding: utf-8 -*-
"""
Microsoft-Windows-DataIntegrityScan
GUID : 13bc4371-4e21-4e46-a84f-8c0ffb548ced
"""
from construct import Int8sl, Int8ul, Int16ul, Int16sl, Int32sl, Int32ul, Int64sl, Int64ul, Bytes, Double, Float32l, Struct
from etl.utils import WString, CString, SystemTime, Guid
from etl.dtyp import Sid
from etl.parsers.etw.core import Etw, declare, guid
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=11, version=0)
class Microsoft_Windows_DataIntegrityScan_11_0(Etw):
pattern = Struct(
"DiskNumber" / Int32ul,
"DiskGuid" / Guid,
"PortPoolId" / Guid,
"PortDiskId" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=12, version=0)
class Microsoft_Windows_DataIntegrityScan_12_0(Etw):
pattern = Struct(
"DiskNumber" / Int32ul,
"DiskGuid" / Guid,
"PortPoolId" / Guid,
"PortDiskId" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=21, version=0)
class Microsoft_Windows_DataIntegrityScan_21_0(Etw):
pattern = Struct(
"VolumeNameLength" / Int16ul,
"VolumeName" / Bytes(lambda this: this.VolumeNameLength),
"FriendlyVolumeNameLength" / Int16ul,
"FriendlyVolumeName" / Bytes(lambda this: this.FriendlyVolumeNameLength),
"VolumeGuid" / Guid,
"DiskNumber" / Int32ul,
"DiskGuid" / Guid,
"PortPoolId" / Guid,
"PortDiskId" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=22, version=0)
class Microsoft_Windows_DataIntegrityScan_22_0(Etw):
pattern = Struct(
"VolumeNameLength" / Int16ul,
"VolumeName" / Bytes(lambda this: this.VolumeNameLength),
"FriendlyVolumeNameLength" / Int16ul,
"FriendlyVolumeName" / Bytes(lambda this: this.FriendlyVolumeNameLength),
"HResult" / Int32sl,
"DirectoryCount" / Int64ul,
"FileCount" / Int64ul,
"StreamCount" / Int64ul,
"DataCount" / Int64ul,
"FsctlCount" / Int64ul,
"TotalBytesRepaired" / Int64ul,
"TotalBytesFailed" / Int64ul,
"MetadataBytesProcessed" / Int64ul,
"DataBytesProcessed" / Int64ul,
"TotalMetadataBytesInUse" / Int64ul,
"TotalDataBytesInUse" / Int64ul,
"TotalTimeTaken" / Int64ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=23, version=0)
class Microsoft_Windows_DataIntegrityScan_23_0(Etw):
pattern = Struct(
"VolumeNameLength" / Int16ul,
"VolumeName" / Bytes(lambda this: this.VolumeNameLength),
"FriendlyVolumeNameLength" / Int16ul,
"FriendlyVolumeName" / Bytes(lambda this: this.FriendlyVolumeNameLength),
"HResult" / Int32sl,
"DirectoryCount" / Int64ul,
"FileCount" / Int64ul,
"StreamCount" / Int64ul,
"DataCount" / Int64ul,
"FsctlCount" / Int64ul,
"TotalBytesRepaired" / Int64ul,
"TotalBytesFailed" / Int64ul,
"MetadataBytesProcessed" / Int64ul,
"DataBytesProcessed" / Int64ul,
"TotalMetadataBytesInUse" / Int64ul,
"TotalDataBytesInUse" / Int64ul,
"TotalTimeTaken" / Int64ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=24, version=0)
class Microsoft_Windows_DataIntegrityScan_24_0(Etw):
pattern = Struct(
"VolumeNameLength" / Int16ul,
"VolumeName" / Bytes(lambda this: this.VolumeNameLength),
"FriendlyVolumeNameLength" / Int16ul,
"FriendlyVolumeName" / Bytes(lambda this: this.FriendlyVolumeNameLength),
"HResult" / Int32sl,
"Count" / Int64ul,
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=25, version=0)
class Microsoft_Windows_DataIntegrityScan_25_0(Etw):
pattern = Struct(
"VolumeNameLength" / Int16ul,
"VolumeName" / Bytes(lambda this: this.VolumeNameLength),
"FriendlyVolumeNameLength" / Int16ul,
"FriendlyVolumeName" / Bytes(lambda this: this.FriendlyVolumeNameLength),
"HResult" / Int32sl,
"DirectoryCount" / Int64ul,
"FileCount" / Int64ul,
"StreamCount" / Int64ul,
"DataCount" / Int64ul,
"FsctlCount" / Int64ul,
"TotalBytesRepaired" / Int64ul,
"TotalBytesFailed" / Int64ul,
"MetadataBytesProcessed" / Int64ul,
"DataBytesProcessed" / Int64ul,
"TotalMetadataBytesInUse" / Int64ul,
"TotalDataBytesInUse" / Int64ul,
"TotalTimeTaken" / Int64ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=26, version=0)
class Microsoft_Windows_DataIntegrityScan_26_0(Etw):
pattern = Struct(
"VolumeNameLength" / Int16ul,
"VolumeName" / Bytes(lambda this: this.VolumeNameLength),
"FriendlyVolumeNameLength" / Int16ul,
"FriendlyVolumeName" / Bytes(lambda this: this.FriendlyVolumeNameLength),
"HResult" / Int32sl,
"DirectoryCount" / Int64ul,
"FileCount" / Int64ul,
"StreamCount" / Int64ul,
"DataCount" / Int64ul,
"FsctlCount" / Int64ul,
"TotalBytesRepaired" / Int64ul,
"TotalBytesFailed" / Int64ul,
"MetadataBytesProcessed" / Int64ul,
"DataBytesProcessed" / Int64ul,
"TotalMetadataBytesInUse" / Int64ul,
"TotalDataBytesInUse" / Int64ul,
"TotalTimeTaken" / Int64ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=27, version=0)
class Microsoft_Windows_DataIntegrityScan_27_0(Etw):
pattern = Struct(
"VolumeNameLength" / Int16ul,
"VolumeName" / Bytes(lambda this: this.VolumeNameLength),
"FriendlyVolumeNameLength" / Int16ul,
"FriendlyVolumeName" / Bytes(lambda this: this.FriendlyVolumeNameLength),
"DirectoryCount" / Int64ul,
"FileCount" / Int64ul,
"StreamCount" / Int64ul,
"DataCount" / Int64ul,
"FsctlCount" / Int64ul,
"TotalBytesRepaired" / Int64ul,
"TotalBytesFailed" / Int64ul,
"MetadataBytesProcessed" / Int64ul,
"DataBytesProcessed" / Int64ul,
"TotalMetadataBytesInUse" / Int64ul,
"TotalDataBytesInUse" / Int64ul,
"TotalTimeTaken" / Int64ul,
"PercentComplete" / Int16ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=54, version=0)
class Microsoft_Windows_DataIntegrityScan_54_0(Etw):
pattern = Struct(
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"InternalFileReference" / Int64ul,
"FileOffset" / Int64ul,
"Length" / Int64ul,
"BytesRepaired" / Int64ul,
"BytesFailed" / Int64ul,
"Status" / Int32ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=55, version=0)
class Microsoft_Windows_DataIntegrityScan_55_0(Etw):
pattern = Struct(
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"InternalFileReference" / Int64ul,
"FileOffset" / Int64ul,
"Length" / Int64ul,
"BytesRepaired" / Int64ul,
"BytesFailed" / Int64ul,
"Status" / Int32ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=56, version=0)
class Microsoft_Windows_DataIntegrityScan_56_0(Etw):
pattern = Struct(
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"InternalFileReference" / Int64ul,
"FileOffset" / Int64ul,
"Length" / Int64ul,
"BytesRepaired" / Int64ul,
"BytesFailed" / Int64ul,
"Status" / Int32ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=57, version=0)
class Microsoft_Windows_DataIntegrityScan_57_0(Etw):
pattern = Struct(
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"InternalFileReference" / Int64ul,
"FileOffset" / Int64ul,
"Length" / Int64ul,
"BytesRepaired" / Int64ul,
"BytesFailed" / Int64ul,
"Status" / Int32ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=58, version=0)
class Microsoft_Windows_DataIntegrityScan_58_0(Etw):
pattern = Struct(
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"InternalFileReference" / Int64ul,
"FileOffset" / Int64ul,
"Length" / Int64ul,
"BytesRepaired" / Int64ul,
"BytesFailed" / Int64ul,
"Status" / Int32ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=59, version=0)
class Microsoft_Windows_DataIntegrityScan_59_0(Etw):
pattern = Struct(
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"InternalFileReference" / Int64ul,
"FileOffset" / Int64ul,
"Length" / Int64ul,
"BytesRepaired" / Int64ul,
"BytesFailed" / Int64ul,
"Status" / Int32ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=60, version=0)
class Microsoft_Windows_DataIntegrityScan_60_0(Etw):
pattern = Struct(
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"InternalFileReference" / Int64ul,
"FileOffset" / Int64ul,
"Length" / Int64ul,
"BytesRepaired" / Int64ul,
"BytesFailed" / Int64ul,
"Status" / Int32ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=61, version=0)
class Microsoft_Windows_DataIntegrityScan_61_0(Etw):
pattern = Struct(
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"InternalFileReference" / Int64ul,
"FileOffset" / Int64ul,
"Length" / Int64ul,
"BytesRepaired" / Int64ul,
"BytesFailed" / Int64ul,
"Status" / Int32ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=62, version=0)
class Microsoft_Windows_DataIntegrityScan_62_0(Etw):
pattern = Struct(
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"InternalFileReference" / Int64ul,
"FileOffset" / Int64ul,
"Length" / Int64ul,
"BytesRepaired" / Int64ul,
"BytesFailed" / Int64ul,
"Status" / Int32ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=111, version=0)
class Microsoft_Windows_DataIntegrityScan_111_0(Etw):
pattern = Struct(
"DiskNumber" / Int32ul,
"DiskGuid" / Guid,
"PortPoolId" / Guid,
"PortDiskId" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=112, version=0)
class Microsoft_Windows_DataIntegrityScan_112_0(Etw):
pattern = Struct(
"DiskNumber" / Int32ul,
"DiskGuid" / Guid,
"PortPoolId" / Guid,
"PortDiskId" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=121, version=0)
class Microsoft_Windows_DataIntegrityScan_121_0(Etw):
pattern = Struct(
"VolumeNameLength" / Int16ul,
"VolumeName" / Bytes(lambda this: this.VolumeNameLength),
"FriendlyVolumeNameLength" / Int16ul,
"FriendlyVolumeName" / Bytes(lambda this: this.FriendlyVolumeNameLength),
"DiskNumber" / Int32ul,
"DiskOffset" / Int64ul,
"Length" / Int64ul,
"VolumeGuid" / Guid,
"DiskGuid" / Guid,
"PortPoolId" / Guid,
"PortDiskId" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=122, version=0)
class Microsoft_Windows_DataIntegrityScan_122_0(Etw):
pattern = Struct(
"VolumeNameLength" / Int16ul,
"VolumeName" / Bytes(lambda this: this.VolumeNameLength),
"FriendlyVolumeNameLength" / Int16ul,
"FriendlyVolumeName" / Bytes(lambda this: this.FriendlyVolumeNameLength),
"HResult" / Int32sl,
"DirectoryCount" / Int64ul,
"FileCount" / Int64ul,
"StreamCount" / Int64ul,
"DataCount" / Int64ul,
"FsctlCount" / Int64ul,
"TotalBytesRepaired" / Int64ul,
"TotalBytesFailed" / Int64ul,
"MetadataBytesProcessed" / Int64ul,
"DataBytesProcessed" / Int64ul,
"TotalMetadataBytesInUse" / Int64ul,
"TotalDataBytesInUse" / Int64ul,
"MaxThreadCount" / Int32ul,
"TotalTimeTaken" / Int64ul,
"DiskNumber" / Int32ul,
"DiskOffset" / Int64ul,
"Length" / Int64ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=123, version=0)
class Microsoft_Windows_DataIntegrityScan_123_0(Etw):
pattern = Struct(
"VolumeNameLength" / Int16ul,
"VolumeName" / Bytes(lambda this: this.VolumeNameLength),
"FriendlyVolumeNameLength" / Int16ul,
"FriendlyVolumeName" / Bytes(lambda this: this.FriendlyVolumeNameLength),
"HResult" / Int32sl,
"DirectoryCount" / Int64ul,
"FileCount" / Int64ul,
"StreamCount" / Int64ul,
"DataCount" / Int64ul,
"FsctlCount" / Int64ul,
"TotalBytesRepaired" / Int64ul,
"TotalBytesFailed" / Int64ul,
"MetadataBytesProcessed" / Int64ul,
"DataBytesProcessed" / Int64ul,
"TotalMetadataBytesInUse" / Int64ul,
"TotalDataBytesInUse" / Int64ul,
"MaxThreadCount" / Int32ul,
"TotalTimeTaken" / Int64ul,
"DiskNumber" / Int32ul,
"DiskOffset" / Int64ul,
"Length" / Int64ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=124, version=0)
class Microsoft_Windows_DataIntegrityScan_124_0(Etw):
pattern = Struct(
"VolumeNameLength" / Int16ul,
"VolumeName" / Bytes(lambda this: this.VolumeNameLength),
"FriendlyVolumeNameLength" / Int16ul,
"FriendlyVolumeName" / Bytes(lambda this: this.FriendlyVolumeNameLength),
"HResult" / Int32sl,
"Count" / Int64ul,
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=125, version=0)
class Microsoft_Windows_DataIntegrityScan_125_0(Etw):
pattern = Struct(
"VolumeNameLength" / Int16ul,
"VolumeName" / Bytes(lambda this: this.VolumeNameLength),
"FriendlyVolumeNameLength" / Int16ul,
"FriendlyVolumeName" / Bytes(lambda this: this.FriendlyVolumeNameLength),
"HResult" / Int32sl,
"DirectoryCount" / Int64ul,
"FileCount" / Int64ul,
"StreamCount" / Int64ul,
"DataCount" / Int64ul,
"FsctlCount" / Int64ul,
"TotalBytesRepaired" / Int64ul,
"TotalBytesFailed" / Int64ul,
"MetadataBytesProcessed" / Int64ul,
"DataBytesProcessed" / Int64ul,
"TotalMetadataBytesInUse" / Int64ul,
"TotalDataBytesInUse" / Int64ul,
"MaxThreadCount" / Int32ul,
"TotalTimeTaken" / Int64ul,
"DiskNumber" / Int32ul,
"DiskOffset" / Int64ul,
"Length" / Int64ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=126, version=0)
class Microsoft_Windows_DataIntegrityScan_126_0(Etw):
pattern = Struct(
"VolumeNameLength" / Int16ul,
"VolumeName" / Bytes(lambda this: this.VolumeNameLength),
"FriendlyVolumeNameLength" / Int16ul,
"FriendlyVolumeName" / Bytes(lambda this: this.FriendlyVolumeNameLength),
"HResult" / Int32sl,
"DirectoryCount" / Int64ul,
"FileCount" / Int64ul,
"StreamCount" / Int64ul,
"DataCount" / Int64ul,
"FsctlCount" / Int64ul,
"TotalBytesRepaired" / Int64ul,
"TotalBytesFailed" / Int64ul,
"MetadataBytesProcessed" / Int64ul,
"DataBytesProcessed" / Int64ul,
"TotalMetadataBytesInUse" / Int64ul,
"TotalDataBytesInUse" / Int64ul,
"MaxThreadCount" / Int32ul,
"TotalTimeTaken" / Int64ul,
"DiskNumber" / Int32ul,
"DiskOffset" / Int64ul,
"Length" / Int64ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=127, version=0)
class Microsoft_Windows_DataIntegrityScan_127_0(Etw):
pattern = Struct(
"VolumeNameLength" / Int16ul,
"VolumeName" / Bytes(lambda this: this.VolumeNameLength),
"FriendlyVolumeNameLength" / Int16ul,
"FriendlyVolumeName" / Bytes(lambda this: this.FriendlyVolumeNameLength),
"HResult" / Int32sl,
"DiskNumber" / Int32ul,
"DiskOffset" / Int64ul,
"Length" / Int64ul,
"VolumeGuid" / Guid,
"DiskGuid" / Guid,
"PortPoolId" / Guid,
"PortDiskId" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=128, version=0)
class Microsoft_Windows_DataIntegrityScan_128_0(Etw):
pattern = Struct(
"VolumeNameLength" / Int16ul,
"VolumeName" / Bytes(lambda this: this.VolumeNameLength),
"FriendlyVolumeNameLength" / Int16ul,
"FriendlyVolumeName" / Bytes(lambda this: this.FriendlyVolumeNameLength),
"HResult" / Int32sl,
"DiskNumber" / Int32ul,
"DiskOffset" / Int64ul,
"Length" / Int64ul,
"VolumeGuid" / Guid,
"DiskGuid" / Guid,
"PortPoolId" / Guid,
"PortDiskId" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=129, version=0)
class Microsoft_Windows_DataIntegrityScan_129_0(Etw):
pattern = Struct(
"VolumeNameLength" / Int16ul,
"VolumeName" / Bytes(lambda this: this.VolumeNameLength),
"FriendlyVolumeNameLength" / Int16ul,
"FriendlyVolumeName" / Bytes(lambda this: this.FriendlyVolumeNameLength),
"DirectoryCount" / Int64ul,
"FileCount" / Int64ul,
"StreamCount" / Int64ul,
"DataCount" / Int64ul,
"FsctlCount" / Int64ul,
"TotalBytesRepaired" / Int64ul,
"TotalBytesFailed" / Int64ul,
"MetadataBytesProcessed" / Int64ul,
"DataBytesProcessed" / Int64ul,
"TotalMetadataBytesInUse" / Int64ul,
"TotalDataBytesInUse" / Int64ul,
"MaxThreadCount" / Int32ul,
"TotalTimeTaken" / Int64ul,
"PercentComplete" / Int16ul,
"DiskNumber" / Int32ul,
"DiskOffset" / Int64ul,
"Length" / Int64ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=154, version=0)
class Microsoft_Windows_DataIntegrityScan_154_0(Etw):
pattern = Struct(
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"InternalFileReference" / Int64ul,
"FileOffset" / Int64ul,
"Length" / Int64ul,
"BytesRepaired" / Int64ul,
"BytesFailed" / Int64ul,
"Status" / Int32ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=155, version=0)
class Microsoft_Windows_DataIntegrityScan_155_0(Etw):
pattern = Struct(
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"InternalFileReference" / Int64ul,
"FileOffset" / Int64ul,
"Length" / Int64ul,
"BytesRepaired" / Int64ul,
"BytesFailed" / Int64ul,
"Status" / Int32ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=156, version=0)
class Microsoft_Windows_DataIntegrityScan_156_0(Etw):
pattern = Struct(
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"InternalFileReference" / Int64ul,
"FileOffset" / Int64ul,
"Length" / Int64ul,
"BytesRepaired" / Int64ul,
"BytesFailed" / Int64ul,
"Status" / Int32ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=157, version=0)
class Microsoft_Windows_DataIntegrityScan_157_0(Etw):
pattern = Struct(
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"InternalFileReference" / Int64ul,
"FileOffset" / Int64ul,
"Length" / Int64ul,
"BytesRepaired" / Int64ul,
"BytesFailed" / Int64ul,
"Status" / Int32ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=158, version=0)
class Microsoft_Windows_DataIntegrityScan_158_0(Etw):
pattern = Struct(
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"InternalFileReference" / Int64ul,
"FileOffset" / Int64ul,
"Length" / Int64ul,
"BytesRepaired" / Int64ul,
"BytesFailed" / Int64ul,
"Status" / Int32ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=159, version=0)
class Microsoft_Windows_DataIntegrityScan_159_0(Etw):
pattern = Struct(
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"InternalFileReference" / Int64ul,
"FileOffset" / Int64ul,
"Length" / Int64ul,
"BytesRepaired" / Int64ul,
"BytesFailed" / Int64ul,
"Status" / Int32ul,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=160, version=0)
class Microsoft_Windows_DataIntegrityScan_160_0(Etw):
pattern = Struct(
"VolumeNameLength" / Int16ul,
"VolumeName" / Bytes(lambda this: this.VolumeNameLength),
"FriendlyVolumeNameLength" / Int16ul,
"FriendlyVolumeName" / Bytes(lambda this: this.FriendlyVolumeNameLength),
"ExtentCount" / Int64ul,
"ExtentSize" / Int64ul,
"HResult" / Int32sl,
"VolumeGuid" / Guid
)
@declare(guid=guid("13bc4371-4e21-4e46-a84f-8c0ffb548ced"), event_id=161, version=0)
class Microsoft_Windows_DataIntegrityScan_161_0(Etw):
pattern = Struct(
"VolumeNameLength" / Int16ul,
"VolumeName" / Bytes(lambda this: this.VolumeNameLength),
"FriendlyVolumeNameLength" / Int16ul,
"FriendlyVolumeName" / Bytes(lambda this: this.FriendlyVolumeNameLength),
"ExtentCount" / Int64ul,
"ExtentSize" / Int64ul,
"HResult" / Int32sl,
"VolumeGuid" / Guid
)
| 35.083082 | 123 | 0.641765 | 1,933 | 23,225 | 7.615106 | 0.061562 | 0.039606 | 0.054008 | 0.06841 | 0.966848 | 0.966848 | 0.848777 | 0.848777 | 0.848777 | 0.848777 | 0 | 0.092732 | 0.239914 | 23,225 | 661 | 124 | 35.136157 | 0.74112 | 0.004392 | 0 | 0.802065 | 0 | 0 | 0.295362 | 0.109414 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.006885 | 0 | 0.134251 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ad54235145d7a6ae3276dfdd4293d34b60121013 | 74 | py | Python | storagebin/tests/__init__.py | Vraid-Systems/storagebin | 1cd5d4ff8f4848df3d91f5bc48923effa897efc6 | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | null | null | null | storagebin/tests/__init__.py | Vraid-Systems/storagebin | 1cd5d4ff8f4848df3d91f5bc48923effa897efc6 | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | null | null | null | storagebin/tests/__init__.py | Vraid-Systems/storagebin | 1cd5d4ff8f4848df3d91f5bc48923effa897efc6 | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | null | null | null | from storagebin.tests.prune import *
from storagebin.tests.utils import *
| 24.666667 | 36 | 0.810811 | 10 | 74 | 6 | 0.6 | 0.466667 | 0.633333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108108 | 74 | 2 | 37 | 37 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ad8d8cd1322efebe019ecc4416a9d18063838dbd | 15,897 | py | Python | Day08/Day8_Prob2_Image_Decoding.py | guilhermebaos/AoC-2020-Python-Solution | 4d473e254b88bacd728338f94788d5592776c4ff | [
"MIT"
] | null | null | null | Day08/Day8_Prob2_Image_Decoding.py | guilhermebaos/AoC-2020-Python-Solution | 4d473e254b88bacd728338f94788d5592776c4ff | [
"MIT"
] | null | null | null | Day08/Day8_Prob2_Image_Decoding.py | guilhermebaos/AoC-2020-Python-Solution | 4d473e254b88bacd728338f94788d5592776c4ff | [
"MIT"
] | null | null | null | from time import sleep
def join_list(lst, string):
"""
:param lst: List to be joined
:param string: String that will be used to join the items in the list
:return: List after being converted into a string
"""
lst = str(string).join(str(x) for x in lst)
return lst
# Solution: CYKBY
puzzle_image = '221002222122020222021022222022222222222222222222221222222022220222112022220222022020222202222222220222222222221222212222222222222222222222222222022222220202222122121222022122222122222222222222222222222222222022221222102022221222122220222202222222220222222222222222202222222222222222222222222222222222221112222222220222021222222222222222222222222222220222222122221222122122221222222221222202222222220222222222220222202222222222222222222222222222222222222212222222220222221122222222222222222222222202221222222022221222002222220222022020222222222222222222222222220222202022222222222222222222222222222222222002222122221222020122222022222222222212222222222222022222220222002022220222222122222212222222222222222222222222212022222222222222222222222222122222221222222222220222122222222022222222222212222202220222122122222222012122220222022020222212222122221222222222221222222122222222222222222222222222222222222122222022120222022222222122222222222212222202222222222122220222102222221122122120222202222122221222222222222222202022202222222222222222222222022222222212222022220222120122222122222222222212222212220222222222221222222022220122222120222202222022221222222222220222212122222222222222222222222222022222220112222022120222121022222222222221222222222222220222122122220222222122220102222120222202222122222222222222222222212222222222222222222222122222022222222122222222121222222222222222222221222222222222220222222122221222112222220212022022222212222122021222222222221222222122212222222222222222122222122222222122222022221222222122222222222221222212222212221220022122220222202022222102222222222212222022220222222222220222212022212222222222222222022222022222222022222022222222120222222022222222222202222202222220122022222222022122221222202020222212222122221222222222221222222222222222222222222222022222122222220212222122222222122222222222222222222202222222221220222222211222022022222202222122222212222122021222222222220222222122202222222222222222122222222222222202222122122222120022222222222222222212222222220222222022202222222122222202012122222212222022022222222222222222222222202222222222222222022222022222220002222222122222120122222122222222222202222212222222122022202222102122220000222122222202222122120222222222222222222122212222222222222222022222122222221102222222121222222022220122222221222212212202221220222122201222002222220210012120222222222022222222222222222022222122222222222222222222222222122222222102222022122222121222220122222221222212212222221220022022212222112022220220112120222212222122021222222222222022222022202222222222222222222222122222222212222122022222122122222122222221222202212222221221122022201222022222221021102020222222222022020222222222222022222022212222222222222222222222022222220222222122222222220022221122222220222202222222220221022222222222122122220100122221222212222222021222222222221022212122222222222222222222122222122222221212222122120222222222221122222222222222212212221222222022220222022222222120112021222222222022022222222222221222222022222222222222222222122222022222220012222222122222120022220222222222222202202212222222122222202222122022222022022121222222222222122222222222222222212222202222222222222222222222222222222122222222021222222022220122222222222222212202220220022122201222012022221202022121222222222022222222222222222222222222222222222222222222122222222222221012222122020222121222222222202220222202222212222201122122221222112122220202002122222212222222221222022222220022212022212222222222222222122222222222220202222222120222220122221022202220222212202212221210022122211222002122222112212220222212222222221222122222221022222222202222222222222222222222222222221002222122020222121222220022212221222202212202221200022120220222102022220201122122222222222022022222022222021022202022222222222222122222122222122222220022222122020222121122220122202222222212212222222200122022212222222122222121102021222202222022020222022222021222202122222222222222122222022222222222220122222222221222021022222022222222222202212202220221122022200222202222221202112022222222222022020222022222020122212222222222222222122222022222222220221022222222020222121022221022212220222222212222220210122120220222212222222112002022222202222122022222222222022122220022222222220222122222122222222220221222222222122222221222221122202222222212202212221200022220210222012222220122212221222212222122022222022222221022201222212222222222122222222222022221221002222122020222220222222222202222222222202222221211022022202222122022220122022121222212222222221222122222222222200122202022221222022222022222222222220202222222020222022022221122222221222222202222220202022222221222202122221021202122222222222222122222022222022222221022212022220222022222022222122220221002222122121222122222222222222222222200222202221200022221212222012022222202012122222222222020221222022222221222200022222122220222222222222222222221221022222122122222120122220122202222222211202212221200222120201222012022220020012020222212222021220222222222121022201022212122220222222222022222122221221112222022122222220022222022212220222212212202220002222220220222102222222201212011222222222122022222122222021022211222202222220222022222222222122222220122222022221222120222221022202221222221222202222000022220200222112122221122202020222212222021120222022222020122210122212122222222120222122222222222221012222122222122120122220022212222222212212202222101122211202222122222220102212102222202222122021222222222120122202022212122221222121222022222122222221212222222222022021122222222212221222200202202222021022202210222102022220211002222222212222021121222222222122122210122222122220222220222121222222202220212222122222022022122220022222220222202222222220210122202212222222121212202112210222202222222022222022222221122201022022122220222022222121222022201222002222122222122120022222222222220222221202212222001122111202222002120220212022000222202222121020222022222021022220022122022220222020222222222022200220012222122022222121222222222222222222201222222221000222222212222122020221201022121222202222120121222222222221222202222002022220222221222020222022211022112222022220022221122221022222221222220212222221022022000221222012020210200112102222220222221220222222222221222222222202222220222122222200222122220220222222022020112022122221122222221222222212222222211022110220222212220202221202020212201122022021222022222220022222122122022220222220222222222122211221222222222121012121022220222212221222211202222221120022102202222222021221210202212202212222120022222022222020222220222122122220222120222211222222221222112222222210002021222221022202221222201212202221220202222202222102221200202102111212202122221220222022222020222200222112122212222122222100222122220220212222222122022120022221022222221222220222212220112122100201222022222200100002222222212222020122222222222222022210222122222211222221222222222222202022012222222202212220222120022222221222022202222220112002102202222222022221002122102212222022020020222022222122122212222022022212222022222120222022212112022222122111122122222121222202221222020212202220012222111221222112120210121122220212202222220022022122222220222222022222122212222221222201222122201012122222122210202122022020222202220222102222222220010202200221022012120210002122010212210022122222022122222222022201222002222221222222222011222122212121012222122202112221022220120202220222122202210222220212121210222012121210001102012222200122120121122122222022022201122102122200222220222111222022200022212222022002112221122222021222222222220202212220120012200212122122220220011022102202210022021022022022222220122210222122122212222120222120222222211111222222222102012221222221221212222222011202222220220122200201122112122210212202100202202222222122122022222222222211022012122222222222222122222022211002202222222212202021122222220212022222010222210220011022200222222222221220111102011202220022221221122022222020122201222002222221222122222102222222211000212222122002022121022120020212220222121212202221202012102221122202022221002212010202200122220122222222221121222220222222122201222222222022222222220122112222122201002122022220220202122222121212220220111012211222022002021220020022112202202022222121122222220221222212222112122220222122222012222122212101122222022021022020022121220202221222112202202220101022100220222202021222122212001202222022222222022222221222222202122202222221222222222110222122212021102222222221202221122122021222022222202202200222112222000221022112021210011022222222202022020122122222221220022200122222120202222121222222222222202102222222222110122021022022021202120222112212210222110202011202022012022221100102221202220122022022022122220022222200222202220210222221222121222022200212122222222001122021022021121222021222102212211221000022021220122022222220021002201222201222221221222122221020222200022202020202222022222212222022201012102222222012212220122221020212222222022212211220211202210212122222020011200102001222211122220122122222222220022222122212221202222221222000222220222202122222222020022020022021021202022222122222201222001012010202022102122220022122201212200122120020022122221222022212022222020202222121222121222122200200212222022200012220022120122202120222102222202221212122110212222022020022211102102222220222022020222122222121022220022102220200222120222221222020201010122222122002212021222021122222020222210202201222122102110221122012221111102022112202220222122221122222220220222212222112220211202022222210222020200002212222122020112022022221122202221222021212120222222021011210122212120220121022110212220222120022022122220220122221122012122202202221222102222221221212222222222010012220122221122222120222221202010220112020112211122222021020010212011222222222120122122122221222022211222212122210202220222110222120202120012222122222202222122022021212022222202212222222212102011220122002221000202122100212220122021121220222222022122202222022120211212020222211222002211000012222022001122022122120021222222222011222010222110210200220122222021011000222020212201222022222120222221222022201122112021202222222222020222101200022212222222001102222122222021222020222120202210222110020020210222212222210011012022212221122222220120022222121022211022102121211222021222111222112220002012222222102022020022121120202122222021222022222000000202200022022022122221122112222220022022220221022222022122220022022221222222020222212222202220000102222222120122120222220221222021222220202112220101111222201222212220102121012111202210022120120120222221020022221022012122220212120222120222202222022102222022102112121222121021102222222202222111221022202111201022012120012220102001222202120222122220122221020222211022002020211212121222110222020211021002222222100222221202110222022222222111222020220220211022211222022021200022012021212222222221021120022222020222211022202120220202020222201222201212222022222022000002220212110220022020222121212121222002212211221022002222112110122112212200120022220220122222222222221222222022211222021222210222220221221122222022112012220202212121222212222210212120222220221020212122222222122212212111202210221221222121222220220122222022022121200222122222201222100211122202222122111202122202121221002012222010202211220020010102212222112022201002112121202202120220220121022221020012212222112220200202022222220222110222110012222022010122221122200012212200222110212010222111221211110122212221202221012022222202221222021120222222121102212122221021221220120222211222111222220202222222212202121012021010202010222020222210221221200010111122202022020220222011202212022020120221222222021002202122020221201210022222112222222211001212222222120022222122102112012022222022212002220221010212121222112121020121010101202221220122222122222222222212220022012220202212220222220222211222112202222222100102221212200001102210222122222000221001001222110222122122211001200120222220122020120222122021120212211222102120200210222222011222021222101202222022021212220122000210112112222021202212222102200220101222102020210200211221222220121120021222022222220002201122121121220211221222202222112200122022222022201222220202122112012002222212202010220210201202212122022221121001220020222202222120120221222121021102202122112222210202120222101222001210010122222022210122120112111101212110222110212221222120000201021022122222102221212001222211122220020220222121122202210222001021221211020222222220121211210022222022220222221122100020112011222012212002221111102011102202002220212221002000202201121020221122222020221122200222212120201200222222222221112200201102222022011102021022021101002111212222202221221100122022101012002122212011100121222202022021022221222020021112202022000120211220020220110220000212001222222222000012222222112210222022212010202210222200012222112212102221221001200101222221222221121220222220120002220222002221210220121220102220122201020212222120002222120212012110102120202020212100220100001010111102202222111220202200212202220120020221022221020112212022120121211200121222001222020201101012222122002022121012201021112122202221202202221101120120212021002120122212100220222212222121120022022120222102201222020020210202120220000220101221211002222220121012121202211222222112222110202001221010100001200122022120201002210002212200222120021021122221121122211022202220202201220221012220210200020102222022010212220112122021222010202101202222222121121202202222202222220211020012212200222221021020122120021202210022020102200212022222211222122211201112222120221202220222212002122000202012222212221021001000222202012222011001021211222111021222122022212100221112201122220202200221222222210220212222202202022222212012020122011111012020212102222010221211121111102211022121212202000011212202222220222021112120020222220222020021200220220221112221020222010012022122000112000112211210122001202111222121220012000020021110212220210100211012222011021122122020202100221012202022022202220201120220000222120200221202122020221022120222202121112100212120212102221122021001120022102021201022110000212200022122221020222000120022202122111212211201221220101220222211021212022211200222011112012221022120202102212122022012201010011000222021211022110222202221021220020020022020220012210122120022201222120222211222122220121112122120000222022102022112222222212012212122122021200222200010202221202000222000202012022022020222002010021022221022022010221202220220020222222200220122122200022022221012122220002212202201202200021001221210111000022222201110221022220020122120022021102201121102220122211201210212122220011120100221212122122220011112120112112110112012202120202210222002112200012011222022011211021201220221121221122022202122101002222222122201201201122222102212221211201222022020102002010212221020022210202110202102020002110222212012222122100122010012022211222021122121202001110222210022211110212202220221212102112202010112222120102112121112121102012221222212022020021000012022111001012221002021022201012100220021120120022220222002221022210120212211002220100200010202010002222212020002112002111212012001222012122222111010001000102000122021020111122010100000221121020221122201000112210022112012222211122220000102200200122211000120212021002220110220200210010001220022100210112221200210200100122202211202002112202002110211210012202100021210110012101012021101101212002112'
puzzle_width = 25
puzzle_height = 6
test_image = '0222112222120000'
test_width = 2
test_height = 2
image = puzzle_image[:]
width = puzzle_width
height = puzzle_height
layers = []
area = width*height
for c in range(0, int(len(image)/area)):
layers += [image[c*area:(c+1)*area]]
final_image = [0] * area
for p in range(0, area):
for lay in range(0, len(layers)):
if layers[lay][p] != '2':
final_image[p] = layers[lay][p]
break
for line in range(0, height):
print(join_list(final_image[line*width:(line+1)*width], ' '))
| 397.425 | 15,017 | 0.979996 | 143 | 15,897 | 108.846154 | 0.384615 | 0.001799 | 0.002056 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.957505 | 0.012644 | 15,897 | 39 | 15,018 | 407.615385 | 0.034149 | 0.010442 | 0 | 0 | 0 | 0 | 0.956134 | 0.954988 | 0 | 1 | 0 | 0 | 0 | 1 | 0.04 | false | 0 | 0.04 | 0 | 0.12 | 0.04 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
d10b49496a822e732f07f05984a6af14d7a78d39 | 30,927 | py | Python | test/integration/test_projects.py | PatrickLaban/cfapi | aeecf4a034a9dda3dd033f241f7c37e52a8c02c9 | [
"MIT"
] | 1 | 2021-09-05T17:21:49.000Z | 2021-09-05T17:21:49.000Z | test/integration/test_projects.py | PatrickLaban/cfapi | aeecf4a034a9dda3dd033f241f7c37e52a8c02c9 | [
"MIT"
] | null | null | null | test/integration/test_projects.py | PatrickLaban/cfapi | aeecf4a034a9dda3dd033f241f7c37e52a8c02c9 | [
"MIT"
] | null | null | null | import json
from datetime import datetime, timedelta
from test.factories import ProjectFactory, OrganizationFactory, IssueFactory
from test.harness import IntegrationTest
from app import db, Issue
class TestProjects(IntegrationTest):
def test_all_projects_order(self):
"""
Test that projects gets returned in order of last_updated
"""
ProjectFactory(name=u'Project 1', last_updated='Mon, 01 Jan 2010 00:00:00 GMT')
ProjectFactory(name=u'Project 2', last_updated='Tue, 01 Jan 2011 00:00:00 GMT')
ProjectFactory(name=u'Non Github Project', last_updated='Wed, 01 Jan 2013 00:00:00', github_details=None)
ProjectFactory(name=u'Project 3', last_updated='Thu, 01 Jan 2014 00:00:00 GMT')
db.session.commit()
response = self.app.get('/api/projects')
response = json.loads(response.data)
self.assertEqual(response['objects'][0]['name'], u'Project 3')
self.assertEqual(response['objects'][1]['name'], u'Non Github Project')
self.assertEqual(response['objects'][2]['name'], u'Project 2')
self.assertEqual(response['objects'][3]['name'], u'Project 1')
def test_projects(self):
ProjectFactory()
db.session.commit()
response = self.app.get('/api/projects')
response = json.loads(response.data)
assert isinstance(response, dict)
assert isinstance(response['pages'], dict)
assert isinstance(response['total'], int)
assert isinstance(response['objects'], list)
assert isinstance(response['objects'][0]['categories'], unicode)
assert isinstance(response['objects'][0]['tags'], unicode)
assert isinstance(response['objects'][0]['code_url'], unicode)
assert isinstance(response['objects'][0]['description'], unicode)
assert isinstance(response['objects'][0]['github_details'], dict)
assert isinstance(response['objects'][0]['id'], int)
assert isinstance(response['objects'][0]['api_url'], unicode)
assert isinstance(response['objects'][0]['link_url'], unicode)
assert isinstance(response['objects'][0]['name'], unicode)
assert isinstance(response['objects'][0]['organization'], dict)
assert isinstance(response['objects'][0]['organization_name'], unicode)
assert isinstance(response['objects'][0]['type'], unicode)
assert isinstance(response['objects'][0]['status'], unicode)
def test_project_search_nonexisting_text(self):
''' Searching for non-existing text in the project and org/project
endpoints returns no results
'''
organization = OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=organization.name, description=u'Coder')
db.session.commit()
project_response = self.app.get('/api/projects?q=ruby')
project_response = json.loads(project_response.data)
assert isinstance(project_response['total'], int)
assert isinstance(project_response['objects'], list)
self.assertEqual(project_response['total'], 0)
self.assertEqual(len(project_response['objects']), 0)
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=ruby')
org_project_response = json.loads(org_project_response.data)
assert isinstance(org_project_response['total'], int)
assert isinstance(org_project_response['objects'], list)
self.assertEqual(org_project_response['total'], 0)
self.assertEqual(len(org_project_response['objects']), 0)
def test_project_search_existing_text(self):
''' Searching for existing text in the project and org/project endpoints
returns expected results
'''
organization = OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=organization.name, description=u'ruby')
ProjectFactory(organization_name=organization.name, description=u'python')
db.session.commit()
project_response = self.app.get('/api/projects?q=ruby')
project_response = json.loads(project_response.data)
assert isinstance(project_response['total'], int)
assert isinstance(project_response['objects'], list)
self.assertEqual(project_response['total'], 1)
self.assertEqual(len(project_response['objects']), 1)
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=ruby')
org_project_response = json.loads(org_project_response.data)
assert isinstance(org_project_response['total'], int)
assert isinstance(org_project_response['objects'], list)
self.assertEqual(org_project_response['total'], 1)
self.assertEqual(len(org_project_response['objects']), 1)
def test_project_search_existing_phrase(self):
''' Searching for an existing phrase in the project and org/project endpoints
returns expected results
'''
organization = OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=organization.name, description=u'ruby on rails')
ProjectFactory(organization_name=organization.name, description=u'i love lamp')
db.session.commit()
project_response = self.app.get('/api/projects?q=ruby on rails')
project_response = json.loads(project_response.data)
assert isinstance(project_response['total'], int)
assert isinstance(project_response['objects'], list)
self.assertEqual(project_response['total'], 1)
self.assertEqual(len(project_response['objects']), 1)
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=ruby on rails')
org_project_response = json.loads(org_project_response.data)
assert isinstance(org_project_response['total'], int)
assert isinstance(org_project_response['objects'], list)
self.assertEqual(org_project_response['total'], 1)
self.assertEqual(len(org_project_response['objects']), 1)
def test_project_search_existing_part_of_phrase(self):
''' Searching for a partial phrase in the project and org/project endpoints
returns expected results
'''
organization = OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=organization.name, description=u'ruby on rails')
ProjectFactory(organization_name=organization.name, description=u'i love lamp')
db.session.commit()
project_response = self.app.get('/api/projects?q=ruby')
project_response = json.loads(project_response.data)
assert isinstance(project_response['total'], int)
assert isinstance(project_response['objects'], list)
self.assertEqual(project_response['total'], 1)
self.assertEqual(len(project_response['objects']), 1)
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=ruby')
org_project_response = json.loads(org_project_response.data)
assert isinstance(org_project_response['total'], int)
assert isinstance(org_project_response['objects'], list)
self.assertEqual(org_project_response['total'], 1)
self.assertEqual(len(org_project_response['objects']), 1)
def test_project_search_nonexisting_phrase(self):
''' Searching for a term that is not part of an existing phrase in the project and
org/project endpoints returns no results
'''
organization = OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=organization.name, description=u'ruby on rails')
db.session.commit()
project_response = self.app.get('/api/projects?q=joomla')
project_response = json.loads(project_response.data)
assert isinstance(project_response['total'], int)
assert isinstance(project_response['objects'], list)
self.assertEqual(project_response['total'], 0)
self.assertEqual(len(project_response['objects']), 0)
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=joomla')
org_project_response = json.loads(org_project_response.data)
assert isinstance(org_project_response['total'], int)
assert isinstance(org_project_response['objects'], list)
self.assertEqual(org_project_response['total'], 0)
self.assertEqual(len(org_project_response['objects']), 0)
def test_project_search_order_by_relevance(self):
''' Search results from the project and org/project endpoints are returned
in order of relevance
'''
organization = OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=organization.name, description=u'ruby ruby ruby ruby ruby', last_updated=datetime.now() - timedelta(10))
ProjectFactory(organization_name=organization.name, description=u'ruby', last_updated=datetime.now() - timedelta(1))
db.session.commit()
project_response = self.app.get('/api/projects?q=ruby')
project_response = json.loads(project_response.data)
assert isinstance(project_response['total'], int)
assert isinstance(project_response['objects'], list)
self.assertEqual(len(project_response["objects"]), 2)
self.assertEqual(project_response['objects'][0]['description'], 'ruby ruby ruby ruby ruby')
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=ruby')
org_project_response = json.loads(org_project_response.data)
assert isinstance(org_project_response['total'], int)
assert isinstance(org_project_response['objects'], list)
self.assertEqual(len(org_project_response["objects"]), 2)
self.assertEqual(org_project_response['objects'][0]['description'], 'ruby ruby ruby ruby ruby')
def test_project_search_order_by_relevance_requested(self):
''' Search results from the project and org/project endpoints are returned
in order of relevance when explicitly requested
'''
organization = OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=organization.name, description=u'ruby ruby ruby ruby ruby', last_updated=datetime.now() - timedelta(10))
ProjectFactory(organization_name=organization.name, description=u'ruby', last_updated=datetime.now() - timedelta(1))
db.session.commit()
project_response = self.app.get('/api/projects?q=ruby&sort_by=relevance')
project_response = json.loads(project_response.data)
assert isinstance(project_response['total'], int)
assert isinstance(project_response['objects'], list)
self.assertEqual(len(project_response["objects"]), 2)
self.assertEqual(project_response['objects'][0]['description'], 'ruby ruby ruby ruby ruby')
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=ruby&sort_by=relevance')
org_project_response = json.loads(org_project_response.data)
assert isinstance(org_project_response['total'], int)
assert isinstance(org_project_response['objects'], list)
self.assertEqual(len(org_project_response["objects"]), 2)
self.assertEqual(org_project_response['objects'][0]['description'], 'ruby ruby ruby ruby ruby')
def test_project_search_order_by_last_updated(self):
''' Search results from the project and org/project endpoints are returned
in order of last_updated, if requested
'''
organization = OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=organization.name, description=u'ruby ruby ruby ruby ruby', last_updated=datetime.now() - timedelta(10))
ProjectFactory(organization_name=organization.name, description=u'ruby', last_updated=datetime.now() - timedelta(1))
db.session.commit()
project_response = self.app.get('/api/projects?q=ruby&sort_by=last_updated')
project_response = json.loads(project_response.data)
assert isinstance(project_response['total'], int)
assert isinstance(project_response['objects'], list)
self.assertEqual(len(project_response["objects"]), 2)
self.assertEqual(project_response['objects'][0]['description'], 'ruby')
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=ruby&sort_by=last_updated')
org_project_response = json.loads(org_project_response.data)
assert isinstance(org_project_response['total'], int)
assert isinstance(org_project_response['objects'], list)
self.assertEqual(len(org_project_response["objects"]), 2)
self.assertEqual(org_project_response['objects'][0]['description'], 'ruby')
def test_project_search_order_by_last_updated_sort_desc(self):
''' Search results from the project and org/project endpoints are returned
in descending order of last_updated, if requested
'''
organization = OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=organization.name, description=u'ruby ruby ruby ruby ruby', last_updated=datetime.now() - timedelta(10))
ProjectFactory(organization_name=organization.name, description=u'ruby', last_updated=datetime.now() - timedelta(1))
db.session.commit()
project_response = self.app.get('/api/projects?q=ruby&sort_by=last_updated&sort_dir=desc')
project_response = json.loads(project_response.data)
assert isinstance(project_response['total'], int)
assert isinstance(project_response['objects'], list)
self.assertEqual(len(project_response["objects"]), 2)
self.assertEqual(project_response['objects'][0]['description'], 'ruby')
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=ruby&sort_by=last_updated&sort_dir=desc')
org_project_response = json.loads(org_project_response.data)
assert isinstance(org_project_response['total'], int)
assert isinstance(org_project_response['objects'], list)
self.assertEqual(len(org_project_response["objects"]), 2)
self.assertEqual(org_project_response['objects'][0]['description'], 'ruby')
def test_project_search_order_by_last_updated_sort_asc(self):
''' Search results from the project and org/project endpoints are returned
in ascending order of last_updated, if requested
'''
organization = OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=organization.name, description=u'ruby ruby ruby ruby ruby', last_updated=datetime.now() - timedelta(10))
ProjectFactory(organization_name=organization.name, description=u'ruby', last_updated=datetime.now() - timedelta(1))
db.session.commit()
project_response = self.app.get('/api/projects?q=ruby&sort_by=last_updated&sort_dir=asc')
project_response = json.loads(project_response.data)
assert isinstance(project_response['total'], int)
assert isinstance(project_response['objects'], list)
self.assertEqual(len(project_response["objects"]), 2)
self.assertEqual(project_response['objects'][0]['description'], 'ruby ruby ruby ruby ruby')
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=ruby&sort_by=last_updated&sort_dir=asc')
org_project_response = json.loads(org_project_response.data)
assert isinstance(org_project_response['total'], int)
assert isinstance(org_project_response['objects'], list)
self.assertEqual(len(org_project_response["objects"]), 2)
self.assertEqual(org_project_response['objects'][0]['description'], 'ruby ruby ruby ruby ruby')
def test_project_return_only_ids(self):
''' Search results from the project and org/project endpoints are returned
as only IDs if requested
'''
organization = OrganizationFactory(name=u"Code for San Francisco")
project_one = ProjectFactory(organization_name=organization.name, description=u'ruby ruby ruby ruby ruby', last_updated=datetime.now() - timedelta(10))
project_two = ProjectFactory(organization_name=organization.name, description=u'ruby', last_updated=datetime.now() - timedelta(1))
db.session.commit()
project_one_id = project_one.id
project_two_id = project_two.id
project_response = self.app.get('/api/projects?q=ruby&only_ids=true')
project_response = json.loads(project_response.data)
assert isinstance(project_response['total'], int)
assert isinstance(project_response['objects'], list)
self.assertEqual(len(project_response["objects"]), 2)
assert isinstance(project_response['objects'][0], int)
assert isinstance(project_response['objects'][1], int)
self.assertEqual(project_response['objects'][0], project_one_id)
self.assertEqual(project_response['objects'][1], project_two_id)
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=ruby&only_ids=true')
org_project_response = json.loads(org_project_response.data)
assert isinstance(org_project_response['total'], int)
assert isinstance(org_project_response['objects'], list)
self.assertEqual(len(org_project_response["objects"]), 2)
assert isinstance(org_project_response['objects'][0], int)
assert isinstance(org_project_response['objects'][1], int)
self.assertEqual(org_project_response['objects'][0], project_one_id)
self.assertEqual(org_project_response['objects'][1], project_two_id)
def test_project_search_empty_string(self):
''' Searching an empty string on the project and org/project endpoints returns all projects
'''
organization = OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=organization.name, description=u'ruby ruby ruby ruby ruby', last_updated=datetime.now() - timedelta(10))
ProjectFactory(organization_name=organization.name, description=u'ruby', last_updated=datetime.now() - timedelta(1))
db.session.commit()
project_response = self.app.get('/api/projects?q=')
project_response = json.loads(project_response.data)
assert isinstance(project_response['total'], int)
assert isinstance(project_response['objects'], list)
self.assertEqual(project_response['total'], 2)
self.assertEqual(len(project_response['objects']), 2)
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=')
org_project_response = json.loads(org_project_response.data)
assert isinstance(org_project_response['total'], int)
assert isinstance(org_project_response['objects'], list)
self.assertEqual(org_project_response['total'], 2)
self.assertEqual(len(org_project_response['objects']), 2)
def test_project_search_tsv_body_not_in_response(self):
''' The tsv_body field is not in the response from the project and org/project endpoints
'''
organization = OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=organization.name, description=u'ruby ruby ruby ruby ruby', last_updated=datetime.now() - timedelta(10))
ProjectFactory(organization_name=organization.name, description=u'ruby', last_updated=datetime.now() - timedelta(1))
db.session.commit()
project_response = self.app.get('/api/projects?q=')
project_response = json.loads(project_response.data)
self.assertEqual(len(project_response['objects']), 2)
self.assertFalse('tsv_body' in project_response['objects'][0])
self.assertFalse('tsv_body' in project_response['objects'][1])
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=')
org_project_response = json.loads(org_project_response.data)
self.assertEqual(len(org_project_response['objects']), 2)
self.assertFalse('tsv_body' in org_project_response['objects'][0])
self.assertFalse('tsv_body' in org_project_response['objects'][1])
def test_project_orgs_dont_include_tsv(self):
OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=u"Code for San Francisco")
db.session.commit()
response = self.app.get('/api/projects')
response = json.loads(response.data)
self.assertFalse('tsv_body' in response['objects'][0]['organization'])
def test_project_search_includes_status(self):
''' The status field is included in search results from the project and org/project endpoints
'''
organization = OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=organization.name, status=u'Beta')
ProjectFactory(organization_name=organization.name, status=u'Alpha')
db.session.commit()
project_response = self.app.get('/api/projects?q=alpha')
project_response = json.loads(project_response.data)
self.assertEqual(len(project_response['objects']), 1)
self.assertEqual(project_response['objects'][0]['status'], 'Alpha')
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=alpha')
org_project_response = json.loads(org_project_response.data)
self.assertEqual(len(org_project_response['objects']), 1)
self.assertEqual(org_project_response['objects'][0]['status'], 'Alpha')
def test_project_search_includes_name(self):
''' The name field is included in search results from the project and org/project endpoints
'''
organization = OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=organization.name, name=u'My Cool Project')
ProjectFactory(organization_name=organization.name, name=u'My Dumb Project')
db.session.commit()
project_response = self.app.get('/api/projects?q=cool')
project_response = json.loads(project_response.data)
self.assertEqual(len(project_response['objects']), 1)
self.assertEqual(project_response['objects'][0]['name'], 'My Cool Project')
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=cool')
org_project_response = json.loads(org_project_response.data)
self.assertEqual(len(org_project_response['objects']), 1)
self.assertEqual(org_project_response['objects'][0]['name'], 'My Cool Project')
def test_project_search_includes_type(self):
''' The type field is included in search results from the project and org/project endpoints
'''
organization = OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=organization.name, type=u'mobile app')
ProjectFactory(organization_name=organization.name, type=u'data portal')
db.session.commit()
project_response = self.app.get('/api/projects?q=portal')
project_response = json.loads(project_response.data)
self.assertEqual(len(project_response['objects']), 1)
self.assertEqual(project_response['objects'][0]['type'], 'data portal')
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=portal')
org_project_response = json.loads(org_project_response.data)
self.assertEqual(len(org_project_response['objects']), 1)
self.assertEqual(org_project_response['objects'][0]['type'], 'data portal')
def test_project_search_includes_categories(self):
''' The categories field is included in search results from the project and org/project endpoints
'''
organization = OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=organization.name, categories=u'project management, civic hacking')
ProjectFactory(organization_name=organization.name, categories=u'animal control, twitter')
db.session.commit()
project_response = self.app.get('/api/projects?q=control')
project_response = json.loads(project_response.data)
self.assertEqual(len(project_response['objects']), 1)
self.assertEqual(project_response['objects'][0]['categories'], 'animal control, twitter')
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=control')
org_project_response = json.loads(org_project_response.data)
self.assertEqual(len(org_project_response['objects']), 1)
self.assertEqual(org_project_response['objects'][0]['categories'], 'animal control, twitter')
def test_project_search_includes_tags(self):
"""
The tags field is included in search results from the project and org/project endpoints
"""
organization = OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=organization.name, tags=u'mapping, philly')
ProjectFactory(organization_name=organization.name, tags=u'food stamps, health')
db.session.commit()
project_response = self.app.get('/api/projects?q=stamps')
project_response = json.loads(project_response.data)
self.assertEqual(len(project_response['objects']), 1)
self.assertEqual(project_response['objects'][0]['tags'], 'food stamps, health')
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=stamps')
org_project_response = json.loads(org_project_response.data)
self.assertEqual(len(org_project_response['objects']), 1)
self.assertEqual(org_project_response['objects'][0]['tags'], 'food stamps, health')
def test_project_search_includes_github_details(self):
''' The github_details field is included in search results from the project and org/project endpoints
'''
organization = OrganizationFactory(name=u"Code for San Francisco")
ProjectFactory(organization_name=organization.name, github_details=json.dumps({'panic': 'disco'}))
ProjectFactory(organization_name=organization.name, github_details=json.dumps({'button': 'red'}))
db.session.commit()
project_response = self.app.get('/api/projects?q=disco')
project_response = json.loads(project_response.data)
self.assertEqual(len(project_response['objects']), 1)
self.assertEqual(project_response['objects'][0]['github_details'], '{"panic": "disco"}')
org_project_response = self.app.get('/api/organizations/Code-for-San-Francisco/projects?q=disco')
org_project_response = json.loads(org_project_response.data)
self.assertEqual(len(org_project_response['objects']), 1)
self.assertEqual(org_project_response['objects'][0]['github_details'], '{"panic": "disco"}')
def test_project_query_filter(self):
'''
Test that project query params work as expected.
'''
brigade = OrganizationFactory(name=u'Whatever', type=u'Brigade')
brigade_somewhere_far = OrganizationFactory(name=u'Brigade Organization', type=u'Brigade, Code for All')
web_project = ProjectFactory(name=u'Random Web App', type=u'web service')
other_web_project = ProjectFactory(name=u'Random Web App 2', type=u'web service', description=u'Another')
non_web_project = ProjectFactory(name=u'Random Other App', type=u'other service')
web_project.organization = brigade
non_web_project.organization = brigade_somewhere_far
db.session.add(web_project)
db.session.add(non_web_project)
db.session.commit()
response = self.app.get('/api/projects?type=web%20service')
self.assertEqual(response.status_code, 200)
response = json.loads(response.data)
self.assertEqual(response['total'], 2)
self.assertEqual(response['objects'][0]['name'], u'Random Web App')
self.assertEqual(response['objects'][1]['name'], u'Random Web App 2')
response = self.app.get('/api/projects?type=web%20service&description=Another')
self.assertEqual(response.status_code, 200)
response = json.loads(response.data)
self.assertEqual(response['total'], 1)
self.assertEqual(response['objects'][0]['name'], u'Random Web App 2')
response = self.app.get('/api/projects?type=different%20service')
self.assertEqual(response.status_code, 200)
response = json.loads(response.data)
self.assertEqual(response['total'], 0)
response = self.app.get('/api/projects?organization_type=Code+for+All')
self.assertEqual(response.status_code, 200)
response = json.loads(response.data)
self.assertEqual(response['total'], 1)
def test_project_cascading_deletes(self):
''' Test that issues get deleted when their parent
project and org is deleted
'''
# set up test objects and delete a project
organization = OrganizationFactory(name=u'TEST ORG')
db.session.flush()
project = ProjectFactory(organization_name=organization.name, name=u'TEST PROJECT')
db.session.flush()
issue = IssueFactory(title=u'TEST ISSUE', project_id=project.id)
another_issue = IssueFactory(title=u'ANOTHER TEST ISSUE', project_id=project.id)
a_third_issue = IssueFactory(title=u'A THIRD TEST ISSUE', project_id=project.id)
db.session.commit()
# make sure the issues are in the db
issues = db.session.query(Issue).all()
self.assertTrue(len(issues) == 3)
db.session.execute('DELETE FROM project')
db.session.commit()
issues = db.session.query(Issue).all()
self.assertFalse(len(issues))
# delete an organization
project = ProjectFactory(organization_name=organization.name, name=u'TEST PROJECT')
db.session.flush()
issue = IssueFactory(title=u'TEST ISSUE', project_id=project.id)
another_issue = IssueFactory(title=u'ANOTHER TEST ISSUE', project_id=project.id)
a_third_issue = IssueFactory(title=u'A THIRD TEST ISSUE', project_id=project.id)
db.session.commit()
# make sure the issues are in the db
issues = db.session.query(Issue).all()
self.assertTrue(len(issues) == 3)
db.session.execute('DELETE FROM organization')
db.session.commit()
issues = db.session.query(Issue).all()
self.assertFalse(len(issues))
| 57.699627 | 159 | 0.705047 | 3,743 | 30,927 | 5.658028 | 0.054502 | 0.174237 | 0.104542 | 0.056663 | 0.900604 | 0.876287 | 0.85433 | 0.806025 | 0.786807 | 0.764284 | 0 | 0.00823 | 0.174993 | 30,927 | 535 | 160 | 57.807477 | 0.821791 | 0.073011 | 0 | 0.628993 | 0 | 0.012285 | 0.18367 | 0.061176 | 0 | 0 | 0 | 0 | 0.415233 | 1 | 0.058968 | false | 0 | 0.012285 | 0 | 0.07371 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d10d70be385868d309c3d39ba3ae11ad1e319cf7 | 20,548 | py | Python | micropython/badger2040_modules_py/badgerpunk.py | nathanmayall/pimoroni-pico | ee12d846a125770a76e7ed331d290ce83f41a0b3 | [
"MIT"
] | 1 | 2022-03-12T13:54:28.000Z | 2022-03-12T13:54:28.000Z | micropython/badger2040_modules_py/badgerpunk.py | nathanmayall/pimoroni-pico | ee12d846a125770a76e7ed331d290ce83f41a0b3 | [
"MIT"
] | null | null | null | micropython/badger2040_modules_py/badgerpunk.py | nathanmayall/pimoroni-pico | ee12d846a125770a76e7ed331d290ce83f41a0b3 | [
"MIT"
] | null | null | null | # Code generated by data_to_py.py.
version = '0.1'
_data =\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x0f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x1f\xc0\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x3f\xf0\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0f\xfc\x00\x00\x00\x7d'\
b'\xf0\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x7f'\
b'\xff\x80\x00\x01\xf0\xf8\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\xff\xff\xf8\x0f\xff\xf0\x7c\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x03\xfd\x5f\xff\xff\xff\xf5\x3e'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0f\xe0\x01'\
b'\xff\xff\xff\xeb\x5f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x3f\x80\x00\x7f\xfc\x00\x01\xff\xc0\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\xff\x00\x01\x3f\xe0\x00\x00\x1f\xf0'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xfe\x17\x6d\x5f'\
b'\xff\xa0\x00\x07\xf8\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x01\xf8\x2b\xf6\xf7\xef\xfe\x00\x01\xfe\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x01\xe0\x2f\xdf\xfe\xfe\xff\x80\x00\x7f\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\xf1\x5f\xff\xfb\xf7'\
b'\xfb\xf0\x00\x1f\xc0\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01'\
b'\xfe\xef\xff\xff\xff\x6f\xf4\x00\x0f\xc0\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x01\xff\xbf\xff\xbf\xff\xff\xf8\x00\x03\xe0\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xfb\xff\xff\xff\xbd\xfd'\
b'\xae\x00\x01\xf0\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xff'\
b'\xdf\xff\xff\xff\xb7\xff\x00\x01\xf8\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\xff\xff\xff\xfe\xf7\xfe\xff\xc0\x00\x78\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\xff\xff\xff\xff\xdf\xdf\xff'\
b'\xc0\x00\x7c\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xff\xff'\
b'\xff\xff\xff\x7f\x7f\xd0\x00\x3f\xfe\x80\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x01\xf7\xff\xff\xf7\xfd\xff\xff\xf0\x00\x3f\xff\xfe\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\xff\xff\xff\xff\x7f\xff\xff\xfc'\
b'\x00\x1f\xff\xff\x80\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xff\xff\xff'\
b'\x7f\xff\xfb\xff\xfa\x00\x2a\xb7\xff\xe0\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x01\xf7\xff\xff\xff\xff\xff\xf5\x56\xff\xdd\x54\x0b\xf0\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x01\xfb\xff\xdd\xff\xfd\x7f\xff\xff\xff'\
b'\xff\xff\xe5\xf8\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\xeb\xfe\xff\xfd'\
b'\xf7\xff\x7f\xff\xff\xf6\xb4\x08\x78\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x01\xf5\xf7\xfb\xf7\xff\xfe\xe7\xfe\x80\x00\x00\x01\x3c\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x01\xed\x4b\xee\x5f\xde\xf7\xcf\xa0\x00\xab'\
b'\x52\x00\x3c\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x03\xf5\x64\x55\x57\x7f'\
b'\xff\x90\x02\xff\xff\xfe\xa8\x1e\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x03'\
b'\xea\x11\x01\x49\xe2\xfe\x00\xbf\xff\xeb\xf0\x14\x0f\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x03\xf0\x80\x54\x10\x14\xdc\x2f\xff\x40\x01\xe0'\
b'\x00\xaf\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x03\xda\x42\x00\x00\x00\x7c'\
b'\xff\x60\x02\xb8\x47\x50\x07\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x07\xa0'\
b'\x00\x00\x00\x00\xfc\xf8\x02\xff\xfc\x23\xaf\x07\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x07\x80\x00\x00\x00\x01\xfc\x00\xff\xff\xa0\x08\x01'\
b'\x6f\x80\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x07\x80\x00\x00\x00\x03\xfe\x0f'\
b'\xff\x80\x02\x04\x00\x1f\x80\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0f\x00\x00'\
b'\x00\x00\x01\xff\xff\xc0\x01\x7e\x07\xdc\x03\xc0\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x0f\x00\x00\x00\x00\x03\xff\xf8\x01\x7f\xfc\x03\xaf\x83'\
b'\xc0\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x0e\x00\x00\x00\x00\x01\xff\xc0\x7f'\
b'\xfe\x8e\x03\x80\x13\xc0\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0f\x00\x00\x00'\
b'\x00\x00\xff\x87\xff\xa0\x04\x01\x08\x01\xc0\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x7f\xff\xff\xe8\x00'\
b'\x00\x1e\x00\x00\x00\x00\x00\xff\x3f\xe8\x05\xf4\x00\x97\xa1\xc0'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0f'\
b'\xff\xff\xff\xff\xfe\x80\x7e\x00\x00\x00\x00\x00\x7f\x8a\x01\x7f'\
b'\xf2\x00\x04\x43\xc0\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x7f\xff\xff\xff\xff\xff\xf5\xfe\x00\x00\x00\x00'\
b'\x00\x01\x80\x17\xfe\x80\x00\x60\x03\xc0\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x01\xff\xd0\x00\x00\xff\xff\xff'\
b'\xff\x80\x00\x00\x00\x00\x00\x42\xff\x40\x08\x00\x3f\x43\xc0\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x07\xf8\x00'\
b'\x00\x00\x00\x0f\xff\xf7\xf0\x00\x04\x00\x08\x00\x3d\x80\x02\xf8'\
b'\x00\x17\xf9\xc0\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x0f\xc0\x00\x00\x00\x00\x00\x1f\xc0\xfe\x00\x00\x84\x52'\
b'\x81\x0f\x01\x7f\xf4\x00\x00\x23\xc0\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x3f\x00\x00\x00\x00\x00\x00\x00\x40'\
b'\x57\x00\x12\x11\x04\x00\x47\xbf\xff\xc8\x00\x07\xdf\x80\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xfc\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x01\xe8\x48\x82\x52\x04\x23\xff\x6a\x7a\x00'\
b'\x03\xff\x80\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x01\xf8\x20\x00\x49\x10\x00\x00\x00\x00\x00\x6f\x53\x68\x48\x90'\
b'\xa8\xb4\xaf\xfa\x00\x00\xff\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x03\xe0\x82\x83\xff\xff\xe9\x00\x00\x00\x80'\
b'\x95\xee\xbb\x0a\x45\x2a\xc5\xff\xec\x00\x00\x7e\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x03\xc0\x0f\x83\x7f\xff'\
b'\xff\xc0\x00\x00\x00\x6a\xbf\xed\x69\x15\x55\x2a\xff\xfa\x80\x00'\
b'\x3e\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x03'\
b'\x80\x70\x00\x00\x00\xaf\x83\xf4\x00\x00\x94\x5e\xff\xca\xaa\xd4'\
b'\xa9\x7f\xdd\x00\x00\x1f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x03\x80\x80\x00\x00\x00\x00\x03\xff\x80\x00\x2a'\
b'\x9b\xfb\x74\xd6\xb5\x4a\xff\xf4\x00\x10\x4f\x80\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x03\xc1\xc0\x00\x00\x00\x00'\
b'\x00\x5f\xc0\x81\x5d\x0f\xff\xee\x52\xfa\x37\x7f\xf5\x01\x45\x17'\
b'\x80\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x03\xc7'\
b'\xfa\xff\xff\xb4\x00\x00\x01\x80\x00\xb5\x56\xff\x7f\x5b\x5a\x94'\
b'\xff\xec\x04\x08\x4f\xc0\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x01\xff\xff\xff\xff\xff\xff\x00\x00\x00\x02\x54\xdb'\
b'\xff\xfb\xed\x6b\x6b\x57\xb0\x01\x41\x27\xc0\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\xff\xff\xff\xff\xff\xff\xf0'\
b'\x08\x00\x00\xba\x4e\xff\xdf\x75\xad\xa9\x2e\xca\x02\x94\xb7\xc0'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x7f\xbf'\
b'\x68\x94\xff\xff\xae\x05\x05\x05\x6d\x1b\xdf\xff\xff\xf5\x6d\x6f'\
b'\xd0\x13\xd7\xdf\xe0\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x08\x00\x00\x00\x0f\xfc\x24\x08\xb7\xf5\xfd\x5a\xff'\
b'\xff\xdd\xd5\x75\x2f\x49\x09\xdb\xff\xc0\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x3f\xe8\x09\x02'\
b'\xdf\xba\xb5\x6e\x7f\xff\xff\xdd\xfd\x5d\xd4\x07\xbf\xff\xc0\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x7f\x62\x28\x03\xeb\xef\xdb\xbd\xff\xff\xff\x77\xad\xef\xa4'\
b'\x55\xff\xff\xc0\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\xfc\x94\x02\x09\xff\x7f\xfa\x4b\xff\xff'\
b'\xff\xff\x7a\x2f\x4a\x2b\xff\xff\xc0\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x03\xf2\xa0\x14\x83\xd5'\
b'\xd5\xbd\xaf\xff\xff\xff\xff\xef\x7f\xea\x8f\xff\xff\x80\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x07'\
b'\xca\xd0\x20\x01\xff\xbe\x6f\xef\xff\xff\xff\xff\xf5\xdf\xb1\x2b'\
b'\xff\xff\xc0\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x0f\xc9\x08\x47\x07\xe5\xf5\x96\xff\xff\xff\xff'\
b'\xff\xff\xfb\xfc\x37\xff\xff\x80\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x1f\x14\xa8\x29\x83\x7a\xdf'\
b'\x51\xff\xff\xff\xff\xff\xff\xff\xfa\xc7\xff\xff\x80\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x3e\x00'\
b'\x00\x75\x89\xef\x7f\xef\x7f\xff\xff\xff\xff\xff\xff\xef\x5b\xff'\
b'\xff\x80\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x7c\x44\x50\xbf\xd7\xb5\xfd\x7b\xff\xff\xff\xff\xfd'\
b'\xff\xff\xff\x6f\xff\xff\x80\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\xf8\x13\xfc\x75\xa6\xfa\xdf\xff'\
b'\xff\xff\xff\xfb\xff\xff\xff\xff\xbf\xff\xff\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xf0\x01\xd0'\
b'\x41\xd3\xf6\xf8\xff\xff\xff\xff\xff\xfb\xff\xff\xff\xf5\xdf\xfe'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x01\xe0\x20\x5c\x20\x52\xbf\x76\xff\xff\xff\xff\xff\x73\xff'\
b'\xff\xff\xfe\xff\xfe\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x03\xc0\x0c\x20\x01\x45\xeb\xdd\xff\xff'\
b'\xff\xff\xff\xff\xff\x00\x40\xbb\xff\xf8\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x07\x80\x0f\x38\x00'\
b'\x83\xed\xf5\xff\xff\xff\xff\xff\xd1\xff\xe0\x00\x1f\xff\xe0\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x07\x80\x05\x01\x12\x40\xb7\xbf\xff\xff\xff\xff\xfd\xf5\x3f\xfa'\
b'\x00\x07\xff\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x0f\x00\x4b\xc8\x00\x01\xfa\xf6\xff\xff\xff'\
b'\xff\xff\xde\x4f\xfe\x00\x01\xff\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x1e\x00\x5f\xc0\x45\x41'\
b'\xd9\xdb\xff\xff\xff\xff\xff\x7f\xa1\xff\xc0\x00\x3f\xc0\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x1e'\
b'\x00\xfb\xe0\x80\x02\xb7\x6f\xff\xff\xff\xff\xff\xff\xf8\x3f\xf8'\
b'\x00\x0f\xf8\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x3c\x00\xff\xf0\x0a\x80\xf9\xff\xff\xff\xff\xff'\
b'\xfb\xff\xf7\x0f\xfe\x00\x03\xfe\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x38\x03\xff\xf0\xaf\x40\x3c'\
b'\xaf\xff\xff\xff\xff\xff\xbf\xff\xc3\xff\xc0\x00\x7f\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x38\x03'\
b'\xfd\xf0\x4e\xe0\xfd\x77\xff\xff\xff\xff\xff\xff\xff\xf8\x7f\xf0'\
b'\x00\x0f\x80\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x78\x07\x81\xf9\x1f\x40\x5a\x97\xff\xff\xff\xff\xfb'\
b'\xcf\xff\xf4\x0f\xfe\x00\x07\xc0\x00\x00\x03\x40\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x70\x06\x00\x34\x19\xe0\x7d\x5f'\
b'\xff\xff\xff\xff\xff\xff\xff\xff\x03\xff\x80\x03\xe0\x00\x00\x1f'\
b'\xf8\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x70\x00\x00'\
b'\x08\x9c\x60\xf6\x6f\xff\xff\xff\xff\xff\xef\xff\xff\xe0\x7f\xc0'\
b'\x03\xf8\x00\x00\x7f\xfe\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x70\x00\x04\x04\x18\x41\xfe\xbd\xff\xff\xff\xff\xfe\xbb'\
b'\xff\xff\xf0\x1f\xa8\x00\xfc\x00\x00\xff\xff\xc0\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x70\x00\xbf\x00\x38\x60\xfb\x3f\xff'\
b'\xff\xff\xff\xff\x6e\xbf\xff\xfe\x07\x07\x00\xff\x00\x03\xff\xff'\
b'\xf0\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x70\x07\xf5\x80'\
b'\x3a\xe1\x7e\xdf\xff\xff\xff\xff\xff\xbf\xff\xff\xff\x40\x07\xc4'\
b'\x1f\xc0\x07\xff\xff\xfc\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x7a\x47\xf1\x80\x78\xb1\xef\x6f\xff\xff\xff\xff\xff\xef\xbf'\
b'\xff\x7f\xd0\x07\xf8\x07\xf0\x0f\xff\xff\xfe\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x79\x0f\xf3\xc0\x70\xfd\xff\x7d\xff\xff'\
b'\xff\xff\xff\xdd\xff\xff\x8f\xfa\x0f\xff\x03\xf8\x1f\xff\xff\xff'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x7a\x4f\xfb\xf8\x38'\
b'\xb7\xbf\xbf\xff\xff\xff\xff\xff\xf7\xff\xff\x83\xfd\x53\x1f\xc0'\
b'\xfe\x3f\xff\xff\xff\x80\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x7a\x8f\xff\xf8\x72\xfd\xf6\xf7\xff\xff\xff\xff\xff\xf7\xbf\xff'\
b'\xc0\x7f\x80\xe7\xf0\x3f\xff\xff\xbf\xff\xe0\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x7f\x0f\x1f\xfc\x7b\x7b\xff\xbf\xff\xff\xff'\
b'\xff\xff\xef\xff\xff\xc0\x3f\xe0\x11\xe8\x0f\xff\xfe\xff\xff\xf0'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x3a\x87\x00\xfc\x3e\xff'\
b'\x7f\xdf\xff\xff\xff\xff\xff\xf5\xdf\xff\xc0\x07\xf8\x15\x6c\x03'\
b'\xff\xff\xff\xff\xf8\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x3e'\
b'\x02\x20\x7c\x7b\xfd\xfd\xbb\xff\xff\xff\xff\xff\xfb\xff\xbf\xe0'\
b'\x01\xfe\x03\x1e\x80\xcf\xff\xff\xff\xfc\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x3f\x46\x00\x7c\x3f\x7d\xf7\xf7\xff\xff\xff\xff'\
b'\xff\xfb\xff\xef\xf0\x00\x7f\xc2\x1f\xd0\x0f\xff\xff\xfb\xfe\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x1e\x87\x80\x78\x3f\xdf\xff'\
b'\xbf\xff\xff\xff\xff\xff\xfb\x7f\xdf\xf0\x00\x1f\xf5\x3f\xf4\x0f'\
b'\xff\xff\xde\xbe\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x1f\x07'\
b'\xf8\xf0\x7f\xed\xdf\xff\xff\xff\xff\xff\xff\xfd\xef\xff\xf8\x00'\
b'\x03\xfe\xff\xf8\x7f\xff\xff\x7f\xff\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x0f\x4f\x54\xe0\x3d\xdf\xbf\x6f\xff\xff\xff\xff\xff'\
b'\xfd\xff\xd7\xf8\x00\x01\xff\xff\xfd\xbf\xff\xfd\xff\xdf\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x0f\x2f\x89\xc0\x7f\xfc\xfd\xfd'\
b'\xff\xff\xff\xff\xff\xfe\xff\xbf\xfc\x00\x00\x3f\xff\xfc\xf7\xff'\
b'\xff\xff\x7f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0e\x97\xff'\
b'\xc0\x7f\xd7\xf7\xf7\xff\xff\xff\xff\xff\xfe\xf7\xbf\xfc\x00\x00'\
b'\x0f\xff\xf9\xef\xff\xef\xfd\xef\x80\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x00\x1e\x27\xff\x80\x7f\xfe\xdf\xdf\xff\xff\xff\xff\xff\xff'\
b'\xff\xff\xfc\x00\x00\x01\xff\xd3\xd7\xff\xff\xff\xff\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x00\x1e\x11\xfe\x02\x7f\xef\xff\xff\xff'\
b'\xff\xff\xff\xff\xfe\xff\xff\xfe\x00\x00\x00\xff\xc0\xc7\xff\xff'\
b'\xff\xdf\x80\x00\x00\x00\x00\x00\x00\x00\x00\x00\x3c\x0a\x18\x06'\
b'\x7f\xfe\xff\xff\xff\xff\xff\xff\xff\xff\x7f\xef\xff\x00\x00\x00'\
b'\x3f\xc0\x47\xff\xff\xff\xff\x80\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\x00\x7c\x05\x20\x0e\xff\xff\xff\xef\xff\xff\xff\xff\xff\xff\x7f'\
b'\xff\xff\x00\x00\x00\x0f\xf0\x01\xff\xff\xff\x6f\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x00\x00\x78\x08\x8a\x5e\x7f\xfe\xff\xff\xff\xff'\
b'\xff\xff\xff\xff\xff\xff\xff\x80\x00\x00\x0f\xf8\x01\xff\xff\xff'\
b'\xff\x80\x00\x00\x00\x00\x00\x00\x00\x00\x00\xf0\x16\xa9\xfc\xff'\
b'\xff\xff\x77\xff\xff\xff\xff\xff\xfb\xbf\xf7\xff\x80\x00\x00\x1f'\
b'\xff\x00\xdf\xff\xff\x7f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'\
b'\xe2\x0a\x92\xfd\xff\xfd\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'\
b'\xff\x80\x00\x00\x3f\xff\x82\x0f\xff\xfb\xdf\x80\x00\x00\x00\x00'\
b'\x00\x00\x00\x00\x01\xe0\x2a\xd5\xfc\xff\xff\xfd\xff\xff\xff\xff'\
b'\xff\xff\xff\xff\xef\xff\xc0\x00\x00\x7f\xff\xe2\x47\xfd\xff\xff'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x03\xc2\x0f\x5b\xfb\xff\xff'\
b'\xf7\xbf\xff\xff\xff\xff\xff\xfb\xfd\xad\xff\xc0\x00\x01\xff\xff'\
b'\xe4\x85\xff\xff\x7f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x03\x80'\
b'\x6c\xff\xfb\xff\xff\xff\xf7\xff\xff\xff\xff\xff\xff\xdf\xe7\xff'\
b'\xe0\x00\x07\xff\xff\xe2\x01\xbf\xfd\xff\x80\x00\x00\x00\x00\x00'\
b'\x00\x00\x00\x07\x84\xbb\x6f\xe3\xff\xff\xf7\xff\xff\xff\xff\xff'\
b'\xff\xfb\xff\xff\xff\xe0\x00\x3f\xff\xff\xca\x02\x3f\xff\x8f\xef'\
b'\x00\x00\x00\x00\x00\x00\x00\x00\x0f\x02\x5e\xbf\xe7\xff\xff\xf6'\
b'\xff\xff\xff\xff\xff\xff\xfd\xff\xdf\xff\xf0\x00\xff\xff\xff\xc6'\
b'\x20\x3f\xfe\xc3\xff\x80\x00\x00\x00\x00\x00\x00\x00\x0f\x0a\xbe'\
b'\xff\x9f\xff\xff\xeb\xdf\xff\xff\xff\xff\xff\xff\xdf\xeb\xff\xf0'\
b'\x03\xff\xff\xff\x4c\x72\x3f\xfb\x80\xff\x80\x00\x00\x00\x00\x00'\
b'\x00\x00\x1e\x0c\xff\xfa\x1f\xff\xff\xff\xfd\xff\xff\xff\xff\xff'\
b'\xfd\x7f\xbf\xef\xf8\x0f\xff\xff\xfd\x04\x78\x0f\xfb\x60\x7f\xc0'\
b'\x00\x00\x00\x00\x00\x00\x00\x1e\x19\xbf\x7c\x7f\xff\xff\xff\xbb'\
b'\xff\xff\xff\xff\xff\xfe\xef\xff\xff\xfc\x3f\xff\xff\xfe\x88\xf0'\
b'\x03\x7b\x70\x03\xc0\x00\x00\x00\x00\x00\x00\x00\x1c\x15\x7b\xf9'\
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x7f\xff\xd7\xff\xff'\
b'\xff\xff\xfd\x25\x61\x88\x66\xfe\x01\xf0\x00\x00\x00\x00\x00\x00'\
b'\x38\x3c\x3d\xff\xf7\xff\xff\xff\xef\xff\xff\xff\xff\xff\xff\xfe'\
b'\xff\xff\xff\xff\xff\xff\xff\xfe\x51\xe3\xc0\x1f\xff\x01\xf8\x00'\
b'\x00\x00\x00\x00\x00\xff\xf8\x77\xff\xff\xff\xff\xff\xff\xdd\xff'\
b'\xff\xff\xff\xff\xff\xaf\xff\xef\xff\xff\xff\xff\xfe\xa1\x43\x88'\
b'\x0d\xff\xc0\x7e\x00\x00\x00\x00\x00\x03\xff\xf8\x2d\xff\xff\xff'\
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xfe\xff\xfd\xff\xff\xff\xff'\
b'\xff\xff\x82\x8d\x8a\x0b\xff\xf0\x3f\x80\x00\x00\x00\x00\x07\xff'\
b'\xf0\xff\xfe\xff\xff\xff\xff\xff\xf7\xff\xff\xff\xff\xff\xff\xff'\
b'\xff\xff\xff\xff\xff\xff\xff\x51\x87\xd5\xb0\xff\xe0\x0f\xe0\x00'\
b'\x00\x00\x00\x0f\xc7\xf0\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'\
b'\xff\xff\xff\xff\xef\xff\xff\xff\xff\xff\xff\xff\xd6\x8a\xfb\xf0'\
b'\x3f\xc0\x03\xf8\x00\x00\x00\x00\x0f\x3c\xf0\xff\xff\xff\xff\xff'\
b'\xff\xff\xff\xff\xff\xff\xff\xff\xfd\xff\xfe\xff\xff\xff\xff\xff'\
b'\xff\xe9\x57\xdf\xf8\x0f\x03\xc0\xff\x00\x00\x00\x00\x1f\x7f\xe1'\
b'\xff\xff\xff\xff\xff\xff\xfb\xef\xff\xff\xff\xff\xff\xff\xef\xff'\
b'\xff\xff\xff\xff\xff\xff\xf4\xaa\xff\xfe\x04\x0f\xf0\x7f\xc0\x00'\
b'\x00\x00\x1c\xff\xe3\x9f\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'\
b'\xff\xff\xfd\xff\xff\xff\x7f\xff\xff\xff\xff\xfd\x47\xff\xff\x80'\
b'\x3f\xfc\x0f\xf0\x00\x00\x00\x1d\xff\xe3\xef\xff\xff\xff\xff\xff'\
b'\xf7\xf7\xff\xff\xff\xff\xff\xff\xff\xff\xfd\xff\xff\xff\xff\xff'\
b'\xff\xaf\xff\xff\xe0\x3f\xfe\x03\xf8\x00\x00\x00\x3d\xff\xc6\xaf'\
b'\xff\xff\xff\xff\xff\xdf\xff\xff\xff\xff\xff\xff\xff\xff\xfe\xff'\
b'\xff\xff\xff\xff\xff\xff\xf7\xff\xc7\xf0\x1f\xff\x80\xff\x00\x00'\
b'\x00\x3d\xff\xcb\x3f\xff\xff\xff\xff\xff\xf7\xff\xff\xff\xff\xff'\
b'\xff\xfe\xfb\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x01\xfe\x07'\
b'\xff\xe0\x1f\x80\x00\x00\x3d\xff\xfd\x4b\xff\xff\xff\xff\xff\xfd'\
b'\xf7\xff\xff\xff\xff\xff\xfd\xff\xff\xff\xff\xff\xff\xff\xff\xff'\
b'\xff\xfc\x00\x7f\x01\xff\xf0\x0f\xe0\x00\x00\x7b\xff\xd5\x3f\xff'\
b'\xff\xff\xff\xff\xbf\xff\xff\xff\xff\xff\xff\xff\x7f\xff\xef\xff'\
b'\xff\xff\xff\xff\xff\xff\xf8\x00\x1f\xc0\xff\xfe\x07\xf0\x00\x00'\
_mvdata = memoryview(_data)
def data():
return _mvdata
| 66.931596 | 68 | 0.709899 | 5,050 | 20,548 | 2.887327 | 0.044356 | 0.783074 | 0.999314 | 1.134902 | 0.691173 | 0.65942 | 0.612441 | 0.55442 | 0.529593 | 0.513065 | 0 | 0.303817 | 0.015671 | 20,548 | 306 | 69 | 67.150327 | 0.417087 | 0.001557 | 0 | 0.096346 | 1 | 0.983389 | 0.923613 | 0.923467 | 0 | 1 | 0 | 0 | 0 | 1 | 0.003322 | false | 0 | 0 | 0.003322 | 0.006645 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
d1241e30242c8ff364e144daed1afc7d3dd6fc69 | 4,675 | py | Python | data/train/python/d1241e30242c8ff364e144daed1afc7d3dd6fc69collector_test.py | harshp8l/deep-learning-lang-detection | 2a54293181c1c2b1a2b840ddee4d4d80177efb33 | [
"MIT"
] | 84 | 2017-10-25T15:49:21.000Z | 2021-11-28T21:25:54.000Z | data/train/python/d1241e30242c8ff364e144daed1afc7d3dd6fc69collector_test.py | vassalos/deep-learning-lang-detection | cbb00b3e81bed3a64553f9c6aa6138b2511e544e | [
"MIT"
] | 5 | 2018-03-29T11:50:46.000Z | 2021-04-26T13:33:18.000Z | data/train/python/d1241e30242c8ff364e144daed1afc7d3dd6fc69collector_test.py | vassalos/deep-learning-lang-detection | cbb00b3e81bed3a64553f9c6aa6138b2511e544e | [
"MIT"
] | 24 | 2017-11-22T08:31:00.000Z | 2022-03-27T01:22:31.000Z | __author__ = 'Ahmed Gamal A. Ali'
import errno
import os
from tests.utils import remove_save_dir
import unittest
from src.image_collector import CollectImages
class ImagesCollectorTest(unittest.TestCase):
def test_image_file_not_exists(self):
image_file = '/path/not/found/file.txt'
save_dir = '/some/save/dir'
try:
CollectImages(image_file=image_file, save_dir=save_dir)
except OSError, e:
self.assertEqual(e.message, "File %s Not Found" % image_file)
except Exception, e:
self.fail('Unexpected exception thrown: %s' % e)
else:
self.fail('Expected Exception not thrown')
remove_save_dir(save_dir)
def test_image_save_dir_not_exists(self):
image_file = './tests/fixtures/empty.txt'
save_dir = '/some/save/dir'
try:
CollectImages(image_file=image_file, save_dir=save_dir)
except OSError, e:
self.assertIn("Save Directory not found", e.message)
except Exception, e:
self.fail('Unexpected exception thrown: %s' % e)
else:
self.fail('Expected Exception not thrown')
remove_save_dir(save_dir)
def test_image_save_dir_permission_denied(self):
image_file = './tests/fixtures/empty.txt'
save_dir = '/dir'
try:
CollectImages(image_file=image_file, save_dir=save_dir, create=True)
except OSError, e:
self.assertEqual(e.errno, errno.EACCES)
except Exception, e:
self.fail('Unexpected exception thrown: %s' % e)
else:
self.fail('Expected Exception not thrown')
remove_save_dir(save_dir)
def test_image_empty_file(self):
image_file = './tests/fixtures/empty.txt'
save_dir = '/tmp/save_dir/'
try:
CollectImages(image_file=image_file, save_dir=save_dir, create=True)
except Exception, e:
self.assertEqual('Images File is Empty', e.message)
else:
self.fail('Expected Exception not thrown')
remove_save_dir(save_dir)
def test_no_urls_in_file(self):
image_file = './tests/fixtures/invalid.txt'
save_dir = '/tmp/save_dir/'
try:
CollectImages(image_file=image_file, save_dir=save_dir, create=True)
except Exception, e:
self.assertEqual('No Valid URLs Found', e.message)
else:
self.fail('Expected Exception not thrown')
remove_save_dir(save_dir)
def test_one_urls_in_file(self):
image_file = './tests/fixtures/one_url.txt'
save_dir = '/tmp/save_dir/'
try:
CollectImages(image_file=image_file, save_dir=save_dir, create=True)
except Exception, e:
self.fail('Not Expected Exception thrown: ' + str(e))
self.assertEqual(1, len(os.listdir(os.path.join(save_dir, 'images'))))
remove_save_dir(save_dir)
def test_valid_but_not_found_urls(self):
image_file = './tests/fixtures/valid_not_found.txt'
save_dir = '/tmp/save_dir/'
try:
CollectImages(image_file=image_file, save_dir=save_dir, create=True)
except Exception, e:
self.fail('Not Expected Exception thrown: ' + str(e))
self.assertEqual(0, len(os.listdir(os.path.join(save_dir, 'images'))))
remove_save_dir(save_dir)
def test_mixed_file(self):
image_file = './tests/fixtures/mixed.txt'
save_dir = '/tmp/save_dir/'
try:
CollectImages(image_file=image_file, save_dir=save_dir, create=True)
except Exception, e:
self.fail('Not Expected Exception thrown: ' + str(e))
self.assertEqual(3, len(os.listdir(os.path.join(save_dir, 'images'))))
remove_save_dir(save_dir)
def test_repeated_urls(self):
image_file = './tests/fixtures/repeated.txt'
save_dir = '/tmp/save_dir/'
try:
CollectImages(image_file=image_file, save_dir=save_dir, create=True)
except Exception, e:
self.fail('Not Expected Exception thrown: ' + str(e))
self.assertEqual(3, len(os.listdir(os.path.join(save_dir, 'images'))))
remove_save_dir(save_dir)
def test_all_valid(self):
image_file = './tests/fixtures/all_valid.txt'
save_dir = '/tmp/save_dir/'
try:
CollectImages(image_file=image_file, save_dir=save_dir, create=True)
except Exception, e:
self.fail('Not Expected Exception thrown: ' + str(e))
self.assertEqual(4, len(os.listdir(os.path.join(save_dir, 'images'))))
remove_save_dir(save_dir) | 38.00813 | 80 | 0.633155 | 608 | 4,675 | 4.621711 | 0.125 | 0.166904 | 0.078292 | 0.099644 | 0.848399 | 0.828114 | 0.781851 | 0.781851 | 0.756228 | 0.719573 | 0 | 0.001439 | 0.256684 | 4,675 | 123 | 81 | 38.00813 | 0.807194 | 0 | 0 | 0.694444 | 0 | 0 | 0.198888 | 0.059666 | 0 | 0 | 0 | 0 | 0.092593 | 0 | null | null | 0 | 0.046296 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d15ababcf9a482f7209fa433c4a6e43b1b2c171a | 29,745 | py | Python | great_international/migrations/0014_auto_20190404_1322.py | uktrade/directory-cms | 8c8d13ce29ea74ddce7a40f3dd29c8847145d549 | [
"MIT"
] | 6 | 2018-03-20T11:19:07.000Z | 2021-10-05T07:53:11.000Z | great_international/migrations/0014_auto_20190404_1322.py | uktrade/directory-cms | 8c8d13ce29ea74ddce7a40f3dd29c8847145d549 | [
"MIT"
] | 802 | 2018-02-05T14:16:13.000Z | 2022-02-10T10:59:21.000Z | great_international/migrations/0014_auto_20190404_1322.py | uktrade/directory-cms | 8c8d13ce29ea74ddce7a40f3dd29c8847145d549 | [
"MIT"
] | 6 | 2019-01-22T13:19:37.000Z | 2019-07-01T10:35:26.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.18 on 2019-04-04 13:22
from __future__ import unicode_literals
import core.model_fields
import core.validators
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('wagtailimages', '0021_image_file_hash'),
('great_international', '0013_auto_20190404_1321'),
]
operations = [
migrations.AddField(
model_name='internationalguidelandingpage',
name='display_title_ar',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='display_title_de',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='display_title_en_gb',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='display_title_es',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='display_title_fr',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='display_title_ja',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='display_title_pt',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='display_title_pt_br',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='display_title_ru',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='display_title_zh_hans',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='guides_section_heading_ar',
field=models.CharField(max_length=100, null=True, verbose_name='section heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='guides_section_heading_de',
field=models.CharField(max_length=100, null=True, verbose_name='section heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='guides_section_heading_en_gb',
field=models.CharField(max_length=100, null=True, verbose_name='section heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='guides_section_heading_es',
field=models.CharField(max_length=100, null=True, verbose_name='section heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='guides_section_heading_fr',
field=models.CharField(max_length=100, null=True, verbose_name='section heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='guides_section_heading_ja',
field=models.CharField(max_length=100, null=True, verbose_name='section heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='guides_section_heading_pt',
field=models.CharField(max_length=100, null=True, verbose_name='section heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='guides_section_heading_pt_br',
field=models.CharField(max_length=100, null=True, verbose_name='section heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='guides_section_heading_ru',
field=models.CharField(max_length=100, null=True, verbose_name='section heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='guides_section_heading_zh_hans',
field=models.CharField(max_length=100, null=True, verbose_name='section heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='hero_image_ar',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='hero_image_de',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='hero_image_en_gb',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='hero_image_es',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='hero_image_fr',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='hero_image_ja',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='hero_image_pt',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='hero_image_pt_br',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='hero_image_ru',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='hero_image_zh_hans',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_content_ar',
field=core.model_fields.MarkdownField(null=True, validators=[core.validators.slug_hyperlinks], verbose_name='content'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_content_de',
field=core.model_fields.MarkdownField(null=True, validators=[core.validators.slug_hyperlinks], verbose_name='content'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_content_en_gb',
field=core.model_fields.MarkdownField(null=True, validators=[core.validators.slug_hyperlinks], verbose_name='content'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_content_es',
field=core.model_fields.MarkdownField(null=True, validators=[core.validators.slug_hyperlinks], verbose_name='content'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_content_fr',
field=core.model_fields.MarkdownField(null=True, validators=[core.validators.slug_hyperlinks], verbose_name='content'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_content_ja',
field=core.model_fields.MarkdownField(null=True, validators=[core.validators.slug_hyperlinks], verbose_name='content'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_content_pt',
field=core.model_fields.MarkdownField(null=True, validators=[core.validators.slug_hyperlinks], verbose_name='content'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_content_pt_br',
field=core.model_fields.MarkdownField(null=True, validators=[core.validators.slug_hyperlinks], verbose_name='content'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_content_ru',
field=core.model_fields.MarkdownField(null=True, validators=[core.validators.slug_hyperlinks], verbose_name='content'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_content_zh_hans',
field=core.model_fields.MarkdownField(null=True, validators=[core.validators.slug_hyperlinks], verbose_name='content'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_ar',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_caption_ar',
field=models.CharField(blank=True, max_length=100, null=True, verbose_name='image caption'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_caption_de',
field=models.CharField(blank=True, max_length=100, null=True, verbose_name='image caption'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_caption_en_gb',
field=models.CharField(blank=True, max_length=100, null=True, verbose_name='image caption'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_caption_es',
field=models.CharField(blank=True, max_length=100, null=True, verbose_name='image caption'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_caption_fr',
field=models.CharField(blank=True, max_length=100, null=True, verbose_name='image caption'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_caption_ja',
field=models.CharField(blank=True, max_length=100, null=True, verbose_name='image caption'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_caption_pt',
field=models.CharField(blank=True, max_length=100, null=True, verbose_name='image caption'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_caption_pt_br',
field=models.CharField(blank=True, max_length=100, null=True, verbose_name='image caption'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_caption_ru',
field=models.CharField(blank=True, max_length=100, null=True, verbose_name='image caption'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_caption_zh_hans',
field=models.CharField(blank=True, max_length=100, null=True, verbose_name='image caption'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_de',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_en_gb',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_es',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_fr',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_ja',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_pt',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_pt_br',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_ru',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_one_image_zh_hans',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_text_ar',
field=models.CharField(max_length=100, null=True, verbose_name='button text'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_text_de',
field=models.CharField(max_length=100, null=True, verbose_name='button text'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_text_en_gb',
field=models.CharField(max_length=100, null=True, verbose_name='button text'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_text_es',
field=models.CharField(max_length=100, null=True, verbose_name='button text'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_text_fr',
field=models.CharField(max_length=100, null=True, verbose_name='button text'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_text_ja',
field=models.CharField(max_length=100, null=True, verbose_name='button text'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_text_pt',
field=models.CharField(max_length=100, null=True, verbose_name='button text'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_text_pt_br',
field=models.CharField(max_length=100, null=True, verbose_name='button text'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_text_ru',
field=models.CharField(max_length=100, null=True, verbose_name='button text'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_text_zh_hans',
field=models.CharField(max_length=100, null=True, verbose_name='button text'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_url_ar',
field=models.CharField(max_length=255, null=True, verbose_name='button URL'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_url_de',
field=models.CharField(max_length=255, null=True, verbose_name='button URL'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_url_en_gb',
field=models.CharField(max_length=255, null=True, verbose_name='button URL'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_url_es',
field=models.CharField(max_length=255, null=True, verbose_name='button URL'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_url_fr',
field=models.CharField(max_length=255, null=True, verbose_name='button URL'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_url_ja',
field=models.CharField(max_length=255, null=True, verbose_name='button URL'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_url_pt',
field=models.CharField(max_length=255, null=True, verbose_name='button URL'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_url_pt_br',
field=models.CharField(max_length=255, null=True, verbose_name='button URL'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_url_ru',
field=models.CharField(max_length=255, null=True, verbose_name='button URL'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_button_url_zh_hans',
field=models.CharField(max_length=255, null=True, verbose_name='button URL'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_heading_ar',
field=models.CharField(max_length=100, null=True, verbose_name='heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_heading_de',
field=models.CharField(max_length=100, null=True, verbose_name='heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_heading_en_gb',
field=models.CharField(max_length=100, null=True, verbose_name='heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_heading_es',
field=models.CharField(max_length=100, null=True, verbose_name='heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_heading_fr',
field=models.CharField(max_length=100, null=True, verbose_name='heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_heading_ja',
field=models.CharField(max_length=100, null=True, verbose_name='heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_heading_pt',
field=models.CharField(max_length=100, null=True, verbose_name='heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_heading_pt_br',
field=models.CharField(max_length=100, null=True, verbose_name='heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_heading_ru',
field=models.CharField(max_length=100, null=True, verbose_name='heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_heading_zh_hans',
field=models.CharField(max_length=100, null=True, verbose_name='heading'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_image_ar',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_image_de',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_image_en_gb',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_image_es',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_image_fr',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_image_ja',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_image_pt',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_image_pt_br',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_image_ru',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_image_zh_hans',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image', verbose_name='image'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_teaser_ar',
field=models.TextField(null=True, verbose_name='teaser'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_teaser_de',
field=models.TextField(null=True, verbose_name='teaser'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_teaser_en_gb',
field=models.TextField(null=True, verbose_name='teaser'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_teaser_es',
field=models.TextField(null=True, verbose_name='teaser'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_teaser_fr',
field=models.TextField(null=True, verbose_name='teaser'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_teaser_ja',
field=models.TextField(null=True, verbose_name='teaser'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_teaser_pt',
field=models.TextField(null=True, verbose_name='teaser'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_teaser_pt_br',
field=models.TextField(null=True, verbose_name='teaser'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_teaser_ru',
field=models.TextField(null=True, verbose_name='teaser'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='section_two_teaser_zh_hans',
field=models.TextField(null=True, verbose_name='teaser'),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='teaser_ar',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='teaser_de',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='teaser_en_gb',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='teaser_es',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='teaser_fr',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='teaser_ja',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='teaser_pt',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='teaser_pt_br',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='teaser_ru',
field=models.CharField(max_length=255, null=True),
),
migrations.AddField(
model_name='internationalguidelandingpage',
name='teaser_zh_hans',
field=models.CharField(max_length=255, null=True),
),
]
| 47.975806 | 159 | 0.645756 | 2,901 | 29,745 | 6.34919 | 0.031024 | 0.11727 | 0.149845 | 0.175905 | 0.980944 | 0.978989 | 0.978338 | 0.978012 | 0.973614 | 0.971008 | 0 | 0.011019 | 0.243369 | 29,745 | 619 | 160 | 48.053312 | 0.807385 | 0.00232 | 0 | 0.784314 | 1 | 0 | 0.254339 | 0.183096 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.00817 | 0 | 0.013072 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
d164b4b2c4ae2d59311570f47189753e26b2b2c9 | 206,749 | py | Python | openbook_posts/tests/views/test_posts.py | TamaraAbells/okuna-api | f87d8e80d2f182c01dbce68155ded0078ee707e4 | [
"MIT"
] | 164 | 2019-07-29T17:59:06.000Z | 2022-03-19T21:36:01.000Z | openbook_posts/tests/views/test_posts.py | TamaraAbells/okuna-api | f87d8e80d2f182c01dbce68155ded0078ee707e4 | [
"MIT"
] | 188 | 2019-03-16T09:53:25.000Z | 2019-07-25T14:57:24.000Z | openbook_posts/tests/views/test_posts.py | TamaraAbells/okuna-api | f87d8e80d2f182c01dbce68155ded0078ee707e4 | [
"MIT"
] | 80 | 2019-08-03T17:49:08.000Z | 2022-02-28T16:56:33.000Z | # Create your tests here.
import tempfile
from unittest import mock
from PIL import Image
from django.conf import settings
from django.core.files import File
from django.core.files.uploadedfile import SimpleUploadedFile
from django.urls import reverse
from django_rq import get_worker
from faker import Faker
from rest_framework import status
from rq import SimpleWorker
from openbook_common.tests.models import OpenbookAPITestCase
from mixer.backend.django import mixer
from openbook.settings import POST_MAX_LENGTH
from openbook_auth.models import User, UserNotificationsSubscription
import random
import logging
import json
from openbook_circles.models import Circle
from openbook_common.tests.helpers import make_user, make_users, make_fake_post_text, \
make_authentication_headers_for_user, make_circle, make_community, make_list, make_moderation_category, \
get_test_usernames, get_test_videos, get_test_image, make_global_moderator, \
make_fake_post_comment_text, make_reactions_emoji_group, make_emoji, make_hashtag_name, make_hashtag, \
get_test_valid_hashtags, get_test_invalid_hashtags, get_post_links
from openbook_common.utils.helpers import sha256sum, normalize_url
from openbook_communities.models import Community
from openbook_hashtags.models import Hashtag
from openbook_lists.models import List
from openbook_moderation.models import ModeratedObject
from openbook_notifications.models import PostUserMentionNotification, Notification, UserNewPostNotification
from openbook_posts.jobs import curate_top_posts, curate_trending_posts
from openbook_posts.models import Post, PostUserMention, PostMedia, TopPost, TrendingPost, PostLink
logger = logging.getLogger(__name__)
fake = Faker()
# TODO A lot of setup duplication. Perhaps its a good idea to create a single factory on top of mixer or Factory boy
class PostsAPITests(OpenbookAPITestCase):
"""
PostsAPI
"""
fixtures = [
'openbook_circles/fixtures/circles.json',
'openbook_common/fixtures/languages.json'
]
def test_create_text_post(self):
"""
should be able to create a text post and return 201
"""
user = make_user()
auth_token = user.auth_token.key
post_text = fake.text(max_nb_chars=POST_MAX_LENGTH)
headers = {'HTTP_AUTHORIZATION': 'Token %s' % auth_token}
data = {
'text': post_text
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertTrue(user.posts.filter(text=post_text).count() == 1)
world_circle = Circle.get_world_circle()
self.assertTrue(world_circle.posts.filter(text=post_text).count() == 1)
def test_create_text_post_with_only_link_should_not_throw_error(self):
"""
should be able to create a text post with only a link and return 201
"""
user = make_user()
auth_token = user.auth_token.key
post_text = 'https://www.okuna.io'
headers = {'HTTP_AUTHORIZATION': 'Token %s' % auth_token}
data = {
'text': post_text
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertTrue(user.posts.filter(text=post_text).count() == 1)
world_circle = Circle.get_world_circle()
self.assertTrue(world_circle.posts.filter(text=post_text).count() == 1)
def test_create_text_post_detect_language(self):
"""
should be able to create a text post and detect its language and return 201
"""
user = make_user()
auth_token = user.auth_token.key
post_text = fake.text(max_nb_chars=POST_MAX_LENGTH)
headers = {'HTTP_AUTHORIZATION': 'Token %s' % auth_token}
data = {
'text': post_text
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertTrue(user.posts.get(text=post_text).language.code is not None)
def test_create_text_post_with_hashtag_creates_hashtag_if_not_exist(self):
"""
when ccreating a post with a hashtag, should create it if not exists
"""
user = make_user()
headers = make_authentication_headers_for_user(user=user)
hashtag = make_hashtag_name()
post_text = 'A hashtag #' + hashtag
data = {
'text': post_text
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(status.HTTP_201_CREATED, response.status_code)
post = Post.objects.get(text=post_text, creator_id=user.pk)
created_hashtag = Hashtag.objects.get(name=hashtag)
self.assertTrue(post.hashtags.filter(pk=created_hashtag.pk).exists())
self.assertEqual(post.hashtags.all().count(), 1)
def test_create_text_post_with_existing_hashtag_adds_it(self):
"""
when creating a post with an existing hashtag, should add it
"""
user = make_user()
headers = make_authentication_headers_for_user(user=user)
hashtag = make_hashtag()
new_post_text = 'A hashtag #' + hashtag.name
data = {
'text': new_post_text
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(status.HTTP_201_CREATED, response.status_code)
post = Post.objects.get(text=new_post_text, creator_id=user.pk)
self.assertTrue(post.hashtags.filter(pk=hashtag.pk).exists())
self.assertEqual(Hashtag.objects.filter(pk=hashtag.pk).count(), 1)
self.assertEqual(post.hashtags.all().count(), 1)
def test_create_text_post_with_repeated_hashtag_does_not_create_double_hashtags(self):
"""
when creating a post with a repeated hashtag, doesnt create it twice
"""
user = make_user()
headers = make_authentication_headers_for_user(user=user)
hashtag_name = make_hashtag_name()
post_text = '#%s #%s' % (hashtag_name, hashtag_name)
data = {
'text': post_text
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(status.HTTP_201_CREATED, response.status_code)
post = Post.objects.get(text=post_text, creator_id=user.pk)
hashtag = Hashtag.objects.get(name=hashtag_name)
self.assertEqual(post.hashtags.all().count(), 1)
self.assertEqual(post.hashtags.filter(name=hashtag.name).count(), 1)
self.assertEqual(Hashtag.objects.filter(name=hashtag.name).count(), 1)
def test_create_text_post_with_valid_hashtags_creates_them(self):
"""
when creating a post with valid hashtags, should create them
"""
user = make_user()
valid_hashtags = get_test_valid_hashtags()
for valid_hashtag in valid_hashtags:
headers = make_authentication_headers_for_user(user=user)
post_text = 'Valid hashtag #' + valid_hashtag
data = {
'text': post_text
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(status.HTTP_201_CREATED, response.status_code)
post = Post.objects.get(text=post_text, creator_id=user.pk)
created_hashtag = Hashtag.objects.get(name=valid_hashtag)
self.assertTrue(post.hashtags.filter(pk=created_hashtag.pk).exists())
self.assertEqual(post.hashtags.all().count(), 1)
def test_create_text_post_with_invalid_hashtags_does_not_create_them(self):
"""
when creating a post with invalid hashtags, should not create them
"""
user = make_user()
invalid_hashtags = get_test_invalid_hashtags()
for invalid_hashtag in invalid_hashtags:
headers = make_authentication_headers_for_user(user=user)
post_text = 'Invalid hashtag #' + invalid_hashtag
data = {
'text': post_text
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(status.HTTP_201_CREATED, response.status_code)
post = Post.objects.get(text=post_text, creator_id=user.pk)
self.assertFalse(Hashtag.objects.filter(name=invalid_hashtag).exists())
self.assertFalse(post.hashtags.all().exists())
def test_create_text_post_with_excedingly_long_hashtag_should_not_created_it(self):
"""
when creating a post with an excedingly long hashtag, should not create it
"""
user = make_user()
headers = make_authentication_headers_for_user(user=user)
long_hashtag = ''.join(['a' for item in range(0, settings.HASHTAG_NAME_MAX_LENGTH + 1)])
post_text = 'Long hashtag #' + long_hashtag
data = {
'text': post_text
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(status.HTTP_400_BAD_REQUEST, response.status_code)
self.assertFalse(Post.objects.filter(text=post_text).exists())
self.assertFalse(Hashtag.objects.filter(name=long_hashtag).exists())
def test_create_text_post_with_more_hashtags_than_allowed_should_not_create_it(self):
"""
when creating a post with exceeding hashtags, should not create it
"""
user = make_user()
headers = make_authentication_headers_for_user(user=user)
post_hashtags = []
for i in range(0, settings.POST_MAX_HASHTAGS + 1):
hashtag = '#%s' % make_hashtag_name()
post_hashtags.append(hashtag)
post_text = ' '.join(post_hashtags)
data = {
'text': post_text
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(status.HTTP_400_BAD_REQUEST, response.status_code)
self.assertFalse(Post.objects.filter(text=post_text).exists())
def test_create_text_post_detects_mentions_once(self):
"""
should be able to create a text post with a mention and detect it once
"""
user = make_user()
headers = make_authentication_headers_for_user(user=user)
test_usernames = get_test_usernames()
for test_username in test_usernames:
test_user = make_user(username=test_username)
post_text = 'Hello @' + test_user.username + ' @' + test_user.username
data = {
'text': post_text
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
post = Post.objects.get(text=post_text, creator_id=user.pk)
self.assertEqual(PostUserMention.objects.filter(user_id=test_user.pk, post_id=post.pk).count(), 1)
def test_create_text_detect_mention_is_case_insensitive(self):
"""
should detect mention regardless of the username letter cases
"""
user = make_user()
headers = make_authentication_headers_for_user(user=user)
mentioned_user = make_user(username='joel132')
post_text = 'Hello @JoEl132'
data = {
'text': post_text,
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
post = Post.objects.get(text=post_text, creator_id=user.pk)
self.assertTrue(PostUserMention.objects.filter(post_id=post.pk, user_id=mentioned_user.pk).exists())
def test_create_text_detect_mention_ignores_casing_of_username(self):
"""
should detect mention regardless of the casing of the username
"""
user = make_user()
headers = make_authentication_headers_for_user(user=user)
mentioned_user = make_user(username='Joel')
post_text = 'Hello @joel'
data = {
'text': post_text,
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
get_worker('high', worker_class=SimpleWorker).work(burst=True)
post = Post.objects.get(text=post_text, creator_id=user.pk)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertTrue(PostUserMention.objects.filter(post_id=post.pk, user_id=mentioned_user.pk).exists())
def test_create_text_post_ignores_non_existing_mentioned_usernames(self):
"""
should ignore non existing mentioned usernames when creating a post
"""
user = make_user()
headers = make_authentication_headers_for_user(user=user)
fake_username = 'nonexistinguser'
post_text = 'Hello @' + fake_username
data = {
'text': post_text
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
post = Post.objects.get(text=post_text, creator_id=user.pk)
self.assertEqual(PostUserMention.objects.filter(post_id=post.pk).count(), 0)
def test_create_text_post_creates_mention_notifications(self):
"""
should be able to create a text post with a mention notification
"""
user = make_user()
headers = make_authentication_headers_for_user(user=user)
test_user = make_user()
post_text = 'Hello @' + test_user.username
data = {
'text': post_text
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
post = Post.objects.get(text=post_text, creator_id=user.pk)
post_user_mention = PostUserMention.objects.get(user_id=test_user.pk, post_id=post.pk)
self.assertEqual(PostUserMentionNotification.objects.filter(post_user_mention_id=post_user_mention.pk,
notification__owner_id=test_user.pk,
notification__notification_type=Notification.POST_USER_MENTION).count(),
1)
def test_create_text_post_does_not_detect_mention_if_encircled(self):
"""
should not detect mention if the post is encircled and the mentioned person is outside the circle
"""
user = make_user()
headers = make_authentication_headers_for_user(user=user)
circle = make_circle(creator=user)
mentioned_user = make_user()
post_text = 'Hello @' + mentioned_user.username
data = {
'text': post_text,
'circle_id': circle.pk
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
post = Post.objects.get(text=post_text, creator_id=user.pk)
self.assertFalse(PostUserMention.objects.filter(post_id=post.pk, user_id=mentioned_user.pk).exists())
def test_create_text_detect_mention_if_encircled_and_part_of(self):
"""
should detect mention if the post is encircled and the mentioned person is in the circle
"""
user = make_user()
headers = make_authentication_headers_for_user(user=user)
circle = make_circle(creator=user)
mentioned_user = make_user()
user.connect_with_user_with_id(user_id=mentioned_user.pk, circles_ids=[circle.pk])
mentioned_user.confirm_connection_with_user_with_id(user_id=user.pk)
post_text = 'Hello @' + mentioned_user.username
data = {
'text': post_text,
'circle_id': circle.pk
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
post = Post.objects.get(text=post_text, creator_id=user.pk)
self.assertTrue(PostUserMention.objects.filter(post_id=post.pk, user_id=mentioned_user.pk).exists())
def test_create_text_detect_mention_if_public(self):
"""
should detect mention if the post is public
"""
user = make_user()
headers = make_authentication_headers_for_user(user=user)
mentioned_user = make_user()
post_text = 'Hello @' + mentioned_user.username
data = {
'text': post_text,
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
post = Post.objects.get(text=post_text, creator_id=user.pk)
self.assertTrue(PostUserMention.objects.filter(post_id=post.pk, user_id=mentioned_user.pk).exists())
def test_create_text_post_does_not_detect_creator_mention(self):
"""
should not detect mention if the mentioned person is the creator
"""
user = make_user()
headers = make_authentication_headers_for_user(user=user)
circle = make_circle(creator=user)
post_text = 'Hello @' + user.username
data = {
'text': post_text,
'circle_id': circle.pk
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
post = Post.objects.get(text=post_text, creator_id=user.pk)
self.assertFalse(PostUserMention.objects.filter(post_id=post.pk, user_id=user.pk).exists())
def test_create_text_post_detects_all_urls(self):
"""
should detect different links in post text and create post links models from them
"""
user = make_user()
headers = make_authentication_headers_for_user(user=user)
links = get_post_links()
post_text = " | ".join(links)
data = {
'text': post_text
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
post = Post.objects.get(text=post_text, creator_id=user.pk)
post_links = PostLink.objects.filter(post_id=post.pk)
result_links = [post_link.link for post_link in post_links]
self.assertEqual(len(result_links), len(links))
for link in links:
link = normalize_url(link)
self.assertTrue(link in result_links)
def test_create_text_post_does_not_skips_http_urls(self):
"""
should not skip http urls in post text while creating post links models from them
"""
user = make_user()
headers = make_authentication_headers_for_user(user=user)
post_text = 'http://unsafe.com/'
data = {
'text': post_text
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
post = Post.objects.get(text=post_text, creator_id=user.pk)
post_links = PostLink.objects.filter(post_id=post.pk)
result_links = [post_link.link for post_link in post_links]
self.assertTrue(len(result_links), 1)
result_link = result_links[0]
self.assertEqual(result_link, post_text)
self.assertEqual(len(result_links), 1)
def test_create_post_is_added_to_world_circle(self):
"""
the created text post should automatically added to world circle
"""
user = make_user()
auth_token = user.auth_token.key
post_text = fake.text(max_nb_chars=POST_MAX_LENGTH)
headers = {'HTTP_AUTHORIZATION': 'Token %s' % auth_token}
data = {
'text': post_text
}
url = self._get_url()
self.client.put(url, data, **headers, format='multipart')
world_circle = Circle.get_world_circle()
self.assertTrue(world_circle.posts.filter(text=post_text).count() == 1)
def test_create_post_in_circle(self):
"""
should be able to create a text post in an specified circle and return 201
"""
user = make_user()
circle = mixer.blend(Circle, creator=user)
auth_token = user.auth_token.key
post_text = fake.text(max_nb_chars=POST_MAX_LENGTH)
headers = {'HTTP_AUTHORIZATION': 'Token %s' % auth_token}
data = {
'text': post_text,
'circle_id': circle.pk
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertTrue(user.posts.filter(text=post_text).count() == 1)
self.assertTrue(circle.posts.filter(text=post_text).count() == 1)
def test_cannot_create_post_in_foreign_circle(self):
"""
should NOT be able to create a text post in an foreign circle and return 400
"""
user = make_user()
foreign_user = make_user()
circle = mixer.blend(Circle, creator=foreign_user)
post_text = fake.text(max_nb_chars=POST_MAX_LENGTH)
headers = make_authentication_headers_for_user(user)
data = {
'text': post_text,
'circle_id': circle.pk
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertTrue(user.posts.filter(text=post_text).count() == 0)
self.assertTrue(circle.posts.filter(text=post_text).count() == 0)
def test_can_create_post_in_world_circle(self):
"""
should be able to create a post in the world circle and return 201
"""
user = make_user()
circle = Circle.get_world_circle()
post_text = fake.text(max_nb_chars=POST_MAX_LENGTH)
headers = make_authentication_headers_for_user(user)
data = {
'text': post_text,
'circle_id': circle.pk
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertTrue(user.posts.filter(text=post_text).exists())
self.assertTrue(circle.posts.filter(text=post_text).exists())
def test_create_image_post(self):
"""
should be able to create an image post and return 201
"""
user = make_user()
auth_token = user.auth_token.key
image = Image.new('RGB', (100, 100))
tmp_file = tempfile.NamedTemporaryFile(suffix='.jpg')
image.save(tmp_file)
tmp_file.seek(0)
headers = {'HTTP_AUTHORIZATION': 'Token %s' % auth_token}
data = {
'image': tmp_file
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
response_post = json.loads(response.content)
response_post_id = response_post.get('id')
self.assertTrue(user.posts.count() == 1)
created_post = user.posts.filter(pk=response_post_id).get()
# To be removed
self.assertTrue(hasattr(created_post, 'image'))
self.assertTrue(created_post.status, Post.STATUS_PUBLISHED)
post_media_image = created_post.media.get(post_id=response_post_id, type=PostMedia.MEDIA_TYPE_IMAGE)
post_image = post_media_image.content_object
self.assertTrue(hasattr(post_image, 'image'))
def test_create_image_and_text_post(self):
"""
should be able to create an image and text post and return 201
"""
user = make_user()
auth_token = user.auth_token.key
post_text = fake.text(max_nb_chars=POST_MAX_LENGTH)
image = Image.new('RGB', (100, 100))
tmp_file = tempfile.NamedTemporaryFile(suffix='.jpg')
image.save(tmp_file)
tmp_file.seek(0)
headers = {'HTTP_AUTHORIZATION': 'Token %s' % auth_token}
data = {
'text': post_text,
'image': tmp_file
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
response_post = json.loads(response.content)
response_post_id = response_post.get('id')
self.assertTrue(user.posts.count() == 1)
created_post = user.posts.filter(pk=response_post_id).get()
self.assertEqual(created_post.text, post_text)
# To be removed
self.assertTrue(hasattr(created_post, 'image'))
self.assertTrue(created_post.status, Post.STATUS_PUBLISHED)
post_media_image = created_post.media.get(post_id=response_post_id, type=PostMedia.MEDIA_TYPE_IMAGE)
post_image = post_media_image.content_object
self.assertTrue(hasattr(post_image, 'image'))
def test_create_image_post_creates_hash(self):
"""
creating an image post should create a hash and return 201
"""
user = make_user()
image = Image.new('RGB', (100, 100))
tmp_file = tempfile.NamedTemporaryFile(suffix='.jpg')
image.save(tmp_file)
tmp_file.seek(0)
filehash = sha256sum(filename=tmp_file.name)
headers = make_authentication_headers_for_user(user)
data = {
'image': tmp_file
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
response_post = json.loads(response.content)
response_post_id = response_post.get('id')
self.assertTrue(user.posts.count() == 1)
media = PostMedia.objects.get(post_id=response_post_id, type=PostMedia.MEDIA_TYPE_IMAGE)
self.assertEqual(media.content_object.hash, filehash)
def test_create_video_post(self):
"""
should be able to create a video post and return 201
"""
test_videos = get_test_videos()
for test_video in test_videos:
with open(test_video['path'], 'rb') as file:
user = make_user()
headers = make_authentication_headers_for_user(user)
data = {
'video': file
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
response_post = json.loads(response.content)
response_post_id = response_post.get('id')
self.assertTrue(user.posts.count() == 1)
created_post = user.posts.filter(pk=response_post_id).get()
self.assertTrue(created_post.status, Post.STATUS_PROCESSING)
get_worker('high', worker_class=SimpleWorker).work(burst=True)
created_post.refresh_from_db()
self.assertTrue(created_post.status, Post.STATUS_PUBLISHED)
post_media_video = created_post.media.get(post_id=response_post_id, type=PostMedia.MEDIA_TYPE_VIDEO)
self.assertTrue(post_media_video.content_object.format_set.exists())
def test_create_video_post_creates_hash(self):
"""
creating a video post creates a hash and return 201
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
test_video = get_test_videos()[0]
with open(test_video['path'], 'rb') as file:
filehash = sha256sum(file=file)
data = {
'video': file
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
response_post = json.loads(response.content)
response_post_id = response_post.get('id')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
media = PostMedia.objects.get(post_id=response_post_id, type=PostMedia.MEDIA_TYPE_VIDEO)
self.assertEqual(media.content_object.hash, filehash)
def test_create_video_post_creates_thumbnail(self):
"""
creating a video post creates a thumbnail and return 201
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
for test_video in get_test_videos():
with open(test_video['path'], 'rb') as file:
data = {
'video': file
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
response_post = json.loads(response.content)
response_post_id = response_post.get('id')
media = PostMedia.objects.get(post_id=response_post_id, type=PostMedia.MEDIA_TYPE_VIDEO)
self.assertIsNotNone(media.content_object.thumbnail)
self.assertIsNotNone(media.content_object.thumbnail_width)
self.assertIsNotNone(media.content_object.thumbnail_height)
def test_create_video_and_text_post(self):
"""
should be able to create a video and text post and return 201
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
test_video = get_test_videos()[0]
with open(test_video['path'], 'rb') as file:
post_text = fake.text(max_nb_chars=POST_MAX_LENGTH)
data = {
'text': post_text,
'video': file
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
response_post = json.loads(response.content)
response_post_id = response_post.get('id')
self.assertTrue(user.posts.count() == 1)
created_post = user.posts.filter(pk=response_post_id).get()
self.assertTrue(created_post.status, Post.STATUS_PROCESSING)
self.assertEqual(created_post.text, post_text)
get_worker('high', worker_class=SimpleWorker).work(burst=True)
created_post.refresh_from_db()
self.assertTrue(created_post.status, Post.STATUS_PUBLISHED)
post_media_video = created_post.media.get(post_id=response_post_id, type=PostMedia.MEDIA_TYPE_VIDEO)
self.assertTrue(post_media_video.content_object.format_set.exists())
def test_cannot_create_both_video_and_image_post(self):
"""
should not be able to create both a video and image
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
video = SimpleUploadedFile("file.mp4", b"video_file_content", content_type="video/mp4")
image = Image.new('RGB', (100, 100))
tmp_file = tempfile.NamedTemporaryFile(suffix='.jpg')
image.save(tmp_file)
tmp_file.seek(0)
data = {
'image': tmp_file,
'video': video
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_create_post_publishes_post_by_default(self):
"""
should be able to create a post and have it automatically published
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
post_text = make_fake_post_text()
data = {
'text': post_text
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
response_post = json.loads(response.content)
response_post_id = response_post.get('id')
self.assertEqual(1, user.posts.filter(pk=response_post_id, status=Post.STATUS_PUBLISHED).count())
def test_can_create_draft_post_with_no_text_image_nor_video(self):
"""
should be able to create a draft post with no text, image nor video and return 201
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
data = {
'is_draft': True
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
response_post = json.loads(response.content)
response_post_id = response_post.get('id')
self.assertEqual(1, user.posts.filter(pk=response_post_id, status=Post.STATUS_DRAFT).count())
def test_cant_create_non_draft_post_with_no_text_image_nor_video(self):
"""
should be able to create a draft post with no text, image nor video and return 201
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
post_text = make_fake_post_text()
data = {
# Its default
# 'is_draft': False
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertFalse(user.posts.filter(text=post_text).exists())
def test_create_public_draft_post(self):
"""
should be able to create a public draft post and return 201
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
post_text = make_fake_post_text()
data = {
'text': post_text,
'is_draft': True
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
response_post = json.loads(response.content)
response_post_id = response_post.get('id')
self.assertEqual(user.posts.filter(text=post_text, pk=response_post_id, status=Post.STATUS_DRAFT).count(), 1)
def test_create_draft_post_in_circle(self):
"""
should be able to create a draft post in an specified circle and return 201
"""
user = make_user()
circle = mixer.blend(Circle, creator=user)
post_text = fake.text(max_nb_chars=POST_MAX_LENGTH)
headers = make_authentication_headers_for_user(user)
data = {
'text': post_text,
'circle_id': circle.pk,
'is_draft': True
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(user.posts.filter(text=post_text, status=Post.STATUS_DRAFT, circles__id=circle.pk).count(), 1)
def test_get_all_posts(self):
"""
should be able to retrieve all posts
"""
# BEWARE The max count for the API is 20. If we are checking for more than 20
# posts, it will fail
user = make_user()
auth_token = user.auth_token.key
amount_of_own_posts = 5
user_posts_ids = []
for i in range(amount_of_own_posts):
post = user.create_public_post(text=fake.text(max_nb_chars=POST_MAX_LENGTH))
user_posts_ids.append(post.pk)
amount_of_users_to_follow = 5
lists_to_follow_in = mixer.cycle(amount_of_users_to_follow).blend(List, creator=user)
users_to_follow = mixer.cycle(amount_of_users_to_follow).blend(User)
users_to_follow_posts_ids = []
for index, user_to_follow in enumerate(users_to_follow):
user.follow_user(user_to_follow, lists_ids=[lists_to_follow_in[index].pk])
post = user_to_follow.create_public_post(text=fake.text(max_nb_chars=POST_MAX_LENGTH))
users_to_follow_posts_ids.append(post.pk)
amount_of_users_to_connect = 5
circles_to_connect_in = mixer.cycle(amount_of_users_to_connect).blend(Circle, creator=user)
users_to_connect = mixer.cycle(amount_of_users_to_connect).blend(User)
users_to_connect_posts_ids = []
for index, user_to_connect in enumerate(users_to_connect):
user.connect_with_user_with_id(user_to_connect.pk, circles_ids=[circles_to_connect_in[index].pk])
post = user_to_connect.create_public_post(text=fake.text(max_nb_chars=POST_MAX_LENGTH))
users_to_connect_posts_ids.append(post.pk)
amount_of_community_posts = 5
community_posts_ids = []
for i in range(0, amount_of_community_posts):
community_creator = make_user()
community = make_community(creator=community_creator, type='P')
user.join_community_with_name(community_name=community.name)
community_member = make_user()
community_member.join_community_with_name(community_name=community.name)
community_post = community_member.create_community_post(text=make_fake_post_text(),
community_name=community.name)
community_posts_ids.append(community_post.pk)
headers = {'HTTP_AUTHORIZATION': 'Token %s' % auth_token}
all_posts_ids = users_to_connect_posts_ids + users_to_follow_posts_ids + user_posts_ids + community_posts_ids
url = self._get_url()
response = self.client.get(url, {'count': len(all_posts_ids)}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(all_posts_ids), len(response_posts))
for response_post in response_posts:
self.assertIn(response_post.get('id'), all_posts_ids)
def test_get_all_circle_posts(self):
"""
should be able to retrieve all posts for a given circle
"""
user = make_user()
auth_token = user.auth_token.key
amount_of_users_to_follow = 5
lists_to_follow_in = mixer.cycle(amount_of_users_to_follow).blend(List, creator=user)
users_to_follow = mixer.cycle(amount_of_users_to_follow).blend(User)
for index, user_to_follow in enumerate(users_to_follow):
user.follow_user(user_to_follow, lists_ids=[lists_to_follow_in[index].pk])
user_to_follow.create_public_post(text=fake.text(max_nb_chars=POST_MAX_LENGTH))
amount_of_users_to_connect = 5
circles_to_connect_in = mixer.cycle(amount_of_users_to_connect).blend(Circle, creator=user)
users_to_connect = mixer.cycle(amount_of_users_to_connect).blend(User)
for index, user_to_connect in enumerate(users_to_connect):
user.connect_with_user_with_id(user_to_connect.pk, circles_ids=[circles_to_connect_in[index].pk])
user_to_connect.create_public_post(text=fake.text(max_nb_chars=POST_MAX_LENGTH))
number_of_circles_to_retrieve_posts_from = 3
circles_to_retrieve_posts_from = mixer.cycle(number_of_circles_to_retrieve_posts_from).blend(Circle,
creator=user)
in_circle_posts_ids = []
for index, circle_to_retrieve_posts_from in enumerate(circles_to_retrieve_posts_from):
user_in_circle = make_user()
user.connect_with_user_with_id(user_in_circle.pk, circles_ids=[circle_to_retrieve_posts_from.pk])
post_in_circle = user_in_circle.create_public_post(text=fake.text(max_nb_chars=POST_MAX_LENGTH))
in_circle_posts_ids.append(post_in_circle.pk)
number_of_expected_posts = number_of_circles_to_retrieve_posts_from
headers = {'HTTP_AUTHORIZATION': 'Token %s' % auth_token}
url = self._get_url()
circles_query_str_value = ','.join(map(str, [circle.pk for circle in circles_to_retrieve_posts_from]))
response = self.client.get(url, {'circle_id': circles_query_str_value}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(response_posts), number_of_expected_posts)
for response_post in response_posts:
self.assertIn(response_post.get('id'), in_circle_posts_ids)
def test_get_all_lists_posts(self):
"""
should be able to retrieve all posts for a given list
"""
user = make_user()
auth_token = user.auth_token.key
amount_of_users_to_follow = 5
lists_to_follow_in = mixer.cycle(amount_of_users_to_follow).blend(List, creator=user)
users_to_follow = mixer.cycle(amount_of_users_to_follow).blend(User)
for index, user_to_follow in enumerate(users_to_follow):
user.follow_user(user_to_follow, lists_ids=[lists_to_follow_in[index].pk])
user_to_follow.create_public_post(text=fake.text(max_nb_chars=POST_MAX_LENGTH))
amount_of_users_to_connect = 5
circles_to_connect_in = mixer.cycle(amount_of_users_to_connect).blend(Circle, creator=user)
users_to_connect = mixer.cycle(amount_of_users_to_connect).blend(User)
for index, user_to_connect in enumerate(users_to_connect):
user.connect_with_user_with_id(user_to_connect.pk, circles_ids=[circles_to_connect_in[index].pk])
user_to_connect.create_public_post(text=fake.text(max_nb_chars=POST_MAX_LENGTH))
number_of_lists_to_retrieve_posts_from = 3
lists_to_retrieve_posts_from = mixer.cycle(number_of_lists_to_retrieve_posts_from).blend(List,
creator=user)
in_list_posts_ids = []
for index, list_to_retrieve_posts_from in enumerate(lists_to_retrieve_posts_from):
user_in_list = make_user()
user.follow_user(user_in_list, lists_ids=[list_to_retrieve_posts_from.pk])
post_in_list = user_in_list.create_public_post(text=fake.text(max_nb_chars=POST_MAX_LENGTH))
in_list_posts_ids.append(post_in_list.pk)
number_of_expected_posts = number_of_lists_to_retrieve_posts_from
headers = {'HTTP_AUTHORIZATION': 'Token %s' % auth_token}
url = self._get_url()
lists_query_str_value = ','.join(map(str, [list.pk for list in lists_to_retrieve_posts_from]))
response = self.client.get(url, {'list_id': lists_query_str_value}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(number_of_expected_posts, len(response_posts))
for response_post in response_posts:
self.assertIn(response_post.get('id'), in_list_posts_ids)
def test_get_all_posts_with_max_id_and_count(self):
"""
should be able to retrieve all posts with a max id and count
"""
user = make_user()
auth_token = user.auth_token.key
amount_of_own_posts = 10
user_posts_ids = []
for i in range(amount_of_own_posts):
post = user.create_public_post(text=fake.text(max_nb_chars=POST_MAX_LENGTH))
user_posts_ids.append(post.pk)
amount_of_users_to_follow = 5
lists_to_follow_in = mixer.cycle(amount_of_users_to_follow).blend(List, creator=user)
users_to_follow = mixer.cycle(amount_of_users_to_follow).blend(User)
users_to_follow_posts_ids = []
for index, user_to_follow in enumerate(users_to_follow):
user.follow_user(user_to_follow, lists_ids=[lists_to_follow_in[index].pk])
post = user_to_follow.create_public_post(text=fake.text(max_nb_chars=POST_MAX_LENGTH))
users_to_follow_posts_ids.append(post.pk)
amount_of_users_to_connect = 5
circles_to_connect_in = mixer.cycle(amount_of_users_to_connect).blend(Circle, creator=user)
users_to_connect = mixer.cycle(amount_of_users_to_connect).blend(User)
users_to_connect_posts_ids = []
for index, user_to_connect in enumerate(users_to_connect):
user.connect_with_user_with_id(user_to_connect.pk, circles_ids=[circles_to_connect_in[index].pk])
post = user_to_connect.create_public_post(text=fake.text(max_nb_chars=POST_MAX_LENGTH))
users_to_connect_posts_ids.append(post.pk)
headers = {'HTTP_AUTHORIZATION': 'Token %s' % auth_token}
all_posts_ids = users_to_connect_posts_ids + users_to_follow_posts_ids + user_posts_ids
url = self._get_url()
max_id = 10
count = 3
response = self.client.get(url, {
'count': count,
'max_id': max_id
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(count, len(response_posts))
for response_post in response_posts:
response_post_id = response_post.get('id')
self.assertIn(response_post_id, all_posts_ids)
self.assertTrue(response_post_id < max_id)
def test_get_all_posts_with_min_id_and_count(self):
"""
should be able to retrieve all posts with a min id and count
"""
user = make_user()
auth_token = user.auth_token.key
amount_of_own_posts = 10
user_posts_ids = []
for i in range(amount_of_own_posts):
post = user.create_public_post(text=fake.text(max_nb_chars=POST_MAX_LENGTH))
user_posts_ids.append(post.pk)
amount_of_users_to_follow = 5
lists_to_follow_in = mixer.cycle(amount_of_users_to_follow).blend(List, creator=user)
users_to_follow = mixer.cycle(amount_of_users_to_follow).blend(User)
users_to_follow_posts_ids = []
for index, user_to_follow in enumerate(users_to_follow):
user.follow_user(user_to_follow, lists_ids=[lists_to_follow_in[index].pk])
post = user_to_follow.create_public_post(text=fake.text(max_nb_chars=POST_MAX_LENGTH))
users_to_follow_posts_ids.append(post.pk)
amount_of_users_to_connect = 5
circles_to_connect_in = mixer.cycle(amount_of_users_to_connect).blend(Circle, creator=user)
users_to_connect = mixer.cycle(amount_of_users_to_connect).blend(User)
users_to_connect_posts_ids = []
for index, user_to_connect in enumerate(users_to_connect):
user.connect_with_user_with_id(user_to_connect.pk, circles_ids=[circles_to_connect_in[index].pk])
post = user_to_connect.create_public_post(text=fake.text(max_nb_chars=POST_MAX_LENGTH))
users_to_connect_posts_ids.append(post.pk)
headers = {'HTTP_AUTHORIZATION': 'Token %s' % auth_token}
all_posts_ids = users_to_connect_posts_ids + users_to_follow_posts_ids + user_posts_ids
url = self._get_url()
min_id = 10
count = 3
response = self.client.get(url, {
'count': count,
'min_id': min_id
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(count, len(response_posts))
for response_post in response_posts:
response_post_id = response_post.get('id')
self.assertIn(response_post_id, all_posts_ids)
self.assertTrue(response_post_id > min_id)
def test_get_all_public_posts_for_unconnected_user(self):
"""
should be able to retrieve all the public posts of an unconnected user
and return 200
"""
user = make_user()
amount_of_users_to_follow = random.randint(1, 5)
users_to_retrieve_posts_from = make_users(amount_of_users_to_follow)
for user_to_retrieve_posts_from in users_to_retrieve_posts_from:
post_text = make_fake_post_text()
user_to_retrieve_posts_from.create_public_post(text=post_text)
user_to_retrieve_posts_from = random.choice(users_to_retrieve_posts_from)
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, {
'username': user_to_retrieve_posts_from.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(response_posts), 1)
post = response_posts[0]
self.assertEqual(post['creator']['id'], user_to_retrieve_posts_from.pk)
def test_get_all_public_posts_for_connected_user(self):
"""
should be able to retrieve all the posts of a connected user
and return 200
"""
user = make_user()
user_to_connect_with = make_user()
user.connect_with_user_with_id(user_to_connect_with.pk)
user_to_connect_with.confirm_connection_with_user_with_id(user.pk)
amount_of_public_posts = random.randint(1, 5)
amount_of_encircled_posts = random.randint(1, 5)
created_posts_ids = []
for i in range(amount_of_public_posts):
post = user_to_connect_with.create_public_post(make_fake_post_text())
created_posts_ids.append(post.pk)
circle = make_circle(creator=user_to_connect_with)
user_to_connect_with.update_connection_with_user_with_id(user_id=user.pk, circles_ids=[circle.pk])
for i in range(amount_of_encircled_posts):
post = user_to_connect_with.create_encircled_post(text=make_fake_post_text(), circles_ids=[circle.pk])
created_posts_ids.append(post.pk)
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, {
'username': user_to_connect_with.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(created_posts_ids), len(response_posts))
response_posts_ids = [post['id'] for post in response_posts]
for post_id in created_posts_ids:
self.assertIn(post_id, response_posts_ids)
def test_get_all_public_posts_for_public_user_unauthenticated(self):
"""
should be able to retrieve all the public posts of an specific public visibility user
being unauthenticated and return 200
"""
user = make_user(visibility=User.VISIBILITY_TYPE_PUBLIC)
amount_of_user_public_posts = random.randint(1, 5)
amount_of_user_encircled_posts = random.randint(1, 5)
public_posts_ids = []
for i in range(amount_of_user_public_posts):
post_text = make_fake_post_text()
public_post = user.create_public_post(text=post_text)
public_posts_ids.append(public_post.pk)
for i in range(amount_of_user_encircled_posts):
post_text = make_fake_post_text()
circle = make_circle(creator=user)
user.create_encircled_post(text=post_text, circles_ids=[circle.pk])
url = self._get_url()
response = self.client.get(url, {
'username': user.username
})
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(response_posts), len(public_posts_ids))
response_posts_ids = [response_post['id'] for response_post in response_posts]
for public_post_id in public_posts_ids:
self.assertIn(public_post_id, response_posts_ids)
def test_cant_get_public_posts_for_private_user_unauthenticated(self):
"""
should not be able to retrieve the public posts of an specific private visibility user
being unauthenticated and return 400
"""
user = make_user(visibility=User.VISIBILITY_TYPE_PRIVATE)
amount_of_user_public_posts = random.randint(1, 5)
public_posts_ids = []
for i in range(amount_of_user_public_posts):
post_text = make_fake_post_text()
public_post = user.create_public_post(text=post_text)
public_posts_ids.append(public_post.pk)
url = self._get_url()
response = self.client.get(url, {
'username': user.username
})
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_can_get_public_posts_for_okuna_visibility_user_authenticated(self):
"""
should be able to retrieve the public posts of an specific okuna visibility user
being authenticated and return 400
"""
user = make_user()
headers = make_authentication_headers_for_user(user=user)
user_to_retrieve_posts_from = make_user(visibility=User.VISIBILITY_TYPE_OKUNA)
amount_of_user_public_posts = random.randint(1, 5)
public_posts_ids = []
for i in range(amount_of_user_public_posts):
post_text = make_fake_post_text()
public_post = user_to_retrieve_posts_from.create_public_post(text=post_text)
public_posts_ids.append(public_post.pk)
url = self._get_url()
response = self.client.get(url, {
'username': user_to_retrieve_posts_from.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(response_posts), len(public_posts_ids))
response_posts_ids = [response_post['id'] for response_post in response_posts]
for public_post_id in public_posts_ids:
self.assertIn(public_post_id, response_posts_ids)
def test_cant_get_public_posts_for_okuna_visibility_user_unauthenticated(self):
"""
should not be able to retrieve the public posts of an specific okuna visibility user
being unauthenticated and return 400
"""
user = make_user(visibility=User.VISIBILITY_TYPE_OKUNA)
amount_of_user_public_posts = random.randint(1, 5)
public_posts_ids = []
for i in range(amount_of_user_public_posts):
post_text = make_fake_post_text()
public_post = user.create_public_post(text=post_text)
public_posts_ids.append(public_post.pk)
url = self._get_url()
response = self.client.get(url, {
'username': user.username
})
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_can_get_public_posts_for_public_visibility_user_authenticated(self):
"""
should be able to retrieve the public posts of an specific public visibility user
being authenticated and return 400
"""
user = make_user()
headers = make_authentication_headers_for_user(user=user)
user_to_retrieve_posts_from = make_user(visibility=User.VISIBILITY_TYPE_PUBLIC)
amount_of_user_public_posts = random.randint(1, 5)
public_posts_ids = []
for i in range(amount_of_user_public_posts):
post_text = make_fake_post_text()
public_post = user_to_retrieve_posts_from.create_public_post(text=post_text)
public_posts_ids.append(public_post.pk)
url = self._get_url()
response = self.client.get(url, {
'username': user_to_retrieve_posts_from.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(response_posts), len(public_posts_ids))
response_posts_ids = [response_post['id'] for response_post in response_posts]
for public_post_id in public_posts_ids:
self.assertIn(public_post_id, response_posts_ids)
def test_get_all_own_posts(self):
"""
should be able to retrieve all own posts
and return 200
"""
user = make_user()
amount_of_public_posts = random.randint(1, 5)
amount_of_encircled_posts = random.randint(1, 5)
created_posts_ids = []
for i in range(amount_of_public_posts):
post = user.create_public_post(make_fake_post_text())
created_posts_ids.append(post.pk)
circle = make_circle(creator=user)
for i in range(amount_of_encircled_posts):
post = user.create_encircled_post(text=make_fake_post_text(), circles_ids=[circle.pk])
created_posts_ids.append(post.pk)
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, {
'username': user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(response_posts), len(created_posts_ids))
response_posts_ids = [post['id'] for post in response_posts]
for post_id in created_posts_ids:
self.assertIn(post_id, response_posts_ids)
def test_filter_public_community_post_from_own_posts_when_not_community_posts_visible(self):
"""
should filter public community posts when community_posts_visible is false and return 200
"""
user = make_user()
user.update(community_posts_visible=False)
community_owner = make_user()
community = make_community(creator=community_owner, type=Community.COMMUNITY_TYPE_PUBLIC)
community_owner.invite_user_with_username_to_community_with_name(username=user.username,
community_name=community.name)
user.join_community_with_name(community_name=community.name)
amount_of_community_posts = random.randint(1, 5)
created_posts_ids = []
for i in range(amount_of_community_posts):
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
created_posts_ids.append(post.pk)
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, {
'username': user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(response_posts), 0)
def test_filter_private_community_post_from_own_posts_when_not_community_posts_visible(self):
"""
should filter public community posts when community_posts_visible is false and return 200
"""
user = make_user()
user.update(community_posts_visible=False)
community_owner = make_user()
community = make_community(creator=community_owner, type=Community.COMMUNITY_TYPE_PRIVATE)
community_owner.invite_user_with_username_to_community_with_name(username=user.username,
community_name=community.name)
user.join_community_with_name(community_name=community.name)
amount_of_community_posts = random.randint(1, 5)
created_posts_ids = []
for i in range(amount_of_community_posts):
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
created_posts_ids.append(post.pk)
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, {
'username': user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(response_posts), 0)
def test_retrieve_own_public_community_post_from_own_posts_when_community_posts_visible(self):
"""
should retrieve our own public community posts when community_posts_visible is true and return 200
"""
user = make_user()
user.update(community_posts_visible=True)
community_owner = make_user()
community = make_community(creator=community_owner, type=Community.COMMUNITY_TYPE_PUBLIC)
community_owner.invite_user_with_username_to_community_with_name(username=user.username,
community_name=community.name)
user.join_community_with_name(community_name=community.name)
amount_of_community_posts = random.randint(1, 5)
created_posts_ids = []
for i in range(amount_of_community_posts):
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
created_posts_ids.append(post.pk)
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, {
'username': user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
response_posts_ids = [post['id'] for post in response_posts]
for post_id in created_posts_ids:
self.assertIn(post_id, response_posts_ids)
def test_retrieve_private_community_post_from_own_posts_when_community_posts_visible(self):
"""
should retrieve the private community posts when community_posts_visible is true and return 200
"""
user = make_user()
user.update(community_posts_visible=True)
community_owner = make_user()
community = make_community(creator=community_owner, type=Community.COMMUNITY_TYPE_PRIVATE)
community_owner.invite_user_with_username_to_community_with_name(username=user.username,
community_name=community.name)
user.join_community_with_name(community_name=community.name)
community = make_community(creator=user)
amount_of_community_posts = random.randint(1, 5)
created_posts_ids = []
for i in range(amount_of_community_posts):
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
created_posts_ids.append(post.pk)
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, {
'username': user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
response_posts_ids = [post['id'] for post in response_posts]
for post_id in created_posts_ids:
self.assertIn(post_id, response_posts_ids)
def test_filter_public_community_post_from_foreign_user_posts_when_not_community_posts_visible(self):
"""
should filter public community posts from foreign user when community_posts_visible is false and return 200
"""
user = make_user()
foreign_user = make_user()
foreign_user.update(community_posts_visible=False)
community_owner = make_user()
community = make_community(creator=community_owner, type=Community.COMMUNITY_TYPE_PUBLIC)
foreign_user.join_community_with_name(community_name=community.name)
amount_of_community_posts = random.randint(1, 5)
created_posts_ids = []
for i in range(amount_of_community_posts):
post = foreign_user.create_community_post(community_name=community.name, text=make_fake_post_text())
created_posts_ids.append(post.pk)
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, {
'username': foreign_user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(response_posts), 0)
def test_filter_private_community_post_from_foreign_user_posts_when_not_community_posts_visible(self):
"""
should filter private community posts from foreign user when community_posts_visible is false and return 200
"""
user = make_user()
foreign_user = make_user()
foreign_user.update(community_posts_visible=False)
community_owner = make_user()
community = make_community(creator=community_owner, type=Community.COMMUNITY_TYPE_PRIVATE)
community_owner.invite_user_with_username_to_community_with_name(username=foreign_user.username,
community_name=community.name)
foreign_user.join_community_with_name(community_name=community.name)
amount_of_community_posts = random.randint(1, 5)
created_posts_ids = []
for i in range(amount_of_community_posts):
post = foreign_user.create_community_post(community_name=community.name, text=make_fake_post_text())
created_posts_ids.append(post.pk)
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, {
'username': foreign_user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(response_posts), 0)
def test_filter_private_joined_community_post_from_foreign_user_posts_when_not_community_posts_visible(self):
"""
should filter private joined community posts from foreign user when community_posts_visible is false and return 200
"""
user = make_user()
foreign_user = make_user()
foreign_user.update(community_posts_visible=False)
community_owner = make_user()
community = make_community(creator=community_owner, type=Community.COMMUNITY_TYPE_PRIVATE)
community_owner.invite_user_with_username_to_community_with_name(username=foreign_user.username,
community_name=community.name)
foreign_user.join_community_with_name(community_name=community.name)
community_owner.invite_user_with_username_to_community_with_name(username=user.username,
community_name=community.name)
user.join_community_with_name(community_name=community.name)
amount_of_community_posts = random.randint(1, 5)
created_posts_ids = []
for i in range(amount_of_community_posts):
post = foreign_user.create_community_post(community_name=community.name, text=make_fake_post_text())
created_posts_ids.append(post.pk)
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, {
'username': foreign_user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(response_posts), 0)
def test_retrieve_public_community_post_from_foreign_user_posts_when_community_posts_visible(self):
"""
should retrieve public community posts from foreign user when community_posts_visible is true and return 200
"""
user = make_user()
foreign_user = make_user()
foreign_user.update(community_posts_visible=True)
community_owner = make_user()
community = make_community(creator=community_owner, type=Community.COMMUNITY_TYPE_PUBLIC)
foreign_user.join_community_with_name(community_name=community.name)
amount_of_community_posts = random.randint(1, 5)
created_posts_ids = []
for i in range(amount_of_community_posts):
post = foreign_user.create_community_post(community_name=community.name, text=make_fake_post_text())
created_posts_ids.append(post.pk)
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, {
'username': foreign_user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
response_posts_ids = [post['id'] for post in response_posts]
for post_id in created_posts_ids:
self.assertIn(post_id, response_posts_ids)
def test_retrieve_joined_private_community_post_from_foreign_user_posts_when_community_posts_visible(self):
"""
should retrieve joined private community posts from foreign user when community_posts_visible is true and return 200
"""
user = make_user()
foreign_user = make_user()
foreign_user.update(community_posts_visible=True)
community_owner = make_user()
community = make_community(creator=community_owner, type=Community.COMMUNITY_TYPE_PRIVATE)
community_owner.invite_user_with_username_to_community_with_name(username=foreign_user.username,
community_name=community.name)
foreign_user.join_community_with_name(community_name=community.name)
community_owner.invite_user_with_username_to_community_with_name(username=user.username,
community_name=community.name)
user.join_community_with_name(community_name=community.name)
amount_of_community_posts = random.randint(1, 5)
created_posts_ids = []
for i in range(amount_of_community_posts):
post = foreign_user.create_community_post(community_name=community.name, text=make_fake_post_text())
created_posts_ids.append(post.pk)
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, {
'username': foreign_user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
response_posts_ids = [post['id'] for post in response_posts]
for post_id in created_posts_ids:
self.assertIn(post_id, response_posts_ids)
def test_filter_private_community_post_from_foreign_user_posts_when_community_posts_visible(self):
"""
should filter private community posts from foreign user when community_posts_visible is true and return 200
"""
user = make_user()
foreign_user = make_user()
foreign_user.update(community_posts_visible=True)
community_owner = make_user()
community = make_community(creator=community_owner, type=Community.COMMUNITY_TYPE_PRIVATE)
community_owner.invite_user_with_username_to_community_with_name(username=foreign_user.username,
community_name=community.name)
foreign_user.join_community_with_name(community_name=community.name)
amount_of_community_posts = random.randint(1, 5)
created_posts_ids = []
for i in range(amount_of_community_posts):
post = foreign_user.create_community_post(community_name=community.name, text=make_fake_post_text())
created_posts_ids.append(post.pk)
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, {
'username': foreign_user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(response_posts), 0)
def test_filter_excluded_public_community_post_from_foreign_user_posts(self):
"""
should filter excluded public community from foreign user and return 200
"""
user = make_user()
foreign_user = make_user()
foreign_user.update(community_posts_visible=True)
community_owner = make_user()
community = make_community(creator=community_owner, type=Community.COMMUNITY_TYPE_PUBLIC)
foreign_user.join_community_with_name(community_name=community.name)
foreign_user.create_community_post(community_name=community.name, text=make_fake_post_text())
foreign_user.exclude_community_from_profile_posts(community=community)
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, {
'username': foreign_user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(response_posts), 0)
def test_filter_excluded_private_community_member_of_post_from_foreign_user_posts(self):
"""
should filter excluded private community member of post from foreign user and return 200
"""
user = make_user()
foreign_user = make_user()
foreign_user.update(community_posts_visible=True)
community_owner = make_user()
community = make_community(creator=community_owner, type=Community.COMMUNITY_TYPE_PRIVATE)
community_owner.invite_user_with_username_to_community_with_name(username=foreign_user.username,
community_name=community.name)
foreign_user.join_community_with_name(community_name=community.name)
community_owner.invite_user_with_username_to_community_with_name(username=user.username,
community_name=community.name)
user.join_community_with_name(community_name=community.name)
foreign_user.create_community_post(community_name=community.name, text=make_fake_post_text())
foreign_user.exclude_community_from_profile_posts(community=community)
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, {
'username': foreign_user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(response_posts), 0)
def test_get_all_public_posts_for_public_visibility_user_unauthenticated_with_max_id_and_count(self):
"""
should be able to retrieve all the public posts of an specific public visibility user
using max_id and count being unauthenticated and return 200
"""
user = make_user(visibility=User.VISIBILITY_TYPE_PUBLIC)
amount_of_user_public_posts = 10
public_posts_ids = []
for i in range(amount_of_user_public_posts):
post_text = make_fake_post_text()
public_post = user.create_public_post(text=post_text)
public_posts_ids.append(public_post.pk)
url = self._get_url()
count = 5
max_id = 6
response = self.client.get(url, {
'username': user.username,
'count': count,
'max_id': max_id
})
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(response_posts), count)
response_posts_ids = [response_post['id'] for response_post in response_posts]
for response_post_id in response_posts_ids:
self.assertTrue(response_post_id < max_id)
def test_retrieves_no_posts_when_filtering_on_empty_circle(self):
"""
should retrieve no posts when filtering on an empty circle
"""
user = make_user()
connections_circle_id = user.connections_circle_id
headers = make_authentication_headers_for_user(user)
amount_of_foreign_public_posts = 10
public_posts_ids = []
for i in range(amount_of_foreign_public_posts):
post_text = make_fake_post_text()
foreign_user = make_user()
public_post = foreign_user.create_public_post(text=post_text)
public_posts_ids.append(public_post.pk)
url = self._get_url()
response = self.client.get(url, {'circle_id': connections_circle_id}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(response_posts), 0)
def test_retrieves_own_posts_of_own_filtered_circle(self):
"""
should retrieve own posts when filtering on a circle that is from us
"""
user = make_user()
circle = make_circle(creator=user)
circle_id = circle.pk
headers = make_authentication_headers_for_user(user)
amount_of_posts = 10
posts_ids = []
for i in range(amount_of_posts):
post_text = make_fake_post_text()
post = user.create_encircled_post(text=post_text, circles_ids=[circle_id])
posts_ids.append(post.pk)
url = self._get_url()
response = self.client.get(url, {'circle_id': circle_id}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(posts_ids), len(response_posts))
response_posts_ids = [post['id'] for post in response_posts]
for post_id in posts_ids:
self.assertIn(post_id, response_posts_ids)
def test_can_retrieve_encircled_posts_of_confirmed_connection(self):
"""
should be able to retrieve the encircled posts of a confirmed connection
"""
user = make_user()
user_to_connect_to = make_user()
user.connect_with_user_with_id(user_id=user_to_connect_to.pk)
user_to_connect_to_circle = make_circle(creator=user_to_connect_to)
user_to_connect_to.confirm_connection_with_user_with_id(user_id=user.pk,
circles_ids=[user_to_connect_to_circle.pk])
post = user_to_connect_to.create_encircled_post(text=make_fake_post_text(),
circles_ids=[user_to_connect_to_circle.pk])
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
response_post = response_posts[0]
self.assertEqual(response_post['id'], post.pk)
def test_cant_retrieve_encircled_posts_of_unconfirmed_connection(self):
"""
should not be able to retrieve encircled posts of unconfirmed co
"""
user = make_user()
user_to_connect_to = make_user()
user.connect_with_user_with_id(user_id=user_to_connect_to.pk)
user_to_connect_to_circle = make_circle(creator=user_to_connect_to)
user_to_connect_to.create_encircled_post(text=make_fake_post_text(),
circles_ids=[user_to_connect_to_circle.pk])
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_can_retrieve_posts_with_recent_unconfirmed_connection_encircled_post(self):
"""
should be able to retrieve the timeline posts with an unconfirmed connection recent posts
https://github.com/OpenbookOrg/openbook-api/issues/301
"""
user = make_user()
user_timeline_posts_amount = 5
posts_ids = []
for i in range(0, user_timeline_posts_amount):
foreign_user = make_user()
foreign_post = foreign_user.create_encircled_post(text=make_fake_post_text(),
circles_ids=[foreign_user.connections_circle_id])
posts_ids.append(foreign_post.pk)
user.connect_with_user_with_id(user_id=foreign_user.pk)
foreign_user.confirm_connection_with_user_with_id(user_id=user.pk)
connection_requester_user = make_user()
connection_requester_user.connect_with_user_with_id(user_id=user.pk, circles_ids=[
connection_requester_user.connections_circle.pk])
user.confirm_connection_with_user_with_id(user_id=connection_requester_user.pk)
connection_requester_user.disconnect_from_user_with_id(user_id=user.pk)
connection_requester_user.connect_with_user_with_id(user_id=user.pk, circles_ids=[
connection_requester_user.connections_circle.pk])
connection_requester_user_circle = make_circle(creator=connection_requester_user)
connection_requester_user.update_connection_with_user_with_id(user_id=user.pk,
circles_ids=[connection_requester_user_circle.pk])
connection_requester_user.create_encircled_post(text=make_fake_post_text(),
circles_ids=[connection_requester_user_circle.pk])
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(posts_ids), len(response_posts))
response_posts_ids = [post['id'] for post in response_posts]
for post_id in posts_ids:
self.assertIn(post_id, response_posts_ids)
def test_can_retrieve_own_community_post(self):
"""
should be able to retrieve an own post posted to a community
"""
user = make_user()
community_creator = make_user()
community = make_community(creator=community_creator)
user.join_community_with_name(community_name=community.name)
post_text = make_fake_post_text()
user.create_community_post(community_name=community.name, text=make_fake_post_text())
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
response_post = response_posts[0]
response_post_id = response_post['id']
retrieved_post = Post.objects.get(pk=response_post_id)
self.assertEqual(retrieved_post.community_id, community.pk)
self.assertTrue(retrieved_post.text, post_text)
def test_does_not_retrieve_duplicate_connections_posts_when_multiple_circles(self):
"""
should not retrieve duplicate connections posts when posted to multiple circles
"""
user = make_user()
user_to_connect_to = make_user()
circle = make_circle(creator=user)
user.connect_with_user_with_id(user_id=user_to_connect_to.pk, circles_ids=[circle.pk])
user_to_connect_to.confirm_connection_with_user_with_id(user_id=user.pk, )
post = user.create_encircled_post(text=make_fake_post_text(),
circles_ids=[circle.pk, user.connections_circle_id])
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
response_post = response_posts[0]
self.assertEqual(response_post['id'], post.pk)
def test_cant_retrieve_moderated_approved_community_posts(self):
"""
should not be able to retrieve moderated approved community posts
"""
user = make_user()
community_creator = make_user()
community = make_community(creator=community_creator)
post_creator = make_user()
user.join_community_with_name(community_name=community.name)
post_creator.join_community_with_name(community_name=community.name)
number_of_posts = 5
post_reporter = make_user()
report_category = make_moderation_category()
for i in range(0, number_of_posts):
post = post_creator.create_community_post(community_name=community.name, text=make_fake_post_text())
post_reporter.report_post(post=post, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=report_category.pk)
community_creator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_can_retrieve_moderated_rejected_community_posts(self):
"""
should not be able to retrieve moderated rejected community posts
"""
user = make_user()
community_creator = make_user()
community = make_community(creator=community_creator)
post_creator = make_user()
user.join_community_with_name(community_name=community.name)
post_creator.join_community_with_name(community_name=community.name)
number_of_posts = 5
post_reporter = make_user()
report_category = make_moderation_category()
posts_ids = []
for i in range(0, number_of_posts):
post = post_creator.create_community_post(community_name=community.name, text=make_fake_post_text())
posts_ids.append(post.pk)
post_reporter.report_post(post=post, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=report_category.pk)
community_creator.reject_moderated_object(moderated_object=moderated_object)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(posts_ids), len(response_posts))
response_posts_ids = [post['id'] for post in response_posts]
for post_id in posts_ids:
self.assertIn(post_id, response_posts_ids)
def test_can_retrieve_moderated_pending_community_posts(self):
"""
should not be able to retrieve moderated pending community posts
"""
user = make_user()
community_creator = make_user()
community = make_community(creator=community_creator)
post_creator = make_user()
user.join_community_with_name(community_name=community.name)
post_creator.join_community_with_name(community_name=community.name)
number_of_posts = 5
post_reporter = make_user()
report_category = make_moderation_category()
posts_ids = []
for i in range(0, number_of_posts):
post = post_creator.create_community_post(community_name=community.name, text=make_fake_post_text())
posts_ids.append(post.pk)
post_reporter.report_post(post=post, category_id=report_category.pk)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(len(posts_ids), len(response_posts))
response_posts_ids = [post['id'] for post in response_posts]
for post_id in posts_ids:
self.assertIn(post_id, response_posts_ids)
def test_cant_retrieve_soft_deleted_community_posts(self):
"""
should not be able to retrieve soft deleted community posts
"""
user = make_user()
community_creator = make_user()
community = make_community(creator=community_creator)
post_creator = make_user()
user.join_community_with_name(community_name=community.name)
post_creator.join_community_with_name(community_name=community.name)
number_of_posts = 5
for i in range(0, number_of_posts):
post = post_creator.create_community_post(community_name=community.name, text=make_fake_post_text())
post.soft_delete()
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_soft_deleted_following_user_posts(self):
"""
should not be able to retrieve soft deleted following user posts
"""
user = make_user()
following_user = make_user()
user.follow_user_with_id(user_id=following_user.pk)
following_user_post = following_user.create_public_post(text=make_fake_post_text())
following_user_post.soft_delete()
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_soft_deleted_following_user_posts_when_filtering(self):
"""
should not be able to retrieve soft deleted following user posts when filtering
"""
user = make_user()
following_user = make_user()
follow_list = make_list(creator=user)
user.follow_user_with_id(user_id=following_user.pk, lists_ids=[follow_list.pk])
following_user_post = following_user.create_public_post(text=make_fake_post_text())
following_user_post.soft_delete()
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'list_id': follow_list.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_soft_deleted_connected_user_posts(self):
"""
should not be able to retrieve soft deleted connected user posts
"""
user = make_user()
connected_user = make_user()
user.connect_with_user_with_id(user_id=connected_user.pk)
connected_user_post_circle = make_circle(creator=connected_user)
connected_user.confirm_connection_with_user_with_id(user_id=user.pk,
circles_ids=[connected_user_post_circle.pk])
connected_user_post = connected_user.create_encircled_post(text=make_fake_post_text(),
circles_ids=[connected_user_post_circle.pk])
connected_user_post.soft_delete()
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_soft_deleted_connected_user_posts_when_filtering(self):
"""
should not be able to retrieve soft deleted connected user posts when filtering
"""
user = make_user()
connected_user = make_user()
user.connect_with_user_with_id(user_id=connected_user.pk)
connected_user_post_circle = make_circle(creator=connected_user)
connected_user.confirm_connection_with_user_with_id(user_id=user.pk,
circles_ids=[connected_user_post_circle.pk])
connected_user_post = connected_user.create_encircled_post(text=make_fake_post_text(),
circles_ids=[connected_user_post_circle.pk])
connected_user_post.soft_delete()
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'circle_id': connected_user_post_circle.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_reported_connected_user_posts(self):
"""
should not be able to retrieve reported connected user posts
"""
user = make_user()
connected_user = make_user()
user.connect_with_user_with_id(user_id=connected_user.pk)
connected_user_post_circle = make_circle(creator=connected_user)
connected_user.confirm_connection_with_user_with_id(user_id=user.pk,
circles_ids=[connected_user_post_circle.pk])
connected_user_post = connected_user.create_encircled_post(text=make_fake_post_text(),
circles_ids=[connected_user_post_circle.pk])
user.report_post(post=connected_user_post, category_id=make_moderation_category().pk)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_reported_connected_user_posts_when_filtering(self):
"""
should not be able to retrieve reported connected user posts when filtering
"""
user = make_user()
connected_user = make_user()
user.connect_with_user_with_id(user_id=connected_user.pk)
connected_user_post_circle = make_circle(creator=connected_user)
connected_user.confirm_connection_with_user_with_id(user_id=user.pk,
circles_ids=[connected_user_post_circle.pk])
connected_user_post = connected_user.create_encircled_post(text=make_fake_post_text(),
circles_ids=[connected_user_post_circle.pk])
user.report_post(post=connected_user_post, category_id=make_moderation_category().pk)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'circle_id': connected_user_post_circle.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_reported_following_user_posts(self):
"""
should not be able to retrieve reported following user posts
"""
user = make_user()
following_user = make_user()
user.follow_user_with_id(user_id=following_user.pk)
following_user_post = following_user.create_public_post(text=make_fake_post_text())
user.report_post(post=following_user_post, category_id=make_moderation_category().pk)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_reported_following_user_posts_when_filtering(self):
"""
should not be able to retrieve reported following user posts when filtering
"""
user = make_user()
following_user = make_user()
follow_list = make_list(creator=user)
user.follow_user_with_id(user_id=following_user.pk, lists_ids=[follow_list.pk])
following_user_post = following_user.create_public_post(text=make_fake_post_text())
user.report_post(post=following_user_post, category_id=make_moderation_category().pk)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'list_id': follow_list.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_reported_community_posts(self):
"""
should not be able to retrieve reported community posts
"""
user = make_user()
community = make_community()
post_creator = make_user()
user.join_community_with_name(community_name=community.name)
post_creator.join_community_with_name(community_name=community.name)
number_of_posts = 5
report_category = make_moderation_category()
for i in range(0, number_of_posts):
post = post_creator.create_community_post(community_name=community.name, text=make_fake_post_text())
user.report_post(post=post, category_id=report_category.pk)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_reported_community_posts_by_username(self):
"""
should not be able to retrieve reported community posts by username
"""
user = make_user()
community = make_community()
post_creator = make_user()
user.join_community_with_name(community_name=community.name)
post_creator.join_community_with_name(community_name=community.name)
report_category = make_moderation_category()
post = post_creator.create_community_post(community_name=community.name, text=make_fake_post_text())
user.report_post(post=post, category_id=report_category.pk)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'username': post_creator.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_reported_and_approved_community_posts_by_username(self):
"""
should not be able to retrieve and reported and approved community posts by username
"""
user = make_user()
community_creator = make_user()
community = make_community(creator=community_creator)
post_creator = make_user()
user.join_community_with_name(community_name=community.name)
post_creator.join_community_with_name(community_name=community.name)
post_reporter = make_user()
report_category = make_moderation_category()
post = post_creator.create_community_post(community_name=community.name, text=make_fake_post_text())
post_reporter.report_post(post=post, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=report_category.pk)
community_creator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'username': post_creator.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_soft_deleted_community_posts_by_username(self):
"""
should not be able to retrieve soft deleted community posts by username
"""
user = make_user()
community_creator = make_user()
community = make_community(creator=community_creator)
post_creator = make_user()
user.join_community_with_name(community_name=community.name)
post_creator.join_community_with_name(community_name=community.name)
post = post_creator.create_community_post(community_name=community.name, text=make_fake_post_text())
post.soft_delete()
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'username': post_creator.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_closed_community_posts_by_username(self):
"""
should not be able to retrieve closed community posts by username
"""
user = make_user()
community_creator = make_user()
community = make_community(creator=community_creator)
post_creator = make_user()
user.join_community_with_name(community_name=community.name)
post_creator.join_community_with_name(community_name=community.name)
post = post_creator.create_community_post(community_name=community.name, text=make_fake_post_text())
community_creator.close_post(post=post)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'username': post_creator.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_draft_public_community_posts(self):
"""
should not be able to retrieve draft public community posts
"""
user = make_user()
community_creator = make_user()
community = make_community(creator=community_creator)
post_creator = make_user()
user.join_community_with_name(community_name=community.name)
post_creator.join_community_with_name(community_name=community.name)
number_of_posts = 5
for i in range(0, number_of_posts):
post_creator.create_community_post(community_name=community.name, text=make_fake_post_text(),
is_draft=True)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_draft_private_community_part_of_posts(self):
"""
should not be able to retrieve draft private community posts
"""
user = make_user()
community_creator = make_user()
community = make_community(creator=community_creator, type=Community.COMMUNITY_TYPE_PRIVATE)
post_creator = make_user()
community_creator.invite_user_with_username_to_community_with_name(username=user.username,
community_name=community.name)
community_creator.invite_user_with_username_to_community_with_name(username=post_creator.username,
community_name=community.name)
user.join_community_with_name(community_name=community.name)
post_creator.join_community_with_name(community_name=community.name)
number_of_posts = 5
for i in range(0, number_of_posts):
post_creator.create_community_post(community_name=community.name, text=make_fake_post_text(),
is_draft=True)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_following_user_draft_posts(self):
"""
should not be able to retrieve following user draft posts
"""
user = make_user()
following_user = make_user()
user.follow_user_with_id(user_id=following_user.pk)
following_user.create_public_post(text=make_fake_post_text(), is_draft=True)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_following_user_draft_posts_when_filtering(self):
"""
should not be able to retrieve following user draft posts when filtering
"""
user = make_user()
following_user = make_user()
follow_list = make_list(creator=user)
user.follow_user_with_id(user_id=following_user.pk, lists_ids=[follow_list.pk])
following_user.create_public_post(text=make_fake_post_text(), is_draft=True)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'list_id': follow_list.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_connected_user_draft_posts(self):
"""
should not be able to retrieve connected user draft posts
"""
user = make_user()
connected_user = make_user()
user.connect_with_user_with_id(user_id=connected_user.pk)
connected_user_post_circle = make_circle(creator=connected_user)
connected_user.confirm_connection_with_user_with_id(user_id=user.pk,
circles_ids=[connected_user_post_circle.pk])
connected_user.create_encircled_post(text=make_fake_post_text(),
circles_ids=[connected_user_post_circle.pk],
is_draft=True)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_connected_user_draft_posts_when_filtering(self):
"""
should not be able to retrieve connected user draft posts when filtering
"""
user = make_user()
connected_user = make_user()
user.connect_with_user_with_id(user_id=connected_user.pk)
connected_user_post_circle = make_circle(creator=connected_user)
connected_user.confirm_connection_with_user_with_id(user_id=user.pk,
circles_ids=[connected_user_post_circle.pk])
connected_user.create_encircled_post(text=make_fake_post_text(),
circles_ids=[connected_user_post_circle.pk], is_draft=True)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'circle_id': connected_user_post_circle.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_public_community_draft_posts(self):
"""
should not be able to retrieve public community draft posts
"""
user = make_user()
community = make_community()
community_member = make_user()
user.join_community_with_name(community_name=community.name)
community_member.join_community_with_name(community_name=community.name)
community_member.create_community_post(text=make_fake_post_text(), is_draft=True, community_name=community.name)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_private_community_part_of_draft_posts(self):
"""
should not be able to retrieve private community part of draft posts
"""
user = make_user()
community_creator = make_user()
community = make_community(type=Community.COMMUNITY_TYPE_PRIVATE, creator=community_creator)
community_member = make_user()
community_creator.invite_user_with_username_to_community_with_name(username=community_member.username,
community_name=community.name)
community_creator.invite_user_with_username_to_community_with_name(username=user.username,
community_name=community.name)
user.join_community_with_name(community_name=community.name)
community_member.join_community_with_name(community_name=community.name)
community_member.create_community_post(text=make_fake_post_text(), is_draft=True, community_name=community.name)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_public_community_processing_posts(self):
"""
should not be able to retrieve public community processing posts
"""
user = make_user()
community = make_community()
community_member = make_user()
user.join_community_with_name(community_name=community.name)
community_member.join_community_with_name(community_name=community.name)
test_image = get_test_image()
with open(test_image['path'], 'rb') as file:
file = File(file)
community_member.create_community_post(text=make_fake_post_text(), image=file,
community_name=community.name)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_private_community_part_of_processing_posts(self):
"""
should not be able to retrieve private community part of processing posts
"""
user = make_user()
community_creator = make_user()
community = make_community(type=Community.COMMUNITY_TYPE_PRIVATE, creator=community_creator)
community_member = make_user()
community_creator.invite_user_with_username_to_community_with_name(username=community_member.username,
community_name=community.name)
community_creator.invite_user_with_username_to_community_with_name(username=user.username,
community_name=community.name)
user.join_community_with_name(community_name=community.name)
community_member.join_community_with_name(community_name=community.name)
test_image = get_test_image()
with open(test_image['path'], 'rb') as file:
file = File(file)
community_member.create_community_post(text=make_fake_post_text(), image=file,
community_name=community.name)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_connected_user_draft_posts_by_username(self):
"""
should not be able to retrieve connected user draft posts by username
"""
user = make_user()
connected_user = make_user()
user.connect_with_user_with_id(user_id=connected_user.pk)
connected_user_post_circle = make_circle(creator=connected_user)
connected_user.confirm_connection_with_user_with_id(user_id=user.pk,
circles_ids=[connected_user_post_circle.pk])
connected_user.create_encircled_post(text=make_fake_post_text(),
circles_ids=[connected_user_post_circle.pk], is_draft=True)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'username': connected_user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_following_user_draft_posts_by_username(self):
"""
should not be able to retrieve following user draft posts by username
"""
user = make_user()
following_user = make_user()
follow_list = make_list(creator=user)
user.follow_user_with_id(user_id=following_user.pk, lists_ids=[follow_list.pk])
following_user.create_public_post(text=make_fake_post_text(), is_draft=True)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'username': following_user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_own_reported_and_approved_community_posts_by_username(self):
"""
should not be able to retrieve own approved and reported posts by username
"""
user = make_user()
community_creator = make_user()
community = make_community(creator=community_creator)
user.join_community_with_name(community_name=community.name)
post_reporter = make_user()
report_category = make_moderation_category()
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
post_reporter.report_post(post=post, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=report_category.pk)
community_creator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'username': user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_own_reported_and_approved_posts_by_username(self):
"""
should not be able to retrieve own approved and reported posts by username
"""
user = make_user()
global_moderator = make_global_moderator()
post_reporter = make_user()
report_category = make_moderation_category()
post = user.create_public_post(text=make_fake_post_text())
post_reporter.report_post(post=post, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'username': user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_comment_counts_on_community_post_should_exclude_blocked_users(self):
"""
should not count blocked users that are not admins in the comment counts on a community post
"""
user = make_user()
blocked_user = make_user()
community_creator = make_user()
community = make_community(creator=community_creator)
user.join_community_with_name(community_name=community.name)
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
blocked_user.comment_post_with_id(post_id=post.pk, text=make_fake_post_comment_text())
user.block_user_with_id(user_id=blocked_user.pk)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
response_post = response_posts[0]
comments_count = response_post['comments_count']
self.assertEqual(comments_count, 0)
def test_comment_counts_on_community_post_should_include_blocked_users_if_they_are_admins(self):
"""
should count blocked users that ARE admins of that community in the comment counts on a community post
"""
user = make_user()
blocked_user = make_user()
community = make_community(creator=blocked_user)
user.join_community_with_name(community_name=community.name)
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
blocked_user.comment_post_with_id(post_id=post.pk, text=make_fake_post_comment_text())
user.block_user_with_id(user_id=blocked_user.pk)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
response_post = response_posts[0]
comments_count = response_post['comments_count']
self.assertEqual(comments_count, 1)
def test_comment_counts_on_posts_should_include_replies(self):
"""
should count replies in the comment counts on posts
"""
user = make_user()
replier = make_user()
post = user.create_public_post(text=make_fake_post_text())
post_comment = user.comment_post_with_id(post_id=post.pk, text=make_fake_post_comment_text())
replier.reply_to_comment_with_id_for_post_with_uuid(post_comment_id=post_comment.pk,
post_uuid=post.uuid,
text=make_fake_post_comment_text())
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
response_post = response_posts[0]
comments_count = response_post['comments_count']
self.assertTrue(comments_count, 2)
def test_cant_retrieve_own_draft_posts_by_username(self):
"""
should not be able to retrieve own draft posts by username
"""
user = make_user()
user.create_public_post(text=make_fake_post_text(), is_draft=True)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'username': user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_own_public_community_draft_posts_by_username(self):
"""
should not be able to retrieve own public community draft posts by username
"""
user = make_user()
community = make_community()
user.join_community_with_name(community_name=community.name)
user.create_community_post(text=make_fake_post_text(), is_draft=True, community_name=community.name)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'username': user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_own_private_community_part_of_draft_posts_by_username(self):
"""
should not be able to retrieve own private community part of draft posts by username
"""
user = make_user()
community_creator = make_user()
community = make_community(type=Community.COMMUNITY_TYPE_PRIVATE, creator=community_creator)
community_creator.invite_user_with_username_to_community_with_name(username=user.username,
community_name=community.name)
user.join_community_with_name(community_name=community.name)
user.create_community_post(text=make_fake_post_text(), is_draft=True, community_name=community.name)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'username': user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
@mock.patch('openbook_posts.jobs.process_post_media')
def test_cant_retrieve_processing_public_community_posts(self, process_post_media_mock):
"""
should not be able to retrieve processing public community posts
"""
user = make_user()
community_creator = make_user()
community = make_community(creator=community_creator)
post_creator = make_user()
user.join_community_with_name(community_name=community.name)
post_creator.join_community_with_name(community_name=community.name)
test_image = get_test_image()
with open(test_image['path'], 'rb') as file:
file = File(file)
post_creator.create_community_post(community_name=community.name, text=make_fake_post_text(),
image=file)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
@mock.patch('openbook_posts.jobs.process_post_media')
def test_cant_retrieve_processing_private_community_part_of_posts(self, mock):
"""
should not be able to retrieve processing private community posts
"""
user = make_user()
community_creator = make_user()
community = make_community(creator=community_creator, type=Community.COMMUNITY_TYPE_PRIVATE)
post_creator = make_user()
community_creator.invite_user_with_username_to_community_with_name(username=user.username,
community_name=community.name)
community_creator.invite_user_with_username_to_community_with_name(username=post_creator.username,
community_name=community.name)
user.join_community_with_name(community_name=community.name)
post_creator.join_community_with_name(community_name=community.name)
test_image = get_test_image()
with open(test_image['path'], 'rb') as file:
file = File(file)
post_creator.create_community_post(community_name=community.name, text=make_fake_post_text(),
image=file)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
@mock.patch('openbook_posts.jobs.process_post_media')
def test_cant_retrieve_following_user_processing_posts(self, mock):
"""
should not be able to retrieve following user processing posts
"""
user = make_user()
following_user = make_user()
user.follow_user_with_id(user_id=following_user.pk)
test_image = get_test_image()
with open(test_image['path'], 'rb') as file:
file = File(file)
following_user.create_public_post(text=make_fake_post_text(), image=file)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
@mock.patch('openbook_posts.jobs.process_post_media')
def test_cant_retrieve_following_user_processing_posts_when_filtering(self, mock):
"""
should not be able to retrieve following user processing posts when filtering
"""
user = make_user()
following_user = make_user()
follow_list = make_list(creator=user)
user.follow_user_with_id(user_id=following_user.pk, lists_ids=[follow_list.pk])
test_image = get_test_image()
with open(test_image['path'], 'rb') as file:
file = File(file)
following_user.create_public_post(text=make_fake_post_text(), image=file)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'list_id': follow_list.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
@mock.patch('openbook_posts.jobs.process_post_media')
def test_cant_retrieve_connected_user_processing_posts(self, mock):
"""
should not be able to retrieve connected user processing posts
"""
user = make_user()
connected_user = make_user()
user.connect_with_user_with_id(user_id=connected_user.pk)
connected_user_post_circle = make_circle(creator=connected_user)
connected_user.confirm_connection_with_user_with_id(user_id=user.pk,
circles_ids=[connected_user_post_circle.pk])
test_image = get_test_image()
with open(test_image['path'], 'rb') as file:
file = File(file)
connected_user.create_encircled_post(text=make_fake_post_text(),
circles_ids=[connected_user_post_circle.pk],
image=file)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
@mock.patch('openbook_posts.jobs.process_post_media')
def test_cant_retrieve_connected_user_processing_posts_when_filtering(self, mock):
"""
should not be able to retrieve connected user processing posts when filtering
"""
user = make_user()
connected_user = make_user()
user.connect_with_user_with_id(user_id=connected_user.pk)
connected_user_post_circle = make_circle(creator=connected_user)
connected_user.confirm_connection_with_user_with_id(user_id=user.pk,
circles_ids=[connected_user_post_circle.pk])
test_image = get_test_image()
with open(test_image['path'], 'rb') as file:
file = File(file)
connected_user.create_encircled_post(text=make_fake_post_text(),
circles_ids=[connected_user_post_circle.pk], image=file)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'circle_id': connected_user_post_circle.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
@mock.patch('openbook_posts.jobs.process_post_media')
def test_cant_retrieve_connected_user_processing_posts_by_username(self, file):
"""
should not be able to retrieve connected user processing posts by username
"""
user = make_user()
connected_user = make_user()
user.connect_with_user_with_id(user_id=connected_user.pk)
connected_user_post_circle = make_circle(creator=connected_user)
connected_user.confirm_connection_with_user_with_id(user_id=user.pk,
circles_ids=[connected_user_post_circle.pk])
test_image = get_test_image()
with open(test_image['path'], 'rb') as file:
file = File(file)
connected_user.create_encircled_post(text=make_fake_post_text(),
circles_ids=[connected_user_post_circle.pk], image=file)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'username': connected_user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
@mock.patch('openbook_posts.jobs.process_post_media')
def test_cant_retrieve_following_user_processing_posts_by_username(self, mock):
"""
should not be able to retrieve following user processing posts by username
"""
user = make_user()
following_user = make_user()
follow_list = make_list(creator=user)
user.follow_user_with_id(user_id=following_user.pk, lists_ids=[follow_list.pk])
test_image = get_test_image()
with open(test_image['path'], 'rb') as file:
file = File(file)
following_user.create_public_post(text=make_fake_post_text(), image=file)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'username': following_user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
@mock.patch('openbook_posts.jobs.process_post_media')
def test_cant_retrieve_own_processing_posts_by_username(self, mock):
"""
should not be able to retrieve own processing posts by username
"""
user = make_user()
test_image = get_test_image()
with open(test_image['path'], 'rb') as file:
file = File(file)
user.create_public_post(text=make_fake_post_text(), image=file)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'username': user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
@mock.patch('openbook_posts.jobs.process_post_media')
def test_cant_retrieve_own_public_community_processing_posts_by_username(self, mock):
"""
should not be able to retrieve own public community processing posts by username
"""
user = make_user()
community = make_community()
user.join_community_with_name(community_name=community.name)
test_image = get_test_image()
with open(test_image['path'], 'rb') as file:
file = File(file)
user.create_community_post(text=make_fake_post_text(), image=file, community_name=community.name)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'username': user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
@mock.patch('openbook_posts.jobs.process_post_media')
def test_cant_retrieve_own_private_community_part_of_processing_posts_by_username(self, mock):
"""
should not be able to retrieve own private community part of processing posts by username
"""
user = make_user()
community_creator = make_user()
community = make_community(type=Community.COMMUNITY_TYPE_PRIVATE, creator=community_creator)
community_creator.invite_user_with_username_to_community_with_name(username=user.username,
community_name=community.name)
user.join_community_with_name(community_name=community.name)
test_image = get_test_image()
with open(test_image['path'], 'rb') as file:
file = File(file)
user.create_community_post(text=make_fake_post_text(), image=file, community_name=community.name)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'username': user.username
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_create_post_notifies_subscribers(self):
"""
should notify subscribers when a post is created
"""
user = make_user()
subscriber = make_user()
post_text = fake.text(max_nb_chars=POST_MAX_LENGTH)
headers = make_authentication_headers_for_user(user)
data = {'text': post_text}
subscriber.enable_new_post_notifications_for_user_with_username(user.username)
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
user_notifications_subscription = UserNotificationsSubscription.objects.get(subscriber=subscriber, user=user)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertTrue(UserNewPostNotification.objects.filter(
user_notifications_subscription=user_notifications_subscription).count() == 1)
def test_create_post_does_not_notify_subscribers_if_post_creator_is_blocked(self):
"""
should NOT notify subscribers if creator is blocked when a post is created
"""
user = make_user()
subscriber = make_user()
post_text = fake.text(max_nb_chars=POST_MAX_LENGTH)
headers = make_authentication_headers_for_user(user)
data = {'text': post_text}
subscriber.enable_new_post_notifications_for_user_with_username(user.username)
subscriber.block_user_with_id(user_id=user.pk)
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
response_post = json.loads(response.content)
post = Post.objects.get(id=response_post['id'])
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertTrue(UserNewPostNotification.objects.filter(
post=post).count() == 0)
def test_create_post_does_not_notify_subscribers_if_they_have_been_blocked(self):
"""
should NOT notify subscribers if creator has blocked them
"""
user = make_user()
subscriber = make_user()
post_text = fake.text(max_nb_chars=POST_MAX_LENGTH)
headers = make_authentication_headers_for_user(user)
data = {'text': post_text}
subscriber.enable_new_post_notifications_for_user_with_username(user.username)
user.block_user_with_id(user_id=subscriber.pk)
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
response_post = json.loads(response.content)
post = Post.objects.get(id=response_post['id'])
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertTrue(UserNewPostNotification.objects.filter(
post=post).count() == 0)
def test_encircled_post_should_notify_only_subscribers_in_that_circle(self):
"""
should only notify subscribers in a circle if encircled post
"""
post_creator = make_user()
subscriber = make_user()
other_subscriber = make_user()
post_text = fake.text(max_nb_chars=POST_MAX_LENGTH)
headers = make_authentication_headers_for_user(post_creator)
circle = mixer.blend(Circle, creator=post_creator)
# connect with subscriber and add them to circle
subscriber.connect_with_user_with_id(post_creator.pk)
post_creator.confirm_connection_with_user_with_id(user_id=subscriber.pk, circles_ids=[circle.pk])
# both users subscribe
subscriber.enable_new_post_notifications_for_user_with_username(post_creator.username)
other_subscriber.enable_new_post_notifications_for_user_with_username(post_creator.username)
data = {
'text': post_text,
'circle_id': circle.pk
}
url = self._get_url()
response = self.client.put(url, data, **headers, format='multipart')
other_subscriber_notifications_subscription = UserNotificationsSubscription.objects.get(
subscriber=other_subscriber, user=post_creator)
subscriber_notifications_subscription = UserNotificationsSubscription.objects.get(
subscriber=subscriber, user=post_creator)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertTrue(UserNewPostNotification.objects.filter(
user_notifications_subscription=other_subscriber_notifications_subscription).count() == 0)
self.assertTrue(UserNewPostNotification.objects.filter(
user_notifications_subscription=subscriber_notifications_subscription).count() == 1)
def _get_url(self):
return reverse('posts')
class TrendingPostsAPITests(OpenbookAPITestCase):
"""
TrendingPostsAPITests
"""
fixtures = [
'openbook_circles/fixtures/circles.json'
]
def test_displays_community_posts_only(self):
"""
should display community posts only and return 200
"""
user = make_user()
community = make_community(creator=user)
user.create_public_post(text=make_fake_post_text())
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
headers = make_authentication_headers_for_user(user)
emoji_group = make_reactions_emoji_group()
emoji = make_emoji(group=emoji_group)
# react once, min required while testing
user.react_to_post_with_id(post_id=post.pk, emoji_id=emoji.pk)
curate_trending_posts()
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
response_post = response_posts[0]
self.assertEqual(response_post['post']['id'], post.pk)
self.assertTrue(TrendingPost.objects.filter(post__id=post.pk).exists())
def test_does_not_curate_community_posts_with_less_than_min_reactions(self):
"""
should not curate community posts with less than minimum reactions and return 200
"""
user = make_user()
community = make_community(creator=user)
user.create_public_post(text=make_fake_post_text())
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
headers = make_authentication_headers_for_user(user)
curate_trending_posts()
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
self.assertFalse(TrendingPost.objects.filter(post__id=post.pk).exists())
def test_does_not_display_closed_community_posts(self):
"""
should not display community posts that are closed
"""
user = make_user()
community = make_community(creator=user)
user.create_public_post(text=make_fake_post_text())
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
post_two = user.create_community_post(community_name=community.name, text=make_fake_post_text())
post_two.is_closed = True
post_two.save()
headers = make_authentication_headers_for_user(user)
emoji_group = make_reactions_emoji_group()
emoji = make_emoji(group=emoji_group)
# react once, min required while testing
user.react_to_post_with_id(post_id=post.pk, emoji_id=emoji.pk)
user.react_to_post_with_id(post_id=post_two.pk, emoji_id=emoji.pk)
curate_trending_posts()
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
response_post = response_posts[0]
self.assertEqual(response_post['post']['id'], post.pk)
self.assertFalse(TrendingPost.objects.filter(post__id=post_two.pk).exists())
self.assertTrue(TrendingPost.objects.filter(post__id=post.pk).exists())
def test_does_not_display_post_from_community_banned_from(self):
"""
should not display posts from a community banned from and return 200
"""
user = make_user()
community_owner = make_user()
community = make_community(creator=community_owner)
user.join_community_with_name(community_name=community.name)
post = community_owner.create_community_post(community_name=community.name, text=make_fake_post_text())
emoji_group = make_reactions_emoji_group()
emoji = make_emoji(group=emoji_group)
# react once, min required while testing
user.react_to_post_with_id(post_id=post.pk, emoji_id=emoji.pk)
community_owner.ban_user_with_username_from_community_with_name(username=user.username,
community_name=community.name)
headers = make_authentication_headers_for_user(user)
curate_trending_posts()
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_post_of_blocked_user(self):
"""
should not be able to retrieve posts of a blocked user
"""
user = make_user()
community = make_community(creator=user)
user_to_retrieve_posts_from = make_user()
user_to_retrieve_posts_from.join_community_with_name(community_name=community.name)
post = user_to_retrieve_posts_from.create_community_post(text=make_fake_post_text(),
community_name=community.name)
# react once, min required while testing
emoji_group = make_reactions_emoji_group()
emoji = make_emoji(group=emoji_group)
user.react_to_post_with_id(post_id=post.pk, emoji_id=emoji.pk)
user.block_user_with_id(user_id=user_to_retrieve_posts_from.pk)
curate_trending_posts()
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_post_of_blocking_user(self):
"""
should not be able to retrieve posts of a blocking user
"""
user = make_user()
community = make_community(creator=user)
user_to_retrieve_posts_from = make_user()
user_to_retrieve_posts_from.join_community_with_name(community_name=community.name)
post = user_to_retrieve_posts_from.create_community_post(text=make_fake_post_text(),
community_name=community.name)
emoji_group = make_reactions_emoji_group()
emoji = make_emoji(group=emoji_group)
# react once, min required while testing
user.react_to_post_with_id(post_id=post.pk, emoji_id=emoji.pk)
user_to_retrieve_posts_from.block_user_with_id(user_id=user.pk)
curate_trending_posts()
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_post_of_blocked_community_staff_member(self):
"""
should not be able to retrieve posts of a blocked community staff member
"""
user = make_user()
community_owner = make_user()
community = make_community(creator=community_owner)
user.join_community_with_name(community_name=community.name)
post = community_owner.create_community_post(text=make_fake_post_text(), community_name=community.name)
emoji_group = make_reactions_emoji_group()
emoji = make_emoji(group=emoji_group)
# react once, min required while testing
user.react_to_post_with_id(post_id=post.pk, emoji_id=emoji.pk)
# block user
user.block_user_with_id(user_id=community_owner.pk)
curate_trending_posts()
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_does_not_curate_encircled_posts(self):
"""
should not curate encircled posts in trending posts
"""
post_creator = make_user()
user = make_user()
circle = make_circle(creator=post_creator)
post_creator.connect_with_user_with_id(user_id=user.pk, circles_ids=[circle.pk])
user.confirm_connection_with_user_with_id(user_id=post_creator.pk)
post_text = make_fake_post_text()
post = post_creator.create_encircled_post(text=post_text, circles_ids=[circle.pk])
emoji_group = make_reactions_emoji_group()
emoji = make_emoji(group=emoji_group)
# react once, min required while testing
user.react_to_post_with_id(post_id=post.pk, emoji_id=emoji.pk)
# curate trending posts
curate_trending_posts()
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
trending_posts = TrendingPost.objects.all()
self.assertEqual(0, len(trending_posts))
self.assertFalse(TrendingPost.objects.filter(post__id=post.pk).exists())
def test_does_not_curate_private_community_posts(self):
"""
should not curate private community posts in trending posts
"""
user = make_user()
community = make_community(creator=user, type=Community.COMMUNITY_TYPE_PRIVATE)
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
emoji_group = make_reactions_emoji_group()
emoji = make_emoji(group=emoji_group)
# react once, min required while testing
user.react_to_post_with_id(post_id=post.pk, emoji_id=emoji.pk)
# curate trending posts
curate_trending_posts()
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
trending_posts = TrendingPost.objects.all()
self.assertEqual(0, len(trending_posts))
self.assertFalse(TrendingPost.objects.filter(post__id=post.pk).exists())
def test_does_not_return_recently_turned_private_community_posts(self):
"""
should not return recently turned private community posts in trending posts
"""
user = make_user()
community = make_community(creator=user, type=Community.COMMUNITY_TYPE_PUBLIC)
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
emoji_group = make_reactions_emoji_group()
emoji = make_emoji(group=emoji_group)
# react once, min required while testing
user.react_to_post_with_id(post_id=post.pk, emoji_id=emoji.pk)
# curate trending posts
curate_trending_posts()
community.type = Community.COMMUNITY_TYPE_PRIVATE
community.save()
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
trending_posts = TrendingPost.objects.all()
self.assertEqual(1, len(trending_posts))
self.assertTrue(TrendingPost.objects.filter(post__id=post.pk).exists())
def test_does_not_display_curated_closed_community_posts(self):
"""
should not display community posts that are closed after already curated in trending posts
"""
user = make_user()
community = make_community(creator=user)
user.create_public_post(text=make_fake_post_text())
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
post_two = user.create_community_post(community_name=community.name, text=make_fake_post_text())
emoji_group = make_reactions_emoji_group()
emoji = make_emoji(group=emoji_group)
# react once, min required while testing
user.react_to_post_with_id(post_id=post.pk, emoji_id=emoji.pk)
user.react_to_post_with_id(post_id=post_two.pk, emoji_id=emoji.pk)
# curate trending posts
curate_trending_posts()
post_two.is_closed = True
post_two.save()
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
response_post = response_posts[0]
self.assertEqual(response_post['post']['id'], post.pk)
def test_does_not_display_reported_community_posts_that_are_approved(self):
"""
should not display community posts that are reported and approved by staff in trending posts
"""
user = make_user()
post_reporter = make_user()
community = make_community(creator=user)
post_reporter.join_community_with_name(community_name=community.name)
user.create_public_post(text=make_fake_post_text())
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
post_two = user.create_community_post(community_name=community.name, text=make_fake_post_text())
# report and approve the report for one post
moderation_category = make_moderation_category()
post_reporter.report_post(post=post, category_id=moderation_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=moderation_category.pk)
user.approve_moderated_object(moderated_object=moderated_object)
emoji_group = make_reactions_emoji_group()
emoji = make_emoji(group=emoji_group)
# react once, min required while testing
user.react_to_post_with_id(post_id=post.pk, emoji_id=emoji.pk)
user.react_to_post_with_id(post_id=post_two.pk, emoji_id=emoji.pk)
# curate trending posts
curate_trending_posts()
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
response_post = response_posts[0]
self.assertEqual(response_post['post']['id'], post_two.pk)
trending_posts = TrendingPost.objects.all()
self.assertEqual(1, len(trending_posts))
self.assertTrue(TrendingPost.objects.filter(post__id=post_two.pk).exists())
def test_does_not_display_reported_community_posts_that_are_approved_after_curation(self):
"""
should not display community posts that are reported and approved after already curated by staff in trending posts
"""
user = make_user()
post_reporter = make_user()
community = make_community(creator=user)
post_reporter.join_community_with_name(community_name=community.name)
user.create_public_post(text=make_fake_post_text())
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
post_two = user.create_community_post(community_name=community.name, text=make_fake_post_text())
# report and approve the report for one post
moderation_category = make_moderation_category()
post_reporter.report_post(post=post, category_id=moderation_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=moderation_category.pk)
emoji_group = make_reactions_emoji_group()
emoji = make_emoji(group=emoji_group)
# react once, min required while testing
user.react_to_post_with_id(post_id=post.pk, emoji_id=emoji.pk)
user.react_to_post_with_id(post_id=post_two.pk, emoji_id=emoji.pk)
# curate trending posts
curate_trending_posts()
user.approve_moderated_object(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
response_post = response_posts[0]
self.assertEqual(response_post['post']['id'], post_two.pk)
def _get_url(self):
return reverse('trending-posts-new')
class TopPostsAPITests(OpenbookAPITestCase):
"""
TopPostsAPITests
"""
fixtures = [
'openbook_circles/fixtures/circles.json',
]
def test_displays_community_posts_only(self):
"""
should display community posts only in top posts and return 200
"""
user = make_user()
community = make_community(creator=user)
public_post = user.create_public_post(text=make_fake_post_text())
community_post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
# comment on both posts to qualify for top
user.comment_post(community_post, text=make_fake_post_comment_text())
user.comment_post(public_post, text=make_fake_post_comment_text())
# curate top posts
curate_top_posts()
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
response_post = response_posts[0]
self.assertEqual(response_post['post']['id'], community_post.pk)
top_posts = TopPost.objects.all()
self.assertEqual(1, len(top_posts))
self.assertTrue(TopPost.objects.filter(post__id=community_post.pk).exists())
def test_excludes_joined_communities_if_true(self):
"""
should display posts only from communities not joined by user if exclude_joined_communities is true
"""
user = make_user()
community_creator = make_user()
user_community = make_community(creator=user)
community = make_community(creator=community_creator)
user_community_post = user.create_community_post(community_name=user_community.name, text=make_fake_post_text())
community_post = community_creator.create_community_post(community_name=community.name,
text=make_fake_post_text())
# comment on both posts to qualify for top
user.comment_post(user_community_post, text=make_fake_post_comment_text())
community_creator.comment_post(community_post, text=make_fake_post_comment_text())
# curate top posts
curate_top_posts()
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, {'exclude_joined_communities': True}, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
response_post = response_posts[0]
self.assertEqual(response_post['post']['id'], community_post.pk)
top_posts = TopPost.objects.all()
self.assertEqual(2, len(top_posts))
def test_does_not_display_excluded_community_posts(self):
"""
should not display excluded community posts in top posts
"""
user = make_user()
community = make_community(creator=user)
public_post = user.create_public_post(text=make_fake_post_text())
community_post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
# comment on both posts to qualify for top
user.comment_post(community_post, text=make_fake_post_comment_text())
user.comment_post(public_post, text=make_fake_post_comment_text())
user.exclude_community_with_name_from_top_posts(community.name)
# curate top posts
curate_top_posts()
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
top_posts = TopPost.objects.all()
self.assertEqual(1, len(top_posts))
self.assertTrue(TopPost.objects.filter(post__id=community_post.pk).exists())
def test_does_not_curate_encircled_posts(self):
"""
should not curate encircled posts in top posts
"""
post_creator = make_user()
user = make_user()
circle = make_circle(creator=post_creator)
post_creator.connect_with_user_with_id(user_id=user.pk, circles_ids=[circle.pk])
user.confirm_connection_with_user_with_id(user_id=post_creator.pk)
post_text = make_fake_post_text()
post = post_creator.create_encircled_post(text=post_text, circles_ids=[circle.pk])
# comment on post to qualify for top
user.comment_post(post, text=make_fake_post_comment_text())
# curate top posts
curate_top_posts()
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
top_posts = TopPost.objects.all()
self.assertEqual(0, len(top_posts))
self.assertFalse(TopPost.objects.filter(post__id=post.pk).exists())
def test_does_not_curate_private_community_posts(self):
"""
should not curate private community posts in top posts
"""
user = make_user()
community = make_community(creator=user, type=Community.COMMUNITY_TYPE_PRIVATE)
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
# comment on post to qualify for top
user.comment_post(post, text=make_fake_post_comment_text())
# curate top posts
curate_top_posts()
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
top_posts = TopPost.objects.all()
self.assertEqual(0, len(top_posts))
self.assertFalse(TopPost.objects.filter(post__id=post.pk).exists())
def test_does_not_return_recently_turned_private_community_posts(self):
"""
should not return recently turned private community posts in top posts
"""
user = make_user()
community = make_community(creator=user, type=Community.COMMUNITY_TYPE_PUBLIC)
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
# comment on post to qualify for top
user.comment_post(post, text=make_fake_post_comment_text())
# curate top posts
curate_top_posts()
community.type = Community.COMMUNITY_TYPE_PRIVATE
community.save()
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
top_posts = TopPost.objects.all()
self.assertEqual(1, len(top_posts))
self.assertTrue(TopPost.objects.filter(post__id=post.pk).exists())
def test_does_not_display_closed_community_posts(self):
"""
should not display community posts that are closed in top posts
"""
user = make_user()
community = make_community(creator=user)
user.create_public_post(text=make_fake_post_text())
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
post_two = user.create_community_post(community_name=community.name, text=make_fake_post_text())
post_two.is_closed = True
post_two.save()
# comment on both posts to qualify for top
user.comment_post(post, text=make_fake_post_comment_text())
user.comment_post(post_two, text=make_fake_post_comment_text())
# curate top posts
curate_top_posts()
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
response_post = response_posts[0]
self.assertEqual(response_post['post']['id'], post.pk)
top_posts = TopPost.objects.all()
self.assertEqual(1, len(top_posts))
self.assertTrue(TopPost.objects.filter(post__id=post.pk).exists())
def test_does_not_display_curated_closed_community_posts(self):
"""
should not display community posts that are closed after already curated in top posts
"""
user = make_user()
community = make_community(creator=user)
user.create_public_post(text=make_fake_post_text())
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
post_two = user.create_community_post(community_name=community.name, text=make_fake_post_text())
# comment on both posts to qualify for top
user.comment_post(post, text=make_fake_post_comment_text())
user.comment_post(post_two, text=make_fake_post_comment_text())
# curate top posts
curate_top_posts()
post_two.is_closed = True
post_two.save()
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
response_post = response_posts[0]
self.assertEqual(response_post['post']['id'], post.pk)
def test_does_not_display_reported_community_posts_that_are_approved(self):
"""
should not display community posts that are reported and approved by staff in top posts
"""
user = make_user()
post_reporter = make_user()
community = make_community(creator=user)
post_reporter.join_community_with_name(community_name=community.name)
user.create_public_post(text=make_fake_post_text())
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
post_two = user.create_community_post(community_name=community.name, text=make_fake_post_text())
# comment on both posts to qualify for top
user.comment_post(post, text=make_fake_post_comment_text())
user.comment_post(post_two, text=make_fake_post_comment_text())
# report and approve the report for one post
moderation_category = make_moderation_category()
post_reporter.report_post(post=post, category_id=moderation_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=moderation_category.pk)
user.approve_moderated_object(moderated_object=moderated_object)
# curate top posts
curate_top_posts()
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
response_post = response_posts[0]
self.assertEqual(response_post['post']['id'], post_two.pk)
top_posts = TopPost.objects.all()
self.assertEqual(1, len(top_posts))
self.assertTrue(TopPost.objects.filter(post__id=post_two.pk).exists())
def test_does_not_display_reported_community_posts_that_are_approved_after_curation(self):
"""
should not display community posts that are reported and approved after already curated by staff in top posts
"""
user = make_user()
post_reporter = make_user()
community = make_community(creator=user)
post_reporter.join_community_with_name(community_name=community.name)
user.create_public_post(text=make_fake_post_text())
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
post_two = user.create_community_post(community_name=community.name, text=make_fake_post_text())
# comment on both posts to qualify for top
user.comment_post(post, text=make_fake_post_comment_text())
user.comment_post(post_two, text=make_fake_post_comment_text())
# report and approve the report for one post
moderation_category = make_moderation_category()
post_reporter.report_post(post=post, category_id=moderation_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=moderation_category.pk)
# curate top posts
curate_top_posts()
user.approve_moderated_object(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
response_post = response_posts[0]
self.assertEqual(response_post['post']['id'], post_two.pk)
def test_does_not_display_post_from_community_banned_from(self):
"""
should not display posts from a community banned from and return 200 in top posts
"""
user = make_user()
community_owner = make_user()
community = make_community(creator=community_owner)
user.join_community_with_name(community_name=community.name)
community_owner.ban_user_with_username_from_community_with_name(username=user.username,
community_name=community.name)
post = community_owner.create_community_post(community_name=community.name, text=make_fake_post_text())
# comment on post to qualify for top
community_owner.comment_post(post, text=make_fake_post_comment_text())
# curate top posts
curate_top_posts()
headers = make_authentication_headers_for_user(user)
url = self._get_url()
response = self.client.get(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_post_of_blocked_user(self):
"""
should not be able to retrieve posts of a blocked user in top posts
"""
user = make_user()
user_to_retrieve_posts_from = make_user()
community = make_community(creator=user_to_retrieve_posts_from)
post = user_to_retrieve_posts_from.create_community_post(community_name=community.name,
text=make_fake_post_text())
user_to_retrieve_posts_from.comment_post(post, text=make_fake_post_comment_text())
user.follow_user_with_id(user_id=user_to_retrieve_posts_from.pk)
user.block_user_with_id(user_id=user_to_retrieve_posts_from.pk)
# curate top posts
curate_top_posts()
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_post_of_blocking_user(self):
"""
should not be able to retrieve posts of a blocking user in top posts
"""
user = make_user()
user_to_retrieve_posts_from = make_user()
community = make_community(creator=user_to_retrieve_posts_from)
post = user_to_retrieve_posts_from.create_community_post(community_name=community.name,
text=make_fake_post_text())
user_to_retrieve_posts_from.comment_post(post, text=make_fake_post_comment_text())
user_to_retrieve_posts_from.block_user_with_id(user_id=user.pk)
# curate top posts
curate_top_posts()
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_cant_retrieve_post_of_blocked_community_staff_member(self):
"""
should not be able to retrieve posts of a blocked community staff member
"""
user = make_user()
community_owner = make_user()
community = make_community(creator=community_owner)
post = community_owner.create_community_post(community_name=community.name, text=make_fake_post_text())
community_owner.comment_post(post, text=make_fake_post_comment_text())
user.block_user_with_id(user_id=community_owner.pk)
# curate top posts
curate_top_posts()
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(0, len(response_posts))
def test_should_have_minimum_comments_to_curate_as_top_post(self):
"""
should not curate a post as top post unless it has minimum no of comments
"""
user = make_user()
community_owner = make_user()
community = make_community(creator=community_owner)
post = community_owner.create_community_post(community_name=community.name, text=make_fake_post_text())
post_two = community_owner.create_community_post(community_name=community.name, text=make_fake_post_text())
# comment once, min comments required while testing
community_owner.comment_post(post, text=make_fake_post_comment_text())
# curate top posts
curate_top_posts()
top_posts = TopPost.objects.all()
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
top_posts = TopPost.objects.all()
self.assertEqual(1, len(top_posts))
self.assertTrue(TopPost.objects.filter(post__id=post.pk).exists())
def test_should_have_minimum_reactions_to_curate_as_top_post(self):
"""
should not curate a post as top post unless it has minimum no of reactions
"""
user = make_user()
community_owner = make_user()
community = make_community(creator=community_owner)
post = community_owner.create_community_post(community_name=community.name, text=make_fake_post_text())
post_two = community_owner.create_community_post(community_name=community.name, text=make_fake_post_text())
emoji_group = make_reactions_emoji_group()
emoji = make_emoji(group=emoji_group)
# react once, min required while testing
community_owner.react_to_post_with_id(post_id=post.pk, emoji_id=emoji.pk, )
# curate top posts
curate_top_posts()
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(1, len(response_posts))
top_posts = TopPost.objects.all()
self.assertEqual(1, len(top_posts))
self.assertTrue(TopPost.objects.filter(post__id=post.pk).exists())
def test_should_respect_max_id_param_for_top_posts(self):
"""
should take into account max_id in when returning top posts
"""
user = make_user()
total_posts = 10
community = make_community(creator=user)
for i in range(total_posts):
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
user.comment_post(post, text=make_fake_post_comment_text())
# curate top posts
curate_top_posts()
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {'max_id': 5}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(4, len(response_posts))
for top_post in response_posts:
self.assertTrue(top_post['id'] < 5)
def test_should_respect_min_id_param_for_top_posts(self):
"""
should take into account min_id in when returning top posts
"""
user = make_user()
total_posts = 10
community = make_community(creator=user)
for i in range(total_posts):
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
user.comment_post(post, text=make_fake_post_comment_text())
# curate top posts
curate_top_posts()
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {'min_id': 5}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(5, len(response_posts))
for top_post in response_posts:
self.assertTrue(top_post['id'] > 5)
def test_should_respect_count_param_for_top_posts(self):
"""
should take into account count when returning top posts
"""
user = make_user()
total_posts = 10
community = make_community(creator=user)
for i in range(total_posts):
post = user.create_community_post(community_name=community.name, text=make_fake_post_text())
user.comment_post(post, text=make_fake_post_comment_text())
# curate top posts
curate_top_posts()
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {'count': 5}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_posts = json.loads(response.content)
self.assertEqual(5, len(response_posts))
def _get_url(self):
return reverse('top-posts')
class ProfilePostsExcludedCommunitiesAPITests(OpenbookAPITestCase):
"""
ProfilePostsExcludedCommunitiesAPI
"""
def test_retrieve_excluded_communities(self):
"""
should be able to retrieve all excluded communities and return 200
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
communities = mixer.cycle(5).blend(Community, creator=user)
communities_ids = [community.pk for community in communities]
for community in communities:
user.join_community_with_name(community_name=community.name)
user.exclude_community_with_name_from_profile_posts(community_name=community.name)
url = self._get_url()
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_communities = json.loads(response.content)
self.assertEqual(len(response_communities), len(communities_ids))
for response_community in response_communities:
response_community_id = response_community.get('id')
self.assertIn(response_community_id, communities_ids)
def test_should_not_retrieve_non_excluded_communities(self):
"""
should NOT retrieve non-excluded communities and return 200
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
communities = mixer.cycle(5).blend(Community, creator=user)
for community in communities:
user.join_community_with_name(community_name=community.name)
url = self._get_url()
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_communities = json.loads(response.content)
self.assertEqual(len(response_communities), 0)
def test_retrieve_excluded_communities_offset(self):
"""
should be able to retrieve all excluded communities with an offset return 200
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
total_amount_of_communities = 10
offset = 5
communities = mixer.cycle(total_amount_of_communities).blend(Community, creator=user)
offsetted_communities = communities[offset: total_amount_of_communities]
offsetted_communities_ids = [community.pk for community in offsetted_communities]
for community in communities:
user.join_community_with_name(community_name=community.name)
user.exclude_community_with_name_from_profile_posts(community_name=community.name)
url = self._get_url()
response = self.client.get(url, {
'offset': offset
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_communities = json.loads(response.content)
self.assertEqual(len(response_communities), total_amount_of_communities - offset)
for response_community in response_communities:
response_community_id = response_community.get('id')
self.assertIn(response_community_id, offsetted_communities_ids)
def test_can_exclude_public_community(self):
"""
should be able to exclude a public community from profile posts
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
other_user = make_user()
community = make_community(creator=other_user)
url = self._get_url()
data = {
'community_name': community.name
}
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_202_ACCEPTED)
self.assertTrue(user.has_excluded_community_with_name_from_profile_posts(community_name=community.name))
def test_can_exclude_private_community(self):
"""
should be able to exclude a private community from profile posts
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
other_user = make_user()
community = make_community(creator=other_user, type=Community.COMMUNITY_TYPE_PRIVATE)
url = self._get_url()
data = {
'community_name': community.name
}
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_202_ACCEPTED)
self.assertTrue(user.has_excluded_community_with_name_from_profile_posts(community_name=community.name))
def test_cannot_exclude_community_already_excluded(self):
"""
should not be able to exclude a community if already excluded from profile posts
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
other_user = make_user()
community = make_community(creator=other_user)
user.exclude_community_with_name_from_profile_posts(community.name)
url = self._get_url()
data = {
'community_name': community.name
}
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertTrue(user.has_excluded_community_with_name_from_profile_posts(community_name=community.name))
def _get_url(self):
return reverse('profile-posts-excluded-communities')
class SearchProfilePostsExcludedCommunitiesAPITests(OpenbookAPITestCase):
"""
SearchProfilePostsExcludedCommunitiesAPI
"""
def test_can_search_excluded_communities_by_name(self):
"""
should be able to search for excluded communities by their name and return 200
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
amount_of_joined_communities_to_search_for = 5
for i in range(0, amount_of_joined_communities_to_search_for):
community_name = fake.user_name().lower()
community = mixer.blend(Community, name=community_name, type=Community.COMMUNITY_TYPE_PUBLIC)
user.exclude_community_with_name_from_profile_posts(community_name)
amount_of_characters_to_query = random.randint(1, len(community_name))
query = community_name[0:amount_of_characters_to_query]
final_query = ''
for character in query:
final_query = final_query + (character.upper() if fake.boolean() else character.lower())
url = self._get_url()
response = self.client.get(url, {
'query': final_query
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
parsed_response = json.loads(response.content)
self.assertEqual(len(parsed_response), 1)
retrieved_community = parsed_response[0]
self.assertEqual(retrieved_community['name'], community_name.lower())
community.delete()
def test_can_search_excluded_communities_by_title(self):
"""
should be able to search for excluded communities by their title and return 200
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
amount_of_joined_communities_to_search_for = 5
for i in range(0, amount_of_joined_communities_to_search_for):
community_title = fake.user_name().lower()
community = mixer.blend(Community, title=community_title, type=Community.COMMUNITY_TYPE_PUBLIC)
user.exclude_community_with_name_from_profile_posts(community.name)
amount_of_characters_to_query = random.randint(1, len(community_title))
query = community_title[0:amount_of_characters_to_query]
final_query = ''
for character in query:
final_query = final_query + (character.upper() if fake.boolean() else character.lower())
url = self._get_url()
response = self.client.get(url, {
'query': final_query
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
parsed_response = json.loads(response.content)
self.assertEqual(len(parsed_response), 1)
retrieved_community = parsed_response[0]
self.assertEqual(retrieved_community['title'], community_title.lower())
community.delete()
def _get_url(self):
return reverse('search-profile-posts-excluded-communities')
class ProfilePostsExcludedCommunityAPITests(OpenbookAPITestCase):
"""
ProfilePostsExcludedCommunityAPI
"""
def test_can_remove_excluded_community(self):
"""
should be able to remove an community exclusion
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
other_user = make_user()
community = make_community(creator=other_user)
user.exclude_community_with_name_from_profile_posts(community.name)
url = self._get_url(community=community)
response = self.client.delete(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_202_ACCEPTED)
self.assertFalse(user.has_excluded_community_with_name_from_profile_posts(community_name=community.name))
def test_cannot_remove_exclusion_for_community_if_not_excluded(self):
"""
should not be able to remove an community exclusion, if the community is not excluded in the first place
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
other_user = make_user()
community = make_community(creator=other_user)
url = self._get_url(community=community)
response = self.client.delete(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertFalse(user.has_excluded_community_with_name_from_profile_posts(community_name=community.name))
def _get_url(self, community):
return reverse('profile-posts-excluded-community', kwargs={
'community_name': community.name
})
class TopPostsExcludedCommunitiesAPITests(OpenbookAPITestCase):
"""
TopPostsExcludedCommunitiesAPI
"""
def test_retrieve_excluded_communities(self):
"""
should be able to retrieve all excluded communities and return 200
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
communities = mixer.cycle(5).blend(Community, creator=user)
communities_ids = [community.pk for community in communities]
for community in communities:
user.join_community_with_name(community_name=community.name)
user.exclude_community_with_name_from_top_posts(community_name=community.name)
url = self._get_url()
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_communities = json.loads(response.content)
self.assertEqual(len(response_communities), len(communities_ids))
for response_community in response_communities:
response_community_id = response_community.get('id')
self.assertIn(response_community_id, communities_ids)
def test_should_not_retrieve_non_excluded_communities(self):
"""
should NOT retrieve non-excluded communities and return 200
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
communities = mixer.cycle(5).blend(Community, creator=user)
for community in communities:
user.join_community_with_name(community_name=community.name)
url = self._get_url()
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_communities = json.loads(response.content)
self.assertEqual(len(response_communities), 0)
def test_retrieve_excluded_communities_offset(self):
"""
should be able to retrieve all excluded communities with an offset return 200
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
total_amount_of_communities = 10
offset = 5
communities = mixer.cycle(total_amount_of_communities).blend(Community, creator=user)
offsetted_communities = communities[offset: total_amount_of_communities]
offsetted_communities_ids = [community.pk for community in offsetted_communities]
for community in communities:
user.join_community_with_name(community_name=community.name)
user.exclude_community_with_name_from_top_posts(community_name=community.name)
url = self._get_url()
response = self.client.get(url, {
'offset': offset
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_communities = json.loads(response.content)
self.assertEqual(len(response_communities), total_amount_of_communities - offset)
for response_community in response_communities:
response_community_id = response_community.get('id')
self.assertIn(response_community_id, offsetted_communities_ids)
def test_can_exclude_public_community(self):
"""
should be able to exclude a public community from top posts
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
other_user = make_user()
community = make_community(creator=other_user)
url = self._get_url()
data = {
'community_name': community.name
}
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_202_ACCEPTED)
self.assertTrue(user.has_excluded_community_with_name_from_top_posts(community_name=community.name))
def test_cannot_exclude_private_community(self):
"""
should be able to exclude a private community from top posts
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
other_user = make_user()
community = make_community(creator=other_user, type=Community.COMMUNITY_TYPE_PRIVATE)
url = self._get_url()
data = {
'community_name': community.name
}
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertFalse(user.has_excluded_community_with_name_from_top_posts(community_name=community.name))
def test_cannot_exclude_community_already_excluded(self):
"""
should not be able to exclude a community if already excluded from top posts
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
other_user = make_user()
community = make_community(creator=other_user)
user.exclude_community_with_name_from_top_posts(community.name)
url = self._get_url()
data = {
'community_name': community.name
}
response = self.client.put(url, data, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertTrue(user.has_excluded_community_with_name_from_top_posts(community_name=community.name))
def _get_url(self):
return reverse('top-posts-excluded-communities')
class SearchTopPostsExcludedCommunitiesAPITests(OpenbookAPITestCase):
"""
SearchTopPostsExcludedCommunitiesAPI
"""
def test_can_search_excluded_communities_by_name(self):
"""
should be able to search for excluded communities by their name and return 200
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
amount_of_joined_communities_to_search_for = 5
for i in range(0, amount_of_joined_communities_to_search_for):
community_name = fake.user_name().lower()
community = mixer.blend(Community, name=community_name, type=Community.COMMUNITY_TYPE_PUBLIC)
user.exclude_community_with_name_from_top_posts(community_name)
amount_of_characters_to_query = random.randint(1, len(community_name))
query = community_name[0:amount_of_characters_to_query]
final_query = ''
for character in query:
final_query = final_query + (character.upper() if fake.boolean() else character.lower())
url = self._get_url()
response = self.client.get(url, {
'query': final_query
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
parsed_response = json.loads(response.content)
self.assertEqual(len(parsed_response), 1)
retrieved_community = parsed_response[0]
self.assertEqual(retrieved_community['name'], community_name.lower())
community.delete()
def test_can_search_excluded_communities_by_title(self):
"""
should be able to search for excluded communities by their title and return 200
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
amount_of_joined_communities_to_search_for = 5
for i in range(0, amount_of_joined_communities_to_search_for):
community_title = fake.user_name().lower()
community = mixer.blend(Community, title=community_title, type=Community.COMMUNITY_TYPE_PUBLIC)
user.exclude_community_with_name_from_top_posts(community.name)
amount_of_characters_to_query = random.randint(1, len(community_title))
query = community_title[0:amount_of_characters_to_query]
final_query = ''
for character in query:
final_query = final_query + (character.upper() if fake.boolean() else character.lower())
url = self._get_url()
response = self.client.get(url, {
'query': final_query
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
parsed_response = json.loads(response.content)
self.assertEqual(len(parsed_response), 1)
retrieved_community = parsed_response[0]
self.assertEqual(retrieved_community['title'], community_title.lower())
community.delete()
def _get_url(self):
return reverse('search-top-posts-excluded-communities')
class TopPostsExcludedCommunityAPITests(OpenbookAPITestCase):
"""
TopPostsExcludedCommunityAPI
"""
def test_can_remove_excluded_community(self):
"""
should be able to remove an community exclusion
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
other_user = make_user()
community = make_community(creator=other_user)
user.exclude_community_with_name_from_top_posts(community.name)
url = self._get_url(community=community)
response = self.client.delete(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_202_ACCEPTED)
self.assertFalse(user.has_excluded_community_with_name_from_top_posts(community_name=community.name))
def test_cannot_remove_exclusion_for_community_if_not_excluded(self):
"""
should not be able to remove an community exclusion, if the community is not excluded in the first place
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
other_user = make_user()
community = make_community(creator=other_user)
url = self._get_url(community=community)
response = self.client.delete(url, **headers, format='multipart')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertFalse(user.has_excluded_community_with_name_from_top_posts(community_name=community.name))
def _get_url(self, community):
return reverse('top-posts-excluded-community', kwargs={
'community_name': community.name
})
| 36.755378 | 140 | 0.673367 | 25,301 | 206,749 | 5.135884 | 0.018576 | 0.042118 | 0.036108 | 0.040418 | 0.939827 | 0.925129 | 0.914763 | 0.900518 | 0.885242 | 0.874253 | 0 | 0.006638 | 0.242241 | 206,749 | 5,624 | 141 | 36.761913 | 0.822789 | 0.073195 | 0 | 0.848597 | 0 | 0 | 0.01766 | 0.004266 | 0 | 0 | 0 | 0.000178 | 0.140919 | 1 | 0.056738 | false | 0 | 0.008634 | 0.002775 | 0.071847 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
66f8d0c9595e0343750df400df87188fda90a233 | 15,293 | py | Python | graphpype/pipelines/conmat_to_graph.py | balandongiv/graphpype | 7bce7c7b78793adcea6758d8e1355bb8ccb3558d | [
"BSD-3-Clause"
] | 17 | 2017-12-26T18:51:43.000Z | 2022-02-25T19:42:09.000Z | graphpype/pipelines/conmat_to_graph.py | balandongiv/graphpype | 7bce7c7b78793adcea6758d8e1355bb8ccb3558d | [
"BSD-3-Clause"
] | 44 | 2017-12-09T19:14:08.000Z | 2021-08-17T14:42:48.000Z | graphpype/pipelines/conmat_to_graph.py | balandongiv/graphpype | 7bce7c7b78793adcea6758d8e1355bb8ccb3558d | [
"BSD-3-Clause"
] | 12 | 2017-05-28T20:38:27.000Z | 2022-03-16T20:57:47.000Z | """
Pipeline to compute graph and modularity with radatools
"""
import nipype.pipeline.engine as pe
import nipype.interfaces.utility as niu
from graphpype.interfaces.bct import KCore
from graphpype.interfaces.radatools.rada import PrepRada, NetPropRada, CommRada
from graphpype.nodes.modularity import (ComputeNetList, ComputeNodeRoles,
ComputeModuleMatProp)
def create_pipeline_conmat_to_graph_density(
main_path, pipeline_name="graph_den_pipe", con_den=1.0, multi=False,
mod=True, plot=False, optim_seq="WS trfr 100", compute_ndi=False):
"""
Pipeline from connectivity matrices to graph analysis
Threshold is density based
Inputs (inputnode):
* conmat_file
* coords_file
* labels_file
"""
# TODO plot=True is kept for sake of clarity but is now unused
pipeline = pe.Workflow(name=pipeline_name + "_den_" +
str(con_den).replace(".", "_"))
pipeline.base_dir = main_path
inputnode = pe.Node(niu.IdentityInterface(
fields=['conmat_file', 'coords_file', 'labels_file']),
name='inputnode')
if multi:
# density-based graphs
# net_list
compute_net_List = pe.MapNode(interface=ComputeNetList(
), name='compute_net_List', iterfield=["Z_cor_mat_file"])
compute_net_List.inputs.density = con_den
pipeline.connect(inputnode, 'conmat_file',
compute_net_List, 'Z_cor_mat_file')
# radatools
# prepare net_list for radatools processing
prep_rada = pe.MapNode(interface=PrepRada(),
name='prep_rada', iterfield=["net_List_file"])
prep_rada.inputs.network_type = "U"
pipeline.connect(compute_net_List, 'net_List_file',
prep_rada, 'net_List_file')
if mod:
# compute community with radatools
community_rada = pe.MapNode(interface=CommRada(
), name='community_rada', iterfield=["Pajek_net_file"])
community_rada.inputs.optim_seq = optim_seq
pipeline.connect(prep_rada, 'Pajek_net_file',
community_rada, 'Pajek_net_file')
# node roles
node_roles = pe.MapNode(
interface=ComputeNodeRoles(role_type="4roles"),
name='node_roles',
iterfield=['Pajek_net_file', 'rada_lol_file'])
pipeline.connect(prep_rada, 'Pajek_net_file',
node_roles, 'Pajek_net_file')
pipeline.connect(community_rada, 'rada_lol_file',
node_roles, 'rada_lol_file')
node_roles.inputs.compute_ndi = compute_ndi
# compute network properties with rada
net_prop = pe.MapNode(interface=NetPropRada(
optim_seq="A"), name='net_prop', iterfield=["Pajek_net_file"])
pipeline.connect(prep_rada, 'Pajek_net_file',
net_prop, 'Pajek_net_file')
else:
# density-based graphs
# net_list
compute_net_List = pe.Node(
interface=ComputeNetList(), name='compute_net_List')
compute_net_List.inputs.density = con_den
pipeline.connect(inputnode, 'conmat_file',
compute_net_List, 'Z_cor_mat_file')
# radatools
# prepare net_list for radatools processing
prep_rada = pe.Node(interface=PrepRada(), name='prep_rada')
prep_rada.inputs.network_type = "U"
pipeline.connect(compute_net_List, 'net_List_file',
prep_rada, 'net_List_file')
if mod:
# compute community with radatools
community_rada = pe.Node(
interface=CommRada(), name='community_rada')
community_rada.inputs.optim_seq = optim_seq
pipeline.connect(prep_rada, 'Pajek_net_file',
community_rada, 'Pajek_net_file')
# node roles
node_roles = pe.Node(interface=ComputeNodeRoles(
role_type="4roles"), name='node_roles')
pipeline.connect(prep_rada, 'Pajek_net_file',
node_roles, 'Pajek_net_file')
pipeline.connect(community_rada, 'rada_lol_file',
node_roles, 'rada_lol_file')
node_roles.inputs.compute_ndi = compute_ndi
# compute network properties with rada
net_prop = pe.Node(interface=NetPropRada(
optim_seq="A"), name='net_prop')
pipeline.connect(prep_rada, 'Pajek_net_file',
net_prop, 'Pajek_net_file')
return pipeline
def create_pipeline_conmat_to_graph_threshold(
main_path, pipeline_name="graph_thr_pipe", con_thr=1.0, multi=False,
mod=True, plot=True, optim_seq="WS trfr 100", compute_ndi=False):
"""
Pipeline from connectivity matrices to graph analysis
Threshold is value based (con_thr)
Inputs (inputnode):
* conmat_file
* coords_file
* labels_file
"""
# TODO Warning, need to be checked...
# TODO Warning, should be merged with previous function
# create_pipeline_conmat_to_graph_density
# TODO plot=True is kept for sake of clarity but is now unused
pipeline = pe.Workflow(name=pipeline_name)
pipeline.base_dir = main_path
inputnode = pe.Node(niu.IdentityInterface(
fields=['conmat_file', 'coords_file', 'labels_file']),
name='inputnode')
if not multi:
# density-based graphs
# net_list
compute_net_List = pe.Node(
interface=ComputeNetList(), name='compute_net_List')
compute_net_List.inputs.threshold = con_thr
pipeline.connect(inputnode, 'conmat_file',
compute_net_List, 'Z_cor_mat_file')
# radatools
# prepare net_list for radatools processing
prep_rada = pe.Node(interface=PrepRada(),
name='prep_rada', iterfield=["net_List_file"])
pipeline.connect(compute_net_List, 'net_List_file',
prep_rada, 'net_List_file')
if mod:
# compute community with radatools
community_rada = pe.Node(interface=CommRada(
), name='community_rada', iterfield=["Pajek_net_file"])
community_rada.inputs.optim_seq = optim_seq
pipeline.connect(prep_rada, 'Pajek_net_file',
community_rada, 'Pajek_net_file')
# node roles
node_roles = pe.Node(
interface=ComputeNodeRoles(role_type="4roles"),
name='node_roles')
pipeline.connect(prep_rada, 'Pajek_net_file',
node_roles, 'Pajek_net_file')
pipeline.connect(community_rada, 'rada_lol_file',
node_roles, 'rada_lol_file')
node_roles.inputs.compute_ndi = compute_ndi
# compute network properties with rada
net_prop = pe.Node(interface=NetPropRada(
optim_seq="A"), name='net_prop')
pipeline.connect(prep_rada, 'Pajek_net_file',
net_prop, 'Pajek_net_file')
else:
# density-based graphs
# net_list
compute_net_List = pe.MapNode(interface=ComputeNetList(
), name='compute_net_List', iterfield=["Z_cor_mat_file"])
compute_net_List.inputs.threshold = con_thr
pipeline.connect(inputnode, 'conmat_file',
compute_net_List, 'Z_cor_mat_file')
# radatools
# prepare net_list for radatools processing
prep_rada = pe.MapNode(interface=PrepRada(),
name='prep_rada', iterfield=["net_List_file"])
prep_rada.inputs.network_type = "U"
pipeline.connect(compute_net_List, 'net_List_file',
prep_rada, 'net_List_file')
if mod:
# compute community with radatools
community_rada = pe.MapNode(interface=CommRada(
), name='community_rada', iterfield=["Pajek_net_file"])
community_rada.inputs.optim_seq = optim_seq
pipeline.connect(prep_rada, 'Pajek_net_file',
community_rada, 'Pajek_net_file')
# node roles
node_roles = pe.MapNode(
interface=ComputeNodeRoles(role_type="4roles"),
name='node_roles',
iterfield=['Pajek_net_file', 'rada_lol_file'])
pipeline.connect(prep_rada, 'Pajek_net_file',
node_roles, 'Pajek_net_file')
pipeline.connect(community_rada, 'rada_lol_file',
node_roles, 'rada_lol_file')
node_roles.inputs.compute_ndi = compute_ndi
# compute network properties with rada
net_prop = pe.MapNode(interface=NetPropRada(
optim_seq="A"), name='net_prop', iterfield=["Pajek_net_file"])
pipeline.connect(prep_rada, 'Pajek_net_file',
net_prop, 'Pajek_net_file')
return pipeline
def create_pipeline_net_list_to_graph(
main_path, pipeline_name="graph_net_pipe", multi=False, mod=True,
plot=False, optim_seq="WS trfr 100", compute_ndi=False):
"""
Pipeline from net_List (txt file) to graph analysis
Inputs (inputnode):
* net_List_file
* coords_file
* labels_file
Could be used in the previous functions
(create_pipeline_conmat_to_graph_density and
create_pipeline_conmat_to_graph_threshold)
"""
# TODO plot=True is kept for sake of clarity but is now unused
pipeline = pe.Workflow(name=pipeline_name)
pipeline.base_dir = main_path
inputnode = pe.Node(niu.IdentityInterface(
fields=['net_List_file', 'coords_file', 'labels_file']),
name='inputnode')
if not multi:
# density-based graphs
# prepare net_list for radatools processing
prep_rada = pe.Node(interface=PrepRada(), name='prep_rada')
prep_rada.inputs.network_type = "U"
pipeline.connect(inputnode, 'net_List_file',
prep_rada, 'net_List_file')
if mod:
# compute community with radatools
community_rada = pe.Node(
interface=CommRada(), name='community_rada')
community_rada.inputs.optim_seq = optim_seq
pipeline.connect(prep_rada, 'Pajek_net_file',
community_rada, 'Pajek_net_file')
# node roles
node_roles = pe.Node(interface=ComputeNodeRoles(
role_type="4roles"), name='node_roles')
pipeline.connect(prep_rada, 'Pajek_net_file',
node_roles, 'Pajek_net_file')
pipeline.connect(community_rada, 'rada_lol_file',
node_roles, 'rada_lol_file')
node_roles.inputs.compute_ndi = compute_ndi
# compute network properties with rada
net_prop = pe.Node(interface=NetPropRada(
optim_seq="A"), name='net_prop')
pipeline.connect(prep_rada, 'Pajek_net_file',
net_prop, 'Pajek_net_file')
else:
assert False, "Error, never tested"
# density-based graphs
# radatools
# prepare net_list for radatools processing
prep_rada = pe.MapNode(interface=PrepRada(),
name='prep_rada', iterfield=["net_List_file"])
prep_rada.inputs.network_type = "U"
pipeline.connect(inputnode, 'net_List_file',
prep_rada, 'net_List_file')
if mod:
# compute community with radatools
community_rada = pe.MapNode(interface=CommRada(
), name='community_rada', iterfield=["Pajek_net_file"])
community_rada.inputs.optim_seq = optim_seq
pipeline.connect(prep_rada, 'Pajek_net_file',
community_rada, 'Pajek_net_file')
# node roles
node_roles = pe.MapNode(interface=ComputeNodeRoles(
role_type="4roles"), name='node_roles',
iterfield=['Pajek_net_file', 'rada_lol_file'])
pipeline.connect(prep_rada, 'Pajek_net_file',
node_roles, 'Pajek_net_file')
pipeline.connect(community_rada, 'rada_lol_file',
node_roles, 'rada_lol_file')
node_roles.inputs.compute_ndi = compute_ndi
# compute network properties with rada
net_prop = pe.MapNode(interface=NetPropRada(
optim_seq="A"), name='net_prop', iterfield=["Pajek_net_file"])
pipeline.connect(prep_rada, 'Pajek_net_file',
net_prop, 'Pajek_net_file')
return pipeline
# create_pipeline_bct_graph
def create_pipeline_bct_graph(
main_path, pipeline_name="graph_bct_pipe", con_den=1.0):
"""
Description:
Pipeline for computing module based graph properties
Threshold is density based
Inputs (inputnode):
* conmat_files
"""
# TODO plot=True is kept for sake of clarity but is now unused
pipeline = pe.Workflow(name=pipeline_name)
pipeline.base_dir = main_path
# input node
inputnode = pe.Node(niu.IdentityInterface(
fields=['conmat_file']),
name='inputnode')
# compute binary version
bin_mat = pe.Node(interface=ComputeNetList(export_np_bin=True,
density=con_den),
name="bin_mat")
pipeline.connect(inputnode, 'conmat_file',
bin_mat, 'Z_cor_mat_file')
# compute K core
k_core = pe.Node(
interface=KCore(),
name="k_core")
k_core.inputs.is_directed = False
pipeline.connect(bin_mat, 'np_bin_mat_file',
k_core, 'np_mat_file')
return pipeline
# create_pipeline_graph_module_properties
def create_pipeline_graph_module_properties(
main_path, pipeline_name="graph_mod_pipe", con_den=1.0, multi=False,
plot=True, export_excel=False):
"""
Description:
Pipeline for computing module based graph properties
Threshold is density based
Inputs (inputnode):
* conmat_files
* lol_file
"""
# TODO plot=True is kept for sake of clarity but is now unused
pipeline = pe.Workflow(name=pipeline_name)
pipeline.base_dir = main_path
# input node
inputnode = pe.Node(niu.IdentityInterface(
fields=['group_conmat_file', 'rada_lol_file', 'Pajek_net_file']),
name='inputnode')
# module to graph properties
mod_graph = pe.Node(
interface=ComputeModuleMatProp(),
name="mod_graph")
mod_graph.inputs.export_excel = export_excel
pipeline.connect(inputnode, 'Pajek_net_file',
mod_graph, 'Pajek_net_file')
pipeline.connect(inputnode, 'rada_lol_file',
mod_graph, 'rada_lol_file')
pipeline.connect(inputnode, 'group_conmat_file',
mod_graph, 'group_conmat_file')
return pipeline
| 33.030238 | 79 | 0.609691 | 1,723 | 15,293 | 5.088218 | 0.082995 | 0.039922 | 0.06707 | 0.043801 | 0.869168 | 0.835634 | 0.810197 | 0.797992 | 0.796852 | 0.784305 | 0 | 0.002151 | 0.30066 | 15,293 | 462 | 80 | 33.101732 | 0.817578 | 0.161119 | 0 | 0.773504 | 0 | 0 | 0.160721 | 0 | 0 | 0 | 0 | 0.010823 | 0.004274 | 1 | 0.021368 | false | 0 | 0.021368 | 0 | 0.064103 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
66f927e5f67bd76fac1c75af40c97b779a2a456d | 4,804 | py | Python | tests/integration/billing_request_flows_integration_test.py | gdvalderrama/gocardless-pro-python | 0ff8001f5bba11673c4fa0f30d26eca61a1219ba | [
"MIT"
] | null | null | null | tests/integration/billing_request_flows_integration_test.py | gdvalderrama/gocardless-pro-python | 0ff8001f5bba11673c4fa0f30d26eca61a1219ba | [
"MIT"
] | null | null | null | tests/integration/billing_request_flows_integration_test.py | gdvalderrama/gocardless-pro-python | 0ff8001f5bba11673c4fa0f30d26eca61a1219ba | [
"MIT"
] | null | null | null | # WARNING: Do not edit by hand, this file was generated by Crank:
#
# https://github.com/gocardless/crank
#
import json
import requests
import responses
from nose.tools import (
assert_equal,
assert_is_instance,
assert_is_none,
assert_is_not_none,
assert_not_equal,
assert_raises
)
from gocardless_pro.errors import MalformedResponseError
from gocardless_pro import resources
from gocardless_pro import list_response
from .. import helpers
@responses.activate
def test_billing_request_flows_create():
fixture = helpers.load_fixture('billing_request_flows')['create']
helpers.stub_response(fixture)
response = helpers.client.billing_request_flows.create(*fixture['url_params'])
body = fixture['body']['billing_request_flows']
assert_is_instance(response, resources.BillingRequestFlow)
assert_is_not_none(responses.calls[-1].request.headers.get('Idempotency-Key'))
assert_equal(response.authorisation_url, body.get('authorisation_url'))
assert_equal(response.auto_fulfil, body.get('auto_fulfil'))
assert_equal(response.created_at, body.get('created_at'))
assert_equal(response.exit_uri, body.get('exit_uri'))
assert_equal(response.expires_at, body.get('expires_at'))
assert_equal(response.id, body.get('id'))
assert_equal(response.lock_bank_account, body.get('lock_bank_account'))
assert_equal(response.lock_customer_details, body.get('lock_customer_details'))
assert_equal(response.redirect_uri, body.get('redirect_uri'))
assert_equal(response.session_token, body.get('session_token'))
assert_equal(response.links.billing_request,
body.get('links')['billing_request'])
@responses.activate
def test_timeout_billing_request_flows_create_retries():
fixture = helpers.load_fixture('billing_request_flows')['create']
with helpers.stub_timeout_then_response(fixture) as rsps:
response = helpers.client.billing_request_flows.create(*fixture['url_params'])
assert_equal(2, len(rsps.calls))
assert_equal(rsps.calls[0].request.headers.get('Idempotency-Key'),
rsps.calls[1].request.headers.get('Idempotency-Key'))
body = fixture['body']['billing_request_flows']
assert_is_instance(response, resources.BillingRequestFlow)
def test_502_billing_request_flows_create_retries():
fixture = helpers.load_fixture('billing_request_flows')['create']
with helpers.stub_502_then_response(fixture) as rsps:
response = helpers.client.billing_request_flows.create(*fixture['url_params'])
assert_equal(2, len(rsps.calls))
assert_equal(rsps.calls[0].request.headers.get('Idempotency-Key'),
rsps.calls[1].request.headers.get('Idempotency-Key'))
body = fixture['body']['billing_request_flows']
assert_is_instance(response, resources.BillingRequestFlow)
@responses.activate
def test_billing_request_flows_initialise():
fixture = helpers.load_fixture('billing_request_flows')['initialise']
helpers.stub_response(fixture)
response = helpers.client.billing_request_flows.initialise(*fixture['url_params'])
body = fixture['body']['billing_request_flows']
assert_is_instance(response, resources.BillingRequestFlow)
assert_is_not_none(responses.calls[-1].request.headers.get('Idempotency-Key'))
assert_equal(response.authorisation_url, body.get('authorisation_url'))
assert_equal(response.auto_fulfil, body.get('auto_fulfil'))
assert_equal(response.created_at, body.get('created_at'))
assert_equal(response.exit_uri, body.get('exit_uri'))
assert_equal(response.expires_at, body.get('expires_at'))
assert_equal(response.id, body.get('id'))
assert_equal(response.lock_bank_account, body.get('lock_bank_account'))
assert_equal(response.lock_customer_details, body.get('lock_customer_details'))
assert_equal(response.redirect_uri, body.get('redirect_uri'))
assert_equal(response.session_token, body.get('session_token'))
assert_equal(response.links.billing_request,
body.get('links')['billing_request'])
def test_timeout_billing_request_flows_initialise_doesnt_retry():
fixture = helpers.load_fixture('billing_request_flows')['initialise']
with helpers.stub_timeout(fixture) as rsps:
with assert_raises(requests.ConnectTimeout):
response = helpers.client.billing_request_flows.initialise(*fixture['url_params'])
assert_equal(1, len(rsps.calls))
def test_502_billing_request_flows_initialise_doesnt_retry():
fixture = helpers.load_fixture('billing_request_flows')['initialise']
with helpers.stub_502(fixture) as rsps:
with assert_raises(MalformedResponseError):
response = helpers.client.billing_request_flows.initialise(*fixture['url_params'])
assert_equal(1, len(rsps.calls))
| 44.481481 | 90 | 0.763739 | 611 | 4,804 | 5.685761 | 0.144026 | 0.091825 | 0.120322 | 0.064767 | 0.875648 | 0.871042 | 0.84053 | 0.815774 | 0.785838 | 0.785838 | 0 | 0.005196 | 0.118651 | 4,804 | 107 | 91 | 44.897196 | 0.815305 | 0.021024 | 0 | 0.686047 | 1 | 0 | 0.150277 | 0.05364 | 0 | 0 | 0 | 0 | 0.488372 | 1 | 0.069767 | false | 0 | 0.093023 | 0 | 0.162791 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
7e2dcf690ce7b3810b83f25d056395a5953dbc00 | 39 | py | Python | MAPPO-Lagrangian/mappo_lagrangian/algorithms/r_mappo/__init__.py | Anonymous-ICML2022/Multi-Agent-Constrained-Policy-Optimisation | 0bfe52024e4d07600a39d3228de36fd75a3cd65d | [
"MIT"
] | 13 | 2021-10-20T20:02:27.000Z | 2022-03-28T05:58:01.000Z | MAPPO-Lagrangian/mappo_lagrangian/algorithms/r_mappo/__init__.py | Anonymous-ICML2022/Multi-Agent-Constrained-Policy-Optimisation | 0bfe52024e4d07600a39d3228de36fd75a3cd65d | [
"MIT"
] | null | null | null | MAPPO-Lagrangian/mappo_lagrangian/algorithms/r_mappo/__init__.py | Anonymous-ICML2022/Multi-Agent-Constrained-Policy-Optimisation | 0bfe52024e4d07600a39d3228de36fd75a3cd65d | [
"MIT"
] | 3 | 2022-02-14T04:04:19.000Z | 2022-03-15T09:22:34.000Z | def cost_trpo_macppo():
return None | 19.5 | 23 | 0.74359 | 6 | 39 | 4.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.179487 | 39 | 2 | 24 | 19.5 | 0.84375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
7e80e3453a03c5280647aeb51a9dd5eac80fbc8a | 64,497 | py | Python | sdk/python/pulumi_aws/servicecatalog/provisioned_product.py | chivandikwa/pulumi-aws | 19c08bf9dcb90544450ffa4eec7bf6751058fde2 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2021-11-10T16:33:40.000Z | 2021-11-10T16:33:40.000Z | sdk/python/pulumi_aws/servicecatalog/provisioned_product.py | chivandikwa/pulumi-aws | 19c08bf9dcb90544450ffa4eec7bf6751058fde2 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_aws/servicecatalog/provisioned_product.py | chivandikwa/pulumi-aws | 19c08bf9dcb90544450ffa4eec7bf6751058fde2 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['ProvisionedProductArgs', 'ProvisionedProduct']
@pulumi.input_type
class ProvisionedProductArgs:
def __init__(__self__, *,
accept_language: Optional[pulumi.Input[str]] = None,
ignore_errors: Optional[pulumi.Input[bool]] = None,
name: Optional[pulumi.Input[str]] = None,
notification_arns: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
path_id: Optional[pulumi.Input[str]] = None,
path_name: Optional[pulumi.Input[str]] = None,
product_id: Optional[pulumi.Input[str]] = None,
product_name: Optional[pulumi.Input[str]] = None,
provisioning_artifact_id: Optional[pulumi.Input[str]] = None,
provisioning_artifact_name: Optional[pulumi.Input[str]] = None,
provisioning_parameters: Optional[pulumi.Input[Sequence[pulumi.Input['ProvisionedProductProvisioningParameterArgs']]]] = None,
retain_physical_resources: Optional[pulumi.Input[bool]] = None,
stack_set_provisioning_preferences: Optional[pulumi.Input['ProvisionedProductStackSetProvisioningPreferencesArgs']] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None):
"""
The set of arguments for constructing a ProvisionedProduct resource.
:param pulumi.Input[str] accept_language: Language code. Valid values: `en` (English), `jp` (Japanese), `zh` (Chinese). Default value is `en`.
:param pulumi.Input[bool] ignore_errors: _Only applies to deleting._ If set to `true`, AWS Service Catalog stops managing the specified provisioned product even if it cannot delete the underlying resources. The default value is `false`.
:param pulumi.Input[str] name: User-friendly name of the provisioned product.
:param pulumi.Input[Sequence[pulumi.Input[str]]] notification_arns: Passed to CloudFormation. The SNS topic ARNs to which to publish stack-related events.
:param pulumi.Input[str] path_id: Path identifier of the product. This value is optional if the product has a default path, and required if the product has more than one path. To list the paths for a product, use `servicecatalog.get_launch_paths`. When required, you must provide `path_id` or `path_name`, but not both.
:param pulumi.Input[str] path_name: Name of the path. You must provide `path_id` or `path_name`, but not both.
:param pulumi.Input[str] product_id: Product identifier. For example, `prod-abcdzk7xy33qa`. You must provide `product_id` or `product_name`, but not both.
:param pulumi.Input[str] product_name: Name of the product. You must provide `product_id` or `product_name`, but not both.
:param pulumi.Input[str] provisioning_artifact_id: Identifier of the provisioning artifact. For example, `pa-4abcdjnxjj6ne`. You must provide the `provisioning_artifact_id` or `provisioning_artifact_name`, but not both.
:param pulumi.Input[str] provisioning_artifact_name: Name of the provisioning artifact. You must provide the `provisioning_artifact_id` or `provisioning_artifact_name`, but not both.
:param pulumi.Input[Sequence[pulumi.Input['ProvisionedProductProvisioningParameterArgs']]] provisioning_parameters: Configuration block with parameters specified by the administrator that are required for provisioning the product. See details below.
:param pulumi.Input[bool] retain_physical_resources: _Only applies to deleting._ Whether to delete the Service Catalog provisioned product but leave the CloudFormation stack, stack set, or the underlying resources of the deleted provisioned product. The default value is `false`.
:param pulumi.Input['ProvisionedProductStackSetProvisioningPreferencesArgs'] stack_set_provisioning_preferences: Configuration block with information about the provisioning preferences for a stack set. See details below.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: Tags to apply to the provisioned product. If configured with a provider `default_tags` configuration block present, tags with matching keys will overwrite those defined at the provider-level.
"""
if accept_language is not None:
pulumi.set(__self__, "accept_language", accept_language)
if ignore_errors is not None:
pulumi.set(__self__, "ignore_errors", ignore_errors)
if name is not None:
pulumi.set(__self__, "name", name)
if notification_arns is not None:
pulumi.set(__self__, "notification_arns", notification_arns)
if path_id is not None:
pulumi.set(__self__, "path_id", path_id)
if path_name is not None:
pulumi.set(__self__, "path_name", path_name)
if product_id is not None:
pulumi.set(__self__, "product_id", product_id)
if product_name is not None:
pulumi.set(__self__, "product_name", product_name)
if provisioning_artifact_id is not None:
pulumi.set(__self__, "provisioning_artifact_id", provisioning_artifact_id)
if provisioning_artifact_name is not None:
pulumi.set(__self__, "provisioning_artifact_name", provisioning_artifact_name)
if provisioning_parameters is not None:
pulumi.set(__self__, "provisioning_parameters", provisioning_parameters)
if retain_physical_resources is not None:
pulumi.set(__self__, "retain_physical_resources", retain_physical_resources)
if stack_set_provisioning_preferences is not None:
pulumi.set(__self__, "stack_set_provisioning_preferences", stack_set_provisioning_preferences)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter(name="acceptLanguage")
def accept_language(self) -> Optional[pulumi.Input[str]]:
"""
Language code. Valid values: `en` (English), `jp` (Japanese), `zh` (Chinese). Default value is `en`.
"""
return pulumi.get(self, "accept_language")
@accept_language.setter
def accept_language(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "accept_language", value)
@property
@pulumi.getter(name="ignoreErrors")
def ignore_errors(self) -> Optional[pulumi.Input[bool]]:
"""
_Only applies to deleting._ If set to `true`, AWS Service Catalog stops managing the specified provisioned product even if it cannot delete the underlying resources. The default value is `false`.
"""
return pulumi.get(self, "ignore_errors")
@ignore_errors.setter
def ignore_errors(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "ignore_errors", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
User-friendly name of the provisioned product.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="notificationArns")
def notification_arns(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Passed to CloudFormation. The SNS topic ARNs to which to publish stack-related events.
"""
return pulumi.get(self, "notification_arns")
@notification_arns.setter
def notification_arns(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "notification_arns", value)
@property
@pulumi.getter(name="pathId")
def path_id(self) -> Optional[pulumi.Input[str]]:
"""
Path identifier of the product. This value is optional if the product has a default path, and required if the product has more than one path. To list the paths for a product, use `servicecatalog.get_launch_paths`. When required, you must provide `path_id` or `path_name`, but not both.
"""
return pulumi.get(self, "path_id")
@path_id.setter
def path_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path_id", value)
@property
@pulumi.getter(name="pathName")
def path_name(self) -> Optional[pulumi.Input[str]]:
"""
Name of the path. You must provide `path_id` or `path_name`, but not both.
"""
return pulumi.get(self, "path_name")
@path_name.setter
def path_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path_name", value)
@property
@pulumi.getter(name="productId")
def product_id(self) -> Optional[pulumi.Input[str]]:
"""
Product identifier. For example, `prod-abcdzk7xy33qa`. You must provide `product_id` or `product_name`, but not both.
"""
return pulumi.get(self, "product_id")
@product_id.setter
def product_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "product_id", value)
@property
@pulumi.getter(name="productName")
def product_name(self) -> Optional[pulumi.Input[str]]:
"""
Name of the product. You must provide `product_id` or `product_name`, but not both.
"""
return pulumi.get(self, "product_name")
@product_name.setter
def product_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "product_name", value)
@property
@pulumi.getter(name="provisioningArtifactId")
def provisioning_artifact_id(self) -> Optional[pulumi.Input[str]]:
"""
Identifier of the provisioning artifact. For example, `pa-4abcdjnxjj6ne`. You must provide the `provisioning_artifact_id` or `provisioning_artifact_name`, but not both.
"""
return pulumi.get(self, "provisioning_artifact_id")
@provisioning_artifact_id.setter
def provisioning_artifact_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "provisioning_artifact_id", value)
@property
@pulumi.getter(name="provisioningArtifactName")
def provisioning_artifact_name(self) -> Optional[pulumi.Input[str]]:
"""
Name of the provisioning artifact. You must provide the `provisioning_artifact_id` or `provisioning_artifact_name`, but not both.
"""
return pulumi.get(self, "provisioning_artifact_name")
@provisioning_artifact_name.setter
def provisioning_artifact_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "provisioning_artifact_name", value)
@property
@pulumi.getter(name="provisioningParameters")
def provisioning_parameters(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ProvisionedProductProvisioningParameterArgs']]]]:
"""
Configuration block with parameters specified by the administrator that are required for provisioning the product. See details below.
"""
return pulumi.get(self, "provisioning_parameters")
@provisioning_parameters.setter
def provisioning_parameters(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ProvisionedProductProvisioningParameterArgs']]]]):
pulumi.set(self, "provisioning_parameters", value)
@property
@pulumi.getter(name="retainPhysicalResources")
def retain_physical_resources(self) -> Optional[pulumi.Input[bool]]:
"""
_Only applies to deleting._ Whether to delete the Service Catalog provisioned product but leave the CloudFormation stack, stack set, or the underlying resources of the deleted provisioned product. The default value is `false`.
"""
return pulumi.get(self, "retain_physical_resources")
@retain_physical_resources.setter
def retain_physical_resources(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "retain_physical_resources", value)
@property
@pulumi.getter(name="stackSetProvisioningPreferences")
def stack_set_provisioning_preferences(self) -> Optional[pulumi.Input['ProvisionedProductStackSetProvisioningPreferencesArgs']]:
"""
Configuration block with information about the provisioning preferences for a stack set. See details below.
"""
return pulumi.get(self, "stack_set_provisioning_preferences")
@stack_set_provisioning_preferences.setter
def stack_set_provisioning_preferences(self, value: Optional[pulumi.Input['ProvisionedProductStackSetProvisioningPreferencesArgs']]):
pulumi.set(self, "stack_set_provisioning_preferences", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
Tags to apply to the provisioned product. If configured with a provider `default_tags` configuration block present, tags with matching keys will overwrite those defined at the provider-level.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@pulumi.input_type
class _ProvisionedProductState:
def __init__(__self__, *,
accept_language: Optional[pulumi.Input[str]] = None,
arn: Optional[pulumi.Input[str]] = None,
cloudwatch_dashboard_names: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
created_time: Optional[pulumi.Input[str]] = None,
ignore_errors: Optional[pulumi.Input[bool]] = None,
last_provisioning_record_id: Optional[pulumi.Input[str]] = None,
last_record_id: Optional[pulumi.Input[str]] = None,
last_successful_provisioning_record_id: Optional[pulumi.Input[str]] = None,
launch_role_arn: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
notification_arns: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
path_id: Optional[pulumi.Input[str]] = None,
path_name: Optional[pulumi.Input[str]] = None,
product_id: Optional[pulumi.Input[str]] = None,
product_name: Optional[pulumi.Input[str]] = None,
provisioning_artifact_id: Optional[pulumi.Input[str]] = None,
provisioning_artifact_name: Optional[pulumi.Input[str]] = None,
provisioning_parameters: Optional[pulumi.Input[Sequence[pulumi.Input['ProvisionedProductProvisioningParameterArgs']]]] = None,
retain_physical_resources: Optional[pulumi.Input[bool]] = None,
stack_set_provisioning_preferences: Optional[pulumi.Input['ProvisionedProductStackSetProvisioningPreferencesArgs']] = None,
status: Optional[pulumi.Input[str]] = None,
status_message: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
tags_all: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
type: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering ProvisionedProduct resources.
:param pulumi.Input[str] accept_language: Language code. Valid values: `en` (English), `jp` (Japanese), `zh` (Chinese). Default value is `en`.
:param pulumi.Input[str] arn: ARN of the provisioned product.
:param pulumi.Input[Sequence[pulumi.Input[str]]] cloudwatch_dashboard_names: Set of CloudWatch dashboards that were created when provisioning the product.
:param pulumi.Input[str] created_time: Time when the provisioned product was created.
:param pulumi.Input[bool] ignore_errors: _Only applies to deleting._ If set to `true`, AWS Service Catalog stops managing the specified provisioned product even if it cannot delete the underlying resources. The default value is `false`.
:param pulumi.Input[str] last_provisioning_record_id: Record identifier of the last request performed on this provisioned product of the following types: `ProvisionedProduct`, `UpdateProvisionedProduct`, `ExecuteProvisionedProductPlan`, `TerminateProvisionedProduct`.
:param pulumi.Input[str] last_record_id: Record identifier of the last request performed on this provisioned product.
:param pulumi.Input[str] last_successful_provisioning_record_id: Record identifier of the last successful request performed on this provisioned product of the following types: `ProvisionedProduct`, `UpdateProvisionedProduct`, `ExecuteProvisionedProductPlan`, `TerminateProvisionedProduct`.
:param pulumi.Input[str] launch_role_arn: ARN of the launch role associated with the provisioned product.
:param pulumi.Input[str] name: User-friendly name of the provisioned product.
:param pulumi.Input[Sequence[pulumi.Input[str]]] notification_arns: Passed to CloudFormation. The SNS topic ARNs to which to publish stack-related events.
:param pulumi.Input[str] path_id: Path identifier of the product. This value is optional if the product has a default path, and required if the product has more than one path. To list the paths for a product, use `servicecatalog.get_launch_paths`. When required, you must provide `path_id` or `path_name`, but not both.
:param pulumi.Input[str] path_name: Name of the path. You must provide `path_id` or `path_name`, but not both.
:param pulumi.Input[str] product_id: Product identifier. For example, `prod-abcdzk7xy33qa`. You must provide `product_id` or `product_name`, but not both.
:param pulumi.Input[str] product_name: Name of the product. You must provide `product_id` or `product_name`, but not both.
:param pulumi.Input[str] provisioning_artifact_id: Identifier of the provisioning artifact. For example, `pa-4abcdjnxjj6ne`. You must provide the `provisioning_artifact_id` or `provisioning_artifact_name`, but not both.
:param pulumi.Input[str] provisioning_artifact_name: Name of the provisioning artifact. You must provide the `provisioning_artifact_id` or `provisioning_artifact_name`, but not both.
:param pulumi.Input[Sequence[pulumi.Input['ProvisionedProductProvisioningParameterArgs']]] provisioning_parameters: Configuration block with parameters specified by the administrator that are required for provisioning the product. See details below.
:param pulumi.Input[bool] retain_physical_resources: _Only applies to deleting._ Whether to delete the Service Catalog provisioned product but leave the CloudFormation stack, stack set, or the underlying resources of the deleted provisioned product. The default value is `false`.
:param pulumi.Input['ProvisionedProductStackSetProvisioningPreferencesArgs'] stack_set_provisioning_preferences: Configuration block with information about the provisioning preferences for a stack set. See details below.
:param pulumi.Input[str] status: Current status of the provisioned product. See meanings below.
:param pulumi.Input[str] status_message: Current status message of the provisioned product.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: Tags to apply to the provisioned product. If configured with a provider `default_tags` configuration block present, tags with matching keys will overwrite those defined at the provider-level.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags_all: Map of tags assigned to the resource, including those inherited from the provider `default_tags` configuration block.
:param pulumi.Input[str] type: Type of provisioned product. Valid values are `CFN_STACK` and `CFN_STACKSET`.
"""
if accept_language is not None:
pulumi.set(__self__, "accept_language", accept_language)
if arn is not None:
pulumi.set(__self__, "arn", arn)
if cloudwatch_dashboard_names is not None:
pulumi.set(__self__, "cloudwatch_dashboard_names", cloudwatch_dashboard_names)
if created_time is not None:
pulumi.set(__self__, "created_time", created_time)
if ignore_errors is not None:
pulumi.set(__self__, "ignore_errors", ignore_errors)
if last_provisioning_record_id is not None:
pulumi.set(__self__, "last_provisioning_record_id", last_provisioning_record_id)
if last_record_id is not None:
pulumi.set(__self__, "last_record_id", last_record_id)
if last_successful_provisioning_record_id is not None:
pulumi.set(__self__, "last_successful_provisioning_record_id", last_successful_provisioning_record_id)
if launch_role_arn is not None:
pulumi.set(__self__, "launch_role_arn", launch_role_arn)
if name is not None:
pulumi.set(__self__, "name", name)
if notification_arns is not None:
pulumi.set(__self__, "notification_arns", notification_arns)
if path_id is not None:
pulumi.set(__self__, "path_id", path_id)
if path_name is not None:
pulumi.set(__self__, "path_name", path_name)
if product_id is not None:
pulumi.set(__self__, "product_id", product_id)
if product_name is not None:
pulumi.set(__self__, "product_name", product_name)
if provisioning_artifact_id is not None:
pulumi.set(__self__, "provisioning_artifact_id", provisioning_artifact_id)
if provisioning_artifact_name is not None:
pulumi.set(__self__, "provisioning_artifact_name", provisioning_artifact_name)
if provisioning_parameters is not None:
pulumi.set(__self__, "provisioning_parameters", provisioning_parameters)
if retain_physical_resources is not None:
pulumi.set(__self__, "retain_physical_resources", retain_physical_resources)
if stack_set_provisioning_preferences is not None:
pulumi.set(__self__, "stack_set_provisioning_preferences", stack_set_provisioning_preferences)
if status is not None:
pulumi.set(__self__, "status", status)
if status_message is not None:
pulumi.set(__self__, "status_message", status_message)
if tags is not None:
pulumi.set(__self__, "tags", tags)
if tags_all is not None:
pulumi.set(__self__, "tags_all", tags_all)
if type is not None:
pulumi.set(__self__, "type", type)
@property
@pulumi.getter(name="acceptLanguage")
def accept_language(self) -> Optional[pulumi.Input[str]]:
"""
Language code. Valid values: `en` (English), `jp` (Japanese), `zh` (Chinese). Default value is `en`.
"""
return pulumi.get(self, "accept_language")
@accept_language.setter
def accept_language(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "accept_language", value)
@property
@pulumi.getter
def arn(self) -> Optional[pulumi.Input[str]]:
"""
ARN of the provisioned product.
"""
return pulumi.get(self, "arn")
@arn.setter
def arn(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "arn", value)
@property
@pulumi.getter(name="cloudwatchDashboardNames")
def cloudwatch_dashboard_names(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Set of CloudWatch dashboards that were created when provisioning the product.
"""
return pulumi.get(self, "cloudwatch_dashboard_names")
@cloudwatch_dashboard_names.setter
def cloudwatch_dashboard_names(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "cloudwatch_dashboard_names", value)
@property
@pulumi.getter(name="createdTime")
def created_time(self) -> Optional[pulumi.Input[str]]:
"""
Time when the provisioned product was created.
"""
return pulumi.get(self, "created_time")
@created_time.setter
def created_time(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "created_time", value)
@property
@pulumi.getter(name="ignoreErrors")
def ignore_errors(self) -> Optional[pulumi.Input[bool]]:
"""
_Only applies to deleting._ If set to `true`, AWS Service Catalog stops managing the specified provisioned product even if it cannot delete the underlying resources. The default value is `false`.
"""
return pulumi.get(self, "ignore_errors")
@ignore_errors.setter
def ignore_errors(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "ignore_errors", value)
@property
@pulumi.getter(name="lastProvisioningRecordId")
def last_provisioning_record_id(self) -> Optional[pulumi.Input[str]]:
"""
Record identifier of the last request performed on this provisioned product of the following types: `ProvisionedProduct`, `UpdateProvisionedProduct`, `ExecuteProvisionedProductPlan`, `TerminateProvisionedProduct`.
"""
return pulumi.get(self, "last_provisioning_record_id")
@last_provisioning_record_id.setter
def last_provisioning_record_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "last_provisioning_record_id", value)
@property
@pulumi.getter(name="lastRecordId")
def last_record_id(self) -> Optional[pulumi.Input[str]]:
"""
Record identifier of the last request performed on this provisioned product.
"""
return pulumi.get(self, "last_record_id")
@last_record_id.setter
def last_record_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "last_record_id", value)
@property
@pulumi.getter(name="lastSuccessfulProvisioningRecordId")
def last_successful_provisioning_record_id(self) -> Optional[pulumi.Input[str]]:
"""
Record identifier of the last successful request performed on this provisioned product of the following types: `ProvisionedProduct`, `UpdateProvisionedProduct`, `ExecuteProvisionedProductPlan`, `TerminateProvisionedProduct`.
"""
return pulumi.get(self, "last_successful_provisioning_record_id")
@last_successful_provisioning_record_id.setter
def last_successful_provisioning_record_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "last_successful_provisioning_record_id", value)
@property
@pulumi.getter(name="launchRoleArn")
def launch_role_arn(self) -> Optional[pulumi.Input[str]]:
"""
ARN of the launch role associated with the provisioned product.
"""
return pulumi.get(self, "launch_role_arn")
@launch_role_arn.setter
def launch_role_arn(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "launch_role_arn", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
User-friendly name of the provisioned product.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="notificationArns")
def notification_arns(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Passed to CloudFormation. The SNS topic ARNs to which to publish stack-related events.
"""
return pulumi.get(self, "notification_arns")
@notification_arns.setter
def notification_arns(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "notification_arns", value)
@property
@pulumi.getter(name="pathId")
def path_id(self) -> Optional[pulumi.Input[str]]:
"""
Path identifier of the product. This value is optional if the product has a default path, and required if the product has more than one path. To list the paths for a product, use `servicecatalog.get_launch_paths`. When required, you must provide `path_id` or `path_name`, but not both.
"""
return pulumi.get(self, "path_id")
@path_id.setter
def path_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path_id", value)
@property
@pulumi.getter(name="pathName")
def path_name(self) -> Optional[pulumi.Input[str]]:
"""
Name of the path. You must provide `path_id` or `path_name`, but not both.
"""
return pulumi.get(self, "path_name")
@path_name.setter
def path_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path_name", value)
@property
@pulumi.getter(name="productId")
def product_id(self) -> Optional[pulumi.Input[str]]:
"""
Product identifier. For example, `prod-abcdzk7xy33qa`. You must provide `product_id` or `product_name`, but not both.
"""
return pulumi.get(self, "product_id")
@product_id.setter
def product_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "product_id", value)
@property
@pulumi.getter(name="productName")
def product_name(self) -> Optional[pulumi.Input[str]]:
"""
Name of the product. You must provide `product_id` or `product_name`, but not both.
"""
return pulumi.get(self, "product_name")
@product_name.setter
def product_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "product_name", value)
@property
@pulumi.getter(name="provisioningArtifactId")
def provisioning_artifact_id(self) -> Optional[pulumi.Input[str]]:
"""
Identifier of the provisioning artifact. For example, `pa-4abcdjnxjj6ne`. You must provide the `provisioning_artifact_id` or `provisioning_artifact_name`, but not both.
"""
return pulumi.get(self, "provisioning_artifact_id")
@provisioning_artifact_id.setter
def provisioning_artifact_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "provisioning_artifact_id", value)
@property
@pulumi.getter(name="provisioningArtifactName")
def provisioning_artifact_name(self) -> Optional[pulumi.Input[str]]:
"""
Name of the provisioning artifact. You must provide the `provisioning_artifact_id` or `provisioning_artifact_name`, but not both.
"""
return pulumi.get(self, "provisioning_artifact_name")
@provisioning_artifact_name.setter
def provisioning_artifact_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "provisioning_artifact_name", value)
@property
@pulumi.getter(name="provisioningParameters")
def provisioning_parameters(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ProvisionedProductProvisioningParameterArgs']]]]:
"""
Configuration block with parameters specified by the administrator that are required for provisioning the product. See details below.
"""
return pulumi.get(self, "provisioning_parameters")
@provisioning_parameters.setter
def provisioning_parameters(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ProvisionedProductProvisioningParameterArgs']]]]):
pulumi.set(self, "provisioning_parameters", value)
@property
@pulumi.getter(name="retainPhysicalResources")
def retain_physical_resources(self) -> Optional[pulumi.Input[bool]]:
"""
_Only applies to deleting._ Whether to delete the Service Catalog provisioned product but leave the CloudFormation stack, stack set, or the underlying resources of the deleted provisioned product. The default value is `false`.
"""
return pulumi.get(self, "retain_physical_resources")
@retain_physical_resources.setter
def retain_physical_resources(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "retain_physical_resources", value)
@property
@pulumi.getter(name="stackSetProvisioningPreferences")
def stack_set_provisioning_preferences(self) -> Optional[pulumi.Input['ProvisionedProductStackSetProvisioningPreferencesArgs']]:
"""
Configuration block with information about the provisioning preferences for a stack set. See details below.
"""
return pulumi.get(self, "stack_set_provisioning_preferences")
@stack_set_provisioning_preferences.setter
def stack_set_provisioning_preferences(self, value: Optional[pulumi.Input['ProvisionedProductStackSetProvisioningPreferencesArgs']]):
pulumi.set(self, "stack_set_provisioning_preferences", value)
@property
@pulumi.getter
def status(self) -> Optional[pulumi.Input[str]]:
"""
Current status of the provisioned product. See meanings below.
"""
return pulumi.get(self, "status")
@status.setter
def status(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "status", value)
@property
@pulumi.getter(name="statusMessage")
def status_message(self) -> Optional[pulumi.Input[str]]:
"""
Current status message of the provisioned product.
"""
return pulumi.get(self, "status_message")
@status_message.setter
def status_message(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "status_message", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
Tags to apply to the provisioned product. If configured with a provider `default_tags` configuration block present, tags with matching keys will overwrite those defined at the provider-level.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@property
@pulumi.getter(name="tagsAll")
def tags_all(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
Map of tags assigned to the resource, including those inherited from the provider `default_tags` configuration block.
"""
return pulumi.get(self, "tags_all")
@tags_all.setter
def tags_all(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags_all", value)
@property
@pulumi.getter
def type(self) -> Optional[pulumi.Input[str]]:
"""
Type of provisioned product. Valid values are `CFN_STACK` and `CFN_STACKSET`.
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "type", value)
class ProvisionedProduct(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
accept_language: Optional[pulumi.Input[str]] = None,
ignore_errors: Optional[pulumi.Input[bool]] = None,
name: Optional[pulumi.Input[str]] = None,
notification_arns: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
path_id: Optional[pulumi.Input[str]] = None,
path_name: Optional[pulumi.Input[str]] = None,
product_id: Optional[pulumi.Input[str]] = None,
product_name: Optional[pulumi.Input[str]] = None,
provisioning_artifact_id: Optional[pulumi.Input[str]] = None,
provisioning_artifact_name: Optional[pulumi.Input[str]] = None,
provisioning_parameters: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ProvisionedProductProvisioningParameterArgs']]]]] = None,
retain_physical_resources: Optional[pulumi.Input[bool]] = None,
stack_set_provisioning_preferences: Optional[pulumi.Input[pulumi.InputType['ProvisionedProductStackSetProvisioningPreferencesArgs']]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
__props__=None):
"""
This resource provisions and manages a Service Catalog provisioned product.
A provisioned product is a resourced instance of a product. For example, provisioning a product based on a CloudFormation template launches a CloudFormation stack and its underlying resources.
Like this resource, the `aws_servicecatalog_record` data source also provides information about a provisioned product. Although a Service Catalog record provides some overlapping information with this resource, a record is tied to a provisioned product event, such as provisioning, termination, and updating.
> **Tip:** If you include conflicted keys as tags, AWS will report an error, "Parameter validation failed: Missing required parameter in Tags[N]:Value".
> **Tip:** A "provisioning artifact" is also referred to as a "version." A "distributor" is also referred to as a "vendor."
## Example Usage
### Basic Usage
```python
import pulumi
import pulumi_aws as aws
example = aws.servicecatalog.ProvisionedProduct("example",
product_name="Example product",
provisioning_artifact_name="Example version",
provisioning_parameters=[aws.servicecatalog.ProvisionedProductProvisioningParameterArgs(
key="foo",
value="bar",
)],
tags={
"foo": "bar",
})
```
## Import
`aws_servicecatalog_provisioned_product` can be imported using the provisioned product ID, e.g.,
```sh
$ pulumi import aws:servicecatalog/provisionedProduct:ProvisionedProduct example pp-dnigbtea24ste
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] accept_language: Language code. Valid values: `en` (English), `jp` (Japanese), `zh` (Chinese). Default value is `en`.
:param pulumi.Input[bool] ignore_errors: _Only applies to deleting._ If set to `true`, AWS Service Catalog stops managing the specified provisioned product even if it cannot delete the underlying resources. The default value is `false`.
:param pulumi.Input[str] name: User-friendly name of the provisioned product.
:param pulumi.Input[Sequence[pulumi.Input[str]]] notification_arns: Passed to CloudFormation. The SNS topic ARNs to which to publish stack-related events.
:param pulumi.Input[str] path_id: Path identifier of the product. This value is optional if the product has a default path, and required if the product has more than one path. To list the paths for a product, use `servicecatalog.get_launch_paths`. When required, you must provide `path_id` or `path_name`, but not both.
:param pulumi.Input[str] path_name: Name of the path. You must provide `path_id` or `path_name`, but not both.
:param pulumi.Input[str] product_id: Product identifier. For example, `prod-abcdzk7xy33qa`. You must provide `product_id` or `product_name`, but not both.
:param pulumi.Input[str] product_name: Name of the product. You must provide `product_id` or `product_name`, but not both.
:param pulumi.Input[str] provisioning_artifact_id: Identifier of the provisioning artifact. For example, `pa-4abcdjnxjj6ne`. You must provide the `provisioning_artifact_id` or `provisioning_artifact_name`, but not both.
:param pulumi.Input[str] provisioning_artifact_name: Name of the provisioning artifact. You must provide the `provisioning_artifact_id` or `provisioning_artifact_name`, but not both.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ProvisionedProductProvisioningParameterArgs']]]] provisioning_parameters: Configuration block with parameters specified by the administrator that are required for provisioning the product. See details below.
:param pulumi.Input[bool] retain_physical_resources: _Only applies to deleting._ Whether to delete the Service Catalog provisioned product but leave the CloudFormation stack, stack set, or the underlying resources of the deleted provisioned product. The default value is `false`.
:param pulumi.Input[pulumi.InputType['ProvisionedProductStackSetProvisioningPreferencesArgs']] stack_set_provisioning_preferences: Configuration block with information about the provisioning preferences for a stack set. See details below.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: Tags to apply to the provisioned product. If configured with a provider `default_tags` configuration block present, tags with matching keys will overwrite those defined at the provider-level.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: Optional[ProvisionedProductArgs] = None,
opts: Optional[pulumi.ResourceOptions] = None):
"""
This resource provisions and manages a Service Catalog provisioned product.
A provisioned product is a resourced instance of a product. For example, provisioning a product based on a CloudFormation template launches a CloudFormation stack and its underlying resources.
Like this resource, the `aws_servicecatalog_record` data source also provides information about a provisioned product. Although a Service Catalog record provides some overlapping information with this resource, a record is tied to a provisioned product event, such as provisioning, termination, and updating.
> **Tip:** If you include conflicted keys as tags, AWS will report an error, "Parameter validation failed: Missing required parameter in Tags[N]:Value".
> **Tip:** A "provisioning artifact" is also referred to as a "version." A "distributor" is also referred to as a "vendor."
## Example Usage
### Basic Usage
```python
import pulumi
import pulumi_aws as aws
example = aws.servicecatalog.ProvisionedProduct("example",
product_name="Example product",
provisioning_artifact_name="Example version",
provisioning_parameters=[aws.servicecatalog.ProvisionedProductProvisioningParameterArgs(
key="foo",
value="bar",
)],
tags={
"foo": "bar",
})
```
## Import
`aws_servicecatalog_provisioned_product` can be imported using the provisioned product ID, e.g.,
```sh
$ pulumi import aws:servicecatalog/provisionedProduct:ProvisionedProduct example pp-dnigbtea24ste
```
:param str resource_name: The name of the resource.
:param ProvisionedProductArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(ProvisionedProductArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
accept_language: Optional[pulumi.Input[str]] = None,
ignore_errors: Optional[pulumi.Input[bool]] = None,
name: Optional[pulumi.Input[str]] = None,
notification_arns: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
path_id: Optional[pulumi.Input[str]] = None,
path_name: Optional[pulumi.Input[str]] = None,
product_id: Optional[pulumi.Input[str]] = None,
product_name: Optional[pulumi.Input[str]] = None,
provisioning_artifact_id: Optional[pulumi.Input[str]] = None,
provisioning_artifact_name: Optional[pulumi.Input[str]] = None,
provisioning_parameters: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ProvisionedProductProvisioningParameterArgs']]]]] = None,
retain_physical_resources: Optional[pulumi.Input[bool]] = None,
stack_set_provisioning_preferences: Optional[pulumi.Input[pulumi.InputType['ProvisionedProductStackSetProvisioningPreferencesArgs']]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = ProvisionedProductArgs.__new__(ProvisionedProductArgs)
__props__.__dict__["accept_language"] = accept_language
__props__.__dict__["ignore_errors"] = ignore_errors
__props__.__dict__["name"] = name
__props__.__dict__["notification_arns"] = notification_arns
__props__.__dict__["path_id"] = path_id
__props__.__dict__["path_name"] = path_name
__props__.__dict__["product_id"] = product_id
__props__.__dict__["product_name"] = product_name
__props__.__dict__["provisioning_artifact_id"] = provisioning_artifact_id
__props__.__dict__["provisioning_artifact_name"] = provisioning_artifact_name
__props__.__dict__["provisioning_parameters"] = provisioning_parameters
__props__.__dict__["retain_physical_resources"] = retain_physical_resources
__props__.__dict__["stack_set_provisioning_preferences"] = stack_set_provisioning_preferences
__props__.__dict__["tags"] = tags
__props__.__dict__["arn"] = None
__props__.__dict__["cloudwatch_dashboard_names"] = None
__props__.__dict__["created_time"] = None
__props__.__dict__["last_provisioning_record_id"] = None
__props__.__dict__["last_record_id"] = None
__props__.__dict__["last_successful_provisioning_record_id"] = None
__props__.__dict__["launch_role_arn"] = None
__props__.__dict__["status"] = None
__props__.__dict__["status_message"] = None
__props__.__dict__["tags_all"] = None
__props__.__dict__["type"] = None
super(ProvisionedProduct, __self__).__init__(
'aws:servicecatalog/provisionedProduct:ProvisionedProduct',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
accept_language: Optional[pulumi.Input[str]] = None,
arn: Optional[pulumi.Input[str]] = None,
cloudwatch_dashboard_names: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
created_time: Optional[pulumi.Input[str]] = None,
ignore_errors: Optional[pulumi.Input[bool]] = None,
last_provisioning_record_id: Optional[pulumi.Input[str]] = None,
last_record_id: Optional[pulumi.Input[str]] = None,
last_successful_provisioning_record_id: Optional[pulumi.Input[str]] = None,
launch_role_arn: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
notification_arns: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
path_id: Optional[pulumi.Input[str]] = None,
path_name: Optional[pulumi.Input[str]] = None,
product_id: Optional[pulumi.Input[str]] = None,
product_name: Optional[pulumi.Input[str]] = None,
provisioning_artifact_id: Optional[pulumi.Input[str]] = None,
provisioning_artifact_name: Optional[pulumi.Input[str]] = None,
provisioning_parameters: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ProvisionedProductProvisioningParameterArgs']]]]] = None,
retain_physical_resources: Optional[pulumi.Input[bool]] = None,
stack_set_provisioning_preferences: Optional[pulumi.Input[pulumi.InputType['ProvisionedProductStackSetProvisioningPreferencesArgs']]] = None,
status: Optional[pulumi.Input[str]] = None,
status_message: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
tags_all: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
type: Optional[pulumi.Input[str]] = None) -> 'ProvisionedProduct':
"""
Get an existing ProvisionedProduct resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] accept_language: Language code. Valid values: `en` (English), `jp` (Japanese), `zh` (Chinese). Default value is `en`.
:param pulumi.Input[str] arn: ARN of the provisioned product.
:param pulumi.Input[Sequence[pulumi.Input[str]]] cloudwatch_dashboard_names: Set of CloudWatch dashboards that were created when provisioning the product.
:param pulumi.Input[str] created_time: Time when the provisioned product was created.
:param pulumi.Input[bool] ignore_errors: _Only applies to deleting._ If set to `true`, AWS Service Catalog stops managing the specified provisioned product even if it cannot delete the underlying resources. The default value is `false`.
:param pulumi.Input[str] last_provisioning_record_id: Record identifier of the last request performed on this provisioned product of the following types: `ProvisionedProduct`, `UpdateProvisionedProduct`, `ExecuteProvisionedProductPlan`, `TerminateProvisionedProduct`.
:param pulumi.Input[str] last_record_id: Record identifier of the last request performed on this provisioned product.
:param pulumi.Input[str] last_successful_provisioning_record_id: Record identifier of the last successful request performed on this provisioned product of the following types: `ProvisionedProduct`, `UpdateProvisionedProduct`, `ExecuteProvisionedProductPlan`, `TerminateProvisionedProduct`.
:param pulumi.Input[str] launch_role_arn: ARN of the launch role associated with the provisioned product.
:param pulumi.Input[str] name: User-friendly name of the provisioned product.
:param pulumi.Input[Sequence[pulumi.Input[str]]] notification_arns: Passed to CloudFormation. The SNS topic ARNs to which to publish stack-related events.
:param pulumi.Input[str] path_id: Path identifier of the product. This value is optional if the product has a default path, and required if the product has more than one path. To list the paths for a product, use `servicecatalog.get_launch_paths`. When required, you must provide `path_id` or `path_name`, but not both.
:param pulumi.Input[str] path_name: Name of the path. You must provide `path_id` or `path_name`, but not both.
:param pulumi.Input[str] product_id: Product identifier. For example, `prod-abcdzk7xy33qa`. You must provide `product_id` or `product_name`, but not both.
:param pulumi.Input[str] product_name: Name of the product. You must provide `product_id` or `product_name`, but not both.
:param pulumi.Input[str] provisioning_artifact_id: Identifier of the provisioning artifact. For example, `pa-4abcdjnxjj6ne`. You must provide the `provisioning_artifact_id` or `provisioning_artifact_name`, but not both.
:param pulumi.Input[str] provisioning_artifact_name: Name of the provisioning artifact. You must provide the `provisioning_artifact_id` or `provisioning_artifact_name`, but not both.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ProvisionedProductProvisioningParameterArgs']]]] provisioning_parameters: Configuration block with parameters specified by the administrator that are required for provisioning the product. See details below.
:param pulumi.Input[bool] retain_physical_resources: _Only applies to deleting._ Whether to delete the Service Catalog provisioned product but leave the CloudFormation stack, stack set, or the underlying resources of the deleted provisioned product. The default value is `false`.
:param pulumi.Input[pulumi.InputType['ProvisionedProductStackSetProvisioningPreferencesArgs']] stack_set_provisioning_preferences: Configuration block with information about the provisioning preferences for a stack set. See details below.
:param pulumi.Input[str] status: Current status of the provisioned product. See meanings below.
:param pulumi.Input[str] status_message: Current status message of the provisioned product.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: Tags to apply to the provisioned product. If configured with a provider `default_tags` configuration block present, tags with matching keys will overwrite those defined at the provider-level.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags_all: Map of tags assigned to the resource, including those inherited from the provider `default_tags` configuration block.
:param pulumi.Input[str] type: Type of provisioned product. Valid values are `CFN_STACK` and `CFN_STACKSET`.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _ProvisionedProductState.__new__(_ProvisionedProductState)
__props__.__dict__["accept_language"] = accept_language
__props__.__dict__["arn"] = arn
__props__.__dict__["cloudwatch_dashboard_names"] = cloudwatch_dashboard_names
__props__.__dict__["created_time"] = created_time
__props__.__dict__["ignore_errors"] = ignore_errors
__props__.__dict__["last_provisioning_record_id"] = last_provisioning_record_id
__props__.__dict__["last_record_id"] = last_record_id
__props__.__dict__["last_successful_provisioning_record_id"] = last_successful_provisioning_record_id
__props__.__dict__["launch_role_arn"] = launch_role_arn
__props__.__dict__["name"] = name
__props__.__dict__["notification_arns"] = notification_arns
__props__.__dict__["path_id"] = path_id
__props__.__dict__["path_name"] = path_name
__props__.__dict__["product_id"] = product_id
__props__.__dict__["product_name"] = product_name
__props__.__dict__["provisioning_artifact_id"] = provisioning_artifact_id
__props__.__dict__["provisioning_artifact_name"] = provisioning_artifact_name
__props__.__dict__["provisioning_parameters"] = provisioning_parameters
__props__.__dict__["retain_physical_resources"] = retain_physical_resources
__props__.__dict__["stack_set_provisioning_preferences"] = stack_set_provisioning_preferences
__props__.__dict__["status"] = status
__props__.__dict__["status_message"] = status_message
__props__.__dict__["tags"] = tags
__props__.__dict__["tags_all"] = tags_all
__props__.__dict__["type"] = type
return ProvisionedProduct(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="acceptLanguage")
def accept_language(self) -> pulumi.Output[Optional[str]]:
"""
Language code. Valid values: `en` (English), `jp` (Japanese), `zh` (Chinese). Default value is `en`.
"""
return pulumi.get(self, "accept_language")
@property
@pulumi.getter
def arn(self) -> pulumi.Output[str]:
"""
ARN of the provisioned product.
"""
return pulumi.get(self, "arn")
@property
@pulumi.getter(name="cloudwatchDashboardNames")
def cloudwatch_dashboard_names(self) -> pulumi.Output[Sequence[str]]:
"""
Set of CloudWatch dashboards that were created when provisioning the product.
"""
return pulumi.get(self, "cloudwatch_dashboard_names")
@property
@pulumi.getter(name="createdTime")
def created_time(self) -> pulumi.Output[str]:
"""
Time when the provisioned product was created.
"""
return pulumi.get(self, "created_time")
@property
@pulumi.getter(name="ignoreErrors")
def ignore_errors(self) -> pulumi.Output[Optional[bool]]:
"""
_Only applies to deleting._ If set to `true`, AWS Service Catalog stops managing the specified provisioned product even if it cannot delete the underlying resources. The default value is `false`.
"""
return pulumi.get(self, "ignore_errors")
@property
@pulumi.getter(name="lastProvisioningRecordId")
def last_provisioning_record_id(self) -> pulumi.Output[str]:
"""
Record identifier of the last request performed on this provisioned product of the following types: `ProvisionedProduct`, `UpdateProvisionedProduct`, `ExecuteProvisionedProductPlan`, `TerminateProvisionedProduct`.
"""
return pulumi.get(self, "last_provisioning_record_id")
@property
@pulumi.getter(name="lastRecordId")
def last_record_id(self) -> pulumi.Output[str]:
"""
Record identifier of the last request performed on this provisioned product.
"""
return pulumi.get(self, "last_record_id")
@property
@pulumi.getter(name="lastSuccessfulProvisioningRecordId")
def last_successful_provisioning_record_id(self) -> pulumi.Output[str]:
"""
Record identifier of the last successful request performed on this provisioned product of the following types: `ProvisionedProduct`, `UpdateProvisionedProduct`, `ExecuteProvisionedProductPlan`, `TerminateProvisionedProduct`.
"""
return pulumi.get(self, "last_successful_provisioning_record_id")
@property
@pulumi.getter(name="launchRoleArn")
def launch_role_arn(self) -> pulumi.Output[str]:
"""
ARN of the launch role associated with the provisioned product.
"""
return pulumi.get(self, "launch_role_arn")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
User-friendly name of the provisioned product.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="notificationArns")
def notification_arns(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
Passed to CloudFormation. The SNS topic ARNs to which to publish stack-related events.
"""
return pulumi.get(self, "notification_arns")
@property
@pulumi.getter(name="pathId")
def path_id(self) -> pulumi.Output[str]:
"""
Path identifier of the product. This value is optional if the product has a default path, and required if the product has more than one path. To list the paths for a product, use `servicecatalog.get_launch_paths`. When required, you must provide `path_id` or `path_name`, but not both.
"""
return pulumi.get(self, "path_id")
@property
@pulumi.getter(name="pathName")
def path_name(self) -> pulumi.Output[Optional[str]]:
"""
Name of the path. You must provide `path_id` or `path_name`, but not both.
"""
return pulumi.get(self, "path_name")
@property
@pulumi.getter(name="productId")
def product_id(self) -> pulumi.Output[str]:
"""
Product identifier. For example, `prod-abcdzk7xy33qa`. You must provide `product_id` or `product_name`, but not both.
"""
return pulumi.get(self, "product_id")
@property
@pulumi.getter(name="productName")
def product_name(self) -> pulumi.Output[Optional[str]]:
"""
Name of the product. You must provide `product_id` or `product_name`, but not both.
"""
return pulumi.get(self, "product_name")
@property
@pulumi.getter(name="provisioningArtifactId")
def provisioning_artifact_id(self) -> pulumi.Output[str]:
"""
Identifier of the provisioning artifact. For example, `pa-4abcdjnxjj6ne`. You must provide the `provisioning_artifact_id` or `provisioning_artifact_name`, but not both.
"""
return pulumi.get(self, "provisioning_artifact_id")
@property
@pulumi.getter(name="provisioningArtifactName")
def provisioning_artifact_name(self) -> pulumi.Output[Optional[str]]:
"""
Name of the provisioning artifact. You must provide the `provisioning_artifact_id` or `provisioning_artifact_name`, but not both.
"""
return pulumi.get(self, "provisioning_artifact_name")
@property
@pulumi.getter(name="provisioningParameters")
def provisioning_parameters(self) -> pulumi.Output[Optional[Sequence['outputs.ProvisionedProductProvisioningParameter']]]:
"""
Configuration block with parameters specified by the administrator that are required for provisioning the product. See details below.
"""
return pulumi.get(self, "provisioning_parameters")
@property
@pulumi.getter(name="retainPhysicalResources")
def retain_physical_resources(self) -> pulumi.Output[Optional[bool]]:
"""
_Only applies to deleting._ Whether to delete the Service Catalog provisioned product but leave the CloudFormation stack, stack set, or the underlying resources of the deleted provisioned product. The default value is `false`.
"""
return pulumi.get(self, "retain_physical_resources")
@property
@pulumi.getter(name="stackSetProvisioningPreferences")
def stack_set_provisioning_preferences(self) -> pulumi.Output[Optional['outputs.ProvisionedProductStackSetProvisioningPreferences']]:
"""
Configuration block with information about the provisioning preferences for a stack set. See details below.
"""
return pulumi.get(self, "stack_set_provisioning_preferences")
@property
@pulumi.getter
def status(self) -> pulumi.Output[str]:
"""
Current status of the provisioned product. See meanings below.
"""
return pulumi.get(self, "status")
@property
@pulumi.getter(name="statusMessage")
def status_message(self) -> pulumi.Output[str]:
"""
Current status message of the provisioned product.
"""
return pulumi.get(self, "status_message")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
Tags to apply to the provisioned product. If configured with a provider `default_tags` configuration block present, tags with matching keys will overwrite those defined at the provider-level.
"""
return pulumi.get(self, "tags")
@property
@pulumi.getter(name="tagsAll")
def tags_all(self) -> pulumi.Output[Mapping[str, str]]:
"""
Map of tags assigned to the resource, including those inherited from the provider `default_tags` configuration block.
"""
return pulumi.get(self, "tags_all")
@property
@pulumi.getter
def type(self) -> pulumi.Output[str]:
"""
Type of provisioned product. Valid values are `CFN_STACK` and `CFN_STACKSET`.
"""
return pulumi.get(self, "type")
| 56.625988 | 327 | 0.698312 | 7,554 | 64,497 | 5.73802 | 0.042229 | 0.076895 | 0.063952 | 0.054816 | 0.943615 | 0.930788 | 0.918168 | 0.907048 | 0.898143 | 0.872096 | 0 | 0.000781 | 0.206103 | 64,497 | 1,138 | 328 | 56.675747 | 0.845735 | 0.403879 | 0 | 0.759202 | 1 | 0 | 0.143117 | 0.092559 | 0 | 0 | 0 | 0 | 0 | 1 | 0.168712 | false | 0.001534 | 0.010736 | 0 | 0.283742 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
7e8fd6050ab8d272173ac5eda140839e565cb986 | 100 | py | Python | engine/gamestate.py | manuel-huez/PyAngine | 45e5b86ca6c1b76a06192849c47c30c2957f5805 | [
"MIT"
] | null | null | null | engine/gamestate.py | manuel-huez/PyAngine | 45e5b86ca6c1b76a06192849c47c30c2957f5805 | [
"MIT"
] | null | null | null | engine/gamestate.py | manuel-huez/PyAngine | 45e5b86ca6c1b76a06192849c47c30c2957f5805 | [
"MIT"
] | null | null | null | # engine.gamestate
class GameState:
def set(self):
return True
def unset(self):
return True
| 11.111111 | 18 | 0.71 | 14 | 100 | 5.071429 | 0.642857 | 0.28169 | 0.394366 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 100 | 8 | 19 | 12.5 | 0.8875 | 0.16 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0.4 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
0e44b31f68f8fa71c633bf3ea357968b59855ef3 | 40,667 | py | Python | src/solvers.py | ReinholdM/OpenASR | bd55ad3750a999405dfc9becab28acd34f3f535d | [
"Apache-2.0"
] | null | null | null | src/solvers.py | ReinholdM/OpenASR | bd55ad3750a999405dfc9becab28acd34f3f535d | [
"Apache-2.0"
] | null | null | null | src/solvers.py | ReinholdM/OpenASR | bd55ad3750a999405dfc9becab28acd34f3f535d | [
"Apache-2.0"
] | null | null | null | """
"""
import os
import time
import logging
import torch
from torch.nn.utils import clip_grad_norm_
import utils
import schedule
from utils import cycle
class Solver(object):
def __init__(self, model, config, tr_loader, cv_loader):
self.config = config
self.tr_loader = tr_loader
self.cv_loader = cv_loader
self.model = model
if config["multi_gpu"] == True:
self.model_to_pack = self.model.module
else:
self.model_to_pack = self.model
self.device = torch.device('cuda:0') if torch.cuda.is_available() else torch.device('cpu')
self.num_epoch = config["num_epoch"]
self.exp_dir = config["exp_dir"]
self.print_inteval = config["print_inteval"]
self.accumulate_grad_batch = config["accumulate_grad_batch"]
self.init_lr = config["init_lr"]
self.grad_max_norm = config["grad_max_norm"]
self.label_smooth = config["label_smooth"]
self.num_last_ckpt_keep = None
if "num_last_ckpt_keep" in config:
self.num_last_ckpt_keep = config["num_last_ckpt_keep"]
self.lr_scheduler = schedule.get_scheduler(config["lr_scheduler"])
# Solver state
self.epoch = 0
self.step = 0
self.tr_loss = []
self.cv_loss = []
self.lr = self.init_lr
if config["optimtype"] == "sgd":
self.optimizer = torch.optim.SGD(self.model_to_pack.parameters(), lr=self.lr, momentum=0.9)
elif config["optimtype"] == "adam":
self.optimizer = torch.optim.Adam(self.model_to_pack.parameters(), lr=self.lr,
betas=(0.9, 0.999), eps=1e-08, weight_decay=0)
else:
raise ValueError("Unknown optimizer.")
if not os.path.isdir(self.exp_dir):
os.makedirs(self.exp_dir)
def training_state(self):
return {
"epoch": self.epoch,
"step": self.step,
"tr_loss": self.tr_loss,
"cv_loss": self.cv_loss,
"lr": self.lr,
}
def restore_training_state(self, state):
self.epoch = state["epoch"]
self.step = state["step"]
self.tr_loss = state["tr_loss"]
self.cv_loss = state["cv_loss"]
self.lr = state["lr"]
def package(self):
return {
"model": self.model_to_pack.package(),
"Solver_config": self.config,
"Solver_state": self.training_state(),
"optim_state": self.optimizer.state_dict(),
"scheduler_state": self.lr_scheduler.pack_state()
}
def save(self, path):
pkg = self.package()
torch.save(pkg, path)
logging.info("Saving model to {}".format(path))
def restore(self, pkg):
self.restore_training_state(pkg["Solver_state"])
self.optimizer.load_state_dict(pkg['optim_state'])
self.lr_scheduler.restore_state(pkg["scheduler_state"])
def train(self):
timer = utils.Timer()
self.best_cvloss = 9e20
if self.cv_loss:
self.best_cvloss = min(self.cv_loss)
while self.epoch < self.num_epoch:
timer.tic()
self.epoch += 1
logging.info("Training")
tr_loss = self.iter_one_epoch()
tr_msg = ("tr loss: {:.4f}").format(tr_loss)
msg = "\n" + "-"*85 + "\n"
msg += "Epoch {} Training Summary:\n{}\n".format(self.epoch, tr_msg)
msg += "-"*85
logging.info(msg)
self.save(os.path.join(self.exp_dir, "ep-{:04d}.pt".format(self.epoch)))
self.save(os.path.join(self.exp_dir, "last.pt"))
logging.info("Validation")
cv_loss = self.iter_one_epoch(cross_valid=True)
if self.best_cvloss > cv_loss:
self.best_cvloss = cv_loss
train_time = timer.toc()
cv_msg = ("cv loss: {:.4f} | best cv loss {:.4f}").format(cv_loss, self.best_cvloss)
msg = "\n" + "-"*85 + "\n"
msg += "Epoch {} Validation Summary:\n{}\n".format(self.epoch, cv_msg)
msg += "Time cost: {:.4f} min".format(train_time/60.)
msg += "\n" + "-"*85 + '\n'
logging.info(msg)
self.tr_loss.append(tr_loss)
self.cv_loss.append(cv_loss)
if self.num_last_ckpt_keep:
utils.cleanup_ckpt(self.exp_dir, self.num_last_ckpt_keep)
class CE_Solver(Solver):
def iter_one_epoch(self, cross_valid=False):
if cross_valid:
loader = self.cv_loader
self.model.eval()
else:
loader = self.tr_loader
self.model.train()
timer = utils.Timer()
timer.tic()
tot_loss = 0.
tot_token = 0
tot_sequence = 0
n_accu_batch = self.accumulate_grad_batch
tot_iter_num = len(loader)
for niter, (utts, data) in enumerate(loader):
niter += 1
feats, len_feat, target_in, targets, paddings = \
(i.to(self.device) for i in data)
if niter == 1 and self.epoch == 1:
print('feats:\t{}\nlen_feat:\t{}\ntarget_in:\t{}\ntargets:\t{}\npaddings:\t{}'.format(
feats.size(), len_feat.size(), target_in.size(), targets.size(), paddings.size()))
print('feats:\n{}\nlen_feat:\t{}\ntarget_in:\t{}\ntargets:\t{}\npaddings:\t{}'.format(
feats[0], len_feat[0], target_in[0], targets[0], paddings[0]))
if cross_valid:
with torch.no_grad():
ce_loss = self.model(feats, len_feat, target_in, targets, paddings)
else:
ce_loss = self.model(feats, len_feat, target_in, targets, paddings,
label_smooth=self.label_smooth)
n_token = torch.sum(1-paddings).float()
tot_token += n_token
n_sequence = len(utts)
tot_sequence += n_sequence
loss = ce_loss.sum()/n_token
tot_loss += ce_loss
# compute gradients
if not cross_valid:
if n_accu_batch == self.accumulate_grad_batch:
self.optimizer.zero_grad()
loss.backward()
n_accu_batch -= 1
if n_accu_batch == 0 or niter == tot_iter_num:
self.step += 1 # to be consistant with metric
clip_grad_norm_(self.model.parameters(), self.grad_max_norm)
self.lr_scheduler.step() # then, update learning rate
self.lr_scheduler.set_lr(self.optimizer, self.init_lr)
self.optimizer.step()
n_accu_batch = self.accumulate_grad_batch
else:
continue
timer.toc()
if niter % self.print_inteval == 0:
print('Epoch {} | Step {} | Batch {}/{} {} \ncur_all_loss: {:.3f} ce_loss: {:.3f} lr: {:.3e} sent/sec: {:.3f}\n'.format(
self.epoch, self.step, niter, tot_iter_num, list(feats.size()),
loss, tot_loss / tot_token, list(self.optimizer.param_groups)[0]["lr"], tot_sequence/timer.toc()
), flush=True)
torch.cuda.empty_cache()
time.sleep(2)
return (tot_loss/tot_token).item()
class CTC_CE_Solver(Solver):
def __init__ (self, model, config, tr_loader, cv_loader):
self.config = config
self.tr_loader = tr_loader
self.cv_loader = cv_loader
self.model = model
if config["multi_gpu"] == True:
self.model_to_pack = self.model.module
else:
self.model_to_pack = self.model
self.device = torch.device('cuda:0') if torch.cuda.is_available() else torch.device('cpu')
self.num_epoch = config["num_epoch"]
self.exp_dir = config["exp_dir"]
self.print_inteval = config["print_inteval"]
self.accumulate_grad_batch = config["accumulate_grad_batch"]
self.init_lr = config["init_lr"]
self.grad_max_norm = config["grad_max_norm"]
self.label_smooth = config["label_smooth"]
self.lambda_ctc = config["lambda_ctc"]
self.num_last_ckpt_keep = None
if "num_last_ckpt_keep" in config:
self.num_last_ckpt_keep = config["num_last_ckpt_keep"]
self.lr_scheduler = schedule.get_scheduler(config["lr_scheduler"])
# Solver state
self.epoch = 0
self.step = 0
self.tr_loss = []
self.cv_loss = []
self.lr = self.init_lr
if config["optimtype"] == "sgd":
self.optimizer = torch.optim.SGD(self.model_to_pack.parameters(), lr=self.lr, momentum=0.9)
elif config["optimtype"] == "adam":
self.optimizer = torch.optim.Adam(self.model_to_pack.parameters(), lr=self.lr,
betas=(0.9, 0.999), eps=1e-08, weight_decay=0)
else:
raise ValueError("Unknown optimizer.")
if not os.path.isdir(self.exp_dir):
os.makedirs(self.exp_dir)
def iter_one_epoch(self, cross_valid=False):
if cross_valid:
loader = self.cv_loader
self.model.eval()
else:
loader = self.tr_loader
self.model.train()
timer = utils.Timer()
timer.tic()
tot_loss = 0.
tot_ctc_loss = 0.
tot_token = 0
tot_sequence = 0
n_accu_batch = self.accumulate_grad_batch
tot_iter_num = len(loader)
for niter, data in enumerate(loader):
niter += 1
utts, padded_waveforms, wave_lengths, ids, labels, paddings = data
if cross_valid:
with torch.no_grad():
ctc_loss, ce_loss = self.model(padded_waveforms,
wave_lengths.long(),
ids.long(),
labels.long(),
paddings.long())
else:
ctc_loss, ce_loss = self.model(padded_waveforms,
wave_lengths.long(),
ids.long(),
labels.long(),
paddings.long(),
label_smooth=self.label_smooth)
n_token = torch.sum(1-paddings).float()
tot_token += n_token
n_sequence = len(utts)
tot_sequence += n_sequence
loss = ce_loss.sum()/n_token + self.lambda_ctc * ctc_loss.sum()/n_sequence
tot_ctc_loss += ctc_loss
tot_loss += ce_loss
# compute gradients
if not cross_valid:
if n_accu_batch == self.accumulate_grad_batch:
self.optimizer.zero_grad()
loss.backward()
n_accu_batch -= 1
if n_accu_batch == 0 or niter == tot_iter_num:
self.step += 1 # to be consistant with metric
clip_grad_norm_(self.model.parameters(), self.grad_max_norm)
self.lr_scheduler.step() # then, update learning rate
self.lr_scheduler.set_lr(self.optimizer, self.init_lr)
self.optimizer.step()
n_accu_batch = self.accumulate_grad_batch
else:
continue
timer.toc()
if niter % self.print_inteval == 0:
print('Epoch {} | Step {} | Iter {} batch {} \ncur_all_loss: {:.3f} ce_loss: {:.3f} ctc_loss/sent: {:.3f} lr: {:.3e} sent/sec: {:.3f}\n'.format(
self.epoch, self.step, niter, list(padded_waveforms.size()),
loss, tot_loss / tot_token, tot_ctc_loss / tot_sequence,
list(self.optimizer.param_groups)[0]["lr"], tot_sequence/timer.toc()
), flush=True)
torch.cuda.empty_cache()
time.sleep(2)
return (tot_loss/tot_token).item()
class CTC_Solver(CE_Solver):
def iter_one_epoch(self, cross_valid=False):
if cross_valid:
loader = self.cv_loader
self.model.eval()
else:
loader = self.tr_loader
self.model.train()
timer = utils.Timer()
timer.tic()
tot_loss = 0.
tot_token = 0
tot_sequence = 0
n_accu_batch = self.accumulate_grad_batch
tot_iter_num = len(loader)
for niter, (utts, data) in enumerate(loader):
niter += 1
feats, len_feat, _, targets, paddings = (i.to(self.device) for i in data)
if niter == 1 and self.epoch == 1:
print('feats:\t{}\nlen_feat:\t{}\ntargets:\t{}\npaddings:\t{}'.format(
feats.size(), len_feat.size(), targets.size(), paddings.size()))
print('feats:\n{}\nlen_feat:\t{}\ntargets:\t{}\npaddings:\t{}'.format(
feats[0], len_feat[0], targets[0], paddings[0]))
len_target = (1-paddings).int().sum(-1)
n_token = len_target.sum().float()
tot_token += n_token
n_sequence = len(utts)
tot_sequence += n_sequence
if cross_valid:
with torch.no_grad():
ctc_loss = self.model(feats, len_feat, targets, len_target)
if niter == 1:
logits, len_logits = self.model.get_logits(feats[:1], len_feat[:1])
blk_idx = logits.size(-1) - 1
align = torch.argmax(logits[0], -1)[:len_logits[0]]
print('infer:\n', align[align<blk_idx].tolist())
print('target:\n', targets[0][:len_target[0]].tolist())
else:
ctc_loss = self.model(feats, len_feat, targets, len_target)
loss = ctc_loss.sum()/n_sequence
tot_loss += ctc_loss
# compute gradients
if not cross_valid:
if n_accu_batch == self.accumulate_grad_batch:
self.optimizer.zero_grad()
loss.backward()
n_accu_batch -= 1
if n_accu_batch == 0 or niter == tot_iter_num:
self.step += 1 # to be consistant with metric
clip_grad_norm_(self.model.parameters(), self.grad_max_norm)
self.lr_scheduler.step() # then, update learning rate
self.lr_scheduler.set_lr(self.optimizer, self.init_lr)
self.optimizer.step()
n_accu_batch = self.accumulate_grad_batch
else:
continue
timer.toc()
if niter % self.print_inteval == 0:
print('Epoch {} | Step {} | Batch {}/{} {} \ncur_loss: {:.3f} avg_loss: {:.3f} lr: {:.3e} sent/sec: {:.3f}\n'.format(
self.epoch, self.step, niter, tot_iter_num, list(feats.size()),
loss, tot_loss / tot_sequence,
list(self.optimizer.param_groups)[0]["lr"],
tot_sequence/timer.toc()
), flush=True)
torch.cuda.empty_cache()
time.sleep(2)
return (tot_loss/tot_sequence).item()
class CIF_Solver(Solver):
def __init__ (self, model, config, tr_loader, cv_loader):
self.config = config
self.tr_loader = tr_loader
self.cv_loader = cv_loader
self.model = model
if config["multi_gpu"] == True:
self.model_to_pack = self.model.module
else:
self.model_to_pack = self.model
self.device = torch.device('cuda:0') if torch.cuda.is_available() else torch.device('cpu')
self.num_epoch = config["num_epoch"]
self.exp_dir = config["exp_dir"]
self.print_inteval = config["print_inteval"]
self.accumulate_grad_batch = config["accumulate_grad_batch"]
self.init_lr = config["init_lr"]
self.grad_max_norm = config["grad_max_norm"]
self.label_smooth = config["label_smooth"]
self.lambda_qua = config["lambda_qua"]
self.lambda_ctc = config["lambda_ctc"]
self.num_last_ckpt_keep = None
if "num_last_ckpt_keep" in config:
self.num_last_ckpt_keep = config["num_last_ckpt_keep"]
self.lr_scheduler = schedule.get_scheduler(config["lr_scheduler"])
# Solver state
self.epoch = 0
self.step = 0
self.tr_loss = []
self.cv_loss = []
self.lr = self.init_lr
if config["optimtype"] == "sgd":
self.optimizer = torch.optim.SGD(self.model_to_pack.parameters(), lr=self.lr, momentum=0.9)
elif config["optimtype"] == "adam":
self.optimizer = torch.optim.Adam(self.model_to_pack.parameters(), lr=self.lr,
betas=(0.9, 0.999), eps=1e-08, weight_decay=0)
else:
raise ValueError("Unknown optimizer.")
if not os.path.isdir(self.exp_dir):
os.makedirs(self.exp_dir)
def iter_one_epoch(self, cross_valid=False):
if cross_valid:
loader = self.cv_loader
self.model.eval()
else:
loader = self.tr_loader
self.model.train()
timer = utils.Timer()
timer.tic()
tot_loss = 0.
tot_ctc_loss = 0.
tot_qua_loss = 0.
tot_token = 0
tot_sequence = 0
n_accu_batch = self.accumulate_grad_batch
tot_iter_num = len(loader)
for niter, data in enumerate(loader):
niter += 1
utts, padded_waveforms, wave_lengths, target_input, target, paddings = data
if cross_valid:
with torch.no_grad():
ctc_loss, qua_loss, ce_loss = self.model(padded_waveforms,
wave_lengths.long(),
target_input.long(),
target.long(),
paddings.long())
else:
ctc_loss, qua_loss, ce_loss = self.model(padded_waveforms,
wave_lengths.long(),
target_input.long(),
target.long(),
paddings.long(),
label_smooth=self.label_smooth)
n_token = torch.sum(1-paddings).float()
tot_token += n_token
n_sequence = len(utts)
tot_sequence += n_sequence
loss = ce_loss.sum()/n_token + \
self.lambda_qua * qua_loss.sum()/n_sequence + \
self.lambda_ctc * ctc_loss.sum()/n_sequence
tot_ctc_loss += ctc_loss
tot_qua_loss += qua_loss
tot_loss += ce_loss
# compute gradients
if not cross_valid:
if n_accu_batch == self.accumulate_grad_batch:
self.optimizer.zero_grad()
loss.backward()
n_accu_batch -= 1
if n_accu_batch == 0 or niter == tot_iter_num:
self.step += 1 # to be consistant with metric
clip_grad_norm_(self.model.parameters(), self.grad_max_norm)
self.lr_scheduler.step() # then, update learning rate
self.lr_scheduler.set_lr(self.optimizer, self.init_lr)
self.optimizer.step()
n_accu_batch = self.accumulate_grad_batch
else:
continue
timer.toc()
if niter % self.print_inteval == 0:
print('Epoch {} | Step {} | Iter {} batch {} \ncur_all_loss: {:.3f} ce_loss: {:.3f} ctc_loss: {:.3f} qua_loss: {:.3f} lr: {:.3e} sec/sent: {:.3f}s\n'.format(
self.epoch, self.step, niter, list(padded_waveforms.size()),
loss, tot_loss/tot_token, tot_ctc_loss/tot_sequence, tot_qua_loss/tot_sequence,
list(self.optimizer.param_groups)[0]["lr"], tot_sequence/timer.toc()
), flush=True)
torch.cuda.empty_cache()
time.sleep(2)
return (tot_loss/tot_token).item()
class Phone2Char_Solver(Solver):
def iter_one_epoch(self, cross_valid=False):
if cross_valid:
loader = self.cv_loader
self.model.eval()
else:
loader = self.tr_loader
self.model.train()
timer = utils.Timer()
timer.tic()
tot_loss = 0.
tot_token = 0
tot_sequence = 0
n_accu_batch = self.accumulate_grad_batch
tot_iter_num = len(loader)
for niter, (utts, data) in enumerate(loader):
niter += 1
xs_in, len_xs, target_in, target, paddings = (i.to(self.device) for i in data)
if cross_valid:
with torch.no_grad():
ce_loss = self.model(xs_in, len_xs, target_in, target, paddings)
# elif cross_valid == 'performance':
# self.model.eval()
# with torch.no_grad():
# encoded, len_encoded = self.model.get_encoded(xs_in, len_xs)
# pred_ids, len_decodeds, scores = self.model.batch_beam_decode(
# encoded, len_encoded, beam_size=self.nbest, max_decode_len=self.maxlen)
#
# self.model.train()
else:
ce_loss = self.model(xs_in, len_xs, target_in, target, paddings,
label_smooth=self.label_smooth)
n_token = torch.sum(1-paddings).float()
tot_token += n_token
n_sequence = len(utts)
tot_sequence += n_sequence
loss = ce_loss.sum()/n_token
tot_loss += ce_loss
# compute gradients
if not cross_valid:
if n_accu_batch == self.accumulate_grad_batch:
self.optimizer.zero_grad()
loss.backward()
n_accu_batch -= 1
if n_accu_batch == 0 or niter == tot_iter_num:
self.step += 1 # to be consistant with metric
clip_grad_norm_(self.model.parameters(), self.grad_max_norm)
self.lr_scheduler.step() # then, update learning rate
self.lr_scheduler.set_lr(self.optimizer, self.init_lr)
self.optimizer.step()
n_accu_batch = self.accumulate_grad_batch
else:
continue
timer.toc()
if niter % self.print_inteval == 0:
print('Epoch {} | Step {} | Iter {} batch {} \ncur_all_loss: {:.3f} ce_loss: {:.3f} lr: {:.3e} sent/sec: {:.3f}\n'.format(
self.epoch, self.step, niter, list(xs_in.size()),
loss, tot_loss / tot_token, list(self.optimizer.param_groups)[0]["lr"], tot_sequence/timer.toc()
), flush=True)
torch.cuda.empty_cache()
time.sleep(2)
return (tot_loss/tot_token).item()
def train(self):
timer = utils.Timer()
self.best_cvloss = 9e20
if self.cv_loss:
self.best_cvloss = min(self.cv_loss)
while self.epoch < self.num_epoch:
timer.tic()
self.epoch += 1
logging.info("Training")
tr_loss = self.iter_one_epoch()
tr_msg = ("tr loss: {:.4f}").format(tr_loss)
msg = "\n" + "-"*85 + "\n"
msg += "Epoch {} Training Summary:\n{}\n".format(self.epoch, tr_msg)
msg += "-"*85
logging.info(msg)
self.save(os.path.join(self.exp_dir, "ep-{:04d}.pt".format(self.epoch)))
self.save(os.path.join(self.exp_dir, "last.pt"))
logging.info("Validation")
cv_loss = self.iter_one_epoch(cross_valid=True)
if self.best_cvloss > cv_loss:
self.best_cvloss = cv_loss
train_time = timer.toc()
cv_msg = ("cv loss: {:.4f} | best cv loss {:.4f}").format(cv_loss, self.best_cvloss)
msg = "\n" + "-"*85 + "\n"
msg += "Epoch {} Validation Summary:\n{}\n".format(self.epoch, cv_msg)
msg += "Time cost: {:.4f} min".format(train_time/60.)
msg += "\n" + "-"*85 + '\n'
logging.info(msg)
self.tr_loss.append(tr_loss)
self.cv_loss.append(cv_loss)
if self.num_last_ckpt_keep:
utils.cleanup_ckpt(self.exp_dir, self.num_last_ckpt_keep)
class CIF_FC_Solver(CIF_Solver):
def iter_one_epoch(self, cross_valid=False):
if cross_valid:
loader = self.cv_loader
self.model.eval()
else:
loader = self.tr_loader
self.model.train()
timer = utils.Timer()
timer.tic()
tot_loss = 0.
tot_sequence = 0
tot_phone_acoustic = 0
n_accu_batch = self.accumulate_grad_batch
tot_iter_num = len(loader)
for niter, (utts, data_acoustic) in enumerate(loader):
niter += 1
if n_accu_batch == self.accumulate_grad_batch:
self.optimizer.zero_grad()
feats_acoustic, len_feat_acoustic, phones_acoustic, len_phone_acoustic = \
(i.to(self.device) for i in data_acoustic)
# general acoustic loss
n_sequence = len(utts)
n_phone_acoustic = len_phone_acoustic.sum()
tot_phone_acoustic += n_phone_acoustic
tot_sequence+= n_sequence
loss_qua_acoustic, loss_ce_phone_acoustic = \
self.model(feats_acoustic, len_feat_acoustic, phones_acoustic, len_phone_acoustic,
label_smooth=self.label_smooth)
loss_ce_phone_acoustic = loss_ce_phone_acoustic.sum() / n_phone_acoustic
loss_qua_acoustic = loss_qua_acoustic.sum() / n_sequence
loss_acoustic = loss_ce_phone_acoustic + \
self.lambda_qua * loss_qua_acoustic
loss_acoustic.backward()
tot_loss += loss_acoustic
n_accu_batch -= 1
if n_accu_batch == 0 or niter == tot_iter_num:
self.step += 1 # to be consistant with metric
clip_grad_norm_(self.model.parameters(), self.grad_max_norm)
self.lr_scheduler.step() # then, update learning rate
self.lr_scheduler.set_lr(self.optimizer, self.init_lr)
self.optimizer.step()
n_accu_batch = self.accumulate_grad_batch
else:
continue
timer.toc()
if niter % self.print_inteval == 0:
print('''Epoch {} | Step {} | Iter {} acoustic {} | lr: {:.3e} | sent/sec: {:.1f}
acoustic cur_all_loss: {:.3f} loss_ce_phone: {:.3f} loss_qua: {:.3f}
'''.format(
self.epoch, self.step, niter, list(feats_acoustic.size()),
list(self.optimizer.param_groups)[0]["lr"], tot_sequence/timer.toc(),
loss_acoustic, loss_ce_phone_acoustic, loss_qua_acoustic
), flush=True)
torch.cuda.empty_cache()
time.sleep(2)
return (tot_loss/tot_phone_acoustic).item()
class CIF_CTC_FC_Solver(CIF_FC_Solver):
def iter_one_epoch(self, cross_valid=False):
if cross_valid:
loader = self.cv_loader
self.model.eval()
else:
loader = self.tr_loader
self.model.train()
timer = utils.Timer()
timer.tic()
tot_loss = 0.
tot_sequence = 0
tot_phone_acoustic = 0
n_accu_batch = self.accumulate_grad_batch
tot_iter_num = len(loader)
for niter, (utts, data_acoustic) in enumerate(loader):
niter += 1
if n_accu_batch == self.accumulate_grad_batch:
self.optimizer.zero_grad()
feats_acoustic, len_feat_acoustic, phones_acoustic, len_phone_acoustic = \
(i.to(self.device) for i in data_acoustic)
# general acoustic loss
n_sequence = len(utts)
n_phone_acoustic = len_phone_acoustic.sum()
tot_phone_acoustic += n_phone_acoustic
tot_sequence+= n_sequence
loss_ctc_acoustic, loss_qua_acoustic, loss_ce_phone_acoustic = \
self.model(feats_acoustic, len_feat_acoustic, phones_acoustic, len_phone_acoustic,
label_smooth=self.label_smooth)
loss_ce_phone_acoustic = loss_ce_phone_acoustic.sum() / n_phone_acoustic
loss_ctc_acoustic = loss_ctc_acoustic.sum() / n_sequence
loss_qua_acoustic = loss_qua_acoustic.sum() / n_sequence
loss_acoustic = loss_ce_phone_acoustic + \
self.lambda_qua * loss_qua_acoustic + \
self.lambda_ctc * loss_ctc_acoustic
loss_acoustic.backward()
tot_loss += loss_acoustic
n_accu_batch -= 1
if n_accu_batch == 0 or niter == tot_iter_num:
self.step += 1 # to be consistant with metric
clip_grad_norm_(self.model.parameters(), self.grad_max_norm)
self.lr_scheduler.step() # then, update learning rate
self.lr_scheduler.set_lr(self.optimizer, self.init_lr)
self.optimizer.step()
n_accu_batch = self.accumulate_grad_batch
else:
continue
timer.toc()
if niter % self.print_inteval == 0:
print('''Epoch {} | Step {} | Iter {} acoustic {} | lr: {:.3e} | sent/sec: {:.1f}
acoustic cur_all_loss: {:.3f} loss_ce_phone: {:.3f} loss_ctc: {:.3f} loss_qua: {:.3f}
'''.format(
self.epoch, self.step, niter, list(feats_acoustic.size()),
list(self.optimizer.param_groups)[0]["lr"], tot_sequence/timer.toc(),
loss_acoustic, loss_ce_phone_acoustic, loss_ctc_acoustic, loss_qua_acoustic
), flush=True)
torch.cuda.empty_cache()
time.sleep(2)
return (tot_loss/tot_phone_acoustic).item()
class CIF_MIX_Solver(CIF_Solver):
def __init__ (self, model, config, batchiter_acoustic, batchiter_train, batchiter_dev):
self.config = config
self.batchiter_acoustic = batchiter_acoustic
self.batchiter_train = batchiter_train
self.batchiter_dev = batchiter_dev
self.model = model
if config["multi_gpu"] == True:
self.model_to_pack = self.model.module
else:
self.model_to_pack = self.model
self.device = torch.device('cuda:0') if torch.cuda.is_available() else torch.device('cpu')
self.num_epoch = config["num_epoch"]
self.exp_dir = config["exp_dir"]
self.print_inteval = config["print_inteval"]
self.accumulate_grad_batch = config["accumulate_grad_batch"]
self.init_lr = config["init_lr"]
self.grad_max_norm = config["grad_max_norm"]
self.label_smooth = config["label_smooth"]
self.lambda_qua = config["lambda_qua"]
self.lambda_ctc = config["lambda_ctc"]
self.num_last_ckpt_keep = None
if "num_last_ckpt_keep" in config:
self.num_last_ckpt_keep = config["num_last_ckpt_keep"]
self.lr_scheduler = schedule.get_scheduler(config["lr_scheduler"])
# Solver state
self.epoch = 0
self.step = 0
self.tr_loss = []
self.cv_loss = []
self.lr = self.init_lr
if config["optimtype"] == "sgd":
self.optimizer = torch.optim.SGD(self.model_to_pack.parameters(), lr=self.lr, momentum=0.9)
elif config["optimtype"] == "adam":
self.optimizer = torch.optim.Adam(self.model_to_pack.parameters(), lr=self.lr,
betas=(0.9, 0.999), eps=1e-08, weight_decay=0)
else:
raise ValueError("Unknown optimizer.")
if not os.path.isdir(self.exp_dir):
os.makedirs(self.exp_dir)
def train(self):
timer = utils.Timer()
self.best_cvloss = 9e20
if self.cv_loss:
self.best_cvloss = min(self.cv_loss)
while self.epoch < self.num_epoch:
timer.tic()
self.epoch += 1
logging.info("Training")
tr_loss = self.iter_train_epoch()
tr_msg = ("tr loss: {:.4f}").format(tr_loss)
msg = "\n" + "-"*85 + "\n"
msg += "Epoch {} Training Summary:\n{}\n".format(self.epoch, tr_msg)
msg += "-"*85
logging.info(msg)
self.save(os.path.join(self.exp_dir, "ep-{:04d}.pt".format(self.epoch)))
self.save(os.path.join(self.exp_dir, "last.pt"))
logging.info("Validation")
cv_loss = self.iter_dev_epoch()
if self.best_cvloss > cv_loss:
self.best_cvloss = cv_loss
train_time = timer.toc()
cv_msg = ("cv loss: {:.4f} | best cv loss {:.4f}").format(cv_loss, self.best_cvloss)
msg = "\n" + "-"*85 + "\n"
msg += "Epoch {} Validation Summary:\n{}\n".format(self.epoch, cv_msg)
msg += "Time cost: {:.4f} min".format(train_time/60.)
msg += "\n" + "-"*85 + '\n'
logging.info(msg)
self.tr_loss.append(tr_loss)
self.cv_loss.append(cv_loss)
if self.num_last_ckpt_keep:
utils.cleanup_ckpt(self.exp_dir, self.num_last_ckpt_keep)
def iter_train_epoch(self):
loader_acoustic = self.batchiter_acoustic
loader = iter(cycle(self.batchiter_train))
self.model.train()
timer = utils.Timer()
timer.tic()
tot_loss = 0.
tot_phone = 0
tot_token = 0
tot_sequence = 0
tot_phone_acoustic = 0
tot_sequence_acoustic = 0
n_accu_batch = self.accumulate_grad_batch
tot_iter_num = len(loader_acoustic)
for niter, ((utts_acoustic, data_acoustic), (utts, data)) in enumerate(zip(loader_acoustic, loader)):
niter += 1
if n_accu_batch == self.accumulate_grad_batch:
self.optimizer.zero_grad()
feats_acoustic, len_feat_acoustic, phones_acoustic, len_phone_acoustic = \
(i.to(self.device) for i in data_acoustic)
feats, len_feat, phones, len_phone, target_in, targets, paddings = \
(i.to(self.device) for i in data)
if niter == 1 and self.epoch == 1:
print('feats_acoustic:\t{}\nlen_feat_acoustic:\t{}\nphones_acoustic:\t{}\nlen_phone_acoustic:\t{}'.format(
feats_acoustic.size(), len_feat_acoustic.size(), phones_acoustic.size(), len_phone_acoustic.size()))
print('feats_acoustic:\n{}\nlen_feat_acoustic:\t{}\nphones_acoustic:\t{}\nlen_phone_acoustic:\t{}'.format(
feats_acoustic[0], len_feat_acoustic[0], phones_acoustic[0], len_phone_acoustic[0]))
print('feats:\t{}\nlen_feat:\t{}\nphones:\t{}\nlen_phone:\t{}\ntargets:\t{}\npaddings:\t{}'.format(
feats.size(), len_feat.size(), phones.size(), len_phone.size(), target_in.size(), targets.size(), paddings.size()))
print('feats:\n{}\nlen_feat:\t{}\nphones:\t{}\nlen_phone:\t{}\ntarget_in:\t{}\ntargets:\t{}\npaddings:\t{}'.format(
feats[0], len_feat[0], phones[0], len_phone[0], target_in[0], targets[0], paddings[0]))
timer.tic()
# general acoustic loss
n_phone_acoustic = len_phone_acoustic.sum()
tot_phone_acoustic += n_phone_acoustic
n_sequence_acoustic = len(utts_acoustic)
tot_sequence_acoustic += n_sequence_acoustic
loss_ctc_acoustic, loss_qua_acoustic, loss_ce_phone_acoustic = \
self.model(feats_acoustic, len_feat_acoustic, phones_acoustic, len_phone_acoustic,
label_smooth=self.label_smooth)
print(timer.toc())
loss_ce_phone_acoustic = loss_ce_phone_acoustic.sum() / n_phone_acoustic
loss_ctc_acoustic = loss_ctc_acoustic.sum() / n_sequence_acoustic
loss_qua_acoustic = loss_qua_acoustic.sum() / n_sequence_acoustic
loss_acoustic = loss_ce_phone_acoustic + \
self.lambda_qua * loss_qua_acoustic + \
self.lambda_ctc * loss_ctc_acoustic
loss_acoustic.backward()
loss_ctc, loss_qua, loss_ce_phone, loss_ce_target = \
self.model(feats, len_feat, phones, len_phone, target_in, targets, paddings,
label_smooth=self.label_smooth)
print(timer.toc())
n_phone = len_phone.sum()
n_token = torch.sum(1-paddings).float()
tot_phone += n_phone
tot_token += n_token
n_sequence = len(utts)
tot_sequence += n_sequence
loss_ce_phone = loss_ce_phone.sum() / n_phone
loss_ce_target = loss_ce_target.sum() / n_token
loss_ctc = loss_ctc.sum() / n_sequence
loss_qua = loss_qua.sum() / n_sequence
loss = loss_ce_phone + loss_ce_target + \
self.lambda_qua * loss_qua + \
self.lambda_ctc * loss_ctc
loss.backward()
tot_loss += loss
n_accu_batch -= 1
if n_accu_batch == 0 or niter == tot_iter_num:
self.step += 1 # to be consistant with metric
clip_grad_norm_(self.model.parameters(), self.grad_max_norm)
self.lr_scheduler.step() # then, update learning rate
self.lr_scheduler.set_lr(self.optimizer, self.init_lr)
self.optimizer.step()
n_accu_batch = self.accumulate_grad_batch
else:
continue
if niter % self.print_inteval == 0:
print('''Epoch {} | Step {} | acoustic {}/{} {} | target {} | lr: {:.3e} | sent/sec: {:.1f}
acoustic cur_all_loss: {:.3f} loss_ce_phone: {:.3f} loss_ctc: {:.3f} loss_qua: {:.3f}
target cur_all_loss: {:.3f} loss_ce_phone: {:.3f} loss_ctc: {:.3f} loss_qua: {:.3f} loss_ce_char: {:.3f}
'''.format(
self.epoch, self.step, niter, tot_iter_num, list(feats_acoustic.size()), list(feats.size()),
list(self.optimizer.param_groups)[0]["lr"], tot_sequence_acoustic/timer.toc(),
loss_acoustic, loss_ce_phone_acoustic, loss_ctc_acoustic, loss_qua_acoustic,
loss, loss_ce_phone, loss_ctc, loss_qua, loss_ce_target,
), flush=True)
torch.cuda.empty_cache()
time.sleep(2)
return (tot_loss/tot_token).item()
def iter_dev_epoch(self):
loader = self.batchiter_dev
self.model.eval()
timer = utils.Timer()
timer.tic()
tot_loss = 0.
tot_phone = 0
tot_token = 0
tot_sequence = 0
for niter, (utts, data) in enumerate(loader):
niter += 1
feats, len_feat, phones, len_phone, target_in, target_out, paddings = \
(i.to(self.device) for i in data)
with torch.no_grad():
ctc_loss, qua_loss, ce_phone_loss, ce_target_loss = \
self.model(feats, len_feat, phones, len_phone, target_in, target_out, paddings)
n_phone = len_phone.sum()
n_token = torch.sum(1-paddings).float()
tot_phone += n_phone
tot_token += n_token
n_sequence = len(utts)
tot_sequence += n_sequence
loss = ce_phone_loss.sum()/n_phone + \
ce_target_loss.sum()/n_token + \
self.lambda_qua * qua_loss.sum()/n_sequence + \
self.lambda_ctc * ctc_loss.sum()/n_sequence
tot_loss += loss
timer.toc()
if niter % self.print_inteval == 0:
print('Epoch {} | Step {} | Iter {} batch {} \ncur_all_loss: {:.3f} ce_loss: {:.3f} lr: {:.3e} sent/sec: {:.3f}\n'.format(
self.epoch, self.step, niter, list(feats.size()),
loss, tot_loss / tot_token, list(self.optimizer.param_groups)[0]["lr"], tot_sequence/timer.toc()
), flush=True)
torch.cuda.empty_cache()
time.sleep(2)
return (tot_loss/tot_token).item()
| 39.559339 | 173 | 0.555266 | 4,982 | 40,667 | 4.251104 | 0.045765 | 0.032721 | 0.018887 | 0.030407 | 0.898862 | 0.89178 | 0.887861 | 0.885075 | 0.876859 | 0.867652 | 0 | 0.011296 | 0.329506 | 40,667 | 1,027 | 174 | 39.597858 | 0.765431 | 0.024221 | 0 | 0.830325 | 0 | 0.016847 | 0.088955 | 0.017508 | 0.01083 | 0 | 0 | 0 | 0 | 1 | 0.025271 | false | 0 | 0.009627 | 0.002407 | 0.058965 | 0.040915 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0e4ba0aeef22cb8fbca94e44a5da730cacace8b5 | 22,810 | py | Python | src/dms-preview/azext_dms/vendored_sdks/datamigration/operations/_service_tasks_operations.py | Mannan2812/azure-cli-extensions | e2b34efe23795f6db9c59100534a40f0813c3d95 | [
"MIT"
] | 8 | 2021-01-13T23:44:08.000Z | 2021-03-17T10:13:36.000Z | src/dms-preview/azext_dms/vendored_sdks/datamigration/operations/_service_tasks_operations.py | Mannan2812/azure-cli-extensions | e2b34efe23795f6db9c59100534a40f0813c3d95 | [
"MIT"
] | 226 | 2019-07-24T07:57:21.000Z | 2019-10-15T01:07:24.000Z | src/dms-preview/azext_dms/vendored_sdks/datamigration/operations/_service_tasks_operations.py | Mannan2812/azure-cli-extensions | e2b34efe23795f6db9c59100534a40f0813c3d95 | [
"MIT"
] | 5 | 2020-05-09T17:47:09.000Z | 2020-10-01T19:52:06.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
import uuid
from msrest.pipeline import ClientRawResponse
from .. import models
class ServiceTasksOperations(object):
"""ServiceTasksOperations operations.
You should not instantiate directly this class, but create a Client instance that will create it for you and attach it as attribute.
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An object model deserializer.
:ivar api_version: Version of the API. Constant value: "2018-07-15-preview".
"""
models = models
def __init__(self, client, config, serializer, deserializer):
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self.api_version = "2018-07-15-preview"
self.config = config
def list(
self, group_name, service_name, task_type=None, custom_headers=None, raw=False, **operation_config):
"""Get service level tasks for a service.
The services resource is the top-level resource that represents the
Database Migration Service. This method returns a list of service level
tasks owned by a service resource. Some tasks may have a status of
Unknown, which indicates that an error occurred while querying the
status of that task.
:param group_name: Name of the resource group
:type group_name: str
:param service_name: Name of the service
:type service_name: str
:param task_type: Filter tasks by task type
:type task_type: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of ProjectTask
:rtype:
~azure.mgmt.datamigration.models.ProjectTaskPaged[~azure.mgmt.datamigration.models.ProjectTask]
:raises:
:class:`ApiErrorException<azure.mgmt.datamigration.models.ApiErrorException>`
"""
def prepare_request(next_link=None):
if not next_link:
# Construct URL
url = self.list.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'groupName': self._serialize.url("group_name", group_name, 'str'),
'serviceName': self._serialize.url("service_name", service_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
if task_type is not None:
query_parameters['taskType'] = self._serialize.query("task_type", task_type, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
return request
def internal_paging(next_link=None):
request = prepare_request(next_link)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ApiErrorException(self._deserialize, response)
return response
# Deserialize response
header_dict = None
if raw:
header_dict = {}
deserialized = models.ProjectTaskPaged(internal_paging, self._deserialize.dependencies, header_dict)
return deserialized
list.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{groupName}/providers/Microsoft.DataMigration/services/{serviceName}/serviceTasks'}
def create_or_update(
self, group_name, service_name, task_name, etag=None, properties=None, custom_headers=None, raw=False, **operation_config):
"""Create or update service task.
The service tasks resource is a nested, proxy-only resource
representing work performed by a DMS instance. The PUT method creates a
new service task or updates an existing one, although since service
tasks have no mutable custom properties, there is little reason to
update an existing one.
:param group_name: Name of the resource group
:type group_name: str
:param service_name: Name of the service
:type service_name: str
:param task_name: Name of the Task
:type task_name: str
:param etag: HTTP strong entity tag value. This is ignored if
submitted.
:type etag: str
:param properties: Custom task properties
:type properties:
~azure.mgmt.datamigration.models.ProjectTaskProperties
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: ProjectTask or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.datamigration.models.ProjectTask or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ApiErrorException<azure.mgmt.datamigration.models.ApiErrorException>`
"""
parameters = models.ProjectTask(etag=etag, properties=properties)
# Construct URL
url = self.create_or_update.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'groupName': self._serialize.url("group_name", group_name, 'str'),
'serviceName': self._serialize.url("service_name", service_name, 'str'),
'taskName': self._serialize.url("task_name", task_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(parameters, 'ProjectTask')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 201]:
raise models.ApiErrorException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('ProjectTask', response)
if response.status_code == 201:
deserialized = self._deserialize('ProjectTask', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
create_or_update.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{groupName}/providers/Microsoft.DataMigration/services/{serviceName}/serviceTasks/{taskName}'}
def get(
self, group_name, service_name, task_name, expand=None, custom_headers=None, raw=False, **operation_config):
"""Get service task information.
The service tasks resource is a nested, proxy-only resource
representing work performed by a DMS instance. The GET method retrieves
information about a service task.
:param group_name: Name of the resource group
:type group_name: str
:param service_name: Name of the service
:type service_name: str
:param task_name: Name of the Task
:type task_name: str
:param expand: Expand the response
:type expand: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: ProjectTask or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.datamigration.models.ProjectTask or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ApiErrorException<azure.mgmt.datamigration.models.ApiErrorException>`
"""
# Construct URL
url = self.get.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'groupName': self._serialize.url("group_name", group_name, 'str'),
'serviceName': self._serialize.url("service_name", service_name, 'str'),
'taskName': self._serialize.url("task_name", task_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
if expand is not None:
query_parameters['$expand'] = self._serialize.query("expand", expand, 'str')
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ApiErrorException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('ProjectTask', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{groupName}/providers/Microsoft.DataMigration/services/{serviceName}/serviceTasks/{taskName}'}
def delete(
self, group_name, service_name, task_name, delete_running_tasks=None, custom_headers=None, raw=False, **operation_config):
"""Delete service task.
The service tasks resource is a nested, proxy-only resource
representing work performed by a DMS instance. The DELETE method
deletes a service task, canceling it first if it's running.
:param group_name: Name of the resource group
:type group_name: str
:param service_name: Name of the service
:type service_name: str
:param task_name: Name of the Task
:type task_name: str
:param delete_running_tasks: Delete the resource even if it contains
running tasks
:type delete_running_tasks: bool
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: None or ClientRawResponse if raw=true
:rtype: None or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`ApiErrorException<azure.mgmt.datamigration.models.ApiErrorException>`
"""
# Construct URL
url = self.delete.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'groupName': self._serialize.url("group_name", group_name, 'str'),
'serviceName': self._serialize.url("service_name", service_name, 'str'),
'taskName': self._serialize.url("task_name", task_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
if delete_running_tasks is not None:
query_parameters['deleteRunningTasks'] = self._serialize.query("delete_running_tasks", delete_running_tasks, 'bool')
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 204]:
raise models.ApiErrorException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
delete.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{groupName}/providers/Microsoft.DataMigration/services/{serviceName}/serviceTasks/{taskName}'}
def update(
self, group_name, service_name, task_name, etag=None, properties=None, custom_headers=None, raw=False, **operation_config):
"""Create or update service task.
The service tasks resource is a nested, proxy-only resource
representing work performed by a DMS instance. The PATCH method updates
an existing service task, but since service tasks have no mutable
custom properties, there is little reason to do so.
:param group_name: Name of the resource group
:type group_name: str
:param service_name: Name of the service
:type service_name: str
:param task_name: Name of the Task
:type task_name: str
:param etag: HTTP strong entity tag value. This is ignored if
submitted.
:type etag: str
:param properties: Custom task properties
:type properties:
~azure.mgmt.datamigration.models.ProjectTaskProperties
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: ProjectTask or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.datamigration.models.ProjectTask or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ApiErrorException<azure.mgmt.datamigration.models.ApiErrorException>`
"""
parameters = models.ProjectTask(etag=etag, properties=properties)
# Construct URL
url = self.update.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'groupName': self._serialize.url("group_name", group_name, 'str'),
'serviceName': self._serialize.url("service_name", service_name, 'str'),
'taskName': self._serialize.url("task_name", task_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(parameters, 'ProjectTask')
# Construct and send request
request = self._client.patch(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ApiErrorException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('ProjectTask', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
update.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{groupName}/providers/Microsoft.DataMigration/services/{serviceName}/serviceTasks/{taskName}'}
def cancel(
self, group_name, service_name, task_name, custom_headers=None, raw=False, **operation_config):
"""Cancel a service task.
The service tasks resource is a nested, proxy-only resource
representing work performed by a DMS instance. This method cancels a
service task if it's currently queued or running.
:param group_name: Name of the resource group
:type group_name: str
:param service_name: Name of the service
:type service_name: str
:param task_name: Name of the Task
:type task_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: ProjectTask or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.datamigration.models.ProjectTask or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ApiErrorException<azure.mgmt.datamigration.models.ApiErrorException>`
"""
# Construct URL
url = self.cancel.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'groupName': self._serialize.url("group_name", group_name, 'str'),
'serviceName': self._serialize.url("service_name", service_name, 'str'),
'taskName': self._serialize.url("task_name", task_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.post(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ApiErrorException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('ProjectTask', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
cancel.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{groupName}/providers/Microsoft.DataMigration/services/{serviceName}/serviceTasks/{taskName}/cancel'}
| 46.934156 | 182 | 0.668172 | 2,519 | 22,810 | 5.881302 | 0.097261 | 0.035977 | 0.02484 | 0.02916 | 0.843672 | 0.83078 | 0.82376 | 0.811407 | 0.80351 | 0.80351 | 0 | 0.003669 | 0.235204 | 22,810 | 485 | 183 | 47.030928 | 0.845572 | 0.331346 | 0 | 0.701835 | 0 | 0.022936 | 0.176143 | 0.091746 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041284 | false | 0 | 0.013761 | 0 | 0.119266 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0e4dd1b1c2207e88c03abe02f0bd364c9134fda5 | 29,927 | py | Python | data/migrations/0001_initial.py | clairityproject/backend | 7e92f8672b839c5bdaaed423134dc8ecc9e7e60a | [
"MIT"
] | null | null | null | data/migrations/0001_initial.py | clairityproject/backend | 7e92f8672b839c5bdaaed423134dc8ecc9e7e60a | [
"MIT"
] | null | null | null | data/migrations/0001_initial.py | clairityproject/backend | 7e92f8672b839c5bdaaed423134dc8ecc9e7e60a | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from south.utils import datetime_utils as datetime
from south.db import db
from south.v2 import SchemaMigration
from django.db import models
class Migration(SchemaMigration):
def forwards(self, orm):
# Adding model 'Node'
db.create_table(u'data_node', (
('node_id', self.gf('django.db.models.fields.IntegerField')(primary_key=True, db_index=True)),
('name', self.gf('django.db.models.fields.CharField')(max_length=200, null=True, blank=True)),
('latitude', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('longitude', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('added_on', self.gf('django.db.models.fields.DateTimeField')(auto_now_add=True, blank=True)),
('last_modified', self.gf('django.db.models.fields.DateTimeField')(auto_now=True, auto_now_add=True, blank=True)),
('indoor', self.gf('django.db.models.fields.BooleanField')(default=False)),
))
db.send_create_signal(u'data', ['Node'])
# Adding model 'Latest'
db.create_table(u'data_latest', (
('node_id', self.gf('django.db.models.fields.IntegerField')(primary_key=True, db_index=True)),
('name', self.gf('django.db.models.fields.CharField')(max_length=200, null=True, blank=True)),
('indoor', self.gf('django.db.models.fields.BooleanField')(default=False)),
('latitude', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('longitude', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('temperature', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('rh', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('dylos_bin_1', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('dylos_bin_2', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('dylos_bin_3', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('dylos_bin_4', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_1', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_2', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_3', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_4', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_5', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_6', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_7', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_8', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('added_on', self.gf('django.db.models.fields.DateTimeField')(auto_now_add=True, db_index=True, blank=True)),
('last_modified', self.gf('django.db.models.fields.DateTimeField')(auto_now=True, auto_now_add=True, blank=True)),
))
db.send_create_signal(u'data', ['Latest'])
# Adding model 'DataPoint'
db.create_table(u'data_datapoint', (
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('node_id', self.gf('django.db.models.fields.IntegerField')(db_index=True)),
('temperature', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('rh', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('dylos_bin_1', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('dylos_bin_2', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('dylos_bin_3', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('dylos_bin_4', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_1', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_2', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_3', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_4', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_5', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_6', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_7', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_8', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('added_on', self.gf('django.db.models.fields.DateTimeField')(auto_now_add=True, db_index=True, blank=True)),
('last_modified', self.gf('django.db.models.fields.DateTimeField')(auto_now=True, auto_now_add=True, blank=True)),
('reading_time', self.gf('django.db.models.fields.DateTimeField')(null=True, blank=True)),
))
db.send_create_signal(u'data', ['DataPoint'])
# Adding model 'Dylos'
db.create_table(u'data_dylos', (
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('node_id', self.gf('django.db.models.fields.IntegerField')(db_index=True)),
('dylos_bin_1', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('dylos_bin_2', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('dylos_bin_3', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('dylos_bin_4', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('added_on', self.gf('django.db.models.fields.DateTimeField')(auto_now_add=True, db_index=True, blank=True)),
('last_modified', self.gf('django.db.models.fields.DateTimeField')(auto_now=True, auto_now_add=True, blank=True)),
('reading_time', self.gf('django.db.models.fields.DateTimeField')(null=True, blank=True)),
))
db.send_create_signal(u'data', ['Dylos'])
# Adding model 'Alphasense'
db.create_table(u'data_alphasense', (
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('node_id', self.gf('django.db.models.fields.IntegerField')(db_index=True)),
('alphasense_1', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_2', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_3', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_4', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_5', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_6', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_7', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('alphasense_8', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('added_on', self.gf('django.db.models.fields.DateTimeField')(auto_now_add=True, db_index=True, blank=True)),
('last_modified', self.gf('django.db.models.fields.DateTimeField')(auto_now=True, auto_now_add=True, blank=True)),
('reading_time', self.gf('django.db.models.fields.DateTimeField')(null=True, blank=True)),
))
db.send_create_signal(u'data', ['Alphasense'])
# Adding model 'Met'
db.create_table(u'data_met', (
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('node_id', self.gf('django.db.models.fields.IntegerField')(db_index=True)),
('temperature', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('rh', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('added_on', self.gf('django.db.models.fields.DateTimeField')(auto_now_add=True, db_index=True, blank=True)),
('last_modified', self.gf('django.db.models.fields.DateTimeField')(auto_now=True, auto_now_add=True, blank=True)),
('reading_time', self.gf('django.db.models.fields.DateTimeField')(null=True, blank=True)),
))
db.send_create_signal(u'data', ['Met'])
# Adding model 'AQI'
db.create_table(u'data_aqi', (
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('node_id', self.gf('django.db.models.fields.IntegerField')(db_index=True)),
('no', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('no2', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('o3', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('co', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('dylos_bin_1', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('dylos_bin_2', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('dylos_bin_3', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('dylos_bin_4', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('mitaqi', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('no_rank', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('no2_rank', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('o3_rank', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('co_rank', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('dylos_bin_1_rank', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('dylos_bin_2_rank', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('dylos_bin_3_rank', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('dylos_bin_4_rank', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('mitaqi_rank', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('added_on', self.gf('django.db.models.fields.DateTimeField')(auto_now_add=True, db_index=True, blank=True)),
('last_modified', self.gf('django.db.models.fields.DateTimeField')(auto_now=True, auto_now_add=True, blank=True)),
))
db.send_create_signal(u'data', ['AQI'])
# Adding model 'SensorDetail'
db.create_table(u'data_sensordetail', (
(u'id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('node_id', self.gf('django.db.models.fields.IntegerField')(db_index=True)),
('no_serial', self.gf('django.db.models.fields.CharField')(max_length=200, null=True, blank=True)),
('o3_serial', self.gf('django.db.models.fields.CharField')(max_length=200, null=True, blank=True)),
('no2_serial', self.gf('django.db.models.fields.CharField')(max_length=200, null=True, blank=True)),
('co_serial', self.gf('django.db.models.fields.CharField')(max_length=200, null=True, blank=True)),
('no_electronic_we_zero', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('no_total_we_zero', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('no_electronic_aux_zero', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('no_total_aux_zero', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('no_electronic_we_sens', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('no_total_we_sens', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('o3_electronic_we_zero', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('o3_total_we_zero', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('o3_electronic_aux_zero', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('o3_total_aux_zero', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('o3_electronic_we_sens', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('o3_total_we_sens', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('no2_electronic_we_zero', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('no2_total_we_zero', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('no2_electronic_aux_zero', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('no2_total_aux_zero', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('no2_electronic_we_sens', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('no2_total_we_sens', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
('co_electronic_we_zero', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('co_total_we_zero', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('co_electronic_aux_zero', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('co_total_aux_zero', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('co_electronic_we_sens', self.gf('django.db.models.fields.IntegerField')(null=True, blank=True)),
('co_total_we_sens', self.gf('django.db.models.fields.FloatField')(null=True, blank=True)),
))
db.send_create_signal(u'data', ['SensorDetail'])
def backwards(self, orm):
# Deleting model 'Node'
db.delete_table(u'data_node')
# Deleting model 'Latest'
db.delete_table(u'data_latest')
# Deleting model 'DataPoint'
db.delete_table(u'data_datapoint')
# Deleting model 'Dylos'
db.delete_table(u'data_dylos')
# Deleting model 'Alphasense'
db.delete_table(u'data_alphasense')
# Deleting model 'Met'
db.delete_table(u'data_met')
# Deleting model 'AQI'
db.delete_table(u'data_aqi')
# Deleting model 'SensorDetail'
db.delete_table(u'data_sensordetail')
models = {
u'data.alphasense': {
'Meta': {'object_name': 'Alphasense'},
'added_on': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'db_index': 'True', 'blank': 'True'}),
'alphasense_1': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_2': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_3': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_4': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_5': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_6': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_7': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_8': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'last_modified': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'auto_now_add': 'True', 'blank': 'True'}),
'node_id': ('django.db.models.fields.IntegerField', [], {'db_index': 'True'}),
'reading_time': ('django.db.models.fields.DateTimeField', [], {'null': 'True', 'blank': 'True'})
},
u'data.aqi': {
'Meta': {'object_name': 'AQI'},
'added_on': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'db_index': 'True', 'blank': 'True'}),
'co': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'co_rank': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_1': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_1_rank': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_2': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_2_rank': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_3': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_3_rank': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_4': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_4_rank': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'last_modified': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'auto_now_add': 'True', 'blank': 'True'}),
'mitaqi': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'mitaqi_rank': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'no': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'no2': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'no2_rank': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'no_rank': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'node_id': ('django.db.models.fields.IntegerField', [], {'db_index': 'True'}),
'o3': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'o3_rank': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'})
},
u'data.datapoint': {
'Meta': {'object_name': 'DataPoint'},
'added_on': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'db_index': 'True', 'blank': 'True'}),
'alphasense_1': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_2': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_3': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_4': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_5': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_6': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_7': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_8': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_1': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_2': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_3': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_4': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'last_modified': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'auto_now_add': 'True', 'blank': 'True'}),
'node_id': ('django.db.models.fields.IntegerField', [], {'db_index': 'True'}),
'reading_time': ('django.db.models.fields.DateTimeField', [], {'null': 'True', 'blank': 'True'}),
'rh': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'temperature': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'})
},
u'data.dylos': {
'Meta': {'object_name': 'Dylos'},
'added_on': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'db_index': 'True', 'blank': 'True'}),
'dylos_bin_1': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_2': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_3': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_4': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'last_modified': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'auto_now_add': 'True', 'blank': 'True'}),
'node_id': ('django.db.models.fields.IntegerField', [], {'db_index': 'True'}),
'reading_time': ('django.db.models.fields.DateTimeField', [], {'null': 'True', 'blank': 'True'})
},
u'data.latest': {
'Meta': {'object_name': 'Latest'},
'added_on': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'db_index': 'True', 'blank': 'True'}),
'alphasense_1': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_2': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_3': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_4': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_5': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_6': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_7': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'alphasense_8': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_1': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_2': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_3': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'dylos_bin_4': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'indoor': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
'last_modified': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'auto_now_add': 'True', 'blank': 'True'}),
'latitude': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'longitude': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '200', 'null': 'True', 'blank': 'True'}),
'node_id': ('django.db.models.fields.IntegerField', [], {'primary_key': 'True', 'db_index': 'True'}),
'rh': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'temperature': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'})
},
u'data.met': {
'Meta': {'object_name': 'Met'},
'added_on': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'db_index': 'True', 'blank': 'True'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'last_modified': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'auto_now_add': 'True', 'blank': 'True'}),
'node_id': ('django.db.models.fields.IntegerField', [], {'db_index': 'True'}),
'reading_time': ('django.db.models.fields.DateTimeField', [], {'null': 'True', 'blank': 'True'}),
'rh': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'temperature': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'})
},
u'data.node': {
'Meta': {'object_name': 'Node'},
'added_on': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
'indoor': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
'last_modified': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'auto_now_add': 'True', 'blank': 'True'}),
'latitude': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'longitude': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '200', 'null': 'True', 'blank': 'True'}),
'node_id': ('django.db.models.fields.IntegerField', [], {'primary_key': 'True', 'db_index': 'True'})
},
u'data.sensordetail': {
'Meta': {'object_name': 'SensorDetail'},
'co_electronic_aux_zero': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'co_electronic_we_sens': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'co_electronic_we_zero': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'co_serial': ('django.db.models.fields.CharField', [], {'max_length': '200', 'null': 'True', 'blank': 'True'}),
'co_total_aux_zero': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'co_total_we_sens': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'co_total_we_zero': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'no2_electronic_aux_zero': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'no2_electronic_we_sens': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'no2_electronic_we_zero': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'no2_serial': ('django.db.models.fields.CharField', [], {'max_length': '200', 'null': 'True', 'blank': 'True'}),
'no2_total_aux_zero': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'no2_total_we_sens': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'no2_total_we_zero': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'no_electronic_aux_zero': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'no_electronic_we_sens': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'no_electronic_we_zero': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'no_serial': ('django.db.models.fields.CharField', [], {'max_length': '200', 'null': 'True', 'blank': 'True'}),
'no_total_aux_zero': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'no_total_we_sens': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'no_total_we_zero': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'node_id': ('django.db.models.fields.IntegerField', [], {'db_index': 'True'}),
'o3_electronic_aux_zero': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'o3_electronic_we_sens': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'o3_electronic_we_zero': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'o3_serial': ('django.db.models.fields.CharField', [], {'max_length': '200', 'null': 'True', 'blank': 'True'}),
'o3_total_aux_zero': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'}),
'o3_total_we_sens': ('django.db.models.fields.FloatField', [], {'null': 'True', 'blank': 'True'}),
'o3_total_we_zero': ('django.db.models.fields.IntegerField', [], {'null': 'True', 'blank': 'True'})
}
}
complete_apps = ['data'] | 82.900277 | 138 | 0.60153 | 3,621 | 29,927 | 4.824082 | 0.027341 | 0.117701 | 0.205175 | 0.293107 | 0.948248 | 0.931589 | 0.9297 | 0.9297 | 0.9297 | 0.9297 | 0 | 0.006555 | 0.174224 | 29,927 | 361 | 139 | 82.900277 | 0.700279 | 0.013266 | 0 | 0.546584 | 0 | 0 | 0.502761 | 0.320604 | 0 | 0 | 0 | 0 | 0 | 1 | 0.006211 | false | 0 | 0.012422 | 0 | 0.02795 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0e7bb8d900d75a34708aa61ad757246594a7b9ba | 10,825 | py | Python | archs/obsolete.py | idiap/archs | e9499c9c1bed7aab540e054062784cd0e871f146 | [
"BSD-3-Clause"
] | 1 | 2021-06-21T12:53:05.000Z | 2021-06-21T12:53:05.000Z | archs/obsolete.py | idiap/archs | e9499c9c1bed7aab540e054062784cd0e871f146 | [
"BSD-3-Clause"
] | null | null | null | archs/obsolete.py | idiap/archs | e9499c9c1bed7aab540e054062784cd0e871f146 | [
"BSD-3-Clause"
] | null | null | null | """
obsolete.py
Copyright (c) 2018 Idiap Research Institute, http://www.idiap.ch/
Written by Weipeng He <weipeng.he@idiap.ch>
"""
import torch
import torch.nn as nn
from base import SerializableModule, ResidualBlock, _act_funcs, ACT_NONE, ACT_SIGMOID, ACT_RELU
class ResNet(SerializableModule):
def __init__(self, input_size, output_act=ACT_NONE, n_out_map=1,
n_out_hidden=0, output_size=360):
super(ResNet, self).__init__({
'input_size' : input_size,
'output_act' : output_act,
'n_out_map' : n_out_map,
'n_out_hidden' : n_out_hidden,
'output_size' : output_size
})
self.output_size = output_size
ic, x, y = input_size
# conv layers
seq = []
# initial layers (no residual)
# layer 1
oc = 4 * ic
seq.append(nn.Conv2d(ic, oc, kernel_size=(1, 7), stride=(1, 3)))
seq.append(nn.BatchNorm2d(oc))
seq.append(nn.ReLU(inplace=True))
ic = oc
x = x
y = (y - 7 + 3) // 3
# layer 2
oc = 4 * ic
seq.append(nn.Conv2d(ic, oc, kernel_size=(1, 5), stride=(1, 2)))
seq.append(nn.BatchNorm2d(oc))
seq.append(nn.ReLU(inplace=True))
ic = oc
x = x
y = (y - 5 + 2) // 2
# residual layers
seq.append(ResidualBlock(ic, oc))
seq.append(ResidualBlock(ic, oc))
seq.append(ResidualBlock(ic, oc))
seq.append(ResidualBlock(ic, oc))
seq.append(ResidualBlock(ic, oc))
# reduce map size
oc = ic / 4
seq.append(nn.Conv2d(ic, oc, kernel_size=(x, 9), stride=(1, 3)))
seq.append(nn.BatchNorm2d(oc))
seq.append(nn.ReLU(inplace=True))
ic = oc
x = 1
y = (y - 9 + 3) // 3
self.cseq = nn.Sequential(*seq)
# output layers
if n_out_hidden == 0:
outseq = [nn.Linear(ic * x * y, output_size * n_out_map)]
if output_act != ACT_NONE:
outseq.append(_act_funcs[output_act]())
self.out = nn.Sequential(*outseq)
else:
hidden_struct = [1000] * n_out_hidden + [output_size]
self.out = MultiTaskMLP(ic * x * y,
[hidden_struct] * n_out_map,
hidden_act=ACT_RELU,
output_act=output_act,
batch_norm=True)
def forward(self, x):
x = self.cseq(x)
x = x.view(x.size(0), -1)
output = self.out(x)
return output.view(x.size(0), -1, self.output_size)
class ResNetv2(SerializableModule):
"""Expected map sizes:
15x168
-> conv(1,5) stride(1,3)
15x55
-> conv(1,3) stride(1,2)
15x27
-> residual modules
15x27
-> conv(3,3) stride(2,2)
7x13
-> conv(3,3) stride(2,2)
3x6
-> fully connected
1000
-> fully connected
360
"""
def __init__(self, input_size, init_ch_factor=4, output_act=ACT_NONE,
n_out_map=1, n_out_hidden=0, output_size=360):
super(ResNetv2, self).__init__({
'input_size' : input_size,
'init_ch_factor' : init_ch_factor,
'output_act' : output_act,
'n_out_map' : n_out_map,
'n_out_hidden' : n_out_hidden,
'output_size' : output_size
})
self.output_size = output_size
ic, x, y = input_size
# conv layers
seq = []
# initial layers (no residual)
# layer 1
oc = init_ch_factor * ic
seq.append(nn.Conv2d(ic, oc, kernel_size=(1, 5), stride=(1, 3)))
seq.append(nn.BatchNorm2d(oc))
seq.append(nn.ReLU(inplace=True))
ic = oc
x = x
y = (y - 5 + 3) // 3
# layer 2
oc = 4 * ic
seq.append(nn.Conv2d(ic, oc, kernel_size=(1, 3), stride=(1, 2)))
seq.append(nn.BatchNorm2d(oc))
seq.append(nn.ReLU(inplace=True))
ic = oc
x = x
y = (y - 3 + 2) // 2
# residual layers
seq.append(ResidualBlock(ic, oc))
seq.append(ResidualBlock(ic, oc))
seq.append(ResidualBlock(ic, oc))
seq.append(ResidualBlock(ic, oc))
seq.append(ResidualBlock(ic, oc))
# reduce map size 1
oc = ic / 4
seq.append(nn.Conv2d(ic, oc, kernel_size=(3, 3), stride=(2, 2)))
seq.append(nn.BatchNorm2d(oc))
seq.append(nn.ReLU(inplace=True))
ic = oc
x = (x - 3 + 2) // 2
y = (y - 3 + 2) // 2
# reduce map size 2
oc = ic / 4
seq.append(nn.Conv2d(ic, oc, kernel_size=(3, 3), stride=(2, 2)))
seq.append(nn.BatchNorm2d(oc))
seq.append(nn.ReLU(inplace=True))
ic = oc
x = (x - 3 + 2) // 2
y = (y - 3 + 2) // 2
self.cseq = nn.Sequential(*seq)
# output layers
if n_out_hidden == 0:
outseq = [nn.Linear(ic * x * y, output_size * n_out_map)]
if output_act != ACT_NONE:
outseq.append(_act_funcs[output_act]())
self.out = nn.Sequential(*outseq)
else:
hidden_struct = [1000] * n_out_hidden + [output_size]
self.out = MultiTaskMLP(ic * x * y,
[hidden_struct] * n_out_map,
hidden_act=ACT_RELU,
output_act=output_act,
batch_norm=True)
def forward(self, x):
x = self.cseq(x)
x = x.view(x.size(0), -1)
output = self.out(x)
return output.view(x.size(0), -1, self.output_size)
class ResNetCtx32(SerializableModule):
def __init__(self, input_size, output_act=ACT_NONE, n_out_map=1,
n_out_hidden=0):
super(ResNetCtx32, self).__init__({
'input_size' : input_size,
'output_act' : output_act,
'n_out_map' : n_out_map,
'n_out_hidden' : n_out_hidden
})
ic, x, y = input_size
# conv layers
seq = []
# initial layers (no residual)
# layer 1
oc = 4 * ic
seq.append(nn.Conv2d(ic, oc, kernel_size=(1, 7), stride=(1, 3)))
seq.append(nn.BatchNorm2d(oc))
seq.append(nn.ReLU(inplace=True))
ic = oc
x = x
y = (y - 7 + 3) // 3
# layer 2
oc = 4 * ic
seq.append(nn.Conv2d(ic, oc, kernel_size=(2, 5), stride=(2, 2)))
seq.append(nn.BatchNorm2d(oc))
seq.append(nn.ReLU(inplace=True))
ic = oc
x = x // 2
y = (y - 5 + 2) // 2
# residual layers
seq.append(ResidualBlock(ic, oc))
seq.append(ResidualBlock(ic, oc))
seq.append(ResidualBlock(ic, oc))
seq.append(ResidualBlock(ic, oc))
seq.append(ResidualBlock(ic, oc))
# reduce map size
oc = ic / 8
seq.append(nn.Conv2d(ic, oc, kernel_size=(4, 9), stride=(4, 3)))
seq.append(nn.BatchNorm2d(oc))
seq.append(nn.ReLU(inplace=True))
ic = oc
x = x // 4
y = (y - 9 + 3) // 3
self.cseq = nn.Sequential(*seq)
# output layers
if n_out_hidden == 0:
outseq = [nn.Linear(ic * x * y, 360 * n_out_map)]
if output_act != ACT_NONE:
outseq.append(_act_funcs[output_act]())
self.out = nn.Sequential(*outseq)
else:
hidden_struct = [1000] * n_out_hidden + [360]
self.out = MultiTaskMLP(ic * x * y,
[hidden_struct] * n_out_map,
hidden_act=ACT_RELU,
output_act=output_act,
batch_norm=True)
def forward(self, x):
x = self.cseq(x)
x = x.view(x.size(0), -1)
output = self.out(x)
return output.view(x.size(0), -1, 360)
class MLP_Softmax(SerializableModule):
def __init__(self, layer_size, hidden_act=ACT_SIGMOID,
batch_norm=False):
super(MLP_Softmax, self).__init__({
'layer_size' : layer_size,
'hidden_act' : hidden_act,
'batch_norm' : batch_norm,
})
nlayer = len(layer_size) - 1
seq = []
for i in xrange(nlayer):
seq.append(nn.Linear(layer_size[i], layer_size[i+1]))
if i < nlayer - 1:
seq.append(_act_funcs[hidden_act]())
if batch_norm:
seq.append(nn.BatchNorm1d(layer_size[i+1]))
else:
seq.append(nn.Softmax())
self.seq = nn.Sequential(*seq)
def forward(self, x):
return self.seq(x.view(x.size(0), -1))
class ResNetClassification(SerializableModule):
def __init__(self, input_size, output_size, channel_x=1):
super(ResNetClassification, self).__init__({
'input_size' : input_size,
'output_size' : output_size,
'channel_x' : channel_x
})
x, y = input_size
ic = 1
# conv layers
seq = []
# initial layers (no residual)
# layer 1
oc = 8 * ic * channel_x
seq.append(nn.Conv2d(ic, oc, kernel_size=(1, 7), stride=(1, 3)))
seq.append(nn.BatchNorm2d(oc))
seq.append(nn.ReLU(inplace=True))
ic = oc
x = x
y = (y - 7 + 3) // 3
# layer 2
oc = 8 * ic
seq.append(nn.Conv2d(ic, oc, kernel_size=(1, 5), stride=(1, 2)))
seq.append(nn.BatchNorm2d(oc))
seq.append(nn.ReLU(inplace=True))
ic = oc
x = x
y = (y - 5 + 2) // 2
# residual layers
seq.append(ResidualBlock(ic, oc))
seq.append(ResidualBlock(ic, oc))
seq.append(ResidualBlock(ic, oc))
seq.append(ResidualBlock(ic, oc))
seq.append(ResidualBlock(ic, oc))
# reduce map size
oc = ic / 4
seq.append(nn.Conv2d(ic, oc, kernel_size=(x, 9), stride=(1, 3)))
seq.append(nn.BatchNorm2d(oc))
seq.append(nn.ReLU(inplace=True))
ic = oc
x = 1
y = (y - 9 + 3) // 3
self.cseq = nn.Sequential(*seq)
# output layers
outseq = [nn.Linear(ic * x * y, output_size)]
outseq.append(nn.Softmax())
self.out = nn.Sequential(*outseq)
def forward(self, x):
x = x.unsqueeze(1)
x = self.cseq(x)
x = x.view(x.size(0), -1)
output = self.out(x)
return output
# -*- Mode: Python -*-
# vi:si:et:sw=4:sts=4:ts=4
| 29.657534 | 95 | 0.503649 | 1,398 | 10,825 | 3.732475 | 0.084406 | 0.108662 | 0.08854 | 0.091989 | 0.812572 | 0.78555 | 0.772901 | 0.758337 | 0.74703 | 0.74703 | 0 | 0.038815 | 0.366928 | 10,825 | 364 | 96 | 29.739011 | 0.722603 | 0.078799 | 0 | 0.769231 | 0 | 0 | 0.02234 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.040486 | false | 0 | 0.012146 | 0.004049 | 0.093117 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7ecf97fc71097ed10b3555af099d2243bb31f09a | 133 | py | Python | picbackend/views/v2/provider_plan_network_views/__init__.py | bbcawodu/careadvisors-backend | 5ebd3c0fc189b2486cea92b2a13c0bd8a0ee3838 | [
"MIT"
] | null | null | null | picbackend/views/v2/provider_plan_network_views/__init__.py | bbcawodu/careadvisors-backend | 5ebd3c0fc189b2486cea92b2a13c0bd8a0ee3838 | [
"MIT"
] | null | null | null | picbackend/views/v2/provider_plan_network_views/__init__.py | bbcawodu/careadvisors-backend | 5ebd3c0fc189b2486cea92b2a13c0bd8a0ee3838 | [
"MIT"
] | null | null | null | from .carrier_views import *
from .plans_views import *
from .provider_location_views import *
from .provider_network_views import *
| 26.6 | 38 | 0.819549 | 18 | 133 | 5.722222 | 0.444444 | 0.427184 | 0.436893 | 0.446602 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.120301 | 133 | 4 | 39 | 33.25 | 0.880342 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
7d046c7e13513c99fd73f7690f4ed1d16a75e578 | 2,318 | py | Python | blackdoc/tests/data/doctest.py | keewis/blackdoc | 78199afa9a49a6fe53dd8ed0561c2ef7fcb77034 | [
"MIT"
] | 21 | 2020-04-28T12:16:39.000Z | 2021-11-12T06:44:18.000Z | blackdoc/tests/data/doctest.py | keewis/blackdoc | 78199afa9a49a6fe53dd8ed0561c2ef7fcb77034 | [
"MIT"
] | 29 | 2020-04-26T22:23:41.000Z | 2022-02-26T20:31:18.000Z | blackdoc/tests/data/doctest.py | keewis/black-doctest | 4f379bdf1ae502ac84a7820be9078902f67bf6c7 | [
"MIT"
] | 4 | 2020-09-12T04:28:51.000Z | 2021-07-17T15:36:22.000Z | from .utils import from_dict
docstring = """ a function to open files
with a very long description
>>> file = open(
... "very_long_filepath",
... mode="a",
... )
>>> file
<_io.TextIOWrapper name='very_long_filepath' mode='w' encoding='UTF-8'>
text after the first example, spanning
multiple lines
>>> file.closed
False
>>> ''' arbitrary triple-quoted string
...
... with a empty continuation line
... '''
>>> def myfunc2(arg1, arg2):
... pass
>>>
>>> if myfunc2(2, 1) is not None:
... print("caught")
>>> a = 2
...
>>> # this is not a block:
"""
lines = docstring.split("\n")
labels = {
1: "none",
2: "none",
3: "none",
4: "none",
(5, 9): "doctest",
9: "doctest",
10: "none",
11: "none",
12: "none",
13: "none",
14: "none",
15: "doctest",
16: "none",
17: "none",
(18, 22): "doctest",
(22, 24): "doctest",
24: "doctest",
25: "none",
(26, 28): "doctest",
(28, 30): "doctest",
30: "doctest",
31: "none",
}
line_ranges, line_labels = from_dict(labels)
expected = """ a function to open files
with a very long description
>>> file = open(
... "very_long_filepath",
... mode="a",
... )
>>> file
<_io.TextIOWrapper name='very_long_filepath' mode='w' encoding='UTF-8'>
text after the first example, spanning
multiple lines
>>> file.closed
False
>>> ''' arbitrary triple-quoted string
...
... with a empty continuation line
... '''
>>> def myfunc2(arg1, arg2):
... pass
...
>>>
>>> if myfunc2(2, 1) is not None:
... print("caught")
...
>>> a = 2
>>> # this is not a block:
"""
expected_lines = expected.split("\n")
expected_labels = {
1: "none",
2: "none",
3: "none",
4: "none",
(5, 9): "doctest",
9: "doctest",
10: "none",
11: "none",
12: "none",
13: "none",
14: "none",
15: "doctest",
16: "none",
17: "none",
(18, 22): "doctest",
(22, 25): "doctest",
25: "doctest",
26: "none",
(27, 30): "doctest",
30: "doctest",
31: "doctest",
32: "none",
}
expected_line_ranges, expected_line_labels = from_dict(expected_labels)
| 19.478992 | 75 | 0.503451 | 273 | 2,318 | 4.194139 | 0.307692 | 0.041921 | 0.055895 | 0.069869 | 0.766812 | 0.731878 | 0.731878 | 0.731878 | 0.731878 | 0.731878 | 0 | 0.066749 | 0.301984 | 2,318 | 118 | 76 | 19.644068 | 0.640915 | 0 | 0 | 0.778846 | 0 | 0 | 0.632442 | 0.039689 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.019231 | 0.009615 | 0 | 0.009615 | 0.019231 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
adbce73119b3292b6e0da5da0ddf6192b1210a2e | 67,165 | py | Python | main.py | DimensionPrism/Student-Test-Score-Analysis | 76cc5ef4240237f77cf57fc127824223719225d8 | [
"MIT"
] | null | null | null | main.py | DimensionPrism/Student-Test-Score-Analysis | 76cc5ef4240237f77cf57fc127824223719225d8 | [
"MIT"
] | null | null | null | main.py | DimensionPrism/Student-Test-Score-Analysis | 76cc5ef4240237f77cf57fc127824223719225d8 | [
"MIT"
] | 1 | 2021-12-05T12:56:57.000Z | 2021-12-05T12:56:57.000Z | import os
import pandas as pd
import numpy as np
import overall_analysis
import subject_analysis
import seperate_school_data
import school_subject_analysis
import school_overall_analysis
# 读取小题分
ITEM_SCORE = pd.read_excel("./考试数据/item_score.xlsx").fillna(0)
# 读取学生总分
SCHOOL_NAME = []
STUDENT_TOTAL = pd.read_excel("./考试数据/报名信息-学生成绩.xlsx").fillna(0).replace(['弘金国际地学校'], '弘金地国际学校')
[SCHOOL_NAME.append(x) for x in STUDENT_TOTAL['学校'] if x not in SCHOOL_NAME]
SCHOOL_NAME.append("全体")
# 读取各科总分和小题满分
CHIN_TOTAL = pd.read_excel("./考试数据/小题分/小题分(语文).xls").fillna(0).replace(['弘金国际地学校'], '弘金地国际学校')
for index, value in STUDENT_TOTAL.iterrows():
student = STUDENT_TOTAL.iloc[index]
if student['学号'] not in CHIN_TOTAL['学号'].values:
missing_student = {'学号': student['学号'], '考号': student['考号'], '姓名': student['姓名'], '班级': student['班级'],
'学校': student['学校']}
CHIN_TOTAL = CHIN_TOTAL.append(missing_student, ignore_index=True).fillna(0)
CHIN_PROB_TOTAL = pd.read_excel("./考试数据/小题满分.xlsx", sheet_name="语文").fillna(0)
MATH_TOTAL = pd.read_excel("./考试数据/小题分/小题分(数学).xls").fillna(0).replace(['弘金国际地学校'], '弘金地国际学校')
for index, value in STUDENT_TOTAL.iterrows():
student = STUDENT_TOTAL.iloc[index]
if student['学号'] not in MATH_TOTAL['学号'].values:
missing_student = {'学号': student['学号'], '考号': student['考号'], '姓名': student['姓名'], '班级': student['班级'],
'学校': student['学校']}
MATH_TOTAL = MATH_TOTAL.append(missing_student, ignore_index=True).fillna(0)
MATH_PROB_TOTAL = pd.read_excel("./考试数据/小题满分.xlsx", sheet_name="数学").fillna(0)
ENGL_TOTAL = pd.read_excel("./考试数据/小题分/小题分(英语).xls").fillna(0).replace(['弘金国际地学校'], '弘金地国际学校')
for index, value in STUDENT_TOTAL.iterrows():
student = STUDENT_TOTAL.iloc[index]
if student['学号'] not in ENGL_TOTAL['学号'].values:
missing_student = {'学号': student['学号'], '考号': student['考号'], '姓名': student['姓名'], '班级': student['班级'],
'学校': student['学校']}
ENGL_TOTAL = ENGL_TOTAL.append(missing_student, ignore_index=True).fillna(0)
ENGL_PROB_TOTAL = pd.read_excel("./考试数据/小题满分.xlsx", sheet_name="英语").fillna(0)
HIST_TOTAL = pd.read_excel("./考试数据/小题分/小题分(历史).xls").fillna(0).replace(['弘金国际地学校'], '弘金地国际学校')
for index, value in STUDENT_TOTAL.iterrows():
student = STUDENT_TOTAL.iloc[index]
if student['学号'] not in HIST_TOTAL['学号'].values:
missing_student = {'学号': student['学号'], '考号': student['考号'], '姓名': student['姓名'], '班级': student['班级'],
'学校': student['学校']}
HIST_TOTAL = HIST_TOTAL.append(missing_student, ignore_index=True).fillna(0)
HIST_PROB_TOTAL = pd.read_excel("./考试数据/小题满分.xlsx", sheet_name="历史").fillna(0)
PHYS_TOTAL = pd.read_excel("./考试数据/小题分/小题分(物理).xls").fillna(0).replace(['弘金国际地学校'], '弘金地国际学校')
for index, value in STUDENT_TOTAL.iterrows():
student = STUDENT_TOTAL.iloc[index]
if student['学号'] not in PHYS_TOTAL['学号'].values:
missing_student = {'学号': student['学号'], '考号': student['考号'], '姓名': student['姓名'], '班级': student['班级'],
'学校': student['学校']}
PHYS_TOTAL = PHYS_TOTAL.append(missing_student, ignore_index=True).fillna(0)
PHYS_PROB_TOTAL = pd.read_excel("./考试数据/小题满分.xlsx", sheet_name="物理").fillna(0)
CHEM_TOTAL = pd.read_excel("./考试数据/小题分/小题分(化学).xls").fillna(0).replace(['弘金国际地学校'], '弘金地国际学校')
for index, value in STUDENT_TOTAL.iterrows():
student = STUDENT_TOTAL.iloc[index]
if student['学号'] not in CHEM_TOTAL['学号'].values:
missing_student = {'学号': student['学号'], '考号': student['考号'], '姓名': student['姓名'], '班级': student['班级'],
'学校': student['学校']}
CHEM_TOTAL = CHEM_TOTAL.append(missing_student, ignore_index=True).fillna(0)
CHEM_PROB_TOTAL = pd.read_excel("./考试数据/小题满分.xlsx", sheet_name="化学").fillna(0)
POLI_TOTAL = pd.read_excel("./考试数据/小题分/小题分(道德与法治).xls").fillna(0).replace(['弘金国际地学校'], '弘金地国际学校')
for index, value in STUDENT_TOTAL.iterrows():
student = STUDENT_TOTAL.iloc[index]
if student['学号'] not in POLI_TOTAL['学号'].values:
missing_student = {'学号': student['学号'], '考号': student['考号'], '姓名': student['姓名'], '班级': student['班级'],
'学校': student['学校']}
POLI_TOTAL = POLI_TOTAL.append(missing_student, ignore_index=True).fillna(0)
POLI_PROB_TOTAL = pd.read_excel("./考试数据/小题满分.xlsx", sheet_name="道德与法治").fillna(0)
SUBJECT_LIST = ["语文", "数学", "英语", "历史", "物理", "化学", "道德与法治", "总分"]
SUBJECT_FULL_SCORE_LIST = [120, 100, 100, 100, 100, 100, 100, 720]
SUBJECT_TOTAL_LIST = [
CHIN_TOTAL,
MATH_TOTAL,
ENGL_TOTAL,
HIST_TOTAL,
PHYS_TOTAL,
CHEM_TOTAL,
POLI_TOTAL,
]
SUBJECT_PROB_TOTAL_LIST = [
CHIN_PROB_TOTAL,
MATH_PROB_TOTAL,
ENGL_PROB_TOTAL,
HIST_PROB_TOTAL,
PHYS_PROB_TOTAL,
CHEM_PROB_TOTAL,
POLI_PROB_TOTAL,
]
SUBJECT_TOTAL_SCORE = pd.concat([
STUDENT_TOTAL[['学校', '分数']],
STUDENT_TOTAL['分数.1'],
STUDENT_TOTAL['分数.2'],
STUDENT_TOTAL['分数.3'],
STUDENT_TOTAL['分数.4'],
STUDENT_TOTAL['分数.5'],
STUDENT_TOTAL['分数.6'],
STUDENT_TOTAL['分数.7']], axis=1)
def ability_analysis(item_score, subject_name, subject_total, subject_prob_total):
ability_name_list = list(subject_prob_total.columns[4:-2])
prob_name_list = list(subject_total.columns[8:])
this_subject = item_score.groupby('sskm_mc').get_group(subject_name)
ability_score_rate = {"能力层次": [], "满分": [], "平均分": [], "标准差": [], "得分率": []}
for ability_name in ability_name_list:
ability_total = subject_prob_total[ability_name].sum()
ability_ave = 0
ability_prob = []
for prob_index in range(len(subject_prob_total.index)):
this_prob_total = subject_prob_total.groupby('sj_tid').get_group(prob_index + 1)
target = int(this_prob_total[ability_name])
total = int(this_prob_total['score'])
if target != 0:
prob_name = prob_name_list[prob_index]
this_prob = subject_total[prob_name]
this_prob_item_score = this_subject.groupby('sj_tid').get_group(prob_index + 1)
ability_prob.append(this_prob_item_score)
coefficient = target / total
this_prob_ave = this_prob.mean()
this_prob_ability_ave = this_prob_ave * coefficient
ability_ave += this_prob_ability_ave
ability_score_rate['能力层次'].append(ability_name)
if len(ability_prob) != 0:
ability_concated = pd.concat(ability_prob)
ability_std = ability_concated['score'].std()
else:
ability_std = 0
if ability_ave == 0:
score_rate = 0
else:
score_rate = np.round(ability_ave / ability_total * 100, 2)
ability_score_rate['满分'].append(ability_total)
ability_score_rate['平均分'].append(np.round(ability_ave, 2))
ability_score_rate['标准差'].append(np.round(ability_std, 2))
ability_score_rate['得分率'].append(score_rate)
ability_score_rate_df = pd.DataFrame(ability_score_rate)
return ability_score_rate_df
def knowledge_field_and_point(item_score, subject_name, subject_total, subject_prob_total):
this_subject = item_score.groupby("sskm_mc").get_group(subject_name)
knowledge_field_list = list(subject_prob_total['知识模块'])
knowledge_field_list_compact = []
[knowledge_field_list_compact.append(x)
for x in knowledge_field_list
if x not in knowledge_field_list_compact and x != 0]
knowledge_point_list = list(subject_prob_total['知识点'])
knowledge_point_list_compact = []
[knowledge_point_list_compact.append(x)
for x in knowledge_point_list
if x not in knowledge_point_list_compact and x != 0]
knowledge_field_dict = {}
knowledge_point_dict = {}
concated_field = {}
concated_point = {}
if len(knowledge_field_list_compact) != 0:
knowledge_field_dict["知识模块"] = knowledge_field_list_compact
knowledge_field_dict["知识模块满分"] = []
knowledge_field_dict["知识模块平均分"] = []
knowledge_field_dict["知识模块标准差"] = []
knowledge_field_dict["知识模块得分率"] = []
for length in range(len(knowledge_field_list_compact)):
knowledge_field_dict["知识模块满分"].append(0)
knowledge_field_dict["知识模块平均分"].append(0)
knowledge_field_dict["知识模块标准差"].append(0)
knowledge_field_dict["知识模块得分率"].append(0)
prob_name_list = list(subject_total.columns[8:])
for index in range(len(subject_prob_total)):
prob_name = prob_name_list[index]
this_prob_total = subject_prob_total.groupby('sj_tid').get_group(index + 1)
this_prob_item_score = this_subject.groupby('sj_tid').get_group(index+1)
knowledge_field_name = this_prob_total['知识模块'].values
if knowledge_field_name != 0:
knowledge_field_index = knowledge_field_list_compact.index(knowledge_field_name)
if knowledge_field_name[0] not in concated_field:
concated_field[knowledge_field_name[0]] = []
concated_field[knowledge_field_name[0]].append(this_prob_item_score)
else:
concated_field[knowledge_field_name[0]].append(this_prob_item_score)
prob_ave = subject_total[prob_name].mean()
total = int(this_prob_total['score'].values)
knowledge_field_dict["知识模块满分"][knowledge_field_index] += total
if prob_ave != 0:
knowledge_field_dict["知识模块平均分"][knowledge_field_index] += prob_ave
for x in range(len(knowledge_field_list_compact)):
this_ave = knowledge_field_dict["知识模块平均分"][x]
this_total = knowledge_field_dict["知识模块满分"][x]
this_field_all = pd.concat(concated_field[knowledge_field_list_compact[x]])
knowledge_field_dict["知识模块标准差"][x] = np.round(this_field_all['score'].std(), 2)
knowledge_field_dict["知识模块得分率"][x] = np.round(this_ave / this_total * 100, 2)
knowledge_field_df = pd.DataFrame(knowledge_field_dict)
if len(knowledge_point_list_compact) != 0:
knowledge_point_dict["知识点"] = knowledge_point_list_compact
knowledge_point_dict["知识点满分"] = []
knowledge_point_dict["知识点平均分"] = []
knowledge_point_dict["知识点标准差"] = []
knowledge_point_dict["知识点得分率"] = []
for length in range(len(knowledge_point_list_compact)):
knowledge_point_dict["知识点满分"].append(0)
knowledge_point_dict["知识点平均分"].append(0)
knowledge_point_dict["知识点标准差"].append(0)
knowledge_point_dict["知识点得分率"].append(0)
prob_name_list = list(subject_total.columns[8:])
for index in range(len(subject_prob_total)):
prob_name = prob_name_list[index]
this_prob_total = subject_prob_total.groupby('sj_tid').get_group(index + 1)
this_prob_item_score = this_subject.groupby('sj_tid').get_group(index + 1)
knowledge_point_name = this_prob_total['知识点'].values
if knowledge_point_name != 0:
knowledge_point_index = knowledge_point_list_compact.index(knowledge_point_name)
if knowledge_point_name[0] not in concated_point:
concated_point[knowledge_point_name[0]] = []
concated_point[knowledge_point_name[0]].append(this_prob_item_score)
else:
concated_point[knowledge_point_name[0]].append(this_prob_item_score)
prob_ave = subject_total[prob_name].mean()
total = int(this_prob_total['score'].values)
knowledge_point_dict["知识点满分"][knowledge_point_index] += total
if prob_ave != 0:
knowledge_point_dict["知识点平均分"][knowledge_point_index] += prob_ave
for x in range(len(knowledge_point_list_compact)):
this_ave = knowledge_point_dict["知识点平均分"][x]
this_total = knowledge_point_dict["知识点满分"][x]
this_point_all = pd.concat(concated_point[knowledge_point_list_compact[x]])
knowledge_point_dict["知识点标准差"][x] = np.round(this_point_all['score'].std(), 2)
knowledge_point_dict["知识点得分率"][x] = np.round(this_ave / this_total * 100, 2)
knowledge_point_df = pd.DataFrame(knowledge_point_dict)
output_dict = {"知识模块分析.xlsx": knowledge_field_df, "知识点分析.xlsx": knowledge_point_df}
return output_dict
def subject(path):
print("学科信息计算中......\n")
for x in range(0, 7):
subject_name = SUBJECT_LIST[x]
subject_full_score = SUBJECT_FULL_SCORE_LIST[x]
subject_total = SUBJECT_TOTAL_LIST[x]
subject_prob_total = SUBJECT_PROB_TOTAL_LIST[x]
this_subject = subject_analysis.Analysis(subject_name, SCHOOL_NAME, subject_full_score,
subject_prob_total, subject_total, ITEM_SCORE,
STUDENT_TOTAL, SUBJECT_TOTAL_SCORE, x)
problem_analysis = this_subject.problem_analysis()
print(subject_name + "科目报表计算完成")
this_subject_export_path = path + "{}/".format(subject_name)
if not os.path.exists(this_subject_export_path):
os.makedirs(this_subject_export_path)
for each in problem_analysis.keys():
excel_name = "{}.xlsx".format(each)
this_excel = problem_analysis[each]
with pd.ExcelWriter(this_subject_export_path + excel_name, engine='xlsxwriter') as writer:
if type(this_excel) == dict:
for content in this_excel.keys():
this_excel[content].to_excel(writer, index=False, sheet_name=content)
if each == "2-1-3-2试题难度分布":
if content == "难度分布":
workbook = writer.book
worksheet = writer.sheets[content]
chart = workbook.add_chart({'type': 'column'})
max_row = len(this_excel[content])
col_x = this_excel[content].columns.get_loc('难度范围')
col_y = this_excel[content].columns.get_loc("比例")
chart.add_series({
'name': '总分占比',
'categories': [content, 1, col_x, max_row, col_x],
'values': [content, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': "难度比例图"})
chart.set_x_axis({'name': '难度范围', 'text_axis': True})
chart.set_y_axis({'name': '比例'})
chart.set_size({'width': 720, 'height': 576})
worksheet.insert_chart('A8', chart)
if each == "2-1-3-3试题区分度分布":
if content == "区分度分布":
workbook = writer.book
worksheet = writer.sheets[content]
chart = workbook.add_chart({'type': 'column'})
max_row = len(this_excel[content])
col_x = this_excel[content].columns.get_loc('区分度范围')
col_y = this_excel[content].columns.get_loc("比例")
chart.add_series({
'name': '总分占比',
'categories': [content, 1, col_x, max_row, col_x],
'values': [content, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': "区分度比例图"})
chart.set_x_axis({'name': '区分度范围', 'text_axis': True})
chart.set_y_axis({'name': '比例'})
chart.set_size({'width': 720, 'height': 576})
worksheet.insert_chart('A8', chart)
if each == "2-1-3-9全卷及各题难度曲线图":
workbook = writer.book
worksheet = writer.sheets[content]
chart = workbook.add_chart({'type': 'line'})
max_row = len(this_excel[content])
col_x = this_excel[content].columns.get_loc('分数段')
col_y = this_excel[content].columns.get_loc('难度值')
chart.add_series({
'name': '难度值',
'categories': [content, 1, col_x, max_row, col_x],
'values': [content, 1, col_y, max_row, col_y],
'marker': {
'type': 'square',
'size': 4,
'border': {'color': 'black'},
'fill': {'color': 'red'},
},
'data_labels': {'value': True, 'position': 'left'},
})
if content == "全卷":
chart.set_title({'name': content + '难度曲线'})
else:
chart.set_title({'name': '第' + str(content) + '题难度曲线'})
chart.set_x_axis({'name': '分数段'})
chart.set_y_axis({'name': '难度值', 'min': 0, 'max': 1})
chart.set_size({'width': 720, 'height': 576})
worksheet.insert_chart('D2', chart)
else:
problem_analysis[each].to_excel(writer, index=False, sheet_name=each)
if each == "2-1-3-1各题目指标":
workbook = writer.book
worksheet = writer.sheets[each]
# 难度与区分度分布图
chart = workbook.add_chart({'type': 'scatter'})
max_row = len(problem_analysis[each])
col_x = problem_analysis[each].columns.get_loc('难度')
col_y = problem_analysis[each].columns.get_loc('区分度')
prob_list = problem_analysis[each]["题号"]
custom_labels = []
for prob in prob_list:
custom_labels.append({'value': prob})
chart.add_series({
'name': '题号',
'categories': [each, 1, col_x, max_row, col_x],
'values': [each, 1, col_y, max_row, col_y],
'marker': {'type': 'circle', 'size': 4, 'fill': {'color': 'orange'}},
'data_labels': {'value': True, 'custom': custom_labels},
})
chart.set_title({'name': '题目难度与区分度分布图'})
chart.set_x_axis({'name': '难度', 'min': 0, 'max': 1})
chart.set_y_axis({'name': '区分度', 'min': -1, 'max': 1})
chart.set_size({'width': 920, 'height': 860})
worksheet.insert_chart('A30', chart)
# 预估难度与实测难度图
chart = workbook.add_chart({'type': 'line'})
max_row = len(problem_analysis[each])
col_x = problem_analysis[each].columns.get_loc('题号')
col_y_actual_difficulty = problem_analysis[each].columns.get_loc('预估难度')
col_y_estimate_difficulty = problem_analysis[each].columns.get_loc('难度')
chart.add_series({
'name': '预估难度',
'categories': [each, 1, col_x, max_row, col_x],
'values': [each, 1, col_y_actual_difficulty, max_row, col_y_actual_difficulty],
'data_labels': {'value': True},
})
chart.add_series({
'name': '实测难度',
'categories': [each, 1, col_x, max_row, col_x],
'values': [each, 1, col_y_estimate_difficulty, max_row, col_y_estimate_difficulty],
'data_labels': {'value': True},
})
chart.set_title({'name': subject_name + '预估难度与实测难度对比图'})
chart.set_x_axis({'name': '题号', 'text_axis': True})
chart.set_y_axis({'name': '难度值', 'min': 0, 'max': 1})
chart.set_size({'width': 920, 'height': 860})
worksheet.insert_chart('P30', chart)
if each == "2-1-4-1各学校得分情况":
problem_analysis[each].to_excel(writer, index=False, sheet_name=each)
workbook = writer.book
worksheet = writer.sheets[each]
# 平均分
chart = workbook.add_chart({'type': 'column'})
max_row = len(problem_analysis[each])
col_x = problem_analysis[each].columns.get_loc('学校')
col_y = problem_analysis[each].columns.get_loc('全体平均分')
chart.add_series({
'name': '平均分',
'categories': [each, 1, col_x, max_row, col_x],
'values': [each, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': "各学校平均分分布图"})
chart.set_x_axis({'name': '学校', 'text_axis': True})
chart.set_y_axis({'name': '平均分'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('A20', chart)
# 得分率
chart = workbook.add_chart({'type': 'column'})
max_row = len(problem_analysis[each])
col_x = problem_analysis[each].columns.get_loc('学校')
col_y = problem_analysis[each].columns.get_loc('得分率')
chart.add_series({
'name': '得分率',
'categories': [each, 1, col_x, max_row, col_x],
'values': [each, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': "各学校得分率分布图"})
chart.set_x_axis({'name': '学校', 'text_axis': True})
chart.set_y_axis({'name': '得分率'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('M20', chart)
# 超优率
chart = workbook.add_chart({'type': 'column'})
max_row = len(problem_analysis[each])
col_x = problem_analysis[each].columns.get_loc('学校')
col_y = problem_analysis[each].columns.get_loc('超优率')
chart.add_series({
'name': '超优率',
'categories': [each, 1, col_x, max_row, col_x],
'values': [each, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': "各学校超优率分布图"})
chart.set_x_axis({'name': '学校', 'text_axis': True})
chart.set_y_axis({'name': '超优率'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('A50', chart)
# 优秀率
chart = workbook.add_chart({'type': 'column'})
max_row = len(problem_analysis[each])
col_x = problem_analysis[each].columns.get_loc('学校')
col_y = problem_analysis[each].columns.get_loc('超优率')
chart.add_series({
'name': '优秀率',
'categories': [each, 1, col_x, max_row, col_x],
'values': [each, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': "各学校优秀率分布图"})
chart.set_x_axis({'name': '学校', 'text_axis': True})
chart.set_y_axis({'name': '优秀率'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('M50', chart)
# 及格率
chart = workbook.add_chart({'type': 'column'})
max_row = len(problem_analysis[each])
col_x = problem_analysis[each].columns.get_loc('学校')
col_y = problem_analysis[each].columns.get_loc('及格率')
chart.add_series({
'name': '及格率',
'categories': [each, 1, col_x, max_row, col_x],
'values': [each, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': "各学校及格率分布图"})
chart.set_x_axis({'name': '学校', 'text_axis': True})
chart.set_y_axis({'name': '及格率'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('A80', chart)
# 低分率
chart = workbook.add_chart({'type': 'column'})
max_row = len(problem_analysis[each])
col_x = problem_analysis[each].columns.get_loc('学校')
col_y = problem_analysis[each].columns.get_loc('低分率')
chart.add_series({
'name': '低分率',
'categories': [each, 1, col_x, max_row, col_x],
'values': [each, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': "各学校低分率分布图"})
chart.set_x_axis({'name': '学校', 'text_axis': True})
chart.set_y_axis({'name': '低分率'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('M80', chart)
print("学科报表输出完成\n")
print("学科知识领域与能力层次分析中......")
for x in range(6):
subject_name = SUBJECT_LIST[x]
print("当前科目:{}".format(subject_name))
subject_total = SUBJECT_TOTAL_LIST[x]
subject_prob_total = SUBJECT_PROB_TOTAL_LIST[x]
ability_analysis_result = ability_analysis(ITEM_SCORE, subject_name, subject_total, subject_prob_total)
ability_and_knowledge_output_path = "{}/{}/{}".format(path, subject_name, subject_name)
with pd.ExcelWriter(ability_and_knowledge_output_path+'能力层次得分率.xlsx', engine='xlsxwriter') as writer:
ability_analysis_result.to_excel(writer, index=False, sheet_name=subject_name+"能力层次得分率")
workbook = writer.book
worksheet = writer.sheets[subject_name+"能力层次得分率"]
chart = workbook.add_chart({'type': 'column'})
max_row = len(ability_analysis_result)
col_x = ability_analysis_result.columns.get_loc('能力层次')
col_y = ability_analysis_result.columns.get_loc("得分率")
chart.add_series({
'name': '能力层次得分率',
'categories': [subject_name+"能力层次得分率", 1, col_x, max_row, col_x],
'values': [subject_name+"能力层次得分率", 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': subject_name + "能力层次得分率图"})
chart.set_x_axis({'name': '能力层次', 'text_axis': True})
chart.set_y_axis({'name': '得分率'})
chart.set_size({'width': 720, 'height': 576})
worksheet.insert_chart('A7', chart)
knowledge_analysis = knowledge_field_and_point(ITEM_SCORE, subject_name, subject_total, subject_prob_total)
for excel_name in knowledge_analysis.keys():
with pd.ExcelWriter(ability_and_knowledge_output_path+excel_name, engine='xlsxwriter') as writer:
knowledge_analysis[excel_name].to_excel(writer, index=False, sheet_name=excel_name[:-5])
if excel_name == "知识模块分析.xlsx":
if len(knowledge_analysis[excel_name]) != 0:
workbook = writer.book
worksheet = writer.sheets["知识模块分析"]
chart = workbook.add_chart({'type': 'column'})
max_row = len(knowledge_analysis[excel_name])
col_x = knowledge_analysis[excel_name].columns.get_loc('知识模块')
col_y = knowledge_analysis[excel_name].columns.get_loc("知识模块得分率")
chart.add_series({
'name': '知识模块得分率',
'categories': ["知识模块分析", 1, col_x, max_row, col_x],
'values': ["知识模块分析", 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': subject_name + "知识模块得分率图"})
chart.set_x_axis({'name': '知识模块', 'text_axis': True})
chart.set_y_axis({'name': '得分率'})
chart.set_size({'width': 720, 'height': 576})
worksheet.insert_chart('F2', chart)
if excel_name == "知识点分析.xlsx":
if len(knowledge_analysis[excel_name]) != 0:
workbook = writer.book
worksheet = writer.sheets["知识点分析"]
chart = workbook.add_chart({'type': 'column'})
max_row = len(knowledge_analysis[excel_name])
col_x = knowledge_analysis[excel_name].columns.get_loc('知识点')
col_y = knowledge_analysis[excel_name].columns.get_loc("知识点得分率")
chart.add_series({
'name': '知识点得分率',
'categories': ["知识点分析", 1, col_x, max_row, col_x],
'values': ["知识点分析", 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': subject_name + "知识点得分率图"})
chart.set_x_axis({'name': '知识点', 'text_axis': True})
chart.set_y_axis({'name': '得分率'})
chart.set_size({'width': 720, 'height': 576})
worksheet.insert_chart('F2', chart)
print("输出完成")
print("学科知识领域与能力层次分析完成\n")
def overall(path):
print("全区信息计算中......\n")
overall = overall_analysis.Analysis(SCHOOL_NAME, SUBJECT_LIST, SUBJECT_FULL_SCORE_LIST,
SUBJECT_TOTAL_LIST, SUBJECT_TOTAL_SCORE)
overall_analysis_output = overall.overall_output()
print("全区报表计算完成")
if not os.path.exists(path):
os.makedirs(path)
for each in overall_analysis_output.keys():
excel_name = "{}.xlsx".format(each)
this_excel = overall_analysis_output[each]
if each == "2-2-2总分及各科上线的有效分数线":
effective_score_analysis = overall.subject_effective_score_analysis(this_excel)
for subject in effective_score_analysis.keys():
score_line_path = "./OUTPUT/学科/{}/".format(subject)
subject_dict = effective_score_analysis[subject]
for goals in subject_dict.keys():
with pd.ExcelWriter(score_line_path + goals, engine='xlsxwriter') as writer:
subject_dict[goals].to_excel(writer, index=False, sheet_name=goals)
workbook = writer.book
worksheet = writer.sheets[goals]
# 总分上线
chart = workbook.add_chart({'type': 'column'})
max_row = len(subject_dict[goals]) - 1
col_x = subject_dict[goals].columns.get_loc('学校')
col_y = subject_dict[goals].columns.get_loc("总分上线人数")
chart.add_series({
'name': '总分上线人数',
'categories': [goals, 1, col_x, max_row, col_x],
'values': [goals, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': "总分上线各校分布图"})
chart.set_x_axis({'name': '学校', 'text_axis': True})
chart.set_y_axis({'name': '总分上线人数'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('G2', chart)
# 单上线
chart = workbook.add_chart({'type': 'column'})
max_row = len(subject_dict[goals]) - 1
col_x = subject_dict[goals].columns.get_loc('学校')
col_y = subject_dict[goals].columns.get_loc(subject+'单上线')
chart.add_series({
'name': '单上线人数',
'categories': [goals, 1, col_x, max_row, col_x],
'values': [goals, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': subject + "单科上线各校分布图"})
chart.set_x_axis({'name': '学校', 'text_axis': True})
chart.set_y_axis({'name': '单上线人数'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('A30', chart)
# 双上线
chart = workbook.add_chart({'type': 'column'})
max_row = len(subject_dict[goals]) - 1
col_x = subject_dict[goals].columns.get_loc('学校')
col_y = subject_dict[goals].columns.get_loc(subject+'双上线')
chart.add_series({
'name': '双上线人数',
'categories': [goals, 1, col_x, max_row, col_x],
'values': [goals, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': subject + "双上线各校分布图"})
chart.set_x_axis({'name': '学校', 'text_axis': True})
chart.set_y_axis({'name': '双上线人数'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('M30', chart)
# M1
chart = workbook.add_chart({'type': 'column'})
max_row = len(subject_dict[goals]) - 1
col_x = subject_dict[goals].columns.get_loc('学校')
col_y = subject_dict[goals].columns.get_loc(subject + 'M1')
chart.add_series({
'name': 'M1人数',
'categories': [goals, 1, col_x, max_row, col_x],
'values': [goals, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': subject + "M1各校分布图"})
chart.set_x_axis({'name': '学校', 'text_axis': True})
chart.set_y_axis({'name': 'M1人数'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('A60', chart)
# M2
chart = workbook.add_chart({'type': 'column'})
max_row = len(subject_dict[goals]) - 1
col_x = subject_dict[goals].columns.get_loc('学校')
col_y = subject_dict[goals].columns.get_loc(subject + 'M2')
chart.add_series({
'name': 'M2人数',
'categories': [goals, 1, col_x, max_row, col_x],
'values': [goals, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': subject + "M2各校分布图"})
chart.set_x_axis({'name': '学校', 'text_axis': True})
chart.set_y_axis({'name': 'M2人数'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('A60', chart)
with pd.ExcelWriter(path + excel_name, engine='xlsxwriter') as writer:
if type(this_excel) == dict:
for content in this_excel.keys():
this_excel[content].to_excel(writer, index=False, sheet_name=content)
else:
overall_analysis_output[each].to_excel(writer, index=False, sheet_name=each)
print("全区报表输出完成\n")
def school_output(path):
print("各学校学科信息计算中......\n")
for school in SCHOOL_NAME[:-1]:
read_path = "./考试数据/学校数据/{}/".format(school)
this_school_export_path = path + "{}/".format(school)
item_score = pd.read_excel(read_path + "item_score.xlsx").fillna(0)
school_student_total = pd.read_excel(read_path + "{}-学生信息.xlsx".format(school)).fillna(0)
school_subject_total_score = pd.concat([
school_student_total[['班级', '分数']],
school_student_total['分数.1'],
school_student_total['分数.2'],
school_student_total['分数.3'],
school_student_total['分数.4'],
school_student_total['分数.5'],
school_student_total['分数.6'],
school_student_total['分数.7']], axis=1)
Class_name = []
[Class_name.append(x) for x in school_student_total['班级'] if x not in Class_name]
Class_name.append("全校")
print("当前学校:" + school)
school_subject_total_list = []
for index in range(0, 7):
subject_name = SUBJECT_LIST[index]
subject_full_score = SUBJECT_FULL_SCORE_LIST[index]
subject_total = pd.read_excel(read_path + "/小题分/小题分({}).xls".format(subject_name)).fillna(0)
school_subject_total_list.append(subject_total)
subject_prob_total = SUBJECT_PROB_TOTAL_LIST[index]
this_subject = school_subject_analysis.Analysis(subject_name, Class_name, subject_full_score,
subject_prob_total, subject_total, item_score,
school_student_total, school_subject_total_score, index)
problem_analysis = this_subject.problem_analysis()
print(subject_name + "科目报表计算完成")
this_subject_export_path = this_school_export_path + "学科/{}/".format(subject_name)
if not os.path.exists(this_subject_export_path):
os.makedirs(this_subject_export_path)
for each in problem_analysis.keys():
excel_name = "{}.xlsx".format(each)
this_excel = problem_analysis[each]
with pd.ExcelWriter(this_subject_export_path + excel_name, engine='xlsxwriter') as writer:
if type(this_excel) == dict:
for content in this_excel.keys():
this_excel[content].to_excel(writer, index=False, sheet_name=content)
if each == "2-1-3-2试题难度分布":
if content == "难度分布":
workbook = writer.book
worksheet = writer.sheets[content]
chart = workbook.add_chart({'type': 'column'})
max_row = len(this_excel[content])
col_x = this_excel[content].columns.get_loc('难度范围')
col_y = this_excel[content].columns.get_loc("比例")
chart.add_series({
'name': '总分占比',
'categories': [content, 1, col_x, max_row, col_x],
'values': [content, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': "难度比例图"})
chart.set_x_axis({'name': '难度范围', 'text_axis': True})
chart.set_y_axis({'name': '比例'})
chart.set_size({'width': 720, 'height': 576})
worksheet.insert_chart('A8', chart)
if each == "2-1-3-3试题区分度分布":
if content == "区分度分布":
workbook = writer.book
worksheet = writer.sheets[content]
chart = workbook.add_chart({'type': 'column'})
max_row = len(this_excel[content])
col_x = this_excel[content].columns.get_loc('区分度范围')
col_y = this_excel[content].columns.get_loc("比例")
chart.add_series({
'name': '总分占比',
'categories': [content, 1, col_x, max_row, col_x],
'values': [content, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': "区分度比例图"})
chart.set_x_axis({'name': '区分度范围', 'text_axis': True})
chart.set_y_axis({'name': '比例'})
chart.set_size({'width': 720, 'height': 576})
worksheet.insert_chart('A8', chart)
if each == "2-1-3-9全卷及各题难度曲线图":
workbook = writer.book
worksheet = writer.sheets[content]
chart = workbook.add_chart({'type': 'line'})
max_row = len(this_excel[content])
col_x = this_excel[content].columns.get_loc('分数段')
col_y = this_excel[content].columns.get_loc('难度值')
chart.add_series({
'name': '难度值',
'categories': [content, 1, col_x, max_row, col_x],
'values': [content, 1, col_y, max_row, col_y],
'marker': {
'type': 'square',
'size': 4,
'border': {'color': 'black'},
'fill': {'color': 'red'},
},
'data_labels': {'value': True, 'position': 'left'},
})
if content == "全卷":
chart.set_title({'name': content + '难度曲线'})
else:
chart.set_title({'name': '第' + str(content) + '题难度曲线'})
chart.set_x_axis({'name': '分数段'})
chart.set_y_axis({'name': '难度值', 'min': 0, 'max': 1})
chart.set_size({'width': 720, 'height': 576})
worksheet.insert_chart('D2', chart)
else:
problem_analysis[each].to_excel(writer, index=False, sheet_name=each)
if each == "2-1-3-1各题目指标":
workbook = writer.book
worksheet = writer.sheets[each]
chart = workbook.add_chart({'type': 'scatter'})
max_row = len(problem_analysis[each])
col_x = problem_analysis[each].columns.get_loc('难度')
col_y = problem_analysis[each].columns.get_loc('区分度')
prob_list = problem_analysis[each]["题号"]
custom_labels = []
for prob in prob_list:
custom_labels.append({'value': prob})
chart.add_series({
'name': '题号',
'categories': [each, 1, col_x, max_row, col_x],
'values': [each, 1, col_y, max_row, col_y],
'marker': {'type': 'circle', 'size': 4, 'fill': {'color': 'orange'}},
'data_labels': {'value': True, 'custom': custom_labels},
})
chart.set_title({'name': '题目难度与区分度分布图'})
chart.set_x_axis({'name': '难度', 'min': 0, 'max': 1})
chart.set_y_axis({'name': '区分度', 'min': -1, 'max': 1})
chart.set_size({'width': 920, 'height': 860})
worksheet.insert_chart('A30', chart)
if each == "2-1-4-1各班级得分情况":
problem_analysis[each].to_excel(writer, index=False, sheet_name=each)
workbook = writer.book
worksheet = writer.sheets[each]
# 平均分
chart = workbook.add_chart({'type': 'column'})
max_row = len(problem_analysis[each])
col_x = problem_analysis[each].columns.get_loc('班级')
col_y = problem_analysis[each].columns.get_loc('全体平均分')
chart.add_series({
'name': '平均分',
'categories': [each, 1, col_x, max_row, col_x],
'values': [each, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': "各班级平均分分布图"})
chart.set_x_axis({'name': '班级', 'text_axis': True})
chart.set_y_axis({'name': '平均分'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('A20', chart)
# 得分率
chart = workbook.add_chart({'type': 'column'})
max_row = len(problem_analysis[each])
col_x = problem_analysis[each].columns.get_loc('班级')
col_y = problem_analysis[each].columns.get_loc('得分率')
chart.add_series({
'name': '得分率',
'categories': [each, 1, col_x, max_row, col_x],
'values': [each, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': "各班级得分率分布图"})
chart.set_x_axis({'name': '班级', 'text_axis': True})
chart.set_y_axis({'name': '得分率'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('M20', chart)
# 超优率
chart = workbook.add_chart({'type': 'column'})
max_row = len(problem_analysis[each])
col_x = problem_analysis[each].columns.get_loc('班级')
col_y = problem_analysis[each].columns.get_loc('超优率')
chart.add_series({
'name': '超优率',
'categories': [each, 1, col_x, max_row, col_x],
'values': [each, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': "各班级超优率分布图"})
chart.set_x_axis({'name': '班级', 'text_axis': True})
chart.set_y_axis({'name': '超优率'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('A50', chart)
# 优秀率
chart = workbook.add_chart({'type': 'column'})
max_row = len(problem_analysis[each])
col_x = problem_analysis[each].columns.get_loc('班级')
col_y = problem_analysis[each].columns.get_loc('超优率')
chart.add_series({
'name': '优秀率',
'categories': [each, 1, col_x, max_row, col_x],
'values': [each, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': "各班级优秀率分布图"})
chart.set_x_axis({'name': '班级', 'text_axis': True})
chart.set_y_axis({'name': '优秀率'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('M50', chart)
# 及格率
chart = workbook.add_chart({'type': 'column'})
max_row = len(problem_analysis[each])
col_x = problem_analysis[each].columns.get_loc('班级')
col_y = problem_analysis[each].columns.get_loc('及格率')
chart.add_series({
'name': '及格率',
'categories': [each, 1, col_x, max_row, col_x],
'values': [each, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': "各班级及格率分布图"})
chart.set_x_axis({'name': '班级', 'text_axis': True})
chart.set_y_axis({'name': '及格率'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('A80', chart)
# 低分率
chart = workbook.add_chart({'type': 'column'})
max_row = len(problem_analysis[each])
col_x = problem_analysis[each].columns.get_loc('班级')
col_y = problem_analysis[each].columns.get_loc('低分率')
chart.add_series({
'name': '低分率',
'categories': [each, 1, col_x, max_row, col_x],
'values': [each, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': "各班级低分率分布图"})
chart.set_x_axis({'name': '班级', 'text_axis': True})
chart.set_y_axis({'name': '低分率'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('M80', chart)
print("\n{}学科报表输出完成\n".format(school))
print(school + "学科知识领域与能力层次分析中......")
for x in range(6):
subject_name = SUBJECT_LIST[x]
subject_total = school_subject_total_list[x]
subject_prob_total = SUBJECT_PROB_TOTAL_LIST[x]
ability_analysis_result = ability_analysis(item_score, subject_name, subject_total, subject_prob_total)
ability_and_knowledge_output_path = "{}/学科/{}/{}".format(this_school_export_path, subject_name, subject_name)
with pd.ExcelWriter(ability_and_knowledge_output_path + '能力层次得分率.xlsx', engine='xlsxwriter') as writer:
ability_analysis_result.to_excel(writer, index=False, sheet_name=subject_name + "能力层次得分率")
workbook = writer.book
worksheet = writer.sheets[subject_name + "能力层次得分率"]
chart = workbook.add_chart({'type': 'column'})
max_row = len(ability_analysis_result)
col_x = ability_analysis_result.columns.get_loc('能力层次')
col_y = ability_analysis_result.columns.get_loc("得分率")
chart.add_series({
'name': '能力层次得分率',
'categories': [subject_name + "能力层次得分率", 1, col_x, max_row, col_x],
'values': [subject_name + "能力层次得分率", 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': subject_name + "能力层次得分率图"})
chart.set_x_axis({'name': '能力层次', 'text_axis': True})
chart.set_y_axis({'name': '得分率'})
chart.set_size({'width': 720, 'height': 576})
worksheet.insert_chart('A7', chart)
knowledge_analysis = knowledge_field_and_point(item_score, subject_name, subject_total, subject_prob_total)
for excel_name in knowledge_analysis.keys():
with pd.ExcelWriter(ability_and_knowledge_output_path + excel_name, engine='xlsxwriter') as writer:
knowledge_analysis[excel_name].to_excel(writer, index=False, sheet_name=excel_name[:-5])
if excel_name == "知识模块分析.xlsx":
if len(knowledge_analysis[excel_name]) != 0:
workbook = writer.book
worksheet = writer.sheets["知识模块分析"]
chart = workbook.add_chart({'type': 'column'})
max_row = len(knowledge_analysis[excel_name])
col_x = knowledge_analysis[excel_name].columns.get_loc('知识模块')
col_y = knowledge_analysis[excel_name].columns.get_loc("知识模块得分率")
chart.add_series({
'name': '知识模块得分率',
'categories': ["知识模块分析", 1, col_x, max_row, col_x],
'values': ["知识模块分析", 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': subject_name + "知识模块得分率图"})
chart.set_x_axis({'name': '知识模块', 'text_axis': True})
chart.set_y_axis({'name': '得分率'})
chart.set_size({'width': 720, 'height': 576})
worksheet.insert_chart('F2', chart)
if excel_name == "知识点分析.xlsx":
if len(knowledge_analysis[excel_name]) != 0:
workbook = writer.book
worksheet = writer.sheets["知识点分析"]
chart = workbook.add_chart({'type': 'column'})
max_row = len(knowledge_analysis[excel_name])
col_x = knowledge_analysis[excel_name].columns.get_loc('知识点')
col_y = knowledge_analysis[excel_name].columns.get_loc("知识点得分率")
chart.add_series({
'name': '知识点得分率',
'categories': ["知识点分析", 1, col_x, max_row, col_x],
'values': ["知识点分析", 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': subject_name + "知识点得分率图"})
chart.set_x_axis({'name': '知识点', 'text_axis': True})
chart.set_y_axis({'name': '得分率'})
chart.set_size({'width': 720, 'height': 576})
worksheet.insert_chart('F2', chart)
print(school + "学科知识领域与能力层次分析完成")
print(school + "整体报表计算中")
overall = school_overall_analysis.Analysis(Class_name, SUBJECT_LIST, SUBJECT_FULL_SCORE_LIST,
school_subject_total_list, school_subject_total_score)
overall_analysis_output = overall.overall_output()
print(school + "整体报表计算完成\n")
if not os.path.exists(path):
os.makedirs(path)
for each in overall_analysis_output.keys():
excel_name = "{}.xlsx".format(each)
this_excel = overall_analysis_output[each]
if each == "2-2-2总分及各科上线的有效分数线":
effective_score_analysis = overall.subject_effective_score_analysis(this_excel)
for subject in effective_score_analysis.keys():
score_line_path = "./OUTPUT/学校/{}/学科/{}/".format(school, subject)
subject_dict = effective_score_analysis[subject]
for goals in subject_dict.keys():
with pd.ExcelWriter(score_line_path + goals, engine='xlsxwriter') as writer:
subject_dict[goals].to_excel(writer, index=False, sheet_name=goals)
workbook = writer.book
worksheet = writer.sheets[goals]
# 总分上线
chart = workbook.add_chart({'type': 'column'})
max_row = len(subject_dict[goals]) - 1
col_x = subject_dict[goals].columns.get_loc('班级')
col_y = subject_dict[goals].columns.get_loc("总分上线人数")
chart.add_series({
'name': '总分上线人数',
'categories': [goals, 1, col_x, max_row, col_x],
'values': [goals, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': "总分上线各班分布图"})
chart.set_x_axis({'name': '班级', 'text_axis': True})
chart.set_y_axis({'name': '总分上线人数'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('G2', chart)
# 单上线
chart = workbook.add_chart({'type': 'column'})
max_row = len(subject_dict[goals]) - 1
col_x = subject_dict[goals].columns.get_loc('班级')
col_y = subject_dict[goals].columns.get_loc(subject + '单上线')
chart.add_series({
'name': '单上线人数',
'categories': [goals, 1, col_x, max_row, col_x],
'values': [goals, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': subject + "单科上线各班分布图"})
chart.set_x_axis({'name': '班级', 'text_axis': True})
chart.set_y_axis({'name': '单上线人数'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('A30', chart)
# 双上线
chart = workbook.add_chart({'type': 'column'})
max_row = len(subject_dict[goals]) - 1
col_x = subject_dict[goals].columns.get_loc('班级')
col_y = subject_dict[goals].columns.get_loc(subject + '双上线')
chart.add_series({
'name': '双上线人数',
'categories': [goals, 1, col_x, max_row, col_x],
'values': [goals, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': subject + "双上线各班分布图"})
chart.set_x_axis({'name': '班级', 'text_axis': True})
chart.set_y_axis({'name': '双上线人数'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('M30', chart)
# M1
chart = workbook.add_chart({'type': 'column'})
max_row = len(subject_dict[goals]) - 1
col_x = subject_dict[goals].columns.get_loc('班级')
col_y = subject_dict[goals].columns.get_loc(subject + 'M1')
chart.add_series({
'name': 'M1人数',
'categories': [goals, 1, col_x, max_row, col_x],
'values': [goals, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': subject + "M2各班分布图"})
chart.set_x_axis({'name': '班级', 'text_axis': True})
chart.set_y_axis({'name': 'M2人数'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('A60', chart)
# M2
chart = workbook.add_chart({'type': 'column'})
max_row = len(subject_dict[goals]) - 1
col_x = subject_dict[goals].columns.get_loc('班级')
col_y = subject_dict[goals].columns.get_loc(subject + 'M2')
chart.add_series({
'name': 'M2人数',
'categories': [goals, 1, col_x, max_row, col_x],
'values': [goals, 1, col_y, max_row, col_y],
'data_labels': {'value': True},
})
chart.set_title({'name': subject + "M2各班分布图"})
chart.set_x_axis({'name': '班级', 'text_axis': True})
chart.set_y_axis({'name': 'M2人数'})
chart.set_size({'width': 720, 'height': 560})
worksheet.insert_chart('M60', chart)
with pd.ExcelWriter(path +"{}/".format(school) + excel_name, engine='xlsxwriter') as writer:
if type(this_excel) == dict:
for content in this_excel.keys():
this_excel[content].to_excel(writer, index=False, sheet_name=content)
else:
overall_analysis_output[each].to_excel(writer, index=False, sheet_name=each)
print("各学校报表输出完成\n")
if __name__ == '__main__':
export_path = "./OUTPUT/"
subject_export_path = export_path + "学科/"
school_export_path = export_path + "学校/"
overall_export_path = export_path + "全区/"
school_data_path = "./考试数据/学校数据/"
# 计算并输出学科报表
subject(subject_export_path)
# 计算并输出全区报表
overall(overall_export_path)
# 按学校拆分各表
print("按学校拆分各表中......")
print("学校名单:" + ",".join(SCHOOL_NAME))
seperate_school_data.student_total_overall_to_school(STUDENT_TOTAL, SCHOOL_NAME, school_data_path)
for index in range(7):
subject_total = SUBJECT_TOTAL_LIST[index]
subject_name = SUBJECT_LIST[index]
seperate_school_data.subject_total_overall_to_school(subject_total, SCHOOL_NAME, school_data_path, subject_name)
print("拆分完成\n")
# 计算并输出学校报表
school_output(school_export_path)
| 52.927502 | 122 | 0.473059 | 6,694 | 67,165 | 4.447565 | 0.048402 | 0.040306 | 0.022975 | 0.022269 | 0.856778 | 0.820234 | 0.797259 | 0.774453 | 0.761353 | 0.746708 | 0 | 0.014767 | 0.406134 | 67,165 | 1,268 | 123 | 52.969243 | 0.731642 | 0.002516 | 0 | 0.719349 | 0 | 0 | 0.093065 | 0.003364 | 0 | 0 | 0 | 0 | 0 | 1 | 0.004789 | false | 0 | 0.007663 | 0 | 0.014368 | 0.021073 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bc2c6acd8afae9fec4776160c843e13c746a5bc6 | 144 | py | Python | jina/peapods/__init__.py | liusy182/jina | eb1b605f5c635f7db83e6adf9d77e52747ea7a8d | [
"Apache-2.0"
] | 1 | 2021-04-11T08:06:05.000Z | 2021-04-11T08:06:05.000Z | jina/peapods/__init__.py | sthagen/jina | c272bbddef733167804c5a68d5f41ec789fa1732 | [
"Apache-2.0"
] | 1 | 2021-07-16T17:36:22.000Z | 2021-09-22T13:48:18.000Z | jina/peapods/__init__.py | sthagen/jina | c272bbddef733167804c5a68d5f41ec789fa1732 | [
"Apache-2.0"
] | null | null | null | from jina.peapods.peas import BasePea
from jina.peapods.peas import Pea
from jina.peapods.pods import BasePod
from jina.peapods.pods import Pod
| 28.8 | 37 | 0.833333 | 24 | 144 | 5 | 0.416667 | 0.266667 | 0.5 | 0.316667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 144 | 4 | 38 | 36 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
bc30011240d43c8bc74be3c57a33383377704779 | 4,761 | py | Python | tests/analysis/test_analysis__plot_3d_regression.py | kamilazdybal/PCAfold | 251ca0dc8c8f98976266b94147504247ddd09bd2 | [
"MIT"
] | 1 | 2022-02-01T08:57:18.000Z | 2022-02-01T08:57:18.000Z | tests/analysis/test_analysis__plot_3d_regression.py | kamilazdybal/PCAfold | 251ca0dc8c8f98976266b94147504247ddd09bd2 | [
"MIT"
] | null | null | null | tests/analysis/test_analysis__plot_3d_regression.py | kamilazdybal/PCAfold | 251ca0dc8c8f98976266b94147504247ddd09bd2 | [
"MIT"
] | 1 | 2022-03-13T13:19:56.000Z | 2022-03-13T13:19:56.000Z | import unittest
import numpy as np
from PCAfold import preprocess
from PCAfold import reduction
from PCAfold import analysis
class Analysis(unittest.TestCase):
def test_analysis__plot_3d_regression__allowed_calls(self):
X = np.random.rand(100,5)
pca_X = reduction.PCA(X, scaling='auto', n_components=2)
X_rec = pca_X.reconstruct(pca_X.transform(X))
try:
plt = analysis.plot_3d_regression(X[:,0], X[:,1], X[:,0], X_rec[:,0])
plt.close()
except Exception:
self.assertTrue(False)
try:
plt = analysis.plot_3d_regression(X[:,0],
X[:,1],
X[:,0],
X_rec[:,0],
elev=45,
azim=-45,
x_label='$x$',
y_label='$y$',
z_label='$z$',
figure_size=(7,7),
title='3D regression')
plt.close()
except Exception:
self.assertTrue(False)
try:
plt = analysis.plot_3d_regression(X[:,0:1],
X[:,1:2],
X[:,2:3],
X_rec[:,0:1],
elev=45,
azim=-45,
x_label='$x$',
y_label='$y$',
z_label='$z$',
figure_size=(7,7),
title='3D regression')
plt.close()
except Exception:
self.assertTrue(False)
# ------------------------------------------------------------------------------
def test_analysis__plot_3d_regression__not_allowed_calls(self):
X = np.random.rand(100,5)
with self.assertRaises(ValueError):
plt = analysis.plot_3d_regression(X[:,0:2], X[:,1], X[:,2], X[:,0])
plt.close()
with self.assertRaises(ValueError):
plt = analysis.plot_3d_regression(X[:,0], X[:,1:3], X[:,2], X[:,0])
plt.close()
with self.assertRaises(ValueError):
plt = analysis.plot_3d_regression(X[:,0], X[:,1], X[:,2:4], X[:,0])
plt.close()
with self.assertRaises(ValueError):
plt = analysis.plot_3d_regression(X[:,0], X[:,1], X[:,2], X[:,0:2])
plt.close()
with self.assertRaises(ValueError):
plt = analysis.plot_3d_regression([1,2,3], X[:,1], X[:,2], X[:,2])
plt.close()
with self.assertRaises(ValueError):
plt = analysis.plot_3d_regression(X[:,0], [1,2,3], X[:,2], X[:,2])
plt.close()
with self.assertRaises(ValueError):
plt = analysis.plot_3d_regression(X[:,0], X[:,1], [1,2,3], X[:,2])
plt.close()
with self.assertRaises(ValueError):
plt = analysis.plot_3d_regression(X[:,0], X[:,1], X[:,2], [1,2,3])
plt.close()
with self.assertRaises(ValueError):
plt = analysis.plot_3d_regression(X[:,0], X[:,1], X[:,2], X[:,2], elev=[1])
plt.close()
with self.assertRaises(ValueError):
plt = analysis.plot_3d_regression(X[:,0], X[:,1], X[:,2], X[:,2], azim=[1])
plt.close()
with self.assertRaises(ValueError):
plt = analysis.plot_3d_regression(X[:,0], X[:,1], X[:,2], X[:,2], x_label=[1])
plt.close()
with self.assertRaises(ValueError):
plt = analysis.plot_3d_regression(X[:,0], X[:,1], X[:,2], X[:,2], y_label=[1])
plt.close()
with self.assertRaises(ValueError):
plt = analysis.plot_3d_regression(X[:,0], X[:,1], X[:,2], X[:,2], z_label=[1])
plt.close()
with self.assertRaises(ValueError):
plt = analysis.plot_3d_regression(X[:,0], X[:,1], X[:,2], X[:,2], figure_size=[1])
plt.close()
with self.assertRaises(ValueError):
plt = analysis.plot_3d_regression(X[:,0], X[:,1], X[:,2], X[:,2], title=[1])
plt.close()
with self.assertRaises(ValueError):
plt = analysis.plot_3d_regression(X[:,0], X[:,1], X[:,2], X[:,2], save_filename=[1])
plt.close()
# ------------------------------------------------------------------------------
| 38.395161 | 96 | 0.426381 | 515 | 4,761 | 3.794175 | 0.118447 | 0.027636 | 0.150461 | 0.257932 | 0.856704 | 0.854145 | 0.819857 | 0.819857 | 0.818833 | 0.785056 | 0 | 0.049587 | 0.390044 | 4,761 | 123 | 97 | 38.707317 | 0.623278 | 0.032976 | 0 | 0.631579 | 0 | 0 | 0.010433 | 0 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.021053 | false | 0 | 0.052632 | 0 | 0.084211 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cb00958990fc7627c25a533160b5eaf48f723207 | 93 | py | Python | pylib/cstw/__init__.py | eddiehoyle/constraint-workshop | 958190f50a524420d91da3b52244b177729c1656 | [
"MIT"
] | null | null | null | pylib/cstw/__init__.py | eddiehoyle/constraint-workshop | 958190f50a524420d91da3b52244b177729c1656 | [
"MIT"
] | null | null | null | pylib/cstw/__init__.py | eddiehoyle/constraint-workshop | 958190f50a524420d91da3b52244b177729c1656 | [
"MIT"
] | null | null | null | from cstw import angle
from cstw import matrix
from cstw import quat
from cstw import vector
| 18.6 | 23 | 0.827957 | 16 | 93 | 4.8125 | 0.4375 | 0.415584 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.172043 | 93 | 4 | 24 | 23.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1dac617857e04ee7991d1bb5dc668cac43525d6a | 4,537 | py | Python | tests/Intervals_test.py | cgat-developers/cgat-apps | 4de381418e80cfd085b17fc8a3048a48f55e57cf | [
"BSD-2-Clause",
"MIT"
] | 19 | 2018-09-07T11:32:49.000Z | 2021-08-24T01:39:37.000Z | tests/Intervals_test.py | cgat-developers/cgat-apps | 4de381418e80cfd085b17fc8a3048a48f55e57cf | [
"BSD-2-Clause",
"MIT"
] | 55 | 2018-04-05T15:26:41.000Z | 2022-03-17T09:27:43.000Z | tests/Intervals_test.py | cgat-developers/cgat-apps | 4de381418e80cfd085b17fc8a3048a48f55e57cf | [
"BSD-2-Clause",
"MIT"
] | 12 | 2018-02-11T20:07:03.000Z | 2021-04-07T04:50:12.000Z | """unit testing module for the Tree.py class."""
import cgat.Intervals as Intervals
import unittest
class TruncateCheck(unittest.TestCase):
def testEmpty(self):
"""test empty input."""
self.assertEqual(Intervals.truncate([], []), [])
def testHalfEmpty(self):
"""test empty input."""
self.assertEqual(Intervals.truncate([], [(0, 5)]), [])
self.assertEqual(Intervals.truncate([(0, 5)], []), [(0, 5)])
def testSingle(self):
"""test empty input."""
self.assertEqual(Intervals.truncate([(0, 5)], [(0, 5)]), [])
self.assertEqual(Intervals.truncate([(0, 5)], [(0, 3)]), [(3, 5)])
self.assertEqual(Intervals.truncate([(0, 3)], [(0, 5)]), [])
self.assertEqual(Intervals.truncate([(0, 5)], [(3, 5)]), [(0, 3)])
self.assertEqual(Intervals.truncate([(3, 5)], [(0, 5)]), [])
self.assertEqual(Intervals.truncate([(5, 10)], [(5, 10)]), [])
self.assertEqual(Intervals.truncate([(5, 10)], [(5, 20)]), [])
self.assertEqual(Intervals.truncate([(5, 10)], [(0, 10)]), [])
self.assertEqual(Intervals.truncate([(5, 10)], [(0, 10)]), [])
self.assertEqual(Intervals.truncate([(5, 10)], [(0, 20)]), [])
def testMultiple(self):
"""test empty input."""
self.assertEqual(
Intervals.truncate([(0, 5), (10, 15)], [(0, 5)]), [(10, 15)])
self.assertEqual(
Intervals.truncate([(0, 5), (10, 15)], [(0, 10)]), [(10, 15)])
self.assertEqual(Intervals.truncate([(0, 5), (10, 15)], [(0, 15)]), [])
self.assertEqual(Intervals.truncate([(0, 5), (5, 10)], [(0, 10)]), [])
self.assertEqual(
Intervals.truncate([(0, 5), (5, 10)], []), [(0, 5), (5, 10)])
def testNoOverlap(self):
"""test empty input."""
self.assertEqual(
Intervals.truncate([(0, 5), (10, 15)], [(5, 10)]), [(0, 5), (10, 15)])
self.assertEqual(
Intervals.truncate([(5, 10)], [(0, 5), (10, 15)]), [(5, 10)])
self.assertEqual(
Intervals.truncate([(0, 5), (5, 10)], [(10, 15)]), [(0, 5), (5, 10)])
class IntersectCheck(unittest.TestCase):
def testEmpty(self):
"""test empty input."""
self.assertEqual(Intervals.intersect([], []), [])
def testHalfEmpty(self):
"""test empty input."""
self.assertEqual(Intervals.intersect([(0, 5)], []), [])
self.assertEqual(Intervals.intersect([], [(0, 5)]), [])
def testSingle(self):
"""test empty input."""
self.assertEqual(Intervals.intersect([(0, 5)], [(0, 5)]), [(0, 5)])
self.assertEqual(Intervals.intersect([(0, 5)], [(0, 3)]), [(0, 3)])
self.assertEqual(Intervals.intersect([(0, 3)], [(0, 5)]), [(0, 3)])
self.assertEqual(Intervals.intersect([(0, 5)], [(3, 5)]), [(3, 5)])
self.assertEqual(Intervals.intersect([(3, 5)], [(0, 5)]), [(3, 5)])
self.assertEqual(Intervals.intersect([(5, 10)], [(5, 20)]), [(5, 10)])
self.assertEqual(Intervals.intersect([(5, 10)], [(0, 20)]), [(5, 10)])
def testMultiple(self):
"""test empty input."""
self.assertEqual(
Intervals.intersect([(0, 5), (10, 15)], [(0, 5)]), [(0, 5)])
self.assertEqual(
Intervals.intersect([(0, 5), (10, 15)], [(0, 10)]), [(0, 5)])
self.assertEqual(
Intervals.intersect([(0, 5), (10, 15)], [(0, 15)]), [(0, 5), (10, 15)])
self.assertEqual(
Intervals.intersect([(0, 5), (5, 10)], [(0, 10)]), [(0, 5), (5, 10)])
def testNoOverlap(self):
"""test empty input."""
self.assertEqual(
Intervals.intersect([(0, 5), (10, 15)], [(5, 10)]), [])
self.assertEqual(
Intervals.intersect([(5, 10)], [(0, 5), (10, 15)]), [])
class FromArrayCheck(unittest.TestCase):
def testEmpty(self):
"""test empty input."""
self.assertEqual(Intervals.fromArray([]), [])
def testArray1(self):
"""test simple array."""
a = [1, 1, 1, 0, 0, 0, 1, 1, 1]
self.assertEqual(Intervals.fromArray(a), [(0, 3), (6, 9)])
self.assertEqual(Intervals.fromArray([not x for x in a]), [(3, 6)])
def testArray2(self):
"""test longer array."""
a = [1, 1, 1, 0, 0, 0, 1, 1, 1, 0, 0, 0, 1, 1, 1]
self.assertEqual(Intervals.fromArray(a), [(0, 3), (6, 9), (12, 15)])
self.assertEqual(
Intervals.fromArray([not x for x in a]), [(3, 6), (9, 12)])
if __name__ == "__main__":
unittest.main()
| 39.798246 | 83 | 0.514878 | 542 | 4,537 | 4.295203 | 0.097786 | 0.270619 | 0.43299 | 0.28866 | 0.885739 | 0.885309 | 0.845361 | 0.747852 | 0.680412 | 0.536512 | 0 | 0.091857 | 0.239365 | 4,537 | 113 | 84 | 40.150442 | 0.58273 | 0.061274 | 0 | 0.337662 | 0 | 0 | 0.00191 | 0 | 0 | 0 | 0 | 0 | 0.545455 | 1 | 0.168831 | false | 0 | 0.025974 | 0 | 0.233766 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1dbcd7694bfc88a060e7e01e92c660a046f13882 | 173 | py | Python | pythonFiles/vscode_datascience_helpers/getJupyterKernelspecVersion.py | ChaseKnowlden/vscode-jupyter | 9bdaf87f0b6dcd717c508e9023350499a6093f97 | [
"MIT"
] | 615 | 2020-11-11T22:55:28.000Z | 2022-03-30T21:48:08.000Z | pythonFiles/vscode_datascience_helpers/getJupyterKernelspecVersion.py | ChaseKnowlden/vscode-jupyter | 9bdaf87f0b6dcd717c508e9023350499a6093f97 | [
"MIT"
] | 8,428 | 2020-11-11T22:06:43.000Z | 2022-03-31T23:42:34.000Z | pythonFiles/vscode_datascience_helpers/getJupyterKernelspecVersion.py | ChaseKnowlden/vscode-jupyter | 9bdaf87f0b6dcd717c508e9023350499a6093f97 | [
"MIT"
] | 158 | 2020-11-12T07:49:02.000Z | 2022-03-27T20:50:20.000Z | # Check whether kernelspec module exists.
import sys
import jupyter_client
import jupyter_client.kernelspec
sys.stdout.write(jupyter_client.__version__)
sys.stdout.flush()
| 21.625 | 44 | 0.843931 | 23 | 173 | 6.043478 | 0.565217 | 0.280576 | 0.273381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086705 | 173 | 7 | 45 | 24.714286 | 0.879747 | 0.225434 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
38433340f08f3818bea0d322fe0d37ba49a6015e | 23,062 | py | Python | appendix/torch_nsolt/test_nsoltInitialRotation2dLayer.py | msiplab/SaivDr | 7015dea02e955c71337db6e7b29bb8c35294fa0e | [
"BSD-2-Clause"
] | 7 | 2018-11-26T08:49:24.000Z | 2021-09-15T08:46:35.000Z | appendix/torch_nsolt/test_nsoltInitialRotation2dLayer.py | msiplab/SaivDr | 7015dea02e955c71337db6e7b29bb8c35294fa0e | [
"BSD-2-Clause"
] | 11 | 2019-12-02T11:20:18.000Z | 2022-02-14T05:17:11.000Z | appendix/torch_nsolt/test_nsoltInitialRotation2dLayer.py | msiplab/SaivDr | 7015dea02e955c71337db6e7b29bb8c35294fa0e | [
"BSD-2-Clause"
] | 7 | 2019-06-05T07:45:20.000Z | 2021-09-15T08:46:36.000Z | import itertools
import unittest
from parameterized import parameterized
import math
import torch
import torch.nn as nn
from nsoltInitialRotation2dLayer import NsoltInitialRotation2dLayer
from nsoltUtility import Direction, OrthonormalMatrixGenerationSystem
nchs = [ [2, 2], [3, 3], [4, 4] ]
stride = [ [1, 1], [1, 2], [2, 1], [2, 2] ]
mus = [ -1, 1 ]
datatype = [ torch.float, torch.double ]
nrows = [ 4, 8, 16 ]
ncols = [ 4, 8, 16 ]
isdevicetest = True
class NsoltInitialRotation2dLayerTestCase(unittest.TestCase):
"""
NSOLTINITIALROTATION2DLAYERTESTCASE
コンポーネント別に入力(nComponents):
nSamples x nRows x nCols x nDecs
コンポーネント別に出力(nComponents):
nSamples x nRows x nCols x nChs
Requirements: Python 3.7.x, PyTorch 1.7.x
Copyright (c) 2020-2021, Shogo MURAMATSU
All rights reserved.
Contact address: Shogo MURAMATSU,
Faculty of Engineering, Niigata University,
8050 2-no-cho Ikarashi, Nishi-ku,
Niigata, 950-2181, JAPAN
http://msiplab.eng.niigata-u.ac.jp/
"""
@parameterized.expand(
list(itertools.product(nchs,stride))
)
def testConstructor(self,
nchs, stride):
# Expcted values
expctdName = 'V0'
expctdDescription = "NSOLT initial rotation " \
+ "(ps,pa) = (" \
+ str(nchs[0]) + "," + str(nchs[1]) + "), " \
+ "(mv,mh) = (" \
+ str(stride[Direction.VERTICAL]) + "," + str(stride[Direction.HORIZONTAL]) + ")"
# Instantiation of target class
layer = NsoltInitialRotation2dLayer(
number_of_channels=nchs,
decimation_factor=stride,
name=expctdName
)
# Actual values
actualName = layer.name
actualDescription = layer.description
# Evaluation
self.assertTrue(isinstance(layer, nn.Module))
self.assertEqual(actualName,expctdName)
self.assertEqual(actualDescription,expctdDescription)
@parameterized.expand(
list(itertools.product(nchs,stride,nrows,ncols,datatype))
)
def testPredictGrayscale(self,
nchs, stride, nrows, ncols, datatype):
rtol,atol=1e-5,1e-8
if isdevicetest:
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
else:
device = torch.device("cpu")
# Parameters
nSamples = 8
nDecs = stride[0]*stride[1] # math.prod(stride)
nChsTotal = sum(nchs)
# nSamples x nRows x nCols x nDecs
X = torch.randn(nSamples,nrows,ncols,nDecs,dtype=datatype)
X = X.to(device)
# Expected values
# nSamplex x nRows x nCols x nChs
ps, pa = nchs
W0 = torch.eye(ps,dtype=datatype).to(device)
U0 = torch.eye(pa,dtype=datatype).to(device)
ms,ma = int(math.ceil(nDecs/2.)), int(math.floor(nDecs/2.))
Zsa = torch.zeros(nChsTotal,nrows*ncols*nSamples,dtype=datatype).to(device)
Ys = X[:,:,:,:ms].view(-1,ms).T
Zsa[:ps,:] = W0[:,:ms] @ Ys
if ma > 0:
Ya = X[:,:,:,ms:].view(-1,ma).T
Zsa[ps:,:] = U0[:,:ma] @ Ya
expctdZ = Zsa.T.view(nSamples,nrows,ncols,nChsTotal)
# Instantiation of target class
layer = NsoltInitialRotation2dLayer(
number_of_channels=nchs,
decimation_factor=stride,
name='V0'
)
layer = layer.to(device)
# Actual values
with torch.no_grad():
actualZ = layer.forward(X)
# Evaluation
self.assertEqual(actualZ.dtype,datatype)
self.assertTrue(torch.allclose(actualZ,expctdZ,rtol=rtol,atol=atol))
self.assertFalse(actualZ.requires_grad)
@parameterized.expand(
list(itertools.product(nchs,stride,nrows,ncols,datatype))
)
def testPredictGrayscaleWithRandomAngles(self,
nchs, stride, nrows, ncols, datatype):
rtol,atol=1e-3,1e-6
if isdevicetest:
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
else:
device = torch.device("cpu")
gen = OrthonormalMatrixGenerationSystem(dtype=datatype)
# Parameters
nSamples = 8
nDecs = stride[0]*stride[1] # math.prod(stride)
nChsTotal = sum(nchs)
# nSamples x nRows x nCols x nDecs
X = torch.randn(nSamples,nrows,ncols,nDecs,dtype=datatype)
X = X.to(device)
angles = torch.randn(int((nChsTotal-2)*nChsTotal/4),dtype=datatype)
# Expected values
# nSamples x nRows x nCols x nChs
ps,pa = nchs
nAngsW = int(len(angles)/2)
angsW,angsU = angles[:nAngsW],angles[nAngsW:]
W0,U0 = gen(angsW).to(device),gen(angsU).to(device)
ms,ma = int(math.ceil(nDecs/2.)), int(math.floor(nDecs/2.))
Zsa = torch.zeros(nChsTotal,nrows*ncols*nSamples,dtype=datatype)
Zsa = Zsa.to(device)
Ys = X[:,:,:,:ms].view(-1,ms).T
Zsa[:ps,:] = W0[:,:ms] @ Ys
if ma > 0:
Ya = X[:,:,:,ms:].view(-1,ma).T
Zsa[ps:,:] = U0[:,:ma] @ Ya
expctdZ = Zsa.T.view(nSamples,nrows,ncols,nChsTotal)
# Instantiation of target class
layer = NsoltInitialRotation2dLayer(
number_of_channels=nchs,
decimation_factor=stride,
name='V0')
layer.orthTransW0.angles.data = angsW
layer.orthTransW0.mus = 1
layer.orthTransU0.angles.data = angsU
layer.orthTransU0.mus = 1
layer = layer.to(device)
# Actual values
with torch.no_grad():
actualZ = layer.forward(X)
# Evaluation
self.assertEqual(actualZ.dtype,datatype)
self.assertTrue(torch.allclose(actualZ,expctdZ,rtol=rtol,atol=atol))
self.assertFalse(actualZ.requires_grad)
@parameterized.expand(
list(itertools.product(nchs,stride,nrows,ncols,datatype,mus))
)
def testPredictGrayscaleWithRandomAnglesNoDcLeackage(self,
nchs, stride, nrows, ncols, datatype,mus):
rtol,atol=1e-3,1e-6
if isdevicetest:
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
else:
device = torch.device("cpu")
gen = OrthonormalMatrixGenerationSystem(dtype=datatype)
# Parameters
nSamples = 8
nDecs = stride[0]*stride[1] # math.prod(stride)
nChsTotal = sum(nchs)
# nSamples x nRows x nCols x nDecs
X = torch.randn(nSamples,nrows,ncols,nDecs,dtype=datatype)
X = X.to(device)
angles = torch.randn(int((nChsTotal-2)*nChsTotal/4),dtype=datatype)
# Expected values
# nSamples x nRows x nCols x nChs
ps,pa = nchs
nAngsW = int(len(angles)/2)
angsW,angsU = angles[:nAngsW],angles[nAngsW:]
angsWNoDcLeak = angsW.clone()
angsWNoDcLeak[:ps-1] = torch.zeros(ps-1,dtype=angles.dtype)
musW,musU = mus*torch.ones(ps,dtype=datatype),mus*torch.ones(pa,dtype=datatype)
musW[0] = 1
W0,U0 = gen(angsWNoDcLeak,musW).to(device),gen(angsU,musU).to(device)
ms,ma = int(math.ceil(nDecs/2.)), int(math.floor(nDecs/2.))
Zsa = torch.zeros(nChsTotal,nrows*ncols*nSamples,dtype=datatype).to(device)
Ys = X[:,:,:,:ms].view(-1,ms).T
Zsa[:ps,:] = W0[:,:ms] @ Ys
if ma > 0:
Ya = X[:,:,:,ms:].view(-1,ma).T
Zsa[ps:,:] = U0[:,:ma] @ Ya
expctdZ = Zsa.T.view(nSamples,nrows,ncols,nChsTotal)
# Instantiation of target class
layer = NsoltInitialRotation2dLayer(
number_of_channels=nchs,
decimation_factor=stride,
no_dc_leakage=True,
name='V0')
layer.orthTransW0.angles.data = angsW
layer.orthTransW0.mus = musW
layer.orthTransU0.angles.data = angsU
layer.orthTransU0.mus = musU
layer = layer.to(device)
# Actual values
with torch.no_grad():
actualZ = layer.forward(X)
# Evaluation
self.assertEqual(actualZ.dtype,datatype)
self.assertTrue(torch.allclose(actualZ,expctdZ,rtol=rtol,atol=atol))
self.assertFalse(actualZ.requires_grad)
@parameterized.expand(
list(itertools.product(nchs,stride,nrows,ncols,datatype))
)
def testBackwardGrayscale(self,
nchs, stride, nrows, ncols, datatype):
rtol,atol=1e-5,1e-8
if isdevicetest:
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
else:
device = torch.device("cpu")
omgs = OrthonormalMatrixGenerationSystem(dtype=datatype,partial_difference=False)
# Parameters
nSamples = 8
nDecs = stride[0]*stride[1] # math.prod(stride)
nChsTotal = sum(nchs)
nAnglesH = int((nChsTotal-2)*nChsTotal/8)
anglesW = torch.zeros(nAnglesH,dtype=datatype)
anglesU = torch.zeros(nAnglesH,dtype=datatype)
mus = 1
# nSamples x nRows x nCols x nDecs
X = torch.randn(nSamples,nrows,ncols,nDecs,dtype=datatype,device=device,requires_grad=True)
dLdZ = torch.randn(nSamples,nrows,ncols,nChsTotal,dtype=datatype)
dLdZ = dLdZ.to(device)
# Expected values
# nSamples x nRows x nCols x nDecs
ps,pa = nchs
W0T = omgs(anglesW,mus).T.to(device)
U0T = omgs(anglesU,mus).T.to(device)
# dLdX = dZdX x dLdZ
ms,ma = int(math.ceil(nDecs/2.)),int(math.floor(nDecs/2.))
Ys = dLdZ[:,:,:,:ps].view(nSamples*nrows*ncols,ps).T # ps * n
Ya = dLdZ[:,:,:,ps:].view(nSamples*nrows*ncols,pa).T # pa * n
Y = torch.cat(
( W0T[:ms,:] @ Ys, # ms x ps @ ps x n
U0T[:ma,:] @ Ya ), dim=0) # ma x pa @ pa x n
expctddLdX = Y.T.view(nSamples,nrows,ncols,nDecs) # n x (ms+ma)
# dLdWi = <dLdZ,(dVdWi)X>
expctddLdW_W = torch.zeros(nAnglesH,dtype=datatype).to(device)
expctddLdW_U = torch.zeros(nAnglesH,dtype=datatype).to(device)
omgs.partial_difference = True
for iAngle in range(nAnglesH):
dW0 = omgs(anglesW,mus,index_pd_angle=iAngle).to(device)
Xs = X[:,:,:,:ms].view(-1,ms).T
Zs = dW0[:,:ms] @ Xs # ps x n
expctddLdW_W[iAngle] = torch.sum(Ys * Zs) # ps x n
if ma>0:
dU0 = omgs(anglesU,mus,index_pd_angle=iAngle).to(device)
Xa = X[:,:,:,ms:].view(-1,ma).T
Za = dU0[:,:ma] @ Xa # pa x n
expctddLdW_U[iAngle] = torch.sum(Ya * Za) # pa x n
# Instantiation of target class
layer = NsoltInitialRotation2dLayer(
number_of_channels=nchs,
decimation_factor=stride,
name='V0')
layer.orthTransW0.angles.data = anglesW
layer.orthTransW0.mus = mus
layer.orthTransU0.angles.data = anglesU
layer.orthTransU0.mus = mus
layer = layer.to(device)
# Actual values
torch.autograd.set_detect_anomaly(True)
Z = layer.forward(X)
layer.zero_grad()
Z.backward(dLdZ)
actualdLdX = X.grad
actualdLdW_W = layer.orthTransW0.angles.grad
actualdLdW_U = layer.orthTransU0.angles.grad
# Evaluation
self.assertEqual(actualdLdX.dtype,datatype)
self.assertEqual(actualdLdW_W.dtype,datatype)
self.assertEqual(actualdLdW_U.dtype,datatype)
self.assertTrue(torch.allclose(actualdLdX,expctddLdX,rtol=rtol,atol=atol))
self.assertTrue(torch.allclose(actualdLdW_W,expctddLdW_W,rtol=rtol,atol=atol))
self.assertTrue(torch.allclose(actualdLdW_U,expctddLdW_U,rtol=rtol,atol=atol))
self.assertTrue(Z.requires_grad)
@parameterized.expand(
list(itertools.product(nchs,stride,nrows,ncols,datatype))
)
def testBackwardGrayscaleWithRandomAngles(self,
nchs, stride, nrows, ncols, datatype):
rtol,atol=1e-3,1e-6
if isdevicetest:
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
else:
device = torch.device("cpu")
omgs = OrthonormalMatrixGenerationSystem(dtype=datatype,partial_difference=False)
# Parameters
nSamples = 8
nDecs = stride[0]*stride[1] # math.prod(stride)
nChsTotal = sum(nchs)
nAnglesH = int((nChsTotal-2)*nChsTotal/8)
anglesW = torch.randn(nAnglesH,dtype=datatype)
anglesU = torch.randn(nAnglesH,dtype=datatype)
mus = 1
# nSamples x nRows x nCols x nDecs
X = torch.randn(nSamples,nrows,ncols,nDecs,dtype=datatype,device=device,requires_grad=True)
dLdZ = torch.randn(nSamples,nrows,ncols,nChsTotal,dtype=datatype)
dLdZ = dLdZ.to(device)
# Expected values
ps,pa = nchs
W0T = omgs(anglesW,mus).T.to(device)
U0T = omgs(anglesU,mus).T.to(device)
# dLdX = dZdX x dLdZ
ms,ma = int(math.ceil(nDecs/2.)),int(math.floor(nDecs/2.))
Ys = dLdZ[:,:,:,:ps].view(nSamples*nrows*ncols,ps).T # ps * n
Ya = dLdZ[:,:,:,ps:].view(nSamples*nrows*ncols,pa).T # pa * n
Y = torch.cat(
( W0T[:ms,:] @ Ys, # ms x ps @ ps x n
U0T[:ma,:] @ Ya ), dim=0) # ma x pa @ pa x n
expctddLdX = Y.T.view(nSamples,nrows,ncols,nDecs) # n x (ms+ma)
# dLdWi = <dLdZ,(dVdWi)X>
expctddLdW_W = torch.zeros(nAnglesH,dtype=datatype).to(device)
expctddLdW_U = torch.zeros(nAnglesH,dtype=datatype).to(device)
omgs.partial_difference = True
for iAngle in range(nAnglesH):
dW0 = omgs(anglesW,mus,index_pd_angle=iAngle).to(device)
Xs = X[:,:,:,:ms].view(-1,ms).T
Zs = dW0[:,:ms] @ Xs # ps x n
expctddLdW_W[iAngle] = torch.sum(Ys * Zs) # ps x n
if ma>0:
dU0 = omgs(anglesU,mus,index_pd_angle=iAngle).to(device)
Xa = X[:,:,:,ms:].view(-1,ma).T
Za = dU0[:,:ma] @ Xa # pa x n
expctddLdW_U[iAngle] = torch.sum(Ya * Za) # pa x n
# Instantiation of target class
layer = NsoltInitialRotation2dLayer(
number_of_channels=nchs,
decimation_factor=stride,
name='V0')
layer.orthTransW0.angles.data = anglesW
layer.orthTransW0.mus = mus
layer.orthTransU0.angles.data = anglesU
layer.orthTransU0.mus = mus
layer = layer.to(device)
# Actual values
torch.autograd.set_detect_anomaly(True)
Z = layer.forward(X)
layer.zero_grad()
Z.backward(dLdZ)
actualdLdX = X.grad
actualdLdW_W = layer.orthTransW0.angles.grad
actualdLdW_U = layer.orthTransU0.angles.grad
# Evaluation
self.assertEqual(actualdLdX.dtype,datatype)
self.assertEqual(actualdLdW_W.dtype,datatype)
self.assertEqual(actualdLdW_U.dtype,datatype)
self.assertTrue(torch.allclose(actualdLdX,expctddLdX,rtol=rtol,atol=atol))
self.assertTrue(torch.allclose(actualdLdW_W,expctddLdW_W,rtol=rtol,atol=atol))
self.assertTrue(torch.allclose(actualdLdW_U,expctddLdW_U,rtol=rtol,atol=atol))
self.assertTrue(Z.requires_grad)
@parameterized.expand(
list(itertools.product(nchs,stride,nrows,ncols,datatype,mus))
)
def testBackwardGrayscaleWithRandomAnglesNoDcLeackage(self,
nchs, stride, nrows, ncols, datatype,mus):
rtol,atol=1e-2,1e-5
if isdevicetest:
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
else:
device = torch.device("cpu")
omgs = OrthonormalMatrixGenerationSystem(dtype=datatype,partial_difference=False)
# Parameters
nSamples = 8
nDecs = stride[0]*stride[1] # math.prod(stride)
nChsTotal = sum(nchs)
nAnglesH = int((nChsTotal-2)*nChsTotal/8)
anglesW = torch.randn(nAnglesH,dtype=datatype)
anglesU = torch.randn(nAnglesH,dtype=datatype)
mus = 1
# nSamples x nRows x nCols x nDecs
X = torch.randn(nSamples,nrows,ncols,nDecs,dtype=datatype,device=device,requires_grad=True)
dLdZ = torch.randn(nSamples,nrows,ncols,nChsTotal,dtype=datatype)
dLdZ = dLdZ.to(device)
# Expected values
ps,pa = nchs
anglesWNoDcLeak = anglesW.clone()
anglesWNoDcLeak[:ps-1] = torch.zeros(ps-1,dtype=datatype)
musW,musU = mus*torch.ones(ps,dtype=datatype),mus*torch.ones(pa,dtype=datatype)
musW[0] = 1
W0T = omgs(anglesWNoDcLeak,musW).T.to(device)
U0T = omgs(anglesU,musU).T.to(device)
# dLdX = dZdX x dLdZ
ms,ma = int(math.ceil(nDecs/2.)),int(math.floor(nDecs/2.))
Ys = dLdZ[:,:,:,:ps].view(nSamples*nrows*ncols,ps).T # ps * n
Ya = dLdZ[:,:,:,ps:].view(nSamples*nrows*ncols,pa).T # pa * n
Y = torch.cat(
( W0T[:ms,:] @ Ys, # ms x ps @ ps x n
U0T[:ma,:] @ Ya ), dim=0) # ma x pa @ pa x n
expctddLdX = Y.T.view(nSamples,nrows,ncols,nDecs).to(device) # n x (ms+ma)
# dLdWi = <dLdZ,(dVdWi)X>
expctddLdW_W = torch.zeros(nAnglesH,dtype=datatype).to(device)
expctddLdW_U = torch.zeros(nAnglesH,dtype=datatype).to(device)
omgs.partial_difference = True
for iAngle in range(nAnglesH):
dW0 = omgs(anglesWNoDcLeak,mus,index_pd_angle=iAngle).to(device)
Xs = X[:,:,:,:ms].view(-1,ms).T
Zs = dW0[:,:ms] @ Xs # ps x n
expctddLdW_W[iAngle] = torch.sum(Ys * Zs) # ps x n
if ma>0:
dU0 = omgs(anglesU,mus,index_pd_angle=iAngle).to(device)
Xa = X[:,:,:,ms:].view(-1,ma).T
Za = dU0[:,:ma] @ Xa # pa x n
expctddLdW_U[iAngle] = torch.sum(Ya * Za) # pa x n
# Instantiation of target class
layer = NsoltInitialRotation2dLayer(
number_of_channels=nchs,
decimation_factor=stride,
no_dc_leakage=True,
name='V0')
layer.orthTransW0.angles.data = anglesW
layer.orthTransW0.mus = musW
layer.orthTransU0.angles.data = anglesU
layer.orthTransU0.mus = musU
layer = layer.to(device)
# Actual values
torch.autograd.set_detect_anomaly(True)
Z = layer.forward(X)
layer.zero_grad()
Z.backward(dLdZ)
actualdLdX = X.grad
actualdLdW_W = layer.orthTransW0.angles.grad
actualdLdW_U = layer.orthTransU0.angles.grad
# Evaluation
self.assertEqual(actualdLdX.dtype,datatype)
self.assertEqual(actualdLdW_W.dtype,datatype)
self.assertEqual(actualdLdW_U.dtype,datatype)
self.assertTrue(torch.allclose(actualdLdX,expctddLdX,rtol=rtol,atol=atol))
self.assertTrue(torch.allclose(actualdLdW_W,expctddLdW_W,rtol=rtol,atol=atol))
self.assertTrue(torch.allclose(actualdLdW_U,expctddLdW_U,rtol=rtol,atol=atol))
self.assertTrue(Z.requires_grad)
@parameterized.expand(
list(itertools.product(nchs,stride))
)
def testGradCheck(self,nchs,stride):
datatype = torch.double
if isdevicetest:
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
else:
device = torch.device("cpu")
# Configuration
ps, pa = nchs
nrows = 2
ncols = 3
nSamples = 2
nDecs = stride[0]*stride[1] # math.prod(stride)
nChsTotal = sum(nchs)
nAnglesW = int((ps-1)*ps/2)
anglesW = torch.randn(nAnglesW,dtype=datatype)
musW = (-1)**torch.randint(high=2,size=(ps,))
nAnglesU = int((pa-1)*pa/2)
anglesU = torch.randn(nAnglesU,dtype=datatype)
musU = (-1)**torch.randint(high=2,size=(pa,))
# nSamples x nRows x nCols x nDecs
X = torch.randn(nSamples,nrows,ncols,nDecs,dtype=datatype,device=device,requires_grad=True)
dLdZ = torch.randn(nSamples,nrows,ncols,nChsTotal,dtype=datatype)
dLdZ = dLdZ.to(device)
# Instantiation of target class
layer = NsoltInitialRotation2dLayer(
number_of_channels=nchs,
decimation_factor=stride,
no_dc_leakage=False,
name='V0'
)
layer.orthTransW0.angles.data = anglesW
layer.orthTransW0.mus = musW
layer.orthTransU0.angles.data = anglesU
layer.orthTransU0.mus = musU
layer = layer.to(device)
# Forward
torch.autograd.set_detect_anomaly(True)
Z = layer.forward(X)
layer.zero_grad()
# Evaluation
self.assertTrue(torch.autograd.gradcheck(layer,(X,)))
@parameterized.expand(
list(itertools.product(nchs,stride))
)
def testGradCheckNoDcLeakage(self,nchs,stride):
datatype = torch.double
if isdevicetest:
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
else:
device = torch.device("cpu")
# Configuration
ps, pa = nchs
nrows = 2
ncols = 3
nSamples = 2
nDecs = stride[0]*stride[1] # math.prod(stride)
nChsTotal = sum(nchs)
nAnglesW = int((ps-1)*ps/2)
anglesW = torch.randn(nAnglesW,dtype=datatype)
musW = (-1)**torch.randint(high=2,size=(ps,))
nAnglesU = int((pa-1)*pa/2)
anglesU = torch.randn(nAnglesU,dtype=datatype)
musU = (-1)**torch.randint(high=2,size=(pa,))
# nSamples x nRows x nCols x nDecs
X = torch.randn(nSamples,nrows,ncols,nDecs,dtype=datatype,device=device,requires_grad=True)
dLdZ = torch.randn(nSamples,nrows,ncols,nChsTotal,dtype=datatype)
dLdZ = dLdZ.to(device)
# Instantiation of target class
layer = NsoltInitialRotation2dLayer(
number_of_channels=nchs,
decimation_factor=stride,
no_dc_leakage=True,
name='V0'
)
layer.orthTransW0.angles.data = anglesW
layer.orthTransW0.mus = musW
layer.orthTransU0.angles.data = anglesU
layer.orthTransU0.mus = musU
layer = layer.to(device)
# Forward
torch.autograd.set_detect_anomaly(True)
Z = layer.forward(X)
layer.zero_grad()
# Evaluation
self.assertTrue(torch.autograd.gradcheck(layer,(X,)))
if __name__ == '__main__':
unittest.main()
| 38.956081 | 99 | 0.585075 | 2,707 | 23,062 | 4.934983 | 0.085334 | 0.056441 | 0.033685 | 0.012576 | 0.879183 | 0.875964 | 0.874242 | 0.865259 | 0.849315 | 0.847444 | 0 | 0.016893 | 0.291562 | 23,062 | 591 | 100 | 39.021997 | 0.800771 | 0.092533 | 0 | 0.838496 | 0 | 0 | 0.008344 | 0 | 0 | 0 | 0 | 0 | 0.077434 | 1 | 0.019912 | false | 0 | 0.017699 | 0 | 0.039823 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
69e10b65c63602fecdbecc5f81816ceb25680a69 | 539,172 | py | Python | nova/db/sqlalchemy/api.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/db/sqlalchemy/api.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/db/sqlalchemy/api.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | 2 | 2017-07-20T17:31:34.000Z | 2020-07-24T02:42:19.000Z | begin_unit
comment|'# Copyright (c) 2011 X.commerce, a business unit of eBay Inc.'
nl|'\n'
comment|'# Copyright 2010 United States Government as represented by the'
nl|'\n'
comment|'# Administrator of the National Aeronautics and Space Administration.'
nl|'\n'
comment|'# All Rights Reserved.'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Licensed under the Apache License, Version 2.0 (the "License"); you may'
nl|'\n'
comment|'# not use this file except in compliance with the License. You may obtain'
nl|'\n'
comment|'# a copy of the License at'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# http://www.apache.org/licenses/LICENSE-2.0'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Unless required by applicable law or agreed to in writing, software'
nl|'\n'
comment|'# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT'
nl|'\n'
comment|'# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the'
nl|'\n'
comment|'# License for the specific language governing permissions and limitations'
nl|'\n'
comment|'# under the License.'
nl|'\n'
nl|'\n'
string|'"""Implementation of SQLAlchemy backend."""'
newline|'\n'
nl|'\n'
name|'import'
name|'collections'
newline|'\n'
name|'import'
name|'copy'
newline|'\n'
name|'import'
name|'datetime'
newline|'\n'
name|'import'
name|'functools'
newline|'\n'
name|'import'
name|'inspect'
newline|'\n'
name|'import'
name|'sys'
newline|'\n'
name|'import'
name|'uuid'
newline|'\n'
nl|'\n'
name|'from'
name|'oslo_config'
name|'import'
name|'cfg'
newline|'\n'
name|'from'
name|'oslo_db'
name|'import'
name|'api'
name|'as'
name|'oslo_db_api'
newline|'\n'
name|'from'
name|'oslo_db'
name|'import'
name|'exception'
name|'as'
name|'db_exc'
newline|'\n'
name|'from'
name|'oslo_db'
name|'import'
name|'options'
name|'as'
name|'oslo_db_options'
newline|'\n'
name|'from'
name|'oslo_db'
op|'.'
name|'sqlalchemy'
name|'import'
name|'enginefacade'
newline|'\n'
name|'from'
name|'oslo_db'
op|'.'
name|'sqlalchemy'
name|'import'
name|'update_match'
newline|'\n'
name|'from'
name|'oslo_db'
op|'.'
name|'sqlalchemy'
name|'import'
name|'utils'
name|'as'
name|'sqlalchemyutils'
newline|'\n'
name|'from'
name|'oslo_log'
name|'import'
name|'log'
name|'as'
name|'logging'
newline|'\n'
name|'from'
name|'oslo_utils'
name|'import'
name|'excutils'
newline|'\n'
name|'from'
name|'oslo_utils'
name|'import'
name|'timeutils'
newline|'\n'
name|'from'
name|'oslo_utils'
name|'import'
name|'uuidutils'
newline|'\n'
name|'import'
name|'six'
newline|'\n'
name|'from'
name|'six'
op|'.'
name|'moves'
name|'import'
name|'range'
newline|'\n'
name|'import'
name|'sqlalchemy'
name|'as'
name|'sa'
newline|'\n'
name|'from'
name|'sqlalchemy'
name|'import'
name|'and_'
newline|'\n'
name|'from'
name|'sqlalchemy'
op|'.'
name|'exc'
name|'import'
name|'NoSuchTableError'
newline|'\n'
name|'from'
name|'sqlalchemy'
name|'import'
name|'MetaData'
newline|'\n'
name|'from'
name|'sqlalchemy'
name|'import'
name|'or_'
newline|'\n'
name|'from'
name|'sqlalchemy'
op|'.'
name|'orm'
name|'import'
name|'aliased'
newline|'\n'
name|'from'
name|'sqlalchemy'
op|'.'
name|'orm'
name|'import'
name|'contains_eager'
newline|'\n'
name|'from'
name|'sqlalchemy'
op|'.'
name|'orm'
name|'import'
name|'joinedload'
newline|'\n'
name|'from'
name|'sqlalchemy'
op|'.'
name|'orm'
name|'import'
name|'joinedload_all'
newline|'\n'
name|'from'
name|'sqlalchemy'
op|'.'
name|'orm'
name|'import'
name|'noload'
newline|'\n'
name|'from'
name|'sqlalchemy'
op|'.'
name|'orm'
name|'import'
name|'undefer'
newline|'\n'
name|'from'
name|'sqlalchemy'
op|'.'
name|'schema'
name|'import'
name|'Table'
newline|'\n'
name|'from'
name|'sqlalchemy'
name|'import'
name|'sql'
newline|'\n'
name|'from'
name|'sqlalchemy'
op|'.'
name|'sql'
op|'.'
name|'expression'
name|'import'
name|'asc'
newline|'\n'
name|'from'
name|'sqlalchemy'
op|'.'
name|'sql'
op|'.'
name|'expression'
name|'import'
name|'desc'
newline|'\n'
name|'from'
name|'sqlalchemy'
op|'.'
name|'sql'
name|'import'
name|'false'
newline|'\n'
name|'from'
name|'sqlalchemy'
op|'.'
name|'sql'
name|'import'
name|'func'
newline|'\n'
name|'from'
name|'sqlalchemy'
op|'.'
name|'sql'
name|'import'
name|'null'
newline|'\n'
name|'from'
name|'sqlalchemy'
op|'.'
name|'sql'
name|'import'
name|'true'
newline|'\n'
nl|'\n'
name|'from'
name|'nova'
name|'import'
name|'block_device'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'compute'
name|'import'
name|'task_states'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'compute'
name|'import'
name|'vm_states'
newline|'\n'
name|'import'
name|'nova'
op|'.'
name|'conf'
newline|'\n'
name|'import'
name|'nova'
op|'.'
name|'context'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'db'
op|'.'
name|'sqlalchemy'
name|'import'
name|'models'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'exception'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'i18n'
name|'import'
name|'_'
op|','
name|'_LI'
op|','
name|'_LE'
op|','
name|'_LW'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'objects'
name|'import'
name|'fields'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'quota'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'safe_utils'
newline|'\n'
nl|'\n'
DECL|variable|db_opts
name|'db_opts'
op|'='
op|'['
nl|'\n'
name|'cfg'
op|'.'
name|'StrOpt'
op|'('
string|"'osapi_compute_unique_server_name_scope'"
op|','
nl|'\n'
DECL|variable|default
name|'default'
op|'='
string|"''"
op|','
nl|'\n'
DECL|variable|help
name|'help'
op|'='
string|"'When set, compute API will consider duplicate hostnames '"
nl|'\n'
string|"'invalid within the specified scope, regardless of case. '"
nl|'\n'
string|'\'Should be empty, "project" or "global".\''
op|')'
op|','
nl|'\n'
op|']'
newline|'\n'
nl|'\n'
DECL|variable|api_db_opts
name|'api_db_opts'
op|'='
op|'['
nl|'\n'
name|'cfg'
op|'.'
name|'StrOpt'
op|'('
string|"'connection'"
op|','
nl|'\n'
DECL|variable|help
name|'help'
op|'='
string|"'The SQLAlchemy connection string to use to connect to '"
nl|'\n'
string|"'the Nova API database.'"
op|','
nl|'\n'
DECL|variable|secret
name|'secret'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
name|'cfg'
op|'.'
name|'BoolOpt'
op|'('
string|"'sqlite_synchronous'"
op|','
nl|'\n'
DECL|variable|default
name|'default'
op|'='
name|'True'
op|','
nl|'\n'
DECL|variable|help
name|'help'
op|'='
string|"'If True, SQLite uses synchronous mode.'"
op|')'
op|','
nl|'\n'
name|'cfg'
op|'.'
name|'StrOpt'
op|'('
string|"'slave_connection'"
op|','
nl|'\n'
DECL|variable|secret
name|'secret'
op|'='
name|'True'
op|','
nl|'\n'
DECL|variable|help
name|'help'
op|'='
string|"'The SQLAlchemy connection string to use to connect to the'"
nl|'\n'
string|"' slave database.'"
op|')'
op|','
nl|'\n'
name|'cfg'
op|'.'
name|'StrOpt'
op|'('
string|"'mysql_sql_mode'"
op|','
nl|'\n'
DECL|variable|default
name|'default'
op|'='
string|"'TRADITIONAL'"
op|','
nl|'\n'
DECL|variable|help
name|'help'
op|'='
string|"'The SQL mode to be used for MySQL sessions. '"
nl|'\n'
string|"'This option, including the default, overrides any '"
nl|'\n'
string|"'server-set SQL mode. To use whatever SQL mode '"
nl|'\n'
string|"'is set by the server configuration, '"
nl|'\n'
string|"'set this to no value. Example: mysql_sql_mode='"
op|')'
op|','
nl|'\n'
name|'cfg'
op|'.'
name|'IntOpt'
op|'('
string|"'idle_timeout'"
op|','
nl|'\n'
DECL|variable|default
name|'default'
op|'='
number|'3600'
op|','
nl|'\n'
DECL|variable|help
name|'help'
op|'='
string|"'Timeout before idle SQL connections are reaped.'"
op|')'
op|','
nl|'\n'
name|'cfg'
op|'.'
name|'IntOpt'
op|'('
string|"'max_pool_size'"
op|','
nl|'\n'
DECL|variable|help
name|'help'
op|'='
string|"'Maximum number of SQL connections to keep open in a '"
nl|'\n'
string|"'pool.'"
op|')'
op|','
nl|'\n'
name|'cfg'
op|'.'
name|'IntOpt'
op|'('
string|"'max_retries'"
op|','
nl|'\n'
DECL|variable|default
name|'default'
op|'='
number|'10'
op|','
nl|'\n'
DECL|variable|help
name|'help'
op|'='
string|"'Maximum number of database connection retries '"
nl|'\n'
string|"'during startup. Set to -1 to specify an infinite '"
nl|'\n'
string|"'retry count.'"
op|')'
op|','
nl|'\n'
name|'cfg'
op|'.'
name|'IntOpt'
op|'('
string|"'retry_interval'"
op|','
nl|'\n'
DECL|variable|default
name|'default'
op|'='
number|'10'
op|','
nl|'\n'
DECL|variable|help
name|'help'
op|'='
string|"'Interval between retries of opening a SQL connection.'"
op|')'
op|','
nl|'\n'
name|'cfg'
op|'.'
name|'IntOpt'
op|'('
string|"'max_overflow'"
op|','
nl|'\n'
DECL|variable|help
name|'help'
op|'='
string|"'If set, use this value for max_overflow with '"
nl|'\n'
string|"'SQLAlchemy.'"
op|')'
op|','
nl|'\n'
name|'cfg'
op|'.'
name|'IntOpt'
op|'('
string|"'connection_debug'"
op|','
nl|'\n'
DECL|variable|default
name|'default'
op|'='
number|'0'
op|','
nl|'\n'
name|'help'
op|'='
string|"'Verbosity of SQL debugging information: 0=None, '"
nl|'\n'
string|"'100=Everything.'"
op|')'
op|','
nl|'\n'
name|'cfg'
op|'.'
name|'BoolOpt'
op|'('
string|"'connection_trace'"
op|','
nl|'\n'
DECL|variable|default
name|'default'
op|'='
name|'False'
op|','
nl|'\n'
DECL|variable|help
name|'help'
op|'='
string|"'Add Python stack traces to SQL as comment strings.'"
op|')'
op|','
nl|'\n'
name|'cfg'
op|'.'
name|'IntOpt'
op|'('
string|"'pool_timeout'"
op|','
nl|'\n'
DECL|variable|help
name|'help'
op|'='
string|"'If set, use this value for pool_timeout with '"
nl|'\n'
string|"'SQLAlchemy.'"
op|')'
op|','
nl|'\n'
op|']'
newline|'\n'
nl|'\n'
DECL|variable|CONF
name|'CONF'
op|'='
name|'nova'
op|'.'
name|'conf'
op|'.'
name|'CONF'
newline|'\n'
name|'CONF'
op|'.'
name|'register_opts'
op|'('
name|'db_opts'
op|')'
newline|'\n'
name|'CONF'
op|'.'
name|'register_opts'
op|'('
name|'oslo_db_options'
op|'.'
name|'database_opts'
op|','
string|"'database'"
op|')'
newline|'\n'
name|'CONF'
op|'.'
name|'register_opts'
op|'('
name|'api_db_opts'
op|','
name|'group'
op|'='
string|"'api_database'"
op|')'
newline|'\n'
nl|'\n'
DECL|variable|LOG
name|'LOG'
op|'='
name|'logging'
op|'.'
name|'getLogger'
op|'('
name|'__name__'
op|')'
newline|'\n'
nl|'\n'
DECL|variable|main_context_manager
name|'main_context_manager'
op|'='
name|'enginefacade'
op|'.'
name|'transaction_context'
op|'('
op|')'
newline|'\n'
DECL|variable|api_context_manager
name|'api_context_manager'
op|'='
name|'enginefacade'
op|'.'
name|'transaction_context'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_get_db_conf
name|'def'
name|'_get_db_conf'
op|'('
name|'conf_group'
op|','
name|'connection'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'kw'
op|'='
name|'dict'
op|'('
nl|'\n'
name|'connection'
op|'='
name|'connection'
name|'or'
name|'conf_group'
op|'.'
name|'connection'
op|','
nl|'\n'
name|'slave_connection'
op|'='
name|'conf_group'
op|'.'
name|'slave_connection'
op|','
nl|'\n'
name|'sqlite_fk'
op|'='
name|'False'
op|','
nl|'\n'
name|'__autocommit'
op|'='
name|'True'
op|','
nl|'\n'
name|'expire_on_commit'
op|'='
name|'False'
op|','
nl|'\n'
name|'mysql_sql_mode'
op|'='
name|'conf_group'
op|'.'
name|'mysql_sql_mode'
op|','
nl|'\n'
name|'idle_timeout'
op|'='
name|'conf_group'
op|'.'
name|'idle_timeout'
op|','
nl|'\n'
name|'connection_debug'
op|'='
name|'conf_group'
op|'.'
name|'connection_debug'
op|','
nl|'\n'
name|'max_pool_size'
op|'='
name|'conf_group'
op|'.'
name|'max_pool_size'
op|','
nl|'\n'
name|'max_overflow'
op|'='
name|'conf_group'
op|'.'
name|'max_overflow'
op|','
nl|'\n'
name|'pool_timeout'
op|'='
name|'conf_group'
op|'.'
name|'pool_timeout'
op|','
nl|'\n'
name|'sqlite_synchronous'
op|'='
name|'conf_group'
op|'.'
name|'sqlite_synchronous'
op|','
nl|'\n'
name|'connection_trace'
op|'='
name|'conf_group'
op|'.'
name|'connection_trace'
op|','
nl|'\n'
name|'max_retries'
op|'='
name|'conf_group'
op|'.'
name|'max_retries'
op|','
nl|'\n'
name|'retry_interval'
op|'='
name|'conf_group'
op|'.'
name|'retry_interval'
op|')'
newline|'\n'
name|'return'
name|'kw'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_context_manager_from_context
dedent|''
name|'def'
name|'_context_manager_from_context'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'context'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'context'
op|'.'
name|'db_connection'
newline|'\n'
dedent|''
name|'except'
name|'AttributeError'
op|':'
newline|'\n'
indent|' '
name|'pass'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|configure
dedent|''
dedent|''
dedent|''
name|'def'
name|'configure'
op|'('
name|'conf'
op|')'
op|':'
newline|'\n'
indent|' '
name|'main_context_manager'
op|'.'
name|'configure'
op|'('
op|'**'
name|'_get_db_conf'
op|'('
name|'conf'
op|'.'
name|'database'
op|')'
op|')'
newline|'\n'
name|'api_context_manager'
op|'.'
name|'configure'
op|'('
op|'**'
name|'_get_db_conf'
op|'('
name|'conf'
op|'.'
name|'api_database'
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|create_context_manager
dedent|''
name|'def'
name|'create_context_manager'
op|'('
name|'connection'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Create a database context manager object.\n\n : param connection: The database connection string\n """'
newline|'\n'
name|'ctxt_mgr'
op|'='
name|'enginefacade'
op|'.'
name|'transaction_context'
op|'('
op|')'
newline|'\n'
name|'ctxt_mgr'
op|'.'
name|'configure'
op|'('
op|'**'
name|'_get_db_conf'
op|'('
name|'CONF'
op|'.'
name|'database'
op|','
name|'connection'
op|'='
name|'connection'
op|')'
op|')'
newline|'\n'
name|'return'
name|'ctxt_mgr'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|get_context_manager
dedent|''
name|'def'
name|'get_context_manager'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Get a database context manager object.\n\n :param context: The request context that can contain a context manager\n """'
newline|'\n'
name|'return'
name|'_context_manager_from_context'
op|'('
name|'context'
op|')'
name|'or'
name|'main_context_manager'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|get_engine
dedent|''
name|'def'
name|'get_engine'
op|'('
name|'use_slave'
op|'='
name|'False'
op|','
name|'context'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Get a database engine object.\n\n :param use_slave: Whether to use the slave connection\n :param context: The request context that can contain a context manager\n """'
newline|'\n'
name|'ctxt_mgr'
op|'='
name|'_context_manager_from_context'
op|'('
name|'context'
op|')'
name|'or'
name|'main_context_manager'
newline|'\n'
name|'return'
name|'ctxt_mgr'
op|'.'
name|'get_legacy_facade'
op|'('
op|')'
op|'.'
name|'get_engine'
op|'('
name|'use_slave'
op|'='
name|'use_slave'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|get_api_engine
dedent|''
name|'def'
name|'get_api_engine'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'api_context_manager'
op|'.'
name|'get_legacy_facade'
op|'('
op|')'
op|'.'
name|'get_engine'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|variable|_SHADOW_TABLE_PREFIX
dedent|''
name|'_SHADOW_TABLE_PREFIX'
op|'='
string|"'shadow_'"
newline|'\n'
DECL|variable|_DEFAULT_QUOTA_NAME
name|'_DEFAULT_QUOTA_NAME'
op|'='
string|"'default'"
newline|'\n'
DECL|variable|PER_PROJECT_QUOTAS
name|'PER_PROJECT_QUOTAS'
op|'='
op|'['
string|"'fixed_ips'"
op|','
string|"'floating_ips'"
op|','
string|"'networks'"
op|']'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|get_backend
name|'def'
name|'get_backend'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""The backend is this module itself."""'
newline|'\n'
name|'return'
name|'sys'
op|'.'
name|'modules'
op|'['
name|'__name__'
op|']'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|require_context
dedent|''
name|'def'
name|'require_context'
op|'('
name|'f'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Decorator to require *any* user or admin context.\n\n This does no authorization for user or project access matching, see\n :py:func:`nova.context.authorize_project_context` and\n :py:func:`nova.context.authorize_user_context`.\n\n The first argument to the wrapped function must be the context.\n\n """'
newline|'\n'
nl|'\n'
op|'@'
name|'functools'
op|'.'
name|'wraps'
op|'('
name|'f'
op|')'
newline|'\n'
DECL|function|wrapper
name|'def'
name|'wrapper'
op|'('
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'nova'
op|'.'
name|'context'
op|'.'
name|'require_context'
op|'('
name|'args'
op|'['
number|'0'
op|']'
op|')'
newline|'\n'
name|'return'
name|'f'
op|'('
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'wrapper'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|require_instance_exists_using_uuid
dedent|''
name|'def'
name|'require_instance_exists_using_uuid'
op|'('
name|'f'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Decorator to require the specified instance to exist.\n\n Requires the wrapped function to use context and instance_uuid as\n their first two arguments.\n """'
newline|'\n'
op|'@'
name|'functools'
op|'.'
name|'wraps'
op|'('
name|'f'
op|')'
newline|'\n'
DECL|function|wrapper
name|'def'
name|'wrapper'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'instance_get_by_uuid'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
newline|'\n'
name|'return'
name|'f'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'wrapper'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|require_aggregate_exists
dedent|''
name|'def'
name|'require_aggregate_exists'
op|'('
name|'f'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Decorator to require the specified aggregate to exist.\n\n Requires the wrapped function to use context and aggregate_id as\n their first two arguments.\n """'
newline|'\n'
nl|'\n'
op|'@'
name|'functools'
op|'.'
name|'wraps'
op|'('
name|'f'
op|')'
newline|'\n'
DECL|function|wrapper
name|'def'
name|'wrapper'
op|'('
name|'context'
op|','
name|'aggregate_id'
op|','
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'aggregate_get'
op|'('
name|'context'
op|','
name|'aggregate_id'
op|')'
newline|'\n'
name|'return'
name|'f'
op|'('
name|'context'
op|','
name|'aggregate_id'
op|','
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'wrapper'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|select_db_reader_mode
dedent|''
name|'def'
name|'select_db_reader_mode'
op|'('
name|'f'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Decorator to select synchronous or asynchronous reader mode.\n\n The kwarg argument \'use_slave\' defines reader mode. Asynchronous reader\n will be used if \'use_slave\' is True and synchronous reader otherwise.\n If \'use_slave\' is not specified default value \'False\' will be used.\n\n Wrapped function must have a context in the arguments.\n """'
newline|'\n'
nl|'\n'
op|'@'
name|'functools'
op|'.'
name|'wraps'
op|'('
name|'f'
op|')'
newline|'\n'
DECL|function|wrapper
name|'def'
name|'wrapper'
op|'('
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'wrapped_func'
op|'='
name|'safe_utils'
op|'.'
name|'get_wrapped_function'
op|'('
name|'f'
op|')'
newline|'\n'
name|'keyed_args'
op|'='
name|'inspect'
op|'.'
name|'getcallargs'
op|'('
name|'wrapped_func'
op|','
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
newline|'\n'
nl|'\n'
name|'context'
op|'='
name|'keyed_args'
op|'['
string|"'context'"
op|']'
newline|'\n'
name|'use_slave'
op|'='
name|'keyed_args'
op|'.'
name|'get'
op|'('
string|"'use_slave'"
op|','
name|'False'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'use_slave'
op|':'
newline|'\n'
indent|' '
name|'reader_mode'
op|'='
name|'main_context_manager'
op|'.'
name|'async'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'reader_mode'
op|'='
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
nl|'\n'
dedent|''
name|'with'
name|'reader_mode'
op|'.'
name|'using'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'f'
op|'('
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'return'
name|'wrapper'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|pick_context_manager_writer
dedent|''
name|'def'
name|'pick_context_manager_writer'
op|'('
name|'f'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Decorator to use a writer db context manager.\n\n The db context manager will be picked from the RequestContext.\n\n Wrapped function must have a RequestContext in the arguments.\n """'
newline|'\n'
op|'@'
name|'functools'
op|'.'
name|'wraps'
op|'('
name|'f'
op|')'
newline|'\n'
DECL|function|wrapped
name|'def'
name|'wrapped'
op|'('
name|'context'
op|','
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt_mgr'
op|'='
name|'get_context_manager'
op|'('
name|'context'
op|')'
newline|'\n'
name|'with'
name|'ctxt_mgr'
op|'.'
name|'writer'
op|'.'
name|'using'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'f'
op|'('
name|'context'
op|','
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'return'
name|'wrapped'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|pick_context_manager_reader
dedent|''
name|'def'
name|'pick_context_manager_reader'
op|'('
name|'f'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Decorator to use a reader db context manager.\n\n The db context manager will be picked from the RequestContext.\n\n Wrapped function must have a RequestContext in the arguments.\n """'
newline|'\n'
op|'@'
name|'functools'
op|'.'
name|'wraps'
op|'('
name|'f'
op|')'
newline|'\n'
DECL|function|wrapped
name|'def'
name|'wrapped'
op|'('
name|'context'
op|','
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt_mgr'
op|'='
name|'get_context_manager'
op|'('
name|'context'
op|')'
newline|'\n'
name|'with'
name|'ctxt_mgr'
op|'.'
name|'reader'
op|'.'
name|'using'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'f'
op|'('
name|'context'
op|','
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'return'
name|'wrapped'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|pick_context_manager_reader_allow_async
dedent|''
name|'def'
name|'pick_context_manager_reader_allow_async'
op|'('
name|'f'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Decorator to use a reader.allow_async db context manager.\n\n The db context manager will be picked from the RequestContext.\n\n Wrapped function must have a RequestContext in the arguments.\n """'
newline|'\n'
op|'@'
name|'functools'
op|'.'
name|'wraps'
op|'('
name|'f'
op|')'
newline|'\n'
DECL|function|wrapped
name|'def'
name|'wrapped'
op|'('
name|'context'
op|','
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt_mgr'
op|'='
name|'get_context_manager'
op|'('
name|'context'
op|')'
newline|'\n'
name|'with'
name|'ctxt_mgr'
op|'.'
name|'reader'
op|'.'
name|'allow_async'
op|'.'
name|'using'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'f'
op|'('
name|'context'
op|','
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'return'
name|'wrapped'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|model_query
dedent|''
name|'def'
name|'model_query'
op|'('
name|'context'
op|','
name|'model'
op|','
nl|'\n'
name|'args'
op|'='
name|'None'
op|','
nl|'\n'
name|'read_deleted'
op|'='
name|'None'
op|','
nl|'\n'
name|'project_only'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Query helper that accounts for context\'s `read_deleted` field.\n\n :param context: NovaContext of the query.\n :param model: Model to query. Must be a subclass of ModelBase.\n :param args: Arguments to query. If None - model is used.\n :param read_deleted: If not None, overrides context\'s read_deleted field.\n Permitted values are \'no\', which does not return\n deleted values; \'only\', which only returns deleted\n values; and \'yes\', which does not filter deleted\n values.\n :param project_only: If set and context is user-type, then restrict\n query to match the context\'s project_id. If set to\n \'allow_none\', restriction includes project_id = None.\n """'
newline|'\n'
nl|'\n'
name|'if'
name|'read_deleted'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'read_deleted'
op|'='
name|'context'
op|'.'
name|'read_deleted'
newline|'\n'
nl|'\n'
dedent|''
name|'query_kwargs'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'if'
string|"'no'"
op|'=='
name|'read_deleted'
op|':'
newline|'\n'
indent|' '
name|'query_kwargs'
op|'['
string|"'deleted'"
op|']'
op|'='
name|'False'
newline|'\n'
dedent|''
name|'elif'
string|"'only'"
op|'=='
name|'read_deleted'
op|':'
newline|'\n'
indent|' '
name|'query_kwargs'
op|'['
string|"'deleted'"
op|']'
op|'='
name|'True'
newline|'\n'
dedent|''
name|'elif'
string|"'yes'"
op|'=='
name|'read_deleted'
op|':'
newline|'\n'
indent|' '
name|'pass'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'ValueError'
op|'('
name|'_'
op|'('
string|'"Unrecognized read_deleted value \'%s\'"'
op|')'
nl|'\n'
op|'%'
name|'read_deleted'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'query'
op|'='
name|'sqlalchemyutils'
op|'.'
name|'model_query'
op|'('
nl|'\n'
name|'model'
op|','
name|'context'
op|'.'
name|'session'
op|','
name|'args'
op|','
op|'**'
name|'query_kwargs'
op|')'
newline|'\n'
nl|'\n'
comment|"# We can't use oslo.db model_query's project_id here, as it doesn't allow"
nl|'\n'
comment|'# us to return both our projects and unowned projects.'
nl|'\n'
name|'if'
name|'nova'
op|'.'
name|'context'
op|'.'
name|'is_user_context'
op|'('
name|'context'
op|')'
name|'and'
name|'project_only'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'project_only'
op|'=='
string|"'allow_none'"
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'or_'
op|'('
name|'model'
op|'.'
name|'project_id'
op|'=='
name|'context'
op|'.'
name|'project_id'
op|','
nl|'\n'
name|'model'
op|'.'
name|'project_id'
op|'=='
name|'null'
op|'('
op|')'
op|')'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'context'
op|'.'
name|'project_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'return'
name|'query'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|convert_objects_related_datetimes
dedent|''
name|'def'
name|'convert_objects_related_datetimes'
op|'('
name|'values'
op|','
op|'*'
name|'datetime_keys'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'not'
name|'datetime_keys'
op|':'
newline|'\n'
indent|' '
name|'datetime_keys'
op|'='
op|'('
string|"'created_at'"
op|','
string|"'deleted_at'"
op|','
string|"'updated_at'"
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'for'
name|'key'
name|'in'
name|'datetime_keys'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'key'
name|'in'
name|'values'
name|'and'
name|'values'
op|'['
name|'key'
op|']'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'isinstance'
op|'('
name|'values'
op|'['
name|'key'
op|']'
op|','
name|'six'
op|'.'
name|'string_types'
op|')'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'values'
op|'['
name|'key'
op|']'
op|'='
name|'timeutils'
op|'.'
name|'parse_strtime'
op|'('
name|'values'
op|'['
name|'key'
op|']'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'ValueError'
op|':'
newline|'\n'
comment|'# Try alternate parsing since parse_strtime will fail'
nl|'\n'
comment|"# with say converting '2015-05-28T19:59:38+00:00'"
nl|'\n'
indent|' '
name|'values'
op|'['
name|'key'
op|']'
op|'='
name|'timeutils'
op|'.'
name|'parse_isotime'
op|'('
name|'values'
op|'['
name|'key'
op|']'
op|')'
newline|'\n'
comment|"# NOTE(danms): Strip UTC timezones from datetimes, since they're"
nl|'\n'
comment|'# stored that way in the database'
nl|'\n'
dedent|''
dedent|''
name|'values'
op|'['
name|'key'
op|']'
op|'='
name|'values'
op|'['
name|'key'
op|']'
op|'.'
name|'replace'
op|'('
name|'tzinfo'
op|'='
name|'None'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'return'
name|'values'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_sync_instances
dedent|''
name|'def'
name|'_sync_instances'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'dict'
op|'('
name|'zip'
op|'('
op|'('
string|"'instances'"
op|','
string|"'cores'"
op|','
string|"'ram'"
op|')'
op|','
nl|'\n'
name|'_instance_data_get_for_user'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_sync_floating_ips
dedent|''
name|'def'
name|'_sync_floating_ips'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'dict'
op|'('
name|'floating_ips'
op|'='
name|'_floating_ip_count_by_project'
op|'('
nl|'\n'
name|'context'
op|','
name|'project_id'
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_sync_fixed_ips
dedent|''
name|'def'
name|'_sync_fixed_ips'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'dict'
op|'('
name|'fixed_ips'
op|'='
name|'_fixed_ip_count_by_project'
op|'('
name|'context'
op|','
name|'project_id'
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_sync_security_groups
dedent|''
name|'def'
name|'_sync_security_groups'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'dict'
op|'('
name|'security_groups'
op|'='
name|'_security_group_count_by_project_and_user'
op|'('
nl|'\n'
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_sync_server_groups
dedent|''
name|'def'
name|'_sync_server_groups'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'dict'
op|'('
name|'server_groups'
op|'='
name|'_instance_group_count_by_project_and_user'
op|'('
nl|'\n'
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|variable|QUOTA_SYNC_FUNCTIONS
dedent|''
name|'QUOTA_SYNC_FUNCTIONS'
op|'='
op|'{'
nl|'\n'
string|"'_sync_instances'"
op|':'
name|'_sync_instances'
op|','
nl|'\n'
string|"'_sync_floating_ips'"
op|':'
name|'_sync_floating_ips'
op|','
nl|'\n'
string|"'_sync_fixed_ips'"
op|':'
name|'_sync_fixed_ips'
op|','
nl|'\n'
string|"'_sync_security_groups'"
op|':'
name|'_sync_security_groups'
op|','
nl|'\n'
string|"'_sync_server_groups'"
op|':'
name|'_sync_server_groups'
op|','
nl|'\n'
op|'}'
newline|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
DECL|function|constraint
name|'def'
name|'constraint'
op|'('
op|'**'
name|'conditions'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'Constraint'
op|'('
name|'conditions'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|equal_any
dedent|''
name|'def'
name|'equal_any'
op|'('
op|'*'
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'EqualityCondition'
op|'('
name|'values'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|not_equal
dedent|''
name|'def'
name|'not_equal'
op|'('
op|'*'
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'InequalityCondition'
op|'('
name|'values'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|Constraint
dedent|''
name|'class'
name|'Constraint'
op|'('
name|'object'
op|')'
op|':'
newline|'\n'
nl|'\n'
DECL|member|__init__
indent|' '
name|'def'
name|'__init__'
op|'('
name|'self'
op|','
name|'conditions'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'conditions'
op|'='
name|'conditions'
newline|'\n'
nl|'\n'
DECL|member|apply
dedent|''
name|'def'
name|'apply'
op|'('
name|'self'
op|','
name|'model'
op|','
name|'query'
op|')'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'key'
op|','
name|'condition'
name|'in'
name|'self'
op|'.'
name|'conditions'
op|'.'
name|'items'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'clause'
name|'in'
name|'condition'
op|'.'
name|'clauses'
op|'('
name|'getattr'
op|'('
name|'model'
op|','
name|'key'
op|')'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'clause'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'return'
name|'query'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|EqualityCondition
dedent|''
dedent|''
name|'class'
name|'EqualityCondition'
op|'('
name|'object'
op|')'
op|':'
newline|'\n'
nl|'\n'
DECL|member|__init__
indent|' '
name|'def'
name|'__init__'
op|'('
name|'self'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'values'
op|'='
name|'values'
newline|'\n'
nl|'\n'
DECL|member|clauses
dedent|''
name|'def'
name|'clauses'
op|'('
name|'self'
op|','
name|'field'
op|')'
op|':'
newline|'\n'
comment|'# method signature requires us to return an iterable even if for OR'
nl|'\n'
comment|'# operator this will actually be a single clause'
nl|'\n'
indent|' '
name|'return'
op|'['
name|'or_'
op|'('
op|'*'
op|'['
name|'field'
op|'=='
name|'value'
name|'for'
name|'value'
name|'in'
name|'self'
op|'.'
name|'values'
op|']'
op|')'
op|']'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|InequalityCondition
dedent|''
dedent|''
name|'class'
name|'InequalityCondition'
op|'('
name|'object'
op|')'
op|':'
newline|'\n'
nl|'\n'
DECL|member|__init__
indent|' '
name|'def'
name|'__init__'
op|'('
name|'self'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'values'
op|'='
name|'values'
newline|'\n'
nl|'\n'
DECL|member|clauses
dedent|''
name|'def'
name|'clauses'
op|'('
name|'self'
op|','
name|'field'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
op|'['
name|'field'
op|'!='
name|'value'
name|'for'
name|'value'
name|'in'
name|'self'
op|'.'
name|'values'
op|']'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|service_destroy
name|'def'
name|'service_destroy'
op|'('
name|'context'
op|','
name|'service_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'service'
op|'='
name|'service_get'
op|'('
name|'context'
op|','
name|'service_id'
op|')'
newline|'\n'
nl|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Service'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'service_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
comment|'# TODO(sbauza): Remove the service_id filter in a later release'
nl|'\n'
comment|'# once we are sure that all compute nodes report the host field'
nl|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'ComputeNode'
op|')'
op|'.'
name|'filter'
op|'('
name|'or_'
op|'('
name|'models'
op|'.'
name|'ComputeNode'
op|'.'
name|'service_id'
op|'=='
name|'service_id'
op|','
nl|'\n'
name|'models'
op|'.'
name|'ComputeNode'
op|'.'
name|'host'
op|'=='
name|'service'
op|'['
string|"'host'"
op|']'
op|')'
op|')'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|service_get
name|'def'
name|'service_get'
op|'('
name|'context'
op|','
name|'service_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Service'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'service_id'
op|')'
newline|'\n'
nl|'\n'
name|'result'
op|'='
name|'query'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ServiceNotFound'
op|'('
name|'service_id'
op|'='
name|'service_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader_allow_async'
newline|'\n'
DECL|function|service_get_minimum_version
name|'def'
name|'service_get_minimum_version'
op|'('
name|'context'
op|','
name|'binaries'
op|')'
op|':'
newline|'\n'
indent|' '
name|'min_versions'
op|'='
name|'context'
op|'.'
name|'session'
op|'.'
name|'query'
op|'('
nl|'\n'
name|'models'
op|'.'
name|'Service'
op|'.'
name|'binary'
op|','
nl|'\n'
name|'func'
op|'.'
name|'min'
op|'('
name|'models'
op|'.'
name|'Service'
op|'.'
name|'version'
op|')'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Service'
op|'.'
name|'binary'
op|'.'
name|'in_'
op|'('
name|'binaries'
op|')'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Service'
op|'.'
name|'forced_down'
op|'=='
name|'false'
op|'('
op|')'
op|')'
op|'.'
name|'group_by'
op|'('
name|'models'
op|'.'
name|'Service'
op|'.'
name|'binary'
op|')'
newline|'\n'
name|'return'
name|'dict'
op|'('
name|'min_versions'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|service_get_all
name|'def'
name|'service_get_all'
op|'('
name|'context'
op|','
name|'disabled'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Service'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'disabled'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter_by'
op|'('
name|'disabled'
op|'='
name|'disabled'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'query'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|service_get_all_by_topic
name|'def'
name|'service_get_all_by_topic'
op|'('
name|'context'
op|','
name|'topic'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Service'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'disabled'
op|'='
name|'False'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'topic'
op|'='
name|'topic'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|service_get_by_host_and_topic
name|'def'
name|'service_get_by_host_and_topic'
op|'('
name|'context'
op|','
name|'host'
op|','
name|'topic'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Service'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'disabled'
op|'='
name|'False'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'host'
op|'='
name|'host'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'topic'
op|'='
name|'topic'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|service_get_all_by_binary
name|'def'
name|'service_get_all_by_binary'
op|'('
name|'context'
op|','
name|'binary'
op|','
name|'include_disabled'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Service'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'binary'
op|'='
name|'binary'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'include_disabled'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter_by'
op|'('
name|'disabled'
op|'='
name|'False'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'query'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|service_get_by_host_and_binary
name|'def'
name|'service_get_by_host_and_binary'
op|'('
name|'context'
op|','
name|'host'
op|','
name|'binary'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Service'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'host'
op|'='
name|'host'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'binary'
op|'='
name|'binary'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'HostBinaryNotFound'
op|'('
name|'host'
op|'='
name|'host'
op|','
name|'binary'
op|'='
name|'binary'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|service_get_all_by_host
name|'def'
name|'service_get_all_by_host'
op|'('
name|'context'
op|','
name|'host'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Service'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'host'
op|'='
name|'host'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader_allow_async'
newline|'\n'
DECL|function|service_get_by_compute_host
name|'def'
name|'service_get_by_compute_host'
op|'('
name|'context'
op|','
name|'host'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Service'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'host'
op|'='
name|'host'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'binary'
op|'='
string|"'nova-compute'"
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ComputeHostNotFound'
op|'('
name|'host'
op|'='
name|'host'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|service_create
name|'def'
name|'service_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'service_ref'
op|'='
name|'models'
op|'.'
name|'Service'
op|'('
op|')'
newline|'\n'
name|'service_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'CONF'
op|'.'
name|'enable_new_services'
op|':'
newline|'\n'
indent|' '
name|'service_ref'
op|'.'
name|'disabled'
op|'='
name|'True'
newline|'\n'
dedent|''
name|'try'
op|':'
newline|'\n'
indent|' '
name|'service_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
name|'as'
name|'e'
op|':'
newline|'\n'
indent|' '
name|'if'
string|"'binary'"
name|'in'
name|'e'
op|'.'
name|'columns'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ServiceBinaryExists'
op|'('
name|'host'
op|'='
name|'values'
op|'.'
name|'get'
op|'('
string|"'host'"
op|')'
op|','
nl|'\n'
name|'binary'
op|'='
name|'values'
op|'.'
name|'get'
op|'('
string|"'binary'"
op|')'
op|')'
newline|'\n'
dedent|''
name|'raise'
name|'exception'
op|'.'
name|'ServiceTopicExists'
op|'('
name|'host'
op|'='
name|'values'
op|'.'
name|'get'
op|'('
string|"'host'"
op|')'
op|','
nl|'\n'
name|'topic'
op|'='
name|'values'
op|'.'
name|'get'
op|'('
string|"'topic'"
op|')'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'service_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|')'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|service_update
name|'def'
name|'service_update'
op|'('
name|'context'
op|','
name|'service_id'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'service_ref'
op|'='
name|'service_get'
op|'('
name|'context'
op|','
name|'service_id'
op|')'
newline|'\n'
comment|'# Only servicegroup.drivers.db.DbDriver._report_state() updates'
nl|'\n'
comment|"# 'report_count', so if that value changes then store the timestamp"
nl|'\n'
comment|'# as the last time we got a state report.'
nl|'\n'
name|'if'
string|"'report_count'"
name|'in'
name|'values'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'values'
op|'['
string|"'report_count'"
op|']'
op|'>'
name|'service_ref'
op|'.'
name|'report_count'
op|':'
newline|'\n'
indent|' '
name|'service_ref'
op|'.'
name|'last_seen_up'
op|'='
name|'timeutils'
op|'.'
name|'utcnow'
op|'('
op|')'
newline|'\n'
dedent|''
dedent|''
name|'service_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
nl|'\n'
name|'return'
name|'service_ref'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
DECL|function|_compute_node_select
dedent|''
name|'def'
name|'_compute_node_select'
op|'('
name|'context'
op|','
name|'filters'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
comment|'# NOTE(jaypipes): With the addition of the resource-providers database'
nl|'\n'
comment|'# schema, inventory and allocation information for various resources'
nl|'\n'
comment|'# on a compute node are to be migrated from the compute_nodes and'
nl|'\n'
comment|'# instance_extra tables into the new inventories and allocations tables.'
nl|'\n'
comment|'# During the time that this data migration is ongoing we need to allow'
nl|'\n'
comment|'# the scheduler to essentially be blind to the underlying database'
nl|'\n'
comment|'# schema changes. So, this query here returns three sets of resource'
nl|'\n'
comment|'# attributes:'
nl|'\n'
comment|'# - inv_memory_mb, inv_memory_mb_used, inv_memory_mb_reserved,'
nl|'\n'
comment|'# inv_ram_allocation_ratio'
nl|'\n'
comment|'# - inv_vcpus, inv_vcpus_used, inv_cpu_allocation_ratio'
nl|'\n'
comment|'# - inv_local_gb, inv_local_gb_used, inv_disk_allocation_ratio'
nl|'\n'
comment|'# These resource capacity/usage fields store the total and used values'
nl|'\n'
comment|'# for those three resource classes that are currently stored in similar'
nl|'\n'
comment|'# fields in the compute_nodes table (e.g. memory_mb and memory_mb_used)'
nl|'\n'
comment|'# The code that runs the online data migrations will be able to tell if'
nl|'\n'
comment|'# the compute node has had its inventory information moved to the'
nl|'\n'
comment|'# inventories table by checking for a non-None field value for the'
nl|'\n'
comment|'# inv_memory_mb, inv_vcpus, and inv_local_gb fields.'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# The below SQLAlchemy code below produces the following SQL statement'
nl|'\n'
comment|'# exactly:'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# SELECT'
nl|'\n'
comment|'# cn.*,'
nl|'\n'
comment|'# ram_inv.total as inv_memory_mb,'
nl|'\n'
comment|'# ram_inv.reserved as inv_memory_mb_reserved,'
nl|'\n'
comment|'# ram_inv.allocation_ratio as inv_ram_allocation_ratio,'
nl|'\n'
comment|'# ram_usage.used as inv_memory_mb_used,'
nl|'\n'
comment|'# cpu_inv.total as inv_vcpus,'
nl|'\n'
comment|'# cpu_inv.allocation_ratio as inv_cpu_allocation_ratio,'
nl|'\n'
comment|'# cpu_usage.used as inv_vcpus_used,'
nl|'\n'
comment|'# disk_inv.total as inv_local_gb,'
nl|'\n'
comment|'# disk_inv.allocation_ratio as inv_disk_allocation_ratio,'
nl|'\n'
comment|'# disk_usage.used as inv_local_gb_used'
nl|'\n'
comment|'# FROM compute_nodes AS cn'
nl|'\n'
comment|'# LEFT OUTER JOIN resource_providers AS rp'
nl|'\n'
comment|'# ON cn.uuid = rp.uuid'
nl|'\n'
comment|'# LEFT OUTER JOIN inventories AS ram_inv'
nl|'\n'
comment|'# ON rp.id = ram_inv.resource_provider_id'
nl|'\n'
comment|'# AND ram_inv.resource_class_id = :RAM_MB'
nl|'\n'
comment|'# LEFT OUTER JOIN ('
nl|'\n'
comment|'# SELECT resource_provider_id, SUM(used) as used'
nl|'\n'
comment|'# FROM allocations'
nl|'\n'
comment|'# WHERE resource_class_id = :RAM_MB'
nl|'\n'
comment|'# GROUP BY resource_provider_id'
nl|'\n'
comment|'# ) AS ram_usage'
nl|'\n'
comment|'# ON ram_inv.resource_provider_id = ram_usage.resource_provider_id'
nl|'\n'
comment|'# LEFT OUTER JOIN inventories AS cpu_inv'
nl|'\n'
comment|'# ON rp.id = cpu_inv.resource_provider_id'
nl|'\n'
comment|'# AND cpu_inv.resource_class_id = :VCPUS'
nl|'\n'
comment|'# LEFT OUTER JOIN ('
nl|'\n'
comment|'# SELECT resource_provider_id, SUM(used) as used'
nl|'\n'
comment|'# FROM allocations'
nl|'\n'
comment|'# WHERE resource_class_id = :VCPUS'
nl|'\n'
comment|'# GROUP BY resource_provider_id'
nl|'\n'
comment|'# ) AS cpu_usage'
nl|'\n'
comment|'# ON cpu_inv.resource_provider_id = cpu_usage.resource_provider_id'
nl|'\n'
comment|'# LEFT OUTER JOIN inventories AS disk_inv'
nl|'\n'
comment|'# ON rp.id = disk_inv.resource_provider_id'
nl|'\n'
comment|'# AND disk_inv.resource_class_id = :DISK_GB'
nl|'\n'
comment|'# LEFT OUTER JOIN ('
nl|'\n'
comment|'# SELECT resource_provider_id, SUM(used) as used'
nl|'\n'
comment|'# FROM allocations'
nl|'\n'
comment|'# WHERE resource_class_id = :DISK_GB'
nl|'\n'
comment|'# GROUP BY resource_provider_id'
nl|'\n'
comment|'# ) AS disk_usage'
nl|'\n'
comment|'# ON disk_inv.resource_provider_id = disk_usage.resource_provider_id'
nl|'\n'
comment|'# WHERE cn.deleted = 0;'
nl|'\n'
indent|' '
name|'if'
name|'filters'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'filters'
op|'='
op|'{'
op|'}'
newline|'\n'
nl|'\n'
dedent|''
name|'RAM_MB'
op|'='
name|'fields'
op|'.'
name|'ResourceClass'
op|'.'
name|'index'
op|'('
name|'fields'
op|'.'
name|'ResourceClass'
op|'.'
name|'MEMORY_MB'
op|')'
newline|'\n'
name|'VCPU'
op|'='
name|'fields'
op|'.'
name|'ResourceClass'
op|'.'
name|'index'
op|'('
name|'fields'
op|'.'
name|'ResourceClass'
op|'.'
name|'VCPU'
op|')'
newline|'\n'
name|'DISK_GB'
op|'='
name|'fields'
op|'.'
name|'ResourceClass'
op|'.'
name|'index'
op|'('
name|'fields'
op|'.'
name|'ResourceClass'
op|'.'
name|'DISK_GB'
op|')'
newline|'\n'
nl|'\n'
name|'cn_tbl'
op|'='
name|'sa'
op|'.'
name|'alias'
op|'('
name|'models'
op|'.'
name|'ComputeNode'
op|'.'
name|'__table__'
op|','
name|'name'
op|'='
string|"'cn'"
op|')'
newline|'\n'
name|'rp_tbl'
op|'='
name|'sa'
op|'.'
name|'alias'
op|'('
name|'models'
op|'.'
name|'ResourceProvider'
op|'.'
name|'__table__'
op|','
name|'name'
op|'='
string|"'rp'"
op|')'
newline|'\n'
name|'inv_tbl'
op|'='
name|'models'
op|'.'
name|'Inventory'
op|'.'
name|'__table__'
newline|'\n'
name|'alloc_tbl'
op|'='
name|'models'
op|'.'
name|'Allocation'
op|'.'
name|'__table__'
newline|'\n'
name|'ram_inv'
op|'='
name|'sa'
op|'.'
name|'alias'
op|'('
name|'inv_tbl'
op|','
name|'name'
op|'='
string|"'ram_inv'"
op|')'
newline|'\n'
name|'cpu_inv'
op|'='
name|'sa'
op|'.'
name|'alias'
op|'('
name|'inv_tbl'
op|','
name|'name'
op|'='
string|"'cpu_inv'"
op|')'
newline|'\n'
name|'disk_inv'
op|'='
name|'sa'
op|'.'
name|'alias'
op|'('
name|'inv_tbl'
op|','
name|'name'
op|'='
string|"'disk_inv'"
op|')'
newline|'\n'
nl|'\n'
name|'ram_usage'
op|'='
name|'sa'
op|'.'
name|'select'
op|'('
op|'['
name|'alloc_tbl'
op|'.'
name|'c'
op|'.'
name|'resource_provider_id'
op|','
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'sum'
op|'('
name|'alloc_tbl'
op|'.'
name|'c'
op|'.'
name|'used'
op|')'
op|'.'
name|'label'
op|'('
string|"'used'"
op|')'
op|']'
op|')'
newline|'\n'
name|'ram_usage'
op|'='
name|'ram_usage'
op|'.'
name|'where'
op|'('
name|'alloc_tbl'
op|'.'
name|'c'
op|'.'
name|'resource_class_id'
op|'=='
name|'RAM_MB'
op|')'
newline|'\n'
name|'ram_usage'
op|'='
name|'ram_usage'
op|'.'
name|'group_by'
op|'('
name|'alloc_tbl'
op|'.'
name|'c'
op|'.'
name|'resource_provider_id'
op|')'
newline|'\n'
name|'ram_usage'
op|'='
name|'sa'
op|'.'
name|'alias'
op|'('
name|'ram_usage'
op|','
name|'name'
op|'='
string|"'ram_usage'"
op|')'
newline|'\n'
nl|'\n'
name|'cpu_usage'
op|'='
name|'sa'
op|'.'
name|'select'
op|'('
op|'['
name|'alloc_tbl'
op|'.'
name|'c'
op|'.'
name|'resource_provider_id'
op|','
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'sum'
op|'('
name|'alloc_tbl'
op|'.'
name|'c'
op|'.'
name|'used'
op|')'
op|'.'
name|'label'
op|'('
string|"'used'"
op|')'
op|']'
op|')'
newline|'\n'
name|'cpu_usage'
op|'='
name|'cpu_usage'
op|'.'
name|'where'
op|'('
name|'alloc_tbl'
op|'.'
name|'c'
op|'.'
name|'resource_class_id'
op|'=='
name|'VCPU'
op|')'
newline|'\n'
name|'cpu_usage'
op|'='
name|'cpu_usage'
op|'.'
name|'group_by'
op|'('
name|'alloc_tbl'
op|'.'
name|'c'
op|'.'
name|'resource_provider_id'
op|')'
newline|'\n'
name|'cpu_usage'
op|'='
name|'sa'
op|'.'
name|'alias'
op|'('
name|'cpu_usage'
op|','
name|'name'
op|'='
string|"'cpu_usage'"
op|')'
newline|'\n'
nl|'\n'
name|'disk_usage'
op|'='
name|'sa'
op|'.'
name|'select'
op|'('
op|'['
name|'alloc_tbl'
op|'.'
name|'c'
op|'.'
name|'resource_provider_id'
op|','
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'sum'
op|'('
name|'alloc_tbl'
op|'.'
name|'c'
op|'.'
name|'used'
op|')'
op|'.'
name|'label'
op|'('
string|"'used'"
op|')'
op|']'
op|')'
newline|'\n'
name|'disk_usage'
op|'='
name|'disk_usage'
op|'.'
name|'where'
op|'('
name|'alloc_tbl'
op|'.'
name|'c'
op|'.'
name|'resource_class_id'
op|'=='
name|'DISK_GB'
op|')'
newline|'\n'
name|'disk_usage'
op|'='
name|'disk_usage'
op|'.'
name|'group_by'
op|'('
name|'alloc_tbl'
op|'.'
name|'c'
op|'.'
name|'resource_provider_id'
op|')'
newline|'\n'
name|'disk_usage'
op|'='
name|'sa'
op|'.'
name|'alias'
op|'('
name|'disk_usage'
op|','
name|'name'
op|'='
string|"'disk_usage'"
op|')'
newline|'\n'
nl|'\n'
name|'cn_rp_join'
op|'='
name|'sql'
op|'.'
name|'outerjoin'
op|'('
nl|'\n'
name|'cn_tbl'
op|','
name|'rp_tbl'
op|','
nl|'\n'
name|'cn_tbl'
op|'.'
name|'c'
op|'.'
name|'uuid'
op|'=='
name|'rp_tbl'
op|'.'
name|'c'
op|'.'
name|'uuid'
op|')'
newline|'\n'
name|'ram_inv_join'
op|'='
name|'sql'
op|'.'
name|'outerjoin'
op|'('
nl|'\n'
name|'cn_rp_join'
op|','
name|'ram_inv'
op|','
nl|'\n'
name|'sql'
op|'.'
name|'and_'
op|'('
name|'rp_tbl'
op|'.'
name|'c'
op|'.'
name|'id'
op|'=='
name|'ram_inv'
op|'.'
name|'c'
op|'.'
name|'resource_provider_id'
op|','
nl|'\n'
name|'ram_inv'
op|'.'
name|'c'
op|'.'
name|'resource_class_id'
op|'=='
name|'RAM_MB'
op|')'
op|')'
newline|'\n'
name|'ram_join'
op|'='
name|'sql'
op|'.'
name|'outerjoin'
op|'('
nl|'\n'
name|'ram_inv_join'
op|','
name|'ram_usage'
op|','
nl|'\n'
name|'ram_inv'
op|'.'
name|'c'
op|'.'
name|'resource_provider_id'
op|'=='
name|'ram_usage'
op|'.'
name|'c'
op|'.'
name|'resource_provider_id'
op|')'
newline|'\n'
name|'cpu_inv_join'
op|'='
name|'sql'
op|'.'
name|'outerjoin'
op|'('
nl|'\n'
name|'ram_join'
op|','
name|'cpu_inv'
op|','
nl|'\n'
name|'sql'
op|'.'
name|'and_'
op|'('
name|'rp_tbl'
op|'.'
name|'c'
op|'.'
name|'id'
op|'=='
name|'cpu_inv'
op|'.'
name|'c'
op|'.'
name|'resource_provider_id'
op|','
nl|'\n'
name|'cpu_inv'
op|'.'
name|'c'
op|'.'
name|'resource_class_id'
op|'=='
name|'VCPU'
op|')'
op|')'
newline|'\n'
name|'cpu_join'
op|'='
name|'sql'
op|'.'
name|'outerjoin'
op|'('
nl|'\n'
name|'cpu_inv_join'
op|','
name|'cpu_usage'
op|','
nl|'\n'
name|'cpu_inv'
op|'.'
name|'c'
op|'.'
name|'resource_provider_id'
op|'=='
name|'cpu_usage'
op|'.'
name|'c'
op|'.'
name|'resource_provider_id'
op|')'
newline|'\n'
name|'disk_inv_join'
op|'='
name|'sql'
op|'.'
name|'outerjoin'
op|'('
nl|'\n'
name|'cpu_join'
op|','
name|'disk_inv'
op|','
nl|'\n'
name|'sql'
op|'.'
name|'and_'
op|'('
name|'rp_tbl'
op|'.'
name|'c'
op|'.'
name|'id'
op|'=='
name|'disk_inv'
op|'.'
name|'c'
op|'.'
name|'resource_provider_id'
op|','
nl|'\n'
name|'disk_inv'
op|'.'
name|'c'
op|'.'
name|'resource_class_id'
op|'=='
name|'DISK_GB'
op|')'
op|')'
newline|'\n'
name|'disk_join'
op|'='
name|'sql'
op|'.'
name|'outerjoin'
op|'('
nl|'\n'
name|'disk_inv_join'
op|','
name|'disk_usage'
op|','
nl|'\n'
name|'disk_inv'
op|'.'
name|'c'
op|'.'
name|'resource_provider_id'
op|'=='
name|'disk_usage'
op|'.'
name|'c'
op|'.'
name|'resource_provider_id'
op|')'
newline|'\n'
comment|'# TODO(jaypipes): Remove all capacity and usage fields from this method'
nl|'\n'
comment|'# entirely and deal with allocations and inventory information in a'
nl|'\n'
comment|'# tabular fashion instead of a columnar fashion like the legacy'
nl|'\n'
comment|'# compute_nodes table schema does.'
nl|'\n'
name|'inv_cols'
op|'='
op|'['
nl|'\n'
name|'ram_inv'
op|'.'
name|'c'
op|'.'
name|'total'
op|'.'
name|'label'
op|'('
string|"'inv_memory_mb'"
op|')'
op|','
nl|'\n'
name|'ram_inv'
op|'.'
name|'c'
op|'.'
name|'reserved'
op|'.'
name|'label'
op|'('
string|"'inv_memory_mb_reserved'"
op|')'
op|','
nl|'\n'
name|'ram_inv'
op|'.'
name|'c'
op|'.'
name|'allocation_ratio'
op|'.'
name|'label'
op|'('
string|"'inv_ram_allocation_ratio'"
op|')'
op|','
nl|'\n'
name|'ram_usage'
op|'.'
name|'c'
op|'.'
name|'used'
op|'.'
name|'label'
op|'('
string|"'inv_memory_mb_used'"
op|')'
op|','
nl|'\n'
name|'cpu_inv'
op|'.'
name|'c'
op|'.'
name|'total'
op|'.'
name|'label'
op|'('
string|"'inv_vcpus'"
op|')'
op|','
nl|'\n'
name|'cpu_inv'
op|'.'
name|'c'
op|'.'
name|'allocation_ratio'
op|'.'
name|'label'
op|'('
string|"'inv_cpu_allocation_ratio'"
op|')'
op|','
nl|'\n'
name|'cpu_usage'
op|'.'
name|'c'
op|'.'
name|'used'
op|'.'
name|'label'
op|'('
string|"'inv_vcpus_used'"
op|')'
op|','
nl|'\n'
name|'disk_inv'
op|'.'
name|'c'
op|'.'
name|'total'
op|'.'
name|'label'
op|'('
string|"'inv_local_gb'"
op|')'
op|','
nl|'\n'
name|'disk_inv'
op|'.'
name|'c'
op|'.'
name|'reserved'
op|'.'
name|'label'
op|'('
string|"'inv_local_gb_reserved'"
op|')'
op|','
nl|'\n'
name|'disk_inv'
op|'.'
name|'c'
op|'.'
name|'allocation_ratio'
op|'.'
name|'label'
op|'('
string|"'inv_disk_allocation_ratio'"
op|')'
op|','
nl|'\n'
name|'disk_usage'
op|'.'
name|'c'
op|'.'
name|'used'
op|'.'
name|'label'
op|'('
string|"'inv_local_gb_used'"
op|')'
op|','
nl|'\n'
op|']'
newline|'\n'
name|'cols_in_output'
op|'='
name|'list'
op|'('
name|'cn_tbl'
op|'.'
name|'c'
op|')'
newline|'\n'
name|'cols_in_output'
op|'.'
name|'extend'
op|'('
name|'inv_cols'
op|')'
newline|'\n'
nl|'\n'
name|'select'
op|'='
name|'sa'
op|'.'
name|'select'
op|'('
name|'cols_in_output'
op|')'
op|'.'
name|'select_from'
op|'('
name|'disk_join'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'context'
op|'.'
name|'read_deleted'
op|'=='
string|'"no"'
op|':'
newline|'\n'
indent|' '
name|'select'
op|'='
name|'select'
op|'.'
name|'where'
op|'('
name|'cn_tbl'
op|'.'
name|'c'
op|'.'
name|'deleted'
op|'=='
number|'0'
op|')'
newline|'\n'
dedent|''
name|'if'
string|'"compute_id"'
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'select'
op|'='
name|'select'
op|'.'
name|'where'
op|'('
name|'cn_tbl'
op|'.'
name|'c'
op|'.'
name|'id'
op|'=='
name|'filters'
op|'['
string|'"compute_id"'
op|']'
op|')'
newline|'\n'
dedent|''
name|'if'
string|'"service_id"'
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'select'
op|'='
name|'select'
op|'.'
name|'where'
op|'('
name|'cn_tbl'
op|'.'
name|'c'
op|'.'
name|'service_id'
op|'=='
name|'filters'
op|'['
string|'"service_id"'
op|']'
op|')'
newline|'\n'
dedent|''
name|'if'
string|'"host"'
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'select'
op|'='
name|'select'
op|'.'
name|'where'
op|'('
name|'cn_tbl'
op|'.'
name|'c'
op|'.'
name|'host'
op|'=='
name|'filters'
op|'['
string|'"host"'
op|']'
op|')'
newline|'\n'
dedent|''
name|'if'
string|'"hypervisor_hostname"'
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'hyp_hostname'
op|'='
name|'filters'
op|'['
string|'"hypervisor_hostname"'
op|']'
newline|'\n'
name|'select'
op|'='
name|'select'
op|'.'
name|'where'
op|'('
name|'cn_tbl'
op|'.'
name|'c'
op|'.'
name|'hypervisor_hostname'
op|'=='
name|'hyp_hostname'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'select'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_compute_node_fetchall
dedent|''
name|'def'
name|'_compute_node_fetchall'
op|'('
name|'context'
op|','
name|'filters'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'select'
op|'='
name|'_compute_node_select'
op|'('
name|'context'
op|','
name|'filters'
op|')'
newline|'\n'
name|'engine'
op|'='
name|'get_engine'
op|'('
name|'context'
op|')'
newline|'\n'
name|'conn'
op|'='
name|'engine'
op|'.'
name|'connect'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'results'
op|'='
name|'conn'
op|'.'
name|'execute'
op|'('
name|'select'
op|')'
op|'.'
name|'fetchall'
op|'('
op|')'
newline|'\n'
nl|'\n'
comment|'# Callers expect dict-like objects, not SQLAlchemy RowProxy objects...'
nl|'\n'
name|'results'
op|'='
op|'['
name|'dict'
op|'('
name|'r'
op|')'
name|'for'
name|'r'
name|'in'
name|'results'
op|']'
newline|'\n'
name|'conn'
op|'.'
name|'close'
op|'('
op|')'
newline|'\n'
name|'return'
name|'results'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|compute_node_get
name|'def'
name|'compute_node_get'
op|'('
name|'context'
op|','
name|'compute_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'results'
op|'='
name|'_compute_node_fetchall'
op|'('
name|'context'
op|','
op|'{'
string|'"compute_id"'
op|':'
name|'compute_id'
op|'}'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'results'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ComputeHostNotFound'
op|'('
name|'host'
op|'='
name|'compute_id'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'results'
op|'['
number|'0'
op|']'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|compute_node_get_model
name|'def'
name|'compute_node_get_model'
op|'('
name|'context'
op|','
name|'compute_id'
op|')'
op|':'
newline|'\n'
comment|'# TODO(edleafe): remove once the compute node resource provider migration'
nl|'\n'
comment|'# is complete, and this distinction is no longer necessary.'
nl|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'ComputeNode'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'compute_id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ComputeHostNotFound'
op|'('
name|'host'
op|'='
name|'compute_id'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|compute_nodes_get_by_service_id
name|'def'
name|'compute_nodes_get_by_service_id'
op|'('
name|'context'
op|','
name|'service_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'results'
op|'='
name|'_compute_node_fetchall'
op|'('
name|'context'
op|','
op|'{'
string|'"service_id"'
op|':'
name|'service_id'
op|'}'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'results'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ServiceNotFound'
op|'('
name|'service_id'
op|'='
name|'service_id'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'results'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|compute_node_get_by_host_and_nodename
name|'def'
name|'compute_node_get_by_host_and_nodename'
op|'('
name|'context'
op|','
name|'host'
op|','
name|'nodename'
op|')'
op|':'
newline|'\n'
indent|' '
name|'results'
op|'='
name|'_compute_node_fetchall'
op|'('
name|'context'
op|','
nl|'\n'
op|'{'
string|'"host"'
op|':'
name|'host'
op|','
string|'"hypervisor_hostname"'
op|':'
name|'nodename'
op|'}'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'results'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ComputeHostNotFound'
op|'('
name|'host'
op|'='
name|'host'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'results'
op|'['
number|'0'
op|']'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader_allow_async'
newline|'\n'
DECL|function|compute_node_get_all_by_host
name|'def'
name|'compute_node_get_all_by_host'
op|'('
name|'context'
op|','
name|'host'
op|')'
op|':'
newline|'\n'
indent|' '
name|'results'
op|'='
name|'_compute_node_fetchall'
op|'('
name|'context'
op|','
op|'{'
string|'"host"'
op|':'
name|'host'
op|'}'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'results'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ComputeHostNotFound'
op|'('
name|'host'
op|'='
name|'host'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'results'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|compute_node_get_all
name|'def'
name|'compute_node_get_all'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_compute_node_fetchall'
op|'('
name|'context'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|compute_node_search_by_hypervisor
name|'def'
name|'compute_node_search_by_hypervisor'
op|'('
name|'context'
op|','
name|'hypervisor_match'
op|')'
op|':'
newline|'\n'
indent|' '
name|'field'
op|'='
name|'models'
op|'.'
name|'ComputeNode'
op|'.'
name|'hypervisor_hostname'
newline|'\n'
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'ComputeNode'
op|')'
op|'.'
name|'filter'
op|'('
name|'field'
op|'.'
name|'like'
op|'('
string|"'%%%s%%'"
op|'%'
name|'hypervisor_match'
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|compute_node_create
name|'def'
name|'compute_node_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Creates a new ComputeNode and populates the capacity fields\n with the most recent data.\n """'
newline|'\n'
name|'convert_objects_related_datetimes'
op|'('
name|'values'
op|')'
newline|'\n'
nl|'\n'
name|'compute_node_ref'
op|'='
name|'models'
op|'.'
name|'ComputeNode'
op|'('
op|')'
newline|'\n'
name|'compute_node_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'compute_node_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
nl|'\n'
name|'return'
name|'compute_node_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|')'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|compute_node_update
name|'def'
name|'compute_node_update'
op|'('
name|'context'
op|','
name|'compute_id'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Updates the ComputeNode record with the most recent data."""'
newline|'\n'
nl|'\n'
name|'compute_ref'
op|'='
name|'compute_node_get_model'
op|'('
name|'context'
op|','
name|'compute_id'
op|')'
newline|'\n'
comment|"# Always update this, even if there's going to be no other"
nl|'\n'
comment|'# changes in data. This ensures that we invalidate the'
nl|'\n'
comment|'# scheduler cache of compute node data in case of races.'
nl|'\n'
name|'values'
op|'['
string|"'updated_at'"
op|']'
op|'='
name|'timeutils'
op|'.'
name|'utcnow'
op|'('
op|')'
newline|'\n'
name|'convert_objects_related_datetimes'
op|'('
name|'values'
op|')'
newline|'\n'
name|'compute_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
nl|'\n'
name|'return'
name|'compute_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|compute_node_delete
name|'def'
name|'compute_node_delete'
op|'('
name|'context'
op|','
name|'compute_id'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Delete a ComputeNode record."""'
newline|'\n'
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'ComputeNode'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'compute_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ComputeHostNotFound'
op|'('
name|'host'
op|'='
name|'compute_id'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|compute_node_statistics
name|'def'
name|'compute_node_statistics'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Compute statistics over all compute nodes."""'
newline|'\n'
name|'engine'
op|'='
name|'get_engine'
op|'('
name|'context'
op|')'
newline|'\n'
name|'services_tbl'
op|'='
name|'models'
op|'.'
name|'Service'
op|'.'
name|'__table__'
newline|'\n'
nl|'\n'
name|'inner_sel'
op|'='
name|'sa'
op|'.'
name|'alias'
op|'('
name|'_compute_node_select'
op|'('
name|'context'
op|')'
op|','
name|'name'
op|'='
string|"'inner_sel'"
op|')'
newline|'\n'
nl|'\n'
comment|'# TODO(sbauza): Remove the service_id filter in a later release'
nl|'\n'
comment|'# once we are sure that all compute nodes report the host field'
nl|'\n'
name|'j'
op|'='
name|'sa'
op|'.'
name|'join'
op|'('
nl|'\n'
name|'inner_sel'
op|','
name|'services_tbl'
op|','
nl|'\n'
name|'sql'
op|'.'
name|'and_'
op|'('
nl|'\n'
name|'sql'
op|'.'
name|'or_'
op|'('
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'host'
op|'=='
name|'services_tbl'
op|'.'
name|'c'
op|'.'
name|'host'
op|','
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'service_id'
op|'=='
name|'services_tbl'
op|'.'
name|'c'
op|'.'
name|'id'
nl|'\n'
op|')'
op|','
nl|'\n'
name|'services_tbl'
op|'.'
name|'c'
op|'.'
name|'disabled'
op|'=='
name|'false'
op|'('
op|')'
op|','
nl|'\n'
name|'services_tbl'
op|'.'
name|'c'
op|'.'
name|'binary'
op|'=='
string|"'nova-compute'"
nl|'\n'
op|')'
nl|'\n'
op|')'
newline|'\n'
nl|'\n'
comment|'# NOTE(jaypipes): This COALESCE() stuff is temporary while the data'
nl|'\n'
comment|'# migration to the new resource providers inventories and allocations'
nl|'\n'
comment|'# tables is completed.'
nl|'\n'
name|'agg_cols'
op|'='
op|'['
nl|'\n'
name|'func'
op|'.'
name|'count'
op|'('
op|')'
op|'.'
name|'label'
op|'('
string|"'count'"
op|')'
op|','
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'sum'
op|'('
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'coalesce'
op|'('
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'inv_vcpus'
op|','
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'vcpus'
nl|'\n'
op|')'
nl|'\n'
op|')'
op|'.'
name|'label'
op|'('
string|"'vcpus'"
op|')'
op|','
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'sum'
op|'('
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'coalesce'
op|'('
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'inv_memory_mb'
op|','
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'memory_mb'
nl|'\n'
op|')'
nl|'\n'
op|')'
op|'.'
name|'label'
op|'('
string|"'memory_mb'"
op|')'
op|','
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'sum'
op|'('
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'coalesce'
op|'('
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'inv_local_gb'
op|','
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'local_gb'
nl|'\n'
op|')'
nl|'\n'
op|')'
op|'.'
name|'label'
op|'('
string|"'local_gb'"
op|')'
op|','
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'sum'
op|'('
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'coalesce'
op|'('
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'inv_vcpus_used'
op|','
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'vcpus_used'
nl|'\n'
op|')'
nl|'\n'
op|')'
op|'.'
name|'label'
op|'('
string|"'vcpus_used'"
op|')'
op|','
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'sum'
op|'('
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'coalesce'
op|'('
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'inv_memory_mb_used'
op|','
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'memory_mb_used'
nl|'\n'
op|')'
nl|'\n'
op|')'
op|'.'
name|'label'
op|'('
string|"'memory_mb_used'"
op|')'
op|','
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'sum'
op|'('
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'coalesce'
op|'('
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'inv_local_gb_used'
op|','
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'local_gb_used'
nl|'\n'
op|')'
nl|'\n'
op|')'
op|'.'
name|'label'
op|'('
string|"'local_gb_used'"
op|')'
op|','
nl|'\n'
comment|'# NOTE(jaypipes): This mess cannot be removed until the'
nl|'\n'
comment|'# resource-providers-allocations blueprint is completed and all of the'
nl|'\n'
comment|'# data migrations for BOTH inventory and allocations fields have been'
nl|'\n'
comment|'# completed.'
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'sum'
op|'('
nl|'\n'
comment|'# NOTE(jaypipes): free_ram_mb and free_disk_gb do NOT take'
nl|'\n'
comment|'# allocation ratios for those resources into account but they DO'
nl|'\n'
comment|'# take reserved memory and disk configuration option amounts into'
nl|'\n'
comment|'# account. Awesomesauce.'
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'coalesce'
op|'('
nl|'\n'
op|'('
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'inv_memory_mb'
op|'-'
op|'('
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'inv_memory_mb_used'
op|'+'
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'inv_memory_mb_reserved'
op|')'
nl|'\n'
op|')'
op|','
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'free_ram_mb'
nl|'\n'
op|')'
nl|'\n'
op|')'
op|'.'
name|'label'
op|'('
string|"'free_ram_mb'"
op|')'
op|','
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'sum'
op|'('
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'coalesce'
op|'('
nl|'\n'
op|'('
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'inv_local_gb'
op|'-'
op|'('
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'inv_local_gb_used'
op|'+'
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'inv_local_gb_reserved'
op|')'
nl|'\n'
op|')'
op|','
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'free_disk_gb'
nl|'\n'
op|')'
nl|'\n'
op|')'
op|'.'
name|'label'
op|'('
string|"'free_disk_gb'"
op|')'
op|','
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'sum'
op|'('
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'current_workload'
nl|'\n'
op|')'
op|'.'
name|'label'
op|'('
string|"'current_workload'"
op|')'
op|','
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'sum'
op|'('
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'running_vms'
nl|'\n'
op|')'
op|'.'
name|'label'
op|'('
string|"'running_vms'"
op|')'
op|','
nl|'\n'
name|'sql'
op|'.'
name|'func'
op|'.'
name|'sum'
op|'('
nl|'\n'
name|'inner_sel'
op|'.'
name|'c'
op|'.'
name|'disk_available_least'
nl|'\n'
op|')'
op|'.'
name|'label'
op|'('
string|"'disk_available_least'"
op|')'
op|','
nl|'\n'
op|']'
newline|'\n'
name|'select'
op|'='
name|'sql'
op|'.'
name|'select'
op|'('
name|'agg_cols'
op|')'
op|'.'
name|'select_from'
op|'('
name|'j'
op|')'
newline|'\n'
name|'conn'
op|'='
name|'engine'
op|'.'
name|'connect'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'results'
op|'='
name|'conn'
op|'.'
name|'execute'
op|'('
name|'select'
op|')'
op|'.'
name|'fetchone'
op|'('
op|')'
newline|'\n'
nl|'\n'
comment|'# Build a dict of the info--making no assumptions about result'
nl|'\n'
name|'fields'
op|'='
op|'('
string|"'count'"
op|','
string|"'vcpus'"
op|','
string|"'memory_mb'"
op|','
string|"'local_gb'"
op|','
string|"'vcpus_used'"
op|','
nl|'\n'
string|"'memory_mb_used'"
op|','
string|"'local_gb_used'"
op|','
string|"'free_ram_mb'"
op|','
string|"'free_disk_gb'"
op|','
nl|'\n'
string|"'current_workload'"
op|','
string|"'running_vms'"
op|','
string|"'disk_available_least'"
op|')'
newline|'\n'
name|'results'
op|'='
op|'{'
name|'field'
op|':'
name|'int'
op|'('
name|'results'
op|'['
name|'idx'
op|']'
name|'or'
number|'0'
op|')'
nl|'\n'
name|'for'
name|'idx'
op|','
name|'field'
name|'in'
name|'enumerate'
op|'('
name|'fields'
op|')'
op|'}'
newline|'\n'
name|'conn'
op|'.'
name|'close'
op|'('
op|')'
newline|'\n'
name|'return'
name|'results'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|certificate_create
name|'def'
name|'certificate_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'certificate_ref'
op|'='
name|'models'
op|'.'
name|'Certificate'
op|'('
op|')'
newline|'\n'
name|'for'
op|'('
name|'key'
op|','
name|'value'
op|')'
name|'in'
name|'values'
op|'.'
name|'items'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'certificate_ref'
op|'['
name|'key'
op|']'
op|'='
name|'value'
newline|'\n'
dedent|''
name|'certificate_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'return'
name|'certificate_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|certificate_get_all_by_project
name|'def'
name|'certificate_get_all_by_project'
op|'('
name|'context'
op|','
name|'project_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Certificate'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|certificate_get_all_by_user
name|'def'
name|'certificate_get_all_by_user'
op|'('
name|'context'
op|','
name|'user_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Certificate'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'user_id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|certificate_get_all_by_user_and_project
name|'def'
name|'certificate_get_all_by_user_and_project'
op|'('
name|'context'
op|','
name|'user_id'
op|','
name|'project_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Certificate'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'user_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|floating_ip_get
name|'def'
name|'floating_ip_get'
op|'('
name|'context'
op|','
name|'id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FloatingIp'
op|','
name|'project_only'
op|'='
name|'True'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'id'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload_all'
op|'('
string|"'fixed_ip.instance'"
op|')'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FloatingIpNotFound'
op|'('
name|'id'
op|'='
name|'id'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBError'
op|':'
newline|'\n'
indent|' '
name|'msg'
op|'='
name|'_LW'
op|'('
string|'"Invalid floating IP ID %s in request"'
op|')'
op|'%'
name|'id'
newline|'\n'
name|'LOG'
op|'.'
name|'warning'
op|'('
name|'msg'
op|')'
newline|'\n'
name|'raise'
name|'exception'
op|'.'
name|'InvalidID'
op|'('
name|'id'
op|'='
name|'id'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|floating_ip_get_pools
name|'def'
name|'floating_ip_get_pools'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'pools'
op|'='
op|'['
op|']'
newline|'\n'
name|'for'
name|'result'
name|'in'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FloatingIp'
op|','
nl|'\n'
op|'('
name|'models'
op|'.'
name|'FloatingIp'
op|'.'
name|'pool'
op|','
op|')'
op|')'
op|'.'
name|'distinct'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'pools'
op|'.'
name|'append'
op|'('
op|'{'
string|"'name'"
op|':'
name|'result'
op|'['
number|'0'
op|']'
op|'}'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'pools'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|','
nl|'\n'
DECL|variable|retry_on_request
name|'retry_on_request'
op|'='
name|'True'
op|')'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|floating_ip_allocate_address
name|'def'
name|'floating_ip_allocate_address'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'pool'
op|','
nl|'\n'
name|'auto_assigned'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
name|'nova'
op|'.'
name|'context'
op|'.'
name|'authorize_project_context'
op|'('
name|'context'
op|','
name|'project_id'
op|')'
newline|'\n'
name|'floating_ip_ref'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FloatingIp'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'fixed_ip_id'
op|'='
name|'None'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'None'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'pool'
op|'='
name|'pool'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'floating_ip_ref'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'NoMoreFloatingIps'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'params'
op|'='
op|'{'
string|"'project_id'"
op|':'
name|'project_id'
op|','
string|"'auto_assigned'"
op|':'
name|'auto_assigned'
op|'}'
newline|'\n'
nl|'\n'
name|'rows_update'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FloatingIp'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'floating_ip_ref'
op|'['
string|"'id'"
op|']'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'fixed_ip_id'
op|'='
name|'None'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'None'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'pool'
op|'='
name|'pool'
op|')'
op|'.'
name|'update'
op|'('
name|'params'
op|','
name|'synchronize_session'
op|'='
string|"'evaluate'"
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'rows_update'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'debug'
op|'('
string|"'The row was updated in a concurrent transaction, '"
nl|'\n'
string|"'we will fetch another one'"
op|')'
newline|'\n'
name|'raise'
name|'db_exc'
op|'.'
name|'RetryRequest'
op|'('
name|'exception'
op|'.'
name|'FloatingIpAllocateFailed'
op|'('
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'floating_ip_ref'
op|'['
string|"'address'"
op|']'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|floating_ip_bulk_create
name|'def'
name|'floating_ip_bulk_create'
op|'('
name|'context'
op|','
name|'ips'
op|','
name|'want_result'
op|'='
name|'True'
op|')'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'tab'
op|'='
name|'models'
op|'.'
name|'FloatingIp'
op|'('
op|')'
op|'.'
name|'__table__'
newline|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'execute'
op|'('
name|'tab'
op|'.'
name|'insert'
op|'('
op|')'
op|','
name|'ips'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
name|'as'
name|'e'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FloatingIpExists'
op|'('
name|'address'
op|'='
name|'e'
op|'.'
name|'value'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'want_result'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FloatingIp'
op|')'
op|'.'
name|'filter'
op|'('
nl|'\n'
name|'models'
op|'.'
name|'FloatingIp'
op|'.'
name|'address'
op|'.'
name|'in_'
op|'('
nl|'\n'
op|'['
name|'ip'
op|'['
string|"'address'"
op|']'
name|'for'
name|'ip'
name|'in'
name|'ips'
op|']'
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_ip_range_splitter
dedent|''
dedent|''
name|'def'
name|'_ip_range_splitter'
op|'('
name|'ips'
op|','
name|'block_size'
op|'='
number|'256'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Yields blocks of IPs no more than block_size elements long."""'
newline|'\n'
name|'out'
op|'='
op|'['
op|']'
newline|'\n'
name|'count'
op|'='
number|'0'
newline|'\n'
name|'for'
name|'ip'
name|'in'
name|'ips'
op|':'
newline|'\n'
indent|' '
name|'out'
op|'.'
name|'append'
op|'('
name|'ip'
op|'['
string|"'address'"
op|']'
op|')'
newline|'\n'
name|'count'
op|'+='
number|'1'
newline|'\n'
nl|'\n'
name|'if'
name|'count'
op|'>'
name|'block_size'
op|'-'
number|'1'
op|':'
newline|'\n'
indent|' '
name|'yield'
name|'out'
newline|'\n'
name|'out'
op|'='
op|'['
op|']'
newline|'\n'
name|'count'
op|'='
number|'0'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'if'
name|'out'
op|':'
newline|'\n'
indent|' '
name|'yield'
name|'out'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|floating_ip_bulk_destroy
name|'def'
name|'floating_ip_bulk_destroy'
op|'('
name|'context'
op|','
name|'ips'
op|')'
op|':'
newline|'\n'
indent|' '
name|'project_id_to_quota_count'
op|'='
name|'collections'
op|'.'
name|'defaultdict'
op|'('
name|'int'
op|')'
newline|'\n'
name|'for'
name|'ip_block'
name|'in'
name|'_ip_range_splitter'
op|'('
name|'ips'
op|')'
op|':'
newline|'\n'
comment|'# Find any floating IPs that were not auto_assigned and'
nl|'\n'
comment|'# thus need quota released.'
nl|'\n'
indent|' '
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FloatingIp'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'FloatingIp'
op|'.'
name|'address'
op|'.'
name|'in_'
op|'('
name|'ip_block'
op|')'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'auto_assigned'
op|'='
name|'False'
op|')'
newline|'\n'
name|'for'
name|'row'
name|'in'
name|'query'
op|'.'
name|'all'
op|'('
op|')'
op|':'
newline|'\n'
comment|'# The count is negative since we release quota by'
nl|'\n'
comment|'# reserving negative quota.'
nl|'\n'
indent|' '
name|'project_id_to_quota_count'
op|'['
name|'row'
op|'['
string|"'project_id'"
op|']'
op|']'
op|'-='
number|'1'
newline|'\n'
comment|'# Delete the floating IPs.'
nl|'\n'
dedent|''
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FloatingIp'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'FloatingIp'
op|'.'
name|'address'
op|'.'
name|'in_'
op|'('
name|'ip_block'
op|')'
op|')'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
string|"'fetch'"
op|')'
newline|'\n'
nl|'\n'
comment|'# Delete the quotas, if needed.'
nl|'\n'
comment|'# Quota update happens in a separate transaction, so previous must have'
nl|'\n'
comment|'# been committed first.'
nl|'\n'
dedent|''
name|'for'
name|'project_id'
op|','
name|'count'
name|'in'
name|'project_id_to_quota_count'
op|'.'
name|'items'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'reservations'
op|'='
name|'quota'
op|'.'
name|'QUOTAS'
op|'.'
name|'reserve'
op|'('
name|'context'
op|','
nl|'\n'
name|'project_id'
op|'='
name|'project_id'
op|','
nl|'\n'
name|'floating_ips'
op|'='
name|'count'
op|')'
newline|'\n'
name|'quota'
op|'.'
name|'QUOTAS'
op|'.'
name|'commit'
op|'('
name|'context'
op|','
name|'reservations'
op|','
name|'project_id'
op|'='
name|'project_id'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'Exception'
op|':'
newline|'\n'
indent|' '
name|'with'
name|'excutils'
op|'.'
name|'save_and_reraise_exception'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'exception'
op|'('
name|'_LE'
op|'('
string|'"Failed to update usages bulk "'
nl|'\n'
string|'"deallocating floating IP"'
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
dedent|''
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|floating_ip_create
name|'def'
name|'floating_ip_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'floating_ip_ref'
op|'='
name|'models'
op|'.'
name|'FloatingIp'
op|'('
op|')'
newline|'\n'
name|'floating_ip_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'floating_ip_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FloatingIpExists'
op|'('
name|'address'
op|'='
name|'values'
op|'['
string|"'address'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'floating_ip_ref'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_floating_ip_count_by_project
dedent|''
name|'def'
name|'_floating_ip_count_by_project'
op|'('
name|'context'
op|','
name|'project_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'nova'
op|'.'
name|'context'
op|'.'
name|'authorize_project_context'
op|'('
name|'context'
op|','
name|'project_id'
op|')'
newline|'\n'
comment|'# TODO(tr3buchet): why leave auto_assigned floating IPs out?'
nl|'\n'
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FloatingIp'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'auto_assigned'
op|'='
name|'False'
op|')'
op|'.'
name|'count'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|')'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|floating_ip_fixed_ip_associate
name|'def'
name|'floating_ip_fixed_ip_associate'
op|'('
name|'context'
op|','
name|'floating_address'
op|','
nl|'\n'
name|'fixed_address'
op|','
name|'host'
op|')'
op|':'
newline|'\n'
indent|' '
name|'fixed_ip_ref'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'address'
op|'='
name|'fixed_address'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'network'"
op|')'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'fixed_ip_ref'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FixedIpNotFoundForAddress'
op|'('
name|'address'
op|'='
name|'fixed_address'
op|')'
newline|'\n'
dedent|''
name|'rows'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FloatingIp'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'address'
op|'='
name|'floating_address'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'FloatingIp'
op|'.'
name|'project_id'
op|'=='
nl|'\n'
name|'context'
op|'.'
name|'project_id'
op|')'
op|'.'
name|'filter'
op|'('
name|'or_'
op|'('
name|'models'
op|'.'
name|'FloatingIp'
op|'.'
name|'fixed_ip_id'
op|'=='
nl|'\n'
name|'fixed_ip_ref'
op|'['
string|"'id'"
op|']'
op|','
nl|'\n'
name|'models'
op|'.'
name|'FloatingIp'
op|'.'
name|'fixed_ip_id'
op|'.'
name|'is_'
op|'('
name|'None'
op|')'
op|')'
op|')'
op|'.'
name|'update'
op|'('
op|'{'
string|"'fixed_ip_id'"
op|':'
name|'fixed_ip_ref'
op|'['
string|"'id'"
op|']'
op|','
string|"'host'"
op|':'
name|'host'
op|'}'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'rows'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FloatingIpAssociateFailed'
op|'('
name|'address'
op|'='
name|'floating_address'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'fixed_ip_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|')'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|floating_ip_deallocate
name|'def'
name|'floating_ip_deallocate'
op|'('
name|'context'
op|','
name|'address'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FloatingIp'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'address'
op|'='
name|'address'
op|')'
op|'.'
name|'filter'
op|'('
name|'and_'
op|'('
name|'models'
op|'.'
name|'FloatingIp'
op|'.'
name|'project_id'
op|'!='
name|'null'
op|'('
op|')'
op|')'
op|','
nl|'\n'
name|'models'
op|'.'
name|'FloatingIp'
op|'.'
name|'fixed_ip_id'
op|'=='
name|'null'
op|'('
op|')'
op|')'
op|'.'
name|'update'
op|'('
op|'{'
string|"'project_id'"
op|':'
name|'None'
op|','
nl|'\n'
string|"'host'"
op|':'
name|'None'
op|','
nl|'\n'
string|"'auto_assigned'"
op|':'
name|'False'
op|'}'
op|','
nl|'\n'
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|floating_ip_destroy
name|'def'
name|'floating_ip_destroy'
op|'('
name|'context'
op|','
name|'address'
op|')'
op|':'
newline|'\n'
indent|' '
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FloatingIp'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'address'
op|'='
name|'address'
op|')'
op|'.'
name|'delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|floating_ip_disassociate
name|'def'
name|'floating_ip_disassociate'
op|'('
name|'context'
op|','
name|'address'
op|')'
op|':'
newline|'\n'
indent|' '
name|'floating_ip_ref'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'FloatingIp'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'address'
op|'='
name|'address'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'floating_ip_ref'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FloatingIpNotFoundForAddress'
op|'('
name|'address'
op|'='
name|'address'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'fixed_ip_ref'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'floating_ip_ref'
op|'['
string|"'fixed_ip_id'"
op|']'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'network'"
op|')'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'floating_ip_ref'
op|'.'
name|'fixed_ip_id'
op|'='
name|'None'
newline|'\n'
name|'floating_ip_ref'
op|'.'
name|'host'
op|'='
name|'None'
newline|'\n'
nl|'\n'
name|'return'
name|'fixed_ip_ref'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_floating_ip_get_all
dedent|''
name|'def'
name|'_floating_ip_get_all'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FloatingIp'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|floating_ip_get_all
name|'def'
name|'floating_ip_get_all'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'floating_ip_refs'
op|'='
name|'_floating_ip_get_all'
op|'('
name|'context'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'fixed_ip'"
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'floating_ip_refs'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'NoFloatingIpsDefined'
op|'('
op|')'
newline|'\n'
dedent|''
name|'return'
name|'floating_ip_refs'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|floating_ip_get_all_by_host
name|'def'
name|'floating_ip_get_all_by_host'
op|'('
name|'context'
op|','
name|'host'
op|')'
op|':'
newline|'\n'
indent|' '
name|'floating_ip_refs'
op|'='
name|'_floating_ip_get_all'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'host'
op|'='
name|'host'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'fixed_ip'"
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'floating_ip_refs'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FloatingIpNotFoundForHost'
op|'('
name|'host'
op|'='
name|'host'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'floating_ip_refs'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|floating_ip_get_all_by_project
name|'def'
name|'floating_ip_get_all_by_project'
op|'('
name|'context'
op|','
name|'project_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'nova'
op|'.'
name|'context'
op|'.'
name|'authorize_project_context'
op|'('
name|'context'
op|','
name|'project_id'
op|')'
newline|'\n'
comment|'# TODO(tr3buchet): why do we not want auto_assigned floating IPs here?'
nl|'\n'
name|'return'
name|'_floating_ip_get_all'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'auto_assigned'
op|'='
name|'False'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload_all'
op|'('
string|"'fixed_ip.instance'"
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|floating_ip_get_by_address
name|'def'
name|'floating_ip_get_by_address'
op|'('
name|'context'
op|','
name|'address'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_floating_ip_get_by_address'
op|'('
name|'context'
op|','
name|'address'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_floating_ip_get_by_address
dedent|''
name|'def'
name|'_floating_ip_get_by_address'
op|'('
name|'context'
op|','
name|'address'
op|')'
op|':'
newline|'\n'
nl|'\n'
comment|'# if address string is empty explicitly set it to None'
nl|'\n'
indent|' '
name|'if'
name|'not'
name|'address'
op|':'
newline|'\n'
indent|' '
name|'address'
op|'='
name|'None'
newline|'\n'
dedent|''
name|'try'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FloatingIp'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'address'
op|'='
name|'address'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload_all'
op|'('
string|"'fixed_ip.instance'"
op|')'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FloatingIpNotFoundForAddress'
op|'('
name|'address'
op|'='
name|'address'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBError'
op|':'
newline|'\n'
indent|' '
name|'msg'
op|'='
name|'_'
op|'('
string|'"Invalid floating IP %s in request"'
op|')'
op|'%'
name|'address'
newline|'\n'
name|'LOG'
op|'.'
name|'warning'
op|'('
name|'msg'
op|')'
newline|'\n'
name|'raise'
name|'exception'
op|'.'
name|'InvalidIpAddressError'
op|'('
name|'msg'
op|')'
newline|'\n'
nl|'\n'
comment|'# If the floating IP has a project ID set, check to make sure'
nl|'\n'
comment|'# the non-admin user has access.'
nl|'\n'
dedent|''
name|'if'
name|'result'
op|'.'
name|'project_id'
name|'and'
name|'nova'
op|'.'
name|'context'
op|'.'
name|'is_user_context'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'nova'
op|'.'
name|'context'
op|'.'
name|'authorize_project_context'
op|'('
name|'context'
op|','
name|'result'
op|'.'
name|'project_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|floating_ip_get_by_fixed_address
name|'def'
name|'floating_ip_get_by_fixed_address'
op|'('
name|'context'
op|','
name|'fixed_address'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FloatingIp'
op|')'
op|'.'
name|'outerjoin'
op|'('
name|'models'
op|'.'
name|'FixedIp'
op|','
nl|'\n'
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'id'
op|'=='
nl|'\n'
name|'models'
op|'.'
name|'FloatingIp'
op|'.'
name|'fixed_ip_id'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'address'
op|'=='
name|'fixed_address'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|floating_ip_get_by_fixed_ip_id
name|'def'
name|'floating_ip_get_by_fixed_ip_id'
op|'('
name|'context'
op|','
name|'fixed_ip_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FloatingIp'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'fixed_ip_id'
op|'='
name|'fixed_ip_id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|floating_ip_update
name|'def'
name|'floating_ip_update'
op|'('
name|'context'
op|','
name|'address'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'float_ip_ref'
op|'='
name|'_floating_ip_get_by_address'
op|'('
name|'context'
op|','
name|'address'
op|')'
newline|'\n'
name|'float_ip_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'float_ip_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FloatingIpExists'
op|'('
name|'address'
op|'='
name|'values'
op|'['
string|"'address'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'float_ip_ref'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|dnsdomain_get
name|'def'
name|'dnsdomain_get'
op|'('
name|'context'
op|','
name|'fqdomain'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'DNSDomain'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'domain'
op|'='
name|'fqdomain'
op|')'
op|'.'
name|'with_lockmode'
op|'('
string|"'update'"
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_dnsdomain_get_or_create
dedent|''
name|'def'
name|'_dnsdomain_get_or_create'
op|'('
name|'context'
op|','
name|'fqdomain'
op|')'
op|':'
newline|'\n'
indent|' '
name|'domain_ref'
op|'='
name|'dnsdomain_get'
op|'('
name|'context'
op|','
name|'fqdomain'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'domain_ref'
op|':'
newline|'\n'
indent|' '
name|'dns_ref'
op|'='
name|'models'
op|'.'
name|'DNSDomain'
op|'('
op|')'
newline|'\n'
name|'dns_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|"'domain'"
op|':'
name|'fqdomain'
op|','
nl|'\n'
string|"'availability_zone'"
op|':'
name|'None'
op|','
nl|'\n'
string|"'project_id'"
op|':'
name|'None'
op|'}'
op|')'
newline|'\n'
name|'return'
name|'dns_ref'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'domain_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|dnsdomain_register_for_zone
name|'def'
name|'dnsdomain_register_for_zone'
op|'('
name|'context'
op|','
name|'fqdomain'
op|','
name|'zone'
op|')'
op|':'
newline|'\n'
indent|' '
name|'domain_ref'
op|'='
name|'_dnsdomain_get_or_create'
op|'('
name|'context'
op|','
name|'fqdomain'
op|')'
newline|'\n'
name|'domain_ref'
op|'.'
name|'scope'
op|'='
string|"'private'"
newline|'\n'
name|'domain_ref'
op|'.'
name|'availability_zone'
op|'='
name|'zone'
newline|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'add'
op|'('
name|'domain_ref'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|dnsdomain_register_for_project
name|'def'
name|'dnsdomain_register_for_project'
op|'('
name|'context'
op|','
name|'fqdomain'
op|','
name|'project'
op|')'
op|':'
newline|'\n'
indent|' '
name|'domain_ref'
op|'='
name|'_dnsdomain_get_or_create'
op|'('
name|'context'
op|','
name|'fqdomain'
op|')'
newline|'\n'
name|'domain_ref'
op|'.'
name|'scope'
op|'='
string|"'public'"
newline|'\n'
name|'domain_ref'
op|'.'
name|'project_id'
op|'='
name|'project'
newline|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'add'
op|'('
name|'domain_ref'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|dnsdomain_unregister
name|'def'
name|'dnsdomain_unregister'
op|'('
name|'context'
op|','
name|'fqdomain'
op|')'
op|':'
newline|'\n'
indent|' '
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'DNSDomain'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'domain'
op|'='
name|'fqdomain'
op|')'
op|'.'
name|'delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|dnsdomain_get_all
name|'def'
name|'dnsdomain_get_all'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'DNSDomain'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|','
nl|'\n'
name|'retry_on_request'
op|'='
name|'True'
op|')'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|fixed_ip_associate
name|'def'
name|'fixed_ip_associate'
op|'('
name|'context'
op|','
name|'address'
op|','
name|'instance_uuid'
op|','
name|'network_id'
op|'='
name|'None'
op|','
nl|'\n'
name|'reserved'
op|'='
name|'False'
op|','
name|'virtual_interface_id'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Keyword arguments:\n reserved -- should be a boolean value(True or False), exact value will be\n used to filter on the fixed IP address\n """'
newline|'\n'
name|'if'
name|'not'
name|'uuidutils'
op|'.'
name|'is_uuid_like'
op|'('
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InvalidUUID'
op|'('
name|'uuid'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'network_or_none'
op|'='
name|'or_'
op|'('
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'network_id'
op|'=='
name|'network_id'
op|','
nl|'\n'
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'network_id'
op|'=='
name|'null'
op|'('
op|')'
op|')'
newline|'\n'
name|'fixed_ip_ref'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter'
op|'('
name|'network_or_none'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'reserved'
op|'='
name|'reserved'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'address'
op|'='
name|'address'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'fixed_ip_ref'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FixedIpNotFoundForNetwork'
op|'('
name|'address'
op|'='
name|'address'
op|','
nl|'\n'
name|'network_uuid'
op|'='
name|'network_id'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'fixed_ip_ref'
op|'.'
name|'instance_uuid'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FixedIpAlreadyInUse'
op|'('
name|'address'
op|'='
name|'address'
op|','
nl|'\n'
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'params'
op|'='
op|'{'
string|"'instance_uuid'"
op|':'
name|'instance_uuid'
op|','
nl|'\n'
string|"'allocated'"
op|':'
name|'virtual_interface_id'
name|'is'
name|'not'
name|'None'
op|'}'
newline|'\n'
name|'if'
name|'not'
name|'fixed_ip_ref'
op|'.'
name|'network_id'
op|':'
newline|'\n'
indent|' '
name|'params'
op|'['
string|"'network_id'"
op|']'
op|'='
name|'network_id'
newline|'\n'
dedent|''
name|'if'
name|'virtual_interface_id'
op|':'
newline|'\n'
indent|' '
name|'params'
op|'['
string|"'virtual_interface_id'"
op|']'
op|'='
name|'virtual_interface_id'
newline|'\n'
nl|'\n'
dedent|''
name|'rows_updated'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'fixed_ip_ref'
op|'.'
name|'id'
op|')'
op|'.'
name|'filter'
op|'('
name|'network_or_none'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'reserved'
op|'='
name|'reserved'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'address'
op|'='
name|'address'
op|')'
op|'.'
name|'update'
op|'('
name|'params'
op|','
name|'synchronize_session'
op|'='
string|"'evaluate'"
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'rows_updated'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'debug'
op|'('
string|"'The row was updated in a concurrent transaction, '"
nl|'\n'
string|"'we will fetch another row'"
op|')'
newline|'\n'
name|'raise'
name|'db_exc'
op|'.'
name|'RetryRequest'
op|'('
nl|'\n'
name|'exception'
op|'.'
name|'FixedIpAssociateFailed'
op|'('
name|'net'
op|'='
name|'network_id'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'fixed_ip_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|','
nl|'\n'
DECL|variable|retry_on_request
name|'retry_on_request'
op|'='
name|'True'
op|')'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|fixed_ip_associate_pool
name|'def'
name|'fixed_ip_associate_pool'
op|'('
name|'context'
op|','
name|'network_id'
op|','
name|'instance_uuid'
op|'='
name|'None'
op|','
nl|'\n'
name|'host'
op|'='
name|'None'
op|','
name|'virtual_interface_id'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""allocate a fixed ip out of a fixed ip network pool.\n\n This allocates an unallocated fixed ip out of a specified\n network. We sort by updated_at to hand out the oldest address in\n the list.\n\n """'
newline|'\n'
name|'if'
name|'instance_uuid'
name|'and'
name|'not'
name|'uuidutils'
op|'.'
name|'is_uuid_like'
op|'('
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InvalidUUID'
op|'('
name|'uuid'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'network_or_none'
op|'='
name|'or_'
op|'('
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'network_id'
op|'=='
name|'network_id'
op|','
nl|'\n'
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'network_id'
op|'=='
name|'null'
op|'('
op|')'
op|')'
newline|'\n'
name|'fixed_ip_ref'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter'
op|'('
name|'network_or_none'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'reserved'
op|'='
name|'False'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'None'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'host'
op|'='
name|'None'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'leased'
op|'='
name|'False'
op|')'
op|'.'
name|'order_by'
op|'('
name|'asc'
op|'('
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'updated_at'
op|')'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'fixed_ip_ref'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'NoMoreFixedIps'
op|'('
name|'net'
op|'='
name|'network_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'params'
op|'='
op|'{'
string|"'allocated'"
op|':'
name|'virtual_interface_id'
name|'is'
name|'not'
name|'None'
op|'}'
newline|'\n'
name|'if'
name|'fixed_ip_ref'
op|'['
string|"'network_id'"
op|']'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'params'
op|'['
string|"'network_id'"
op|']'
op|'='
name|'network_id'
newline|'\n'
dedent|''
name|'if'
name|'instance_uuid'
op|':'
newline|'\n'
indent|' '
name|'params'
op|'['
string|"'instance_uuid'"
op|']'
op|'='
name|'instance_uuid'
newline|'\n'
dedent|''
name|'if'
name|'host'
op|':'
newline|'\n'
indent|' '
name|'params'
op|'['
string|"'host'"
op|']'
op|'='
name|'host'
newline|'\n'
dedent|''
name|'if'
name|'virtual_interface_id'
op|':'
newline|'\n'
indent|' '
name|'params'
op|'['
string|"'virtual_interface_id'"
op|']'
op|'='
name|'virtual_interface_id'
newline|'\n'
nl|'\n'
dedent|''
name|'rows_updated'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'fixed_ip_ref'
op|'['
string|"'id'"
op|']'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'network_id'
op|'='
name|'fixed_ip_ref'
op|'['
string|"'network_id'"
op|']'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'reserved'
op|'='
name|'False'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'None'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'host'
op|'='
name|'None'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'leased'
op|'='
name|'False'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'address'
op|'='
name|'fixed_ip_ref'
op|'['
string|"'address'"
op|']'
op|')'
op|'.'
name|'update'
op|'('
name|'params'
op|','
name|'synchronize_session'
op|'='
string|"'evaluate'"
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'rows_updated'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'debug'
op|'('
string|"'The row was updated in a concurrent transaction, '"
nl|'\n'
string|"'we will fetch another row'"
op|')'
newline|'\n'
name|'raise'
name|'db_exc'
op|'.'
name|'RetryRequest'
op|'('
nl|'\n'
name|'exception'
op|'.'
name|'FixedIpAssociateFailed'
op|'('
name|'net'
op|'='
name|'network_id'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'fixed_ip_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|fixed_ip_create
name|'def'
name|'fixed_ip_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'fixed_ip_ref'
op|'='
name|'models'
op|'.'
name|'FixedIp'
op|'('
op|')'
newline|'\n'
name|'fixed_ip_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'fixed_ip_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FixedIpExists'
op|'('
name|'address'
op|'='
name|'values'
op|'['
string|"'address'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'fixed_ip_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|fixed_ip_bulk_create
name|'def'
name|'fixed_ip_bulk_create'
op|'('
name|'context'
op|','
name|'ips'
op|')'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'tab'
op|'='
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'__table__'
newline|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'execute'
op|'('
name|'tab'
op|'.'
name|'insert'
op|'('
op|')'
op|','
name|'ips'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
name|'as'
name|'e'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FixedIpExists'
op|'('
name|'address'
op|'='
name|'e'
op|'.'
name|'value'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|fixed_ip_disassociate
name|'def'
name|'fixed_ip_disassociate'
op|'('
name|'context'
op|','
name|'address'
op|')'
op|':'
newline|'\n'
indent|' '
name|'_fixed_ip_get_by_address'
op|'('
name|'context'
op|','
name|'address'
op|')'
op|'.'
name|'update'
op|'('
nl|'\n'
op|'{'
string|"'instance_uuid'"
op|':'
name|'None'
op|','
nl|'\n'
string|"'virtual_interface_id'"
op|':'
name|'None'
op|'}'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|fixed_ip_disassociate_all_by_timeout
name|'def'
name|'fixed_ip_disassociate_all_by_timeout'
op|'('
name|'context'
op|','
name|'host'
op|','
name|'time'
op|')'
op|':'
newline|'\n'
comment|'# NOTE(vish): only update fixed ips that "belong" to this'
nl|'\n'
comment|'# host; i.e. the network host or the instance'
nl|'\n'
comment|'# host matches. Two queries necessary because'
nl|'\n'
comment|"# join with update doesn't work."
nl|'\n'
indent|' '
name|'host_filter'
op|'='
name|'or_'
op|'('
name|'and_'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'host'
op|'=='
name|'host'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Network'
op|'.'
name|'multi_host'
op|'=='
name|'true'
op|'('
op|')'
op|')'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Network'
op|'.'
name|'host'
op|'=='
name|'host'
op|')'
newline|'\n'
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|','
op|'('
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'id'
op|','
op|')'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'allocated'
op|'=='
name|'false'
op|'('
op|')'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'updated_at'
op|'<'
name|'time'
op|')'
op|'.'
name|'join'
op|'('
op|'('
name|'models'
op|'.'
name|'Network'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Network'
op|'.'
name|'id'
op|'=='
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'network_id'
op|')'
op|')'
op|'.'
name|'join'
op|'('
op|'('
name|'models'
op|'.'
name|'Instance'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'uuid'
op|'=='
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'instance_uuid'
op|')'
op|')'
op|'.'
name|'filter'
op|'('
name|'host_filter'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'fixed_ip_ids'
op|'='
op|'['
name|'fip'
op|'['
number|'0'
op|']'
name|'for'
name|'fip'
name|'in'
name|'result'
op|']'
newline|'\n'
name|'if'
name|'not'
name|'fixed_ip_ids'
op|':'
newline|'\n'
indent|' '
name|'return'
number|'0'
newline|'\n'
dedent|''
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'id'
op|'.'
name|'in_'
op|'('
name|'fixed_ip_ids'
op|')'
op|')'
op|'.'
name|'update'
op|'('
op|'{'
string|"'instance_uuid'"
op|':'
name|'None'
op|','
nl|'\n'
string|"'leased'"
op|':'
name|'False'
op|','
nl|'\n'
string|"'updated_at'"
op|':'
name|'timeutils'
op|'.'
name|'utcnow'
op|'('
op|')'
op|'}'
op|','
nl|'\n'
name|'synchronize_session'
op|'='
string|"'fetch'"
op|')'
newline|'\n'
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|fixed_ip_get
name|'def'
name|'fixed_ip_get'
op|'('
name|'context'
op|','
name|'id'
op|','
name|'get_network'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'id'
op|')'
newline|'\n'
name|'if'
name|'get_network'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'network'"
op|')'
op|')'
newline|'\n'
dedent|''
name|'result'
op|'='
name|'query'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FixedIpNotFound'
op|'('
name|'id'
op|'='
name|'id'
op|')'
newline|'\n'
nl|'\n'
comment|"# FIXME(sirp): shouldn't we just use project_only here to restrict the"
nl|'\n'
comment|'# results?'
nl|'\n'
dedent|''
name|'if'
op|'('
name|'nova'
op|'.'
name|'context'
op|'.'
name|'is_user_context'
op|'('
name|'context'
op|')'
name|'and'
nl|'\n'
name|'result'
op|'['
string|"'instance_uuid'"
op|']'
name|'is'
name|'not'
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'='
name|'instance_get_by_uuid'
op|'('
name|'context'
op|'.'
name|'elevated'
op|'('
name|'read_deleted'
op|'='
string|"'yes'"
op|')'
op|','
nl|'\n'
name|'result'
op|'['
string|"'instance_uuid'"
op|']'
op|')'
newline|'\n'
name|'nova'
op|'.'
name|'context'
op|'.'
name|'authorize_project_context'
op|'('
name|'context'
op|','
name|'instance'
op|'.'
name|'project_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|fixed_ip_get_all
name|'def'
name|'fixed_ip_get_all'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|','
name|'read_deleted'
op|'='
string|'"yes"'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'NoFixedIpsDefined'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|fixed_ip_get_by_address
name|'def'
name|'fixed_ip_get_by_address'
op|'('
name|'context'
op|','
name|'address'
op|','
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_fixed_ip_get_by_address'
op|'('
name|'context'
op|','
name|'address'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'columns_to_join'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_fixed_ip_get_by_address
dedent|''
name|'def'
name|'_fixed_ip_get_by_address'
op|'('
name|'context'
op|','
name|'address'
op|','
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'columns_to_join'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'columns_to_join'
op|'='
op|'['
op|']'
newline|'\n'
nl|'\n'
dedent|''
name|'try'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|')'
newline|'\n'
name|'for'
name|'column'
name|'in'
name|'columns_to_join'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'result'
op|'.'
name|'options'
op|'('
name|'joinedload_all'
op|'('
name|'column'
op|')'
op|')'
newline|'\n'
dedent|''
name|'result'
op|'='
name|'result'
op|'.'
name|'filter_by'
op|'('
name|'address'
op|'='
name|'address'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FixedIpNotFoundForAddress'
op|'('
name|'address'
op|'='
name|'address'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBError'
op|':'
newline|'\n'
indent|' '
name|'msg'
op|'='
name|'_'
op|'('
string|'"Invalid fixed IP Address %s in request"'
op|')'
op|'%'
name|'address'
newline|'\n'
name|'LOG'
op|'.'
name|'warning'
op|'('
name|'msg'
op|')'
newline|'\n'
name|'raise'
name|'exception'
op|'.'
name|'FixedIpInvalid'
op|'('
name|'msg'
op|')'
newline|'\n'
nl|'\n'
comment|"# NOTE(sirp): shouldn't we just use project_only here to restrict the"
nl|'\n'
comment|'# results?'
nl|'\n'
dedent|''
name|'if'
op|'('
name|'nova'
op|'.'
name|'context'
op|'.'
name|'is_user_context'
op|'('
name|'context'
op|')'
name|'and'
nl|'\n'
name|'result'
op|'['
string|"'instance_uuid'"
op|']'
name|'is'
name|'not'
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'='
name|'_instance_get_by_uuid'
op|'('
nl|'\n'
name|'context'
op|'.'
name|'elevated'
op|'('
name|'read_deleted'
op|'='
string|"'yes'"
op|')'
op|','
nl|'\n'
name|'result'
op|'['
string|"'instance_uuid'"
op|']'
op|')'
newline|'\n'
name|'nova'
op|'.'
name|'context'
op|'.'
name|'authorize_project_context'
op|'('
name|'context'
op|','
nl|'\n'
name|'instance'
op|'.'
name|'project_id'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|fixed_ip_get_by_floating_address
name|'def'
name|'fixed_ip_get_by_floating_address'
op|'('
name|'context'
op|','
name|'floating_address'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|')'
op|'.'
name|'join'
op|'('
name|'models'
op|'.'
name|'FloatingIp'
op|','
nl|'\n'
name|'models'
op|'.'
name|'FloatingIp'
op|'.'
name|'fixed_ip_id'
op|'=='
nl|'\n'
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'id'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'FloatingIp'
op|'.'
name|'address'
op|'=='
name|'floating_address'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
comment|"# NOTE(tr3buchet) please don't invent an exception here, None is fine"
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|fixed_ip_get_by_instance
name|'def'
name|'fixed_ip_get_by_instance'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'not'
name|'uuidutils'
op|'.'
name|'is_uuid_like'
op|'('
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InvalidUUID'
op|'('
name|'uuid'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'vif_and'
op|'='
name|'and_'
op|'('
name|'models'
op|'.'
name|'VirtualInterface'
op|'.'
name|'id'
op|'=='
nl|'\n'
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'virtual_interface_id'
op|','
nl|'\n'
name|'models'
op|'.'
name|'VirtualInterface'
op|'.'
name|'deleted'
op|'=='
number|'0'
op|')'
newline|'\n'
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'outerjoin'
op|'('
name|'models'
op|'.'
name|'VirtualInterface'
op|','
name|'vif_and'
op|')'
op|'.'
name|'options'
op|'('
name|'contains_eager'
op|'('
string|'"virtual_interface"'
op|')'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'network'"
op|')'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'floating_ips'"
op|')'
op|')'
op|'.'
name|'order_by'
op|'('
name|'asc'
op|'('
name|'models'
op|'.'
name|'VirtualInterface'
op|'.'
name|'created_at'
op|')'
op|','
nl|'\n'
name|'asc'
op|'('
name|'models'
op|'.'
name|'VirtualInterface'
op|'.'
name|'id'
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FixedIpNotFoundForInstance'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|fixed_ip_get_by_host
name|'def'
name|'fixed_ip_get_by_host'
op|'('
name|'context'
op|','
name|'host'
op|')'
op|':'
newline|'\n'
indent|' '
name|'instance_uuids'
op|'='
name|'_instance_get_all_uuids_by_host'
op|'('
name|'context'
op|','
name|'host'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'instance_uuids'
op|':'
newline|'\n'
indent|' '
name|'return'
op|'['
op|']'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'instance_uuid'
op|'.'
name|'in_'
op|'('
name|'instance_uuids'
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|fixed_ip_get_by_network_host
name|'def'
name|'fixed_ip_get_by_network_host'
op|'('
name|'context'
op|','
name|'network_id'
op|','
name|'host'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'network_id'
op|'='
name|'network_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'host'
op|'='
name|'host'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FixedIpNotFoundForNetworkHost'
op|'('
name|'network_id'
op|'='
name|'network_id'
op|','
nl|'\n'
name|'host'
op|'='
name|'host'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|fixed_ips_by_virtual_interface
name|'def'
name|'fixed_ips_by_virtual_interface'
op|'('
name|'context'
op|','
name|'vif_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'virtual_interface_id'
op|'='
name|'vif_id'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'network'"
op|')'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'floating_ips'"
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|fixed_ip_update
name|'def'
name|'fixed_ip_update'
op|'('
name|'context'
op|','
name|'address'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'_fixed_ip_get_by_address'
op|'('
name|'context'
op|','
name|'address'
op|')'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_fixed_ip_count_by_project
dedent|''
name|'def'
name|'_fixed_ip_count_by_project'
op|'('
name|'context'
op|','
name|'project_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'nova'
op|'.'
name|'context'
op|'.'
name|'authorize_project_context'
op|'('
name|'context'
op|','
name|'project_id'
op|')'
newline|'\n'
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|','
op|'('
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'id'
op|','
op|')'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'join'
op|'('
op|'('
name|'models'
op|'.'
name|'Instance'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'uuid'
op|'=='
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'instance_uuid'
op|')'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'project_id'
op|'=='
name|'project_id'
op|')'
op|'.'
name|'count'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|virtual_interface_create
name|'def'
name|'virtual_interface_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Create a new virtual interface record in the database.\n\n :param values: = dict containing column values\n """'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'vif_ref'
op|'='
name|'models'
op|'.'
name|'VirtualInterface'
op|'('
op|')'
newline|'\n'
name|'vif_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'vif_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBError'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'VirtualInterfaceCreateException'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'vif_ref'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_virtual_interface_query
dedent|''
name|'def'
name|'_virtual_interface_query'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'VirtualInterface'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|virtual_interface_get
name|'def'
name|'virtual_interface_get'
op|'('
name|'context'
op|','
name|'vif_id'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Gets a virtual interface from the table.\n\n :param vif_id: = id of the virtual interface\n """'
newline|'\n'
name|'vif_ref'
op|'='
name|'_virtual_interface_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'vif_id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'return'
name|'vif_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|virtual_interface_get_by_address
name|'def'
name|'virtual_interface_get_by_address'
op|'('
name|'context'
op|','
name|'address'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Gets a virtual interface from the table.\n\n :param address: = the address of the interface you\'re looking to get\n """'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'vif_ref'
op|'='
name|'_virtual_interface_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'address'
op|'='
name|'address'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBError'
op|':'
newline|'\n'
indent|' '
name|'msg'
op|'='
name|'_'
op|'('
string|'"Invalid virtual interface address %s in request"'
op|')'
op|'%'
name|'address'
newline|'\n'
name|'LOG'
op|'.'
name|'warning'
op|'('
name|'msg'
op|')'
newline|'\n'
name|'raise'
name|'exception'
op|'.'
name|'InvalidIpAddressError'
op|'('
name|'msg'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'vif_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|virtual_interface_get_by_uuid
name|'def'
name|'virtual_interface_get_by_uuid'
op|'('
name|'context'
op|','
name|'vif_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Gets a virtual interface from the table.\n\n :param vif_uuid: the uuid of the interface you\'re looking to get\n """'
newline|'\n'
name|'vif_ref'
op|'='
name|'_virtual_interface_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'uuid'
op|'='
name|'vif_uuid'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'return'
name|'vif_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'require_instance_exists_using_uuid'
newline|'\n'
op|'@'
name|'pick_context_manager_reader_allow_async'
newline|'\n'
DECL|function|virtual_interface_get_by_instance
name|'def'
name|'virtual_interface_get_by_instance'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Gets all virtual interfaces for instance.\n\n :param instance_uuid: = uuid of the instance to retrieve vifs for\n """'
newline|'\n'
name|'vif_refs'
op|'='
name|'_virtual_interface_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'order_by'
op|'('
name|'asc'
op|'('
string|'"created_at"'
op|')'
op|','
name|'asc'
op|'('
string|'"id"'
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'return'
name|'vif_refs'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|virtual_interface_get_by_instance_and_network
name|'def'
name|'virtual_interface_get_by_instance_and_network'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
nl|'\n'
name|'network_id'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Gets virtual interface for instance that\'s associated with network."""'
newline|'\n'
name|'vif_ref'
op|'='
name|'_virtual_interface_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'network_id'
op|'='
name|'network_id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'return'
name|'vif_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|virtual_interface_delete_by_instance
name|'def'
name|'virtual_interface_delete_by_instance'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Delete virtual interface records that are associated\n with the instance given by instance_id.\n\n :param instance_uuid: = uuid of instance\n """'
newline|'\n'
name|'_virtual_interface_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|virtual_interface_get_all
name|'def'
name|'virtual_interface_get_all'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Get all vifs."""'
newline|'\n'
name|'vif_refs'
op|'='
name|'_virtual_interface_query'
op|'('
name|'context'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'return'
name|'vif_refs'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
DECL|function|_metadata_refs
dedent|''
name|'def'
name|'_metadata_refs'
op|'('
name|'metadata_dict'
op|','
name|'meta_class'
op|')'
op|':'
newline|'\n'
indent|' '
name|'metadata_refs'
op|'='
op|'['
op|']'
newline|'\n'
name|'if'
name|'metadata_dict'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'k'
op|','
name|'v'
name|'in'
name|'metadata_dict'
op|'.'
name|'items'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'metadata_ref'
op|'='
name|'meta_class'
op|'('
op|')'
newline|'\n'
name|'metadata_ref'
op|'['
string|"'key'"
op|']'
op|'='
name|'k'
newline|'\n'
name|'metadata_ref'
op|'['
string|"'value'"
op|']'
op|'='
name|'v'
newline|'\n'
name|'metadata_refs'
op|'.'
name|'append'
op|'('
name|'metadata_ref'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'return'
name|'metadata_refs'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_validate_unique_server_name
dedent|''
name|'def'
name|'_validate_unique_server_name'
op|'('
name|'context'
op|','
name|'name'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'not'
name|'CONF'
op|'.'
name|'osapi_compute_unique_server_name_scope'
op|':'
newline|'\n'
indent|' '
name|'return'
newline|'\n'
nl|'\n'
dedent|''
name|'lowername'
op|'='
name|'name'
op|'.'
name|'lower'
op|'('
op|')'
newline|'\n'
name|'base_query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Instance'
op|','
name|'read_deleted'
op|'='
string|"'no'"
op|')'
op|'.'
name|'filter'
op|'('
name|'func'
op|'.'
name|'lower'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'hostname'
op|')'
op|'=='
name|'lowername'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'CONF'
op|'.'
name|'osapi_compute_unique_server_name_scope'
op|'=='
string|"'project'"
op|':'
newline|'\n'
indent|' '
name|'instance_with_same_name'
op|'='
name|'base_query'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'context'
op|'.'
name|'project_id'
op|')'
op|'.'
name|'count'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'elif'
name|'CONF'
op|'.'
name|'osapi_compute_unique_server_name_scope'
op|'=='
string|"'global'"
op|':'
newline|'\n'
indent|' '
name|'instance_with_same_name'
op|'='
name|'base_query'
op|'.'
name|'count'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'msg'
op|'='
name|'_'
op|'('
string|"'Unknown osapi_compute_unique_server_name_scope value: %s'"
nl|'\n'
string|'\' Flag must be empty, "global" or\''
nl|'\n'
string|'\' "project"\''
op|')'
op|'%'
name|'CONF'
op|'.'
name|'osapi_compute_unique_server_name_scope'
newline|'\n'
name|'LOG'
op|'.'
name|'warning'
op|'('
name|'msg'
op|')'
newline|'\n'
name|'return'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'instance_with_same_name'
op|'>'
number|'0'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceExists'
op|'('
name|'name'
op|'='
name|'lowername'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_handle_objects_related_type_conversions
dedent|''
dedent|''
name|'def'
name|'_handle_objects_related_type_conversions'
op|'('
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Make sure that certain things in values (which may have come from\n an objects.instance.Instance object) are in suitable form for the\n database.\n """'
newline|'\n'
comment|'# NOTE(danms): Make sure IP addresses are passed as strings to'
nl|'\n'
comment|'# the database engine'
nl|'\n'
name|'for'
name|'key'
name|'in'
op|'('
string|"'access_ip_v4'"
op|','
string|"'access_ip_v6'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'key'
name|'in'
name|'values'
name|'and'
name|'values'
op|'['
name|'key'
op|']'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'values'
op|'['
name|'key'
op|']'
op|'='
name|'str'
op|'('
name|'values'
op|'['
name|'key'
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'datetime_keys'
op|'='
op|'('
string|"'created_at'"
op|','
string|"'deleted_at'"
op|','
string|"'updated_at'"
op|','
nl|'\n'
string|"'launched_at'"
op|','
string|"'terminated_at'"
op|')'
newline|'\n'
name|'convert_objects_related_datetimes'
op|'('
name|'values'
op|','
op|'*'
name|'datetime_keys'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_check_instance_exists_in_project
dedent|''
name|'def'
name|'_check_instance_exists_in_project'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'not'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Instance'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|','
nl|'\n'
name|'project_only'
op|'='
name|'True'
op|')'
op|'.'
name|'filter_by'
op|'('
nl|'\n'
name|'uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'first'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceNotFound'
op|'('
name|'instance_id'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|instance_create
name|'def'
name|'instance_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Create a new Instance record in the database.\n\n context - request context object\n values - dict containing column values.\n """'
newline|'\n'
nl|'\n'
name|'security_group_ensure_default'
op|'('
name|'context'
op|')'
newline|'\n'
nl|'\n'
name|'values'
op|'='
name|'values'
op|'.'
name|'copy'
op|'('
op|')'
newline|'\n'
name|'values'
op|'['
string|"'metadata'"
op|']'
op|'='
name|'_metadata_refs'
op|'('
nl|'\n'
name|'values'
op|'.'
name|'get'
op|'('
string|"'metadata'"
op|')'
op|','
name|'models'
op|'.'
name|'InstanceMetadata'
op|')'
newline|'\n'
nl|'\n'
name|'values'
op|'['
string|"'system_metadata'"
op|']'
op|'='
name|'_metadata_refs'
op|'('
nl|'\n'
name|'values'
op|'.'
name|'get'
op|'('
string|"'system_metadata'"
op|')'
op|','
name|'models'
op|'.'
name|'InstanceSystemMetadata'
op|')'
newline|'\n'
name|'_handle_objects_related_type_conversions'
op|'('
name|'values'
op|')'
newline|'\n'
nl|'\n'
name|'instance_ref'
op|'='
name|'models'
op|'.'
name|'Instance'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'values'
op|'.'
name|'get'
op|'('
string|"'uuid'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'values'
op|'['
string|"'uuid'"
op|']'
op|'='
name|'str'
op|'('
name|'uuid'
op|'.'
name|'uuid4'
op|'('
op|')'
op|')'
newline|'\n'
dedent|''
name|'instance_ref'
op|'['
string|"'info_cache'"
op|']'
op|'='
name|'models'
op|'.'
name|'InstanceInfoCache'
op|'('
op|')'
newline|'\n'
name|'info_cache'
op|'='
name|'values'
op|'.'
name|'pop'
op|'('
string|"'info_cache'"
op|','
name|'None'
op|')'
newline|'\n'
name|'if'
name|'info_cache'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'instance_ref'
op|'['
string|"'info_cache'"
op|']'
op|'.'
name|'update'
op|'('
name|'info_cache'
op|')'
newline|'\n'
dedent|''
name|'security_groups'
op|'='
name|'values'
op|'.'
name|'pop'
op|'('
string|"'security_groups'"
op|','
op|'['
op|']'
op|')'
newline|'\n'
name|'instance_ref'
op|'['
string|"'extra'"
op|']'
op|'='
name|'models'
op|'.'
name|'InstanceExtra'
op|'('
op|')'
newline|'\n'
name|'instance_ref'
op|'['
string|"'extra'"
op|']'
op|'.'
name|'update'
op|'('
nl|'\n'
op|'{'
string|"'numa_topology'"
op|':'
name|'None'
op|','
nl|'\n'
string|"'pci_requests'"
op|':'
name|'None'
op|','
nl|'\n'
string|"'vcpu_model'"
op|':'
name|'None'
op|','
nl|'\n'
op|'}'
op|')'
newline|'\n'
name|'instance_ref'
op|'['
string|"'extra'"
op|']'
op|'.'
name|'update'
op|'('
name|'values'
op|'.'
name|'pop'
op|'('
string|"'extra'"
op|','
op|'{'
op|'}'
op|')'
op|')'
newline|'\n'
name|'instance_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
nl|'\n'
DECL|function|_get_sec_group_models
name|'def'
name|'_get_sec_group_models'
op|'('
name|'security_groups'
op|')'
op|':'
newline|'\n'
indent|' '
name|'models'
op|'='
op|'['
op|']'
newline|'\n'
name|'default_group'
op|'='
name|'_security_group_ensure_default'
op|'('
name|'context'
op|')'
newline|'\n'
name|'if'
string|"'default'"
name|'in'
name|'security_groups'
op|':'
newline|'\n'
indent|' '
name|'models'
op|'.'
name|'append'
op|'('
name|'default_group'
op|')'
newline|'\n'
comment|"# Generate a new list, so we don't modify the original"
nl|'\n'
name|'security_groups'
op|'='
op|'['
name|'x'
name|'for'
name|'x'
name|'in'
name|'security_groups'
name|'if'
name|'x'
op|'!='
string|"'default'"
op|']'
newline|'\n'
dedent|''
name|'if'
name|'security_groups'
op|':'
newline|'\n'
indent|' '
name|'models'
op|'.'
name|'extend'
op|'('
name|'_security_group_get_by_names'
op|'('
name|'context'
op|','
nl|'\n'
name|'context'
op|'.'
name|'project_id'
op|','
name|'security_groups'
op|')'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'models'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
string|"'hostname'"
name|'in'
name|'values'
op|':'
newline|'\n'
indent|' '
name|'_validate_unique_server_name'
op|'('
name|'context'
op|','
name|'values'
op|'['
string|"'hostname'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'instance_ref'
op|'.'
name|'security_groups'
op|'='
name|'_get_sec_group_models'
op|'('
name|'security_groups'
op|')'
newline|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'add'
op|'('
name|'instance_ref'
op|')'
newline|'\n'
nl|'\n'
comment|'# create the instance uuid to ec2_id mapping entry for instance'
nl|'\n'
name|'ec2_instance_create'
op|'('
name|'context'
op|','
name|'instance_ref'
op|'['
string|"'uuid'"
op|']'
op|')'
newline|'\n'
nl|'\n'
name|'return'
name|'instance_ref'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_instance_data_get_for_user
dedent|''
name|'def'
name|'_instance_data_get_for_user'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Instance'
op|','
op|'('
nl|'\n'
name|'func'
op|'.'
name|'count'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'id'
op|')'
op|','
nl|'\n'
name|'func'
op|'.'
name|'sum'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'vcpus'
op|')'
op|','
nl|'\n'
name|'func'
op|'.'
name|'sum'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'memory_mb'
op|')'
op|')'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
newline|'\n'
name|'if'
name|'user_id'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'result'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'user_id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'result'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
comment|'# NOTE(vish): convert None to 0'
nl|'\n'
dedent|''
name|'return'
op|'('
name|'result'
op|'['
number|'0'
op|']'
name|'or'
number|'0'
op|','
name|'result'
op|'['
number|'1'
op|']'
name|'or'
number|'0'
op|','
name|'result'
op|'['
number|'2'
op|']'
name|'or'
number|'0'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|')'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|instance_destroy
name|'def'
name|'instance_destroy'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'constraint'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'uuidutils'
op|'.'
name|'is_uuid_like'
op|'('
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'instance_ref'
op|'='
name|'_instance_get_by_uuid'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InvalidUUID'
op|'('
name|'instance_uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Instance'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'uuid'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
name|'if'
name|'constraint'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'constraint'
op|'.'
name|'apply'
op|'('
name|'models'
op|'.'
name|'Instance'
op|','
name|'query'
op|')'
newline|'\n'
dedent|''
name|'count'
op|'='
name|'query'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'if'
name|'count'
op|'=='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ConstraintNotMet'
op|'('
op|')'
newline|'\n'
dedent|''
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'SecurityGroupInstanceAssociation'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceInfoCache'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceMetadata'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceFault'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceExtra'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceSystemMetadata'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceGroupMember'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_id'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'BlockDeviceMapping'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
comment|"# NOTE(snikitin): We can't use model_query here, because there is no"
nl|'\n'
comment|"# column 'deleted' in 'tags' table."
nl|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'query'
op|'('
name|'models'
op|'.'
name|'Tag'
op|')'
op|'.'
name|'filter_by'
op|'('
nl|'\n'
name|'resource_id'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'return'
name|'instance_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader_allow_async'
newline|'\n'
DECL|function|instance_get_by_uuid
name|'def'
name|'instance_get_by_uuid'
op|'('
name|'context'
op|','
name|'uuid'
op|','
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_instance_get_by_uuid'
op|'('
name|'context'
op|','
name|'uuid'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'columns_to_join'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_instance_get_by_uuid
dedent|''
name|'def'
name|'_instance_get_by_uuid'
op|'('
name|'context'
op|','
name|'uuid'
op|','
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'_build_instance_get'
op|'('
name|'context'
op|','
name|'columns_to_join'
op|'='
name|'columns_to_join'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'uuid'
op|'='
name|'uuid'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceNotFound'
op|'('
name|'instance_id'
op|'='
name|'uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|instance_get
name|'def'
name|'instance_get'
op|'('
name|'context'
op|','
name|'instance_id'
op|','
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'_build_instance_get'
op|'('
name|'context'
op|','
name|'columns_to_join'
op|'='
name|'columns_to_join'
nl|'\n'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'instance_id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceNotFound'
op|'('
name|'instance_id'
op|'='
name|'instance_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBError'
op|':'
newline|'\n'
comment|'# NOTE(sdague): catch all in case the db engine chokes on the'
nl|'\n'
comment|"# id because it's too long of an int to store."
nl|'\n'
indent|' '
name|'msg'
op|'='
name|'_'
op|'('
string|'"Invalid instance id %s in request"'
op|')'
op|'%'
name|'instance_id'
newline|'\n'
name|'LOG'
op|'.'
name|'warning'
op|'('
name|'msg'
op|')'
newline|'\n'
name|'raise'
name|'exception'
op|'.'
name|'InvalidID'
op|'('
name|'id'
op|'='
name|'instance_id'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_build_instance_get
dedent|''
dedent|''
name|'def'
name|'_build_instance_get'
op|'('
name|'context'
op|','
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Instance'
op|','
name|'project_only'
op|'='
name|'True'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload_all'
op|'('
string|"'security_groups.rules'"
op|')'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'info_cache'"
op|')'
op|')'
newline|'\n'
name|'if'
name|'columns_to_join'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'columns_to_join'
op|'='
op|'['
string|"'metadata'"
op|','
string|"'system_metadata'"
op|']'
newline|'\n'
dedent|''
name|'for'
name|'column'
name|'in'
name|'columns_to_join'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'column'
name|'in'
op|'['
string|"'info_cache'"
op|','
string|"'security_groups'"
op|']'
op|':'
newline|'\n'
comment|'# Already always joined above'
nl|'\n'
indent|' '
name|'continue'
newline|'\n'
dedent|''
name|'if'
string|"'extra.'"
name|'in'
name|'column'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'undefer'
op|'('
name|'column'
op|')'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
name|'column'
op|')'
op|')'
newline|'\n'
comment|'# NOTE(alaski) Stop lazy loading of columns not needed.'
nl|'\n'
dedent|''
dedent|''
name|'for'
name|'col'
name|'in'
op|'['
string|"'metadata'"
op|','
string|"'system_metadata'"
op|']'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'col'
name|'not'
name|'in'
name|'columns_to_join'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'noload'
op|'('
name|'col'
op|')'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'return'
name|'query'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_instances_fill_metadata
dedent|''
name|'def'
name|'_instances_fill_metadata'
op|'('
name|'context'
op|','
name|'instances'
op|','
name|'manual_joins'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Selectively fill instances with manually-joined metadata. Note that\n instance will be converted to a dict.\n\n :param context: security context\n :param instances: list of instances to fill\n :param manual_joins: list of tables to manually join (can be any\n combination of \'metadata\' and \'system_metadata\' or\n None to take the default of both)\n """'
newline|'\n'
name|'uuids'
op|'='
op|'['
name|'inst'
op|'['
string|"'uuid'"
op|']'
name|'for'
name|'inst'
name|'in'
name|'instances'
op|']'
newline|'\n'
nl|'\n'
name|'if'
name|'manual_joins'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'manual_joins'
op|'='
op|'['
string|"'metadata'"
op|','
string|"'system_metadata'"
op|']'
newline|'\n'
nl|'\n'
dedent|''
name|'meta'
op|'='
name|'collections'
op|'.'
name|'defaultdict'
op|'('
name|'list'
op|')'
newline|'\n'
name|'if'
string|"'metadata'"
name|'in'
name|'manual_joins'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'row'
name|'in'
name|'_instance_metadata_get_multi'
op|'('
name|'context'
op|','
name|'uuids'
op|')'
op|':'
newline|'\n'
indent|' '
name|'meta'
op|'['
name|'row'
op|'['
string|"'instance_uuid'"
op|']'
op|']'
op|'.'
name|'append'
op|'('
name|'row'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'sys_meta'
op|'='
name|'collections'
op|'.'
name|'defaultdict'
op|'('
name|'list'
op|')'
newline|'\n'
name|'if'
string|"'system_metadata'"
name|'in'
name|'manual_joins'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'row'
name|'in'
name|'_instance_system_metadata_get_multi'
op|'('
name|'context'
op|','
name|'uuids'
op|')'
op|':'
newline|'\n'
indent|' '
name|'sys_meta'
op|'['
name|'row'
op|'['
string|"'instance_uuid'"
op|']'
op|']'
op|'.'
name|'append'
op|'('
name|'row'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'pcidevs'
op|'='
name|'collections'
op|'.'
name|'defaultdict'
op|'('
name|'list'
op|')'
newline|'\n'
name|'if'
string|"'pci_devices'"
name|'in'
name|'manual_joins'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'row'
name|'in'
name|'_instance_pcidevs_get_multi'
op|'('
name|'context'
op|','
name|'uuids'
op|')'
op|':'
newline|'\n'
indent|' '
name|'pcidevs'
op|'['
name|'row'
op|'['
string|"'instance_uuid'"
op|']'
op|']'
op|'.'
name|'append'
op|'('
name|'row'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'filled_instances'
op|'='
op|'['
op|']'
newline|'\n'
name|'for'
name|'inst'
name|'in'
name|'instances'
op|':'
newline|'\n'
indent|' '
name|'inst'
op|'='
name|'dict'
op|'('
name|'inst'
op|')'
newline|'\n'
name|'inst'
op|'['
string|"'system_metadata'"
op|']'
op|'='
name|'sys_meta'
op|'['
name|'inst'
op|'['
string|"'uuid'"
op|']'
op|']'
newline|'\n'
name|'inst'
op|'['
string|"'metadata'"
op|']'
op|'='
name|'meta'
op|'['
name|'inst'
op|'['
string|"'uuid'"
op|']'
op|']'
newline|'\n'
name|'if'
string|"'pci_devices'"
name|'in'
name|'manual_joins'
op|':'
newline|'\n'
indent|' '
name|'inst'
op|'['
string|"'pci_devices'"
op|']'
op|'='
name|'pcidevs'
op|'['
name|'inst'
op|'['
string|"'uuid'"
op|']'
op|']'
newline|'\n'
dedent|''
name|'filled_instances'
op|'.'
name|'append'
op|'('
name|'inst'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'filled_instances'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_manual_join_columns
dedent|''
name|'def'
name|'_manual_join_columns'
op|'('
name|'columns_to_join'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Separate manually joined columns from columns_to_join\n\n If columns_to_join contains \'metadata\', \'system_metadata\', or\n \'pci_devices\' those columns are removed from columns_to_join and added\n to a manual_joins list to be used with the _instances_fill_metadata method.\n\n The columns_to_join formal parameter is copied and not modified, the return\n tuple has the modified columns_to_join list to be used with joinedload in\n a model query.\n\n :param:columns_to_join: List of columns to join in a model query.\n :return: tuple of (manual_joins, columns_to_join)\n """'
newline|'\n'
name|'manual_joins'
op|'='
op|'['
op|']'
newline|'\n'
name|'columns_to_join_new'
op|'='
name|'copy'
op|'.'
name|'copy'
op|'('
name|'columns_to_join'
op|')'
newline|'\n'
name|'for'
name|'column'
name|'in'
op|'('
string|"'metadata'"
op|','
string|"'system_metadata'"
op|','
string|"'pci_devices'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'column'
name|'in'
name|'columns_to_join_new'
op|':'
newline|'\n'
indent|' '
name|'columns_to_join_new'
op|'.'
name|'remove'
op|'('
name|'column'
op|')'
newline|'\n'
name|'manual_joins'
op|'.'
name|'append'
op|'('
name|'column'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'return'
name|'manual_joins'
op|','
name|'columns_to_join_new'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|instance_get_all
name|'def'
name|'instance_get_all'
op|'('
name|'context'
op|','
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'columns_to_join'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'columns_to_join_new'
op|'='
op|'['
string|"'info_cache'"
op|','
string|"'security_groups'"
op|']'
newline|'\n'
name|'manual_joins'
op|'='
op|'['
string|"'metadata'"
op|','
string|"'system_metadata'"
op|']'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'manual_joins'
op|','
name|'columns_to_join_new'
op|'='
op|'('
nl|'\n'
name|'_manual_join_columns'
op|'('
name|'columns_to_join'
op|')'
op|')'
newline|'\n'
dedent|''
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Instance'
op|')'
newline|'\n'
name|'for'
name|'column'
name|'in'
name|'columns_to_join_new'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
name|'column'
op|')'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'not'
name|'context'
op|'.'
name|'is_admin'
op|':'
newline|'\n'
comment|"# If we're not admin context, add appropriate filter.."
nl|'\n'
indent|' '
name|'if'
name|'context'
op|'.'
name|'project_id'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'context'
op|'.'
name|'project_id'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'context'
op|'.'
name|'user_id'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'instances'
op|'='
name|'query'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'return'
name|'_instances_fill_metadata'
op|'('
name|'context'
op|','
name|'instances'
op|','
name|'manual_joins'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader_allow_async'
newline|'\n'
DECL|function|instance_get_all_by_filters
name|'def'
name|'instance_get_all_by_filters'
op|'('
name|'context'
op|','
name|'filters'
op|','
name|'sort_key'
op|','
name|'sort_dir'
op|','
nl|'\n'
name|'limit'
op|'='
name|'None'
op|','
name|'marker'
op|'='
name|'None'
op|','
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Return instances matching all filters sorted by the primary key.\n\n See instance_get_all_by_filters_sort for more information.\n """'
newline|'\n'
comment|'# Invoke the API with the multiple sort keys and directions using the'
nl|'\n'
comment|'# single sort key/direction'
nl|'\n'
name|'return'
name|'instance_get_all_by_filters_sort'
op|'('
name|'context'
op|','
name|'filters'
op|','
name|'limit'
op|'='
name|'limit'
op|','
nl|'\n'
name|'marker'
op|'='
name|'marker'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'columns_to_join'
op|','
nl|'\n'
name|'sort_keys'
op|'='
op|'['
name|'sort_key'
op|']'
op|','
nl|'\n'
name|'sort_dirs'
op|'='
op|'['
name|'sort_dir'
op|']'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader_allow_async'
newline|'\n'
DECL|function|instance_get_all_by_filters_sort
name|'def'
name|'instance_get_all_by_filters_sort'
op|'('
name|'context'
op|','
name|'filters'
op|','
name|'limit'
op|'='
name|'None'
op|','
name|'marker'
op|'='
name|'None'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'None'
op|','
name|'sort_keys'
op|'='
name|'None'
op|','
nl|'\n'
name|'sort_dirs'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Return instances that match all filters sorted by the given keys.\n Deleted instances will be returned by default, unless there\'s a filter that\n says otherwise.\n\n Depending on the name of a filter, matching for that filter is\n performed using either exact matching or as regular expression\n matching. Exact matching is applied for the following filters::\n\n | [\'project_id\', \'user_id\', \'image_ref\',\n | \'vm_state\', \'instance_type_id\', \'uuid\',\n | \'metadata\', \'host\', \'system_metadata\']\n\n\n A third type of filter (also using exact matching), filters\n based on instance metadata tags when supplied under a special\n key named \'filter\'::\n\n | filters = {\n | \'filter\': [\n | {\'name\': \'tag-key\', \'value\': \'<metakey>\'},\n | {\'name\': \'tag-value\', \'value\': \'<metaval>\'},\n | {\'name\': \'tag:<metakey>\', \'value\': \'<metaval>\'}\n | ]\n | }\n\n Special keys are used to tweek the query further::\n\n | \'changes-since\' - only return instances updated after\n | \'deleted\' - only return (or exclude) deleted instances\n | \'soft_deleted\' - modify behavior of \'deleted\' to either\n | include or exclude instances whose\n | vm_state is SOFT_DELETED.\n\n A fourth type of filter (also using exact matching), filters\n based on instance tags (not metadata tags). There are two types\n of these tags:\n\n `tags` -- One or more strings that will be used to filter results\n in an AND expression: T1 AND T2\n\n `tags-any` -- One or more strings that will be used to filter results in\n an OR expression: T1 OR T2\n\n `not-tags` -- One or more strings that will be used to filter results in\n an NOT AND expression: NOT (T1 AND T2)\n\n `not-tags-any` -- One or more strings that will be used to filter results\n in an NOT OR expression: NOT (T1 OR T2)\n\n Tags should be represented as list::\n\n | filters = {\n | \'tags\': [some-tag, some-another-tag],\n | \'tags-any: [some-any-tag, some-another-any-tag],\n | \'not-tags: [some-not-tag, some-another-not-tag],\n | \'not-tags-any: [some-not-any-tag, some-another-not-any-tag]\n | }\n\n """'
newline|'\n'
comment|'# NOTE(mriedem): If the limit is 0 there is no point in even going'
nl|'\n'
comment|'# to the database since nothing is going to be returned anyway.'
nl|'\n'
name|'if'
name|'limit'
op|'=='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'return'
op|'['
op|']'
newline|'\n'
nl|'\n'
dedent|''
name|'sort_keys'
op|','
name|'sort_dirs'
op|'='
name|'process_sort_params'
op|'('
name|'sort_keys'
op|','
nl|'\n'
name|'sort_dirs'
op|','
nl|'\n'
name|'default_dir'
op|'='
string|"'desc'"
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'columns_to_join'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'columns_to_join_new'
op|'='
op|'['
string|"'info_cache'"
op|','
string|"'security_groups'"
op|']'
newline|'\n'
name|'manual_joins'
op|'='
op|'['
string|"'metadata'"
op|','
string|"'system_metadata'"
op|']'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'manual_joins'
op|','
name|'columns_to_join_new'
op|'='
op|'('
nl|'\n'
name|'_manual_join_columns'
op|'('
name|'columns_to_join'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'query_prefix'
op|'='
name|'context'
op|'.'
name|'session'
op|'.'
name|'query'
op|'('
name|'models'
op|'.'
name|'Instance'
op|')'
newline|'\n'
name|'for'
name|'column'
name|'in'
name|'columns_to_join_new'
op|':'
newline|'\n'
indent|' '
name|'if'
string|"'extra.'"
name|'in'
name|'column'
op|':'
newline|'\n'
indent|' '
name|'query_prefix'
op|'='
name|'query_prefix'
op|'.'
name|'options'
op|'('
name|'undefer'
op|'('
name|'column'
op|')'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'query_prefix'
op|'='
name|'query_prefix'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
name|'column'
op|')'
op|')'
newline|'\n'
nl|'\n'
comment|'# Note: order_by is done in the sqlalchemy.utils.py paginate_query(),'
nl|'\n'
comment|'# no need to do it here as well'
nl|'\n'
nl|'\n'
comment|"# Make a copy of the filters dictionary to use going forward, as we'll"
nl|'\n'
comment|"# be modifying it and we shouldn't affect the caller's use of it."
nl|'\n'
dedent|''
dedent|''
name|'filters'
op|'='
name|'filters'
op|'.'
name|'copy'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
string|"'changes-since'"
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'changes_since'
op|'='
name|'timeutils'
op|'.'
name|'normalize_time'
op|'('
name|'filters'
op|'['
string|"'changes-since'"
op|']'
op|')'
newline|'\n'
name|'query_prefix'
op|'='
name|'query_prefix'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'updated_at'
op|'>='
name|'changes_since'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
string|"'deleted'"
name|'in'
name|'filters'
op|':'
newline|'\n'
comment|'# Instances can be soft or hard deleted and the query needs to'
nl|'\n'
comment|'# include or exclude both'
nl|'\n'
indent|' '
name|'deleted'
op|'='
name|'filters'
op|'.'
name|'pop'
op|'('
string|"'deleted'"
op|')'
newline|'\n'
name|'if'
name|'deleted'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'filters'
op|'.'
name|'pop'
op|'('
string|"'soft_deleted'"
op|','
name|'True'
op|')'
op|':'
newline|'\n'
indent|' '
name|'delete'
op|'='
name|'or_'
op|'('
nl|'\n'
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'deleted'
op|'=='
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'id'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'vm_state'
op|'=='
name|'vm_states'
op|'.'
name|'SOFT_DELETED'
nl|'\n'
op|')'
newline|'\n'
name|'query_prefix'
op|'='
name|'query_prefix'
op|'.'
name|'filter'
op|'('
name|'delete'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'query_prefix'
op|'='
name|'query_prefix'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'deleted'
op|'=='
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'id'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'query_prefix'
op|'='
name|'query_prefix'
op|'.'
name|'filter_by'
op|'('
name|'deleted'
op|'='
number|'0'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'filters'
op|'.'
name|'pop'
op|'('
string|"'soft_deleted'"
op|','
name|'False'
op|')'
op|':'
newline|'\n'
comment|'# It would be better to have vm_state not be nullable'
nl|'\n'
comment|'# but until then we test it explicitly as a workaround.'
nl|'\n'
indent|' '
name|'not_soft_deleted'
op|'='
name|'or_'
op|'('
nl|'\n'
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'vm_state'
op|'!='
name|'vm_states'
op|'.'
name|'SOFT_DELETED'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'vm_state'
op|'=='
name|'null'
op|'('
op|')'
nl|'\n'
op|')'
newline|'\n'
name|'query_prefix'
op|'='
name|'query_prefix'
op|'.'
name|'filter'
op|'('
name|'not_soft_deleted'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
dedent|''
name|'if'
string|"'cleaned'"
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'filters'
op|'.'
name|'pop'
op|'('
string|"'cleaned'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'query_prefix'
op|'='
name|'query_prefix'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'cleaned'
op|'=='
number|'1'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'query_prefix'
op|'='
name|'query_prefix'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'cleaned'
op|'=='
number|'0'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'if'
string|"'tags'"
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'tags'
op|'='
name|'filters'
op|'.'
name|'pop'
op|'('
string|"'tags'"
op|')'
newline|'\n'
comment|"# We build a JOIN ladder expression for each tag, JOIN'ing"
nl|'\n'
comment|'# the first tag to the instances table, and each subsequent'
nl|'\n'
comment|"# tag to the last JOIN'd tags table"
nl|'\n'
name|'first_tag'
op|'='
name|'tags'
op|'.'
name|'pop'
op|'('
number|'0'
op|')'
newline|'\n'
name|'query_prefix'
op|'='
name|'query_prefix'
op|'.'
name|'join'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'tags'
op|')'
newline|'\n'
name|'query_prefix'
op|'='
name|'query_prefix'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Tag'
op|'.'
name|'tag'
op|'=='
name|'first_tag'
op|')'
newline|'\n'
nl|'\n'
name|'for'
name|'tag'
name|'in'
name|'tags'
op|':'
newline|'\n'
indent|' '
name|'tag_alias'
op|'='
name|'aliased'
op|'('
name|'models'
op|'.'
name|'Tag'
op|')'
newline|'\n'
name|'query_prefix'
op|'='
name|'query_prefix'
op|'.'
name|'join'
op|'('
name|'tag_alias'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'tags'
op|')'
newline|'\n'
name|'query_prefix'
op|'='
name|'query_prefix'
op|'.'
name|'filter'
op|'('
name|'tag_alias'
op|'.'
name|'tag'
op|'=='
name|'tag'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'if'
string|"'tags-any'"
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'tags'
op|'='
name|'filters'
op|'.'
name|'pop'
op|'('
string|"'tags-any'"
op|')'
newline|'\n'
name|'tag_alias'
op|'='
name|'aliased'
op|'('
name|'models'
op|'.'
name|'Tag'
op|')'
newline|'\n'
name|'query_prefix'
op|'='
name|'query_prefix'
op|'.'
name|'join'
op|'('
name|'tag_alias'
op|','
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'tags'
op|')'
newline|'\n'
name|'query_prefix'
op|'='
name|'query_prefix'
op|'.'
name|'filter'
op|'('
name|'tag_alias'
op|'.'
name|'tag'
op|'.'
name|'in_'
op|'('
name|'tags'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
string|"'not-tags'"
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'tags'
op|'='
name|'filters'
op|'.'
name|'pop'
op|'('
string|"'not-tags'"
op|')'
newline|'\n'
name|'first_tag'
op|'='
name|'tags'
op|'.'
name|'pop'
op|'('
number|'0'
op|')'
newline|'\n'
name|'subq'
op|'='
name|'query_prefix'
op|'.'
name|'session'
op|'.'
name|'query'
op|'('
name|'models'
op|'.'
name|'Tag'
op|'.'
name|'resource_id'
op|')'
newline|'\n'
name|'subq'
op|'='
name|'subq'
op|'.'
name|'join'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'tags'
op|')'
newline|'\n'
name|'subq'
op|'='
name|'subq'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Tag'
op|'.'
name|'tag'
op|'=='
name|'first_tag'
op|')'
newline|'\n'
nl|'\n'
name|'for'
name|'tag'
name|'in'
name|'tags'
op|':'
newline|'\n'
indent|' '
name|'tag_alias'
op|'='
name|'aliased'
op|'('
name|'models'
op|'.'
name|'Tag'
op|')'
newline|'\n'
name|'subq'
op|'='
name|'subq'
op|'.'
name|'join'
op|'('
name|'tag_alias'
op|','
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'tags'
op|')'
newline|'\n'
name|'subq'
op|'='
name|'subq'
op|'.'
name|'filter'
op|'('
name|'tag_alias'
op|'.'
name|'tag'
op|'=='
name|'tag'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'query_prefix'
op|'='
name|'query_prefix'
op|'.'
name|'filter'
op|'('
op|'~'
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'uuid'
op|'.'
name|'in_'
op|'('
name|'subq'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
string|"'not-tags-any'"
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'tags'
op|'='
name|'filters'
op|'.'
name|'pop'
op|'('
string|"'not-tags-any'"
op|')'
newline|'\n'
name|'query_prefix'
op|'='
name|'query_prefix'
op|'.'
name|'filter'
op|'('
op|'~'
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'tags'
op|'.'
name|'any'
op|'('
nl|'\n'
name|'models'
op|'.'
name|'Tag'
op|'.'
name|'tag'
op|'.'
name|'in_'
op|'('
name|'tags'
op|')'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'not'
name|'context'
op|'.'
name|'is_admin'
op|':'
newline|'\n'
comment|"# If we're not admin context, add appropriate filter.."
nl|'\n'
indent|' '
name|'if'
name|'context'
op|'.'
name|'project_id'
op|':'
newline|'\n'
indent|' '
name|'filters'
op|'['
string|"'project_id'"
op|']'
op|'='
name|'context'
op|'.'
name|'project_id'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'filters'
op|'['
string|"'user_id'"
op|']'
op|'='
name|'context'
op|'.'
name|'user_id'
newline|'\n'
nl|'\n'
comment|'# Filters for exact matches that we can do along with the SQL query...'
nl|'\n'
comment|"# For other filters that don't match this, we will do regexp matching"
nl|'\n'
dedent|''
dedent|''
name|'exact_match_filter_names'
op|'='
op|'['
string|"'project_id'"
op|','
string|"'user_id'"
op|','
string|"'image_ref'"
op|','
nl|'\n'
string|"'vm_state'"
op|','
string|"'instance_type_id'"
op|','
string|"'uuid'"
op|','
nl|'\n'
string|"'metadata'"
op|','
string|"'host'"
op|','
string|"'task_state'"
op|','
nl|'\n'
string|"'system_metadata'"
op|']'
newline|'\n'
nl|'\n'
comment|'# Filter the query'
nl|'\n'
name|'query_prefix'
op|'='
name|'_exact_instance_filter'
op|'('
name|'query_prefix'
op|','
nl|'\n'
name|'filters'
op|','
name|'exact_match_filter_names'
op|')'
newline|'\n'
name|'if'
name|'query_prefix'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'return'
op|'['
op|']'
newline|'\n'
dedent|''
name|'query_prefix'
op|'='
name|'_regex_instance_filter'
op|'('
name|'query_prefix'
op|','
name|'filters'
op|')'
newline|'\n'
name|'query_prefix'
op|'='
name|'_tag_instance_filter'
op|'('
name|'context'
op|','
name|'query_prefix'
op|','
name|'filters'
op|')'
newline|'\n'
nl|'\n'
comment|'# paginate query'
nl|'\n'
name|'if'
name|'marker'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'marker'
op|'='
name|'_instance_get_by_uuid'
op|'('
nl|'\n'
name|'context'
op|'.'
name|'elevated'
op|'('
name|'read_deleted'
op|'='
string|"'yes'"
op|')'
op|','
name|'marker'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'exception'
op|'.'
name|'InstanceNotFound'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'MarkerNotFound'
op|'('
name|'marker'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'try'
op|':'
newline|'\n'
indent|' '
name|'query_prefix'
op|'='
name|'sqlalchemyutils'
op|'.'
name|'paginate_query'
op|'('
name|'query_prefix'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Instance'
op|','
name|'limit'
op|','
nl|'\n'
name|'sort_keys'
op|','
nl|'\n'
name|'marker'
op|'='
name|'marker'
op|','
nl|'\n'
name|'sort_dirs'
op|'='
name|'sort_dirs'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'InvalidSortKey'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InvalidSortKey'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'_instances_fill_metadata'
op|'('
name|'context'
op|','
name|'query_prefix'
op|'.'
name|'all'
op|'('
op|')'
op|','
name|'manual_joins'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_tag_instance_filter
dedent|''
name|'def'
name|'_tag_instance_filter'
op|'('
name|'context'
op|','
name|'query'
op|','
name|'filters'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Applies tag filtering to an Instance query.\n\n Returns the updated query. This method alters filters to remove\n keys that are tags. This filters on resources by tags - this\n method assumes that the caller will take care of access control\n\n :param context: request context object\n :param query: query to apply filters to\n :param filters: dictionary of filters\n """'
newline|'\n'
name|'if'
name|'filters'
op|'.'
name|'get'
op|'('
string|"'filter'"
op|')'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'query'
newline|'\n'
nl|'\n'
dedent|''
name|'model'
op|'='
name|'models'
op|'.'
name|'Instance'
newline|'\n'
name|'model_metadata'
op|'='
name|'models'
op|'.'
name|'InstanceMetadata'
newline|'\n'
name|'model_uuid'
op|'='
name|'model_metadata'
op|'.'
name|'instance_uuid'
newline|'\n'
nl|'\n'
name|'or_query'
op|'='
name|'None'
newline|'\n'
nl|'\n'
DECL|function|_to_list
name|'def'
name|'_to_list'
op|'('
name|'val'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'isinstance'
op|'('
name|'val'
op|','
name|'dict'
op|')'
op|':'
newline|'\n'
indent|' '
name|'val'
op|'='
name|'val'
op|'.'
name|'values'
op|'('
op|')'
newline|'\n'
dedent|''
name|'if'
name|'not'
name|'isinstance'
op|'('
name|'val'
op|','
op|'('
name|'tuple'
op|','
name|'list'
op|','
name|'set'
op|')'
op|')'
op|':'
newline|'\n'
indent|' '
name|'val'
op|'='
op|'('
name|'val'
op|','
op|')'
newline|'\n'
dedent|''
name|'return'
name|'val'
newline|'\n'
nl|'\n'
dedent|''
name|'for'
name|'filter_block'
name|'in'
name|'filters'
op|'['
string|"'filter'"
op|']'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'not'
name|'isinstance'
op|'('
name|'filter_block'
op|','
name|'dict'
op|')'
op|':'
newline|'\n'
indent|' '
name|'continue'
newline|'\n'
nl|'\n'
dedent|''
name|'filter_name'
op|'='
name|'filter_block'
op|'.'
name|'get'
op|'('
string|"'name'"
op|')'
newline|'\n'
name|'if'
name|'filter_name'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'continue'
newline|'\n'
nl|'\n'
dedent|''
name|'tag_name'
op|'='
name|'filter_name'
op|'['
number|'4'
op|':'
op|']'
newline|'\n'
name|'tag_val'
op|'='
name|'_to_list'
op|'('
name|'filter_block'
op|'.'
name|'get'
op|'('
string|"'value'"
op|')'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'filter_name'
op|'.'
name|'startswith'
op|'('
string|"'tag-'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'tag_name'
name|'not'
name|'in'
op|'['
string|"'key'"
op|','
string|"'value'"
op|']'
op|':'
newline|'\n'
indent|' '
name|'msg'
op|'='
name|'_'
op|'('
string|'"Invalid field name: %s"'
op|')'
op|'%'
name|'tag_name'
newline|'\n'
name|'raise'
name|'exception'
op|'.'
name|'InvalidParameterValue'
op|'('
name|'err'
op|'='
name|'msg'
op|')'
newline|'\n'
dedent|''
name|'subq'
op|'='
name|'getattr'
op|'('
name|'model_metadata'
op|','
name|'tag_name'
op|')'
op|'.'
name|'in_'
op|'('
name|'tag_val'
op|')'
newline|'\n'
name|'or_query'
op|'='
name|'subq'
name|'if'
name|'or_query'
name|'is'
name|'None'
name|'else'
name|'or_'
op|'('
name|'or_query'
op|','
name|'subq'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'elif'
name|'filter_name'
op|'.'
name|'startswith'
op|'('
string|"'tag:'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'subq'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'model_metadata'
op|','
op|'('
name|'model_uuid'
op|','
op|')'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'key'
op|'='
name|'tag_name'
op|')'
op|'.'
name|'filter'
op|'('
name|'model_metadata'
op|'.'
name|'value'
op|'.'
name|'in_'
op|'('
name|'tag_val'
op|')'
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'model'
op|'.'
name|'uuid'
op|'.'
name|'in_'
op|'('
name|'subq'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'if'
name|'or_query'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'subq'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'model_metadata'
op|','
op|'('
name|'model_uuid'
op|','
op|')'
op|')'
op|'.'
name|'filter'
op|'('
name|'or_query'
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'model'
op|'.'
name|'uuid'
op|'.'
name|'in_'
op|'('
name|'subq'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'query'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_get_regexp_op_for_connection
dedent|''
name|'def'
name|'_get_regexp_op_for_connection'
op|'('
name|'db_connection'
op|')'
op|':'
newline|'\n'
indent|' '
name|'db_string'
op|'='
name|'db_connection'
op|'.'
name|'split'
op|'('
string|"':'"
op|')'
op|'['
number|'0'
op|']'
op|'.'
name|'split'
op|'('
string|"'+'"
op|')'
op|'['
number|'0'
op|']'
newline|'\n'
name|'regexp_op_map'
op|'='
op|'{'
nl|'\n'
string|"'postgresql'"
op|':'
string|"'~'"
op|','
nl|'\n'
string|"'mysql'"
op|':'
string|"'REGEXP'"
op|','
nl|'\n'
string|"'sqlite'"
op|':'
string|"'REGEXP'"
nl|'\n'
op|'}'
newline|'\n'
name|'return'
name|'regexp_op_map'
op|'.'
name|'get'
op|'('
name|'db_string'
op|','
string|"'LIKE'"
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_regex_instance_filter
dedent|''
name|'def'
name|'_regex_instance_filter'
op|'('
name|'query'
op|','
name|'filters'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Applies regular expression filtering to an Instance query.\n\n Returns the updated query.\n\n :param query: query to apply filters to\n :param filters: dictionary of filters with regex values\n """'
newline|'\n'
nl|'\n'
name|'model'
op|'='
name|'models'
op|'.'
name|'Instance'
newline|'\n'
name|'db_regexp_op'
op|'='
name|'_get_regexp_op_for_connection'
op|'('
name|'CONF'
op|'.'
name|'database'
op|'.'
name|'connection'
op|')'
newline|'\n'
name|'for'
name|'filter_name'
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'column_attr'
op|'='
name|'getattr'
op|'('
name|'model'
op|','
name|'filter_name'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'AttributeError'
op|':'
newline|'\n'
indent|' '
name|'continue'
newline|'\n'
dedent|''
name|'if'
string|"'property'"
op|'=='
name|'type'
op|'('
name|'column_attr'
op|')'
op|'.'
name|'__name__'
op|':'
newline|'\n'
indent|' '
name|'continue'
newline|'\n'
dedent|''
name|'filter_val'
op|'='
name|'filters'
op|'['
name|'filter_name'
op|']'
newline|'\n'
comment|'# Sometimes the REGEX filter value is not a string'
nl|'\n'
name|'if'
name|'not'
name|'isinstance'
op|'('
name|'filter_val'
op|','
name|'six'
op|'.'
name|'string_types'
op|')'
op|':'
newline|'\n'
indent|' '
name|'filter_val'
op|'='
name|'str'
op|'('
name|'filter_val'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'db_regexp_op'
op|'=='
string|"'LIKE'"
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'column_attr'
op|'.'
name|'op'
op|'('
name|'db_regexp_op'
op|')'
op|'('
nl|'\n'
string|"u'%'"
op|'+'
name|'filter_val'
op|'+'
string|"u'%'"
op|')'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'column_attr'
op|'.'
name|'op'
op|'('
name|'db_regexp_op'
op|')'
op|'('
nl|'\n'
name|'filter_val'
op|')'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'return'
name|'query'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_exact_instance_filter
dedent|''
name|'def'
name|'_exact_instance_filter'
op|'('
name|'query'
op|','
name|'filters'
op|','
name|'legal_keys'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Applies exact match filtering to an Instance query.\n\n Returns the updated query. Modifies filters argument to remove\n filters consumed.\n\n :param query: query to apply filters to\n :param filters: dictionary of filters; values that are lists,\n tuples, sets, or frozensets cause an \'IN\' test to\n be performed, while exact matching (\'==\' operator)\n is used for other values\n :param legal_keys: list of keys to apply exact filtering to\n """'
newline|'\n'
nl|'\n'
name|'filter_dict'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'model'
op|'='
name|'models'
op|'.'
name|'Instance'
newline|'\n'
nl|'\n'
comment|'# Walk through all the keys'
nl|'\n'
name|'for'
name|'key'
name|'in'
name|'legal_keys'
op|':'
newline|'\n'
comment|"# Skip ones we're not filtering on"
nl|'\n'
indent|' '
name|'if'
name|'key'
name|'not'
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'continue'
newline|'\n'
nl|'\n'
comment|'# OK, filtering on this key; what value do we search for?'
nl|'\n'
dedent|''
name|'value'
op|'='
name|'filters'
op|'.'
name|'pop'
op|'('
name|'key'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'key'
name|'in'
op|'('
string|"'metadata'"
op|','
string|"'system_metadata'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'column_attr'
op|'='
name|'getattr'
op|'('
name|'model'
op|','
name|'key'
op|')'
newline|'\n'
name|'if'
name|'isinstance'
op|'('
name|'value'
op|','
name|'list'
op|')'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'item'
name|'in'
name|'value'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'k'
op|','
name|'v'
name|'in'
name|'item'
op|'.'
name|'items'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'column_attr'
op|'.'
name|'any'
op|'('
name|'key'
op|'='
name|'k'
op|')'
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'column_attr'
op|'.'
name|'any'
op|'('
name|'value'
op|'='
name|'v'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'k'
op|','
name|'v'
name|'in'
name|'value'
op|'.'
name|'items'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'column_attr'
op|'.'
name|'any'
op|'('
name|'key'
op|'='
name|'k'
op|')'
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'column_attr'
op|'.'
name|'any'
op|'('
name|'value'
op|'='
name|'v'
op|')'
op|')'
newline|'\n'
dedent|''
dedent|''
dedent|''
name|'elif'
name|'isinstance'
op|'('
name|'value'
op|','
op|'('
name|'list'
op|','
name|'tuple'
op|','
name|'set'
op|','
name|'frozenset'
op|')'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'not'
name|'value'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'None'
comment|'# empty IN-predicate; short circuit'
newline|'\n'
comment|'# Looking for values in a list; apply to query directly'
nl|'\n'
dedent|''
name|'column_attr'
op|'='
name|'getattr'
op|'('
name|'model'
op|','
name|'key'
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'column_attr'
op|'.'
name|'in_'
op|'('
name|'value'
op|')'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
comment|'# OK, simple exact match; save for later'
nl|'\n'
indent|' '
name|'filter_dict'
op|'['
name|'key'
op|']'
op|'='
name|'value'
newline|'\n'
nl|'\n'
comment|'# Apply simple exact matches'
nl|'\n'
dedent|''
dedent|''
name|'if'
name|'filter_dict'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
op|'*'
op|'['
name|'getattr'
op|'('
name|'models'
op|'.'
name|'Instance'
op|','
name|'k'
op|')'
op|'=='
name|'v'
nl|'\n'
name|'for'
name|'k'
op|','
name|'v'
name|'in'
name|'filter_dict'
op|'.'
name|'items'
op|'('
op|')'
op|']'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'query'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|process_sort_params
dedent|''
name|'def'
name|'process_sort_params'
op|'('
name|'sort_keys'
op|','
name|'sort_dirs'
op|','
nl|'\n'
name|'default_keys'
op|'='
op|'['
string|"'created_at'"
op|','
string|"'id'"
op|']'
op|','
nl|'\n'
name|'default_dir'
op|'='
string|"'asc'"
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Process the sort parameters to include default keys.\n\n Creates a list of sort keys and a list of sort directions. Adds the default\n keys to the end of the list if they are not already included.\n\n When adding the default keys to the sort keys list, the associated\n direction is:\n 1) The first element in the \'sort_dirs\' list (if specified), else\n 2) \'default_dir\' value (Note that \'asc\' is the default value since this is\n the default in sqlalchemy.utils.paginate_query)\n\n :param sort_keys: List of sort keys to include in the processed list\n :param sort_dirs: List of sort directions to include in the processed list\n :param default_keys: List of sort keys that need to be included in the\n processed list, they are added at the end of the list\n if not already specified.\n :param default_dir: Sort direction associated with each of the default\n keys that are not supplied, used when they are added\n to the processed list\n :returns: list of sort keys, list of sort directions\n :raise exception.InvalidInput: If more sort directions than sort keys\n are specified or if an invalid sort\n direction is specified\n """'
newline|'\n'
comment|'# Determine direction to use for when adding default keys'
nl|'\n'
name|'if'
name|'sort_dirs'
name|'and'
name|'len'
op|'('
name|'sort_dirs'
op|')'
op|'!='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'default_dir_value'
op|'='
name|'sort_dirs'
op|'['
number|'0'
op|']'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'default_dir_value'
op|'='
name|'default_dir'
newline|'\n'
nl|'\n'
comment|'# Create list of keys (do not modify the input list)'
nl|'\n'
dedent|''
name|'if'
name|'sort_keys'
op|':'
newline|'\n'
indent|' '
name|'result_keys'
op|'='
name|'list'
op|'('
name|'sort_keys'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'result_keys'
op|'='
op|'['
op|']'
newline|'\n'
nl|'\n'
comment|'# If a list of directions is not provided, use the default sort direction'
nl|'\n'
comment|'# for all provided keys'
nl|'\n'
dedent|''
name|'if'
name|'sort_dirs'
op|':'
newline|'\n'
indent|' '
name|'result_dirs'
op|'='
op|'['
op|']'
newline|'\n'
comment|'# Verify sort direction'
nl|'\n'
name|'for'
name|'sort_dir'
name|'in'
name|'sort_dirs'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'sort_dir'
name|'not'
name|'in'
op|'('
string|"'asc'"
op|','
string|"'desc'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'msg'
op|'='
name|'_'
op|'('
string|'"Unknown sort direction, must be \'desc\' or \'asc\'"'
op|')'
newline|'\n'
name|'raise'
name|'exception'
op|'.'
name|'InvalidInput'
op|'('
name|'reason'
op|'='
name|'msg'
op|')'
newline|'\n'
dedent|''
name|'result_dirs'
op|'.'
name|'append'
op|'('
name|'sort_dir'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'result_dirs'
op|'='
op|'['
name|'default_dir_value'
name|'for'
name|'_sort_key'
name|'in'
name|'result_keys'
op|']'
newline|'\n'
nl|'\n'
comment|'# Ensure that the key and direction length match'
nl|'\n'
dedent|''
name|'while'
name|'len'
op|'('
name|'result_dirs'
op|')'
op|'<'
name|'len'
op|'('
name|'result_keys'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result_dirs'
op|'.'
name|'append'
op|'('
name|'default_dir_value'
op|')'
newline|'\n'
comment|'# Unless more direction are specified, which is an error'
nl|'\n'
dedent|''
name|'if'
name|'len'
op|'('
name|'result_dirs'
op|')'
op|'>'
name|'len'
op|'('
name|'result_keys'
op|')'
op|':'
newline|'\n'
indent|' '
name|'msg'
op|'='
name|'_'
op|'('
string|'"Sort direction size exceeds sort key size"'
op|')'
newline|'\n'
name|'raise'
name|'exception'
op|'.'
name|'InvalidInput'
op|'('
name|'reason'
op|'='
name|'msg'
op|')'
newline|'\n'
nl|'\n'
comment|'# Ensure defaults are included'
nl|'\n'
dedent|''
name|'for'
name|'key'
name|'in'
name|'default_keys'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'key'
name|'not'
name|'in'
name|'result_keys'
op|':'
newline|'\n'
indent|' '
name|'result_keys'
op|'.'
name|'append'
op|'('
name|'key'
op|')'
newline|'\n'
name|'result_dirs'
op|'.'
name|'append'
op|'('
name|'default_dir_value'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'return'
name|'result_keys'
op|','
name|'result_dirs'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader_allow_async'
newline|'\n'
DECL|function|instance_get_active_by_window_joined
name|'def'
name|'instance_get_active_by_window_joined'
op|'('
name|'context'
op|','
name|'begin'
op|','
name|'end'
op|'='
name|'None'
op|','
nl|'\n'
name|'project_id'
op|'='
name|'None'
op|','
name|'host'
op|'='
name|'None'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Return instances and joins that were active during window."""'
newline|'\n'
name|'query'
op|'='
name|'context'
op|'.'
name|'session'
op|'.'
name|'query'
op|'('
name|'models'
op|'.'
name|'Instance'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'columns_to_join'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'columns_to_join_new'
op|'='
op|'['
string|"'info_cache'"
op|','
string|"'security_groups'"
op|']'
newline|'\n'
name|'manual_joins'
op|'='
op|'['
string|"'metadata'"
op|','
string|"'system_metadata'"
op|']'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'manual_joins'
op|','
name|'columns_to_join_new'
op|'='
op|'('
nl|'\n'
name|'_manual_join_columns'
op|'('
name|'columns_to_join'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'for'
name|'column'
name|'in'
name|'columns_to_join_new'
op|':'
newline|'\n'
indent|' '
name|'if'
string|"'extra.'"
name|'in'
name|'column'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'undefer'
op|'('
name|'column'
op|')'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
name|'column'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'or_'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'terminated_at'
op|'=='
name|'null'
op|'('
op|')'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'terminated_at'
op|'>'
name|'begin'
op|')'
op|')'
newline|'\n'
name|'if'
name|'end'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'launched_at'
op|'<'
name|'end'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'project_id'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'host'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter_by'
op|'('
name|'host'
op|'='
name|'host'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'_instances_fill_metadata'
op|'('
name|'context'
op|','
name|'query'
op|'.'
name|'all'
op|'('
op|')'
op|','
name|'manual_joins'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_instance_get_all_query
dedent|''
name|'def'
name|'_instance_get_all_query'
op|'('
name|'context'
op|','
name|'project_only'
op|'='
name|'False'
op|','
name|'joins'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'joins'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'joins'
op|'='
op|'['
string|"'info_cache'"
op|','
string|"'security_groups'"
op|']'
newline|'\n'
nl|'\n'
dedent|''
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Instance'
op|','
nl|'\n'
name|'project_only'
op|'='
name|'project_only'
op|')'
newline|'\n'
name|'for'
name|'column'
name|'in'
name|'joins'
op|':'
newline|'\n'
indent|' '
name|'if'
string|"'extra.'"
name|'in'
name|'column'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'undefer'
op|'('
name|'column'
op|')'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
name|'column'
op|')'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'return'
name|'query'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader_allow_async'
newline|'\n'
DECL|function|instance_get_all_by_host
name|'def'
name|'instance_get_all_by_host'
op|'('
name|'context'
op|','
name|'host'
op|','
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_instances_fill_metadata'
op|'('
name|'context'
op|','
nl|'\n'
name|'_instance_get_all_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'host'
op|'='
name|'host'
op|')'
op|'.'
name|'all'
op|'('
op|')'
op|','
nl|'\n'
name|'manual_joins'
op|'='
name|'columns_to_join'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_instance_get_all_uuids_by_host
dedent|''
name|'def'
name|'_instance_get_all_uuids_by_host'
op|'('
name|'context'
op|','
name|'host'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Return a list of the instance uuids on a given host.\n\n Returns a list of UUIDs, not Instance model objects.\n """'
newline|'\n'
name|'uuids'
op|'='
op|'['
op|']'
newline|'\n'
name|'for'
name|'tuple'
name|'in'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Instance'
op|','
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'uuid'
op|','
op|')'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'host'
op|'='
name|'host'
op|')'
op|'.'
name|'all'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'uuids'
op|'.'
name|'append'
op|'('
name|'tuple'
op|'['
number|'0'
op|']'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'uuids'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|instance_get_all_by_host_and_node
name|'def'
name|'instance_get_all_by_host_and_node'
op|'('
name|'context'
op|','
name|'host'
op|','
name|'node'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'columns_to_join'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'manual_joins'
op|'='
op|'['
op|']'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'candidates'
op|'='
op|'['
string|"'system_metadata'"
op|','
string|"'metadata'"
op|']'
newline|'\n'
name|'manual_joins'
op|'='
op|'['
name|'x'
name|'for'
name|'x'
name|'in'
name|'columns_to_join'
name|'if'
name|'x'
name|'in'
name|'candidates'
op|']'
newline|'\n'
name|'columns_to_join'
op|'='
name|'list'
op|'('
name|'set'
op|'('
name|'columns_to_join'
op|')'
op|'-'
name|'set'
op|'('
name|'candidates'
op|')'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'_instances_fill_metadata'
op|'('
name|'context'
op|','
nl|'\n'
name|'_instance_get_all_query'
op|'('
nl|'\n'
name|'context'
op|','
nl|'\n'
name|'joins'
op|'='
name|'columns_to_join'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'host'
op|'='
name|'host'
op|')'
op|'.'
nl|'\n'
name|'filter_by'
op|'('
name|'node'
op|'='
name|'node'
op|')'
op|'.'
name|'all'
op|'('
op|')'
op|','
name|'manual_joins'
op|'='
name|'manual_joins'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|instance_get_all_by_host_and_not_type
name|'def'
name|'instance_get_all_by_host_and_not_type'
op|'('
name|'context'
op|','
name|'host'
op|','
name|'type_id'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_instances_fill_metadata'
op|'('
name|'context'
op|','
nl|'\n'
name|'_instance_get_all_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'host'
op|'='
name|'host'
op|')'
op|'.'
nl|'\n'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'instance_type_id'
op|'!='
name|'type_id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|instance_get_all_by_grantee_security_groups
name|'def'
name|'instance_get_all_by_grantee_security_groups'
op|'('
name|'context'
op|','
name|'group_ids'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'not'
name|'group_ids'
op|':'
newline|'\n'
indent|' '
name|'return'
op|'['
op|']'
newline|'\n'
dedent|''
name|'return'
name|'_instances_fill_metadata'
op|'('
name|'context'
op|','
nl|'\n'
name|'_instance_get_all_query'
op|'('
name|'context'
op|')'
op|'.'
nl|'\n'
name|'join'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'security_groups'
op|')'
op|'.'
nl|'\n'
name|'filter'
op|'('
name|'models'
op|'.'
name|'SecurityGroup'
op|'.'
name|'rules'
op|'.'
name|'any'
op|'('
nl|'\n'
name|'models'
op|'.'
name|'SecurityGroupIngressRule'
op|'.'
name|'group_id'
op|'.'
name|'in_'
op|'('
name|'group_ids'
op|')'
op|')'
op|')'
op|'.'
nl|'\n'
name|'all'
op|'('
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|instance_floating_address_get_all
name|'def'
name|'instance_floating_address_get_all'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'not'
name|'uuidutils'
op|'.'
name|'is_uuid_like'
op|'('
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InvalidUUID'
op|'('
name|'uuid'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'floating_ips'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'FloatingIp'
op|','
nl|'\n'
op|'('
name|'models'
op|'.'
name|'FloatingIp'
op|'.'
name|'address'
op|','
op|')'
op|')'
op|'.'
name|'join'
op|'('
name|'models'
op|'.'
name|'FloatingIp'
op|'.'
name|'fixed_ip'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
nl|'\n'
name|'return'
op|'['
name|'floating_ip'
op|'.'
name|'address'
name|'for'
name|'floating_ip'
name|'in'
name|'floating_ips'
op|']'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'# NOTE(hanlind): This method can be removed as conductor RPC API moves to v2.0.'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|instance_get_all_hung_in_rebooting
name|'def'
name|'instance_get_all_hung_in_rebooting'
op|'('
name|'context'
op|','
name|'reboot_window'
op|')'
op|':'
newline|'\n'
indent|' '
name|'reboot_window'
op|'='
op|'('
name|'timeutils'
op|'.'
name|'utcnow'
op|'('
op|')'
op|'-'
nl|'\n'
name|'datetime'
op|'.'
name|'timedelta'
op|'('
name|'seconds'
op|'='
name|'reboot_window'
op|')'
op|')'
newline|'\n'
nl|'\n'
comment|'# NOTE(danms): this is only used in the _poll_rebooting_instances()'
nl|'\n'
comment|'# call in compute/manager, so we can avoid the metadata lookups'
nl|'\n'
comment|'# explicitly'
nl|'\n'
name|'return'
name|'_instances_fill_metadata'
op|'('
name|'context'
op|','
nl|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Instance'
op|')'
op|'.'
nl|'\n'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'updated_at'
op|'<='
name|'reboot_window'
op|')'
op|'.'
nl|'\n'
name|'filter_by'
op|'('
name|'task_state'
op|'='
name|'task_states'
op|'.'
name|'REBOOTING'
op|')'
op|'.'
name|'all'
op|'('
op|')'
op|','
nl|'\n'
name|'manual_joins'
op|'='
op|'['
op|']'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_retry_instance_update
dedent|''
name|'def'
name|'_retry_instance_update'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Wrap with oslo_db_api.wrap_db_retry, and also retry on\n UnknownInstanceUpdateConflict.\n """'
newline|'\n'
name|'exception_checker'
op|'='
name|'lambda'
name|'exc'
op|':'
name|'isinstance'
op|'('
name|'exc'
op|','
op|'('
name|'exception'
op|'.'
name|'UnknownInstanceUpdateConflict'
op|','
op|')'
op|')'
newline|'\n'
name|'return'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|','
nl|'\n'
name|'exception_checker'
op|'='
name|'exception_checker'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'_retry_instance_update'
op|'('
op|')'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|instance_update
name|'def'
name|'instance_update'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'values'
op|','
name|'expected'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_instance_update'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'values'
op|','
name|'expected'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'_retry_instance_update'
op|'('
op|')'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|instance_update_and_get_original
name|'def'
name|'instance_update_and_get_original'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'values'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'None'
op|','
name|'expected'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Set the given properties on an instance and update it. Return\n a shallow copy of the original instance reference, as well as the\n updated one.\n\n :param context: = request context object\n :param instance_uuid: = instance uuid\n :param values: = dict containing column values\n\n If "expected_task_state" exists in values, the update can only happen\n when the task state before update matches expected_task_state. Otherwise\n a UnexpectedTaskStateError is thrown.\n\n :returns: a tuple of the form (old_instance_ref, new_instance_ref)\n\n Raises NotFound if instance does not exist.\n """'
newline|'\n'
name|'instance_ref'
op|'='
name|'_instance_get_by_uuid'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'columns_to_join'
op|')'
newline|'\n'
name|'return'
op|'('
name|'copy'
op|'.'
name|'copy'
op|'('
name|'instance_ref'
op|')'
op|','
name|'_instance_update'
op|'('
nl|'\n'
name|'context'
op|','
name|'instance_uuid'
op|','
name|'values'
op|','
name|'expected'
op|','
name|'original'
op|'='
name|'instance_ref'
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|"# NOTE(danms): This updates the instance's metadata list in-place and in"
nl|'\n'
comment|'# the database to avoid stale data and refresh issues. It assumes the'
nl|'\n'
comment|'# delete=True behavior of instance_metadata_update(...)'
nl|'\n'
DECL|function|_instance_metadata_update_in_place
dedent|''
name|'def'
name|'_instance_metadata_update_in_place'
op|'('
name|'context'
op|','
name|'instance'
op|','
name|'metadata_type'
op|','
name|'model'
op|','
nl|'\n'
name|'metadata'
op|')'
op|':'
newline|'\n'
indent|' '
name|'metadata'
op|'='
name|'dict'
op|'('
name|'metadata'
op|')'
newline|'\n'
name|'to_delete'
op|'='
op|'['
op|']'
newline|'\n'
name|'for'
name|'keyvalue'
name|'in'
name|'instance'
op|'['
name|'metadata_type'
op|']'
op|':'
newline|'\n'
indent|' '
name|'key'
op|'='
name|'keyvalue'
op|'['
string|"'key'"
op|']'
newline|'\n'
name|'if'
name|'key'
name|'in'
name|'metadata'
op|':'
newline|'\n'
indent|' '
name|'keyvalue'
op|'['
string|"'value'"
op|']'
op|'='
name|'metadata'
op|'.'
name|'pop'
op|'('
name|'key'
op|')'
newline|'\n'
dedent|''
name|'elif'
name|'key'
name|'not'
name|'in'
name|'metadata'
op|':'
newline|'\n'
indent|' '
name|'to_delete'
op|'.'
name|'append'
op|'('
name|'keyvalue'
op|')'
newline|'\n'
nl|'\n'
comment|'# NOTE: we have to hard_delete here otherwise we will get more than one'
nl|'\n'
comment|'# system_metadata record when we read deleted for an instance;'
nl|'\n'
comment|"# regular metadata doesn't have the same problem because we don't"
nl|'\n'
comment|'# allow reading deleted regular metadata anywhere.'
nl|'\n'
dedent|''
dedent|''
name|'if'
name|'metadata_type'
op|'=='
string|"'system_metadata'"
op|':'
newline|'\n'
indent|' '
name|'for'
name|'condemned'
name|'in'
name|'to_delete'
op|':'
newline|'\n'
indent|' '
name|'context'
op|'.'
name|'session'
op|'.'
name|'delete'
op|'('
name|'condemned'
op|')'
newline|'\n'
name|'instance'
op|'['
name|'metadata_type'
op|']'
op|'.'
name|'remove'
op|'('
name|'condemned'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'condemned'
name|'in'
name|'to_delete'
op|':'
newline|'\n'
indent|' '
name|'condemned'
op|'.'
name|'soft_delete'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'for'
name|'key'
op|','
name|'value'
name|'in'
name|'metadata'
op|'.'
name|'items'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'newitem'
op|'='
name|'model'
op|'('
op|')'
newline|'\n'
name|'newitem'
op|'.'
name|'update'
op|'('
op|'{'
string|"'key'"
op|':'
name|'key'
op|','
string|"'value'"
op|':'
name|'value'
op|','
nl|'\n'
string|"'instance_uuid'"
op|':'
name|'instance'
op|'['
string|"'uuid'"
op|']'
op|'}'
op|')'
newline|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'add'
op|'('
name|'newitem'
op|')'
newline|'\n'
name|'instance'
op|'['
name|'metadata_type'
op|']'
op|'.'
name|'append'
op|'('
name|'newitem'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_instance_update
dedent|''
dedent|''
name|'def'
name|'_instance_update'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'values'
op|','
name|'expected'
op|','
name|'original'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'not'
name|'uuidutils'
op|'.'
name|'is_uuid_like'
op|'('
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InvalidUUID'
op|'('
name|'instance_uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'expected'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'expected'
op|'='
op|'{'
op|'}'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
comment|'# Coerce all single values to singleton lists'
nl|'\n'
indent|' '
name|'expected'
op|'='
op|'{'
name|'k'
op|':'
op|'['
name|'None'
op|']'
name|'if'
name|'v'
name|'is'
name|'None'
name|'else'
name|'sqlalchemyutils'
op|'.'
name|'to_list'
op|'('
name|'v'
op|')'
nl|'\n'
name|'for'
op|'('
name|'k'
op|','
name|'v'
op|')'
name|'in'
name|'six'
op|'.'
name|'iteritems'
op|'('
name|'expected'
op|')'
op|'}'
newline|'\n'
nl|'\n'
comment|"# Extract 'expected_' values from values dict, as these aren't actually"
nl|'\n'
comment|'# updates'
nl|'\n'
dedent|''
name|'for'
name|'field'
name|'in'
op|'('
string|"'task_state'"
op|','
string|"'vm_state'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'expected_field'
op|'='
string|"'expected_%s'"
op|'%'
name|'field'
newline|'\n'
name|'if'
name|'expected_field'
name|'in'
name|'values'
op|':'
newline|'\n'
indent|' '
name|'value'
op|'='
name|'values'
op|'.'
name|'pop'
op|'('
name|'expected_field'
op|','
name|'None'
op|')'
newline|'\n'
comment|'# Coerce all single values to singleton lists'
nl|'\n'
name|'if'
name|'value'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'expected'
op|'['
name|'field'
op|']'
op|'='
op|'['
name|'None'
op|']'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'expected'
op|'['
name|'field'
op|']'
op|'='
name|'sqlalchemyutils'
op|'.'
name|'to_list'
op|'('
name|'value'
op|')'
newline|'\n'
nl|'\n'
comment|'# Values which need to be updated separately'
nl|'\n'
dedent|''
dedent|''
dedent|''
name|'metadata'
op|'='
name|'values'
op|'.'
name|'pop'
op|'('
string|"'metadata'"
op|','
name|'None'
op|')'
newline|'\n'
name|'system_metadata'
op|'='
name|'values'
op|'.'
name|'pop'
op|'('
string|"'system_metadata'"
op|','
name|'None'
op|')'
newline|'\n'
nl|'\n'
name|'_handle_objects_related_type_conversions'
op|'('
name|'values'
op|')'
newline|'\n'
nl|'\n'
comment|'# Hostname is potentially unique, but this is enforced in code rather'
nl|'\n'
comment|'# than the DB. The query below races, but the number of users of'
nl|'\n'
comment|'# osapi_compute_unique_server_name_scope is small, and a robust fix'
nl|'\n'
comment|'# will be complex. This is intentionally left as is for the moment.'
nl|'\n'
name|'if'
string|"'hostname'"
name|'in'
name|'values'
op|':'
newline|'\n'
indent|' '
name|'_validate_unique_server_name'
op|'('
name|'context'
op|','
name|'values'
op|'['
string|"'hostname'"
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'compare'
op|'='
name|'models'
op|'.'
name|'Instance'
op|'('
name|'uuid'
op|'='
name|'instance_uuid'
op|','
op|'**'
name|'expected'
op|')'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'instance_ref'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Instance'
op|','
nl|'\n'
name|'project_only'
op|'='
name|'True'
op|')'
op|'.'
name|'update_on_match'
op|'('
name|'compare'
op|','
string|"'uuid'"
op|','
name|'values'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'update_match'
op|'.'
name|'NoRowsMatched'
op|':'
newline|'\n'
comment|'# Update failed. Try to find why and raise a specific error.'
nl|'\n'
nl|'\n'
comment|'# We should get here only because our expected values were not current'
nl|'\n'
comment|'# when update_on_match executed. Having failed, we now have a hint that'
nl|'\n'
comment|'# the values are out of date and should check them.'
nl|'\n'
nl|'\n'
comment|'# This code is made more complex because we are using repeatable reads.'
nl|'\n'
comment|'# If we have previously read the original instance in the current'
nl|'\n'
comment|'# transaction, reading it again will return the same data, even though'
nl|'\n'
comment|'# the above update failed because it has changed: it is not possible to'
nl|'\n'
comment|'# determine what has changed in this transaction. In this case we raise'
nl|'\n'
comment|'# UnknownInstanceUpdateConflict, which will cause the operation to be'
nl|'\n'
comment|'# retried in a new transaction.'
nl|'\n'
nl|'\n'
comment|'# Because of the above, if we have previously read the instance in the'
nl|'\n'
comment|"# current transaction it will have been passed as 'original', and there"
nl|'\n'
comment|'# is no point refreshing it. If we have not previously read the'
nl|'\n'
comment|'# instance, we can fetch it here and we will get fresh data.'
nl|'\n'
indent|' '
name|'if'
name|'original'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'original'
op|'='
name|'_instance_get_by_uuid'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'conflicts_expected'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'conflicts_actual'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'for'
op|'('
name|'field'
op|','
name|'expected_values'
op|')'
name|'in'
name|'six'
op|'.'
name|'iteritems'
op|'('
name|'expected'
op|')'
op|':'
newline|'\n'
indent|' '
name|'actual'
op|'='
name|'original'
op|'['
name|'field'
op|']'
newline|'\n'
name|'if'
name|'actual'
name|'not'
name|'in'
name|'expected_values'
op|':'
newline|'\n'
indent|' '
name|'conflicts_expected'
op|'['
name|'field'
op|']'
op|'='
name|'expected_values'
newline|'\n'
name|'conflicts_actual'
op|'['
name|'field'
op|']'
op|'='
name|'actual'
newline|'\n'
nl|'\n'
comment|'# Exception properties'
nl|'\n'
dedent|''
dedent|''
name|'exc_props'
op|'='
op|'{'
nl|'\n'
string|"'instance_uuid'"
op|':'
name|'instance_uuid'
op|','
nl|'\n'
string|"'expected'"
op|':'
name|'conflicts_expected'
op|','
nl|'\n'
string|"'actual'"
op|':'
name|'conflicts_actual'
nl|'\n'
op|'}'
newline|'\n'
nl|'\n'
comment|'# There was a conflict, but something (probably the MySQL read view,'
nl|'\n'
comment|'# but possibly an exceptionally unlikely second race) is preventing us'
nl|'\n'
comment|"# from seeing what it is. When we go round again we'll get a fresh"
nl|'\n'
comment|'# transaction and a fresh read view.'
nl|'\n'
name|'if'
name|'len'
op|'('
name|'conflicts_actual'
op|')'
op|'=='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'UnknownInstanceUpdateConflict'
op|'('
op|'**'
name|'exc_props'
op|')'
newline|'\n'
nl|'\n'
comment|'# Task state gets special handling for convenience. We raise the'
nl|'\n'
comment|'# specific error UnexpectedDeletingTaskStateError or'
nl|'\n'
comment|'# UnexpectedTaskStateError as appropriate'
nl|'\n'
dedent|''
name|'if'
string|"'task_state'"
name|'in'
name|'conflicts_actual'
op|':'
newline|'\n'
indent|' '
name|'conflict_task_state'
op|'='
name|'conflicts_actual'
op|'['
string|"'task_state'"
op|']'
newline|'\n'
name|'if'
name|'conflict_task_state'
op|'=='
name|'task_states'
op|'.'
name|'DELETING'
op|':'
newline|'\n'
indent|' '
name|'exc'
op|'='
name|'exception'
op|'.'
name|'UnexpectedDeletingTaskStateError'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'exc'
op|'='
name|'exception'
op|'.'
name|'UnexpectedTaskStateError'
newline|'\n'
nl|'\n'
comment|'# Everything else is an InstanceUpdateConflict'
nl|'\n'
dedent|''
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'exc'
op|'='
name|'exception'
op|'.'
name|'InstanceUpdateConflict'
newline|'\n'
nl|'\n'
dedent|''
name|'raise'
name|'exc'
op|'('
op|'**'
name|'exc_props'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'metadata'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'_instance_metadata_update_in_place'
op|'('
name|'context'
op|','
name|'instance_ref'
op|','
nl|'\n'
string|"'metadata'"
op|','
nl|'\n'
name|'models'
op|'.'
name|'InstanceMetadata'
op|','
nl|'\n'
name|'metadata'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'system_metadata'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'_instance_metadata_update_in_place'
op|'('
name|'context'
op|','
name|'instance_ref'
op|','
nl|'\n'
string|"'system_metadata'"
op|','
nl|'\n'
name|'models'
op|'.'
name|'InstanceSystemMetadata'
op|','
nl|'\n'
name|'system_metadata'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'instance_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|instance_add_security_group
name|'def'
name|'instance_add_security_group'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'security_group_id'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Associate the given security group with the given instance."""'
newline|'\n'
name|'sec_group_ref'
op|'='
name|'models'
op|'.'
name|'SecurityGroupInstanceAssociation'
op|'('
op|')'
newline|'\n'
name|'sec_group_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|"'instance_uuid'"
op|':'
name|'instance_uuid'
op|','
nl|'\n'
string|"'security_group_id'"
op|':'
name|'security_group_id'
op|'}'
op|')'
newline|'\n'
name|'sec_group_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|instance_remove_security_group
name|'def'
name|'instance_remove_security_group'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'security_group_id'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Disassociate the given security group from the given instance."""'
newline|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'SecurityGroupInstanceAssociation'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'security_group_id'
op|'='
name|'security_group_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|instance_info_cache_get
name|'def'
name|'instance_info_cache_get'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Gets an instance info cache from the table.\n\n :param instance_uuid: = uuid of the info cache\'s instance\n """'
newline|'\n'
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceInfoCache'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|')'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|instance_info_cache_update
name|'def'
name|'instance_info_cache_update'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Update an instance info cache record in the table.\n\n :param instance_uuid: = uuid of info cache\'s instance\n :param values: = dict containing column values to update\n """'
newline|'\n'
name|'convert_objects_related_datetimes'
op|'('
name|'values'
op|')'
newline|'\n'
nl|'\n'
name|'info_cache'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceInfoCache'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'needs_create'
op|'='
name|'False'
newline|'\n'
name|'if'
name|'info_cache'
name|'and'
name|'info_cache'
op|'['
string|"'deleted'"
op|']'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceInfoCacheNotFound'
op|'('
nl|'\n'
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
dedent|''
name|'elif'
name|'not'
name|'info_cache'
op|':'
newline|'\n'
comment|"# NOTE(tr3buchet): just in case someone blows away an instance's"
nl|'\n'
comment|'# cache entry, re-create it.'
nl|'\n'
indent|' '
name|'values'
op|'['
string|"'instance_uuid'"
op|']'
op|'='
name|'instance_uuid'
newline|'\n'
name|'info_cache'
op|'='
name|'models'
op|'.'
name|'InstanceInfoCache'
op|'('
op|'**'
name|'values'
op|')'
newline|'\n'
name|'needs_create'
op|'='
name|'True'
newline|'\n'
nl|'\n'
dedent|''
name|'try'
op|':'
newline|'\n'
indent|' '
name|'with'
name|'main_context_manager'
op|'.'
name|'writer'
op|'.'
name|'savepoint'
op|'.'
name|'using'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'needs_create'
op|':'
newline|'\n'
indent|' '
name|'info_cache'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'info_cache'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
dedent|''
dedent|''
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
comment|'# NOTE(sirp): Possible race if two greenthreads attempt to'
nl|'\n'
comment|'# recreate the instance cache entry at the same time. First one'
nl|'\n'
comment|'# wins.'
nl|'\n'
indent|' '
name|'pass'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'info_cache'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|instance_info_cache_delete
name|'def'
name|'instance_info_cache_delete'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Deletes an existing instance_info_cache record\n\n :param instance_uuid: = uuid of the instance tied to the cache record\n """'
newline|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceInfoCache'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
DECL|function|_instance_extra_create
dedent|''
name|'def'
name|'_instance_extra_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'inst_extra_ref'
op|'='
name|'models'
op|'.'
name|'InstanceExtra'
op|'('
op|')'
newline|'\n'
name|'inst_extra_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'inst_extra_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'return'
name|'inst_extra_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|instance_extra_update_by_uuid
name|'def'
name|'instance_extra_update_by_uuid'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'rows_updated'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceExtra'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'rows_updated'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'debug'
op|'('
string|'"Created instance_extra for %s"'
op|','
name|'instance_uuid'
op|')'
newline|'\n'
name|'create_values'
op|'='
name|'copy'
op|'.'
name|'copy'
op|'('
name|'values'
op|')'
newline|'\n'
name|'create_values'
op|'['
string|'"instance_uuid"'
op|']'
op|'='
name|'instance_uuid'
newline|'\n'
name|'_instance_extra_create'
op|'('
name|'context'
op|','
name|'create_values'
op|')'
newline|'\n'
name|'rows_updated'
op|'='
number|'1'
newline|'\n'
dedent|''
name|'return'
name|'rows_updated'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|instance_extra_get_by_instance_uuid
name|'def'
name|'instance_extra_get_by_instance_uuid'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
nl|'\n'
name|'columns'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceExtra'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
name|'if'
name|'columns'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'columns'
op|'='
op|'['
string|"'numa_topology'"
op|','
string|"'pci_requests'"
op|','
string|"'flavor'"
op|','
string|"'vcpu_model'"
op|','
nl|'\n'
string|"'migration_context'"
op|']'
newline|'\n'
dedent|''
name|'for'
name|'column'
name|'in'
name|'columns'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'undefer'
op|'('
name|'column'
op|')'
op|')'
newline|'\n'
dedent|''
name|'instance_extra'
op|'='
name|'query'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'return'
name|'instance_extra'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|key_pair_create
name|'def'
name|'key_pair_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'key_pair_ref'
op|'='
name|'models'
op|'.'
name|'KeyPair'
op|'('
op|')'
newline|'\n'
name|'key_pair_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'key_pair_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'return'
name|'key_pair_ref'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'KeyPairExists'
op|'('
name|'key_name'
op|'='
name|'values'
op|'['
string|"'name'"
op|']'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|key_pair_destroy
name|'def'
name|'key_pair_destroy'
op|'('
name|'context'
op|','
name|'user_id'
op|','
name|'name'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'KeyPair'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'user_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'name'
op|'='
name|'name'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'KeypairNotFound'
op|'('
name|'user_id'
op|'='
name|'user_id'
op|','
name|'name'
op|'='
name|'name'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|key_pair_get
name|'def'
name|'key_pair_get'
op|'('
name|'context'
op|','
name|'user_id'
op|','
name|'name'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'KeyPair'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'user_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'name'
op|'='
name|'name'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'KeypairNotFound'
op|'('
name|'user_id'
op|'='
name|'user_id'
op|','
name|'name'
op|'='
name|'name'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|key_pair_get_all_by_user
name|'def'
name|'key_pair_get_all_by_user'
op|'('
name|'context'
op|','
name|'user_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'KeyPair'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'user_id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|key_pair_count_by_user
name|'def'
name|'key_pair_count_by_user'
op|'('
name|'context'
op|','
name|'user_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'KeyPair'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'user_id'
op|')'
op|'.'
name|'count'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|network_associate
name|'def'
name|'network_associate'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'network_id'
op|'='
name|'None'
op|','
name|'force'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Associate a project with a network.\n\n called by project_get_networks under certain conditions\n and network manager add_network_to_project()\n\n only associate if the project doesn\'t already have a network\n or if force is True\n\n force solves race condition where a fresh project has multiple instance\n builds simultaneously picked up by multiple network hosts which attempt\n to associate the project with multiple networks\n force should only be used as a direct consequence of user request\n all automated requests should not use force\n """'
newline|'\n'
DECL|function|network_query
name|'def'
name|'network_query'
op|'('
name|'project_filter'
op|','
name|'id'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'filter_kwargs'
op|'='
op|'{'
string|"'project_id'"
op|':'
name|'project_filter'
op|'}'
newline|'\n'
name|'if'
name|'id'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'filter_kwargs'
op|'['
string|"'id'"
op|']'
op|'='
name|'id'
newline|'\n'
dedent|''
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Network'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
op|'**'
name|'filter_kwargs'
op|')'
op|'.'
name|'with_lockmode'
op|'('
string|"'update'"
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'not'
name|'force'
op|':'
newline|'\n'
comment|'# find out if project has a network'
nl|'\n'
indent|' '
name|'network_ref'
op|'='
name|'network_query'
op|'('
name|'project_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'force'
name|'or'
name|'not'
name|'network_ref'
op|':'
newline|'\n'
comment|"# in force mode or project doesn't have a network so associate"
nl|'\n'
comment|'# with a new network'
nl|'\n'
nl|'\n'
comment|'# get new network'
nl|'\n'
indent|' '
name|'network_ref'
op|'='
name|'network_query'
op|'('
name|'None'
op|','
name|'network_id'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'network_ref'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'NoMoreNetworks'
op|'('
op|')'
newline|'\n'
nl|'\n'
comment|'# associate with network'
nl|'\n'
comment|"# NOTE(vish): if with_lockmode isn't supported, as in sqlite,"
nl|'\n'
comment|'# then this has concurrency issues'
nl|'\n'
dedent|''
name|'network_ref'
op|'['
string|"'project_id'"
op|']'
op|'='
name|'project_id'
newline|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'add'
op|'('
name|'network_ref'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'network_ref'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_network_ips_query
dedent|''
name|'def'
name|'_network_ips_query'
op|'('
name|'context'
op|','
name|'network_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'network_id'
op|'='
name|'network_id'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|network_count_reserved_ips
name|'def'
name|'network_count_reserved_ips'
op|'('
name|'context'
op|','
name|'network_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_network_ips_query'
op|'('
name|'context'
op|','
name|'network_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'reserved'
op|'='
name|'True'
op|')'
op|'.'
name|'count'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|network_create_safe
name|'def'
name|'network_create_safe'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'network_ref'
op|'='
name|'models'
op|'.'
name|'Network'
op|'('
op|')'
newline|'\n'
name|'network_ref'
op|'['
string|"'uuid'"
op|']'
op|'='
name|'str'
op|'('
name|'uuid'
op|'.'
name|'uuid4'
op|'('
op|')'
op|')'
newline|'\n'
name|'network_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
nl|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'network_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'return'
name|'network_ref'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'DuplicateVlan'
op|'('
name|'vlan'
op|'='
name|'values'
op|'['
string|"'vlan'"
op|']'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|network_delete_safe
name|'def'
name|'network_delete_safe'
op|'('
name|'context'
op|','
name|'network_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'network_id'
op|'='
name|'network_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'allocated'
op|'='
name|'True'
op|')'
op|'.'
name|'count'
op|'('
op|')'
newline|'\n'
name|'if'
name|'result'
op|'!='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'NetworkInUse'
op|'('
name|'network_id'
op|'='
name|'network_id'
op|')'
newline|'\n'
dedent|''
name|'network_ref'
op|'='
name|'_network_get'
op|'('
name|'context'
op|','
name|'network_id'
op|'='
name|'network_id'
op|')'
newline|'\n'
nl|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'network_id'
op|'='
name|'network_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'delete'
op|'('
name|'network_ref'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|network_disassociate
name|'def'
name|'network_disassociate'
op|'('
name|'context'
op|','
name|'network_id'
op|','
name|'disassociate_host'
op|','
nl|'\n'
name|'disassociate_project'
op|')'
op|':'
newline|'\n'
indent|' '
name|'net_update'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'if'
name|'disassociate_project'
op|':'
newline|'\n'
indent|' '
name|'net_update'
op|'['
string|"'project_id'"
op|']'
op|'='
name|'None'
newline|'\n'
dedent|''
name|'if'
name|'disassociate_host'
op|':'
newline|'\n'
indent|' '
name|'net_update'
op|'['
string|"'host'"
op|']'
op|'='
name|'None'
newline|'\n'
dedent|''
name|'network_update'
op|'('
name|'context'
op|','
name|'network_id'
op|','
name|'net_update'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_network_get
dedent|''
name|'def'
name|'_network_get'
op|'('
name|'context'
op|','
name|'network_id'
op|','
name|'project_only'
op|'='
string|"'allow_none'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Network'
op|','
name|'project_only'
op|'='
name|'project_only'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'network_id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'NetworkNotFound'
op|'('
name|'network_id'
op|'='
name|'network_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|network_get
name|'def'
name|'network_get'
op|'('
name|'context'
op|','
name|'network_id'
op|','
name|'project_only'
op|'='
string|"'allow_none'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_network_get'
op|'('
name|'context'
op|','
name|'network_id'
op|','
name|'project_only'
op|'='
name|'project_only'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|network_get_all
name|'def'
name|'network_get_all'
op|'('
name|'context'
op|','
name|'project_only'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Network'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|','
nl|'\n'
name|'project_only'
op|'='
name|'project_only'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'NoNetworksFound'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|network_get_all_by_uuids
name|'def'
name|'network_get_all_by_uuids'
op|'('
name|'context'
op|','
name|'network_uuids'
op|','
name|'project_only'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Network'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|','
nl|'\n'
name|'project_only'
op|'='
name|'project_only'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Network'
op|'.'
name|'uuid'
op|'.'
name|'in_'
op|'('
name|'network_uuids'
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'NoNetworksFound'
op|'('
op|')'
newline|'\n'
nl|'\n'
comment|'# check if the result contains all the networks'
nl|'\n'
comment|'# we are looking for'
nl|'\n'
dedent|''
name|'for'
name|'network_uuid'
name|'in'
name|'network_uuids'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'network'
name|'in'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'network'
op|'['
string|"'uuid'"
op|']'
op|'=='
name|'network_uuid'
op|':'
newline|'\n'
indent|' '
name|'break'
newline|'\n'
dedent|''
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'project_only'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'NetworkNotFoundForProject'
op|'('
nl|'\n'
name|'network_uuid'
op|'='
name|'network_uuid'
op|','
name|'project_id'
op|'='
name|'context'
op|'.'
name|'project_id'
op|')'
newline|'\n'
dedent|''
name|'raise'
name|'exception'
op|'.'
name|'NetworkNotFound'
op|'('
name|'network_id'
op|'='
name|'network_uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_get_associated_fixed_ips_query
dedent|''
name|'def'
name|'_get_associated_fixed_ips_query'
op|'('
name|'context'
op|','
name|'network_id'
op|','
name|'host'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
comment|'# NOTE(vish): The ugly joins here are to solve a performance issue and'
nl|'\n'
comment|'# should be removed once we can add and remove leases'
nl|'\n'
comment|'# without regenerating the whole list'
nl|'\n'
indent|' '
name|'vif_and'
op|'='
name|'and_'
op|'('
name|'models'
op|'.'
name|'VirtualInterface'
op|'.'
name|'id'
op|'=='
nl|'\n'
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'virtual_interface_id'
op|','
nl|'\n'
name|'models'
op|'.'
name|'VirtualInterface'
op|'.'
name|'deleted'
op|'=='
number|'0'
op|')'
newline|'\n'
name|'inst_and'
op|'='
name|'and_'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'uuid'
op|'=='
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'instance_uuid'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'deleted'
op|'=='
number|'0'
op|')'
newline|'\n'
comment|'# NOTE(vish): This subquery left joins the minimum interface id for each'
nl|'\n'
comment|'# instance. If the join succeeds (i.e. the 11th column is not'
nl|'\n'
comment|'# null), then the fixed ip is on the first interface.'
nl|'\n'
name|'subq'
op|'='
name|'context'
op|'.'
name|'session'
op|'.'
name|'query'
op|'('
nl|'\n'
name|'func'
op|'.'
name|'min'
op|'('
name|'models'
op|'.'
name|'VirtualInterface'
op|'.'
name|'id'
op|')'
op|'.'
name|'label'
op|'('
string|'"id"'
op|')'
op|','
nl|'\n'
name|'models'
op|'.'
name|'VirtualInterface'
op|'.'
name|'instance_uuid'
op|')'
op|'.'
name|'group_by'
op|'('
name|'models'
op|'.'
name|'VirtualInterface'
op|'.'
name|'instance_uuid'
op|')'
op|'.'
name|'subquery'
op|'('
op|')'
newline|'\n'
name|'subq_and'
op|'='
name|'and_'
op|'('
name|'subq'
op|'.'
name|'c'
op|'.'
name|'id'
op|'=='
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'virtual_interface_id'
op|','
nl|'\n'
name|'subq'
op|'.'
name|'c'
op|'.'
name|'instance_uuid'
op|'=='
name|'models'
op|'.'
name|'VirtualInterface'
op|'.'
name|'instance_uuid'
op|')'
newline|'\n'
name|'query'
op|'='
name|'context'
op|'.'
name|'session'
op|'.'
name|'query'
op|'('
nl|'\n'
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'address'
op|','
nl|'\n'
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'instance_uuid'
op|','
nl|'\n'
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'network_id'
op|','
nl|'\n'
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'virtual_interface_id'
op|','
nl|'\n'
name|'models'
op|'.'
name|'VirtualInterface'
op|'.'
name|'address'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'hostname'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'updated_at'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'created_at'
op|','
nl|'\n'
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'allocated'
op|','
nl|'\n'
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'leased'
op|','
nl|'\n'
name|'subq'
op|'.'
name|'c'
op|'.'
name|'id'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'deleted'
op|'=='
number|'0'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'network_id'
op|'=='
name|'network_id'
op|')'
op|'.'
name|'join'
op|'('
op|'('
name|'models'
op|'.'
name|'VirtualInterface'
op|','
name|'vif_and'
op|')'
op|')'
op|'.'
name|'join'
op|'('
op|'('
name|'models'
op|'.'
name|'Instance'
op|','
name|'inst_and'
op|')'
op|')'
op|'.'
name|'outerjoin'
op|'('
op|'('
name|'subq'
op|','
name|'subq_and'
op|')'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'instance_uuid'
op|'!='
name|'null'
op|'('
op|')'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'virtual_interface_id'
op|'!='
name|'null'
op|'('
op|')'
op|')'
newline|'\n'
name|'if'
name|'host'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'host'
op|'=='
name|'host'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'query'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|network_get_associated_fixed_ips
name|'def'
name|'network_get_associated_fixed_ips'
op|'('
name|'context'
op|','
name|'network_id'
op|','
name|'host'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
comment|'# FIXME(sirp): since this returns fixed_ips, this would be better named'
nl|'\n'
comment|'# fixed_ip_get_all_by_network.'
nl|'\n'
indent|' '
name|'query'
op|'='
name|'_get_associated_fixed_ips_query'
op|'('
name|'context'
op|','
name|'network_id'
op|','
name|'host'
op|')'
newline|'\n'
name|'result'
op|'='
name|'query'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'data'
op|'='
op|'['
op|']'
newline|'\n'
name|'for'
name|'datum'
name|'in'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'cleaned'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'cleaned'
op|'['
string|"'address'"
op|']'
op|'='
name|'datum'
op|'['
number|'0'
op|']'
newline|'\n'
name|'cleaned'
op|'['
string|"'instance_uuid'"
op|']'
op|'='
name|'datum'
op|'['
number|'1'
op|']'
newline|'\n'
name|'cleaned'
op|'['
string|"'network_id'"
op|']'
op|'='
name|'datum'
op|'['
number|'2'
op|']'
newline|'\n'
name|'cleaned'
op|'['
string|"'vif_id'"
op|']'
op|'='
name|'datum'
op|'['
number|'3'
op|']'
newline|'\n'
name|'cleaned'
op|'['
string|"'vif_address'"
op|']'
op|'='
name|'datum'
op|'['
number|'4'
op|']'
newline|'\n'
name|'cleaned'
op|'['
string|"'instance_hostname'"
op|']'
op|'='
name|'datum'
op|'['
number|'5'
op|']'
newline|'\n'
name|'cleaned'
op|'['
string|"'instance_updated'"
op|']'
op|'='
name|'datum'
op|'['
number|'6'
op|']'
newline|'\n'
name|'cleaned'
op|'['
string|"'instance_created'"
op|']'
op|'='
name|'datum'
op|'['
number|'7'
op|']'
newline|'\n'
name|'cleaned'
op|'['
string|"'allocated'"
op|']'
op|'='
name|'datum'
op|'['
number|'8'
op|']'
newline|'\n'
name|'cleaned'
op|'['
string|"'leased'"
op|']'
op|'='
name|'datum'
op|'['
number|'9'
op|']'
newline|'\n'
comment|'# NOTE(vish): default_route is True if this fixed ip is on the first'
nl|'\n'
comment|'# interface its instance.'
nl|'\n'
name|'cleaned'
op|'['
string|"'default_route'"
op|']'
op|'='
name|'datum'
op|'['
number|'10'
op|']'
name|'is'
name|'not'
name|'None'
newline|'\n'
name|'data'
op|'.'
name|'append'
op|'('
name|'cleaned'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'data'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|network_in_use_on_host
name|'def'
name|'network_in_use_on_host'
op|'('
name|'context'
op|','
name|'network_id'
op|','
name|'host'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'_get_associated_fixed_ips_query'
op|'('
name|'context'
op|','
name|'network_id'
op|','
name|'host'
op|')'
newline|'\n'
name|'return'
name|'query'
op|'.'
name|'count'
op|'('
op|')'
op|'>'
number|'0'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_network_get_query
dedent|''
name|'def'
name|'_network_get_query'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Network'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|network_get_by_uuid
name|'def'
name|'network_get_by_uuid'
op|'('
name|'context'
op|','
name|'uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'_network_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'uuid'
op|'='
name|'uuid'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'NetworkNotFoundForUUID'
op|'('
name|'uuid'
op|'='
name|'uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|network_get_by_cidr
name|'def'
name|'network_get_by_cidr'
op|'('
name|'context'
op|','
name|'cidr'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'_network_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter'
op|'('
name|'or_'
op|'('
name|'models'
op|'.'
name|'Network'
op|'.'
name|'cidr'
op|'=='
name|'cidr'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Network'
op|'.'
name|'cidr_v6'
op|'=='
name|'cidr'
op|')'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'NetworkNotFoundForCidr'
op|'('
name|'cidr'
op|'='
name|'cidr'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|network_get_all_by_host
name|'def'
name|'network_get_all_by_host'
op|'('
name|'context'
op|','
name|'host'
op|')'
op|':'
newline|'\n'
indent|' '
name|'fixed_host_filter'
op|'='
name|'or_'
op|'('
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'host'
op|'=='
name|'host'
op|','
nl|'\n'
name|'and_'
op|'('
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'instance_uuid'
op|'!='
name|'null'
op|'('
op|')'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'host'
op|'=='
name|'host'
op|')'
op|')'
newline|'\n'
name|'fixed_ip_query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'FixedIp'
op|','
nl|'\n'
op|'('
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'network_id'
op|','
op|')'
op|')'
op|'.'
name|'outerjoin'
op|'('
op|'('
name|'models'
op|'.'
name|'Instance'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Instance'
op|'.'
name|'uuid'
op|'=='
nl|'\n'
name|'models'
op|'.'
name|'FixedIp'
op|'.'
name|'instance_uuid'
op|')'
op|')'
op|'.'
name|'filter'
op|'('
name|'fixed_host_filter'
op|')'
newline|'\n'
comment|'# NOTE(vish): return networks that have host set'
nl|'\n'
comment|'# or that have a fixed ip with host set'
nl|'\n'
comment|'# or that have an instance with host set'
nl|'\n'
name|'host_filter'
op|'='
name|'or_'
op|'('
name|'models'
op|'.'
name|'Network'
op|'.'
name|'host'
op|'=='
name|'host'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Network'
op|'.'
name|'id'
op|'.'
name|'in_'
op|'('
name|'fixed_ip_query'
op|'.'
name|'subquery'
op|'('
op|')'
op|')'
op|')'
newline|'\n'
name|'return'
name|'_network_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter'
op|'('
name|'host_filter'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|','
nl|'\n'
DECL|variable|retry_on_request
name|'retry_on_request'
op|'='
name|'True'
op|')'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|network_set_host
name|'def'
name|'network_set_host'
op|'('
name|'context'
op|','
name|'network_id'
op|','
name|'host_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'network_ref'
op|'='
name|'_network_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'network_id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'network_ref'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'NetworkNotFound'
op|'('
name|'network_id'
op|'='
name|'network_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'network_ref'
op|'.'
name|'host'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'None'
newline|'\n'
nl|'\n'
dedent|''
name|'rows_updated'
op|'='
name|'_network_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'network_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'host'
op|'='
name|'None'
op|')'
op|'.'
name|'update'
op|'('
op|'{'
string|"'host'"
op|':'
name|'host_id'
op|'}'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'rows_updated'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'debug'
op|'('
string|"'The row was updated in a concurrent transaction, '"
nl|'\n'
string|"'we will fetch another row'"
op|')'
newline|'\n'
name|'raise'
name|'db_exc'
op|'.'
name|'RetryRequest'
op|'('
nl|'\n'
name|'exception'
op|'.'
name|'NetworkSetHostFailed'
op|'('
name|'network_id'
op|'='
name|'network_id'
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|network_update
name|'def'
name|'network_update'
op|'('
name|'context'
op|','
name|'network_id'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'network_ref'
op|'='
name|'_network_get'
op|'('
name|'context'
op|','
name|'network_id'
op|')'
newline|'\n'
name|'network_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'network_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'DuplicateVlan'
op|'('
name|'vlan'
op|'='
name|'values'
op|'['
string|"'vlan'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'network_ref'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|quota_get
name|'def'
name|'quota_get'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'resource'
op|','
name|'user_id'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'model'
op|'='
name|'models'
op|'.'
name|'ProjectUserQuota'
name|'if'
name|'user_id'
name|'else'
name|'models'
op|'.'
name|'Quota'
newline|'\n'
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'model'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'resource'
op|'='
name|'resource'
op|')'
newline|'\n'
name|'if'
name|'user_id'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'user_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'result'
op|'='
name|'query'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'user_id'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ProjectUserQuotaNotFound'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|','
nl|'\n'
name|'user_id'
op|'='
name|'user_id'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ProjectQuotaNotFound'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|quota_get_all_by_project_and_user
name|'def'
name|'quota_get_all_by_project_and_user'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'user_quotas'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'ProjectUserQuota'
op|','
nl|'\n'
op|'('
name|'models'
op|'.'
name|'ProjectUserQuota'
op|'.'
name|'resource'
op|','
nl|'\n'
name|'models'
op|'.'
name|'ProjectUserQuota'
op|'.'
name|'hard_limit'
op|')'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'user_id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'result'
op|'='
op|'{'
string|"'project_id'"
op|':'
name|'project_id'
op|','
string|"'user_id'"
op|':'
name|'user_id'
op|'}'
newline|'\n'
name|'for'
name|'user_quota'
name|'in'
name|'user_quotas'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'['
name|'user_quota'
op|'.'
name|'resource'
op|']'
op|'='
name|'user_quota'
op|'.'
name|'hard_limit'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|quota_get_all_by_project
name|'def'
name|'quota_get_all_by_project'
op|'('
name|'context'
op|','
name|'project_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'rows'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Quota'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'result'
op|'='
op|'{'
string|"'project_id'"
op|':'
name|'project_id'
op|'}'
newline|'\n'
name|'for'
name|'row'
name|'in'
name|'rows'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'['
name|'row'
op|'.'
name|'resource'
op|']'
op|'='
name|'row'
op|'.'
name|'hard_limit'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|quota_get_all
name|'def'
name|'quota_get_all'
op|'('
name|'context'
op|','
name|'project_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'ProjectUserQuota'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|quota_get_per_project_resources
dedent|''
name|'def'
name|'quota_get_per_project_resources'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'PER_PROJECT_QUOTAS'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|quota_create
name|'def'
name|'quota_create'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'resource'
op|','
name|'limit'
op|','
name|'user_id'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'per_user'
op|'='
name|'user_id'
name|'and'
name|'resource'
name|'not'
name|'in'
name|'PER_PROJECT_QUOTAS'
newline|'\n'
name|'quota_ref'
op|'='
name|'models'
op|'.'
name|'ProjectUserQuota'
op|'('
op|')'
name|'if'
name|'per_user'
name|'else'
name|'models'
op|'.'
name|'Quota'
op|'('
op|')'
newline|'\n'
name|'if'
name|'per_user'
op|':'
newline|'\n'
indent|' '
name|'quota_ref'
op|'.'
name|'user_id'
op|'='
name|'user_id'
newline|'\n'
dedent|''
name|'quota_ref'
op|'.'
name|'project_id'
op|'='
name|'project_id'
newline|'\n'
name|'quota_ref'
op|'.'
name|'resource'
op|'='
name|'resource'
newline|'\n'
name|'quota_ref'
op|'.'
name|'hard_limit'
op|'='
name|'limit'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'quota_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'QuotaExists'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|','
name|'resource'
op|'='
name|'resource'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'quota_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|quota_update
name|'def'
name|'quota_update'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'resource'
op|','
name|'limit'
op|','
name|'user_id'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'per_user'
op|'='
name|'user_id'
name|'and'
name|'resource'
name|'not'
name|'in'
name|'PER_PROJECT_QUOTAS'
newline|'\n'
name|'model'
op|'='
name|'models'
op|'.'
name|'ProjectUserQuota'
name|'if'
name|'per_user'
name|'else'
name|'models'
op|'.'
name|'Quota'
newline|'\n'
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'model'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'resource'
op|'='
name|'resource'
op|')'
newline|'\n'
name|'if'
name|'per_user'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'user_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'result'
op|'='
name|'query'
op|'.'
name|'update'
op|'('
op|'{'
string|"'hard_limit'"
op|':'
name|'limit'
op|'}'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'per_user'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ProjectUserQuotaNotFound'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|','
nl|'\n'
name|'user_id'
op|'='
name|'user_id'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ProjectQuotaNotFound'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|quota_class_get
name|'def'
name|'quota_class_get'
op|'('
name|'context'
op|','
name|'class_name'
op|','
name|'resource'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'QuotaClass'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'class_name'
op|'='
name|'class_name'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'resource'
op|'='
name|'resource'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'QuotaClassNotFound'
op|'('
name|'class_name'
op|'='
name|'class_name'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|quota_class_get_default
name|'def'
name|'quota_class_get_default'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'rows'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'QuotaClass'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'class_name'
op|'='
name|'_DEFAULT_QUOTA_NAME'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'result'
op|'='
op|'{'
string|"'class_name'"
op|':'
name|'_DEFAULT_QUOTA_NAME'
op|'}'
newline|'\n'
name|'for'
name|'row'
name|'in'
name|'rows'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'['
name|'row'
op|'.'
name|'resource'
op|']'
op|'='
name|'row'
op|'.'
name|'hard_limit'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|quota_class_get_all_by_name
name|'def'
name|'quota_class_get_all_by_name'
op|'('
name|'context'
op|','
name|'class_name'
op|')'
op|':'
newline|'\n'
indent|' '
name|'rows'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'QuotaClass'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'class_name'
op|'='
name|'class_name'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'result'
op|'='
op|'{'
string|"'class_name'"
op|':'
name|'class_name'
op|'}'
newline|'\n'
name|'for'
name|'row'
name|'in'
name|'rows'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'['
name|'row'
op|'.'
name|'resource'
op|']'
op|'='
name|'row'
op|'.'
name|'hard_limit'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|quota_class_create
name|'def'
name|'quota_class_create'
op|'('
name|'context'
op|','
name|'class_name'
op|','
name|'resource'
op|','
name|'limit'
op|')'
op|':'
newline|'\n'
indent|' '
name|'quota_class_ref'
op|'='
name|'models'
op|'.'
name|'QuotaClass'
op|'('
op|')'
newline|'\n'
name|'quota_class_ref'
op|'.'
name|'class_name'
op|'='
name|'class_name'
newline|'\n'
name|'quota_class_ref'
op|'.'
name|'resource'
op|'='
name|'resource'
newline|'\n'
name|'quota_class_ref'
op|'.'
name|'hard_limit'
op|'='
name|'limit'
newline|'\n'
name|'quota_class_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'return'
name|'quota_class_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|quota_class_update
name|'def'
name|'quota_class_update'
op|'('
name|'context'
op|','
name|'class_name'
op|','
name|'resource'
op|','
name|'limit'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'QuotaClass'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'class_name'
op|'='
name|'class_name'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'resource'
op|'='
name|'resource'
op|')'
op|'.'
name|'update'
op|'('
op|'{'
string|"'hard_limit'"
op|':'
name|'limit'
op|'}'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'QuotaClassNotFound'
op|'('
name|'class_name'
op|'='
name|'class_name'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|quota_usage_get
name|'def'
name|'quota_usage_get'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'resource'
op|','
name|'user_id'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'QuotaUsage'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'resource'
op|'='
name|'resource'
op|')'
newline|'\n'
name|'if'
name|'user_id'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'resource'
name|'not'
name|'in'
name|'PER_PROJECT_QUOTAS'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'query'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'user_id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'query'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'None'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
dedent|''
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'query'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'QuotaUsageNotFound'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_quota_usage_get_all
dedent|''
name|'def'
name|'_quota_usage_get_all'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'QuotaUsage'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
newline|'\n'
name|'result'
op|'='
op|'{'
string|"'project_id'"
op|':'
name|'project_id'
op|'}'
newline|'\n'
name|'if'
name|'user_id'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'or_'
op|'('
name|'models'
op|'.'
name|'QuotaUsage'
op|'.'
name|'user_id'
op|'=='
name|'user_id'
op|','
nl|'\n'
name|'models'
op|'.'
name|'QuotaUsage'
op|'.'
name|'user_id'
op|'=='
name|'null'
op|'('
op|')'
op|')'
op|')'
newline|'\n'
name|'result'
op|'['
string|"'user_id'"
op|']'
op|'='
name|'user_id'
newline|'\n'
nl|'\n'
dedent|''
name|'rows'
op|'='
name|'query'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'for'
name|'row'
name|'in'
name|'rows'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'row'
op|'.'
name|'resource'
name|'in'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'['
name|'row'
op|'.'
name|'resource'
op|']'
op|'['
string|"'in_use'"
op|']'
op|'+='
name|'row'
op|'.'
name|'in_use'
newline|'\n'
name|'result'
op|'['
name|'row'
op|'.'
name|'resource'
op|']'
op|'['
string|"'reserved'"
op|']'
op|'+='
name|'row'
op|'.'
name|'reserved'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'['
name|'row'
op|'.'
name|'resource'
op|']'
op|'='
name|'dict'
op|'('
name|'in_use'
op|'='
name|'row'
op|'.'
name|'in_use'
op|','
nl|'\n'
name|'reserved'
op|'='
name|'row'
op|'.'
name|'reserved'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|quota_usage_get_all_by_project_and_user
name|'def'
name|'quota_usage_get_all_by_project_and_user'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_quota_usage_get_all'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|'='
name|'user_id'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|quota_usage_get_all_by_project
name|'def'
name|'quota_usage_get_all_by_project'
op|'('
name|'context'
op|','
name|'project_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_quota_usage_get_all'
op|'('
name|'context'
op|','
name|'project_id'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_quota_usage_create
dedent|''
name|'def'
name|'_quota_usage_create'
op|'('
name|'project_id'
op|','
name|'user_id'
op|','
name|'resource'
op|','
name|'in_use'
op|','
nl|'\n'
name|'reserved'
op|','
name|'until_refresh'
op|','
name|'session'
op|')'
op|':'
newline|'\n'
indent|' '
name|'quota_usage_ref'
op|'='
name|'models'
op|'.'
name|'QuotaUsage'
op|'('
op|')'
newline|'\n'
name|'quota_usage_ref'
op|'.'
name|'project_id'
op|'='
name|'project_id'
newline|'\n'
name|'quota_usage_ref'
op|'.'
name|'user_id'
op|'='
name|'user_id'
newline|'\n'
name|'quota_usage_ref'
op|'.'
name|'resource'
op|'='
name|'resource'
newline|'\n'
name|'quota_usage_ref'
op|'.'
name|'in_use'
op|'='
name|'in_use'
newline|'\n'
name|'quota_usage_ref'
op|'.'
name|'reserved'
op|'='
name|'reserved'
newline|'\n'
name|'quota_usage_ref'
op|'.'
name|'until_refresh'
op|'='
name|'until_refresh'
newline|'\n'
comment|'# updated_at is needed for judgement of max_age'
nl|'\n'
name|'quota_usage_ref'
op|'.'
name|'updated_at'
op|'='
name|'timeutils'
op|'.'
name|'utcnow'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'quota_usage_ref'
op|'.'
name|'save'
op|'('
name|'session'
op|')'
newline|'\n'
nl|'\n'
name|'return'
name|'quota_usage_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|quota_usage_update
name|'def'
name|'quota_usage_update'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|','
name|'resource'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'updates'
op|'='
op|'{'
op|'}'
newline|'\n'
nl|'\n'
name|'for'
name|'key'
name|'in'
op|'['
string|"'in_use'"
op|','
string|"'reserved'"
op|','
string|"'until_refresh'"
op|']'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'key'
name|'in'
name|'kwargs'
op|':'
newline|'\n'
indent|' '
name|'updates'
op|'['
name|'key'
op|']'
op|'='
name|'kwargs'
op|'['
name|'key'
op|']'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'QuotaUsage'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'resource'
op|'='
name|'resource'
op|')'
op|'.'
name|'filter'
op|'('
name|'or_'
op|'('
name|'models'
op|'.'
name|'QuotaUsage'
op|'.'
name|'user_id'
op|'=='
name|'user_id'
op|','
nl|'\n'
name|'models'
op|'.'
name|'QuotaUsage'
op|'.'
name|'user_id'
op|'=='
name|'null'
op|'('
op|')'
op|')'
op|')'
op|'.'
name|'update'
op|'('
name|'updates'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'QuotaUsageNotFound'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
DECL|function|_reservation_create
dedent|''
dedent|''
name|'def'
name|'_reservation_create'
op|'('
name|'uuid'
op|','
name|'usage'
op|','
name|'project_id'
op|','
name|'user_id'
op|','
name|'resource'
op|','
nl|'\n'
name|'delta'
op|','
name|'expire'
op|','
name|'session'
op|')'
op|':'
newline|'\n'
indent|' '
name|'reservation_ref'
op|'='
name|'models'
op|'.'
name|'Reservation'
op|'('
op|')'
newline|'\n'
name|'reservation_ref'
op|'.'
name|'uuid'
op|'='
name|'uuid'
newline|'\n'
name|'reservation_ref'
op|'.'
name|'usage_id'
op|'='
name|'usage'
op|'['
string|"'id'"
op|']'
newline|'\n'
name|'reservation_ref'
op|'.'
name|'project_id'
op|'='
name|'project_id'
newline|'\n'
name|'reservation_ref'
op|'.'
name|'user_id'
op|'='
name|'user_id'
newline|'\n'
name|'reservation_ref'
op|'.'
name|'resource'
op|'='
name|'resource'
newline|'\n'
name|'reservation_ref'
op|'.'
name|'delta'
op|'='
name|'delta'
newline|'\n'
name|'reservation_ref'
op|'.'
name|'expire'
op|'='
name|'expire'
newline|'\n'
name|'reservation_ref'
op|'.'
name|'save'
op|'('
name|'session'
op|')'
newline|'\n'
name|'return'
name|'reservation_ref'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
comment|"# NOTE(johannes): The quota code uses SQL locking to ensure races don't"
nl|'\n'
comment|'# cause under or over counting of resources. To avoid deadlocks, this'
nl|'\n'
comment|'# code always acquires the lock on quota_usages before acquiring the lock'
nl|'\n'
comment|'# on reservations.'
nl|'\n'
nl|'\n'
DECL|function|_get_project_user_quota_usages
dedent|''
name|'def'
name|'_get_project_user_quota_usages'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'rows'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'QuotaUsage'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'order_by'
op|'('
name|'models'
op|'.'
name|'QuotaUsage'
op|'.'
name|'id'
op|'.'
name|'asc'
op|'('
op|')'
op|')'
op|'.'
name|'with_lockmode'
op|'('
string|"'update'"
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'proj_result'
op|'='
name|'dict'
op|'('
op|')'
newline|'\n'
name|'user_result'
op|'='
name|'dict'
op|'('
op|')'
newline|'\n'
comment|'# Get the total count of in_use,reserved'
nl|'\n'
name|'for'
name|'row'
name|'in'
name|'rows'
op|':'
newline|'\n'
indent|' '
name|'proj_result'
op|'.'
name|'setdefault'
op|'('
name|'row'
op|'.'
name|'resource'
op|','
nl|'\n'
name|'dict'
op|'('
name|'in_use'
op|'='
number|'0'
op|','
name|'reserved'
op|'='
number|'0'
op|','
name|'total'
op|'='
number|'0'
op|')'
op|')'
newline|'\n'
name|'proj_result'
op|'['
name|'row'
op|'.'
name|'resource'
op|']'
op|'['
string|"'in_use'"
op|']'
op|'+='
name|'row'
op|'.'
name|'in_use'
newline|'\n'
name|'proj_result'
op|'['
name|'row'
op|'.'
name|'resource'
op|']'
op|'['
string|"'reserved'"
op|']'
op|'+='
name|'row'
op|'.'
name|'reserved'
newline|'\n'
name|'proj_result'
op|'['
name|'row'
op|'.'
name|'resource'
op|']'
op|'['
string|"'total'"
op|']'
op|'+='
op|'('
name|'row'
op|'.'
name|'in_use'
op|'+'
name|'row'
op|'.'
name|'reserved'
op|')'
newline|'\n'
name|'if'
name|'row'
op|'.'
name|'user_id'
name|'is'
name|'None'
name|'or'
name|'row'
op|'.'
name|'user_id'
op|'=='
name|'user_id'
op|':'
newline|'\n'
indent|' '
name|'user_result'
op|'['
name|'row'
op|'.'
name|'resource'
op|']'
op|'='
name|'row'
newline|'\n'
dedent|''
dedent|''
name|'return'
name|'proj_result'
op|','
name|'user_result'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_create_quota_usage_if_missing
dedent|''
name|'def'
name|'_create_quota_usage_if_missing'
op|'('
name|'user_usages'
op|','
name|'resource'
op|','
name|'until_refresh'
op|','
nl|'\n'
name|'project_id'
op|','
name|'user_id'
op|','
name|'session'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Creates a QuotaUsage record and adds to user_usages if not present.\n\n :param user_usages: dict of resource keys to QuotaUsage records. This is\n updated if resource is not in user_usages yet or\n until_refresh is not None.\n :param resource: The resource being checked for quota usage.\n :param until_refresh: Count of reservations until usage is refreshed,\n int or None\n :param project_id: The project being checked for quota usage.\n :param user_id: The user being checked for quota usage.\n :param session: DB session holding a transaction lock.\n :return: True if a new QuotaUsage record was created and added\n to user_usages, False otherwise.\n """'
newline|'\n'
name|'new_usage'
op|'='
name|'None'
newline|'\n'
name|'if'
name|'resource'
name|'not'
name|'in'
name|'user_usages'
op|':'
newline|'\n'
indent|' '
name|'user_id_to_use'
op|'='
name|'user_id'
newline|'\n'
name|'if'
name|'resource'
name|'in'
name|'PER_PROJECT_QUOTAS'
op|':'
newline|'\n'
indent|' '
name|'user_id_to_use'
op|'='
name|'None'
newline|'\n'
dedent|''
name|'new_usage'
op|'='
name|'_quota_usage_create'
op|'('
name|'project_id'
op|','
name|'user_id_to_use'
op|','
name|'resource'
op|','
nl|'\n'
number|'0'
op|','
number|'0'
op|','
name|'until_refresh'
name|'or'
name|'None'
op|','
name|'session'
op|')'
newline|'\n'
name|'user_usages'
op|'['
name|'resource'
op|']'
op|'='
name|'new_usage'
newline|'\n'
dedent|''
name|'return'
name|'new_usage'
name|'is'
name|'not'
name|'None'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_is_quota_refresh_needed
dedent|''
name|'def'
name|'_is_quota_refresh_needed'
op|'('
name|'quota_usage'
op|','
name|'max_age'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Determines if a quota usage refresh is needed.\n\n :param quota_usage: A QuotaUsage object for a given resource.\n :param max_age: Number of seconds between subsequent usage refreshes.\n :return: True if a refresh is needed, False otherwise.\n """'
newline|'\n'
name|'refresh'
op|'='
name|'False'
newline|'\n'
name|'if'
name|'quota_usage'
op|'.'
name|'in_use'
op|'<'
number|'0'
op|':'
newline|'\n'
comment|'# Negative in_use count indicates a desync, so try to'
nl|'\n'
comment|'# heal from that...'
nl|'\n'
indent|' '
name|'LOG'
op|'.'
name|'debug'
op|'('
string|"'in_use has dropped below 0; forcing refresh for '"
nl|'\n'
string|"'QuotaUsage: %s'"
op|','
name|'dict'
op|'('
name|'quota_usage'
op|')'
op|')'
newline|'\n'
name|'refresh'
op|'='
name|'True'
newline|'\n'
dedent|''
name|'elif'
name|'quota_usage'
op|'.'
name|'until_refresh'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'quota_usage'
op|'.'
name|'until_refresh'
op|'-='
number|'1'
newline|'\n'
name|'if'
name|'quota_usage'
op|'.'
name|'until_refresh'
op|'<='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'refresh'
op|'='
name|'True'
newline|'\n'
dedent|''
dedent|''
name|'elif'
name|'max_age'
name|'and'
op|'('
name|'timeutils'
op|'.'
name|'utcnow'
op|'('
op|')'
op|'-'
nl|'\n'
name|'quota_usage'
op|'.'
name|'updated_at'
op|')'
op|'.'
name|'seconds'
op|'>='
name|'max_age'
op|':'
newline|'\n'
indent|' '
name|'refresh'
op|'='
name|'True'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'refresh'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_refresh_quota_usages
dedent|''
name|'def'
name|'_refresh_quota_usages'
op|'('
name|'quota_usage'
op|','
name|'until_refresh'
op|','
name|'in_use'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Refreshes quota usage for the given resource.\n\n :param quota_usage: A QuotaUsage object for a given resource.\n :param until_refresh: Count of reservations until usage is refreshed,\n int or None\n :param in_use: Actual quota usage for the resource.\n """'
newline|'\n'
name|'if'
name|'quota_usage'
op|'.'
name|'in_use'
op|'!='
name|'in_use'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'info'
op|'('
name|'_LI'
op|'('
string|"'quota_usages out of sync, updating. '"
nl|'\n'
string|"'project_id: %(project_id)s, '"
nl|'\n'
string|"'user_id: %(user_id)s, '"
nl|'\n'
string|"'resource: %(res)s, '"
nl|'\n'
string|"'tracked usage: %(tracked_use)s, '"
nl|'\n'
string|"'actual usage: %(in_use)s'"
op|')'
op|','
nl|'\n'
op|'{'
string|"'project_id'"
op|':'
name|'quota_usage'
op|'.'
name|'project_id'
op|','
nl|'\n'
string|"'user_id'"
op|':'
name|'quota_usage'
op|'.'
name|'user_id'
op|','
nl|'\n'
string|"'res'"
op|':'
name|'quota_usage'
op|'.'
name|'resource'
op|','
nl|'\n'
string|"'tracked_use'"
op|':'
name|'quota_usage'
op|'.'
name|'in_use'
op|','
nl|'\n'
string|"'in_use'"
op|':'
name|'in_use'
op|'}'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'debug'
op|'('
string|"'QuotaUsage has not changed, refresh is unnecessary for: %s'"
op|','
nl|'\n'
name|'dict'
op|'('
name|'quota_usage'
op|')'
op|')'
newline|'\n'
nl|'\n'
comment|'# Update the usage'
nl|'\n'
dedent|''
name|'quota_usage'
op|'.'
name|'in_use'
op|'='
name|'in_use'
newline|'\n'
name|'quota_usage'
op|'.'
name|'until_refresh'
op|'='
name|'until_refresh'
name|'or'
name|'None'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_refresh_quota_usages_if_needed
dedent|''
name|'def'
name|'_refresh_quota_usages_if_needed'
op|'('
name|'user_usages'
op|','
name|'context'
op|','
name|'resources'
op|','
name|'keys'
op|','
nl|'\n'
name|'project_id'
op|','
name|'user_id'
op|','
name|'until_refresh'
op|','
nl|'\n'
name|'max_age'
op|','
name|'force_refresh'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
name|'elevated'
op|'='
name|'context'
op|'.'
name|'elevated'
op|'('
op|')'
newline|'\n'
nl|'\n'
comment|'# Handle usage refresh'
nl|'\n'
name|'work'
op|'='
name|'set'
op|'('
name|'keys'
op|')'
newline|'\n'
name|'while'
name|'work'
op|':'
newline|'\n'
indent|' '
name|'resource'
op|'='
name|'work'
op|'.'
name|'pop'
op|'('
op|')'
newline|'\n'
nl|'\n'
comment|'# Do we need to refresh the usage?'
nl|'\n'
name|'created'
op|'='
name|'_create_quota_usage_if_missing'
op|'('
name|'user_usages'
op|','
name|'resource'
op|','
nl|'\n'
name|'until_refresh'
op|','
name|'project_id'
op|','
nl|'\n'
name|'user_id'
op|','
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
nl|'\n'
name|'refresh'
op|'='
name|'force_refresh'
newline|'\n'
name|'if'
name|'not'
name|'refresh'
op|':'
newline|'\n'
indent|' '
name|'refresh'
op|'='
name|'created'
name|'or'
name|'_is_quota_refresh_needed'
op|'('
name|'user_usages'
op|'['
name|'resource'
op|']'
op|','
name|'max_age'
op|')'
newline|'\n'
nl|'\n'
comment|'# OK, refresh the usage'
nl|'\n'
dedent|''
name|'if'
name|'refresh'
op|':'
newline|'\n'
comment|'# Grab the sync routine'
nl|'\n'
indent|' '
name|'sync'
op|'='
name|'QUOTA_SYNC_FUNCTIONS'
op|'['
name|'resources'
op|'['
name|'resource'
op|']'
op|'.'
name|'sync'
op|']'
newline|'\n'
nl|'\n'
name|'updates'
op|'='
name|'sync'
op|'('
name|'elevated'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
newline|'\n'
name|'for'
name|'res'
op|','
name|'in_use'
name|'in'
name|'updates'
op|'.'
name|'items'
op|'('
op|')'
op|':'
newline|'\n'
comment|'# Make sure we have a destination for the usage!'
nl|'\n'
indent|' '
name|'_create_quota_usage_if_missing'
op|'('
name|'user_usages'
op|','
name|'res'
op|','
nl|'\n'
name|'until_refresh'
op|','
name|'project_id'
op|','
nl|'\n'
name|'user_id'
op|','
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'_refresh_quota_usages'
op|'('
name|'user_usages'
op|'['
name|'res'
op|']'
op|','
name|'until_refresh'
op|','
nl|'\n'
name|'in_use'
op|')'
newline|'\n'
nl|'\n'
comment|'# Because more than one resource may be refreshed'
nl|'\n'
comment|"# by the call to the sync routine, and we don't"
nl|'\n'
comment|'# want to double-sync, we make sure all refreshed'
nl|'\n'
comment|'# resources are dropped from the work set.'
nl|'\n'
name|'work'
op|'.'
name|'discard'
op|'('
name|'res'
op|')'
newline|'\n'
nl|'\n'
comment|'# NOTE(Vek): We make the assumption that the sync'
nl|'\n'
comment|'# routine actually refreshes the'
nl|'\n'
comment|'# resources that it is the sync routine'
nl|'\n'
comment|"# for. We don't check, because this is"
nl|'\n'
comment|'# a best-effort mechanism.'
nl|'\n'
nl|'\n'
nl|'\n'
DECL|function|_calculate_overquota
dedent|''
dedent|''
dedent|''
dedent|''
name|'def'
name|'_calculate_overquota'
op|'('
name|'project_quotas'
op|','
name|'user_quotas'
op|','
name|'deltas'
op|','
nl|'\n'
name|'project_usages'
op|','
name|'user_usages'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Checks if any resources will go over quota based on the request.\n\n :param project_quotas: dict of resource quotas (limits) for the project.\n :param user_quotas: dict of resource quotas (limits) for the user.\n :param deltas: dict of resource keys to positive/negative quota\n changes for the resources in a given operation.\n :param project_usages: dict of resource keys to QuotaUsage records for the\n project.\n :param user_usages: dict of resource keys to QuotaUsage records for the\n user.\n :return: list of resources that are over-quota for the\n operation.\n """'
newline|'\n'
name|'overs'
op|'='
op|'['
op|']'
newline|'\n'
name|'for'
name|'res'
op|','
name|'delta'
name|'in'
name|'deltas'
op|'.'
name|'items'
op|'('
op|')'
op|':'
newline|'\n'
comment|"# We can't go over-quota if we're not reserving anything."
nl|'\n'
indent|' '
name|'if'
name|'delta'
op|'>='
number|'0'
op|':'
newline|'\n'
comment|"# We can't go over-quota if we have unlimited quotas."
nl|'\n'
comment|'# over if the project usage + delta is more than project quota'
nl|'\n'
indent|' '
name|'if'
number|'0'
op|'<='
name|'project_quotas'
op|'['
name|'res'
op|']'
op|'<'
name|'delta'
op|'+'
name|'project_usages'
op|'['
name|'res'
op|']'
op|'['
string|"'total'"
op|']'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'debug'
op|'('
string|"'Request is over project quota for resource '"
nl|'\n'
string|'\'"%(res)s". Project limit: %(limit)s, delta: \''
nl|'\n'
string|"'%(delta)s, current total project usage: %(total)s'"
op|','
nl|'\n'
op|'{'
string|"'res'"
op|':'
name|'res'
op|','
string|"'limit'"
op|':'
name|'project_quotas'
op|'['
name|'res'
op|']'
op|','
nl|'\n'
string|"'delta'"
op|':'
name|'delta'
op|','
nl|'\n'
string|"'total'"
op|':'
name|'project_usages'
op|'['
name|'res'
op|']'
op|'['
string|"'total'"
op|']'
op|'}'
op|')'
newline|'\n'
name|'overs'
op|'.'
name|'append'
op|'('
name|'res'
op|')'
newline|'\n'
comment|"# We can't go over-quota if we have unlimited quotas."
nl|'\n'
comment|'# over if the user usage + delta is more than user quota'
nl|'\n'
dedent|''
name|'elif'
number|'0'
op|'<='
name|'user_quotas'
op|'['
name|'res'
op|']'
op|'<'
name|'delta'
op|'+'
name|'user_usages'
op|'['
name|'res'
op|']'
op|'['
string|"'total'"
op|']'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'debug'
op|'('
string|"'Request is over user quota for resource '"
nl|'\n'
string|'\'"%(res)s". User limit: %(limit)s, delta: \''
nl|'\n'
string|"'%(delta)s, current total user usage: %(total)s'"
op|','
nl|'\n'
op|'{'
string|"'res'"
op|':'
name|'res'
op|','
string|"'limit'"
op|':'
name|'user_quotas'
op|'['
name|'res'
op|']'
op|','
nl|'\n'
string|"'delta'"
op|':'
name|'delta'
op|','
string|"'total'"
op|':'
name|'user_usages'
op|'['
name|'res'
op|']'
op|'['
string|"'total'"
op|']'
op|'}'
op|')'
newline|'\n'
name|'overs'
op|'.'
name|'append'
op|'('
name|'res'
op|')'
newline|'\n'
dedent|''
dedent|''
dedent|''
name|'return'
name|'overs'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|')'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|quota_usage_refresh
name|'def'
name|'quota_usage_refresh'
op|'('
name|'context'
op|','
name|'resources'
op|','
name|'keys'
op|','
name|'until_refresh'
op|','
name|'max_age'
op|','
nl|'\n'
name|'project_id'
op|'='
name|'None'
op|','
name|'user_id'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'project_id'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'project_id'
op|'='
name|'context'
op|'.'
name|'project_id'
newline|'\n'
dedent|''
name|'if'
name|'user_id'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'user_id'
op|'='
name|'context'
op|'.'
name|'user_id'
newline|'\n'
nl|'\n'
comment|'# Get the current usages'
nl|'\n'
dedent|''
name|'project_usages'
op|','
name|'user_usages'
op|'='
name|'_get_project_user_quota_usages'
op|'('
nl|'\n'
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
newline|'\n'
nl|'\n'
comment|'# Force refresh of the usages'
nl|'\n'
name|'_refresh_quota_usages_if_needed'
op|'('
name|'user_usages'
op|','
name|'context'
op|','
name|'resources'
op|','
name|'keys'
op|','
nl|'\n'
name|'project_id'
op|','
name|'user_id'
op|','
name|'until_refresh'
op|','
nl|'\n'
name|'max_age'
op|','
name|'force_refresh'
op|'='
name|'True'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|')'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|quota_reserve
name|'def'
name|'quota_reserve'
op|'('
name|'context'
op|','
name|'resources'
op|','
name|'project_quotas'
op|','
name|'user_quotas'
op|','
name|'deltas'
op|','
nl|'\n'
name|'expire'
op|','
name|'until_refresh'
op|','
name|'max_age'
op|','
name|'project_id'
op|'='
name|'None'
op|','
nl|'\n'
name|'user_id'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'project_id'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'project_id'
op|'='
name|'context'
op|'.'
name|'project_id'
newline|'\n'
dedent|''
name|'if'
name|'user_id'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'user_id'
op|'='
name|'context'
op|'.'
name|'user_id'
newline|'\n'
nl|'\n'
comment|'# Get the current usages'
nl|'\n'
dedent|''
name|'project_usages'
op|','
name|'user_usages'
op|'='
name|'_get_project_user_quota_usages'
op|'('
nl|'\n'
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
newline|'\n'
nl|'\n'
name|'_refresh_quota_usages_if_needed'
op|'('
name|'user_usages'
op|','
name|'context'
op|','
name|'resources'
op|','
nl|'\n'
name|'deltas'
op|'.'
name|'keys'
op|'('
op|')'
op|','
name|'project_id'
op|','
name|'user_id'
op|','
nl|'\n'
name|'until_refresh'
op|','
name|'max_age'
op|')'
newline|'\n'
nl|'\n'
comment|'# Check for deltas that would go negative'
nl|'\n'
name|'unders'
op|'='
op|'['
name|'res'
name|'for'
name|'res'
op|','
name|'delta'
name|'in'
name|'deltas'
op|'.'
name|'items'
op|'('
op|')'
nl|'\n'
name|'if'
name|'delta'
op|'<'
number|'0'
name|'and'
nl|'\n'
name|'delta'
op|'+'
name|'user_usages'
op|'['
name|'res'
op|']'
op|'.'
name|'in_use'
op|'<'
number|'0'
op|']'
newline|'\n'
nl|'\n'
comment|"# Now, let's check the quotas"
nl|'\n'
comment|"# NOTE(Vek): We're only concerned about positive increments."
nl|'\n'
comment|'# If a project has gone over quota, we want them to'
nl|'\n'
comment|'# be able to reduce their usage without any'
nl|'\n'
comment|'# problems.'
nl|'\n'
name|'for'
name|'key'
op|','
name|'value'
name|'in'
name|'user_usages'
op|'.'
name|'items'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'key'
name|'not'
name|'in'
name|'project_usages'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'debug'
op|'('
string|'\'Copying QuotaUsage for resource "%(key)s" from \''
nl|'\n'
string|"'user_usages into project_usages: %(value)s'"
op|','
nl|'\n'
op|'{'
string|"'key'"
op|':'
name|'key'
op|','
string|"'value'"
op|':'
name|'dict'
op|'('
name|'value'
op|')'
op|'}'
op|')'
newline|'\n'
name|'project_usages'
op|'['
name|'key'
op|']'
op|'='
name|'value'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'overs'
op|'='
name|'_calculate_overquota'
op|'('
name|'project_quotas'
op|','
name|'user_quotas'
op|','
name|'deltas'
op|','
nl|'\n'
name|'project_usages'
op|','
name|'user_usages'
op|')'
newline|'\n'
nl|'\n'
comment|'# NOTE(Vek): The quota check needs to be in the transaction,'
nl|'\n'
comment|"# but the transaction doesn't fail just because"
nl|'\n'
comment|"# we're over quota, so the OverQuota raise is"
nl|'\n'
comment|'# outside the transaction. If we did the raise'
nl|'\n'
comment|'# here, our usage updates would be discarded, but'
nl|'\n'
comment|"# they're not invalidated by being over-quota."
nl|'\n'
nl|'\n'
comment|'# Create the reservations'
nl|'\n'
name|'if'
name|'not'
name|'overs'
op|':'
newline|'\n'
indent|' '
name|'reservations'
op|'='
op|'['
op|']'
newline|'\n'
name|'for'
name|'res'
op|','
name|'delta'
name|'in'
name|'deltas'
op|'.'
name|'items'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'reservation'
op|'='
name|'_reservation_create'
op|'('
nl|'\n'
name|'str'
op|'('
name|'uuid'
op|'.'
name|'uuid4'
op|'('
op|')'
op|')'
op|','
nl|'\n'
name|'user_usages'
op|'['
name|'res'
op|']'
op|','
nl|'\n'
name|'project_id'
op|','
nl|'\n'
name|'user_id'
op|','
nl|'\n'
name|'res'
op|','
name|'delta'
op|','
name|'expire'
op|','
nl|'\n'
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'reservations'
op|'.'
name|'append'
op|'('
name|'reservation'
op|'.'
name|'uuid'
op|')'
newline|'\n'
nl|'\n'
comment|'# Also update the reserved quantity'
nl|'\n'
comment|'# NOTE(Vek): Again, we are only concerned here about'
nl|'\n'
comment|"# positive increments. Here, though, we're"
nl|'\n'
comment|'# worried about the following scenario:'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# 1) User initiates resize down.'
nl|'\n'
comment|'# 2) User allocates a new instance.'
nl|'\n'
comment|'# 3) Resize down fails or is reverted.'
nl|'\n'
comment|'# 4) User is now over quota.'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# To prevent this, we only update the'
nl|'\n'
comment|'# reserved value if the delta is positive.'
nl|'\n'
name|'if'
name|'delta'
op|'>'
number|'0'
op|':'
newline|'\n'
indent|' '
name|'user_usages'
op|'['
name|'res'
op|']'
op|'.'
name|'reserved'
op|'+='
name|'delta'
newline|'\n'
nl|'\n'
comment|'# Apply updates to the usages table'
nl|'\n'
dedent|''
dedent|''
dedent|''
name|'for'
name|'usage_ref'
name|'in'
name|'user_usages'
op|'.'
name|'values'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'context'
op|'.'
name|'session'
op|'.'
name|'add'
op|'('
name|'usage_ref'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'unders'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'warning'
op|'('
name|'_LW'
op|'('
string|'"Change will make usage less than 0 for the following "'
nl|'\n'
string|'"resources: %s"'
op|')'
op|','
name|'unders'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'overs'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'project_quotas'
op|'=='
name|'user_quotas'
op|':'
newline|'\n'
indent|' '
name|'usages'
op|'='
name|'project_usages'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
comment|'# NOTE(mriedem): user_usages is a dict of resource keys to'
nl|'\n'
comment|"# QuotaUsage sqlalchemy dict-like objects and doen't log well"
nl|'\n'
comment|'# so convert the user_usages values to something useful for'
nl|'\n'
comment|'# logging. Remove this if we ever change how'
nl|'\n'
comment|'# _get_project_user_quota_usages returns the user_usages values.'
nl|'\n'
indent|' '
name|'user_usages'
op|'='
op|'{'
name|'k'
op|':'
name|'dict'
op|'('
name|'in_use'
op|'='
name|'v'
op|'['
string|"'in_use'"
op|']'
op|','
name|'reserved'
op|'='
name|'v'
op|'['
string|"'reserved'"
op|']'
op|','
nl|'\n'
name|'total'
op|'='
name|'v'
op|'['
string|"'total'"
op|']'
op|')'
nl|'\n'
name|'for'
name|'k'
op|','
name|'v'
name|'in'
name|'user_usages'
op|'.'
name|'items'
op|'('
op|')'
op|'}'
newline|'\n'
name|'usages'
op|'='
name|'user_usages'
newline|'\n'
dedent|''
name|'usages'
op|'='
op|'{'
name|'k'
op|':'
name|'dict'
op|'('
name|'in_use'
op|'='
name|'v'
op|'['
string|"'in_use'"
op|']'
op|','
name|'reserved'
op|'='
name|'v'
op|'['
string|"'reserved'"
op|']'
op|')'
nl|'\n'
name|'for'
name|'k'
op|','
name|'v'
name|'in'
name|'usages'
op|'.'
name|'items'
op|'('
op|')'
op|'}'
newline|'\n'
name|'LOG'
op|'.'
name|'debug'
op|'('
string|"'Raise OverQuota exception because: '"
nl|'\n'
string|"'project_quotas: %(project_quotas)s, '"
nl|'\n'
string|"'user_quotas: %(user_quotas)s, deltas: %(deltas)s, '"
nl|'\n'
string|"'overs: %(overs)s, project_usages: %(project_usages)s, '"
nl|'\n'
string|"'user_usages: %(user_usages)s'"
op|','
nl|'\n'
op|'{'
string|"'project_quotas'"
op|':'
name|'project_quotas'
op|','
nl|'\n'
string|"'user_quotas'"
op|':'
name|'user_quotas'
op|','
nl|'\n'
string|"'overs'"
op|':'
name|'overs'
op|','
string|"'deltas'"
op|':'
name|'deltas'
op|','
nl|'\n'
string|"'project_usages'"
op|':'
name|'project_usages'
op|','
nl|'\n'
string|"'user_usages'"
op|':'
name|'user_usages'
op|'}'
op|')'
newline|'\n'
name|'raise'
name|'exception'
op|'.'
name|'OverQuota'
op|'('
name|'overs'
op|'='
name|'sorted'
op|'('
name|'overs'
op|')'
op|','
name|'quotas'
op|'='
name|'user_quotas'
op|','
nl|'\n'
name|'usages'
op|'='
name|'usages'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'reservations'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_quota_reservations_query
dedent|''
name|'def'
name|'_quota_reservations_query'
op|'('
name|'context'
op|','
name|'reservations'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Return the relevant reservations."""'
newline|'\n'
nl|'\n'
comment|'# Get the listed reservations'
nl|'\n'
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Reservation'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Reservation'
op|'.'
name|'uuid'
op|'.'
name|'in_'
op|'('
name|'reservations'
op|')'
op|')'
op|'.'
name|'with_lockmode'
op|'('
string|"'update'"
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|')'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|reservation_commit
name|'def'
name|'reservation_commit'
op|'('
name|'context'
op|','
name|'reservations'
op|','
name|'project_id'
op|'='
name|'None'
op|','
name|'user_id'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'_project_usages'
op|','
name|'user_usages'
op|'='
name|'_get_project_user_quota_usages'
op|'('
nl|'\n'
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
newline|'\n'
name|'reservation_query'
op|'='
name|'_quota_reservations_query'
op|'('
name|'context'
op|','
name|'reservations'
op|')'
newline|'\n'
name|'for'
name|'reservation'
name|'in'
name|'reservation_query'
op|'.'
name|'all'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'usage'
op|'='
name|'user_usages'
op|'['
name|'reservation'
op|'.'
name|'resource'
op|']'
newline|'\n'
name|'if'
name|'reservation'
op|'.'
name|'delta'
op|'>='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'usage'
op|'.'
name|'reserved'
op|'-='
name|'reservation'
op|'.'
name|'delta'
newline|'\n'
dedent|''
name|'usage'
op|'.'
name|'in_use'
op|'+='
name|'reservation'
op|'.'
name|'delta'
newline|'\n'
dedent|''
name|'reservation_query'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|')'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|reservation_rollback
name|'def'
name|'reservation_rollback'
op|'('
name|'context'
op|','
name|'reservations'
op|','
name|'project_id'
op|'='
name|'None'
op|','
name|'user_id'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'_project_usages'
op|','
name|'user_usages'
op|'='
name|'_get_project_user_quota_usages'
op|'('
nl|'\n'
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
newline|'\n'
name|'reservation_query'
op|'='
name|'_quota_reservations_query'
op|'('
name|'context'
op|','
name|'reservations'
op|')'
newline|'\n'
name|'for'
name|'reservation'
name|'in'
name|'reservation_query'
op|'.'
name|'all'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'usage'
op|'='
name|'user_usages'
op|'['
name|'reservation'
op|'.'
name|'resource'
op|']'
newline|'\n'
name|'if'
name|'reservation'
op|'.'
name|'delta'
op|'>='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'usage'
op|'.'
name|'reserved'
op|'-='
name|'reservation'
op|'.'
name|'delta'
newline|'\n'
dedent|''
dedent|''
name|'reservation_query'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|quota_destroy_all_by_project_and_user
name|'def'
name|'quota_destroy_all_by_project_and_user'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'ProjectUserQuota'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'user_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'QuotaUsage'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'user_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Reservation'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'user_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|quota_destroy_all_by_project
name|'def'
name|'quota_destroy_all_by_project'
op|'('
name|'context'
op|','
name|'project_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Quota'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'ProjectUserQuota'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'QuotaUsage'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Reservation'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|')'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|reservation_expire
name|'def'
name|'reservation_expire'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'current_time'
op|'='
name|'timeutils'
op|'.'
name|'utcnow'
op|'('
op|')'
newline|'\n'
name|'reservation_query'
op|'='
name|'model_query'
op|'('
nl|'\n'
name|'context'
op|','
name|'models'
op|'.'
name|'Reservation'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Reservation'
op|'.'
name|'expire'
op|'<'
name|'current_time'
op|')'
newline|'\n'
nl|'\n'
name|'for'
name|'reservation'
name|'in'
name|'reservation_query'
op|'.'
name|'join'
op|'('
name|'models'
op|'.'
name|'QuotaUsage'
op|')'
op|'.'
name|'all'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'reservation'
op|'.'
name|'delta'
op|'>='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'reservation'
op|'.'
name|'usage'
op|'.'
name|'reserved'
op|'-='
name|'reservation'
op|'.'
name|'delta'
newline|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'add'
op|'('
name|'reservation'
op|'.'
name|'usage'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'reservation_query'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
DECL|function|_ec2_volume_get_query
dedent|''
name|'def'
name|'_ec2_volume_get_query'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'VolumeIdMapping'
op|','
name|'read_deleted'
op|'='
string|"'yes'"
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_ec2_snapshot_get_query
dedent|''
name|'def'
name|'_ec2_snapshot_get_query'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'SnapshotIdMapping'
op|','
name|'read_deleted'
op|'='
string|"'yes'"
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|ec2_volume_create
name|'def'
name|'ec2_volume_create'
op|'('
name|'context'
op|','
name|'volume_uuid'
op|','
name|'id'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Create ec2 compatible volume by provided uuid."""'
newline|'\n'
name|'ec2_volume_ref'
op|'='
name|'models'
op|'.'
name|'VolumeIdMapping'
op|'('
op|')'
newline|'\n'
name|'ec2_volume_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|"'uuid'"
op|':'
name|'volume_uuid'
op|'}'
op|')'
newline|'\n'
name|'if'
name|'id'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'ec2_volume_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|"'id'"
op|':'
name|'id'
op|'}'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'ec2_volume_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
nl|'\n'
name|'return'
name|'ec2_volume_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|ec2_volume_get_by_uuid
name|'def'
name|'ec2_volume_get_by_uuid'
op|'('
name|'context'
op|','
name|'volume_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'_ec2_volume_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'uuid'
op|'='
name|'volume_uuid'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'VolumeNotFound'
op|'('
name|'volume_id'
op|'='
name|'volume_uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|ec2_volume_get_by_id
name|'def'
name|'ec2_volume_get_by_id'
op|'('
name|'context'
op|','
name|'volume_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'_ec2_volume_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'volume_id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'VolumeNotFound'
op|'('
name|'volume_id'
op|'='
name|'volume_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|ec2_snapshot_create
name|'def'
name|'ec2_snapshot_create'
op|'('
name|'context'
op|','
name|'snapshot_uuid'
op|','
name|'id'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Create ec2 compatible snapshot by provided uuid."""'
newline|'\n'
name|'ec2_snapshot_ref'
op|'='
name|'models'
op|'.'
name|'SnapshotIdMapping'
op|'('
op|')'
newline|'\n'
name|'ec2_snapshot_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|"'uuid'"
op|':'
name|'snapshot_uuid'
op|'}'
op|')'
newline|'\n'
name|'if'
name|'id'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'ec2_snapshot_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|"'id'"
op|':'
name|'id'
op|'}'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'ec2_snapshot_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
nl|'\n'
name|'return'
name|'ec2_snapshot_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|ec2_snapshot_get_by_ec2_id
name|'def'
name|'ec2_snapshot_get_by_ec2_id'
op|'('
name|'context'
op|','
name|'ec2_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'_ec2_snapshot_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'ec2_id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'SnapshotNotFound'
op|'('
name|'snapshot_id'
op|'='
name|'ec2_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|ec2_snapshot_get_by_uuid
name|'def'
name|'ec2_snapshot_get_by_uuid'
op|'('
name|'context'
op|','
name|'snapshot_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'_ec2_snapshot_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'uuid'
op|'='
name|'snapshot_uuid'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'SnapshotNotFound'
op|'('
name|'snapshot_id'
op|'='
name|'snapshot_uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
DECL|function|_block_device_mapping_get_query
dedent|''
name|'def'
name|'_block_device_mapping_get_query'
op|'('
name|'context'
op|','
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'columns_to_join'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'columns_to_join'
op|'='
op|'['
op|']'
newline|'\n'
nl|'\n'
dedent|''
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'BlockDeviceMapping'
op|')'
newline|'\n'
nl|'\n'
name|'for'
name|'column'
name|'in'
name|'columns_to_join'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
name|'column'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'query'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_scrub_empty_str_values
dedent|''
name|'def'
name|'_scrub_empty_str_values'
op|'('
name|'dct'
op|','
name|'keys_to_scrub'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Remove any keys found in sequence keys_to_scrub from the dict\n if they have the value \'\'.\n """'
newline|'\n'
name|'for'
name|'key'
name|'in'
name|'keys_to_scrub'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'key'
name|'in'
name|'dct'
name|'and'
name|'dct'
op|'['
name|'key'
op|']'
op|'=='
string|"''"
op|':'
newline|'\n'
indent|' '
name|'del'
name|'dct'
op|'['
name|'key'
op|']'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_from_legacy_values
dedent|''
dedent|''
dedent|''
name|'def'
name|'_from_legacy_values'
op|'('
name|'values'
op|','
name|'legacy'
op|','
name|'allow_updates'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'legacy'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'allow_updates'
name|'and'
name|'block_device'
op|'.'
name|'is_safe_for_update'
op|'('
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'values'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'block_device'
op|'.'
name|'BlockDeviceDict'
op|'.'
name|'from_legacy'
op|'('
name|'values'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'values'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|block_device_mapping_create
name|'def'
name|'block_device_mapping_create'
op|'('
name|'context'
op|','
name|'values'
op|','
name|'legacy'
op|'='
name|'True'
op|')'
op|':'
newline|'\n'
indent|' '
name|'_scrub_empty_str_values'
op|'('
name|'values'
op|','
op|'['
string|"'volume_size'"
op|']'
op|')'
newline|'\n'
name|'values'
op|'='
name|'_from_legacy_values'
op|'('
name|'values'
op|','
name|'legacy'
op|')'
newline|'\n'
name|'convert_objects_related_datetimes'
op|'('
name|'values'
op|')'
newline|'\n'
nl|'\n'
name|'bdm_ref'
op|'='
name|'models'
op|'.'
name|'BlockDeviceMapping'
op|'('
op|')'
newline|'\n'
name|'bdm_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'bdm_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'return'
name|'bdm_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|block_device_mapping_update
name|'def'
name|'block_device_mapping_update'
op|'('
name|'context'
op|','
name|'bdm_id'
op|','
name|'values'
op|','
name|'legacy'
op|'='
name|'True'
op|')'
op|':'
newline|'\n'
indent|' '
name|'_scrub_empty_str_values'
op|'('
name|'values'
op|','
op|'['
string|"'volume_size'"
op|']'
op|')'
newline|'\n'
name|'values'
op|'='
name|'_from_legacy_values'
op|'('
name|'values'
op|','
name|'legacy'
op|','
name|'allow_updates'
op|'='
name|'True'
op|')'
newline|'\n'
name|'convert_objects_related_datetimes'
op|'('
name|'values'
op|')'
newline|'\n'
nl|'\n'
name|'query'
op|'='
name|'_block_device_mapping_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'bdm_id'
op|')'
newline|'\n'
name|'query'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'return'
name|'query'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|block_device_mapping_update_or_create
name|'def'
name|'block_device_mapping_update_or_create'
op|'('
name|'context'
op|','
name|'values'
op|','
name|'legacy'
op|'='
name|'True'
op|')'
op|':'
newline|'\n'
indent|' '
name|'_scrub_empty_str_values'
op|'('
name|'values'
op|','
op|'['
string|"'volume_size'"
op|']'
op|')'
newline|'\n'
name|'values'
op|'='
name|'_from_legacy_values'
op|'('
name|'values'
op|','
name|'legacy'
op|','
name|'allow_updates'
op|'='
name|'True'
op|')'
newline|'\n'
name|'convert_objects_related_datetimes'
op|'('
name|'values'
op|')'
newline|'\n'
nl|'\n'
name|'result'
op|'='
name|'None'
newline|'\n'
comment|'# NOTE(xqueralt): Only update a BDM when device_name was provided. We'
nl|'\n'
comment|'# allow empty device names so they will be set later by the manager.'
nl|'\n'
name|'if'
name|'values'
op|'['
string|"'device_name'"
op|']'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'_block_device_mapping_get_query'
op|'('
name|'context'
op|')'
newline|'\n'
name|'result'
op|'='
name|'query'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'values'
op|'['
string|"'instance_uuid'"
op|']'
op|','
nl|'\n'
name|'device_name'
op|'='
name|'values'
op|'['
string|"'device_name'"
op|']'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
comment|"# Either the device_name doesn't exist in the database yet, or no"
nl|'\n'
comment|'# device_name was provided. Both cases mean creating a new BDM.'
nl|'\n'
indent|' '
name|'result'
op|'='
name|'models'
op|'.'
name|'BlockDeviceMapping'
op|'('
op|'**'
name|'values'
op|')'
newline|'\n'
name|'result'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
nl|'\n'
comment|'# NOTE(xqueralt): Prevent from having multiple swap devices for the'
nl|'\n'
comment|'# same instance. This will delete all the existing ones.'
nl|'\n'
dedent|''
name|'if'
name|'block_device'
op|'.'
name|'new_format_is_swap'
op|'('
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'_block_device_mapping_get_query'
op|'('
name|'context'
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'values'
op|'['
string|"'instance_uuid'"
op|']'
op|','
nl|'\n'
name|'source_type'
op|'='
string|"'blank'"
op|','
name|'guest_format'
op|'='
string|"'swap'"
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'BlockDeviceMapping'
op|'.'
name|'id'
op|'!='
name|'result'
op|'.'
name|'id'
op|')'
newline|'\n'
name|'query'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader_allow_async'
newline|'\n'
DECL|function|block_device_mapping_get_all_by_instance_uuids
name|'def'
name|'block_device_mapping_get_all_by_instance_uuids'
op|'('
name|'context'
op|','
name|'instance_uuids'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'not'
name|'instance_uuids'
op|':'
newline|'\n'
indent|' '
name|'return'
op|'['
op|']'
newline|'\n'
dedent|''
name|'return'
name|'_block_device_mapping_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter'
op|'('
nl|'\n'
name|'models'
op|'.'
name|'BlockDeviceMapping'
op|'.'
name|'instance_uuid'
op|'.'
name|'in_'
op|'('
name|'instance_uuids'
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader_allow_async'
newline|'\n'
DECL|function|block_device_mapping_get_all_by_instance
name|'def'
name|'block_device_mapping_get_all_by_instance'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_block_device_mapping_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|block_device_mapping_get_all_by_volume_id
name|'def'
name|'block_device_mapping_get_all_by_volume_id'
op|'('
name|'context'
op|','
name|'volume_id'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_block_device_mapping_get_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'columns_to_join'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'volume_id'
op|'='
name|'volume_id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|block_device_mapping_get_by_instance_and_volume_id
name|'def'
name|'block_device_mapping_get_by_instance_and_volume_id'
op|'('
name|'context'
op|','
name|'volume_id'
op|','
nl|'\n'
name|'instance_uuid'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_block_device_mapping_get_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'columns_to_join'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'volume_id'
op|'='
name|'volume_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|block_device_mapping_destroy
name|'def'
name|'block_device_mapping_destroy'
op|'('
name|'context'
op|','
name|'bdm_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'_block_device_mapping_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'bdm_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|block_device_mapping_destroy_by_instance_and_volume
name|'def'
name|'block_device_mapping_destroy_by_instance_and_volume'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
nl|'\n'
name|'volume_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'_block_device_mapping_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'volume_id'
op|'='
name|'volume_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|block_device_mapping_destroy_by_instance_and_device
name|'def'
name|'block_device_mapping_destroy_by_instance_and_device'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
nl|'\n'
name|'device_name'
op|')'
op|':'
newline|'\n'
indent|' '
name|'_block_device_mapping_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'device_name'
op|'='
name|'device_name'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|security_group_create
name|'def'
name|'security_group_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'security_group_ref'
op|'='
name|'models'
op|'.'
name|'SecurityGroup'
op|'('
op|')'
newline|'\n'
comment|'# FIXME(devcamcar): Unless I do this, rules fails with lazy load exception'
nl|'\n'
comment|'# once save() is called. This will get cleaned up in next orm pass.'
nl|'\n'
name|'security_group_ref'
op|'.'
name|'rules'
newline|'\n'
name|'security_group_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'with'
name|'main_context_manager'
op|'.'
name|'writer'
op|'.'
name|'savepoint'
op|'.'
name|'using'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'security_group_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'SecurityGroupExists'
op|'('
nl|'\n'
name|'project_id'
op|'='
name|'values'
op|'['
string|"'project_id'"
op|']'
op|','
nl|'\n'
name|'security_group_name'
op|'='
name|'values'
op|'['
string|"'name'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'security_group_ref'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_security_group_get_query
dedent|''
name|'def'
name|'_security_group_get_query'
op|'('
name|'context'
op|','
name|'read_deleted'
op|'='
name|'None'
op|','
nl|'\n'
name|'project_only'
op|'='
name|'False'
op|','
name|'join_rules'
op|'='
name|'True'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'SecurityGroup'
op|','
nl|'\n'
name|'read_deleted'
op|'='
name|'read_deleted'
op|','
name|'project_only'
op|'='
name|'project_only'
op|')'
newline|'\n'
name|'if'
name|'join_rules'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'joinedload_all'
op|'('
string|"'rules.grantee_group'"
op|')'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'query'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_security_group_get_by_names
dedent|''
name|'def'
name|'_security_group_get_by_names'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'group_names'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Get security group models for a project by a list of names.\n Raise SecurityGroupNotFoundForProject for a name not found.\n """'
newline|'\n'
name|'query'
op|'='
name|'_security_group_get_query'
op|'('
name|'context'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|','
nl|'\n'
name|'join_rules'
op|'='
name|'False'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'SecurityGroup'
op|'.'
name|'name'
op|'.'
name|'in_'
op|'('
name|'group_names'
op|')'
op|')'
newline|'\n'
name|'sg_models'
op|'='
name|'query'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'if'
name|'len'
op|'('
name|'sg_models'
op|')'
op|'=='
name|'len'
op|'('
name|'group_names'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'sg_models'
newline|'\n'
comment|'# Find the first one missing and raise'
nl|'\n'
dedent|''
name|'group_names_from_models'
op|'='
op|'['
name|'x'
op|'.'
name|'name'
name|'for'
name|'x'
name|'in'
name|'sg_models'
op|']'
newline|'\n'
name|'for'
name|'group_name'
name|'in'
name|'group_names'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'group_name'
name|'not'
name|'in'
name|'group_names_from_models'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'SecurityGroupNotFoundForProject'
op|'('
nl|'\n'
name|'project_id'
op|'='
name|'project_id'
op|','
name|'security_group_id'
op|'='
name|'group_name'
op|')'
newline|'\n'
comment|'# Not Reached'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|security_group_get_all
name|'def'
name|'security_group_get_all'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_security_group_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|security_group_get
name|'def'
name|'security_group_get'
op|'('
name|'context'
op|','
name|'security_group_id'
op|','
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'_security_group_get_query'
op|'('
name|'context'
op|','
name|'project_only'
op|'='
name|'True'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'security_group_id'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'columns_to_join'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'columns_to_join'
op|'='
op|'['
op|']'
newline|'\n'
dedent|''
name|'for'
name|'column'
name|'in'
name|'columns_to_join'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'column'
op|'.'
name|'startswith'
op|'('
string|"'instances'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'joinedload_all'
op|'('
name|'column'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'result'
op|'='
name|'query'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'SecurityGroupNotFound'
op|'('
nl|'\n'
name|'security_group_id'
op|'='
name|'security_group_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|security_group_get_by_name
name|'def'
name|'security_group_get_by_name'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'group_name'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'_security_group_get_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|'"no"'
op|','
name|'join_rules'
op|'='
name|'False'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'name'
op|'='
name|'group_name'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'columns_to_join'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'columns_to_join'
op|'='
op|'['
string|"'instances'"
op|','
string|"'rules.grantee_group'"
op|']'
newline|'\n'
nl|'\n'
dedent|''
name|'for'
name|'column'
name|'in'
name|'columns_to_join'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'joinedload_all'
op|'('
name|'column'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'result'
op|'='
name|'query'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'SecurityGroupNotFoundForProject'
op|'('
nl|'\n'
name|'project_id'
op|'='
name|'project_id'
op|','
name|'security_group_id'
op|'='
name|'group_name'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|security_group_get_by_project
name|'def'
name|'security_group_get_by_project'
op|'('
name|'context'
op|','
name|'project_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_security_group_get_query'
op|'('
name|'context'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|security_group_get_by_instance
name|'def'
name|'security_group_get_by_instance'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_security_group_get_query'
op|'('
name|'context'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'join'
op|'('
name|'models'
op|'.'
name|'SecurityGroup'
op|'.'
name|'instances'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|security_group_in_use
name|'def'
name|'security_group_in_use'
op|'('
name|'context'
op|','
name|'group_id'
op|')'
op|':'
newline|'\n'
comment|"# Are there any instances that haven't been deleted"
nl|'\n'
comment|'# that include this group?'
nl|'\n'
indent|' '
name|'inst_assoc'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'SecurityGroupInstanceAssociation'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'security_group_id'
op|'='
name|'group_id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'for'
name|'ia'
name|'in'
name|'inst_assoc'
op|':'
newline|'\n'
indent|' '
name|'num_instances'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Instance'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'uuid'
op|'='
name|'ia'
op|'.'
name|'instance_uuid'
op|')'
op|'.'
name|'count'
op|'('
op|')'
newline|'\n'
name|'if'
name|'num_instances'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'True'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'return'
name|'False'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|security_group_update
name|'def'
name|'security_group_update'
op|'('
name|'context'
op|','
name|'security_group_id'
op|','
name|'values'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'SecurityGroup'
op|')'
op|'.'
name|'filter_by'
op|'('
nl|'\n'
name|'id'
op|'='
name|'security_group_id'
op|')'
newline|'\n'
name|'if'
name|'columns_to_join'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'column'
name|'in'
name|'columns_to_join'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'joinedload_all'
op|'('
name|'column'
op|')'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'security_group_ref'
op|'='
name|'query'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'security_group_ref'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'SecurityGroupNotFound'
op|'('
nl|'\n'
name|'security_group_id'
op|'='
name|'security_group_id'
op|')'
newline|'\n'
dedent|''
name|'security_group_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'name'
op|'='
name|'security_group_ref'
op|'['
string|"'name'"
op|']'
newline|'\n'
name|'project_id'
op|'='
name|'security_group_ref'
op|'['
string|"'project_id'"
op|']'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'security_group_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'SecurityGroupExists'
op|'('
nl|'\n'
name|'project_id'
op|'='
name|'project_id'
op|','
nl|'\n'
name|'security_group_name'
op|'='
name|'name'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'security_group_ref'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|security_group_ensure_default
dedent|''
name|'def'
name|'security_group_ensure_default'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Ensure default security group exists for a project_id."""'
newline|'\n'
nl|'\n'
name|'try'
op|':'
newline|'\n'
comment|"# NOTE(rpodolyaka): create the default security group, if it doesn't"
nl|'\n'
comment|'# exist. This must be done in a separate transaction, so that'
nl|'\n'
comment|'# this one is not aborted in case a concurrent one succeeds first'
nl|'\n'
comment|'# and the unique constraint for security group names is violated'
nl|'\n'
comment|'# by a concurrent INSERT'
nl|'\n'
indent|' '
name|'with'
name|'main_context_manager'
op|'.'
name|'writer'
op|'.'
name|'independent'
op|'.'
name|'using'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_security_group_ensure_default'
op|'('
name|'context'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'except'
name|'exception'
op|'.'
name|'SecurityGroupExists'
op|':'
newline|'\n'
comment|'# NOTE(rpodolyaka): a concurrent transaction has succeeded first,'
nl|'\n'
comment|'# suppress the error and proceed'
nl|'\n'
indent|' '
name|'return'
name|'security_group_get_by_name'
op|'('
name|'context'
op|','
name|'context'
op|'.'
name|'project_id'
op|','
nl|'\n'
string|"'default'"
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|_security_group_ensure_default
name|'def'
name|'_security_group_ensure_default'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'default_group'
op|'='
name|'_security_group_get_by_names'
op|'('
name|'context'
op|','
nl|'\n'
name|'context'
op|'.'
name|'project_id'
op|','
nl|'\n'
op|'['
string|"'default'"
op|']'
op|')'
op|'['
number|'0'
op|']'
newline|'\n'
dedent|''
name|'except'
name|'exception'
op|'.'
name|'NotFound'
op|':'
newline|'\n'
indent|' '
name|'values'
op|'='
op|'{'
string|"'name'"
op|':'
string|"'default'"
op|','
nl|'\n'
string|"'description'"
op|':'
string|"'default'"
op|','
nl|'\n'
string|"'user_id'"
op|':'
name|'context'
op|'.'
name|'user_id'
op|','
nl|'\n'
string|"'project_id'"
op|':'
name|'context'
op|'.'
name|'project_id'
op|'}'
newline|'\n'
name|'default_group'
op|'='
name|'security_group_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
newline|'\n'
name|'usage'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'QuotaUsage'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'context'
op|'.'
name|'project_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'context'
op|'.'
name|'user_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'resource'
op|'='
string|"'security_groups'"
op|')'
newline|'\n'
comment|'# Create quota usage for auto created default security group'
nl|'\n'
name|'if'
name|'not'
name|'usage'
op|'.'
name|'first'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'_quota_usage_create'
op|'('
name|'context'
op|'.'
name|'project_id'
op|','
nl|'\n'
name|'context'
op|'.'
name|'user_id'
op|','
nl|'\n'
string|"'security_groups'"
op|','
nl|'\n'
number|'1'
op|','
number|'0'
op|','
nl|'\n'
name|'CONF'
op|'.'
name|'until_refresh'
op|','
nl|'\n'
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'usage'
op|'.'
name|'update'
op|'('
op|'{'
string|"'in_use'"
op|':'
name|'int'
op|'('
name|'usage'
op|'.'
name|'first'
op|'('
op|')'
op|'.'
name|'in_use'
op|')'
op|'+'
number|'1'
op|'}'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'default_rules'
op|'='
name|'_security_group_rule_get_default_query'
op|'('
name|'context'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'for'
name|'default_rule'
name|'in'
name|'default_rules'
op|':'
newline|'\n'
comment|'# This is suboptimal, it should be programmatic to know'
nl|'\n'
comment|'# the values of the default_rule'
nl|'\n'
indent|' '
name|'rule_values'
op|'='
op|'{'
string|"'protocol'"
op|':'
name|'default_rule'
op|'.'
name|'protocol'
op|','
nl|'\n'
string|"'from_port'"
op|':'
name|'default_rule'
op|'.'
name|'from_port'
op|','
nl|'\n'
string|"'to_port'"
op|':'
name|'default_rule'
op|'.'
name|'to_port'
op|','
nl|'\n'
string|"'cidr'"
op|':'
name|'default_rule'
op|'.'
name|'cidr'
op|','
nl|'\n'
string|"'parent_group_id'"
op|':'
name|'default_group'
op|'.'
name|'id'
op|','
nl|'\n'
op|'}'
newline|'\n'
name|'_security_group_rule_create'
op|'('
name|'context'
op|','
name|'rule_values'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'return'
name|'default_group'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|security_group_destroy
name|'def'
name|'security_group_destroy'
op|'('
name|'context'
op|','
name|'security_group_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'SecurityGroup'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'security_group_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'SecurityGroupInstanceAssociation'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'security_group_id'
op|'='
name|'security_group_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'SecurityGroupIngressRule'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'group_id'
op|'='
name|'security_group_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'SecurityGroupIngressRule'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'parent_group_id'
op|'='
name|'security_group_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_security_group_count_by_project_and_user
dedent|''
name|'def'
name|'_security_group_count_by_project_and_user'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'nova'
op|'.'
name|'context'
op|'.'
name|'authorize_project_context'
op|'('
name|'context'
op|','
name|'project_id'
op|')'
newline|'\n'
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'SecurityGroup'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'user_id'
op|')'
op|'.'
name|'count'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
DECL|function|_security_group_rule_create
dedent|''
name|'def'
name|'_security_group_rule_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'security_group_rule_ref'
op|'='
name|'models'
op|'.'
name|'SecurityGroupIngressRule'
op|'('
op|')'
newline|'\n'
name|'security_group_rule_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'security_group_rule_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'return'
name|'security_group_rule_ref'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_security_group_rule_get_query
dedent|''
name|'def'
name|'_security_group_rule_get_query'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'SecurityGroupIngressRule'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|security_group_rule_get
name|'def'
name|'security_group_rule_get'
op|'('
name|'context'
op|','
name|'security_group_rule_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
op|'('
name|'_security_group_rule_get_query'
op|'('
name|'context'
op|')'
op|'.'
nl|'\n'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'security_group_rule_id'
op|')'
op|'.'
nl|'\n'
name|'first'
op|'('
op|')'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'SecurityGroupNotFoundForRule'
op|'('
nl|'\n'
name|'rule_id'
op|'='
name|'security_group_rule_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|security_group_rule_get_by_security_group
name|'def'
name|'security_group_rule_get_by_security_group'
op|'('
name|'context'
op|','
name|'security_group_id'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'columns_to_join'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'columns_to_join'
op|'='
op|'['
string|"'grantee_group.instances.system_metadata'"
op|','
nl|'\n'
string|"'grantee_group.instances.info_cache'"
op|']'
newline|'\n'
dedent|''
name|'query'
op|'='
op|'('
name|'_security_group_rule_get_query'
op|'('
name|'context'
op|')'
op|'.'
nl|'\n'
name|'filter_by'
op|'('
name|'parent_group_id'
op|'='
name|'security_group_id'
op|')'
op|')'
newline|'\n'
name|'for'
name|'column'
name|'in'
name|'columns_to_join'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'joinedload_all'
op|'('
name|'column'
op|')'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'query'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|security_group_rule_get_by_instance
name|'def'
name|'security_group_rule_get_by_instance'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
op|'('
name|'_security_group_rule_get_query'
op|'('
name|'context'
op|')'
op|'.'
nl|'\n'
name|'join'
op|'('
string|"'parent_group'"
op|','
string|"'instances'"
op|')'
op|'.'
nl|'\n'
name|'filter_by'
op|'('
name|'uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
nl|'\n'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'grantee_group'"
op|')'
op|')'
op|'.'
nl|'\n'
name|'all'
op|'('
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|security_group_rule_create
name|'def'
name|'security_group_rule_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_security_group_rule_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|security_group_rule_destroy
name|'def'
name|'security_group_rule_destroy'
op|'('
name|'context'
op|','
name|'security_group_rule_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'count'
op|'='
op|'('
name|'_security_group_rule_get_query'
op|'('
name|'context'
op|')'
op|'.'
nl|'\n'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'security_group_rule_id'
op|')'
op|'.'
nl|'\n'
name|'soft_delete'
op|'('
op|')'
op|')'
newline|'\n'
name|'if'
name|'count'
op|'=='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'SecurityGroupNotFoundForRule'
op|'('
nl|'\n'
name|'rule_id'
op|'='
name|'security_group_rule_id'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|security_group_rule_count_by_group
name|'def'
name|'security_group_rule_count_by_group'
op|'('
name|'context'
op|','
name|'security_group_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
op|'('
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'SecurityGroupIngressRule'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
nl|'\n'
name|'filter_by'
op|'('
name|'parent_group_id'
op|'='
name|'security_group_id'
op|')'
op|'.'
nl|'\n'
name|'count'
op|'('
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
DECL|function|_security_group_rule_get_default_query
dedent|''
name|'def'
name|'_security_group_rule_get_default_query'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'SecurityGroupIngressDefaultRule'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|security_group_default_rule_get
name|'def'
name|'security_group_default_rule_get'
op|'('
name|'context'
op|','
name|'security_group_rule_default_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'_security_group_rule_get_default_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'security_group_rule_default_id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'SecurityGroupDefaultRuleNotFound'
op|'('
nl|'\n'
name|'rule_id'
op|'='
name|'security_group_rule_default_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|security_group_default_rule_destroy
name|'def'
name|'security_group_default_rule_destroy'
op|'('
name|'context'
op|','
nl|'\n'
name|'security_group_rule_default_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'count'
op|'='
name|'_security_group_rule_get_default_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'security_group_rule_default_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'if'
name|'count'
op|'=='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'SecurityGroupDefaultRuleNotFound'
op|'('
nl|'\n'
name|'rule_id'
op|'='
name|'security_group_rule_default_id'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|security_group_default_rule_create
name|'def'
name|'security_group_default_rule_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'security_group_default_rule_ref'
op|'='
name|'models'
op|'.'
name|'SecurityGroupIngressDefaultRule'
op|'('
op|')'
newline|'\n'
name|'security_group_default_rule_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'security_group_default_rule_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'return'
name|'security_group_default_rule_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|security_group_default_rule_list
name|'def'
name|'security_group_default_rule_list'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_security_group_rule_get_default_query'
op|'('
name|'context'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|provider_fw_rule_create
name|'def'
name|'provider_fw_rule_create'
op|'('
name|'context'
op|','
name|'rule'
op|')'
op|':'
newline|'\n'
indent|' '
name|'fw_rule_ref'
op|'='
name|'models'
op|'.'
name|'ProviderFirewallRule'
op|'('
op|')'
newline|'\n'
name|'fw_rule_ref'
op|'.'
name|'update'
op|'('
name|'rule'
op|')'
newline|'\n'
name|'fw_rule_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'return'
name|'fw_rule_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|provider_fw_rule_get_all
name|'def'
name|'provider_fw_rule_get_all'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'ProviderFirewallRule'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|provider_fw_rule_destroy
name|'def'
name|'provider_fw_rule_destroy'
op|'('
name|'context'
op|','
name|'rule_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'context'
op|'.'
name|'session'
op|'.'
name|'query'
op|'('
name|'models'
op|'.'
name|'ProviderFirewallRule'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'rule_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|project_get_networks
name|'def'
name|'project_get_networks'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'associate'
op|'='
name|'True'
op|')'
op|':'
newline|'\n'
comment|'# NOTE(tr3buchet): as before this function will associate'
nl|'\n'
comment|"# a project with a network if it doesn't have one and"
nl|'\n'
comment|'# associate is true'
nl|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Network'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'not'
name|'associate'
op|':'
newline|'\n'
indent|' '
name|'return'
op|'['
op|']'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
op|'['
name|'network_associate'
op|'('
name|'context'
op|','
name|'project_id'
op|')'
op|']'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'###################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|migration_create
name|'def'
name|'migration_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'migration'
op|'='
name|'models'
op|'.'
name|'Migration'
op|'('
op|')'
newline|'\n'
name|'migration'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'migration'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'return'
name|'migration'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|migration_update
name|'def'
name|'migration_update'
op|'('
name|'context'
op|','
name|'id'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'migration'
op|'='
name|'migration_get'
op|'('
name|'context'
op|','
name|'id'
op|')'
newline|'\n'
name|'migration'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
nl|'\n'
name|'return'
name|'migration'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|migration_get
name|'def'
name|'migration_get'
op|'('
name|'context'
op|','
name|'id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Migration'
op|','
name|'read_deleted'
op|'='
string|'"yes"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'MigrationNotFound'
op|'('
name|'migration_id'
op|'='
name|'id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|migration_get_by_id_and_instance
name|'def'
name|'migration_get_by_id_and_instance'
op|'('
name|'context'
op|','
name|'id'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Migration'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'MigrationNotFoundForInstance'
op|'('
name|'migration_id'
op|'='
name|'id'
op|','
nl|'\n'
name|'instance_id'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|migration_get_by_instance_and_status
name|'def'
name|'migration_get_by_instance_and_status'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'status'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Migration'
op|','
name|'read_deleted'
op|'='
string|'"yes"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'status'
op|'='
name|'status'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'MigrationNotFoundByStatus'
op|'('
name|'instance_id'
op|'='
name|'instance_uuid'
op|','
nl|'\n'
name|'status'
op|'='
name|'status'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader_allow_async'
newline|'\n'
DECL|function|migration_get_unconfirmed_by_dest_compute
name|'def'
name|'migration_get_unconfirmed_by_dest_compute'
op|'('
name|'context'
op|','
name|'confirm_window'
op|','
nl|'\n'
name|'dest_compute'
op|')'
op|':'
newline|'\n'
indent|' '
name|'confirm_window'
op|'='
op|'('
name|'timeutils'
op|'.'
name|'utcnow'
op|'('
op|')'
op|'-'
nl|'\n'
name|'datetime'
op|'.'
name|'timedelta'
op|'('
name|'seconds'
op|'='
name|'confirm_window'
op|')'
op|')'
newline|'\n'
nl|'\n'
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Migration'
op|','
name|'read_deleted'
op|'='
string|'"yes"'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Migration'
op|'.'
name|'updated_at'
op|'<='
name|'confirm_window'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'status'
op|'='
string|'"finished"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'dest_compute'
op|'='
name|'dest_compute'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|migration_get_in_progress_by_host_and_node
name|'def'
name|'migration_get_in_progress_by_host_and_node'
op|'('
name|'context'
op|','
name|'host'
op|','
name|'node'
op|')'
op|':'
newline|'\n'
nl|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Migration'
op|')'
op|'.'
name|'filter'
op|'('
name|'or_'
op|'('
name|'and_'
op|'('
name|'models'
op|'.'
name|'Migration'
op|'.'
name|'source_compute'
op|'=='
name|'host'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Migration'
op|'.'
name|'source_node'
op|'=='
name|'node'
op|')'
op|','
nl|'\n'
name|'and_'
op|'('
name|'models'
op|'.'
name|'Migration'
op|'.'
name|'dest_compute'
op|'=='
name|'host'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Migration'
op|'.'
name|'dest_node'
op|'=='
name|'node'
op|')'
op|')'
op|')'
op|'.'
name|'filter'
op|'('
op|'~'
name|'models'
op|'.'
name|'Migration'
op|'.'
name|'status'
op|'.'
name|'in_'
op|'('
op|'['
string|"'accepted'"
op|','
string|"'confirmed'"
op|','
nl|'\n'
string|"'reverted'"
op|','
string|"'error'"
op|','
nl|'\n'
string|"'failed'"
op|']'
op|')'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload_all'
op|'('
string|"'instance.system_metadata'"
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|migration_get_in_progress_by_instance
name|'def'
name|'migration_get_in_progress_by_instance'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
nl|'\n'
name|'migration_type'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
comment|'# TODO(Shaohe Feng) we should share the in-progress list.'
nl|'\n'
comment|'# TODO(Shaohe Feng) will also summarize all status to a new'
nl|'\n'
comment|'# MigrationStatus class.'
nl|'\n'
indent|' '
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Migration'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Migration'
op|'.'
name|'status'
op|'.'
name|'in_'
op|'('
op|'['
string|"'queued'"
op|','
string|"'preparing'"
op|','
nl|'\n'
string|"'running'"
op|','
nl|'\n'
string|"'post-migrating'"
op|']'
op|')'
op|')'
newline|'\n'
name|'if'
name|'migration_type'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Migration'
op|'.'
name|'migration_type'
op|'=='
name|'migration_type'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'query'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|migration_get_all_by_filters
name|'def'
name|'migration_get_all_by_filters'
op|'('
name|'context'
op|','
name|'filters'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Migration'
op|')'
newline|'\n'
name|'if'
string|'"status"'
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'status'
op|'='
name|'filters'
op|'['
string|'"status"'
op|']'
newline|'\n'
name|'status'
op|'='
op|'['
name|'status'
op|']'
name|'if'
name|'isinstance'
op|'('
name|'status'
op|','
name|'six'
op|'.'
name|'string_types'
op|')'
name|'else'
name|'status'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Migration'
op|'.'
name|'status'
op|'.'
name|'in_'
op|'('
name|'status'
op|')'
op|')'
newline|'\n'
dedent|''
name|'if'
string|'"host"'
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'host'
op|'='
name|'filters'
op|'['
string|'"host"'
op|']'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'or_'
op|'('
name|'models'
op|'.'
name|'Migration'
op|'.'
name|'source_compute'
op|'=='
name|'host'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Migration'
op|'.'
name|'dest_compute'
op|'=='
name|'host'
op|')'
op|')'
newline|'\n'
dedent|''
name|'elif'
string|'"source_compute"'
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'host'
op|'='
name|'filters'
op|'['
string|"'source_compute'"
op|']'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Migration'
op|'.'
name|'source_compute'
op|'=='
name|'host'
op|')'
newline|'\n'
dedent|''
name|'if'
string|'"migration_type"'
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'migtype'
op|'='
name|'filters'
op|'['
string|'"migration_type"'
op|']'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Migration'
op|'.'
name|'migration_type'
op|'=='
name|'migtype'
op|')'
newline|'\n'
dedent|''
name|'if'
string|'"hidden"'
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'hidden'
op|'='
name|'filters'
op|'['
string|'"hidden"'
op|']'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Migration'
op|'.'
name|'hidden'
op|'=='
name|'hidden'
op|')'
newline|'\n'
dedent|''
name|'if'
string|'"instance_uuid"'
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'uuid'
op|'='
name|'filters'
op|'['
string|'"instance_uuid"'
op|']'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'Migration'
op|'.'
name|'instance_uuid'
op|'=='
name|'uuid'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'query'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'##################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|console_pool_create
name|'def'
name|'console_pool_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'pool'
op|'='
name|'models'
op|'.'
name|'ConsolePool'
op|'('
op|')'
newline|'\n'
name|'pool'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'pool'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ConsolePoolExists'
op|'('
nl|'\n'
name|'host'
op|'='
name|'values'
op|'['
string|'"host"'
op|']'
op|','
nl|'\n'
name|'console_type'
op|'='
name|'values'
op|'['
string|'"console_type"'
op|']'
op|','
nl|'\n'
name|'compute_host'
op|'='
name|'values'
op|'['
string|'"compute_host"'
op|']'
op|','
nl|'\n'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'pool'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|console_pool_get_by_host_type
name|'def'
name|'console_pool_get_by_host_type'
op|'('
name|'context'
op|','
name|'compute_host'
op|','
name|'host'
op|','
nl|'\n'
name|'console_type'
op|')'
op|':'
newline|'\n'
nl|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'ConsolePool'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'host'
op|'='
name|'host'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'console_type'
op|'='
name|'console_type'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'compute_host'
op|'='
name|'compute_host'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'consoles'"
op|')'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ConsolePoolNotFoundForHostType'
op|'('
nl|'\n'
name|'host'
op|'='
name|'host'
op|','
name|'console_type'
op|'='
name|'console_type'
op|','
nl|'\n'
name|'compute_host'
op|'='
name|'compute_host'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|console_pool_get_all_by_host_type
name|'def'
name|'console_pool_get_all_by_host_type'
op|'('
name|'context'
op|','
name|'host'
op|','
name|'console_type'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'ConsolePool'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'host'
op|'='
name|'host'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'console_type'
op|'='
name|'console_type'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'consoles'"
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'##################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|console_create
name|'def'
name|'console_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'console'
op|'='
name|'models'
op|'.'
name|'Console'
op|'('
op|')'
newline|'\n'
name|'console'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'console'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'return'
name|'console'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|console_delete
name|'def'
name|'console_delete'
op|'('
name|'context'
op|','
name|'console_id'
op|')'
op|':'
newline|'\n'
comment|'# NOTE(mdragon): consoles are meant to be transient.'
nl|'\n'
indent|' '
name|'context'
op|'.'
name|'session'
op|'.'
name|'query'
op|'('
name|'models'
op|'.'
name|'Console'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'console_id'
op|')'
op|'.'
name|'delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|console_get_by_pool_instance
name|'def'
name|'console_get_by_pool_instance'
op|'('
name|'context'
op|','
name|'pool_id'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Console'
op|','
name|'read_deleted'
op|'='
string|'"yes"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'pool_id'
op|'='
name|'pool_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'pool'"
op|')'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ConsoleNotFoundInPoolForInstance'
op|'('
nl|'\n'
name|'pool_id'
op|'='
name|'pool_id'
op|','
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|console_get_all_by_instance
name|'def'
name|'console_get_all_by_instance'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'columns_to_join'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Console'
op|','
name|'read_deleted'
op|'='
string|'"yes"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
name|'if'
name|'columns_to_join'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'column'
name|'in'
name|'columns_to_join'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
name|'column'
op|')'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'return'
name|'query'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|console_get
name|'def'
name|'console_get'
op|'('
name|'context'
op|','
name|'console_id'
op|','
name|'instance_uuid'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Console'
op|','
name|'read_deleted'
op|'='
string|'"yes"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'console_id'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'pool'"
op|')'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'instance_uuid'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'result'
op|'='
name|'query'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'instance_uuid'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ConsoleNotFoundForInstance'
op|'('
nl|'\n'
name|'console_id'
op|'='
name|'console_id'
op|','
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ConsoleNotFound'
op|'('
name|'console_id'
op|'='
name|'console_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'##################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|flavor_create
name|'def'
name|'flavor_create'
op|'('
name|'context'
op|','
name|'values'
op|','
name|'projects'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Create a new instance type. In order to pass in extra specs,\n the values dict should contain a \'extra_specs\' key/value pair:\n\n {\'extra_specs\' : {\'k1\': \'v1\', \'k2\': \'v2\', ...}}\n\n """'
newline|'\n'
name|'specs'
op|'='
name|'values'
op|'.'
name|'get'
op|'('
string|"'extra_specs'"
op|')'
newline|'\n'
name|'specs_refs'
op|'='
op|'['
op|']'
newline|'\n'
name|'if'
name|'specs'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'k'
op|','
name|'v'
name|'in'
name|'specs'
op|'.'
name|'items'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'specs_ref'
op|'='
name|'models'
op|'.'
name|'InstanceTypeExtraSpecs'
op|'('
op|')'
newline|'\n'
name|'specs_ref'
op|'['
string|"'key'"
op|']'
op|'='
name|'k'
newline|'\n'
name|'specs_ref'
op|'['
string|"'value'"
op|']'
op|'='
name|'v'
newline|'\n'
name|'specs_refs'
op|'.'
name|'append'
op|'('
name|'specs_ref'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'values'
op|'['
string|"'extra_specs'"
op|']'
op|'='
name|'specs_refs'
newline|'\n'
name|'instance_type_ref'
op|'='
name|'models'
op|'.'
name|'InstanceTypes'
op|'('
op|')'
newline|'\n'
name|'instance_type_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'projects'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'projects'
op|'='
op|'['
op|']'
newline|'\n'
nl|'\n'
dedent|''
name|'try'
op|':'
newline|'\n'
indent|' '
name|'instance_type_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
name|'as'
name|'e'
op|':'
newline|'\n'
indent|' '
name|'if'
string|"'flavorid'"
name|'in'
name|'e'
op|'.'
name|'columns'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FlavorIdExists'
op|'('
name|'flavor_id'
op|'='
name|'values'
op|'['
string|"'flavorid'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'raise'
name|'exception'
op|'.'
name|'FlavorExists'
op|'('
name|'name'
op|'='
name|'values'
op|'['
string|"'name'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'Exception'
name|'as'
name|'e'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'db_exc'
op|'.'
name|'DBError'
op|'('
name|'e'
op|')'
newline|'\n'
dedent|''
name|'for'
name|'project'
name|'in'
name|'set'
op|'('
name|'projects'
op|')'
op|':'
newline|'\n'
indent|' '
name|'access_ref'
op|'='
name|'models'
op|'.'
name|'InstanceTypeProjects'
op|'('
op|')'
newline|'\n'
name|'access_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|'"instance_type_id"'
op|':'
name|'instance_type_ref'
op|'.'
name|'id'
op|','
nl|'\n'
string|'"project_id"'
op|':'
name|'project'
op|'}'
op|')'
newline|'\n'
name|'access_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'_dict_with_extra_specs'
op|'('
name|'instance_type_ref'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_dict_with_extra_specs
dedent|''
name|'def'
name|'_dict_with_extra_specs'
op|'('
name|'inst_type_query'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Takes an instance or instance type query returned\n by sqlalchemy and returns it as a dictionary, converting the\n extra_specs entry from a list of dicts:\n\n \'extra_specs\' : [{\'key\': \'k1\', \'value\': \'v1\', ...}, ...]\n\n to a single dict:\n\n \'extra_specs\' : {\'k1\': \'v1\'}\n\n """'
newline|'\n'
name|'inst_type_dict'
op|'='
name|'dict'
op|'('
name|'inst_type_query'
op|')'
newline|'\n'
name|'extra_specs'
op|'='
op|'{'
name|'x'
op|'['
string|"'key'"
op|']'
op|':'
name|'x'
op|'['
string|"'value'"
op|']'
nl|'\n'
name|'for'
name|'x'
name|'in'
name|'inst_type_query'
op|'['
string|"'extra_specs'"
op|']'
op|'}'
newline|'\n'
name|'inst_type_dict'
op|'['
string|"'extra_specs'"
op|']'
op|'='
name|'extra_specs'
newline|'\n'
name|'return'
name|'inst_type_dict'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_flavor_get_query
dedent|''
name|'def'
name|'_flavor_get_query'
op|'('
name|'context'
op|','
name|'read_deleted'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceTypes'
op|','
nl|'\n'
name|'read_deleted'
op|'='
name|'read_deleted'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'extra_specs'"
op|')'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'context'
op|'.'
name|'is_admin'
op|':'
newline|'\n'
indent|' '
name|'the_filter'
op|'='
op|'['
name|'models'
op|'.'
name|'InstanceTypes'
op|'.'
name|'is_public'
op|'=='
name|'true'
op|'('
op|')'
op|']'
newline|'\n'
name|'the_filter'
op|'.'
name|'extend'
op|'('
op|'['
nl|'\n'
name|'models'
op|'.'
name|'InstanceTypes'
op|'.'
name|'projects'
op|'.'
name|'any'
op|'('
name|'project_id'
op|'='
name|'context'
op|'.'
name|'project_id'
op|')'
nl|'\n'
op|']'
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'or_'
op|'('
op|'*'
name|'the_filter'
op|')'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'query'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|flavor_get_all
name|'def'
name|'flavor_get_all'
op|'('
name|'context'
op|','
name|'inactive'
op|'='
name|'False'
op|','
name|'filters'
op|'='
name|'None'
op|','
nl|'\n'
name|'sort_key'
op|'='
string|"'flavorid'"
op|','
name|'sort_dir'
op|'='
string|"'asc'"
op|','
name|'limit'
op|'='
name|'None'
op|','
nl|'\n'
name|'marker'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Returns all flavors.\n """'
newline|'\n'
name|'filters'
op|'='
name|'filters'
name|'or'
op|'{'
op|'}'
newline|'\n'
nl|'\n'
comment|'# FIXME(sirp): now that we have the `disabled` field for flavors, we'
nl|'\n'
comment|'# should probably remove the use of `deleted` to mark inactive. `deleted`'
nl|'\n'
comment|'# should mean truly deleted, e.g. we can safely purge the record out of the'
nl|'\n'
comment|'# database.'
nl|'\n'
name|'read_deleted'
op|'='
string|'"yes"'
name|'if'
name|'inactive'
name|'else'
string|'"no"'
newline|'\n'
nl|'\n'
name|'query'
op|'='
name|'_flavor_get_query'
op|'('
name|'context'
op|','
name|'read_deleted'
op|'='
name|'read_deleted'
op|')'
newline|'\n'
nl|'\n'
name|'if'
string|"'min_memory_mb'"
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
nl|'\n'
name|'models'
op|'.'
name|'InstanceTypes'
op|'.'
name|'memory_mb'
op|'>='
name|'filters'
op|'['
string|"'min_memory_mb'"
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
string|"'min_root_gb'"
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
nl|'\n'
name|'models'
op|'.'
name|'InstanceTypes'
op|'.'
name|'root_gb'
op|'>='
name|'filters'
op|'['
string|"'min_root_gb'"
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
string|"'disabled'"
name|'in'
name|'filters'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
nl|'\n'
name|'models'
op|'.'
name|'InstanceTypes'
op|'.'
name|'disabled'
op|'=='
name|'filters'
op|'['
string|"'disabled'"
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
string|"'is_public'"
name|'in'
name|'filters'
name|'and'
name|'filters'
op|'['
string|"'is_public'"
op|']'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'the_filter'
op|'='
op|'['
name|'models'
op|'.'
name|'InstanceTypes'
op|'.'
name|'is_public'
op|'=='
name|'filters'
op|'['
string|"'is_public'"
op|']'
op|']'
newline|'\n'
name|'if'
name|'filters'
op|'['
string|"'is_public'"
op|']'
name|'and'
name|'context'
op|'.'
name|'project_id'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'the_filter'
op|'.'
name|'extend'
op|'('
op|'['
nl|'\n'
name|'models'
op|'.'
name|'InstanceTypes'
op|'.'
name|'projects'
op|'.'
name|'any'
op|'('
nl|'\n'
name|'project_id'
op|'='
name|'context'
op|'.'
name|'project_id'
op|','
name|'deleted'
op|'='
number|'0'
op|')'
nl|'\n'
op|']'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'len'
op|'('
name|'the_filter'
op|')'
op|'>'
number|'1'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'or_'
op|'('
op|'*'
name|'the_filter'
op|')'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'the_filter'
op|'['
number|'0'
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'marker_row'
op|'='
name|'None'
newline|'\n'
name|'if'
name|'marker'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'marker_row'
op|'='
name|'_flavor_get_query'
op|'('
name|'context'
op|','
name|'read_deleted'
op|'='
name|'read_deleted'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'flavorid'
op|'='
name|'marker'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'marker_row'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'MarkerNotFound'
op|'('
name|'marker'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'query'
op|'='
name|'sqlalchemyutils'
op|'.'
name|'paginate_query'
op|'('
name|'query'
op|','
name|'models'
op|'.'
name|'InstanceTypes'
op|','
name|'limit'
op|','
nl|'\n'
op|'['
name|'sort_key'
op|','
string|"'id'"
op|']'
op|','
nl|'\n'
name|'marker'
op|'='
name|'marker_row'
op|','
nl|'\n'
name|'sort_dir'
op|'='
name|'sort_dir'
op|')'
newline|'\n'
nl|'\n'
name|'inst_types'
op|'='
name|'query'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'return'
op|'['
name|'_dict_with_extra_specs'
op|'('
name|'i'
op|')'
name|'for'
name|'i'
name|'in'
name|'inst_types'
op|']'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_flavor_get_id_from_flavor_query
dedent|''
name|'def'
name|'_flavor_get_id_from_flavor_query'
op|'('
name|'context'
op|','
name|'flavor_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceTypes'
op|','
nl|'\n'
op|'('
name|'models'
op|'.'
name|'InstanceTypes'
op|'.'
name|'id'
op|','
op|')'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'flavorid'
op|'='
name|'flavor_id'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_flavor_get_id_from_flavor
dedent|''
name|'def'
name|'_flavor_get_id_from_flavor'
op|'('
name|'context'
op|','
name|'flavor_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'_flavor_get_id_from_flavor_query'
op|'('
name|'context'
op|','
name|'flavor_id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FlavorNotFound'
op|'('
name|'flavor_id'
op|'='
name|'flavor_id'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'result'
op|'['
number|'0'
op|']'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|flavor_get
name|'def'
name|'flavor_get'
op|'('
name|'context'
op|','
name|'id'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Returns a dict describing specific flavor."""'
newline|'\n'
name|'result'
op|'='
name|'_flavor_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FlavorNotFound'
op|'('
name|'flavor_id'
op|'='
name|'id'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'_dict_with_extra_specs'
op|'('
name|'result'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|flavor_get_by_name
name|'def'
name|'flavor_get_by_name'
op|'('
name|'context'
op|','
name|'name'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Returns a dict describing specific flavor."""'
newline|'\n'
name|'result'
op|'='
name|'_flavor_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'name'
op|'='
name|'name'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FlavorNotFoundByName'
op|'('
name|'flavor_name'
op|'='
name|'name'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'_dict_with_extra_specs'
op|'('
name|'result'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|flavor_get_by_flavor_id
name|'def'
name|'flavor_get_by_flavor_id'
op|'('
name|'context'
op|','
name|'flavor_id'
op|','
name|'read_deleted'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Returns a dict describing specific flavor_id."""'
newline|'\n'
name|'result'
op|'='
name|'_flavor_get_query'
op|'('
name|'context'
op|','
name|'read_deleted'
op|'='
name|'read_deleted'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'flavorid'
op|'='
name|'flavor_id'
op|')'
op|'.'
name|'order_by'
op|'('
name|'asc'
op|'('
name|'models'
op|'.'
name|'InstanceTypes'
op|'.'
name|'deleted'
op|')'
op|','
nl|'\n'
name|'asc'
op|'('
name|'models'
op|'.'
name|'InstanceTypes'
op|'.'
name|'id'
op|')'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FlavorNotFound'
op|'('
name|'flavor_id'
op|'='
name|'flavor_id'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'_dict_with_extra_specs'
op|'('
name|'result'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|flavor_destroy
name|'def'
name|'flavor_destroy'
op|'('
name|'context'
op|','
name|'name'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Marks specific flavor as deleted."""'
newline|'\n'
name|'ref'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceTypes'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'name'
op|'='
name|'name'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'ref'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FlavorNotFoundByName'
op|'('
name|'flavor_name'
op|'='
name|'name'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'ref'
op|'.'
name|'soft_delete'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceTypeExtraSpecs'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_type_id'
op|'='
name|'ref'
op|'['
string|"'id'"
op|']'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceTypeProjects'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_type_id'
op|'='
name|'ref'
op|'['
string|"'id'"
op|']'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_flavor_access_query
dedent|''
name|'def'
name|'_flavor_access_query'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceTypeProjects'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|flavor_access_get_by_flavor_id
name|'def'
name|'flavor_access_get_by_flavor_id'
op|'('
name|'context'
op|','
name|'flavor_id'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Get flavor access list by flavor id."""'
newline|'\n'
name|'instance_type_id_subq'
op|'='
name|'_flavor_get_id_from_flavor_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'flavor_id'
op|')'
newline|'\n'
name|'access_refs'
op|'='
name|'_flavor_access_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_type_id'
op|'='
name|'instance_type_id_subq'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'return'
name|'access_refs'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|flavor_access_add
name|'def'
name|'flavor_access_add'
op|'('
name|'context'
op|','
name|'flavor_id'
op|','
name|'project_id'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Add given tenant to the flavor access list."""'
newline|'\n'
name|'instance_type_id'
op|'='
name|'_flavor_get_id_from_flavor'
op|'('
name|'context'
op|','
name|'flavor_id'
op|')'
newline|'\n'
nl|'\n'
name|'access_ref'
op|'='
name|'models'
op|'.'
name|'InstanceTypeProjects'
op|'('
op|')'
newline|'\n'
name|'access_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|'"instance_type_id"'
op|':'
name|'instance_type_id'
op|','
nl|'\n'
string|'"project_id"'
op|':'
name|'project_id'
op|'}'
op|')'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'access_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FlavorAccessExists'
op|'('
name|'flavor_id'
op|'='
name|'flavor_id'
op|','
nl|'\n'
name|'project_id'
op|'='
name|'project_id'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'access_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|flavor_access_remove
name|'def'
name|'flavor_access_remove'
op|'('
name|'context'
op|','
name|'flavor_id'
op|','
name|'project_id'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Remove given tenant from the flavor access list."""'
newline|'\n'
name|'instance_type_id'
op|'='
name|'_flavor_get_id_from_flavor'
op|'('
name|'context'
op|','
name|'flavor_id'
op|')'
newline|'\n'
nl|'\n'
name|'count'
op|'='
name|'_flavor_access_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_type_id'
op|'='
name|'instance_type_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
name|'if'
name|'count'
op|'=='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FlavorAccessNotFound'
op|'('
name|'flavor_id'
op|'='
name|'flavor_id'
op|','
nl|'\n'
name|'project_id'
op|'='
name|'project_id'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_flavor_extra_specs_get_query
dedent|''
dedent|''
name|'def'
name|'_flavor_extra_specs_get_query'
op|'('
name|'context'
op|','
name|'flavor_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'instance_type_id_subq'
op|'='
name|'_flavor_get_id_from_flavor_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'flavor_id'
op|')'
newline|'\n'
nl|'\n'
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceTypeExtraSpecs'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_type_id'
op|'='
name|'instance_type_id_subq'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|flavor_extra_specs_get
name|'def'
name|'flavor_extra_specs_get'
op|'('
name|'context'
op|','
name|'flavor_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'rows'
op|'='
name|'_flavor_extra_specs_get_query'
op|'('
name|'context'
op|','
name|'flavor_id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'return'
op|'{'
name|'row'
op|'['
string|"'key'"
op|']'
op|':'
name|'row'
op|'['
string|"'value'"
op|']'
name|'for'
name|'row'
name|'in'
name|'rows'
op|'}'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|flavor_extra_specs_delete
name|'def'
name|'flavor_extra_specs_delete'
op|'('
name|'context'
op|','
name|'flavor_id'
op|','
name|'key'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'_flavor_extra_specs_get_query'
op|'('
name|'context'
op|','
name|'flavor_id'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'InstanceTypeExtraSpecs'
op|'.'
name|'key'
op|'=='
name|'key'
op|')'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
comment|'# did not find the extra spec'
nl|'\n'
name|'if'
name|'result'
op|'=='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FlavorExtraSpecsNotFound'
op|'('
nl|'\n'
name|'extra_specs_key'
op|'='
name|'key'
op|','
name|'flavor_id'
op|'='
name|'flavor_id'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|flavor_extra_specs_update_or_create
name|'def'
name|'flavor_extra_specs_update_or_create'
op|'('
name|'context'
op|','
name|'flavor_id'
op|','
name|'specs'
op|','
nl|'\n'
name|'max_retries'
op|'='
number|'10'
op|')'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'attempt'
name|'in'
name|'range'
op|'('
name|'max_retries'
op|')'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'instance_type_id'
op|'='
name|'_flavor_get_id_from_flavor'
op|'('
name|'context'
op|','
name|'flavor_id'
op|')'
newline|'\n'
nl|'\n'
name|'spec_refs'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceTypeExtraSpecs'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_type_id'
op|'='
name|'instance_type_id'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'InstanceTypeExtraSpecs'
op|'.'
name|'key'
op|'.'
name|'in_'
op|'('
name|'specs'
op|'.'
name|'keys'
op|'('
op|')'
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'existing_keys'
op|'='
name|'set'
op|'('
op|')'
newline|'\n'
name|'for'
name|'spec_ref'
name|'in'
name|'spec_refs'
op|':'
newline|'\n'
indent|' '
name|'key'
op|'='
name|'spec_ref'
op|'['
string|'"key"'
op|']'
newline|'\n'
name|'existing_keys'
op|'.'
name|'add'
op|'('
name|'key'
op|')'
newline|'\n'
name|'with'
name|'main_context_manager'
op|'.'
name|'writer'
op|'.'
name|'savepoint'
op|'.'
name|'using'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'spec_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|'"value"'
op|':'
name|'specs'
op|'['
name|'key'
op|']'
op|'}'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'for'
name|'key'
op|','
name|'value'
name|'in'
name|'specs'
op|'.'
name|'items'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'key'
name|'in'
name|'existing_keys'
op|':'
newline|'\n'
indent|' '
name|'continue'
newline|'\n'
dedent|''
name|'spec_ref'
op|'='
name|'models'
op|'.'
name|'InstanceTypeExtraSpecs'
op|'('
op|')'
newline|'\n'
name|'with'
name|'main_context_manager'
op|'.'
name|'writer'
op|'.'
name|'savepoint'
op|'.'
name|'using'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'spec_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|'"key"'
op|':'
name|'key'
op|','
string|'"value"'
op|':'
name|'value'
op|','
nl|'\n'
string|'"instance_type_id"'
op|':'
name|'instance_type_id'
op|'}'
op|')'
newline|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'add'
op|'('
name|'spec_ref'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'return'
name|'specs'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
comment|'# a concurrent transaction has been committed,'
nl|'\n'
comment|'# try again unless this was the last attempt'
nl|'\n'
indent|' '
name|'if'
name|'attempt'
op|'=='
name|'max_retries'
op|'-'
number|'1'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'FlavorExtraSpecUpdateCreateFailed'
op|'('
nl|'\n'
name|'id'
op|'='
name|'flavor_id'
op|','
name|'retries'
op|'='
name|'max_retries'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'####################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
dedent|''
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|cell_create
name|'def'
name|'cell_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'cell'
op|'='
name|'models'
op|'.'
name|'Cell'
op|'('
op|')'
newline|'\n'
name|'cell'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'cell'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'CellExists'
op|'('
name|'name'
op|'='
name|'values'
op|'['
string|"'name'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'cell'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_cell_get_by_name_query
dedent|''
name|'def'
name|'_cell_get_by_name_query'
op|'('
name|'context'
op|','
name|'cell_name'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Cell'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'name'
op|'='
name|'cell_name'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|cell_update
name|'def'
name|'cell_update'
op|'('
name|'context'
op|','
name|'cell_name'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'cell_query'
op|'='
name|'_cell_get_by_name_query'
op|'('
name|'context'
op|','
name|'cell_name'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'cell_query'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'CellNotFound'
op|'('
name|'cell_name'
op|'='
name|'cell_name'
op|')'
newline|'\n'
dedent|''
name|'cell'
op|'='
name|'cell_query'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'return'
name|'cell'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|cell_delete
name|'def'
name|'cell_delete'
op|'('
name|'context'
op|','
name|'cell_name'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_cell_get_by_name_query'
op|'('
name|'context'
op|','
name|'cell_name'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|cell_get
name|'def'
name|'cell_get'
op|'('
name|'context'
op|','
name|'cell_name'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'_cell_get_by_name_query'
op|'('
name|'context'
op|','
name|'cell_name'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'CellNotFound'
op|'('
name|'cell_name'
op|'='
name|'cell_name'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|cell_get_all
name|'def'
name|'cell_get_all'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Cell'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'########################'
nl|'\n'
comment|'# User-provided metadata'
nl|'\n'
nl|'\n'
DECL|function|_instance_metadata_get_multi
dedent|''
name|'def'
name|'_instance_metadata_get_multi'
op|'('
name|'context'
op|','
name|'instance_uuids'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'not'
name|'instance_uuids'
op|':'
newline|'\n'
indent|' '
name|'return'
op|'['
op|']'
newline|'\n'
dedent|''
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceMetadata'
op|')'
op|'.'
name|'filter'
op|'('
nl|'\n'
name|'models'
op|'.'
name|'InstanceMetadata'
op|'.'
name|'instance_uuid'
op|'.'
name|'in_'
op|'('
name|'instance_uuids'
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_instance_metadata_get_query
dedent|''
name|'def'
name|'_instance_metadata_get_query'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceMetadata'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|instance_metadata_get
name|'def'
name|'instance_metadata_get'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'rows'
op|'='
name|'_instance_metadata_get_query'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'return'
op|'{'
name|'row'
op|'['
string|"'key'"
op|']'
op|':'
name|'row'
op|'['
string|"'value'"
op|']'
name|'for'
name|'row'
name|'in'
name|'rows'
op|'}'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|')'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|instance_metadata_delete
name|'def'
name|'instance_metadata_delete'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'key'
op|')'
op|':'
newline|'\n'
indent|' '
name|'_instance_metadata_get_query'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'key'
op|'='
name|'key'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|')'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|instance_metadata_update
name|'def'
name|'instance_metadata_update'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'metadata'
op|','
name|'delete'
op|')'
op|':'
newline|'\n'
indent|' '
name|'all_keys'
op|'='
name|'metadata'
op|'.'
name|'keys'
op|'('
op|')'
newline|'\n'
name|'if'
name|'delete'
op|':'
newline|'\n'
indent|' '
name|'_instance_metadata_get_query'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|'.'
name|'filter'
op|'('
op|'~'
name|'models'
op|'.'
name|'InstanceMetadata'
op|'.'
name|'key'
op|'.'
name|'in_'
op|'('
name|'all_keys'
op|')'
op|')'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'already_existing_keys'
op|'='
op|'['
op|']'
newline|'\n'
name|'meta_refs'
op|'='
name|'_instance_metadata_get_query'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'InstanceMetadata'
op|'.'
name|'key'
op|'.'
name|'in_'
op|'('
name|'all_keys'
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'for'
name|'meta_ref'
name|'in'
name|'meta_refs'
op|':'
newline|'\n'
indent|' '
name|'already_existing_keys'
op|'.'
name|'append'
op|'('
name|'meta_ref'
op|'.'
name|'key'
op|')'
newline|'\n'
name|'meta_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|'"value"'
op|':'
name|'metadata'
op|'['
name|'meta_ref'
op|'.'
name|'key'
op|']'
op|'}'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'new_keys'
op|'='
name|'set'
op|'('
name|'all_keys'
op|')'
op|'-'
name|'set'
op|'('
name|'already_existing_keys'
op|')'
newline|'\n'
name|'for'
name|'key'
name|'in'
name|'new_keys'
op|':'
newline|'\n'
indent|' '
name|'meta_ref'
op|'='
name|'models'
op|'.'
name|'InstanceMetadata'
op|'('
op|')'
newline|'\n'
name|'meta_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|'"key"'
op|':'
name|'key'
op|','
string|'"value"'
op|':'
name|'metadata'
op|'['
name|'key'
op|']'
op|','
nl|'\n'
string|'"instance_uuid"'
op|':'
name|'instance_uuid'
op|'}'
op|')'
newline|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'add'
op|'('
name|'meta_ref'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'metadata'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'#######################'
nl|'\n'
comment|'# System-owned metadata'
nl|'\n'
nl|'\n'
nl|'\n'
DECL|function|_instance_system_metadata_get_multi
dedent|''
name|'def'
name|'_instance_system_metadata_get_multi'
op|'('
name|'context'
op|','
name|'instance_uuids'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'not'
name|'instance_uuids'
op|':'
newline|'\n'
indent|' '
name|'return'
op|'['
op|']'
newline|'\n'
dedent|''
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceSystemMetadata'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|"'yes'"
op|')'
op|'.'
name|'filter'
op|'('
nl|'\n'
name|'models'
op|'.'
name|'InstanceSystemMetadata'
op|'.'
name|'instance_uuid'
op|'.'
name|'in_'
op|'('
name|'instance_uuids'
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_instance_system_metadata_get_query
dedent|''
name|'def'
name|'_instance_system_metadata_get_query'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceSystemMetadata'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|instance_system_metadata_get
name|'def'
name|'instance_system_metadata_get'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'rows'
op|'='
name|'_instance_system_metadata_get_query'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'return'
op|'{'
name|'row'
op|'['
string|"'key'"
op|']'
op|':'
name|'row'
op|'['
string|"'value'"
op|']'
name|'for'
name|'row'
name|'in'
name|'rows'
op|'}'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|instance_system_metadata_update
name|'def'
name|'instance_system_metadata_update'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'metadata'
op|','
name|'delete'
op|')'
op|':'
newline|'\n'
indent|' '
name|'all_keys'
op|'='
name|'metadata'
op|'.'
name|'keys'
op|'('
op|')'
newline|'\n'
name|'if'
name|'delete'
op|':'
newline|'\n'
indent|' '
name|'_instance_system_metadata_get_query'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|'.'
name|'filter'
op|'('
op|'~'
name|'models'
op|'.'
name|'InstanceSystemMetadata'
op|'.'
name|'key'
op|'.'
name|'in_'
op|'('
name|'all_keys'
op|')'
op|')'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'already_existing_keys'
op|'='
op|'['
op|']'
newline|'\n'
name|'meta_refs'
op|'='
name|'_instance_system_metadata_get_query'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'InstanceSystemMetadata'
op|'.'
name|'key'
op|'.'
name|'in_'
op|'('
name|'all_keys'
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'for'
name|'meta_ref'
name|'in'
name|'meta_refs'
op|':'
newline|'\n'
indent|' '
name|'already_existing_keys'
op|'.'
name|'append'
op|'('
name|'meta_ref'
op|'.'
name|'key'
op|')'
newline|'\n'
name|'meta_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|'"value"'
op|':'
name|'metadata'
op|'['
name|'meta_ref'
op|'.'
name|'key'
op|']'
op|'}'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'new_keys'
op|'='
name|'set'
op|'('
name|'all_keys'
op|')'
op|'-'
name|'set'
op|'('
name|'already_existing_keys'
op|')'
newline|'\n'
name|'for'
name|'key'
name|'in'
name|'new_keys'
op|':'
newline|'\n'
indent|' '
name|'meta_ref'
op|'='
name|'models'
op|'.'
name|'InstanceSystemMetadata'
op|'('
op|')'
newline|'\n'
name|'meta_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|'"key"'
op|':'
name|'key'
op|','
string|'"value"'
op|':'
name|'metadata'
op|'['
name|'key'
op|']'
op|','
nl|'\n'
string|'"instance_uuid"'
op|':'
name|'instance_uuid'
op|'}'
op|')'
newline|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'add'
op|'('
name|'meta_ref'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'metadata'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'####################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|agent_build_create
name|'def'
name|'agent_build_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'agent_build_ref'
op|'='
name|'models'
op|'.'
name|'AgentBuild'
op|'('
op|')'
newline|'\n'
name|'agent_build_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'agent_build_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'AgentBuildExists'
op|'('
name|'hypervisor'
op|'='
name|'values'
op|'['
string|"'hypervisor'"
op|']'
op|','
nl|'\n'
name|'os'
op|'='
name|'values'
op|'['
string|"'os'"
op|']'
op|','
name|'architecture'
op|'='
name|'values'
op|'['
string|"'architecture'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'agent_build_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|agent_build_get_by_triple
name|'def'
name|'agent_build_get_by_triple'
op|'('
name|'context'
op|','
name|'hypervisor'
op|','
name|'os'
op|','
name|'architecture'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'AgentBuild'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'hypervisor'
op|'='
name|'hypervisor'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'os'
op|'='
name|'os'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'architecture'
op|'='
name|'architecture'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|agent_build_get_all
name|'def'
name|'agent_build_get_all'
op|'('
name|'context'
op|','
name|'hypervisor'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'hypervisor'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'AgentBuild'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'hypervisor'
op|'='
name|'hypervisor'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'AgentBuild'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|agent_build_destroy
name|'def'
name|'agent_build_destroy'
op|'('
name|'context'
op|','
name|'agent_build_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'rows_affected'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'AgentBuild'
op|')'
op|'.'
name|'filter_by'
op|'('
nl|'\n'
name|'id'
op|'='
name|'agent_build_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'if'
name|'rows_affected'
op|'=='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'AgentBuildNotFound'
op|'('
name|'id'
op|'='
name|'agent_build_id'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|agent_build_update
name|'def'
name|'agent_build_update'
op|'('
name|'context'
op|','
name|'agent_build_id'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'rows_affected'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'AgentBuild'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'agent_build_id'
op|')'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'if'
name|'rows_affected'
op|'=='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'AgentBuildNotFound'
op|'('
name|'id'
op|'='
name|'agent_build_id'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'####################'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader_allow_async'
newline|'\n'
DECL|function|bw_usage_get
name|'def'
name|'bw_usage_get'
op|'('
name|'context'
op|','
name|'uuid'
op|','
name|'start_period'
op|','
name|'mac'
op|')'
op|':'
newline|'\n'
indent|' '
name|'values'
op|'='
op|'{'
string|"'start_period'"
op|':'
name|'start_period'
op|'}'
newline|'\n'
name|'values'
op|'='
name|'convert_objects_related_datetimes'
op|'('
name|'values'
op|','
string|"'start_period'"
op|')'
newline|'\n'
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'BandwidthUsage'
op|','
name|'read_deleted'
op|'='
string|'"yes"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'start_period'
op|'='
name|'values'
op|'['
string|"'start_period'"
op|']'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'uuid'
op|'='
name|'uuid'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'mac'
op|'='
name|'mac'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader_allow_async'
newline|'\n'
DECL|function|bw_usage_get_by_uuids
name|'def'
name|'bw_usage_get_by_uuids'
op|'('
name|'context'
op|','
name|'uuids'
op|','
name|'start_period'
op|')'
op|':'
newline|'\n'
indent|' '
name|'values'
op|'='
op|'{'
string|"'start_period'"
op|':'
name|'start_period'
op|'}'
newline|'\n'
name|'values'
op|'='
name|'convert_objects_related_datetimes'
op|'('
name|'values'
op|','
string|"'start_period'"
op|')'
newline|'\n'
name|'return'
op|'('
nl|'\n'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'BandwidthUsage'
op|','
name|'read_deleted'
op|'='
string|'"yes"'
op|')'
op|'.'
nl|'\n'
name|'filter'
op|'('
name|'models'
op|'.'
name|'BandwidthUsage'
op|'.'
name|'uuid'
op|'.'
name|'in_'
op|'('
name|'uuids'
op|')'
op|')'
op|'.'
nl|'\n'
name|'filter_by'
op|'('
name|'start_period'
op|'='
name|'values'
op|'['
string|"'start_period'"
op|']'
op|')'
op|'.'
nl|'\n'
name|'all'
op|'('
op|')'
nl|'\n'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'oslo_db_api'
op|'.'
name|'wrap_db_retry'
op|'('
name|'max_retries'
op|'='
number|'5'
op|','
name|'retry_on_deadlock'
op|'='
name|'True'
op|')'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|bw_usage_update
name|'def'
name|'bw_usage_update'
op|'('
name|'context'
op|','
name|'uuid'
op|','
name|'mac'
op|','
name|'start_period'
op|','
name|'bw_in'
op|','
name|'bw_out'
op|','
nl|'\n'
name|'last_ctr_in'
op|','
name|'last_ctr_out'
op|','
name|'last_refreshed'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
nl|'\n'
indent|' '
name|'if'
name|'last_refreshed'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'last_refreshed'
op|'='
name|'timeutils'
op|'.'
name|'utcnow'
op|'('
op|')'
newline|'\n'
nl|'\n'
comment|"# NOTE(comstud): More often than not, we'll be updating records vs"
nl|'\n'
comment|'# creating records. Optimize accordingly, trying to update existing'
nl|'\n'
comment|'# records. Fall back to creation when no rows are updated.'
nl|'\n'
dedent|''
name|'ts_values'
op|'='
op|'{'
string|"'last_refreshed'"
op|':'
name|'last_refreshed'
op|','
nl|'\n'
string|"'start_period'"
op|':'
name|'start_period'
op|'}'
newline|'\n'
name|'ts_keys'
op|'='
op|'('
string|"'start_period'"
op|','
string|"'last_refreshed'"
op|')'
newline|'\n'
name|'ts_values'
op|'='
name|'convert_objects_related_datetimes'
op|'('
name|'ts_values'
op|','
op|'*'
name|'ts_keys'
op|')'
newline|'\n'
name|'values'
op|'='
op|'{'
string|"'last_refreshed'"
op|':'
name|'ts_values'
op|'['
string|"'last_refreshed'"
op|']'
op|','
nl|'\n'
string|"'last_ctr_in'"
op|':'
name|'last_ctr_in'
op|','
nl|'\n'
string|"'last_ctr_out'"
op|':'
name|'last_ctr_out'
op|','
nl|'\n'
string|"'bw_in'"
op|':'
name|'bw_in'
op|','
nl|'\n'
string|"'bw_out'"
op|':'
name|'bw_out'
op|'}'
newline|'\n'
name|'bw_usage'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'BandwidthUsage'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|"'yes'"
op|')'
op|'.'
name|'filter_by'
op|'('
name|'start_period'
op|'='
name|'ts_values'
op|'['
string|"'start_period'"
op|']'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'uuid'
op|'='
name|'uuid'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'mac'
op|'='
name|'mac'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'bw_usage'
op|':'
newline|'\n'
indent|' '
name|'bw_usage'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'return'
name|'bw_usage'
newline|'\n'
nl|'\n'
dedent|''
name|'bwusage'
op|'='
name|'models'
op|'.'
name|'BandwidthUsage'
op|'('
op|')'
newline|'\n'
name|'bwusage'
op|'.'
name|'start_period'
op|'='
name|'ts_values'
op|'['
string|"'start_period'"
op|']'
newline|'\n'
name|'bwusage'
op|'.'
name|'uuid'
op|'='
name|'uuid'
newline|'\n'
name|'bwusage'
op|'.'
name|'mac'
op|'='
name|'mac'
newline|'\n'
name|'bwusage'
op|'.'
name|'last_refreshed'
op|'='
name|'ts_values'
op|'['
string|"'last_refreshed'"
op|']'
newline|'\n'
name|'bwusage'
op|'.'
name|'bw_in'
op|'='
name|'bw_in'
newline|'\n'
name|'bwusage'
op|'.'
name|'bw_out'
op|'='
name|'bw_out'
newline|'\n'
name|'bwusage'
op|'.'
name|'last_ctr_in'
op|'='
name|'last_ctr_in'
newline|'\n'
name|'bwusage'
op|'.'
name|'last_ctr_out'
op|'='
name|'last_ctr_out'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'bwusage'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
comment|'# NOTE(sirp): Possible race if two greenthreads attempt to create'
nl|'\n'
comment|'# the usage entry at the same time. First one wins.'
nl|'\n'
indent|' '
name|'pass'
newline|'\n'
dedent|''
name|'return'
name|'bwusage'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'####################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|vol_get_usage_by_time
name|'def'
name|'vol_get_usage_by_time'
op|'('
name|'context'
op|','
name|'begin'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Return volumes usage that have been updated after a specified time."""'
newline|'\n'
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'VolumeUsage'
op|','
name|'read_deleted'
op|'='
string|'"yes"'
op|')'
op|'.'
name|'filter'
op|'('
name|'or_'
op|'('
name|'models'
op|'.'
name|'VolumeUsage'
op|'.'
name|'tot_last_refreshed'
op|'=='
name|'null'
op|'('
op|')'
op|','
nl|'\n'
name|'models'
op|'.'
name|'VolumeUsage'
op|'.'
name|'tot_last_refreshed'
op|'>'
name|'begin'
op|','
nl|'\n'
name|'models'
op|'.'
name|'VolumeUsage'
op|'.'
name|'curr_last_refreshed'
op|'=='
name|'null'
op|'('
op|')'
op|','
nl|'\n'
name|'models'
op|'.'
name|'VolumeUsage'
op|'.'
name|'curr_last_refreshed'
op|'>'
name|'begin'
op|','
nl|'\n'
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|vol_usage_update
name|'def'
name|'vol_usage_update'
op|'('
name|'context'
op|','
name|'id'
op|','
name|'rd_req'
op|','
name|'rd_bytes'
op|','
name|'wr_req'
op|','
name|'wr_bytes'
op|','
nl|'\n'
name|'instance_id'
op|','
name|'project_id'
op|','
name|'user_id'
op|','
name|'availability_zone'
op|','
nl|'\n'
name|'update_totals'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
nl|'\n'
indent|' '
name|'refreshed'
op|'='
name|'timeutils'
op|'.'
name|'utcnow'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'values'
op|'='
op|'{'
op|'}'
newline|'\n'
comment|'# NOTE(dricco): We will be mostly updating current usage records vs'
nl|'\n'
comment|'# updating total or creating records. Optimize accordingly.'
nl|'\n'
name|'if'
name|'not'
name|'update_totals'
op|':'
newline|'\n'
indent|' '
name|'values'
op|'='
op|'{'
string|"'curr_last_refreshed'"
op|':'
name|'refreshed'
op|','
nl|'\n'
string|"'curr_reads'"
op|':'
name|'rd_req'
op|','
nl|'\n'
string|"'curr_read_bytes'"
op|':'
name|'rd_bytes'
op|','
nl|'\n'
string|"'curr_writes'"
op|':'
name|'wr_req'
op|','
nl|'\n'
string|"'curr_write_bytes'"
op|':'
name|'wr_bytes'
op|','
nl|'\n'
string|"'instance_uuid'"
op|':'
name|'instance_id'
op|','
nl|'\n'
string|"'project_id'"
op|':'
name|'project_id'
op|','
nl|'\n'
string|"'user_id'"
op|':'
name|'user_id'
op|','
nl|'\n'
string|"'availability_zone'"
op|':'
name|'availability_zone'
op|'}'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'values'
op|'='
op|'{'
string|"'tot_last_refreshed'"
op|':'
name|'refreshed'
op|','
nl|'\n'
string|"'tot_reads'"
op|':'
name|'models'
op|'.'
name|'VolumeUsage'
op|'.'
name|'tot_reads'
op|'+'
name|'rd_req'
op|','
nl|'\n'
string|"'tot_read_bytes'"
op|':'
name|'models'
op|'.'
name|'VolumeUsage'
op|'.'
name|'tot_read_bytes'
op|'+'
nl|'\n'
name|'rd_bytes'
op|','
nl|'\n'
string|"'tot_writes'"
op|':'
name|'models'
op|'.'
name|'VolumeUsage'
op|'.'
name|'tot_writes'
op|'+'
name|'wr_req'
op|','
nl|'\n'
string|"'tot_write_bytes'"
op|':'
name|'models'
op|'.'
name|'VolumeUsage'
op|'.'
name|'tot_write_bytes'
op|'+'
nl|'\n'
name|'wr_bytes'
op|','
nl|'\n'
string|"'curr_reads'"
op|':'
number|'0'
op|','
nl|'\n'
string|"'curr_read_bytes'"
op|':'
number|'0'
op|','
nl|'\n'
string|"'curr_writes'"
op|':'
number|'0'
op|','
nl|'\n'
string|"'curr_write_bytes'"
op|':'
number|'0'
op|','
nl|'\n'
string|"'instance_uuid'"
op|':'
name|'instance_id'
op|','
nl|'\n'
string|"'project_id'"
op|':'
name|'project_id'
op|','
nl|'\n'
string|"'user_id'"
op|':'
name|'user_id'
op|','
nl|'\n'
string|"'availability_zone'"
op|':'
name|'availability_zone'
op|'}'
newline|'\n'
nl|'\n'
dedent|''
name|'current_usage'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'VolumeUsage'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|'"yes"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'volume_id'
op|'='
name|'id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'current_usage'
op|':'
newline|'\n'
indent|' '
name|'if'
op|'('
name|'rd_req'
op|'<'
name|'current_usage'
op|'['
string|"'curr_reads'"
op|']'
name|'or'
nl|'\n'
name|'rd_bytes'
op|'<'
name|'current_usage'
op|'['
string|"'curr_read_bytes'"
op|']'
name|'or'
nl|'\n'
name|'wr_req'
op|'<'
name|'current_usage'
op|'['
string|"'curr_writes'"
op|']'
name|'or'
nl|'\n'
name|'wr_bytes'
op|'<'
name|'current_usage'
op|'['
string|"'curr_write_bytes'"
op|']'
op|')'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'info'
op|'('
name|'_LI'
op|'('
string|'"Volume(%s) has lower stats then what is in "'
nl|'\n'
string|'"the database. Instance must have been rebooted "'
nl|'\n'
string|'"or crashed. Updating totals."'
op|')'
op|','
name|'id'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'update_totals'
op|':'
newline|'\n'
indent|' '
name|'values'
op|'['
string|"'tot_reads'"
op|']'
op|'='
op|'('
name|'models'
op|'.'
name|'VolumeUsage'
op|'.'
name|'tot_reads'
op|'+'
nl|'\n'
name|'current_usage'
op|'['
string|"'curr_reads'"
op|']'
op|')'
newline|'\n'
name|'values'
op|'['
string|"'tot_read_bytes'"
op|']'
op|'='
op|'('
nl|'\n'
name|'models'
op|'.'
name|'VolumeUsage'
op|'.'
name|'tot_read_bytes'
op|'+'
nl|'\n'
name|'current_usage'
op|'['
string|"'curr_read_bytes'"
op|']'
op|')'
newline|'\n'
name|'values'
op|'['
string|"'tot_writes'"
op|']'
op|'='
op|'('
name|'models'
op|'.'
name|'VolumeUsage'
op|'.'
name|'tot_writes'
op|'+'
nl|'\n'
name|'current_usage'
op|'['
string|"'curr_writes'"
op|']'
op|')'
newline|'\n'
name|'values'
op|'['
string|"'tot_write_bytes'"
op|']'
op|'='
op|'('
nl|'\n'
name|'models'
op|'.'
name|'VolumeUsage'
op|'.'
name|'tot_write_bytes'
op|'+'
nl|'\n'
name|'current_usage'
op|'['
string|"'curr_write_bytes'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'values'
op|'['
string|"'tot_reads'"
op|']'
op|'='
op|'('
name|'models'
op|'.'
name|'VolumeUsage'
op|'.'
name|'tot_reads'
op|'+'
nl|'\n'
name|'current_usage'
op|'['
string|"'curr_reads'"
op|']'
op|'+'
nl|'\n'
name|'rd_req'
op|')'
newline|'\n'
name|'values'
op|'['
string|"'tot_read_bytes'"
op|']'
op|'='
op|'('
nl|'\n'
name|'models'
op|'.'
name|'VolumeUsage'
op|'.'
name|'tot_read_bytes'
op|'+'
nl|'\n'
name|'current_usage'
op|'['
string|"'curr_read_bytes'"
op|']'
op|'+'
name|'rd_bytes'
op|')'
newline|'\n'
name|'values'
op|'['
string|"'tot_writes'"
op|']'
op|'='
op|'('
name|'models'
op|'.'
name|'VolumeUsage'
op|'.'
name|'tot_writes'
op|'+'
nl|'\n'
name|'current_usage'
op|'['
string|"'curr_writes'"
op|']'
op|'+'
nl|'\n'
name|'wr_req'
op|')'
newline|'\n'
name|'values'
op|'['
string|"'tot_write_bytes'"
op|']'
op|'='
op|'('
nl|'\n'
name|'models'
op|'.'
name|'VolumeUsage'
op|'.'
name|'tot_write_bytes'
op|'+'
nl|'\n'
name|'current_usage'
op|'['
string|"'curr_write_bytes'"
op|']'
op|'+'
name|'wr_bytes'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'current_usage'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'current_usage'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'refresh'
op|'('
name|'current_usage'
op|')'
newline|'\n'
name|'return'
name|'current_usage'
newline|'\n'
nl|'\n'
dedent|''
name|'vol_usage'
op|'='
name|'models'
op|'.'
name|'VolumeUsage'
op|'('
op|')'
newline|'\n'
name|'vol_usage'
op|'.'
name|'volume_id'
op|'='
name|'id'
newline|'\n'
name|'vol_usage'
op|'.'
name|'instance_uuid'
op|'='
name|'instance_id'
newline|'\n'
name|'vol_usage'
op|'.'
name|'project_id'
op|'='
name|'project_id'
newline|'\n'
name|'vol_usage'
op|'.'
name|'user_id'
op|'='
name|'user_id'
newline|'\n'
name|'vol_usage'
op|'.'
name|'availability_zone'
op|'='
name|'availability_zone'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'update_totals'
op|':'
newline|'\n'
indent|' '
name|'vol_usage'
op|'.'
name|'curr_last_refreshed'
op|'='
name|'refreshed'
newline|'\n'
name|'vol_usage'
op|'.'
name|'curr_reads'
op|'='
name|'rd_req'
newline|'\n'
name|'vol_usage'
op|'.'
name|'curr_read_bytes'
op|'='
name|'rd_bytes'
newline|'\n'
name|'vol_usage'
op|'.'
name|'curr_writes'
op|'='
name|'wr_req'
newline|'\n'
name|'vol_usage'
op|'.'
name|'curr_write_bytes'
op|'='
name|'wr_bytes'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'vol_usage'
op|'.'
name|'tot_last_refreshed'
op|'='
name|'refreshed'
newline|'\n'
name|'vol_usage'
op|'.'
name|'tot_reads'
op|'='
name|'rd_req'
newline|'\n'
name|'vol_usage'
op|'.'
name|'tot_read_bytes'
op|'='
name|'rd_bytes'
newline|'\n'
name|'vol_usage'
op|'.'
name|'tot_writes'
op|'='
name|'wr_req'
newline|'\n'
name|'vol_usage'
op|'.'
name|'tot_write_bytes'
op|'='
name|'wr_bytes'
newline|'\n'
nl|'\n'
dedent|''
name|'vol_usage'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
nl|'\n'
name|'return'
name|'vol_usage'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'####################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|s3_image_get
name|'def'
name|'s3_image_get'
op|'('
name|'context'
op|','
name|'image_id'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Find local s3 image represented by the provided id."""'
newline|'\n'
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'S3Image'
op|','
name|'read_deleted'
op|'='
string|'"yes"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'image_id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ImageNotFound'
op|'('
name|'image_id'
op|'='
name|'image_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|s3_image_get_by_uuid
name|'def'
name|'s3_image_get_by_uuid'
op|'('
name|'context'
op|','
name|'image_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Find local s3 image represented by the provided uuid."""'
newline|'\n'
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'S3Image'
op|','
name|'read_deleted'
op|'='
string|'"yes"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'uuid'
op|'='
name|'image_uuid'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ImageNotFound'
op|'('
name|'image_id'
op|'='
name|'image_uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|s3_image_create
name|'def'
name|'s3_image_create'
op|'('
name|'context'
op|','
name|'image_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Create local s3 image represented by provided uuid."""'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'s3_image_ref'
op|'='
name|'models'
op|'.'
name|'S3Image'
op|'('
op|')'
newline|'\n'
name|'s3_image_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|"'uuid'"
op|':'
name|'image_uuid'
op|'}'
op|')'
newline|'\n'
name|'s3_image_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'Exception'
name|'as'
name|'e'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'db_exc'
op|'.'
name|'DBError'
op|'('
name|'e'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'s3_image_ref'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'####################'
nl|'\n'
nl|'\n'
nl|'\n'
DECL|function|_aggregate_get_query
dedent|''
name|'def'
name|'_aggregate_get_query'
op|'('
name|'context'
op|','
name|'model_class'
op|','
name|'id_field'
op|'='
name|'None'
op|','
name|'id'
op|'='
name|'None'
op|','
nl|'\n'
name|'read_deleted'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'columns_to_join'
op|'='
op|'{'
name|'models'
op|'.'
name|'Aggregate'
op|':'
op|'['
string|"'_hosts'"
op|','
string|"'_metadata'"
op|']'
op|'}'
newline|'\n'
nl|'\n'
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'model_class'
op|','
name|'read_deleted'
op|'='
name|'read_deleted'
op|')'
newline|'\n'
nl|'\n'
name|'for'
name|'c'
name|'in'
name|'columns_to_join'
op|'.'
name|'get'
op|'('
name|'model_class'
op|','
op|'['
op|']'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
name|'c'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'id'
name|'and'
name|'id_field'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'id_field'
op|'=='
name|'id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'query'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|aggregate_create
name|'def'
name|'aggregate_create'
op|'('
name|'context'
op|','
name|'values'
op|','
name|'metadata'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'_aggregate_get_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Aggregate'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Aggregate'
op|'.'
name|'name'
op|','
nl|'\n'
name|'values'
op|'['
string|"'name'"
op|']'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|"'no'"
op|')'
newline|'\n'
name|'aggregate'
op|'='
name|'query'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'aggregate'
op|':'
newline|'\n'
indent|' '
name|'aggregate'
op|'='
name|'models'
op|'.'
name|'Aggregate'
op|'('
op|')'
newline|'\n'
name|'aggregate'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'aggregate'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
comment|"# We don't want these to be lazy loaded later. We know there is"
nl|'\n'
comment|'# nothing here since we just created this aggregate.'
nl|'\n'
name|'aggregate'
op|'.'
name|'_hosts'
op|'='
op|'['
op|']'
newline|'\n'
name|'aggregate'
op|'.'
name|'_metadata'
op|'='
op|'['
op|']'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'AggregateNameExists'
op|'('
name|'aggregate_name'
op|'='
name|'values'
op|'['
string|"'name'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'metadata'
op|':'
newline|'\n'
indent|' '
name|'aggregate_metadata_add'
op|'('
name|'context'
op|','
name|'aggregate'
op|'.'
name|'id'
op|','
name|'metadata'
op|')'
newline|'\n'
comment|"# NOTE(pkholkin): '_metadata' attribute was updated during"
nl|'\n'
comment|"# 'aggregate_metadata_add' method, so it should be expired and"
nl|'\n'
comment|'# read from db'
nl|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'expire'
op|'('
name|'aggregate'
op|','
op|'['
string|"'_metadata'"
op|']'
op|')'
newline|'\n'
name|'aggregate'
op|'.'
name|'_metadata'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'aggregate'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|aggregate_get
name|'def'
name|'aggregate_get'
op|'('
name|'context'
op|','
name|'aggregate_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'_aggregate_get_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Aggregate'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Aggregate'
op|'.'
name|'id'
op|','
nl|'\n'
name|'aggregate_id'
op|')'
newline|'\n'
name|'aggregate'
op|'='
name|'query'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'aggregate'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'AggregateNotFound'
op|'('
name|'aggregate_id'
op|'='
name|'aggregate_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'aggregate'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|aggregate_get_by_host
name|'def'
name|'aggregate_get_by_host'
op|'('
name|'context'
op|','
name|'host'
op|','
name|'key'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Return rows that match host (mandatory) and metadata key (optional).\n\n :param host matches host, and is required.\n :param key Matches metadata key, if not None.\n """'
newline|'\n'
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Aggregate'
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'_hosts'"
op|')'
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'_metadata'"
op|')'
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'join'
op|'('
string|"'_hosts'"
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'AggregateHost'
op|'.'
name|'host'
op|'=='
name|'host'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'key'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'join'
op|'('
string|'"_metadata"'
op|')'
op|'.'
name|'filter'
op|'('
nl|'\n'
name|'models'
op|'.'
name|'AggregateMetadata'
op|'.'
name|'key'
op|'=='
name|'key'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'query'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|aggregate_metadata_get_by_host
name|'def'
name|'aggregate_metadata_get_by_host'
op|'('
name|'context'
op|','
name|'host'
op|','
name|'key'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Aggregate'
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'join'
op|'('
string|'"_hosts"'
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'join'
op|'('
string|'"_metadata"'
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'AggregateHost'
op|'.'
name|'host'
op|'=='
name|'host'
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'contains_eager'
op|'('
string|'"_metadata"'
op|')'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'key'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'AggregateMetadata'
op|'.'
name|'key'
op|'=='
name|'key'
op|')'
newline|'\n'
dedent|''
name|'rows'
op|'='
name|'query'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'metadata'
op|'='
name|'collections'
op|'.'
name|'defaultdict'
op|'('
name|'set'
op|')'
newline|'\n'
name|'for'
name|'agg'
name|'in'
name|'rows'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'kv'
name|'in'
name|'agg'
op|'.'
name|'_metadata'
op|':'
newline|'\n'
indent|' '
name|'metadata'
op|'['
name|'kv'
op|'['
string|"'key'"
op|']'
op|']'
op|'.'
name|'add'
op|'('
name|'kv'
op|'['
string|"'value'"
op|']'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'return'
name|'dict'
op|'('
name|'metadata'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|aggregate_get_by_metadata_key
name|'def'
name|'aggregate_get_by_metadata_key'
op|'('
name|'context'
op|','
name|'key'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Return rows that match metadata key.\n\n :param key Matches metadata key.\n """'
newline|'\n'
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Aggregate'
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'join'
op|'('
string|'"_metadata"'
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'AggregateMetadata'
op|'.'
name|'key'
op|'=='
name|'key'
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'contains_eager'
op|'('
string|'"_metadata"'
op|')'
op|')'
newline|'\n'
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|'"_hosts"'
op|')'
op|')'
newline|'\n'
name|'return'
name|'query'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|aggregate_update
name|'def'
name|'aggregate_update'
op|'('
name|'context'
op|','
name|'aggregate_id'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
string|'"name"'
name|'in'
name|'values'
op|':'
newline|'\n'
indent|' '
name|'aggregate_by_name'
op|'='
op|'('
name|'_aggregate_get_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Aggregate'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Aggregate'
op|'.'
name|'name'
op|','
nl|'\n'
name|'values'
op|'['
string|"'name'"
op|']'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|"'no'"
op|')'
op|'.'
name|'first'
op|'('
op|')'
op|')'
newline|'\n'
name|'if'
name|'aggregate_by_name'
name|'and'
name|'aggregate_by_name'
op|'.'
name|'id'
op|'!='
name|'aggregate_id'
op|':'
newline|'\n'
comment|'# there is another aggregate with the new name'
nl|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'AggregateNameExists'
op|'('
name|'aggregate_name'
op|'='
name|'values'
op|'['
string|"'name'"
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'aggregate'
op|'='
op|'('
name|'_aggregate_get_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Aggregate'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Aggregate'
op|'.'
name|'id'
op|','
nl|'\n'
name|'aggregate_id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
op|')'
newline|'\n'
nl|'\n'
name|'set_delete'
op|'='
name|'True'
newline|'\n'
name|'if'
name|'aggregate'
op|':'
newline|'\n'
indent|' '
name|'if'
string|'"availability_zone"'
name|'in'
name|'values'
op|':'
newline|'\n'
indent|' '
name|'az'
op|'='
name|'values'
op|'.'
name|'pop'
op|'('
string|"'availability_zone'"
op|')'
newline|'\n'
name|'if'
string|"'metadata'"
name|'not'
name|'in'
name|'values'
op|':'
newline|'\n'
indent|' '
name|'values'
op|'['
string|"'metadata'"
op|']'
op|'='
op|'{'
string|"'availability_zone'"
op|':'
name|'az'
op|'}'
newline|'\n'
name|'set_delete'
op|'='
name|'False'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'values'
op|'['
string|"'metadata'"
op|']'
op|'['
string|"'availability_zone'"
op|']'
op|'='
name|'az'
newline|'\n'
dedent|''
dedent|''
name|'metadata'
op|'='
name|'values'
op|'.'
name|'get'
op|'('
string|"'metadata'"
op|')'
newline|'\n'
name|'if'
name|'metadata'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'aggregate_metadata_add'
op|'('
name|'context'
op|','
nl|'\n'
name|'aggregate_id'
op|','
nl|'\n'
name|'values'
op|'.'
name|'pop'
op|'('
string|"'metadata'"
op|')'
op|','
nl|'\n'
name|'set_delete'
op|'='
name|'set_delete'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'aggregate'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'aggregate'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'return'
name|'aggregate_get'
op|'('
name|'context'
op|','
name|'aggregate'
op|'.'
name|'id'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'AggregateNotFound'
op|'('
name|'aggregate_id'
op|'='
name|'aggregate_id'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|aggregate_delete
name|'def'
name|'aggregate_delete'
op|'('
name|'context'
op|','
name|'aggregate_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'count'
op|'='
name|'_aggregate_get_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Aggregate'
op|','
nl|'\n'
name|'models'
op|'.'
name|'Aggregate'
op|'.'
name|'id'
op|','
nl|'\n'
name|'aggregate_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'if'
name|'count'
op|'=='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'AggregateNotFound'
op|'('
name|'aggregate_id'
op|'='
name|'aggregate_id'
op|')'
newline|'\n'
nl|'\n'
comment|'# Delete Metadata'
nl|'\n'
dedent|''
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'AggregateMetadata'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'aggregate_id'
op|'='
name|'aggregate_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|aggregate_get_all
name|'def'
name|'aggregate_get_all'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_aggregate_get_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Aggregate'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_aggregate_metadata_get_query
dedent|''
name|'def'
name|'_aggregate_metadata_get_query'
op|'('
name|'context'
op|','
name|'aggregate_id'
op|','
name|'read_deleted'
op|'='
string|'"yes"'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'AggregateMetadata'
op|','
nl|'\n'
name|'read_deleted'
op|'='
name|'read_deleted'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'aggregate_id'
op|'='
name|'aggregate_id'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_aggregate_exists'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|aggregate_metadata_get
name|'def'
name|'aggregate_metadata_get'
op|'('
name|'context'
op|','
name|'aggregate_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'rows'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'AggregateMetadata'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'aggregate_id'
op|'='
name|'aggregate_id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'return'
op|'{'
name|'r'
op|'['
string|"'key'"
op|']'
op|':'
name|'r'
op|'['
string|"'value'"
op|']'
name|'for'
name|'r'
name|'in'
name|'rows'
op|'}'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_aggregate_exists'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|aggregate_metadata_delete
name|'def'
name|'aggregate_metadata_delete'
op|'('
name|'context'
op|','
name|'aggregate_id'
op|','
name|'key'
op|')'
op|':'
newline|'\n'
indent|' '
name|'count'
op|'='
name|'_aggregate_get_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'AggregateMetadata'
op|','
nl|'\n'
name|'models'
op|'.'
name|'AggregateMetadata'
op|'.'
name|'aggregate_id'
op|','
nl|'\n'
name|'aggregate_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'key'
op|'='
name|'key'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'if'
name|'count'
op|'=='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'AggregateMetadataNotFound'
op|'('
name|'aggregate_id'
op|'='
name|'aggregate_id'
op|','
nl|'\n'
name|'metadata_key'
op|'='
name|'key'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'require_aggregate_exists'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|aggregate_metadata_add
name|'def'
name|'aggregate_metadata_add'
op|'('
name|'context'
op|','
name|'aggregate_id'
op|','
name|'metadata'
op|','
name|'set_delete'
op|'='
name|'False'
op|','
nl|'\n'
name|'max_retries'
op|'='
number|'10'
op|')'
op|':'
newline|'\n'
indent|' '
name|'all_keys'
op|'='
name|'metadata'
op|'.'
name|'keys'
op|'('
op|')'
newline|'\n'
name|'for'
name|'attempt'
name|'in'
name|'range'
op|'('
name|'max_retries'
op|')'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'_aggregate_metadata_get_query'
op|'('
name|'context'
op|','
name|'aggregate_id'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|"'no'"
op|')'
newline|'\n'
name|'if'
name|'set_delete'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'.'
name|'filter'
op|'('
op|'~'
name|'models'
op|'.'
name|'AggregateMetadata'
op|'.'
name|'key'
op|'.'
name|'in_'
op|'('
name|'all_keys'
op|')'
op|')'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'already_existing_keys'
op|'='
name|'set'
op|'('
op|')'
newline|'\n'
name|'if'
name|'all_keys'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
nl|'\n'
name|'models'
op|'.'
name|'AggregateMetadata'
op|'.'
name|'key'
op|'.'
name|'in_'
op|'('
name|'all_keys'
op|')'
op|')'
newline|'\n'
name|'for'
name|'meta_ref'
name|'in'
name|'query'
op|'.'
name|'all'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'key'
op|'='
name|'meta_ref'
op|'.'
name|'key'
newline|'\n'
name|'meta_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|'"value"'
op|':'
name|'metadata'
op|'['
name|'key'
op|']'
op|'}'
op|')'
newline|'\n'
name|'already_existing_keys'
op|'.'
name|'add'
op|'('
name|'key'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'new_entries'
op|'='
op|'['
op|']'
newline|'\n'
name|'for'
name|'key'
op|','
name|'value'
name|'in'
name|'metadata'
op|'.'
name|'items'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'key'
name|'in'
name|'already_existing_keys'
op|':'
newline|'\n'
indent|' '
name|'continue'
newline|'\n'
dedent|''
name|'new_entries'
op|'.'
name|'append'
op|'('
op|'{'
string|'"key"'
op|':'
name|'key'
op|','
nl|'\n'
string|'"value"'
op|':'
name|'value'
op|','
nl|'\n'
string|'"aggregate_id"'
op|':'
name|'aggregate_id'
op|'}'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'new_entries'
op|':'
newline|'\n'
indent|' '
name|'context'
op|'.'
name|'session'
op|'.'
name|'execute'
op|'('
nl|'\n'
name|'models'
op|'.'
name|'AggregateMetadata'
op|'.'
name|'__table__'
op|'.'
name|'insert'
op|'('
op|')'
op|','
nl|'\n'
name|'new_entries'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'metadata'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
comment|'# a concurrent transaction has been committed,'
nl|'\n'
comment|'# try again unless this was the last attempt'
nl|'\n'
indent|' '
name|'with'
name|'excutils'
op|'.'
name|'save_and_reraise_exception'
op|'('
op|')'
name|'as'
name|'ctxt'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'attempt'
op|'<'
name|'max_retries'
op|'-'
number|'1'
op|':'
newline|'\n'
indent|' '
name|'ctxt'
op|'.'
name|'reraise'
op|'='
name|'False'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'msg'
op|'='
name|'_'
op|'('
string|'"Add metadata failed for aggregate %(id)s after "'
nl|'\n'
string|'"%(retries)s retries"'
op|')'
op|'%'
op|'{'
string|'"id"'
op|':'
name|'aggregate_id'
op|','
nl|'\n'
string|'"retries"'
op|':'
name|'max_retries'
op|'}'
newline|'\n'
name|'LOG'
op|'.'
name|'warning'
op|'('
name|'msg'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
dedent|''
dedent|''
dedent|''
op|'@'
name|'require_aggregate_exists'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|aggregate_host_get_all
name|'def'
name|'aggregate_host_get_all'
op|'('
name|'context'
op|','
name|'aggregate_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'rows'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'AggregateHost'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'aggregate_id'
op|'='
name|'aggregate_id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'return'
op|'['
name|'r'
op|'.'
name|'host'
name|'for'
name|'r'
name|'in'
name|'rows'
op|']'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_aggregate_exists'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|aggregate_host_delete
name|'def'
name|'aggregate_host_delete'
op|'('
name|'context'
op|','
name|'aggregate_id'
op|','
name|'host'
op|')'
op|':'
newline|'\n'
indent|' '
name|'count'
op|'='
name|'_aggregate_get_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'AggregateHost'
op|','
nl|'\n'
name|'models'
op|'.'
name|'AggregateHost'
op|'.'
name|'aggregate_id'
op|','
nl|'\n'
name|'aggregate_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'host'
op|'='
name|'host'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'if'
name|'count'
op|'=='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'AggregateHostNotFound'
op|'('
name|'aggregate_id'
op|'='
name|'aggregate_id'
op|','
nl|'\n'
name|'host'
op|'='
name|'host'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'require_aggregate_exists'
newline|'\n'
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|aggregate_host_add
name|'def'
name|'aggregate_host_add'
op|'('
name|'context'
op|','
name|'aggregate_id'
op|','
name|'host'
op|')'
op|':'
newline|'\n'
indent|' '
name|'host_ref'
op|'='
name|'models'
op|'.'
name|'AggregateHost'
op|'('
op|')'
newline|'\n'
name|'host_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|'"host"'
op|':'
name|'host'
op|','
string|'"aggregate_id"'
op|':'
name|'aggregate_id'
op|'}'
op|')'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'host_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'AggregateHostExists'
op|'('
name|'host'
op|'='
name|'host'
op|','
nl|'\n'
name|'aggregate_id'
op|'='
name|'aggregate_id'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'host_ref'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|instance_fault_create
name|'def'
name|'instance_fault_create'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Create a new InstanceFault."""'
newline|'\n'
name|'fault_ref'
op|'='
name|'models'
op|'.'
name|'InstanceFault'
op|'('
op|')'
newline|'\n'
name|'fault_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'fault_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'return'
name|'dict'
op|'('
name|'fault_ref'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|instance_fault_get_by_instance_uuids
name|'def'
name|'instance_fault_get_by_instance_uuids'
op|'('
name|'context'
op|','
name|'instance_uuids'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Get all instance faults for the provided instance_uuids."""'
newline|'\n'
name|'if'
name|'not'
name|'instance_uuids'
op|':'
newline|'\n'
indent|' '
name|'return'
op|'{'
op|'}'
newline|'\n'
nl|'\n'
dedent|''
name|'rows'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceFault'
op|','
name|'read_deleted'
op|'='
string|"'no'"
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'InstanceFault'
op|'.'
name|'instance_uuid'
op|'.'
name|'in_'
op|'('
nl|'\n'
name|'instance_uuids'
op|')'
op|')'
op|'.'
name|'order_by'
op|'('
name|'desc'
op|'('
string|'"created_at"'
op|')'
op|','
name|'desc'
op|'('
string|'"id"'
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'output'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'for'
name|'instance_uuid'
name|'in'
name|'instance_uuids'
op|':'
newline|'\n'
indent|' '
name|'output'
op|'['
name|'instance_uuid'
op|']'
op|'='
op|'['
op|']'
newline|'\n'
nl|'\n'
dedent|''
name|'for'
name|'row'
name|'in'
name|'rows'
op|':'
newline|'\n'
indent|' '
name|'data'
op|'='
name|'dict'
op|'('
name|'row'
op|')'
newline|'\n'
name|'output'
op|'['
name|'row'
op|'['
string|"'instance_uuid'"
op|']'
op|']'
op|'.'
name|'append'
op|'('
name|'data'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'output'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'##################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|action_start
name|'def'
name|'action_start'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'convert_objects_related_datetimes'
op|'('
name|'values'
op|','
string|"'start_time'"
op|')'
newline|'\n'
name|'action_ref'
op|'='
name|'models'
op|'.'
name|'InstanceAction'
op|'('
op|')'
newline|'\n'
name|'action_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'action_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'return'
name|'action_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|action_finish
name|'def'
name|'action_finish'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'convert_objects_related_datetimes'
op|'('
name|'values'
op|','
string|"'start_time'"
op|','
string|"'finish_time'"
op|')'
newline|'\n'
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceAction'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'values'
op|'['
string|"'instance_uuid'"
op|']'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'request_id'
op|'='
name|'values'
op|'['
string|"'request_id'"
op|']'
op|')'
newline|'\n'
name|'if'
name|'query'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
op|'!='
number|'1'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceActionNotFound'
op|'('
nl|'\n'
name|'request_id'
op|'='
name|'values'
op|'['
string|"'request_id'"
op|']'
op|','
nl|'\n'
name|'instance_uuid'
op|'='
name|'values'
op|'['
string|"'instance_uuid'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'query'
op|'.'
name|'one'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|actions_get
name|'def'
name|'actions_get'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Get all instance actions for the provided uuid."""'
newline|'\n'
name|'actions'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceAction'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'order_by'
op|'('
name|'desc'
op|'('
string|'"created_at"'
op|')'
op|','
name|'desc'
op|'('
string|'"id"'
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'return'
name|'actions'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|action_get_by_request_id
name|'def'
name|'action_get_by_request_id'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'request_id'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Get the action by request_id and given instance."""'
newline|'\n'
name|'action'
op|'='
name|'_action_get_by_request_id'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'request_id'
op|')'
newline|'\n'
name|'return'
name|'action'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_action_get_by_request_id
dedent|''
name|'def'
name|'_action_get_by_request_id'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'request_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceAction'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'request_id'
op|'='
name|'request_id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_action_get_last_created_by_instance_uuid
dedent|''
name|'def'
name|'_action_get_last_created_by_instance_uuid'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
op|'('
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceAction'
op|')'
op|'.'
nl|'\n'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
nl|'\n'
name|'order_by'
op|'('
name|'desc'
op|'('
string|'"created_at"'
op|')'
op|','
name|'desc'
op|'('
string|'"id"'
op|')'
op|')'
op|'.'
nl|'\n'
name|'first'
op|'('
op|')'
op|')'
newline|'\n'
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|action_event_start
name|'def'
name|'action_event_start'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Start an event on an instance action."""'
newline|'\n'
name|'convert_objects_related_datetimes'
op|'('
name|'values'
op|','
string|"'start_time'"
op|')'
newline|'\n'
name|'action'
op|'='
name|'_action_get_by_request_id'
op|'('
name|'context'
op|','
name|'values'
op|'['
string|"'instance_uuid'"
op|']'
op|','
nl|'\n'
name|'values'
op|'['
string|"'request_id'"
op|']'
op|')'
newline|'\n'
comment|'# When nova-compute restarts, the context is generated again in'
nl|'\n'
comment|'# init_host workflow, the request_id was different with the request_id'
nl|'\n'
comment|"# recorded in InstanceAction, so we can't get the original record"
nl|'\n'
comment|'# according to request_id. Try to get the last created action so that'
nl|'\n'
comment|'# init_instance can continue to finish the recovery action, like:'
nl|'\n'
comment|'# powering_off, unpausing, and so on.'
nl|'\n'
name|'if'
name|'not'
name|'action'
name|'and'
name|'not'
name|'context'
op|'.'
name|'project_id'
op|':'
newline|'\n'
indent|' '
name|'action'
op|'='
name|'_action_get_last_created_by_instance_uuid'
op|'('
nl|'\n'
name|'context'
op|','
name|'values'
op|'['
string|"'instance_uuid'"
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'not'
name|'action'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceActionNotFound'
op|'('
nl|'\n'
name|'request_id'
op|'='
name|'values'
op|'['
string|"'request_id'"
op|']'
op|','
nl|'\n'
name|'instance_uuid'
op|'='
name|'values'
op|'['
string|"'instance_uuid'"
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'values'
op|'['
string|"'action_id'"
op|']'
op|'='
name|'action'
op|'['
string|"'id'"
op|']'
newline|'\n'
nl|'\n'
name|'event_ref'
op|'='
name|'models'
op|'.'
name|'InstanceActionEvent'
op|'('
op|')'
newline|'\n'
name|'event_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'add'
op|'('
name|'event_ref'
op|')'
newline|'\n'
name|'return'
name|'event_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|action_event_finish
name|'def'
name|'action_event_finish'
op|'('
name|'context'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Finish an event on an instance action."""'
newline|'\n'
name|'convert_objects_related_datetimes'
op|'('
name|'values'
op|','
string|"'start_time'"
op|','
string|"'finish_time'"
op|')'
newline|'\n'
name|'action'
op|'='
name|'_action_get_by_request_id'
op|'('
name|'context'
op|','
name|'values'
op|'['
string|"'instance_uuid'"
op|']'
op|','
nl|'\n'
name|'values'
op|'['
string|"'request_id'"
op|']'
op|')'
newline|'\n'
comment|'# When nova-compute restarts, the context is generated again in'
nl|'\n'
comment|'# init_host workflow, the request_id was different with the request_id'
nl|'\n'
comment|"# recorded in InstanceAction, so we can't get the original record"
nl|'\n'
comment|'# according to request_id. Try to get the last created action so that'
nl|'\n'
comment|'# init_instance can continue to finish the recovery action, like:'
nl|'\n'
comment|'# powering_off, unpausing, and so on.'
nl|'\n'
name|'if'
name|'not'
name|'action'
name|'and'
name|'not'
name|'context'
op|'.'
name|'project_id'
op|':'
newline|'\n'
indent|' '
name|'action'
op|'='
name|'_action_get_last_created_by_instance_uuid'
op|'('
nl|'\n'
name|'context'
op|','
name|'values'
op|'['
string|"'instance_uuid'"
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'not'
name|'action'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceActionNotFound'
op|'('
nl|'\n'
name|'request_id'
op|'='
name|'values'
op|'['
string|"'request_id'"
op|']'
op|','
nl|'\n'
name|'instance_uuid'
op|'='
name|'values'
op|'['
string|"'instance_uuid'"
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'event_ref'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceActionEvent'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'action_id'
op|'='
name|'action'
op|'['
string|"'id'"
op|']'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'event'
op|'='
name|'values'
op|'['
string|"'event'"
op|']'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'event_ref'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceActionEventNotFound'
op|'('
name|'action_id'
op|'='
name|'action'
op|'['
string|"'id'"
op|']'
op|','
nl|'\n'
name|'event'
op|'='
name|'values'
op|'['
string|"'event'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'event_ref'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'values'
op|'['
string|"'result'"
op|']'
op|'.'
name|'lower'
op|'('
op|')'
op|'=='
string|"'error'"
op|':'
newline|'\n'
indent|' '
name|'action'
op|'.'
name|'update'
op|'('
op|'{'
string|"'message'"
op|':'
string|"'Error'"
op|'}'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'event_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|action_events_get
name|'def'
name|'action_events_get'
op|'('
name|'context'
op|','
name|'action_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'events'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceActionEvent'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'action_id'
op|'='
name|'action_id'
op|')'
op|'.'
name|'order_by'
op|'('
name|'desc'
op|'('
string|'"created_at"'
op|')'
op|','
name|'desc'
op|'('
string|'"id"'
op|')'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'return'
name|'events'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|action_event_get_by_id
name|'def'
name|'action_event_get_by_id'
op|'('
name|'context'
op|','
name|'action_id'
op|','
name|'event_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'event'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceActionEvent'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'action_id'
op|'='
name|'action_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'event_id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'return'
name|'event'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'##################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|ec2_instance_create
name|'def'
name|'ec2_instance_create'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'id'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Create ec2 compatible instance by provided uuid."""'
newline|'\n'
name|'ec2_instance_ref'
op|'='
name|'models'
op|'.'
name|'InstanceIdMapping'
op|'('
op|')'
newline|'\n'
name|'ec2_instance_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|"'uuid'"
op|':'
name|'instance_uuid'
op|'}'
op|')'
newline|'\n'
name|'if'
name|'id'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'ec2_instance_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|"'id'"
op|':'
name|'id'
op|'}'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'ec2_instance_ref'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
nl|'\n'
name|'return'
name|'ec2_instance_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|ec2_instance_get_by_uuid
name|'def'
name|'ec2_instance_get_by_uuid'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'_ec2_instance_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceNotFound'
op|'('
name|'instance_id'
op|'='
name|'instance_uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|ec2_instance_get_by_id
name|'def'
name|'ec2_instance_get_by_id'
op|'('
name|'context'
op|','
name|'instance_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'_ec2_instance_get_query'
op|'('
name|'context'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'instance_id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceNotFound'
op|'('
name|'instance_id'
op|'='
name|'instance_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'result'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|get_instance_uuid_by_ec2_id
name|'def'
name|'get_instance_uuid_by_ec2_id'
op|'('
name|'context'
op|','
name|'ec2_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'ec2_instance_get_by_id'
op|'('
name|'context'
op|','
name|'ec2_id'
op|')'
newline|'\n'
name|'return'
name|'result'
op|'['
string|"'uuid'"
op|']'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_ec2_instance_get_query
dedent|''
name|'def'
name|'_ec2_instance_get_query'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceIdMapping'
op|','
name|'read_deleted'
op|'='
string|"'yes'"
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'##################'
nl|'\n'
nl|'\n'
nl|'\n'
DECL|function|_task_log_get_query
dedent|''
name|'def'
name|'_task_log_get_query'
op|'('
name|'context'
op|','
name|'task_name'
op|','
name|'period_beginning'
op|','
nl|'\n'
name|'period_ending'
op|','
name|'host'
op|'='
name|'None'
op|','
name|'state'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'values'
op|'='
op|'{'
string|"'period_beginning'"
op|':'
name|'period_beginning'
op|','
nl|'\n'
string|"'period_ending'"
op|':'
name|'period_ending'
op|'}'
newline|'\n'
name|'values'
op|'='
name|'convert_objects_related_datetimes'
op|'('
name|'values'
op|','
op|'*'
name|'values'
op|'.'
name|'keys'
op|'('
op|')'
op|')'
newline|'\n'
nl|'\n'
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'TaskLog'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'task_name'
op|'='
name|'task_name'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'period_beginning'
op|'='
name|'values'
op|'['
string|"'period_beginning'"
op|']'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'period_ending'
op|'='
name|'values'
op|'['
string|"'period_ending'"
op|']'
op|')'
newline|'\n'
name|'if'
name|'host'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter_by'
op|'('
name|'host'
op|'='
name|'host'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'state'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter_by'
op|'('
name|'state'
op|'='
name|'state'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'query'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|task_log_get
name|'def'
name|'task_log_get'
op|'('
name|'context'
op|','
name|'task_name'
op|','
name|'period_beginning'
op|','
name|'period_ending'
op|','
name|'host'
op|','
nl|'\n'
name|'state'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_task_log_get_query'
op|'('
name|'context'
op|','
name|'task_name'
op|','
name|'period_beginning'
op|','
nl|'\n'
name|'period_ending'
op|','
name|'host'
op|','
name|'state'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|task_log_get_all
name|'def'
name|'task_log_get_all'
op|'('
name|'context'
op|','
name|'task_name'
op|','
name|'period_beginning'
op|','
name|'period_ending'
op|','
nl|'\n'
name|'host'
op|'='
name|'None'
op|','
name|'state'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_task_log_get_query'
op|'('
name|'context'
op|','
name|'task_name'
op|','
name|'period_beginning'
op|','
nl|'\n'
name|'period_ending'
op|','
name|'host'
op|','
name|'state'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|task_log_begin_task
name|'def'
name|'task_log_begin_task'
op|'('
name|'context'
op|','
name|'task_name'
op|','
name|'period_beginning'
op|','
name|'period_ending'
op|','
nl|'\n'
name|'host'
op|','
name|'task_items'
op|'='
name|'None'
op|','
name|'message'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'values'
op|'='
op|'{'
string|"'period_beginning'"
op|':'
name|'period_beginning'
op|','
nl|'\n'
string|"'period_ending'"
op|':'
name|'period_ending'
op|'}'
newline|'\n'
name|'values'
op|'='
name|'convert_objects_related_datetimes'
op|'('
name|'values'
op|','
op|'*'
name|'values'
op|'.'
name|'keys'
op|'('
op|')'
op|')'
newline|'\n'
nl|'\n'
name|'task'
op|'='
name|'models'
op|'.'
name|'TaskLog'
op|'('
op|')'
newline|'\n'
name|'task'
op|'.'
name|'task_name'
op|'='
name|'task_name'
newline|'\n'
name|'task'
op|'.'
name|'period_beginning'
op|'='
name|'values'
op|'['
string|"'period_beginning'"
op|']'
newline|'\n'
name|'task'
op|'.'
name|'period_ending'
op|'='
name|'values'
op|'['
string|"'period_ending'"
op|']'
newline|'\n'
name|'task'
op|'.'
name|'host'
op|'='
name|'host'
newline|'\n'
name|'task'
op|'.'
name|'state'
op|'='
string|'"RUNNING"'
newline|'\n'
name|'if'
name|'message'
op|':'
newline|'\n'
indent|' '
name|'task'
op|'.'
name|'message'
op|'='
name|'message'
newline|'\n'
dedent|''
name|'if'
name|'task_items'
op|':'
newline|'\n'
indent|' '
name|'task'
op|'.'
name|'task_items'
op|'='
name|'task_items'
newline|'\n'
dedent|''
name|'try'
op|':'
newline|'\n'
indent|' '
name|'task'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'TaskAlreadyRunning'
op|'('
name|'task_name'
op|'='
name|'task_name'
op|','
name|'host'
op|'='
name|'host'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|task_log_end_task
name|'def'
name|'task_log_end_task'
op|'('
name|'context'
op|','
name|'task_name'
op|','
name|'period_beginning'
op|','
name|'period_ending'
op|','
nl|'\n'
name|'host'
op|','
name|'errors'
op|','
name|'message'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'values'
op|'='
name|'dict'
op|'('
name|'state'
op|'='
string|'"DONE"'
op|','
name|'errors'
op|'='
name|'errors'
op|')'
newline|'\n'
name|'if'
name|'message'
op|':'
newline|'\n'
indent|' '
name|'values'
op|'['
string|'"message"'
op|']'
op|'='
name|'message'
newline|'\n'
nl|'\n'
dedent|''
name|'rows'
op|'='
name|'_task_log_get_query'
op|'('
name|'context'
op|','
name|'task_name'
op|','
name|'period_beginning'
op|','
nl|'\n'
name|'period_ending'
op|','
name|'host'
op|')'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'if'
name|'rows'
op|'=='
number|'0'
op|':'
newline|'\n'
comment|"# It's not running!"
nl|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'TaskNotRunning'
op|'('
name|'task_name'
op|'='
name|'task_name'
op|','
name|'host'
op|'='
name|'host'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'##################'
nl|'\n'
nl|'\n'
nl|'\n'
DECL|function|_archive_deleted_rows_for_table
dedent|''
dedent|''
name|'def'
name|'_archive_deleted_rows_for_table'
op|'('
name|'tablename'
op|','
name|'max_rows'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Move up to max_rows rows from one tables to the corresponding\n shadow table.\n\n :returns: number of rows archived\n """'
newline|'\n'
comment|'# NOTE(guochbo): There is a circular import, nova.db.sqlalchemy.utils'
nl|'\n'
comment|'# imports nova.db.sqlalchemy.api.'
nl|'\n'
name|'from'
name|'nova'
op|'.'
name|'db'
op|'.'
name|'sqlalchemy'
name|'import'
name|'utils'
name|'as'
name|'db_utils'
newline|'\n'
nl|'\n'
name|'engine'
op|'='
name|'get_engine'
op|'('
op|')'
newline|'\n'
name|'conn'
op|'='
name|'engine'
op|'.'
name|'connect'
op|'('
op|')'
newline|'\n'
name|'metadata'
op|'='
name|'MetaData'
op|'('
op|')'
newline|'\n'
name|'metadata'
op|'.'
name|'bind'
op|'='
name|'engine'
newline|'\n'
comment|'# NOTE(tdurakov): table metadata should be received'
nl|'\n'
comment|'# from models, not db tables. Default value specified by SoftDeleteMixin'
nl|'\n'
comment|'# is known only by models, not DB layer.'
nl|'\n'
comment|'# IMPORTANT: please do not change source of metadata information for table.'
nl|'\n'
name|'table'
op|'='
name|'models'
op|'.'
name|'BASE'
op|'.'
name|'metadata'
op|'.'
name|'tables'
op|'['
name|'tablename'
op|']'
newline|'\n'
nl|'\n'
name|'shadow_tablename'
op|'='
name|'_SHADOW_TABLE_PREFIX'
op|'+'
name|'tablename'
newline|'\n'
name|'rows_archived'
op|'='
number|'0'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'shadow_table'
op|'='
name|'Table'
op|'('
name|'shadow_tablename'
op|','
name|'metadata'
op|','
name|'autoload'
op|'='
name|'True'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'NoSuchTableError'
op|':'
newline|'\n'
comment|'# No corresponding shadow table; skip it.'
nl|'\n'
indent|' '
name|'return'
name|'rows_archived'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'tablename'
op|'=='
string|'"dns_domains"'
op|':'
newline|'\n'
comment|'# We have one table (dns_domains) where the key is called'
nl|'\n'
comment|'# "domain" rather than "id"'
nl|'\n'
indent|' '
name|'column'
op|'='
name|'table'
op|'.'
name|'c'
op|'.'
name|'domain'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'column'
op|'='
name|'table'
op|'.'
name|'c'
op|'.'
name|'id'
newline|'\n'
comment|'# NOTE(guochbo): Use DeleteFromSelect to avoid'
nl|'\n'
comment|"# database's limit of maximum parameter in one SQL statement."
nl|'\n'
dedent|''
name|'deleted_column'
op|'='
name|'table'
op|'.'
name|'c'
op|'.'
name|'deleted'
newline|'\n'
name|'columns'
op|'='
op|'['
name|'c'
op|'.'
name|'name'
name|'for'
name|'c'
name|'in'
name|'table'
op|'.'
name|'c'
op|']'
newline|'\n'
nl|'\n'
comment|'# NOTE(clecomte): Tables instance_actions and instances_actions_events'
nl|'\n'
comment|'# have to be manage differently so we soft-delete them here to let'
nl|'\n'
comment|'# the archive work the same for all tables'
nl|'\n'
name|'if'
name|'tablename'
op|'=='
string|'"instance_actions"'
op|':'
newline|'\n'
indent|' '
name|'instances'
op|'='
name|'models'
op|'.'
name|'BASE'
op|'.'
name|'metadata'
op|'.'
name|'tables'
op|'['
string|'"instances"'
op|']'
newline|'\n'
name|'deleted_instances'
op|'='
name|'sql'
op|'.'
name|'select'
op|'('
op|'['
name|'instances'
op|'.'
name|'c'
op|'.'
name|'uuid'
op|']'
op|')'
op|'.'
name|'where'
op|'('
name|'instances'
op|'.'
name|'c'
op|'.'
name|'deleted'
op|'!='
name|'instances'
op|'.'
name|'c'
op|'.'
name|'deleted'
op|'.'
name|'default'
op|'.'
name|'arg'
op|')'
newline|'\n'
name|'update_statement'
op|'='
name|'table'
op|'.'
name|'update'
op|'('
op|')'
op|'.'
name|'values'
op|'('
name|'deleted'
op|'='
name|'table'
op|'.'
name|'c'
op|'.'
name|'id'
op|')'
op|'.'
name|'where'
op|'('
name|'table'
op|'.'
name|'c'
op|'.'
name|'instance_uuid'
op|'.'
name|'in_'
op|'('
name|'deleted_instances'
op|')'
op|')'
newline|'\n'
nl|'\n'
name|'conn'
op|'.'
name|'execute'
op|'('
name|'update_statement'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'elif'
name|'tablename'
op|'=='
string|'"instance_actions_events"'
op|':'
newline|'\n'
comment|'# NOTE(clecomte): we have to grab all the relation from'
nl|'\n'
comment|'# instances because instance_actions_events rely on'
nl|'\n'
comment|'# action_id and not uuid'
nl|'\n'
indent|' '
name|'instances'
op|'='
name|'models'
op|'.'
name|'BASE'
op|'.'
name|'metadata'
op|'.'
name|'tables'
op|'['
string|'"instances"'
op|']'
newline|'\n'
name|'instance_actions'
op|'='
name|'models'
op|'.'
name|'BASE'
op|'.'
name|'metadata'
op|'.'
name|'tables'
op|'['
string|'"instance_actions"'
op|']'
newline|'\n'
name|'deleted_instances'
op|'='
name|'sql'
op|'.'
name|'select'
op|'('
op|'['
name|'instances'
op|'.'
name|'c'
op|'.'
name|'uuid'
op|']'
op|')'
op|'.'
name|'where'
op|'('
name|'instances'
op|'.'
name|'c'
op|'.'
name|'deleted'
op|'!='
name|'instances'
op|'.'
name|'c'
op|'.'
name|'deleted'
op|'.'
name|'default'
op|'.'
name|'arg'
op|')'
newline|'\n'
name|'deleted_actions'
op|'='
name|'sql'
op|'.'
name|'select'
op|'('
op|'['
name|'instance_actions'
op|'.'
name|'c'
op|'.'
name|'id'
op|']'
op|')'
op|'.'
name|'where'
op|'('
name|'instance_actions'
op|'.'
name|'c'
op|'.'
name|'instance_uuid'
op|'.'
name|'in_'
op|'('
name|'deleted_instances'
op|')'
op|')'
newline|'\n'
nl|'\n'
name|'update_statement'
op|'='
name|'table'
op|'.'
name|'update'
op|'('
op|')'
op|'.'
name|'values'
op|'('
name|'deleted'
op|'='
name|'table'
op|'.'
name|'c'
op|'.'
name|'id'
op|')'
op|'.'
name|'where'
op|'('
name|'table'
op|'.'
name|'c'
op|'.'
name|'action_id'
op|'.'
name|'in_'
op|'('
name|'deleted_actions'
op|')'
op|')'
newline|'\n'
nl|'\n'
name|'conn'
op|'.'
name|'execute'
op|'('
name|'update_statement'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'insert'
op|'='
name|'shadow_table'
op|'.'
name|'insert'
op|'('
name|'inline'
op|'='
name|'True'
op|')'
op|'.'
name|'from_select'
op|'('
name|'columns'
op|','
nl|'\n'
name|'sql'
op|'.'
name|'select'
op|'('
op|'['
name|'table'
op|']'
op|','
nl|'\n'
name|'deleted_column'
op|'!='
name|'deleted_column'
op|'.'
name|'default'
op|'.'
name|'arg'
op|')'
op|'.'
nl|'\n'
name|'order_by'
op|'('
name|'column'
op|')'
op|'.'
name|'limit'
op|'('
name|'max_rows'
op|')'
op|')'
newline|'\n'
name|'query_delete'
op|'='
name|'sql'
op|'.'
name|'select'
op|'('
op|'['
name|'column'
op|']'
op|','
nl|'\n'
name|'deleted_column'
op|'!='
name|'deleted_column'
op|'.'
name|'default'
op|'.'
name|'arg'
op|')'
op|'.'
name|'order_by'
op|'('
name|'column'
op|')'
op|'.'
name|'limit'
op|'('
name|'max_rows'
op|')'
newline|'\n'
nl|'\n'
name|'delete_statement'
op|'='
name|'db_utils'
op|'.'
name|'DeleteFromSelect'
op|'('
name|'table'
op|','
name|'query_delete'
op|','
name|'column'
op|')'
newline|'\n'
name|'try'
op|':'
newline|'\n'
comment|'# Group the insert and delete in a transaction.'
nl|'\n'
indent|' '
name|'with'
name|'conn'
op|'.'
name|'begin'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'conn'
op|'.'
name|'execute'
op|'('
name|'insert'
op|')'
newline|'\n'
name|'result_delete'
op|'='
name|'conn'
op|'.'
name|'execute'
op|'('
name|'delete_statement'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBReferenceError'
name|'as'
name|'ex'
op|':'
newline|'\n'
comment|'# A foreign key constraint keeps us from deleting some of'
nl|'\n'
comment|'# these rows until we clean up a dependent table. Just'
nl|'\n'
comment|"# skip this table for now; we'll come back to it later."
nl|'\n'
indent|' '
name|'LOG'
op|'.'
name|'warning'
op|'('
name|'_LW'
op|'('
string|'"IntegrityError detected when archiving table "'
nl|'\n'
string|'"%(tablename)s: %(error)s"'
op|')'
op|','
nl|'\n'
op|'{'
string|"'tablename'"
op|':'
name|'tablename'
op|','
string|"'error'"
op|':'
name|'six'
op|'.'
name|'text_type'
op|'('
name|'ex'
op|')'
op|'}'
op|')'
newline|'\n'
name|'return'
name|'rows_archived'
newline|'\n'
nl|'\n'
dedent|''
name|'rows_archived'
op|'='
name|'result_delete'
op|'.'
name|'rowcount'
newline|'\n'
nl|'\n'
name|'return'
name|'rows_archived'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|archive_deleted_rows
dedent|''
name|'def'
name|'archive_deleted_rows'
op|'('
name|'max_rows'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Move up to max_rows rows from production tables to the corresponding\n shadow tables.\n\n :returns: dict that maps table name to number of rows archived from that\n table, for example:\n\n ::\n\n {\n \'instances\': 5,\n \'block_device_mapping\': 5,\n \'pci_devices\': 2,\n }\n\n """'
newline|'\n'
name|'table_to_rows_archived'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'total_rows_archived'
op|'='
number|'0'
newline|'\n'
name|'meta'
op|'='
name|'MetaData'
op|'('
name|'get_engine'
op|'('
name|'use_slave'
op|'='
name|'True'
op|')'
op|')'
newline|'\n'
name|'meta'
op|'.'
name|'reflect'
op|'('
op|')'
newline|'\n'
comment|'# Reverse sort the tables so we get the leaf nodes first for processing.'
nl|'\n'
name|'for'
name|'table'
name|'in'
name|'reversed'
op|'('
name|'meta'
op|'.'
name|'sorted_tables'
op|')'
op|':'
newline|'\n'
indent|' '
name|'tablename'
op|'='
name|'table'
op|'.'
name|'name'
newline|'\n'
comment|'# skip the special sqlalchemy-migrate migrate_version table and any'
nl|'\n'
comment|'# shadow tables'
nl|'\n'
name|'if'
op|'('
name|'tablename'
op|'=='
string|"'migrate_version'"
name|'or'
nl|'\n'
name|'tablename'
op|'.'
name|'startswith'
op|'('
name|'_SHADOW_TABLE_PREFIX'
op|')'
op|')'
op|':'
newline|'\n'
indent|' '
name|'continue'
newline|'\n'
dedent|''
name|'rows_archived'
op|'='
name|'_archive_deleted_rows_for_table'
op|'('
nl|'\n'
name|'tablename'
op|','
name|'max_rows'
op|'='
name|'max_rows'
op|'-'
name|'total_rows_archived'
op|')'
newline|'\n'
name|'total_rows_archived'
op|'+='
name|'rows_archived'
newline|'\n'
comment|'# Only report results for tables that had updates.'
nl|'\n'
name|'if'
name|'rows_archived'
op|':'
newline|'\n'
indent|' '
name|'table_to_rows_archived'
op|'['
name|'tablename'
op|']'
op|'='
name|'rows_archived'
newline|'\n'
dedent|''
name|'if'
name|'total_rows_archived'
op|'>='
name|'max_rows'
op|':'
newline|'\n'
indent|' '
name|'break'
newline|'\n'
dedent|''
dedent|''
name|'return'
name|'table_to_rows_archived'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|pcidevice_online_data_migration
name|'def'
name|'pcidevice_online_data_migration'
op|'('
name|'context'
op|','
name|'max_count'
op|')'
op|':'
newline|'\n'
indent|' '
name|'from'
name|'nova'
op|'.'
name|'objects'
name|'import'
name|'pci_device'
name|'as'
name|'pci_dev_obj'
newline|'\n'
nl|'\n'
name|'count_all'
op|'='
number|'0'
newline|'\n'
name|'count_hit'
op|'='
number|'0'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'pci_dev_obj'
op|'.'
name|'PciDevice'
op|'.'
name|'should_migrate_data'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'error'
op|'('
name|'_LE'
op|'('
string|'"Data migrations for PciDevice are not safe, likely "'
nl|'\n'
string|'"because not all services that access the DB directly "'
nl|'\n'
string|'"are updated to the latest version"'
op|')'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'results'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'PciDevice'
op|')'
op|'.'
name|'filter_by'
op|'('
nl|'\n'
name|'parent_addr'
op|'='
name|'None'
op|')'
op|'.'
name|'limit'
op|'('
name|'max_count'
op|')'
newline|'\n'
nl|'\n'
name|'for'
name|'db_dict'
name|'in'
name|'results'
op|':'
newline|'\n'
indent|' '
name|'count_all'
op|'+='
number|'1'
newline|'\n'
name|'pci_dev'
op|'='
name|'pci_dev_obj'
op|'.'
name|'PciDevice'
op|'.'
name|'_from_db_object'
op|'('
nl|'\n'
name|'context'
op|','
name|'pci_dev_obj'
op|'.'
name|'PciDevice'
op|'('
op|')'
op|','
name|'db_dict'
op|')'
newline|'\n'
name|'if'
name|'pci_dev'
op|'.'
name|'obj_what_changed'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'pci_dev'
op|'.'
name|'save'
op|'('
op|')'
newline|'\n'
name|'count_hit'
op|'+='
number|'1'
newline|'\n'
dedent|''
dedent|''
dedent|''
name|'return'
name|'count_all'
op|','
name|'count_hit'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|aggregate_uuids_online_data_migration
name|'def'
name|'aggregate_uuids_online_data_migration'
op|'('
name|'context'
op|','
name|'max_count'
op|')'
op|':'
newline|'\n'
indent|' '
name|'from'
name|'nova'
op|'.'
name|'objects'
name|'import'
name|'aggregate'
newline|'\n'
nl|'\n'
name|'count_all'
op|'='
number|'0'
newline|'\n'
name|'count_hit'
op|'='
number|'0'
newline|'\n'
nl|'\n'
name|'results'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'Aggregate'
op|')'
op|'.'
name|'filter_by'
op|'('
nl|'\n'
name|'uuid'
op|'='
name|'None'
op|')'
op|'.'
name|'limit'
op|'('
name|'max_count'
op|')'
newline|'\n'
name|'for'
name|'db_agg'
name|'in'
name|'results'
op|':'
newline|'\n'
indent|' '
name|'count_all'
op|'+='
number|'1'
newline|'\n'
name|'agg'
op|'='
name|'aggregate'
op|'.'
name|'Aggregate'
op|'.'
name|'_from_db_object'
op|'('
name|'context'
op|','
nl|'\n'
name|'aggregate'
op|'.'
name|'Aggregate'
op|'('
op|')'
op|','
nl|'\n'
name|'db_agg'
op|')'
newline|'\n'
name|'if'
string|"'uuid'"
name|'in'
name|'agg'
op|':'
newline|'\n'
indent|' '
name|'count_hit'
op|'+='
number|'1'
newline|'\n'
dedent|''
dedent|''
name|'return'
name|'count_all'
op|','
name|'count_hit'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'####################'
nl|'\n'
nl|'\n'
nl|'\n'
DECL|function|_instance_group_get_query
dedent|''
name|'def'
name|'_instance_group_get_query'
op|'('
name|'context'
op|','
name|'model_class'
op|','
name|'id_field'
op|'='
name|'None'
op|','
name|'id'
op|'='
name|'None'
op|','
nl|'\n'
name|'read_deleted'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'columns_to_join'
op|'='
op|'{'
name|'models'
op|'.'
name|'InstanceGroup'
op|':'
op|'['
string|"'_policies'"
op|','
string|"'_members'"
op|']'
op|'}'
newline|'\n'
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'model_class'
op|','
name|'read_deleted'
op|'='
name|'read_deleted'
op|','
nl|'\n'
name|'project_only'
op|'='
name|'True'
op|')'
newline|'\n'
name|'for'
name|'c'
name|'in'
name|'columns_to_join'
op|'.'
name|'get'
op|'('
name|'model_class'
op|','
op|'['
op|']'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
name|'c'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'id'
name|'and'
name|'id_field'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'id_field'
op|'=='
name|'id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'query'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|instance_group_create
name|'def'
name|'instance_group_create'
op|'('
name|'context'
op|','
name|'values'
op|','
name|'policies'
op|'='
name|'None'
op|','
name|'members'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Create a new group."""'
newline|'\n'
name|'uuid'
op|'='
name|'values'
op|'.'
name|'get'
op|'('
string|"'uuid'"
op|','
name|'None'
op|')'
newline|'\n'
name|'if'
name|'uuid'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'uuid'
op|'='
name|'uuidutils'
op|'.'
name|'generate_uuid'
op|'('
op|')'
newline|'\n'
name|'values'
op|'['
string|"'uuid'"
op|']'
op|'='
name|'uuid'
newline|'\n'
nl|'\n'
dedent|''
name|'try'
op|':'
newline|'\n'
indent|' '
name|'group'
op|'='
name|'models'
op|'.'
name|'InstanceGroup'
op|'('
op|')'
newline|'\n'
name|'group'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'group'
op|'.'
name|'save'
op|'('
name|'context'
op|'.'
name|'session'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceGroupIdExists'
op|'('
name|'group_uuid'
op|'='
name|'uuid'
op|')'
newline|'\n'
nl|'\n'
comment|"# We don't want '_policies' and '_members' attributes to be lazy loaded"
nl|'\n'
comment|'# later. We know there is nothing here since we just created this'
nl|'\n'
comment|'# instance group.'
nl|'\n'
dedent|''
name|'if'
name|'policies'
op|':'
newline|'\n'
indent|' '
name|'_instance_group_policies_add'
op|'('
name|'context'
op|','
name|'group'
op|'.'
name|'id'
op|','
name|'policies'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'group'
op|'.'
name|'_policies'
op|'='
op|'['
op|']'
newline|'\n'
dedent|''
name|'if'
name|'members'
op|':'
newline|'\n'
indent|' '
name|'_instance_group_members_add'
op|'('
name|'context'
op|','
name|'group'
op|'.'
name|'id'
op|','
name|'members'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'group'
op|'.'
name|'_members'
op|'='
op|'['
op|']'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'instance_group_get'
op|'('
name|'context'
op|','
name|'uuid'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|instance_group_get
name|'def'
name|'instance_group_get'
op|'('
name|'context'
op|','
name|'group_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Get a specific group by uuid."""'
newline|'\n'
name|'group'
op|'='
name|'_instance_group_get_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'InstanceGroup'
op|','
nl|'\n'
name|'models'
op|'.'
name|'InstanceGroup'
op|'.'
name|'uuid'
op|','
nl|'\n'
name|'group_uuid'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'group'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceGroupNotFound'
op|'('
name|'group_uuid'
op|'='
name|'group_uuid'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'group'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|instance_group_get_by_instance
name|'def'
name|'instance_group_get_by_instance'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'group_member'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceGroupMember'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_id'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'group_member'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceGroupNotFound'
op|'('
name|'group_uuid'
op|'='
string|"''"
op|')'
newline|'\n'
dedent|''
name|'group'
op|'='
name|'_instance_group_get_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceGroup'
op|','
nl|'\n'
name|'models'
op|'.'
name|'InstanceGroup'
op|'.'
name|'id'
op|','
nl|'\n'
name|'group_member'
op|'.'
name|'group_id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'group'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceGroupNotFound'
op|'('
nl|'\n'
name|'group_uuid'
op|'='
name|'group_member'
op|'.'
name|'group_id'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'group'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|instance_group_update
name|'def'
name|'instance_group_update'
op|'('
name|'context'
op|','
name|'group_uuid'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Update the attributes of a group.\n\n If values contains a metadata key, it updates the aggregate metadata\n too. Similarly for the policies and members.\n """'
newline|'\n'
name|'group'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceGroup'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'uuid'
op|'='
name|'group_uuid'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'group'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceGroupNotFound'
op|'('
name|'group_uuid'
op|'='
name|'group_uuid'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'policies'
op|'='
name|'values'
op|'.'
name|'get'
op|'('
string|"'policies'"
op|')'
newline|'\n'
name|'if'
name|'policies'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'_instance_group_policies_add'
op|'('
name|'context'
op|','
nl|'\n'
name|'group'
op|'.'
name|'id'
op|','
nl|'\n'
name|'values'
op|'.'
name|'pop'
op|'('
string|"'policies'"
op|')'
op|','
nl|'\n'
name|'set_delete'
op|'='
name|'True'
op|')'
newline|'\n'
dedent|''
name|'members'
op|'='
name|'values'
op|'.'
name|'get'
op|'('
string|"'members'"
op|')'
newline|'\n'
name|'if'
name|'members'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'_instance_group_members_add'
op|'('
name|'context'
op|','
nl|'\n'
name|'group'
op|'.'
name|'id'
op|','
nl|'\n'
name|'values'
op|'.'
name|'pop'
op|'('
string|"'members'"
op|')'
op|','
nl|'\n'
name|'set_delete'
op|'='
name|'True'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'group'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'policies'
op|':'
newline|'\n'
indent|' '
name|'values'
op|'['
string|"'policies'"
op|']'
op|'='
name|'policies'
newline|'\n'
dedent|''
name|'if'
name|'members'
op|':'
newline|'\n'
indent|' '
name|'values'
op|'['
string|"'members'"
op|']'
op|'='
name|'members'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|instance_group_delete
name|'def'
name|'instance_group_delete'
op|'('
name|'context'
op|','
name|'group_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Delete a group."""'
newline|'\n'
name|'group_id'
op|'='
name|'_instance_group_id'
op|'('
name|'context'
op|','
name|'group_uuid'
op|')'
newline|'\n'
nl|'\n'
name|'count'
op|'='
name|'_instance_group_get_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'InstanceGroup'
op|','
nl|'\n'
name|'models'
op|'.'
name|'InstanceGroup'
op|'.'
name|'uuid'
op|','
nl|'\n'
name|'group_uuid'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'if'
name|'count'
op|'=='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceGroupNotFound'
op|'('
name|'group_uuid'
op|'='
name|'group_uuid'
op|')'
newline|'\n'
nl|'\n'
comment|'# Delete policies, metadata and members'
nl|'\n'
dedent|''
name|'instance_models'
op|'='
op|'['
name|'models'
op|'.'
name|'InstanceGroupPolicy'
op|','
nl|'\n'
name|'models'
op|'.'
name|'InstanceGroupMember'
op|']'
newline|'\n'
name|'for'
name|'model'
name|'in'
name|'instance_models'
op|':'
newline|'\n'
indent|' '
name|'model_query'
op|'('
name|'context'
op|','
name|'model'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'group_id'
op|'='
name|'group_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|instance_group_get_all
name|'def'
name|'instance_group_get_all'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Get all groups."""'
newline|'\n'
name|'return'
name|'_instance_group_get_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceGroup'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|instance_group_get_all_by_project_id
name|'def'
name|'instance_group_get_all_by_project_id'
op|'('
name|'context'
op|','
name|'project_id'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Get all groups."""'
newline|'\n'
name|'return'
name|'_instance_group_get_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceGroup'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_instance_group_count_by_project_and_user
dedent|''
name|'def'
name|'_instance_group_count_by_project_and_user'
op|'('
name|'context'
op|','
name|'project_id'
op|','
name|'user_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'InstanceGroup'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'project_id'
op|'='
name|'project_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'user_id'
op|'='
name|'user_id'
op|')'
op|'.'
name|'count'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_instance_group_model_get_query
dedent|''
name|'def'
name|'_instance_group_model_get_query'
op|'('
name|'context'
op|','
name|'model_class'
op|','
name|'group_id'
op|','
nl|'\n'
name|'read_deleted'
op|'='
string|"'no'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'model_class'
op|','
nl|'\n'
name|'read_deleted'
op|'='
name|'read_deleted'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'group_id'
op|'='
name|'group_id'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_instance_group_id
dedent|''
name|'def'
name|'_instance_group_id'
op|'('
name|'context'
op|','
name|'group_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Returns the group database ID for the group UUID."""'
newline|'\n'
nl|'\n'
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'InstanceGroup'
op|','
nl|'\n'
op|'('
name|'models'
op|'.'
name|'InstanceGroup'
op|'.'
name|'id'
op|','
op|')'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'uuid'
op|'='
name|'group_uuid'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceGroupNotFound'
op|'('
name|'group_uuid'
op|'='
name|'group_uuid'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'result'
op|'.'
name|'id'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_instance_group_members_add
dedent|''
name|'def'
name|'_instance_group_members_add'
op|'('
name|'context'
op|','
name|'id'
op|','
name|'members'
op|','
name|'set_delete'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
name|'all_members'
op|'='
name|'set'
op|'('
name|'members'
op|')'
newline|'\n'
name|'query'
op|'='
name|'_instance_group_model_get_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'InstanceGroupMember'
op|','
name|'id'
op|')'
newline|'\n'
name|'if'
name|'set_delete'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'.'
name|'filter'
op|'('
op|'~'
name|'models'
op|'.'
name|'InstanceGroupMember'
op|'.'
name|'instance_id'
op|'.'
name|'in_'
op|'('
nl|'\n'
name|'all_members'
op|')'
op|')'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
nl|'\n'
name|'models'
op|'.'
name|'InstanceGroupMember'
op|'.'
name|'instance_id'
op|'.'
name|'in_'
op|'('
name|'all_members'
op|')'
op|')'
newline|'\n'
name|'already_existing'
op|'='
name|'set'
op|'('
op|')'
newline|'\n'
name|'for'
name|'member_ref'
name|'in'
name|'query'
op|'.'
name|'all'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'already_existing'
op|'.'
name|'add'
op|'('
name|'member_ref'
op|'.'
name|'instance_id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'for'
name|'instance_id'
name|'in'
name|'members'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'instance_id'
name|'in'
name|'already_existing'
op|':'
newline|'\n'
indent|' '
name|'continue'
newline|'\n'
dedent|''
name|'member_ref'
op|'='
name|'models'
op|'.'
name|'InstanceGroupMember'
op|'('
op|')'
newline|'\n'
name|'member_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|"'instance_id'"
op|':'
name|'instance_id'
op|','
nl|'\n'
string|"'group_id'"
op|':'
name|'id'
op|'}'
op|')'
newline|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'add'
op|'('
name|'member_ref'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'members'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|instance_group_members_add
name|'def'
name|'instance_group_members_add'
op|'('
name|'context'
op|','
name|'group_uuid'
op|','
name|'members'
op|','
nl|'\n'
name|'set_delete'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
name|'id'
op|'='
name|'_instance_group_id'
op|'('
name|'context'
op|','
name|'group_uuid'
op|')'
newline|'\n'
name|'return'
name|'_instance_group_members_add'
op|'('
name|'context'
op|','
name|'id'
op|','
name|'members'
op|','
nl|'\n'
name|'set_delete'
op|'='
name|'set_delete'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|instance_group_member_delete
name|'def'
name|'instance_group_member_delete'
op|'('
name|'context'
op|','
name|'group_uuid'
op|','
name|'instance_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'id'
op|'='
name|'_instance_group_id'
op|'('
name|'context'
op|','
name|'group_uuid'
op|')'
newline|'\n'
name|'count'
op|'='
name|'_instance_group_model_get_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'InstanceGroupMember'
op|','
nl|'\n'
name|'id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_id'
op|'='
name|'instance_id'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'if'
name|'count'
op|'=='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceGroupMemberNotFound'
op|'('
name|'group_uuid'
op|'='
name|'group_uuid'
op|','
nl|'\n'
name|'instance_id'
op|'='
name|'instance_id'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|instance_group_members_get
name|'def'
name|'instance_group_members_get'
op|'('
name|'context'
op|','
name|'group_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'id'
op|'='
name|'_instance_group_id'
op|'('
name|'context'
op|','
name|'group_uuid'
op|')'
newline|'\n'
name|'instances'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'InstanceGroupMember'
op|','
nl|'\n'
op|'('
name|'models'
op|'.'
name|'InstanceGroupMember'
op|'.'
name|'instance_id'
op|','
op|')'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'group_id'
op|'='
name|'id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
name|'return'
op|'['
name|'instance'
op|'['
number|'0'
op|']'
name|'for'
name|'instance'
name|'in'
name|'instances'
op|']'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_instance_group_policies_add
dedent|''
name|'def'
name|'_instance_group_policies_add'
op|'('
name|'context'
op|','
name|'id'
op|','
name|'policies'
op|','
name|'set_delete'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
name|'allpols'
op|'='
name|'set'
op|'('
name|'policies'
op|')'
newline|'\n'
name|'query'
op|'='
name|'_instance_group_model_get_query'
op|'('
name|'context'
op|','
nl|'\n'
name|'models'
op|'.'
name|'InstanceGroupPolicy'
op|','
name|'id'
op|')'
newline|'\n'
name|'if'
name|'set_delete'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'.'
name|'filter'
op|'('
op|'~'
name|'models'
op|'.'
name|'InstanceGroupPolicy'
op|'.'
name|'policy'
op|'.'
name|'in_'
op|'('
name|'allpols'
op|')'
op|')'
op|'.'
name|'soft_delete'
op|'('
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'query'
op|'='
name|'query'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'InstanceGroupPolicy'
op|'.'
name|'policy'
op|'.'
name|'in_'
op|'('
name|'allpols'
op|')'
op|')'
newline|'\n'
name|'already_existing'
op|'='
name|'set'
op|'('
op|')'
newline|'\n'
name|'for'
name|'policy_ref'
name|'in'
name|'query'
op|'.'
name|'all'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'already_existing'
op|'.'
name|'add'
op|'('
name|'policy_ref'
op|'.'
name|'policy'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'for'
name|'policy'
name|'in'
name|'policies'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'policy'
name|'in'
name|'already_existing'
op|':'
newline|'\n'
indent|' '
name|'continue'
newline|'\n'
dedent|''
name|'policy_ref'
op|'='
name|'models'
op|'.'
name|'InstanceGroupPolicy'
op|'('
op|')'
newline|'\n'
name|'policy_ref'
op|'.'
name|'update'
op|'('
op|'{'
string|"'policy'"
op|':'
name|'policy'
op|','
nl|'\n'
string|"'group_id'"
op|':'
name|'id'
op|'}'
op|')'
newline|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'add'
op|'('
name|'policy_ref'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'policies'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'####################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|pci_device_get_by_addr
name|'def'
name|'pci_device_get_by_addr'
op|'('
name|'context'
op|','
name|'node_id'
op|','
name|'dev_addr'
op|')'
op|':'
newline|'\n'
indent|' '
name|'pci_dev_ref'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'PciDevice'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'compute_node_id'
op|'='
name|'node_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'address'
op|'='
name|'dev_addr'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'pci_dev_ref'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'PciDeviceNotFound'
op|'('
name|'node_id'
op|'='
name|'node_id'
op|','
name|'address'
op|'='
name|'dev_addr'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'pci_dev_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|pci_device_get_by_id
name|'def'
name|'pci_device_get_by_id'
op|'('
name|'context'
op|','
name|'id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'pci_dev_ref'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'PciDevice'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'id'
op|'='
name|'id'
op|')'
op|'.'
name|'first'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'pci_dev_ref'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'PciDeviceNotFoundById'
op|'('
name|'id'
op|'='
name|'id'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'pci_dev_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|pci_device_get_all_by_node
name|'def'
name|'pci_device_get_all_by_node'
op|'('
name|'context'
op|','
name|'node_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'PciDevice'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'compute_node_id'
op|'='
name|'node_id'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|pci_device_get_all_by_parent_addr
name|'def'
name|'pci_device_get_all_by_parent_addr'
op|'('
name|'context'
op|','
name|'node_id'
op|','
name|'parent_addr'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'PciDevice'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'compute_node_id'
op|'='
name|'node_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'parent_addr'
op|'='
name|'parent_addr'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'require_context'
newline|'\n'
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|pci_device_get_all_by_instance_uuid
name|'def'
name|'pci_device_get_all_by_instance_uuid'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'PciDevice'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'status'
op|'='
string|"'allocated'"
op|')'
op|'.'
name|'filter_by'
op|'('
name|'instance_uuid'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|_instance_pcidevs_get_multi
name|'def'
name|'_instance_pcidevs_get_multi'
op|'('
name|'context'
op|','
name|'instance_uuids'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'not'
name|'instance_uuids'
op|':'
newline|'\n'
indent|' '
name|'return'
op|'['
op|']'
newline|'\n'
dedent|''
name|'return'
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'PciDevice'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'status'
op|'='
string|"'allocated'"
op|')'
op|'.'
name|'filter'
op|'('
name|'models'
op|'.'
name|'PciDevice'
op|'.'
name|'instance_uuid'
op|'.'
name|'in_'
op|'('
name|'instance_uuids'
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|pci_device_destroy
name|'def'
name|'pci_device_destroy'
op|'('
name|'context'
op|','
name|'node_id'
op|','
name|'address'
op|')'
op|':'
newline|'\n'
indent|' '
name|'result'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'PciDevice'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'compute_node_id'
op|'='
name|'node_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'address'
op|'='
name|'address'
op|')'
op|'.'
name|'soft_delete'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'PciDeviceNotFound'
op|'('
name|'node_id'
op|'='
name|'node_id'
op|','
name|'address'
op|'='
name|'address'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|pci_device_update
name|'def'
name|'pci_device_update'
op|'('
name|'context'
op|','
name|'node_id'
op|','
name|'address'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'query'
op|'='
name|'model_query'
op|'('
name|'context'
op|','
name|'models'
op|'.'
name|'PciDevice'
op|','
name|'read_deleted'
op|'='
string|'"no"'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'compute_node_id'
op|'='
name|'node_id'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'address'
op|'='
name|'address'
op|')'
newline|'\n'
name|'if'
name|'query'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
op|'=='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'device'
op|'='
name|'models'
op|'.'
name|'PciDevice'
op|'('
op|')'
newline|'\n'
name|'device'
op|'.'
name|'update'
op|'('
name|'values'
op|')'
newline|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'add'
op|'('
name|'device'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'query'
op|'.'
name|'one'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'####################'
nl|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|instance_tag_add
name|'def'
name|'instance_tag_add'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'tag'
op|')'
op|':'
newline|'\n'
indent|' '
name|'tag_ref'
op|'='
name|'models'
op|'.'
name|'Tag'
op|'('
op|')'
newline|'\n'
name|'tag_ref'
op|'.'
name|'resource_id'
op|'='
name|'instance_uuid'
newline|'\n'
name|'tag_ref'
op|'.'
name|'tag'
op|'='
name|'tag'
newline|'\n'
nl|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'_check_instance_exists_in_project'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
newline|'\n'
name|'with'
name|'get_context_manager'
op|'('
name|'context'
op|')'
op|'.'
name|'writer'
op|'.'
name|'savepoint'
op|'.'
name|'using'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'context'
op|'.'
name|'session'
op|'.'
name|'add'
op|'('
name|'tag_ref'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBDuplicateEntry'
op|':'
newline|'\n'
comment|'# NOTE(snikitin): We should ignore tags duplicates'
nl|'\n'
indent|' '
name|'pass'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'tag_ref'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|instance_tag_set
name|'def'
name|'instance_tag_set'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'tags'
op|')'
op|':'
newline|'\n'
indent|' '
name|'_check_instance_exists_in_project'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
newline|'\n'
nl|'\n'
name|'existing'
op|'='
name|'context'
op|'.'
name|'session'
op|'.'
name|'query'
op|'('
name|'models'
op|'.'
name|'Tag'
op|'.'
name|'tag'
op|')'
op|'.'
name|'filter_by'
op|'('
nl|'\n'
name|'resource_id'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'existing'
op|'='
name|'set'
op|'('
name|'row'
op|'.'
name|'tag'
name|'for'
name|'row'
name|'in'
name|'existing'
op|')'
newline|'\n'
name|'tags'
op|'='
name|'set'
op|'('
name|'tags'
op|')'
newline|'\n'
name|'to_delete'
op|'='
name|'existing'
op|'-'
name|'tags'
newline|'\n'
name|'to_add'
op|'='
name|'tags'
op|'-'
name|'existing'
newline|'\n'
nl|'\n'
name|'if'
name|'to_delete'
op|':'
newline|'\n'
indent|' '
name|'context'
op|'.'
name|'session'
op|'.'
name|'query'
op|'('
name|'models'
op|'.'
name|'Tag'
op|')'
op|'.'
name|'filter_by'
op|'('
nl|'\n'
name|'resource_id'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'filter'
op|'('
nl|'\n'
name|'models'
op|'.'
name|'Tag'
op|'.'
name|'tag'
op|'.'
name|'in_'
op|'('
name|'to_delete'
op|')'
op|')'
op|'.'
name|'delete'
op|'('
nl|'\n'
name|'synchronize_session'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'to_add'
op|':'
newline|'\n'
indent|' '
name|'data'
op|'='
op|'['
nl|'\n'
op|'{'
string|"'resource_id'"
op|':'
name|'instance_uuid'
op|','
string|"'tag'"
op|':'
name|'tag'
op|'}'
name|'for'
name|'tag'
name|'in'
name|'to_add'
op|']'
newline|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'execute'
op|'('
name|'models'
op|'.'
name|'Tag'
op|'.'
name|'__table__'
op|'.'
name|'insert'
op|'('
op|')'
op|','
name|'data'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'context'
op|'.'
name|'session'
op|'.'
name|'query'
op|'('
name|'models'
op|'.'
name|'Tag'
op|')'
op|'.'
name|'filter_by'
op|'('
nl|'\n'
name|'resource_id'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_reader'
newline|'\n'
DECL|function|instance_tag_get_by_instance_uuid
name|'def'
name|'instance_tag_get_by_instance_uuid'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'_check_instance_exists_in_project'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
newline|'\n'
name|'return'
name|'context'
op|'.'
name|'session'
op|'.'
name|'query'
op|'('
name|'models'
op|'.'
name|'Tag'
op|')'
op|'.'
name|'filter_by'
op|'('
nl|'\n'
name|'resource_id'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|instance_tag_delete
name|'def'
name|'instance_tag_delete'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'tag'
op|')'
op|':'
newline|'\n'
indent|' '
name|'_check_instance_exists_in_project'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
newline|'\n'
name|'result'
op|'='
name|'context'
op|'.'
name|'session'
op|'.'
name|'query'
op|'('
name|'models'
op|'.'
name|'Tag'
op|')'
op|'.'
name|'filter_by'
op|'('
nl|'\n'
name|'resource_id'
op|'='
name|'instance_uuid'
op|','
name|'tag'
op|'='
name|'tag'
op|')'
op|'.'
name|'delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'InstanceTagNotFound'
op|'('
name|'instance_id'
op|'='
name|'instance_uuid'
op|','
nl|'\n'
name|'tag'
op|'='
name|'tag'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'pick_context_manager_writer'
newline|'\n'
DECL|function|instance_tag_delete_all
name|'def'
name|'instance_tag_delete_all'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'_check_instance_exists_in_project'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
newline|'\n'
name|'context'
op|'.'
name|'session'
op|'.'
name|'query'
op|'('
name|'models'
op|'.'
name|'Tag'
op|')'
op|'.'
name|'filter_by'
op|'('
nl|'\n'
name|'resource_id'
op|'='
name|'instance_uuid'
op|')'
op|'.'
name|'delete'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'main_context_manager'
op|'.'
name|'reader'
newline|'\n'
DECL|function|instance_tag_exists
name|'def'
name|'instance_tag_exists'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|','
name|'tag'
op|')'
op|':'
newline|'\n'
indent|' '
name|'_check_instance_exists_in_project'
op|'('
name|'context'
op|','
name|'instance_uuid'
op|')'
newline|'\n'
name|'q'
op|'='
name|'context'
op|'.'
name|'session'
op|'.'
name|'query'
op|'('
name|'models'
op|'.'
name|'Tag'
op|')'
op|'.'
name|'filter_by'
op|'('
nl|'\n'
name|'resource_id'
op|'='
name|'instance_uuid'
op|','
name|'tag'
op|'='
name|'tag'
op|')'
newline|'\n'
name|'return'
name|'context'
op|'.'
name|'session'
op|'.'
name|'query'
op|'('
name|'q'
op|'.'
name|'exists'
op|'('
op|')'
op|')'
op|'.'
name|'scalar'
op|'('
op|')'
newline|'\n'
dedent|''
endmarker|''
end_unit
| 13.082571 | 2,387 | 0.613324 | 80,227 | 539,172 | 3.994727 | 0.019233 | 0.170311 | 0.07888 | 0.042161 | 0.892915 | 0.854152 | 0.813255 | 0.772236 | 0.725173 | 0.688609 | 0 | 0.000627 | 0.106074 | 539,172 | 41,212 | 2,388 | 13.082888 | 0.664308 | 0 | 0 | 0.962026 | 0 | 0.000801 | 0.826543 | 0.12843 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.000291 | 0.001359 | null | null | 0.000024 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
69f072158ee68edcbf330fb8370787ae86b52558 | 81 | py | Python | ledfx_frontend/__init__.py | danhartfiction/LedFx | eaf40ab180ef7e8f4f769193b35b3ffd5fe2a340 | [
"MIT"
] | 419 | 2018-07-13T11:45:14.000Z | 2022-02-10T05:23:00.000Z | ledfx_frontend/__init__.py | danhartfiction/LedFx | eaf40ab180ef7e8f4f769193b35b3ffd5fe2a340 | [
"MIT"
] | 131 | 2018-07-20T09:46:26.000Z | 2022-01-09T22:22:30.000Z | ledfx_frontend/__init__.py | danhartfiction/LedFx | eaf40ab180ef7e8f4f769193b35b3ffd5fe2a340 | [
"MIT"
] | 122 | 2018-07-21T17:01:18.000Z | 2022-02-10T17:09:12.000Z | """LedFx Frontend"""
import os
def where():
return os.path.dirname(__file__) | 16.2 | 36 | 0.691358 | 11 | 81 | 4.727273 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148148 | 81 | 5 | 36 | 16.2 | 0.753623 | 0.17284 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
387f30bb166c9d31340441656d99fef51c36ba34 | 3,156 | py | Python | tests/plantcv/hyperspectral/test_analyze_index.py | ygarrot/plantcv | e934a891e0d1bf8987ca6a9f982a4ac1f420bfe7 | [
"MIT"
] | 1 | 2022-02-03T12:08:59.000Z | 2022-02-03T12:08:59.000Z | tests/plantcv/hyperspectral/test_analyze_index.py | HUISTENCOFFEE/plantcv | f38f7de53663522eb770870b70823d5fc46d0c0f | [
"MIT"
] | null | null | null | tests/plantcv/hyperspectral/test_analyze_index.py | HUISTENCOFFEE/plantcv | f38f7de53663522eb770870b70823d5fc46d0c0f | [
"MIT"
] | null | null | null | import pytest
import numpy as np
import cv2
from plantcv.plantcv.hyperspectral import analyze_index
from plantcv.plantcv import outputs
def test_analyze_index(hyperspectral_test_data):
"""Test for PlantCV."""
# Clear previous outputs
outputs.clear()
index_array = hyperspectral_test_data.load_hsi(hyperspectral_test_data.savi_file)
mask_img = np.ones(np.shape(index_array.array_data), dtype=np.uint8) * 255
_ = analyze_index(index_array=index_array, mask=mask_img, histplot=True)
assert outputs.observations['default']['mean_index_savi']['value'] > 0
def test_analyze_index_set_range(hyperspectral_test_data):
"""Test for PlantCV."""
# Clear previous outputs
outputs.clear()
index_array = hyperspectral_test_data.load_hsi(hyperspectral_test_data.savi_file)
mask = np.ones(np.shape(index_array.array_data), dtype=np.uint8) * 255
_ = analyze_index(index_array=index_array, mask=mask, histplot=True, min_bin=0, max_bin=1)
assert outputs.observations['default']['mean_index_savi']['value'] > 0
def test_analyze_index_auto_range(hyperspectral_test_data):
"""Test for PlantCV."""
# Clear previous outputs
outputs.clear()
index_array = hyperspectral_test_data.load_hsi(hyperspectral_test_data.savi_file)
mask = np.ones(np.shape(index_array.array_data), dtype=np.uint8) * 255
_ = analyze_index(index_array=index_array, mask=mask, min_bin="auto", max_bin="auto")
assert outputs.observations['default']['mean_index_savi']['value'] > 0
def test_analyze_index_outside_range_warning(hyperspectral_test_data):
"""Test for PlantCV."""
import io
from contextlib import redirect_stdout
index_array = hyperspectral_test_data.load_hsi(hyperspectral_test_data.savi_file)
mask = np.ones(np.shape(index_array.array_data), dtype=np.uint8) * 255
f = io.StringIO()
with redirect_stdout(f):
_ = analyze_index(index_array=index_array, mask=mask, min_bin=.5, max_bin=.55, label="i")
out = f.getvalue()
assert out[0:10] == 'WARNING!!!'
def test_analyze_index_bad_input_mask(hyperspectral_test_data):
"""Test for PlantCV."""
index_array = hyperspectral_test_data.load_hsi(hyperspectral_test_data.savi_file)
mask = cv2.imread(hyperspectral_test_data.hsi_mask_file)
with pytest.raises(RuntimeError):
_ = analyze_index(index_array=index_array, mask=mask)
def test_analyze_index_bad_input_index(hyperspectral_test_data):
"""Test for PlantCV."""
index_array = hyperspectral_test_data.load_hsi(hyperspectral_test_data.savi_file)
mask = cv2.imread(hyperspectral_test_data.hsi_mask_file, -1)
index_array.array_data = cv2.imread(hyperspectral_test_data.hsi_mask_file)
with pytest.raises(RuntimeError):
_ = analyze_index(index_array=index_array, mask=mask)
def test_analyze_index_bad_input_datatype(hyperspectral_test_data):
"""Test for PlantCV."""
array_data = hyperspectral_test_data.load_hsi(hyperspectral_test_data.hsi_file)
mask = cv2.imread(hyperspectral_test_data.hsi_mask_file, -1)
with pytest.raises(RuntimeError):
_ = analyze_index(index_array=array_data, mask=mask)
| 42.08 | 97 | 0.761407 | 441 | 3,156 | 5.077098 | 0.151927 | 0.189817 | 0.23448 | 0.059402 | 0.824922 | 0.824922 | 0.781599 | 0.777133 | 0.732916 | 0.732916 | 0 | 0.012413 | 0.132129 | 3,156 | 74 | 98 | 42.648649 | 0.805038 | 0.061787 | 0 | 0.44898 | 0 | 0 | 0.034211 | 0 | 0 | 0 | 0 | 0 | 0.081633 | 1 | 0.142857 | false | 0 | 0.142857 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3880cc9665e069236d161823960543bb92ae582a | 4,116 | py | Python | tests/test_slit_coefs.py | mnishida/pymwm | 820d0a9056982fd37972b0e10f5dad9d1697ed2f | [
"MIT"
] | 3 | 2020-04-16T14:55:34.000Z | 2021-08-04T07:03:31.000Z | tests/test_slit_coefs.py | mnishida/pymwm | 820d0a9056982fd37972b0e10f5dad9d1697ed2f | [
"MIT"
] | 1 | 2021-08-13T04:45:50.000Z | 2021-08-18T03:33:08.000Z | tests/test_slit_coefs.py | mnishida/pymwm | 820d0a9056982fd37972b0e10f5dad9d1697ed2f | [
"MIT"
] | 2 | 2021-04-05T07:10:26.000Z | 2021-08-04T03:15:43.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import unittest
import numpy as np
import numpy.testing as npt
import pymwm
class TestSlitCoefs(unittest.TestCase):
def setUp(self):
self.params = {
"core": {"shape": "slit", "size": 0.3, "fill": {"RI": 1.333}},
"clad": {"book": "Au", "page": "Stewart-DLF"},
"bounds": {"wl_max": 2.0, "wl_min": 1.0, "wl_imag": 50.0},
"modes": {
"wl_max": 2.5,
"wl_min": 1.0,
"wl_imag": 50.0,
"num_n": 6,
"num_m": 1,
"ls": ["h", "v"],
},
}
self.pec = {"PEC": True}
def test_coefs(self):
params = self.params.copy()
wg = pymwm.create(params)
wr = 2.0 * np.pi
wi = -0.002
w = wr + wi * 1j
alpha_all = wg.alpha_all
hs = np.array([wg.beta(w, alpha) for alpha in alpha_all])
As1, Bs1 = wg.coefs_numpy(hs, w)
As2, Bs2 = wg.coefs(hs, w)
print(As1, As2)
npt.assert_allclose(As1, As2)
npt.assert_allclose(Bs1, Bs2)
params["clad"] = self.pec
wg = pymwm.create(params)
alpha_all = wg.alpha_all
hs = np.array([wg.beta(w, alpha) for alpha in alpha_all])
As1, Bs1 = wg.coefs_numpy(hs, w)
As2, Bs2 = wg.coefs(hs, w)
print(As1, As2)
npt.assert_allclose(As1, As2)
npt.assert_allclose(Bs1, Bs2)
def test_ABY(self):
params = self.params.copy()
wg = pymwm.create(params)
wr = 2.0 * np.pi
wi = -0.002
w = wr + wi * 1j
hs = np.array([wg.beta(w, alpha) for alpha in wg.alpha_all])
As1, Bs1 = wg.coefs_numpy(hs, w)
Y1 = wg.Ys(w, hs, As1, Bs1)
As2, Bs2, Y2 = wg.ABY(w, hs)
npt.assert_allclose(As1, As2)
npt.assert_allclose(Bs1, Bs2)
print(wg.alpha_all)
print(Y1, Y2)
npt.assert_allclose(Y1, Y2)
params["clad"] = self.pec
wg = pymwm.create(params)
hs = np.array([wg.beta(w, alpha) for alpha in wg.alpha_all])
As1, Bs1 = wg.coefs_numpy(hs, w)
Y1 = wg.Ys(w, hs, As1, Bs1)
As2, Bs2, Y2 = wg.ABY(w, hs)
npt.assert_allclose(As1, As2)
npt.assert_allclose(Bs1, Bs2)
npt.assert_allclose(Y1, Y2)
def test_hABY(self):
params = self.params.copy()
wg = pymwm.create(params)
wr = 2.0 * np.pi
wi = -0.002
w = wr + wi * 1j
hs1 = np.array([wg.beta(w, alpha) for alpha in wg.alpha_all])
As1, Bs1 = wg.coefs_numpy(hs1, w)
Y1 = wg.Ys(w, hs1, As1, Bs1)
hs2, As2, Bs2, Y2 = wg.hABY(w)
npt.assert_allclose(As1, As2)
npt.assert_allclose(Bs1, Bs2)
npt.assert_allclose(Y1, Y2)
params["clad"] = self.pec
wg = pymwm.create(params)
hs1 = np.array([wg.beta(w, alpha) for alpha in wg.alpha_all])
As1, Bs1 = wg.coefs_numpy(hs1, w)
Y1 = wg.Ys(w, hs1, As1, Bs1)
hs2, As2, Bs2, Y2 = wg.hABY(w)
npt.assert_allclose(As1, As2)
npt.assert_allclose(Bs1, Bs2)
npt.assert_allclose(Y1, Y2)
def test_norm(self):
params = self.params.copy()
wg = pymwm.create(params)
wr = 2.0 * np.pi
wi = -0.002
w = complex(wr, wi)
hs = np.array([wg.beta(w, alpha) for alpha in wg.alpha_all])
As, Bs = wg.coefs_numpy(hs, w)
for h, a, b, s, n, m in zip(hs, As, Bs, wg.s_all, wg.n_all, wg.m_all):
pol = "E" if s == 0 else "M"
norm = wg.norm(w, h, (pol, n, m), a, b)
self.assertAlmostEqual(norm, 1.0)
params["clad"] = self.pec
wg = pymwm.create(params)
hs = np.array([wg.beta(w, alpha) for alpha in wg.alpha_all])
As, Bs = wg.coefs_numpy(hs, w)
for h, a, b, s, n, m in zip(hs, As, Bs, wg.s_all, wg.n_all, wg.m_all):
pol = "E" if s == 0 else "M"
norm = wg.norm(w, h, (pol, n, m), a, b)
self.assertAlmostEqual(norm, 1.0)
if __name__ == "__main__":
unittest.main()
| 32.409449 | 78 | 0.511662 | 632 | 4,116 | 3.232595 | 0.161392 | 0.070485 | 0.133138 | 0.0744 | 0.835047 | 0.835047 | 0.835047 | 0.835047 | 0.801762 | 0.801762 | 0 | 0.051787 | 0.333819 | 4,116 | 126 | 79 | 32.666667 | 0.69329 | 0.010204 | 0 | 0.736364 | 0 | 0 | 0.034872 | 0 | 0 | 0 | 0 | 0 | 0.163636 | 1 | 0.045455 | false | 0 | 0.036364 | 0 | 0.090909 | 0.036364 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
38a6c83e279df429f9ddb51f0c398f548290bf5f | 18,127 | py | Python | esperclient/api/subscription_api.py | pallavigopi/esper-client-py | f7e71d3f25a5d91f35628b414e8abe9e6849d316 | [
"Apache-2.0"
] | null | null | null | esperclient/api/subscription_api.py | pallavigopi/esper-client-py | f7e71d3f25a5d91f35628b414e8abe9e6849d316 | [
"Apache-2.0"
] | null | null | null | esperclient/api/subscription_api.py | pallavigopi/esper-client-py | f7e71d3f25a5d91f35628b414e8abe9e6849d316 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
ESPER API REFERENCE
OpenAPI spec version: 1.0.0
Contact: developer@esper.io
---------------------------------------------------------
Copyright 2019 Shoonya Enterprises Inc.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
from __future__ import absolute_import
import re
# python 2 and python 3 compatibility library
import six
from esperclient.api_client import ApiClient
class SubscriptionApi(object):
"""NOTE: This class is auto generated.
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_subscription(self, data, enterprise_id, **kwargs):
"""Create a Subscription
Returns Subscription instance
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_subscription(data, enterprise_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param EventSubscriptionArgs data: (required)
:param str enterprise_id: A UUID string identifying the enterprise. (required)
:return: EventSubscription
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_subscription_with_http_info(data, enterprise_id, **kwargs)
else:
(data) = self.create_subscription_with_http_info(data, enterprise_id, **kwargs)
return data
def create_subscription_with_http_info(self, data, enterprise_id, **kwargs):
"""Create a Subscription
Returns Subscription instance
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_subscription_with_http_info(data, enterprise_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param EventSubscriptionArgs data: (required)
:param str enterprise_id: A UUID string identifying the enterprise. (required)
:return: EventSubscription
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['data', 'enterprise_id']
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_subscription" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'data' is set
if ('data' not in params or
params['data'] is None):
raise ValueError("Missing the required parameter `data` when calling `create_subscription`")
# verify the required parameter 'enterprise_id' is set
if ('enterprise_id' not in params or
params['enterprise_id'] is None):
raise ValueError("Missing the required parameter `enterprise_id` when calling `create_subscription`")
collection_formats = {}
path_params = {}
if 'enterprise_id' in params:
path_params['enterprise_id'] = params['enterprise_id']
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'data' in params:
body_params = params['data']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json'])
# Authentication setting
auth_settings = ['apiKey']
return self.api_client.call_api(
'/v0/enterprise/{enterprise_id}/subscription/', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EventSubscription',
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_subscription(self, subscription_id, enterprise_id, **kwargs):
"""Delete a subscription
Empty response
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_subscription(subscription_id, enterprise_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str subscription_id: A UUID string identifying the subscription. (required)
:param str enterprise_id: A UUID string identifying the enterprise. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_subscription_with_http_info(subscription_id, enterprise_id, **kwargs)
else:
(data) = self.delete_subscription_with_http_info(subscription_id, enterprise_id, **kwargs)
return data
def delete_subscription_with_http_info(self, subscription_id, enterprise_id, **kwargs):
"""Delete a subscription
Empty response
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_subscription_with_http_info(subscription_id, enterprise_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str subscription_id: A UUID string identifying the subscription. (required)
:param str enterprise_id: A UUID string identifying the enterprise. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['subscription_id', 'enterprise_id']
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_subscription" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'subscription_id' is set
if ('subscription_id' not in params or
params['subscription_id'] is None):
raise ValueError("Missing the required parameter `subscription_id` when calling `delete_subscription`")
# verify the required parameter 'enterprise_id' is set
if ('enterprise_id' not in params or
params['enterprise_id'] is None):
raise ValueError("Missing the required parameter `enterprise_id` when calling `delete_subscription`")
collection_formats = {}
path_params = {}
if 'subscription_id' in params:
path_params['subscription_id'] = params['subscription_id']
if 'enterprise_id' in params:
path_params['enterprise_id'] = params['enterprise_id']
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json'])
# Authentication setting
auth_settings = ['apiKey']
return self.api_client.call_api(
'/v0/enterprise/{enterprise_id}/subscription/{subscription_id}/', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_all_subscriptions(self, enterprise_id, **kwargs):
"""List Subscriptions in Enterprise
API to view all the subscriptions in an enterprise
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_subscriptions(enterprise_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str enterprise_id: A UUID string identifying the enterprise. (required)
:param int limit: Number of results to return per page.
:param int offset: The initial index from which to return the results.
:return: InlineResponse20012
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_all_subscriptions_with_http_info(enterprise_id, **kwargs)
else:
(data) = self.get_all_subscriptions_with_http_info(enterprise_id, **kwargs)
return data
def get_all_subscriptions_with_http_info(self, enterprise_id, **kwargs):
"""List Subscriptions in Enterprise
API to view all the subscriptions in an enterprise
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_subscriptions_with_http_info(enterprise_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str enterprise_id: A UUID string identifying the enterprise. (required)
:param int limit: Number of results to return per page.
:param int offset: The initial index from which to return the results.
:return: InlineResponse20012
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['enterprise_id', 'limit', 'offset']
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_all_subscriptions" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'enterprise_id' is set
if ('enterprise_id' not in params or
params['enterprise_id'] is None):
raise ValueError("Missing the required parameter `enterprise_id` when calling `get_all_subscriptions`")
collection_formats = {}
path_params = {}
if 'enterprise_id' in params:
path_params['enterprise_id'] = params['enterprise_id']
query_params = []
if 'limit' in params:
query_params.append(('limit', params['limit']))
if 'offset' in params:
query_params.append(('offset', params['offset']))
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json'])
# Authentication setting
auth_settings = ['apiKey']
return self.api_client.call_api(
'/v0/enterprise/{enterprise_id}/subscription/', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InlineResponse20012',
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_subscription(self, subscription_id, enterprise_id, **kwargs):
"""Get subscription information
Returns subscription instance
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_subscription(subscription_id, enterprise_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str subscription_id: A UUID string identifying the subscription. (required)
:param str enterprise_id: A UUID string identifying the enterprise. (required)
:return: EventSubscription
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_subscription_with_http_info(subscription_id, enterprise_id, **kwargs)
else:
(data) = self.get_subscription_with_http_info(subscription_id, enterprise_id, **kwargs)
return data
def get_subscription_with_http_info(self, subscription_id, enterprise_id, **kwargs):
"""Get subscription information
Returns subscription instance
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_subscription_with_http_info(subscription_id, enterprise_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str subscription_id: A UUID string identifying the subscription. (required)
:param str enterprise_id: A UUID string identifying the enterprise. (required)
:return: EventSubscription
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['subscription_id', 'enterprise_id']
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_subscription" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'subscription_id' is set
if ('subscription_id' not in params or
params['subscription_id'] is None):
raise ValueError("Missing the required parameter `subscription_id` when calling `get_subscription`")
# verify the required parameter 'enterprise_id' is set
if ('enterprise_id' not in params or
params['enterprise_id'] is None):
raise ValueError("Missing the required parameter `enterprise_id` when calling `get_subscription`")
collection_formats = {}
path_params = {}
if 'subscription_id' in params:
path_params['subscription_id'] = params['subscription_id']
if 'enterprise_id' in params:
path_params['enterprise_id'] = params['enterprise_id']
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json'])
# Authentication setting
auth_settings = ['apiKey']
return self.api_client.call_api(
'/v0/enterprise/{enterprise_id}/subscription/{subscription_id}/', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EventSubscription',
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 39.492375 | 115 | 0.635626 | 2,036 | 18,127 | 5.4111 | 0.101179 | 0.074067 | 0.026141 | 0.026141 | 0.90197 | 0.891077 | 0.885541 | 0.880911 | 0.874376 | 0.874376 | 0 | 0.002529 | 0.280024 | 18,127 | 458 | 116 | 39.578603 | 0.841621 | 0.335632 | 0 | 0.762712 | 0 | 0 | 0.219995 | 0.054059 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038136 | false | 0 | 0.016949 | 0 | 0.110169 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
38c5940cbda5ed9d93092b9d10d436f03f4aadde | 73,055 | py | Python | pylacuna/tests/test_map.py | miketwo/pylacuna | ef2144cef6d5793b75e9c82018d434e0ff2b2c46 | [
"MIT"
] | null | null | null | pylacuna/tests/test_map.py | miketwo/pylacuna | ef2144cef6d5793b75e9c82018d434e0ff2b2c46 | [
"MIT"
] | null | null | null | pylacuna/tests/test_map.py | miketwo/pylacuna | ef2144cef6d5793b75e9c82018d434e0ff2b2c46 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import sys
import os
import unittest
from mock import patch, MagicMock, ANY, call
import pylacuna.core.map
import ast
from sys import version_info
if version_info.major == 2:
import __builtin__ as builtins # pylint:disable=import-error
else:
import builtins # pylint:disable=import-error
GET_STAR_MAP_RESPONSE = ast.literaleval('''
{u'id': 7,
u'jsonrpc': u'2.0',
u'result': {u'stars': [{u'bodies': [{u'id': u'356648',
u'image': u'p18-1',
u'name': u'Kleahien 1',
u'orbit': u'1',
u'ore': {u'anthracite': 1,
u'bauxite': 4200,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 3200,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 2600,
u'zircon': 1},
u'size': u'50',
u'star_id': u'49528',
u'star_name': u'Kleahien',
u'type': u'habitable planet',
u'water': 7600,
u'x': u'416',
u'y': u'-261',
u'zone': u'1|-1'},
{u'id': u'356649',
u'image': u'p28-2',
u'name': u'Kleahien 2',
u'orbit': u'2',
u'ore': {u'anthracite': 1,
u'bauxite': 1500,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 2500,
u'gold': 1,
u'gypsum': 2000,
u'halite': 3500,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1500,
u'zircon': 1},
u'size': u'68',
u'star_id': u'49528',
u'star_name': u'Kleahien',
u'type': u'habitable planet',
u'water': 9230,
u'x': u'417',
u'y': u'-262',
u'zone': u'1|-1'},
{u'empire': {u'alignment': u'hostile',
u'id': u'38995',
u'is_isolationist': u'0',
u'name': u'dev'},
u'id': u'356650',
u'image': u'p38-3',
u'name': u'Polyhedra 3',
u'orbit': u'3',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 3000,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 7000,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'45',
u'star_id': u'49528',
u'star_name': u'Kleahien',
u'type': u'habitable planet',
u'water': 8000,
u'x': u'417',
u'y': u'-264',
u'zone': u'1|-1'},
{u'id': u'356651',
u'image': u'a22-5',
u'name': u'Kleahien 5',
u'orbit': u'5',
u'ore': {u'anthracite': 1,
u'bauxite': 900,
u'beryl': 1,
u'chalcopyrite': 700,
u'chromite': 900,
u'fluorite': 100,
u'galena': 800,
u'goethite': 400,
u'gold': 100,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 500,
u'methane': 1,
u'monazite': 300,
u'rutile': 800,
u'sulfur': 1,
u'trona': 400,
u'uraninite': 1,
u'zircon': 200},
u'size': u'6',
u'star_id': u'49528',
u'star_name': u'Kleahien',
u'type': u'asteroid',
u'x': u'414',
u'y': u'-265',
u'zone': u'1|-1'},
{u'id': u'356652',
u'image': u'p37-6',
u'name': u'Kleahien 6',
u'orbit': u'6',
u'ore': {u'anthracite': 2000,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 2000,
u'magnetite': 2000,
u'methane': 1,
u'monazite': 2000,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 2000,
u'zircon': 1},
u'size': u'60',
u'star_id': u'49528',
u'star_name': u'Kleahien',
u'type': u'habitable planet',
u'water': 6225,
u'x': u'413',
u'y': u'-264',
u'zone': u'1|-1'},
{u'id': u'356653',
u'image': u'p7-8',
u'name': u'Kleahien 8',
u'orbit': u'8',
u'ore': {u'anthracite': 1,
u'bauxite': 1700,
u'beryl': 1000,
u'chalcopyrite': 2800,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 2400,
u'gold': 1,
u'gypsum': 2100,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'57',
u'star_id': u'49528',
u'star_name': u'Kleahien',
u'type': u'habitable planet',
u'water': 5700,
u'x': u'414',
u'y': u'-261',
u'zone': u'1|-1'}],
u'color': u'magenta',
u'id': u'49528',
u'influence': u'0',
u'name': u'Kleahien',
u'x': u'415',
u'y': u'-263',
u'zone': u'1|-1'},
{u'bodies': [{u'id': u'359536',
u'image': u'pg4-1',
u'name': u'Chea Oobixy 1',
u'orbit': u'1',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 2000,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 14000,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 4000,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'105',
u'star_id': u'49928',
u'star_name': u'Chea Oobixy',
u'type': u'gas giant',
u'water': 0,
u'x': u'418',
u'y': u'-251',
u'zone': u'1|-1'},
{u'id': u'359537',
u'image': u'p7-2',
u'name': u'Chea Oobixy 2',
u'orbit': u'2',
u'ore': {u'anthracite': 1,
u'bauxite': 1700,
u'beryl': 1000,
u'chalcopyrite': 2800,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 2400,
u'gold': 1,
u'gypsum': 2100,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'66',
u'star_id': u'49928',
u'star_name': u'Chea Oobixy',
u'type': u'habitable planet',
u'water': 5700,
u'x': u'419',
u'y': u'-252',
u'zone': u'1|-1'},
{u'id': u'359538',
u'image': u'p10-3',
u'name': u'Tau chy 1',
u'orbit': u'3',
u'ore': {u'anthracite': 500,
u'bauxite': 1,
u'beryl': 250,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 250,
u'galena': 1,
u'goethite': 1000,
u'gold': 1,
u'gypsum': 500,
u'halite': 1,
u'kerogen': 500,
u'magnetite': 5000,
u'methane': 500,
u'monazite': 250,
u'rutile': 1,
u'sulfur': 500,
u'trona': 500,
u'uraninite': 1,
u'zircon': 250},
u'size': u'45',
u'star_id': u'49928',
u'star_name': u'Chea Oobixy',
u'type': u'habitable planet',
u'water': 6800,
u'x': u'419',
u'y': u'-254',
u'zone': u'1|-1'},
{u'id': u'359539',
u'image': u'p18-4',
u'name': u'Toppie Terra',
u'orbit': u'4',
u'ore': {u'anthracite': 1,
u'bauxite': 4200,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 3200,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 2600,
u'zircon': 1},
u'size': u'45',
u'star_id': u'49928',
u'star_name': u'Chea Oobixy',
u'type': u'habitable planet',
u'water': 7600,
u'x': u'418',
u'y': u'-255',
u'zone': u'1|-1'},
{u'id': u'359540',
u'image': u'pg4-5',
u'name': u'Chea Oobixy 5',
u'orbit': u'5',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 2000,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 14000,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 4000,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'95',
u'star_id': u'49928',
u'star_name': u'Chea Oobixy',
u'type': u'gas giant',
u'water': 0,
u'x': u'416',
u'y': u'-255',
u'zone': u'1|-1'},
{u'id': u'359541',
u'image': u'p38-6',
u'name': u'Chea Oobixy 6',
u'orbit': u'6',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 3000,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 7000,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'55',
u'star_id': u'49928',
u'star_name': u'Chea Oobixy',
u'type': u'habitable planet',
u'water': 8000,
u'x': u'415',
u'y': u'-254',
u'zone': u'1|-1'},
{u'id': u'359542',
u'image': u'p5-7',
u'name': u'Chea Oobixy 7',
u'orbit': u'7',
u'ore': {u'anthracite': 1,
u'bauxite': 2250,
u'beryl': 1,
u'chalcopyrite': 250,
u'chromite': 1,
u'fluorite': 1,
u'galena': 2250,
u'goethite': 1250,
u'gold': 1,
u'gypsum': 1250,
u'halite': 250,
u'kerogen': 1,
u'magnetite': 250,
u'methane': 250,
u'monazite': 1,
u'rutile': 1250,
u'sulfur': 250,
u'trona': 250,
u'uraninite': 250,
u'zircon': 1},
u'size': u'52',
u'star_id': u'49928',
u'star_name': u'Chea Oobixy',
u'type': u'habitable planet',
u'water': 6200,
u'x': u'415',
u'y': u'-252',
u'zone': u'1|-1'},
{u'id': u'359543',
u'image': u'p37-8',
u'name': u'Chea Oobixy 8',
u'orbit': u'8',
u'ore': {u'anthracite': 2000,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 2000,
u'magnetite': 2000,
u'methane': 1,
u'monazite': 2000,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 2000,
u'zircon': 1},
u'size': u'49',
u'star_id': u'49928',
u'star_name': u'Chea Oobixy',
u'type': u'habitable planet',
u'water': 6225,
u'x': u'416',
u'y': u'-251',
u'zone': u'1|-1'}],
u'color': u'blue',
u'id': u'49928',
u'influence': u'0',
u'name': u'Chea Oobixy',
u'x': u'417',
u'y': u'-253',
u'zone': u'1|-1'},
{u'bodies': [{u'id': u'362428',
u'image': u'a13-1',
u'name': u'Tchia Thowdo Oxy 1',
u'orbit': u'1',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 6574,
u'uraninite': 1,
u'zircon': 2590},
u'size': u'10',
u'star_id': u'50328',
u'star_name': u'Tchia Thowdo Oxy',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'418',
u'y': u'-241',
u'zone': u'1|0'},
{u'id': u'362429',
u'image': u'a12-2',
u'name': u'Tchia Thowdo Oxy 2',
u'orbit': u'2',
u'ore': {u'anthracite': 289,
u'bauxite': 269,
u'beryl': 313,
u'chalcopyrite': 299,
u'chromite': 320,
u'fluorite': 307,
u'galena': 278,
u'goethite': 292,
u'gold': 310,
u'gypsum': 311,
u'halite': 301,
u'kerogen': 284,
u'magnetite': 296,
u'methane': 285,
u'monazite': 319,
u'rutile': 258,
u'sulfur': 324,
u'trona': 293,
u'uraninite': 276,
u'zircon': 275},
u'size': u'6',
u'star_id': u'50328',
u'star_name': u'Tchia Thowdo Oxy',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'419',
u'y': u'-242',
u'zone': u'1|0'},
{u'id': u'362430',
u'image': u'a3-3',
u'name': u'Tchia Thowdo Oxy 3',
u'orbit': u'3',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1000,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1000,
u'zircon': 8000},
u'size': u'9',
u'star_id': u'50328',
u'star_name': u'Tchia Thowdo Oxy',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'419',
u'y': u'-244',
u'zone': u'1|0'},
{u'id': u'362431',
u'image': u'a20-4',
u'name': u'Tchia Thowdo Oxy 4',
u'orbit': u'4',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 6342,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'2',
u'star_id': u'50328',
u'star_name': u'Tchia Thowdo Oxy',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'418',
u'y': u'-245',
u'zone': u'1|0'},
{u'id': u'362432',
u'image': u'a20-5',
u'name': u'Tchia Thowdo Oxy 5',
u'orbit': u'5',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 6342,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'6',
u'star_id': u'50328',
u'star_name': u'Tchia Thowdo Oxy',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'416',
u'y': u'-245',
u'zone': u'1|0'},
{u'id': u'362433',
u'image': u'a14-6',
u'name': u'Tchia Thowdo Oxy 6',
u'orbit': u'6',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 3038,
u'goethite': 2895,
u'gold': 1,
u'gypsum': 2897,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'9',
u'star_id': u'50328',
u'star_name': u'Tchia Thowdo Oxy',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'415',
u'y': u'-244',
u'zone': u'1|0'},
{u'id': u'362434',
u'image': u'a12-7',
u'name': u'Tchia Thowdo Oxy 7',
u'orbit': u'7',
u'ore': {u'anthracite': 289,
u'bauxite': 269,
u'beryl': 313,
u'chalcopyrite': 299,
u'chromite': 320,
u'fluorite': 307,
u'galena': 278,
u'goethite': 292,
u'gold': 310,
u'gypsum': 311,
u'halite': 301,
u'kerogen': 284,
u'magnetite': 296,
u'methane': 285,
u'monazite': 319,
u'rutile': 258,
u'sulfur': 324,
u'trona': 293,
u'uraninite': 276,
u'zircon': 275},
u'size': u'4',
u'star_id': u'50328',
u'star_name': u'Tchia Thowdo Oxy',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'415',
u'y': u'-242',
u'zone': u'1|0'},
{u'id': u'362435',
u'image': u'a2-8',
u'name': u'Tchia Thowdo Oxy 8',
u'orbit': u'8',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 4000,
u'chalcopyrite': 5000,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1000},
u'size': u'2',
u'star_id': u'50328',
u'star_name': u'Tchia Thowdo Oxy',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'416',
u'y': u'-241',
u'zone': u'1|0'}],
u'color': u'white',
u'id': u'50328',
u'influence': u'227',
u'name': u'Tchia Thowdo Oxy',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'x': u'417',
u'y': u'-243',
u'zone': u'1|0'},
{u'bodies': [{u'id': u'360995',
u'image': u'a20-1',
u'name': u'Oum Froassoo Sta 1',
u'orbit': u'1',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 6342,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'5',
u'star_id': u'50129',
u'star_name': u'Oum Froassoo Sta',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'424',
u'y': u'-243',
u'zone': u'1|0'},
{u'id': u'360996',
u'image': u'a14-2',
u'name': u'Oum Froassoo Sta 2',
u'orbit': u'2',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 3038,
u'goethite': 2895,
u'gold': 1,
u'gypsum': 2897,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'8',
u'star_id': u'50129',
u'star_name': u'Oum Froassoo Sta',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'425',
u'y': u'-244',
u'zone': u'1|0'},
{u'id': u'360997',
u'image': u'a3-3',
u'name': u'Oum Froassoo Sta 3',
u'orbit': u'3',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1000,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1000,
u'zircon': 8000},
u'size': u'3',
u'star_id': u'50129',
u'star_name': u'Oum Froassoo Sta',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'425',
u'y': u'-246',
u'zone': u'1|0'},
{u'id': u'360998',
u'image': u'pg3-4',
u'name': u'Oum Froassoo Sta 4',
u'orbit': u'4',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 4000,
u'halite': 14000,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 2000,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'88',
u'star_id': u'50129',
u'star_name': u'Oum Froassoo Sta',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'gas giant',
u'water': 0,
u'x': u'424',
u'y': u'-247',
u'zone': u'1|0'},
{u'id': u'360999',
u'image': u'pg4-5',
u'name': u'Oum Froassoo Sta 5',
u'orbit': u'5',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 2000,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 14000,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 4000,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'107',
u'star_id': u'50129',
u'star_name': u'Oum Froassoo Sta',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'gas giant',
u'water': 0,
u'x': u'422',
u'y': u'-247',
u'zone': u'1|0'},
{u'id': u'361000',
u'image': u'a18-6',
u'name': u'Oum Froassoo Sta 6',
u'orbit': u'6',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 4120,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 3326,
u'uraninite': 1,
u'zircon': 1},
u'size': u'7',
u'star_id': u'50129',
u'star_name': u'Oum Froassoo Sta',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'421',
u'y': u'-246',
u'zone': u'1|0'},
{u'id': u'361001',
u'image': u'a13-7',
u'name': u'Oum Froassoo Sta 7',
u'orbit': u'7',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 6574,
u'uraninite': 1,
u'zircon': 2590},
u'size': u'6',
u'star_id': u'50129',
u'star_name': u'Oum Froassoo Sta',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'421',
u'y': u'-244',
u'zone': u'1|0'},
{u'id': u'361002',
u'image': u'a4-8',
u'name': u'Oum Froassoo Sta 8',
u'orbit': u'8',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1000,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 9000,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'3',
u'star_id': u'50129',
u'star_name': u'Oum Froassoo Sta',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'422',
u'y': u'-243',
u'zone': u'1|0'}],
u'color': u'white',
u'id': u'50129',
u'influence': u'254',
u'name': u'Oum Froassoo Sta',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'x': u'423',
u'y': u'-245',
u'zone': u'1|0'},
{u'bodies': [{u'id': u'358097',
u'image': u'pg3-1',
u'name': u'Ouss Siek 1',
u'orbit': u'1',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 4000,
u'halite': 14000,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 2000,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'104',
u'star_id': u'49729',
u'star_name': u'Ouss Siek',
u'type': u'gas giant',
u'water': 0,
u'x': u'425',
u'y': u'-253',
u'zone': u'1|-1'},
{u'id': u'358098',
u'image': u'p28-2',
u'name': u'Ouss Siek 2',
u'orbit': u'2',
u'ore': {u'anthracite': 1,
u'bauxite': 1500,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 2500,
u'gold': 1,
u'gypsum': 2000,
u'halite': 3500,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1500,
u'zircon': 1},
u'size': u'64',
u'star_id': u'49729',
u'star_name': u'Ouss Siek',
u'type': u'habitable planet',
u'water': 9230,
u'x': u'426',
u'y': u'-254',
u'zone': u'1|-1'},
{u'build_queue_len': 7,
u'build_queue_size': 9,
u'building_count': 43,
u'empire': {u'alignment': u'self',
u'id': u'51819',
u'is_isolationist': u'1',
u'name': u'MikeTwo'},
u'energy_capacity': u'280094',
u'energy_hour': u'3537',
u'energy_stored': u'104737',
u'food_capacity': u'284472',
u'food_hour': 5171,
u'food_stored': 284472,
u'happiness': u'352459',
u'happiness_hour': u'680',
u'id': u'358099',
u'image': u'p38-3',
u'incoming_enemy_ships': [{u'date_arrives': u'06 09 2015 18:33:00 +0000',
u'id': u'51549238',
u'is_ally': 0,
u'is_own': 0}],
u'name': u'Cloraphorm III',
u'needs_surface_refresh': u'0',
u'neutral_entry': u'10 07 2015 07:26:44 +0000',
u'num_incoming_ally': u'0',
u'num_incoming_enemy': u'1',
u'num_incoming_own': u'0',
u'orbit': u'3',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 3000,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 7000,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'ore_capacity': u'295866',
u'ore_hour': 7794,
u'ore_stored': 202664,
u'plots_available': u'2',
u'population': 2880000,
u'propaganda_boost': u'0',
u'size': u'45',
u'star_id': u'49729',
u'star_name': u'Ouss Siek',
u'surface_version': u'358',
u'type': u'habitable planet',
u'waste_capacity': u'289133',
u'waste_hour': u'239',
u'waste_stored': u'288328',
u'water': 8000,
u'water_capacity': u'280094',
u'water_hour': u'5476',
u'water_stored': u'280094',
u'x': u'426',
u'y': u'-256',
u'zone': u'1|-1'},
{u'id': u'358100',
u'image': u'a7-4',
u'name': u'Ouss Siek 4',
u'orbit': u'4',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 3291,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1239,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 2377,
u'zircon': 1},
u'size': u'5',
u'star_id': u'49729',
u'star_name': u'Ouss Siek',
u'type': u'asteroid',
u'x': u'425',
u'y': u'-257',
u'zone': u'1|-1'},
{u'id': u'358101',
u'image': u'p28-5',
u'name': u'Ouss Siek 5',
u'orbit': u'5',
u'ore': {u'anthracite': 1,
u'bauxite': 1500,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 2500,
u'gold': 1,
u'gypsum': 2000,
u'halite': 3500,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1500,
u'zircon': 1},
u'size': u'56',
u'star_id': u'49729',
u'star_name': u'Ouss Siek',
u'type': u'habitable planet',
u'water': 9230,
u'x': u'423',
u'y': u'-257',
u'zone': u'1|-1'},
{u'id': u'358102',
u'image': u'p37-6',
u'name': u'Ouss Siek 6',
u'orbit': u'6',
u'ore': {u'anthracite': 2000,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 2000,
u'magnetite': 2000,
u'methane': 1,
u'monazite': 2000,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 2000,
u'zircon': 1},
u'size': u'53',
u'star_id': u'49729',
u'star_name': u'Ouss Siek',
u'type': u'habitable planet',
u'water': 6225,
u'x': u'422',
u'y': u'-256',
u'zone': u'1|-1'},
{u'id': u'358103',
u'image': u'pg3-7',
u'name': u'Ouss Siek 7',
u'orbit': u'7',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 4000,
u'halite': 14000,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 2000,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'86',
u'star_id': u'49729',
u'star_name': u'Ouss Siek',
u'type': u'gas giant',
u'water': 0,
u'x': u'422',
u'y': u'-254',
u'zone': u'1|-1'},
{u'id': u'358104',
u'image': u'p37-8',
u'name': u'Ouss Siek 8',
u'orbit': u'8',
u'ore': {u'anthracite': 2000,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 2000,
u'magnetite': 2000,
u'methane': 1,
u'monazite': 2000,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 2000,
u'zircon': 1},
u'size': u'69',
u'star_id': u'49729',
u'star_name': u'Ouss Siek',
u'type': u'habitable planet',
u'water': 6225,
u'x': u'423',
u'y': u'-253',
u'zone': u'1|-1'}],
u'color': u'blue',
u'id': u'49729',
u'influence': u'0',
u'name': u'Ouss Siek',
u'x': u'424',
u'y': u'-255',
u'zone': u'1|-1'},
{u'bodies': [{u'id': u'355186',
u'image': u'a12-1',
u'name': u'Iondeuzz 1',
u'orbit': u'1',
u'ore': {u'anthracite': 289,
u'bauxite': 269,
u'beryl': 313,
u'chalcopyrite': 299,
u'chromite': 320,
u'fluorite': 307,
u'galena': 278,
u'goethite': 292,
u'gold': 310,
u'gypsum': 311,
u'halite': 301,
u'kerogen': 284,
u'magnetite': 296,
u'methane': 285,
u'monazite': 319,
u'rutile': 258,
u'sulfur': 324,
u'trona': 293,
u'uraninite': 276,
u'zircon': 275},
u'size': u'8',
u'star_id': u'49329',
u'star_name': u'Iondeuzz',
u'type': u'asteroid',
u'x': u'426',
u'y': u'-265',
u'zone': u'1|-1'},
{u'id': u'355187',
u'image': u'p13-2',
u'name': u'Iondeuzz 2',
u'orbit': u'2',
u'ore': {u'anthracite': 1500,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1300,
u'chromite': 1400,
u'fluorite': 1,
u'galena': 2200,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1500,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 2100,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'66',
u'star_id': u'49329',
u'star_name': u'Iondeuzz',
u'type': u'habitable planet',
u'water': 8300,
u'x': u'427',
u'y': u'-266',
u'zone': u'1|-1'},
{u'id': u'355188',
u'image': u'p10-3',
u'name': u'Iondeuzz 3',
u'orbit': u'3',
u'ore': {u'anthracite': 500,
u'bauxite': 1,
u'beryl': 250,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 250,
u'galena': 1,
u'goethite': 1000,
u'gold': 1,
u'gypsum': 500,
u'halite': 1,
u'kerogen': 500,
u'magnetite': 5000,
u'methane': 500,
u'monazite': 250,
u'rutile': 1,
u'sulfur': 500,
u'trona': 500,
u'uraninite': 1,
u'zircon': 250},
u'size': u'45',
u'star_id': u'49329',
u'star_name': u'Iondeuzz',
u'type': u'habitable planet',
u'water': 6800,
u'x': u'427',
u'y': u'-268',
u'zone': u'1|-1'},
{u'id': u'355189',
u'image': u'p18-4',
u'name': u"Khim'bar",
u'orbit': u'4',
u'ore': {u'anthracite': 1,
u'bauxite': 4200,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 3200,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 2600,
u'zircon': 1},
u'size': u'45',
u'star_id': u'49329',
u'star_name': u'Iondeuzz',
u'type': u'habitable planet',
u'water': 7600,
u'x': u'426',
u'y': u'-269',
u'zone': u'1|-1'},
{u'id': u'355190',
u'image': u'p28-5',
u'name': u'Iondeuzz 5',
u'orbit': u'5',
u'ore': {u'anthracite': 1,
u'bauxite': 1500,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 2500,
u'gold': 1,
u'gypsum': 2000,
u'halite': 3500,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1500,
u'zircon': 1},
u'size': u'58',
u'star_id': u'49329',
u'star_name': u'Iondeuzz',
u'type': u'habitable planet',
u'water': 9230,
u'x': u'424',
u'y': u'-269',
u'zone': u'1|-1'},
{u'id': u'355191',
u'image': u'p38-6',
u'name': u'Iondeuzz 6',
u'orbit': u'6',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 3000,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 7000,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'55',
u'star_id': u'49329',
u'star_name': u'Iondeuzz',
u'type': u'habitable planet',
u'water': 8000,
u'x': u'423',
u'y': u'-268',
u'zone': u'1|-1'},
{u'id': u'355192',
u'image': u'p5-8',
u'name': u'Iondeuzz 8',
u'orbit': u'8',
u'ore': {u'anthracite': 1,
u'bauxite': 2250,
u'beryl': 1,
u'chalcopyrite': 250,
u'chromite': 1,
u'fluorite': 1,
u'galena': 2250,
u'goethite': 1250,
u'gold': 1,
u'gypsum': 1250,
u'halite': 250,
u'kerogen': 1,
u'magnetite': 250,
u'methane': 250,
u'monazite': 1,
u'rutile': 1250,
u'sulfur': 250,
u'trona': 250,
u'uraninite': 250,
u'zircon': 1},
u'size': u'53',
u'star_id': u'49329',
u'star_name': u'Iondeuzz',
u'type': u'habitable planet',
u'water': 6200,
u'x': u'424',
u'y': u'-265',
u'zone': u'1|-1'}],
u'color': u'yellow',
u'id': u'49329',
u'influence': u'0',
u'name': u'Iondeuzz',
u'x': u'425',
u'y': u'-267',
u'zone': u'1|-1'},
{u'bodies': [{u'id': u'301055',
u'image': u'p36-4',
u'name': u'Aegh Oxyeufr 4',
u'orbit': u'4',
u'ore': {u'anthracite': 1,
u'bauxite': 100,
u'beryl': 1,
u'chalcopyrite': 100,
u'chromite': 100,
u'fluorite': 1,
u'galena': 1,
u'goethite': 100,
u'gold': 1,
u'gypsum': 100,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 100,
u'rutile': 100,
u'sulfur': 100,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'45',
u'star_id': u'49529',
u'star_name': u'Ovoahuij',
u'type': u'habitable planet',
u'water': 30000,
u'x': u'431',
u'y': u'-263',
u'zone': u'1|-1'},
{u'empire': {u'alignment': u'hostile',
u'id': u'50180',
u'is_isolationist': u'0',
u'name': u'Cybertronian Defense League'},
u'id': u'332066',
u'image': u'p35-3',
u'name': u'in4',
u'orbit': u'3',
u'ore': {u'anthracite': 500,
u'bauxite': 500,
u'beryl': 500,
u'chalcopyrite': 500,
u'chromite': 500,
u'fluorite': 500,
u'galena': 500,
u'goethite': 500,
u'gold': 500,
u'gypsum': 500,
u'halite': 500,
u'kerogen': 500,
u'magnetite': 500,
u'methane': 500,
u'monazite': 500,
u'rutile': 500,
u'sulfur': 500,
u'trona': 500,
u'uraninite': 500,
u'zircon': 500},
u'size': u'70',
u'star_id': u'49529',
u'star_name': u'Ovoahuij',
u'type': u'habitable planet',
u'water': 9467,
u'x': u'432',
u'y': u'-262',
u'zone': u'1|-1'},
{u'id': u'356654',
u'image': u'p10-1',
u'name': u'Ovoahuij 1',
u'orbit': u'1',
u'ore': {u'anthracite': 500,
u'bauxite': 1,
u'beryl': 250,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 250,
u'galena': 1,
u'goethite': 1000,
u'gold': 1,
u'gypsum': 500,
u'halite': 1,
u'kerogen': 500,
u'magnetite': 5000,
u'methane': 500,
u'monazite': 250,
u'rutile': 1,
u'sulfur': 500,
u'trona': 500,
u'uraninite': 1,
u'zircon': 250},
u'size': u'55',
u'star_id': u'49529',
u'star_name': u'Ovoahuij',
u'type': u'habitable planet',
u'water': 6800,
u'x': u'431',
u'y': u'-259',
u'zone': u'1|-1'},
{u'id': u'356655',
u'image': u'p18-2',
u'name': u'Ovoahuij 2',
u'orbit': u'2',
u'ore': {u'anthracite': 1,
u'bauxite': 4200,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 3200,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 2600,
u'zircon': 1},
u'size': u'52',
u'star_id': u'49529',
u'star_name': u'Ovoahuij',
u'type': u'habitable planet',
u'water': 7600,
u'x': u'432',
u'y': u'-260',
u'zone': u'1|-1'},
{u'id': u'356658',
u'image': u'p18-5',
u'name': u'Ovoahuij 5',
u'orbit': u'5',
u'ore': {u'anthracite': 1,
u'bauxite': 4200,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 3200,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 2600,
u'zircon': 1},
u'size': u'65',
u'star_id': u'49529',
u'star_name': u'Ovoahuij',
u'type': u'habitable planet',
u'water': 7600,
u'x': u'429',
u'y': u'-263',
u'zone': u'1|-1'},
{u'id': u'356659',
u'image': u'p28-6',
u'name': u'Alajuwon',
u'orbit': u'6',
u'ore': {u'anthracite': 1,
u'bauxite': 1500,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 2500,
u'gold': 1,
u'gypsum': 2000,
u'halite': 3500,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1500,
u'zircon': 1},
u'size': u'45',
u'star_id': u'49529',
u'star_name': u'Ovoahuij',
u'type': u'habitable planet',
u'water': 9230,
u'x': u'428',
u'y': u'-262',
u'zone': u'1|-1'},
{u'id': u'356660',
u'image': u'p38-7',
u'name': u'Ovoahuij 7',
u'orbit': u'7',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 3000,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 7000,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'59',
u'star_id': u'49529',
u'star_name': u'Ovoahuij',
u'type': u'habitable planet',
u'water': 8000,
u'x': u'428',
u'y': u'-260',
u'zone': u'1|-1'},
{u'id': u'356661',
u'image': u'p5-8',
u'name': u'Ovoahuij 8',
u'orbit': u'8',
u'ore': {u'anthracite': 1,
u'bauxite': 2250,
u'beryl': 1,
u'chalcopyrite': 250,
u'chromite': 1,
u'fluorite': 1,
u'galena': 2250,
u'goethite': 1250,
u'gold': 1,
u'gypsum': 1250,
u'halite': 250,
u'kerogen': 1,
u'magnetite': 250,
u'methane': 250,
u'monazite': 1,
u'rutile': 1250,
u'sulfur': 250,
u'trona': 250,
u'uraninite': 250,
u'zircon': 1},
u'size': u'56',
u'star_id': u'49529',
u'star_name': u'Ovoahuij',
u'type': u'habitable planet',
u'water': 6200,
u'x': u'429',
u'y': u'-259',
u'zone': u'1|-1'}],
u'color': u'blue',
u'id': u'49529',
u'influence': u'0',
u'name': u'Ovoahuij',
u'x': u'430',
u'y': u'-261',
u'zone': u'1|-1'},
{u'bodies': [{u'id': u'353720',
u'image': u'a7-1',
u'name': u'Plia Osloy Sai 1',
u'orbit': u'1',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 3291,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1239,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 2377,
u'zircon': 1},
u'size': u'4',
u'star_id': u'49129',
u'star_name': u'Plia Osloy Sai',
u'type': u'asteroid',
u'x': u'433',
u'y': u'-268',
u'zone': u'1|-1'},
{u'id': u'353721',
u'image': u'p28-2',
u'name': u'severus',
u'orbit': u'2',
u'ore': {u'anthracite': 1,
u'bauxite': 1500,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 2500,
u'gold': 1,
u'gypsum': 2000,
u'halite': 3500,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1500,
u'zircon': 1},
u'size': u'45',
u'star_id': u'49129',
u'star_name': u'Plia Osloy Sai',
u'type': u'habitable planet',
u'water': 9230,
u'x': u'434',
u'y': u'-269',
u'zone': u'1|-1'},
{u'id': u'353722',
u'image': u'p38-4',
u'name': u'Plia Osloy Sai 4',
u'orbit': u'4',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 3000,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 7000,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'51',
u'star_id': u'49129',
u'star_name': u'Plia Osloy Sai',
u'type': u'habitable planet',
u'water': 8000,
u'x': u'433',
u'y': u'-272',
u'zone': u'1|-1'},
{u'id': u'353723',
u'image': u'p5-5',
u'name': u'Plia Osloy Sai 5',
u'orbit': u'5',
u'ore': {u'anthracite': 1,
u'bauxite': 2250,
u'beryl': 1,
u'chalcopyrite': 250,
u'chromite': 1,
u'fluorite': 1,
u'galena': 2250,
u'goethite': 1250,
u'gold': 1,
u'gypsum': 1250,
u'halite': 250,
u'kerogen': 1,
u'magnetite': 250,
u'methane': 250,
u'monazite': 1,
u'rutile': 1250,
u'sulfur': 250,
u'trona': 250,
u'uraninite': 250,
u'zircon': 1},
u'size': u'45',
u'star_id': u'49129',
u'star_name': u'Plia Osloy Sai',
u'type': u'habitable planet',
u'water': 6200,
u'x': u'431',
u'y': u'-272',
u'zone': u'1|-1'},
{u'id': u'353724',
u'image': u'p37-6',
u'name': u'T8TE Alpha',
u'orbit': u'6',
u'ore': {u'anthracite': 2000,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 2000,
u'magnetite': 2000,
u'methane': 1,
u'monazite': 2000,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 2000,
u'zircon': 1},
u'size': u'45',
u'star_id': u'49129',
u'star_name': u'Plia Osloy Sai',
u'type': u'habitable planet',
u'water': 6225,
u'x': u'430',
u'y': u'-271',
u'zone': u'1|-1'},
{u'id': u'353725',
u'image': u'p37-7',
u'name': u'Plia Osloy Sai 7',
u'orbit': u'7',
u'ore': {u'anthracite': 2000,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 2000,
u'magnetite': 2000,
u'methane': 1,
u'monazite': 2000,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 2000,
u'zircon': 1},
u'size': u'64',
u'star_id': u'49129',
u'star_name': u'Plia Osloy Sai',
u'type': u'habitable planet',
u'water': 6225,
u'x': u'430',
u'y': u'-269',
u'zone': u'1|-1'},
{u'id': u'353726',
u'image': u'p5-8',
u'name': u'Plia Osloy Sai 8',
u'orbit': u'8',
u'ore': {u'anthracite': 1,
u'bauxite': 2250,
u'beryl': 1,
u'chalcopyrite': 250,
u'chromite': 1,
u'fluorite': 1,
u'galena': 2250,
u'goethite': 1250,
u'gold': 1,
u'gypsum': 1250,
u'halite': 250,
u'kerogen': 1,
u'magnetite': 250,
u'methane': 250,
u'monazite': 1,
u'rutile': 1250,
u'sulfur': 250,
u'trona': 250,
u'uraninite': 250,
u'zircon': 1},
u'size': u'61',
u'star_id': u'49129',
u'star_name': u'Plia Osloy Sai',
u'type': u'habitable planet',
u'water': 6200,
u'x': u'431',
u'y': u'-268',
u'zone': u'1|-1'}],
u'color': u'magenta',
u'id': u'49129',
u'influence': u'0',
u'name': u'Plia Osloy Sai',
u'x': u'432',
u'y': u'-270',
u'zone': u'1|-1'},
{u'bodies': [{u'id': u'359544',
u'image': u'p7-1',
u'name': u'Iostr Ufeass 1',
u'orbit': u'1',
u'ore': {u'anthracite': 1,
u'bauxite': 1700,
u'beryl': 1000,
u'chalcopyrite': 2800,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 2400,
u'gold': 1,
u'gypsum': 2100,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'45',
u'star_id': u'49929',
u'star_name': u'Iostr Ufeass',
u'type': u'habitable planet',
u'water': 5700,
u'x': u'433',
u'y': u'-250',
u'zone': u'1|-1'},
{u'id': u'359545',
u'image': u'p10-2',
u'name': u'Coraton',
u'orbit': u'2',
u'ore': {u'anthracite': 500,
u'bauxite': 1,
u'beryl': 250,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 250,
u'galena': 1,
u'goethite': 1000,
u'gold': 1,
u'gypsum': 500,
u'halite': 1,
u'kerogen': 500,
u'magnetite': 5000,
u'methane': 500,
u'monazite': 250,
u'rutile': 1,
u'sulfur': 500,
u'trona': 500,
u'uraninite': 1,
u'zircon': 250},
u'size': u'45',
u'star_id': u'49929',
u'star_name': u'Iostr Ufeass',
u'type': u'habitable planet',
u'water': 6800,
u'x': u'434',
u'y': u'-251',
u'zone': u'1|-1'},
{u'id': u'359546',
u'image': u'p37-3',
u'name': u'Iostr Ufeass 3',
u'orbit': u'3',
u'ore': {u'anthracite': 2000,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 2000,
u'magnetite': 2000,
u'methane': 1,
u'monazite': 2000,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 2000,
u'zircon': 1},
u'size': u'45',
u'star_id': u'49929',
u'star_name': u'Iostr Ufeass',
u'type': u'habitable planet',
u'water': 6225,
u'x': u'434',
u'y': u'-253',
u'zone': u'1|-1'},
{u'id': u'359547',
u'image': u'a16-4',
u'name': u'Iostr Ufeass 4',
u'orbit': u'4',
u'ore': {u'anthracite': 1,
u'bauxite': 1894,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1793,
u'galena': 1,
u'goethite': 1,
u'gold': 2132,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 2018,
u'uraninite': 1,
u'zircon': 1},
u'size': u'5',
u'star_id': u'49929',
u'star_name': u'Iostr Ufeass',
u'type': u'asteroid',
u'x': u'433',
u'y': u'-254',
u'zone': u'1|-1'},
{u'id': u'359548',
u'image': u'p10-5',
u'name': u'Iostr Ufeass 5',
u'orbit': u'5',
u'ore': {u'anthracite': 500,
u'bauxite': 1,
u'beryl': 250,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 250,
u'galena': 1,
u'goethite': 1000,
u'gold': 1,
u'gypsum': 500,
u'halite': 1,
u'kerogen': 500,
u'magnetite': 5000,
u'methane': 500,
u'monazite': 250,
u'rutile': 1,
u'sulfur': 500,
u'trona': 500,
u'uraninite': 1,
u'zircon': 250},
u'size': u'57',
u'star_id': u'49929',
u'star_name': u'Iostr Ufeass',
u'type': u'habitable planet',
u'water': 6800,
u'x': u'431',
u'y': u'-254',
u'zone': u'1|-1'},
{u'id': u'359549',
u'image': u'pg3-6',
u'name': u'Iostr Ufeass 6',
u'orbit': u'6',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 4000,
u'halite': 14000,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 2000,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'90',
u'star_id': u'49929',
u'star_name': u'Iostr Ufeass',
u'type': u'gas giant',
u'water': 0,
u'x': u'430',
u'y': u'-253',
u'zone': u'1|-1'},
{u'id': u'359550',
u'image': u'p28-7',
u'name': u'Iostr Ufeass 7',
u'orbit': u'7',
u'ore': {u'anthracite': 1,
u'bauxite': 1500,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 2500,
u'gold': 1,
u'gypsum': 2000,
u'halite': 3500,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1500,
u'zircon': 1},
u'size': u'51',
u'star_id': u'49929',
u'star_name': u'Iostr Ufeass',
u'type': u'habitable planet',
u'water': 9230,
u'x': u'430',
u'y': u'-251',
u'zone': u'1|-1'},
{u'id': u'359551',
u'image': u'p38-8',
u'name': u'Iostr Ufeass 8',
u'orbit': u'8',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 3000,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 7000,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'49',
u'star_id': u'49929',
u'star_name': u'Iostr Ufeass',
u'type': u'habitable planet',
u'water': 8000,
u'x': u'431',
u'y': u'-250',
u'zone': u'1|-1'}],
u'color': u'white',
u'id': u'49929',
u'influence': u'0',
u'name': u'Iostr Ufeass',
u'x': u'432',
u'y': u'-252',
u'zone': u'1|-1'},
{u'bodies': [{u'id': u'362436',
u'image': u'a1-1',
u'name': u'Vlea Isphern Oaly 1',
u'orbit': u'1',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1000,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 9000,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'7',
u'star_id': u'50329',
u'star_name': u'Vlea Isphern Oaly',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'433',
u'y': u'-241',
u'zone': u'1|0'},
{u'id': u'362437',
u'image': u'a20-2',
u'name': u'Vlea Isphern Oaly 2',
u'orbit': u'2',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 6342,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'1',
u'star_id': u'50329',
u'star_name': u'Vlea Isphern Oaly',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'434',
u'y': u'-242',
u'zone': u'1|0'},
{u'id': u'362438',
u'image': u'a1-3',
u'name': u'Vlea Isphern Oaly 3',
u'orbit': u'3',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1000,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 9000,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'4',
u'star_id': u'50329',
u'star_name': u'Vlea Isphern Oaly',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'434',
u'y': u'-244',
u'zone': u'1|0'},
{u'id': u'362439',
u'image': u'a23-4',
u'name': u'Vlea Isphern Oaly 4',
u'orbit': u'4',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1000,
u'chalcopyrite': 1000,
u'chromite': 1000,
u'fluorite': 1000,
u'galena': 1000,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1000,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1000},
u'size': u'8',
u'star_id': u'50329',
u'star_name': u'Vlea Isphern Oaly',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'433',
u'y': u'-245',
u'zone': u'1|0'},
{u'id': u'362440',
u'image': u'a13-5',
u'name': u'Vlea Isphern Oaly 5',
u'orbit': u'5',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 6574,
u'uraninite': 1,
u'zircon': 2590},
u'size': u'7',
u'star_id': u'50329',
u'star_name': u'Vlea Isphern Oaly',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'431',
u'y': u'-245',
u'zone': u'1|0'},
{u'id': u'362441',
u'image': u'a2-6',
u'name': u'Vlea Isphern Oaly 6',
u'orbit': u'6',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 4000,
u'chalcopyrite': 5000,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1000},
u'size': u'3',
u'star_id': u'50329',
u'star_name': u'Vlea Isphern Oaly',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'430',
u'y': u'-244',
u'zone': u'1|0'},
{u'id': u'362442',
u'image': u'a9-7',
u'name': u'Vlea Isphern Oaly 7',
u'orbit': u'7',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 5500,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'6',
u'star_id': u'50329',
u'star_name': u'Vlea Isphern Oaly',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'430',
u'y': u'-242',
u'zone': u'1|0'},
{u'id': u'362443',
u'image': u'a10-8',
u'name': u'Vlea Isphern Oaly 8',
u'orbit': u'8',
u'ore': {u'anthracite': 6250,
u'bauxite': 108,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 55,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 300,
u'uraninite': 1,
u'zircon': 1},
u'size': u'9',
u'star_id': u'50329',
u'star_name': u'Vlea Isphern Oaly',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'type': u'asteroid',
u'x': u'431',
u'y': u'-241',
u'zone': u'1|0'}],
u'color': u'magenta',
u'id': u'50329',
u'influence': u'276',
u'name': u'Vlea Isphern Oaly',
u'station': {u'id': u'360983',
u'name': u'SASS 5',
u'x': u'424',
u'y': u'-213'},
u'x': u'432',
u'y': u'-243',
u'zone': u'1|0'},
{u'color': u'blue',
u'id': u'50130',
u'influence': u'192',
u'name': u'Aw Ohaeph Vli',
u'x': u'438',
u'y': u'-247',
u'zone': u'1|0'},
{u'bodies': [{u'id': u'358105',
u'image': u'p13-1',
u'name': u'Chou Idow 1',
u'orbit': u'1',
u'ore': {u'anthracite': 1500,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1300,
u'chromite': 1400,
u'fluorite': 1,
u'galena': 2200,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1500,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 2100,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'45',
u'star_id': u'49730',
u'star_name': u'Chou Idow',
u'type': u'habitable planet',
u'water': 8300,
u'x': u'440',
u'y': u'-254',
u'zone': u'1|-1'},
{u'id': u'358106',
u'image': u'p10-2',
u'name': u'Chou Idow 2',
u'orbit': u'2',
u'ore': {u'anthracite': 500,
u'bauxite': 1,
u'beryl': 250,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 250,
u'galena': 1,
u'goethite': 1000,
u'gold': 1,
u'gypsum': 500,
u'halite': 1,
u'kerogen': 500,
u'magnetite': 5000,
u'methane': 500,
u'monazite': 250,
u'rutile': 1,
u'sulfur': 500,
u'trona': 500,
u'uraninite': 1,
u'zircon': 250},
u'size': u'63',
u'star_id': u'49730',
u'star_name': u'Chou Idow',
u'type': u'habitable planet',
u'water': 6800,
u'x': u'441',
u'y': u'-255',
u'zone': u'1|-1'},
{u'empire': {u'alignment': u'hostile',
u'id': u'46258',
u'is_isolationist': u'0',
u'name': u'Last Legion'},
u'id': u'358107',
u'image': u'p18-3',
u'name': u'in1',
u'orbit': u'3',
u'ore': {u'anthracite': 1,
u'bauxite': 4200,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 3200,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 2600,
u'zircon': 1},
u'size': u'70',
u'star_id': u'49730',
u'star_name': u'Chou Idow',
u'type': u'habitable planet',
u'water': 7600,
u'x': u'441',
u'y': u'-257',
u'zone': u'1|-1'},
{u'id': u'358108',
u'image': u'p28-5',
u'name': u'Chou Idow 5',
u'orbit': u'5',
u'ore': {u'anthracite': 1,
u'bauxite': 1500,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 2500,
u'gold': 1,
u'gypsum': 2000,
u'halite': 3500,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 1500,
u'zircon': 1},
u'size': u'58',
u'star_id': u'49730',
u'star_name': u'Chou Idow',
u'type': u'habitable planet',
u'water': 9230,
u'x': u'438',
u'y': u'-258',
u'zone': u'1|-1'},
{u'id': u'358109',
u'image': u'p38-6',
u'name': u'Chou Idow 6',
u'orbit': u'6',
u'ore': {u'anthracite': 1,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 3000,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 1,
u'magnetite': 1,
u'methane': 1,
u'monazite': 1,
u'rutile': 1,
u'sulfur': 7000,
u'trona': 1,
u'uraninite': 1,
u'zircon': 1},
u'size': u'55',
u'star_id': u'49730',
u'star_name': u'Chou Idow',
u'type': u'habitable planet',
u'water': 8000,
u'x': u'437',
u'y': u'-257',
u'zone': u'1|-1'},
{u'id': u'358110',
u'image': u'p5-7',
u'name': u'Chou Idow 7',
u'orbit': u'7',
u'ore': {u'anthracite': 1,
u'bauxite': 2250,
u'beryl': 1,
u'chalcopyrite': 250,
u'chromite': 1,
u'fluorite': 1,
u'galena': 2250,
u'goethite': 1250,
u'gold': 1,
u'gypsum': 1250,
u'halite': 250,
u'kerogen': 1,
u'magnetite': 250,
u'methane': 250,
u'monazite': 1,
u'rutile': 1250,
u'sulfur': 250,
u'trona': 250,
u'uraninite': 250,
u'zircon': 1},
u'size': u'52',
u'star_id': u'49730',
u'star_name': u'Chou Idow',
u'type': u'habitable planet',
u'water': 6200,
u'x': u'437',
u'y': u'-255',
u'zone': u'1|-1'},
{u'id': u'358111',
u'image': u'p37-8',
u'name': u'Chou Idow 8',
u'orbit': u'8',
u'ore': {u'anthracite': 2000,
u'bauxite': 1,
u'beryl': 1,
u'chalcopyrite': 1,
u'chromite': 1,
u'fluorite': 1,
u'galena': 1,
u'goethite': 1,
u'gold': 1,
u'gypsum': 1,
u'halite': 1,
u'kerogen': 2000,
u'magnetite': 2000,
u'methane': 1,
u'monazite': 2000,
u'rutile': 1,
u'sulfur': 1,
u'trona': 1,
u'uraninite': 2000,
u'zircon': 1},
u'size': u'50',
u'star_id': u'49730',
u'star_name': u'Chou Idow',
u'type': u'habitable planet',
u'water': 6225,
u'x': u'438',
u'y': u'-254',
u'zone': u'1|-1'}],
u'color': u'green',
u'id': u'49730',
u'influence': u'0',
u'name': u'Chou Idow',
u'x': u'439',
u'y': u'-256',
u'zone': u'1|-1'},
{u'color': u'magenta',
u'id': u'49330',
u'influence': u'0',
u'name': u'Eckladdee',
u'x': u'440',
u'y': u'-267',
u'zone': u'1|-1'}],
u'status': {u'empire': {u'alliance_id': u'1785',
u'colonies': {u'358099': u'Cloraphorm III'},
u'essentia': 0,
u'has_new_messages': u'0',
u'home_planet_id': u'358099',
u'id': u'51819',
u'insurrect_value': u'100000',
u'is_isolationist': u'1',
u'latest_message_id': u'0',
u'name': u'MikeTwo',
u'next_colony_cost': u'100000',
u'next_colony_srcs': u'100000',
u'next_station_cost': u'108696',
u'planets': {u'358099': u'Cloraphorm III'},
u'primary_embassy_id': u'5048765',
u'rpc_count': 276,
u'self_destruct_active': u'0',
u'self_destruct_date': u'08 06 2015 05:49:38 +0000',
u'stations': {},
u'status_message': u'Just getting started',
u'tech_level': u'9'},
u'server': {u'rpc_limit': 10000,
u'star_map_size': {u'x': [-1500, 1500], u'y': [-1500, 1500]},
u'time': u'10 07 2015 17:41:04 +0000',
u'version': 3.0911}}}}
''')
def mock_api(route, method, params=None):
if method == 'view':
print "MOCK: Returning individual view"
return MagicMock()
if method == 'get_buildings':
print "MOCK: Returning get buildings response"
return MagicMock()
print "MOCK: Returning mock"
return MagicMock()
class testMap(unittest.TestCase):
def setUp(self):
# Create session mock
self.session_mock = MagicMock()
self.session_mock.call_method_with_session_id.side_effect = mock_api
# Patch out stdout
# use 'reload(sys)' in a test to undo
# patcher = patch('sys.stdout')
# patcher.start()
# self.addCleanup(patcher.stop)
def tearDown(self):
pass
def test_init_and_print(self):
mymap = pylacuna.core.map.Map(self.session_mock)
print mymap
print self.mock_body.mock_calls
print self.session_mock.mock_calls
if __name__ == '__main__':
unittest.main()
| 24.697431 | 79 | 0.442762 | 11,129 | 73,055 | 2.8822 | 0.052655 | 0.081182 | 0.016087 | 0.02095 | 0.878414 | 0.857682 | 0.836014 | 0.82211 | 0.79851 | 0.795174 | 0 | 0.126051 | 0.347136 | 73,055 | 2,957 | 80 | 24.705783 | 0.546472 | 0.00308 | 0 | 0.856073 | 0 | 0.001021 | 0.986199 | 0.002774 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.00034 | 0.003062 | null | null | 0.002382 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
2a00a4a7739691439cc2aee99d2dd651375ea777 | 2,234 | py | Python | autodiff/operator/basics.py | zhuzilin/autodiff | 56e167eabf3e870d1a4c9b6ee6583720c2626e3a | [
"MIT"
] | 1 | 2018-05-10T03:17:48.000Z | 2018-05-10T03:17:48.000Z | autodiff/operator/basics.py | zhuzilin/autodiff | 56e167eabf3e870d1a4c9b6ee6583720c2626e3a | [
"MIT"
] | null | null | null | autodiff/operator/basics.py | zhuzilin/autodiff | 56e167eabf3e870d1a4c9b6ee6583720c2626e3a | [
"MIT"
] | null | null | null | import numpy as np
from .operator import Operator
class Add(Operator):
def forward(self, *args, **kwargs):
assert len(args) == 2, "add opr only need 2 inputs"
super().forward(*args, **kwargs)
return self._inputs[0].item + self._inputs[1].item
def backward(self, grad_output):
self._inputs[0].backward(grad_output, end=False)
self._inputs[1].backward(grad_output, end=False)
class Sub(Operator):
def forward(self, *args, **kwargs):
assert len(args) == 2, "mul opr only need 2 inputs"
super().forward(*args, **kwargs)
return self._inputs[0].item * self._inputs[1].item
def backward(self, grad_output):
self._inputs[0].backward(grad_output, end=False)
self._inputs[1].backward(-grad_output, end=False)
class Mul(Operator):
def forward(self, *args, **kwargs):
assert len(args) == 2, "mul opr only need 2 inputs"
super().forward(*args, **kwargs)
return self._inputs[0].item * self._inputs[1].item
def backward(self, grad_output):
self._inputs[0].backward(grad_output * self._inputs[1].item, end=False)
self._inputs[1].backward(grad_output * self._inputs[0].item, end=False)
class Div(Operator):
def forward(self, *args, **kwargs):
assert len(args) == 2, "mul opr only need 2 inputs"
super().forward(*args, **kwargs)
return self._inputs[0].item / self._inputs[1].item
def backward(self, grad_output):
self._inputs[0].backward(grad_output / self._inputs[1].item, end=False)
self._inputs[1].backward(grad_output * self._inputs[0].item / (-self._inputs[1].item ** 2), end=False)
class Mean(Operator):
def forward(self, *args, **kwargs):
assert len(args) == 1, "mean opr only need 2 inputs"
super().forward(*args, **kwargs)
tmp = []
for i in self._inputs:
tmp.append(i)
return self._inputs[0].item.mean()
def backward(self, grad_output):
# TODO: support axis-wise mean, maybe using np.prod
grad_input = grad_output * np.ones(self._inputs[0].item.shape) \
/ self._inputs[0].item.size
self._inputs[0].backward(grad_input, end=False) | 34.90625 | 110 | 0.628469 | 312 | 2,234 | 4.365385 | 0.160256 | 0.190896 | 0.113069 | 0.099119 | 0.819383 | 0.768722 | 0.768722 | 0.768722 | 0.757709 | 0.695301 | 0 | 0.020737 | 0.222919 | 2,234 | 64 | 111 | 34.90625 | 0.763825 | 0.021934 | 0 | 0.478261 | 0 | 0 | 0.059982 | 0 | 0 | 0 | 0 | 0.015625 | 0.108696 | 1 | 0.217391 | false | 0 | 0.043478 | 0 | 0.478261 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
2a582742423dded158dc4d89ea635ee4cc14437e | 165 | py | Python | exrate/tasks.py | ashraffouda/btc_exchange_rate | eea043e55939c0c1c53b812e95c2ac08220f2942 | [
"Apache-2.0"
] | null | null | null | exrate/tasks.py | ashraffouda/btc_exchange_rate | eea043e55939c0c1c53b812e95c2ac08220f2942 | [
"Apache-2.0"
] | null | null | null | exrate/tasks.py | ashraffouda/btc_exchange_rate | eea043e55939c0c1c53b812e95c2ac08220f2942 | [
"Apache-2.0"
] | null | null | null | from btc_exchange_rate.celery import app
from .helpers import get_realtime_exchange_rate
@app.task
def get_realtime_exchange():
get_realtime_exchange_rate()
| 16.5 | 47 | 0.824242 | 24 | 165 | 5.25 | 0.5 | 0.285714 | 0.452381 | 0.365079 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 165 | 9 | 48 | 18.333333 | 0.868966 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
2a82ca2d1c5345ec4c5fc2499e3b38cb2fce7fb5 | 51,113 | py | Python | test/user14_time.py | time-track-tool/time-track-tool | a1c280f32a7766e460c862633b748fa206256f24 | [
"MIT"
] | null | null | null | test/user14_time.py | time-track-tool/time-track-tool | a1c280f32a7766e460c862633b748fa206256f24 | [
"MIT"
] | 1 | 2019-07-03T13:32:38.000Z | 2019-07-03T13:32:38.000Z | test/user14_time.py | time-track-tool/time-track-tool | a1c280f32a7766e460c862633b748fa206256f24 | [
"MIT"
] | 1 | 2019-05-15T16:01:31.000Z | 2019-05-15T16:01:31.000Z | from roundup import date
def import_data_14 (db, user, dep, olo) :
otp = None
ud = db.user_dynamic.create \
( hours_fri = 7.5
, hours_sun = 0.0
, hours_wed = 7.75
, daily_worktime = 0.0
, vacation_yearly = 25.0
, all_in = 1
, booking_allowed = 1
, durations_allowed = 0
, hours_tue = 7.75
, weekend_allowed = 0
, hours_mon = 7.75
, hours_thu = 7.75
, vacation_day = 1.0
, valid_from = date.Date ("2015-01-01")
, weekly_hours = 38.5
, travel_full = 0
, vacation_month = 1.0
, hours_sat = 0.0
, department = dep
, org_location = olo
, overtime_period = otp
, user = user
, max_flexitime = 5
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-02')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, duration = 7.75
, work_location = '5'
, wp = '44'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-03')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, duration = 7.75
, work_location = '5'
, wp = '44'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-04')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, duration = 7.75
, work_location = '5'
, wp = '44'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-05')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, duration = 7.75
, work_location = '5'
, wp = '44'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-06')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, duration = 7.5
, work_location = '5'
, wp = '44'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-05-26')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, duration = 7.75
, work_location = '5'
, wp = '44'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-02')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, duration = 7.5
, work_location = '5'
, wp = '44'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-03')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-04')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-05')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, duration = 7.75
, work_location = '5'
, wp = '44'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-08-07')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-08-08')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-08-09')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-08-10')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-08-11')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-08-12')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-08-13')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-08-14')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-08-15')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-08-16')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-08-17')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-08-18')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-08-19')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-08-20')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-08-21')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-01')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, duration = 7.75
, work_location = '5'
, wp = '1'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-06')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, duration = 7.75
, work_location = '5'
, wp = '1'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-07')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:15'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-08')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:15'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-09')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '14:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-10')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-11')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-12')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-13')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:15'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:15'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-14')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '07:30'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '18:30'
, end = '21:00'
, time_activity = '1'
, work_location = '6'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-15')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '16:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-16')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '16:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-17')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-18')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-23')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, duration = 7.5
, work_location = '5'
, wp = '44'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-19')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-20')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '16:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:15'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-21')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '09:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-22')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '18:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:30'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-24')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-25')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-26')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:15'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-27')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '07:30'
, end = '08:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '18:00'
, work_location = '1'
, wp = '5'
)
db.time_record.create \
( daily_record = dr
, start = '08:30'
, end = '12:00'
, work_location = '1'
, wp = '5'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-28')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '07:30'
, end = '08:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:00'
, work_location = '1'
, wp = '5'
)
db.time_record.create \
( daily_record = dr
, start = '08:30'
, end = '12:00'
, work_location = '1'
, wp = '5'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-29')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '18:00'
, work_location = '1'
, wp = '6'
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '6'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-30')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '14:00'
, work_location = '1'
, wp = '6'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-01-31')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-01')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-09')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-10')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:15'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-11')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '07:15'
, end = '12:00'
, time_activity = '1'
, work_location = '6'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '13:30'
, time_activity = '10'
, work_location = '3'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '13:30'
, end = '16:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-12')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:45'
, end = '17:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:15'
, end = '12:15'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-13')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '14:15'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-14')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-15')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-07')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-08')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-16')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-17')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-18')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-19')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '16:15'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-20')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '08:15'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '14:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-21')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-22')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-23')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-24')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '06:30'
, end = '11:30'
, time_activity = '1'
, work_location = '3'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '18:30'
, time_activity = '1'
, work_location = '6'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '11:30'
, end = '12:00'
, time_activity = '1'
, work_location = '6'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-25')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '18:00'
, time_activity = '1'
, work_location = '6'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:30'
, end = '12:00'
, time_activity = '1'
, work_location = '6'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '18:30'
, end = '23:45'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-26')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:15'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-27')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '15:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:15'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-02-28')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-01')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-02')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:15'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-03')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '13:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '13:30'
, end = '14:00'
, time_activity = '10'
, work_location = '3'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '14:00'
, end = '17:00'
, time_activity = '1'
, work_location = '6'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-04')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '18:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:15'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-05')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:30'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-06')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '14:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-07')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-08')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-09')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '16:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:15'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-10')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:15'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '17:45'
, end = '20:00'
, time_activity = '1'
, work_location = '6'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-11')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '05:00'
, end = '10:00'
, time_activity = '11'
, work_location = '3'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:00'
, time_activity = '1'
, work_location = '6'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '17:30'
, end = '23:30'
, time_activity = '11'
, work_location = '3'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '10:30'
, end = '12:00'
, time_activity = '1'
, work_location = '6'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-12')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:15'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-13')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '08:15'
, end = '14:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-14')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-15')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-16')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-17')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '16:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-18')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '07:30'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-19')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-20')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '10:45'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '14:15'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-21')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-22')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-23')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-24')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-25')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '18:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:15'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '19:00'
, end = '19:30'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-26')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '18:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:15'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-27')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '11:30'
, end = '14:30'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-28')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-29')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-30')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '13:30'
, end = '19:30'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:30'
, end = '13:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-03-31')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '16:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:30'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '18:00'
, end = '19:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-04-01')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '04:30'
, end = '07:45'
, time_activity = '10'
, work_location = '3'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '07:45'
, end = '13:00'
, time_activity = '1'
, work_location = '6'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '13:30'
, end = '17:30'
, time_activity = '1'
, work_location = '6'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '17:30'
, end = '23:45'
, time_activity = '10'
, work_location = '3'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-04-02')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '16:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:30'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-04-03')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '08:30'
, end = '13:30'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-04-04')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-04-05')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-04-07')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, duration = 7.75
, work_location = '5'
, wp = '44'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-04-06')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, duration = 7.75
, work_location = '5'
, wp = '1'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-04-08')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:00'
, work_location = '1'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-04-09')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '12:00'
, time_activity = '1'
, work_location = '6'
, wp = '4'
)
db.time_record.create \
( daily_record = dr
, start = '12:30'
, end = '17:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-04-10')
, weekend_allowed = 0
, required_overtime = 0
)
db.time_record.create \
( daily_record = dr
, start = '08:00'
, end = '14:00'
, work_location = '1'
, wp = '4'
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-04-11')
, weekend_allowed = 0
, required_overtime = 0
)
dr = db.daily_record.create \
( user = user
, date = date.Date ('2015-04-12')
, weekend_allowed = 0
, required_overtime = 0
)
db.user_dynamic.set (ud, valid_to = date.Date ("2015-06-01"))
db.commit ()
# end def import_data_14
| 29.224128 | 66 | 0.392288 | 4,958 | 51,113 | 3.876563 | 0.020775 | 0.149376 | 0.089282 | 0.133923 | 0.973465 | 0.972633 | 0.9718 | 0.969823 | 0.967586 | 0.965557 | 0 | 0.101597 | 0.492771 | 51,113 | 1,748 | 67 | 29.240847 | 0.639744 | 0.00043 | 0 | 0.752577 | 0 | 0 | 0.055237 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.000573 | false | 0 | 0.001145 | 0 | 0.001718 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
aac6747374c964e3eee4637580c128273f8f8c8d | 106 | py | Python | app/main/errors.py | sharon0812/News | 18c06f8a3cab8b4ae2b9ea9c522b9ad850024c79 | [
"MIT"
] | null | null | null | app/main/errors.py | sharon0812/News | 18c06f8a3cab8b4ae2b9ea9c522b9ad850024c79 | [
"MIT"
] | null | null | null | app/main/errors.py | sharon0812/News | 18c06f8a3cab8b4ae2b9ea9c522b9ad850024c79 | [
"MIT"
] | null | null | null | from flask import render_template
from . import main
from flask import render_template
from . import main | 21.2 | 33 | 0.830189 | 16 | 106 | 5.375 | 0.375 | 0.209302 | 0.348837 | 0.488372 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0.150943 | 106 | 5 | 34 | 21.2 | 0.955556 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 13 |
2affad512e0300fcf2bd43f256652536c5493280 | 2,482 | py | Python | fjord/base/tests/test_auth.py | joshua-s/fjord | 8cdaddf5b05e51fef44675492e115279efe9fd5c | [
"BSD-3-Clause"
] | null | null | null | fjord/base/tests/test_auth.py | joshua-s/fjord | 8cdaddf5b05e51fef44675492e115279efe9fd5c | [
"BSD-3-Clause"
] | 1 | 2021-12-13T20:55:07.000Z | 2021-12-13T20:55:07.000Z | fjord/base/tests/test_auth.py | joshua-s/fjord | 8cdaddf5b05e51fef44675492e115279efe9fd5c | [
"BSD-3-Clause"
] | null | null | null | from django.test.client import RequestFactory
from nose.tools import eq_
from fjord.base.browserid import FjordVerify
from fjord.base.tests import BaseTestCase, profile, reverse, user
class TestAuth(BaseTestCase):
def test_new_user(self):
"""Tests that new users get redirected to new_user page"""
# Create a user that has no profile--this is the sign that the
# user is new!
new_user = user(save=True)
self.client_login_user(new_user)
# Now do some ridiculous setup so we can call login_success()
# on the Verify and see if it did what it should be doing.
# FIXME - this can go away post django-browserid 0.9
new_user.backend = 'django_browserid.auth.BrowserIDBackend'
get_request = RequestFactory().get(reverse('dashboard'))
get_request.user = new_user
get_request.session = self.client.session
fv = FjordVerify()
fv.user = new_user
fv.request = get_request
resp = fv.login_success()
eq_(302, resp.status_code)
eq_(resp.get('location'), reverse('new-user-view'))
def test_existing_user(self):
"""Tests that existing users get redirected to right place"""
# Create a user that has no profile--this is the sign that the
# user is new!
new_user = user(save=True)
profile(user=new_user, save=True)
self.client_login_user(new_user)
# Now do some ridiculous setup so we can call login_success()
# on the Verify and see if it did what it should be doing.
# FIXME - this can go away post django-browserid 0.9
new_user.backend = 'django_browserid.auth.BrowserIDBackend'
# First, do it RAW!
get_request = RequestFactory().get(reverse('dashboard'))
get_request.user = new_user
get_request.session = self.client.session
fv = FjordVerify()
fv.user = new_user
fv.request = get_request
resp = fv.login_success()
eq_(302, resp.status_code)
eq_(resp.get('location'), '/')
# Now do it with next!
get_request = RequestFactory().get(reverse('dashboard') + '?next=/foo')
get_request.user = new_user
get_request.session = self.client.session
fv = FjordVerify()
fv.user = new_user
fv.request = get_request
resp = fv.login_success()
eq_(302, resp.status_code)
eq_(resp.get('location'), '/foo')
| 33.540541 | 79 | 0.641015 | 336 | 2,482 | 4.583333 | 0.261905 | 0.072727 | 0.064286 | 0.052597 | 0.753896 | 0.753896 | 0.725974 | 0.725974 | 0.725974 | 0.725974 | 0 | 0.007139 | 0.266317 | 2,482 | 73 | 80 | 34 | 0.83855 | 0.254633 | 0 | 0.707317 | 0 | 0 | 0.084792 | 0.041575 | 0 | 0 | 0 | 0.013699 | 0 | 1 | 0.04878 | false | 0 | 0.097561 | 0 | 0.170732 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
63174585c6ea86ef0bc91d9b3a5d2c0e6c32748a | 132,106 | py | Python | alerter/test/alerter/factory/test_alerting_factory.py | SimplyVC/panic | 2f5c327ea0d14b6a49dc8f4599a255048bc2ff6d | [
"Apache-2.0"
] | 41 | 2019-08-23T12:40:42.000Z | 2022-03-28T11:06:02.000Z | alerter/test/alerter/factory/test_alerting_factory.py | SimplyVC/panic | 2f5c327ea0d14b6a49dc8f4599a255048bc2ff6d | [
"Apache-2.0"
] | 147 | 2019-08-30T22:09:48.000Z | 2022-03-30T08:46:26.000Z | alerter/test/alerter/factory/test_alerting_factory.py | SimplyVC/panic | 2f5c327ea0d14b6a49dc8f4599a255048bc2ff6d | [
"Apache-2.0"
] | 3 | 2019-09-03T21:12:28.000Z | 2021-08-18T14:27:56.000Z | import logging
import unittest
from datetime import datetime
from datetime import timedelta
from freezegun import freeze_time
from parameterized import parameterized
from src.alerter.alerts.node.chainlink import (
NoChangeInHeightAlert, BlockHeightUpdatedAlert,
TotalErroredJobRunsIncreasedAboveThresholdAlert,
TotalErroredJobRunsDecreasedBelowThresholdAlert,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert,
ChangeInSourceNodeAlert, PrometheusSourceIsDownAlert,
PrometheusSourceBackUpAgainAlert, EthBalanceIncreasedAboveThresholdAlert,
EthBalanceDecreasedBelowThresholdAlert, InvalidUrlAlert, ValidUrlAlert,
NodeWentDownAtAlert, NodeBackUpAgainAlert, NodeStillDownAlert)
from src.alerter.alerts.node.evm import (
BlockHeightDifferenceIncreasedAboveThresholdAlert,
BlockHeightDifferenceDecreasedBelowThresholdAlert
)
from src.alerter.factory.alerting_factory import AlertingFactory
from src.alerter.grouped_alerts_metric_code.node.chainlink_node_metric_code \
import GroupedChainlinkNodeAlertsMetricCode
from src.alerter.grouped_alerts_metric_code.node.evm_node_metric_code \
import GroupedEVMNodeAlertsMetricCode as EVMAlertsMetricCode
from src.configs.alerts.node.chainlink import ChainlinkNodeAlertsConfig
from src.configs.alerts.node.evm import EVMNodeAlertsConfig
from src.utils.configs import parse_alert_time_thresholds
from src.utils.exceptions import InvalidUrlException, MetricNotFoundException
from src.utils.timing import (TimedTaskTracker, TimedTaskLimiter,
OccurrencesInTimePeriodTracker)
"""
We will use some chainlink and evm node alerts and configurations for the tests
below. This should not effect the validity and scope of the tests because the
implementation was conducted to be as general as possible.
"""
class ChainlinkAlertingFactoryInstance(AlertingFactory):
def __init__(self, component_logger: logging.Logger) -> None:
super().__init__(component_logger)
def create_alerting_state(
self, parent_id: str, node_id: str,
cl_node_alerts_config: ChainlinkNodeAlertsConfig) -> None:
"""
This function is a smaller version of the ChainlinkNodeAlertingFactory
create_alerting_state function
"""
if parent_id not in self.alerting_state:
self.alerting_state[parent_id] = {}
if node_id not in self.alerting_state[parent_id]:
warning_sent = {
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value:
False,
GroupedChainlinkNodeAlertsMetricCode.
TotalErroredJobRunsThreshold.value: False,
GroupedChainlinkNodeAlertsMetricCode.
MaxUnconfirmedBlocksThreshold.value: False,
GroupedChainlinkNodeAlertsMetricCode.EthBalanceThreshold.value:
False,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value: False,
GroupedChainlinkNodeAlertsMetricCode.PrometheusSourceIsDown:
False,
}
critical_sent = {
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value:
False,
GroupedChainlinkNodeAlertsMetricCode.
MaxUnconfirmedBlocksThreshold.value: False,
GroupedChainlinkNodeAlertsMetricCode.
TotalErroredJobRunsThreshold.value: False,
GroupedChainlinkNodeAlertsMetricCode.EthBalanceThreshold.value:
False,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value: False,
}
error_sent = {
GroupedChainlinkNodeAlertsMetricCode.InvalidUrl.value: False,
}
current_head_thresholds = parse_alert_time_thresholds(
['warning_threshold', 'critical_threshold', 'critical_repeat'],
cl_node_alerts_config.head_tracker_current_head)
eth_balance_thresholds = parse_alert_time_thresholds(
['critical_repeat'], cl_node_alerts_config.eth_balance_amount
)
node_is_down_thresholds = parse_alert_time_thresholds(
['warning_threshold', 'critical_threshold',
'critical_repeat'], cl_node_alerts_config.node_is_down
)
error_jobs_thresholds = parse_alert_time_thresholds(
['warning_time_window', 'critical_time_window',
'critical_repeat'],
cl_node_alerts_config.run_status_update_total
)
unconfirmed_blocks_thresholds = parse_alert_time_thresholds(
['warning_time_window', 'critical_time_window',
'critical_repeat'],
cl_node_alerts_config.max_unconfirmed_blocks
)
warning_window_timer = {
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value:
TimedTaskTracker(timedelta(
seconds=current_head_thresholds['warning_threshold'])),
GroupedChainlinkNodeAlertsMetricCode.
MaxUnconfirmedBlocksThreshold.value:
TimedTaskTracker(timedelta(
seconds=unconfirmed_blocks_thresholds[
'warning_time_window'])),
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value:
TimedTaskTracker(timedelta(seconds=node_is_down_thresholds[
'warning_threshold'])),
}
critical_window_timer = {
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value:
TimedTaskTracker(timedelta(
seconds=current_head_thresholds['critical_threshold'])),
GroupedChainlinkNodeAlertsMetricCode.
MaxUnconfirmedBlocksThreshold.value:
TimedTaskTracker(timedelta(
seconds=unconfirmed_blocks_thresholds[
'critical_time_window'])),
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value:
TimedTaskTracker(timedelta(seconds=node_is_down_thresholds[
'critical_threshold'])),
}
critical_repeat_timer = {
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value:
TimedTaskLimiter(timedelta(
seconds=current_head_thresholds['critical_repeat'])),
GroupedChainlinkNodeAlertsMetricCode.EthBalanceThreshold.value:
TimedTaskLimiter(timedelta(seconds=eth_balance_thresholds[
'critical_repeat'])),
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value:
TimedTaskLimiter(timedelta(seconds=node_is_down_thresholds[
'critical_repeat'])),
GroupedChainlinkNodeAlertsMetricCode.
MaxUnconfirmedBlocksThreshold.value:
TimedTaskLimiter(timedelta(
seconds=unconfirmed_blocks_thresholds[
'critical_repeat'])),
GroupedChainlinkNodeAlertsMetricCode.
TotalErroredJobRunsThreshold.value:
TimedTaskLimiter(timedelta(seconds=error_jobs_thresholds[
'critical_repeat'])),
}
warning_occurrences_in_period_tracker = {
GroupedChainlinkNodeAlertsMetricCode.
TotalErroredJobRunsThreshold.value:
OccurrencesInTimePeriodTracker(timedelta(
seconds=error_jobs_thresholds[
'warning_time_window'])),
}
critical_occurrences_in_period_tracker = {
GroupedChainlinkNodeAlertsMetricCode.
TotalErroredJobRunsThreshold.value:
OccurrencesInTimePeriodTracker(timedelta(
seconds=error_jobs_thresholds[
'critical_time_window'])),
}
self.alerting_state[parent_id][node_id] = {
'warning_sent': warning_sent,
'critical_sent': critical_sent,
'error_sent': error_sent,
'warning_window_timer': warning_window_timer,
'critical_window_timer': critical_window_timer,
'critical_repeat_timer': critical_repeat_timer,
'warning_occurrences_in_period_tracker':
warning_occurrences_in_period_tracker,
'critical_occurrences_in_period_tracker':
critical_occurrences_in_period_tracker,
}
class EVMAlertingFactoryInstance(AlertingFactory):
def __init__(self, component_logger: logging.Logger) -> None:
super().__init__(component_logger)
def create_alerting_state(
self, parent_id: str, node_id: str,
evm_node_alerts_config: EVMNodeAlertsConfig) -> None:
"""
This function is a smaller version of the EVMNodeAlertingFactory
create_alerting_state function
"""
if parent_id not in self.alerting_state:
self.alerting_state[parent_id] = {}
if node_id not in self.alerting_state[parent_id]:
warning_sent = {
EVMAlertsMetricCode.NoChangeInBlockHeight.value: False,
EVMAlertsMetricCode.BlockHeightDifference.value: False,
EVMAlertsMetricCode.NodeIsDown.value: False
}
critical_sent = {
EVMAlertsMetricCode.NoChangeInBlockHeight.value: False,
EVMAlertsMetricCode.BlockHeightDifference.value: False,
EVMAlertsMetricCode.NodeIsDown.value: False
}
error_sent = {
EVMAlertsMetricCode.InvalidUrl.value: False,
}
evm_node_is_down_thresholds = parse_alert_time_thresholds(
['warning_threshold', 'critical_threshold', 'critical_repeat'],
evm_node_alerts_config.evm_node_is_down)
block_height_difference_thresholds = parse_alert_time_thresholds(
['critical_repeat'],
evm_node_alerts_config.
evm_block_syncing_block_height_difference)
no_change_in_block_height_thresholds = parse_alert_time_thresholds(
['warning_threshold', 'critical_threshold', 'critical_repeat'],
evm_node_alerts_config.
evm_block_syncing_no_change_in_block_height)
warning_window_timer = {
EVMAlertsMetricCode.NoChangeInBlockHeight.value:
TimedTaskTracker(timedelta(
seconds=no_change_in_block_height_thresholds[
'warning_threshold'])),
EVMAlertsMetricCode.NodeIsDown.value:
TimedTaskTracker(timedelta(
seconds=evm_node_is_down_thresholds[
'warning_threshold'])),
}
critical_window_timer = {
EVMAlertsMetricCode.NoChangeInBlockHeight.value:
TimedTaskTracker(timedelta(
seconds=no_change_in_block_height_thresholds[
'critical_threshold'])),
EVMAlertsMetricCode.NodeIsDown.value:
TimedTaskTracker(timedelta(
seconds=evm_node_is_down_thresholds[
'critical_threshold'])),
}
critical_repeat_timer = {
EVMAlertsMetricCode.NoChangeInBlockHeight.value:
TimedTaskLimiter(
timedelta(seconds=no_change_in_block_height_thresholds[
'critical_repeat'])),
EVMAlertsMetricCode.NodeIsDown.value:
TimedTaskLimiter(timedelta(
seconds=evm_node_is_down_thresholds[
'critical_repeat'])),
EVMAlertsMetricCode.BlockHeightDifference.value:
TimedTaskLimiter(timedelta(
seconds=block_height_difference_thresholds[
'critical_repeat']))
}
self.alerting_state[parent_id][node_id] = {
'warning_sent': warning_sent,
'critical_sent': critical_sent,
'error_sent': error_sent,
'warning_window_timer': warning_window_timer,
'critical_window_timer': critical_window_timer,
'critical_repeat_timer': critical_repeat_timer,
'current_height': None,
}
class TestAlertingFactory(unittest.TestCase):
def setUp(self) -> None:
# Some dummy data
self.dummy_logger = logging.getLogger('dummy')
self.test_alerting_state = {
'test_key': 'test_val'
}
self.test_parent_id = 'chain_name_4569u540hg8d0fgd0f8th4050h_3464597'
self.test_node_id = 'node_id34543496346t9345459-34689346h-3463-5'
self.test_node_name = 'test_node_name'
# Dummy test objects
self.head_tracker_current_head = {
'name': 'head_tracker_current_head',
'parent_id': self.test_parent_id,
'enabled': 'true',
'critical_threshold': '7',
'critical_repeat': '5',
'critical_enabled': 'true',
'critical_repeat_enabled': 'true',
'warning_threshold': '3',
'warning_enabled': 'true'
}
self.max_unconfirmed_blocks = {
'name': 'max_unconfirmed_blocks',
'parent_id': self.test_parent_id,
'enabled': 'true',
'critical_threshold': '5',
'critical_repeat': '5',
'critical_enabled': 'true',
'critical_repeat_enabled': 'true',
'warning_threshold': '3',
'warning_enabled': 'true',
'warning_time_window': '3',
'critical_time_window': '7',
}
self.run_status_update_total = {
'name': 'run_status_update_total',
'parent_id': self.test_parent_id,
'enabled': 'true',
'critical_threshold': '5',
'critical_repeat': '5',
'critical_enabled': 'true',
'critical_repeat_enabled': 'true',
'warning_threshold': '3',
'warning_enabled': 'true',
'warning_time_window': '3',
'critical_time_window': '7',
}
self.eth_balance_amount = {
'name': 'eth_balance_amount',
'parent_id': self.test_parent_id,
'enabled': 'true',
'critical_threshold': '5',
'critical_repeat': '5',
'critical_enabled': 'true',
'critical_repeat_enabled': 'true',
'warning_threshold': '10',
'warning_enabled': 'true',
}
self.node_is_down = {
'name': 'node_is_down',
'parent_id': self.test_parent_id,
'enabled': 'true',
'critical_threshold': '5',
'critical_repeat': '5',
'critical_enabled': 'true',
'critical_repeat_enabled': 'true',
'warning_threshold': '3',
'warning_enabled': 'true',
}
self.test_alerts_config = ChainlinkNodeAlertsConfig(
parent_id=self.test_parent_id,
head_tracker_current_head=self.head_tracker_current_head,
head_tracker_heads_received_total={},
max_unconfirmed_blocks=self.max_unconfirmed_blocks,
process_start_time_seconds={},
tx_manager_gas_bump_exceeds_limit_total={},
unconfirmed_transactions={},
run_status_update_total=self.run_status_update_total,
eth_balance_amount=self.eth_balance_amount,
eth_balance_amount_increase={}, node_is_down=self.node_is_down,
)
self.test_factory_instance = ChainlinkAlertingFactoryInstance(
self.dummy_logger)
self.test_factory_instance.create_alerting_state(
self.test_parent_id, self.test_node_id, self.test_alerts_config)
# Create EVM Alerting state
# Construct the configs
metrics_without_time_window = [
'evm_block_syncing_no_change_in_block_height',
'evm_block_syncing_block_height_difference',
'evm_node_is_down'
]
filtered = {}
for metric in metrics_without_time_window:
filtered[metric] = {
'name': metric,
'parent_id': self.test_parent_id,
'enabled': 'true',
'critical_threshold': '7',
'critical_repeat': '5',
'critical_enabled': 'true',
'critical_repeat_enabled': 'true',
'warning_threshold': '3',
'warning_enabled': 'true'
}
self.evm_node_alerts_config = EVMNodeAlertsConfig(
parent_id=self.test_parent_id,
evm_node_is_down=filtered['evm_node_is_down'],
evm_block_syncing_block_height_difference=filtered[
'evm_block_syncing_block_height_difference'],
evm_block_syncing_no_change_in_block_height=filtered[
'evm_block_syncing_no_change_in_block_height']
)
self.test_evm_factory_instance = EVMAlertingFactoryInstance(
self.dummy_logger)
self.test_evm_factory_instance.create_alerting_state(
self.test_parent_id, self.test_node_id, self.evm_node_alerts_config)
def tearDown(self) -> None:
self.dummy_logger = None
self.test_alerts_config = None
self.test_factory_instance = None
self.test_evm_factory_instance = None
def test_alerting_state_returns_alerting_state(self) -> None:
self.test_factory_instance._alerting_state = self.test_alerting_state
self.assertEqual(self.test_alerting_state,
self.test_factory_instance.alerting_state)
def test_component_logger_returns_logger_instance(self) -> None:
self.test_factory_instance._component_logger = self.dummy_logger
self.assertEqual(self.dummy_logger,
self.test_factory_instance.component_logger)
def test_classify_no_change_in_alert_does_nothing_warning_critical_disabled(
self) -> None:
"""
In this test we will check that no alert is raised and no timer is
started whenever both warning and critical alerts are disabled. We will
perform this test only for when current == previous. For an alert to be
raised when current != previous it must be that one of the severities is
enabled.
"""
self.test_alerts_config.head_tracker_current_head[
'warning_enabled'] = 'False'
self.test_alerts_config.head_tracker_current_head[
'critical_enabled'] = 'False'
data_for_alerting = []
self.test_factory_instance.classify_no_change_in_alert(
50, 50, self.test_alerts_config.head_tracker_current_head,
NoChangeInHeightAlert, BlockHeightUpdatedAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value,
self.test_node_name, datetime.now().timestamp()
)
critical_window_timer = self.test_factory_instance.alerting_state[
self.test_parent_id][self.test_node_id]['critical_window_timer'][
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value]
warning_window_timer = self.test_factory_instance.alerting_state[
self.test_parent_id][self.test_node_id]['warning_window_timer'][
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value]
self.assertEqual([], data_for_alerting)
self.assertFalse(critical_window_timer.timer_started)
self.assertFalse(warning_window_timer.timer_started)
def test_classify_no_change_does_nothing_if_change_and_no_issue_raised(
self) -> None:
"""
In this test we will check that no alert is raised if the value is being
changed and no issue has been already reported.
"""
data_for_alerting = []
self.test_factory_instance.classify_no_change_in_alert(
51, 50, self.test_alerts_config.head_tracker_current_head,
NoChangeInHeightAlert, BlockHeightUpdatedAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value,
self.test_node_name, datetime.now().timestamp()
)
self.assertEqual([], data_for_alerting)
@parameterized.expand([
('warning_threshold', 'WARNING',), ('critical_threshold', 'CRITICAL',)
])
@freeze_time("2012-01-01")
def test_classify_no_change_in_alert_raises_alert_if_time_window_elapsed(
self, threshold, severity) -> None:
"""
In this test we will check that a warning/critical no change in alert is
raised if the value is not being updated and the warning/critical time
window elapses. We will also first check that no alert is raised first
time round, (as the timer is started) and if the warning/critical time
does not elapse.
"""
data_for_alerting = []
# No alert is raised if timer not started yet
self.test_factory_instance.classify_no_change_in_alert(
50, 50, self.test_alerts_config.head_tracker_current_head,
NoChangeInHeightAlert, BlockHeightUpdatedAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value,
self.test_node_name, datetime.now().timestamp()
)
self.assertEqual([], data_for_alerting)
# No alert is raised if the time window is not elapsed yet
self.test_factory_instance.classify_no_change_in_alert(
50, 50, self.test_alerts_config.head_tracker_current_head,
NoChangeInHeightAlert, BlockHeightUpdatedAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value,
self.test_node_name, datetime.now().timestamp()
)
self.assertEqual([], data_for_alerting)
# No change in alert is raised if time window elapsed
pad = float(self.test_alerts_config.head_tracker_current_head[
threshold])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_no_change_in_alert(
50, 50, self.test_alerts_config.head_tracker_current_head,
NoChangeInHeightAlert, BlockHeightUpdatedAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value,
self.test_node_name, alert_timestamp
)
expected_alert = NoChangeInHeightAlert(
self.test_node_name, pad, severity, alert_timestamp,
self.test_parent_id, self.test_node_id, 50)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_no_change_in_alert_no_warning_if_warning_already_sent(
self) -> None:
"""
In this test we will check that no warning alert is raised if a warning
alert has already been sent
"""
data_for_alerting = []
# Set the timer
self.test_factory_instance.classify_no_change_in_alert(
50, 50, self.test_alerts_config.head_tracker_current_head,
NoChangeInHeightAlert, BlockHeightUpdatedAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value,
self.test_node_name, datetime.now().timestamp()
)
# Send warning alert
pad = float(self.test_alerts_config.head_tracker_current_head[
'warning_threshold'])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_no_change_in_alert(
50, 50, self.test_alerts_config.head_tracker_current_head,
NoChangeInHeightAlert, BlockHeightUpdatedAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value,
self.test_node_name, alert_timestamp
)
self.assertEqual(1, len(data_for_alerting))
# Check that no alert is raised even if the warning window elapses again
data_for_alerting.clear()
alert_timestamp = datetime.now().timestamp() + (2 * pad)
self.test_factory_instance.classify_no_change_in_alert(
50, 50, self.test_alerts_config.head_tracker_current_head,
NoChangeInHeightAlert, BlockHeightUpdatedAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value,
self.test_node_name, alert_timestamp
)
self.assertEqual([], data_for_alerting)
@freeze_time("2012-01-01")
def test_classify_no_change_in_alert_raises_critical_if_repeat_elapsed(
self) -> None:
"""
In this test we will check that a critical no change in alert is
re-raised the critical window elapses. We will also check that if the
critical window does not elapse, a critical alert is not re-raised.
"""
data_for_alerting = []
# Start timer
self.test_factory_instance.classify_no_change_in_alert(
50, 50, self.test_alerts_config.head_tracker_current_head,
NoChangeInHeightAlert, BlockHeightUpdatedAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value,
self.test_node_name, datetime.now().timestamp()
)
# First CRITICAL no change in alert
pad = float(self.test_alerts_config.head_tracker_current_head[
'critical_threshold'])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_no_change_in_alert(
50, 50, self.test_alerts_config.head_tracker_current_head,
NoChangeInHeightAlert, BlockHeightUpdatedAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value,
self.test_node_name, alert_timestamp
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Classify with not elapsed repeat to confirm that no critical alert is
# raised.
pad = float(self.test_alerts_config.head_tracker_current_head[
'critical_threshold']) + float(
self.test_alerts_config.head_tracker_current_head[
'critical_repeat']) - 1
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_no_change_in_alert(
50, 50, self.test_alerts_config.head_tracker_current_head,
NoChangeInHeightAlert, BlockHeightUpdatedAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value,
self.test_node_name, alert_timestamp
)
self.assertEqual([], data_for_alerting)
# Let repeat time to elapse and check that a critical alert is
# re-raised
pad = float(self.test_alerts_config.head_tracker_current_head[
'critical_threshold']) + float(
self.test_alerts_config.head_tracker_current_head[
'critical_repeat'])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_no_change_in_alert(
50, 50, self.test_alerts_config.head_tracker_current_head,
NoChangeInHeightAlert, BlockHeightUpdatedAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value,
self.test_node_name, alert_timestamp
)
expected_alert = NoChangeInHeightAlert(
self.test_node_name, pad, 'CRITICAL', alert_timestamp,
self.test_parent_id, self.test_node_id, 50)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_no_change_in_alert_only_1_critical_if_repeat_disabled(
self) -> None:
"""
In this test we will check that if critical_repeat is disabled, a no
change critical alert is not re-raised.
"""
self.test_alerts_config.head_tracker_current_head[
'critical_repeat_enabled'] = "False"
data_for_alerting = []
# Start timer
self.test_factory_instance.classify_no_change_in_alert(
50, 50, self.test_alerts_config.head_tracker_current_head,
NoChangeInHeightAlert, BlockHeightUpdatedAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value,
self.test_node_name, datetime.now().timestamp()
)
# First CRITICAL no change in alert
pad = float(self.test_alerts_config.head_tracker_current_head[
'critical_threshold'])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_no_change_in_alert(
50, 50, self.test_alerts_config.head_tracker_current_head,
NoChangeInHeightAlert, BlockHeightUpdatedAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value,
self.test_node_name, alert_timestamp
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Let repeat time to elapse and check that a critical alert is
# still not re-raised
pad = float(self.test_alerts_config.head_tracker_current_head[
'critical_threshold']) + float(
self.test_alerts_config.head_tracker_current_head[
'critical_repeat'])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_no_change_in_alert(
50, 50, self.test_alerts_config.head_tracker_current_head,
NoChangeInHeightAlert, BlockHeightUpdatedAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value,
self.test_node_name, alert_timestamp
)
self.assertEqual([], data_for_alerting)
@parameterized.expand([
('critical_threshold',), ('warning_threshold',)
])
@freeze_time("2012-01-01")
def test_classify_no_change_alert_raises_info_if_issue_solved(
self, threshold) -> None:
"""
In this test we will check that once the no change problem is solved,
an info alert is raised. We will perform this test for both when a
warning alert has been sent or a critical alert has been sent.
"""
data_for_alerting = []
# Start timers
self.test_factory_instance.classify_no_change_in_alert(
50, 50, self.test_alerts_config.head_tracker_current_head,
NoChangeInHeightAlert, BlockHeightUpdatedAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value,
self.test_node_name, datetime.now().timestamp()
)
self.assertEqual([], data_for_alerting)
# Raise problem alert
pad = float(self.test_alerts_config.head_tracker_current_head[
threshold])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_no_change_in_alert(
50, 50, self.test_alerts_config.head_tracker_current_head,
NoChangeInHeightAlert, BlockHeightUpdatedAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value,
self.test_node_name, alert_timestamp
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Check that an INFO alert is raised
pad = float(self.test_alerts_config.head_tracker_current_head[
threshold])
alert_timestamp = datetime.now().timestamp() + pad + 60
self.test_factory_instance.classify_no_change_in_alert(
51, 50, self.test_alerts_config.head_tracker_current_head,
NoChangeInHeightAlert, BlockHeightUpdatedAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NoChangeInHeight.value,
self.test_node_name, alert_timestamp
)
expected_alert = BlockHeightUpdatedAlert(
self.test_node_name, 'INFO', alert_timestamp, self.test_parent_id,
self.test_node_id, 51)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
def test_classify_thresh_time_win_does_nothing_warning_critical_disabled(
self) -> None:
"""
In this test we will check that no alert is raised and that no timer is
starter whenever both warning and critical alerts are disabled. We will
perform this test for both when current >= critical and
current >= warning. For an alert to be raised when current < critical or
current < warning it must be that one of the severities is enabled.
"""
self.test_alerts_config.max_unconfirmed_blocks[
'warning_enabled'] = 'False'
self.test_alerts_config.max_unconfirmed_blocks[
'critical_enabled'] = 'False'
data_for_alerting = []
current = int(self.test_alerts_config.max_unconfirmed_blocks[
'critical_threshold'])
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, datetime.now().timestamp()
)
warning_timer = self.test_factory_instance.alerting_state[
self.test_parent_id][self.test_node_id]['warning_window_timer'][
GroupedChainlinkNodeAlertsMetricCode
.MaxUnconfirmedBlocksThreshold.value]
critical_timer = self.test_factory_instance.alerting_state[
self.test_parent_id][self.test_node_id]['critical_window_timer'][
GroupedChainlinkNodeAlertsMetricCode
.MaxUnconfirmedBlocksThreshold.value]
self.assertEqual([], data_for_alerting)
self.assertFalse(warning_timer.timer_started)
self.assertFalse(critical_timer.timer_started)
@parameterized.expand([
('warning_time_window', 'WARNING',),
('critical_time_window', 'CRITICAL',)
])
@freeze_time("2012-01-01")
def test_classify_thresh_time_win_raises_alert_if_above_thresh_and_elapsed(
self, threshold, severity) -> None:
"""
In this test we will check that a warning/critical above threshold alert
is raised if the time window above warning/critical threshold elapses.
We will also first check that no alert is raised first time round,
(as the timer is started) and if the warning/critical time does not
elapse.
"""
data_for_alerting = []
# No alert is raised if timer not started yet
current = int(self.test_alerts_config.max_unconfirmed_blocks[
'critical_threshold'])
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, datetime.now().timestamp()
)
self.assertEqual([], data_for_alerting)
# No alert is raised if the time window is not elapsed yet
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, datetime.now().timestamp()
)
self.assertEqual([], data_for_alerting)
# Above threshold alert is raised if time window elapsed
pad = float(self.test_alerts_config.max_unconfirmed_blocks[
threshold])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, alert_timestamp
)
expected_alert = MaxUnconfirmedBlocksIncreasedAboveThresholdAlert(
self.test_node_name, current, severity, alert_timestamp, pad,
severity, self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_thresh_time_win_no_warning_if_warning_already_sent(
self) -> None:
"""
In this test we will check that no warning alert is raised if a warning
alert has already been sent
"""
data_for_alerting = []
# Set the timer
current = int(self.test_alerts_config.max_unconfirmed_blocks[
'critical_threshold'])
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, datetime.now().timestamp()
)
# Send warning alert
pad = float(self.test_alerts_config.max_unconfirmed_blocks[
'warning_time_window'])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, alert_timestamp
)
self.assertEqual(1, len(data_for_alerting))
# Check that no alert is raised even if the warning window elapses again
data_for_alerting.clear()
alert_timestamp = datetime.now().timestamp() + (2 * pad)
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, alert_timestamp
)
self.assertEqual([], data_for_alerting)
@freeze_time("2012-01-01")
def test_classify_thresh_time_win_raises_critical_if_repeat_elapsed(
self) -> None:
"""
In this test we will check that a critical above threshold alert is
re-raised if the critical window elapses. We will also check that if the
critical window does not elapse, a critical alert is not re-raised.
"""
data_for_alerting = []
# Start timer
current = int(self.test_alerts_config.max_unconfirmed_blocks[
'critical_threshold'])
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, datetime.now().timestamp()
)
# First CRITICAL above threshold alert
pad = float(self.test_alerts_config.max_unconfirmed_blocks[
'critical_time_window'])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, alert_timestamp
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Classify with not elapsed repeat to confirm that no critical alert is
# raised.
pad = float(self.test_alerts_config.max_unconfirmed_blocks[
'critical_time_window']) + float(
self.test_alerts_config.max_unconfirmed_blocks[
'critical_repeat']) - 1
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, alert_timestamp
)
self.assertEqual([], data_for_alerting)
# Let repeat time to elapse and check that a critical alert is
# re-raised
pad = float(self.test_alerts_config.max_unconfirmed_blocks[
'critical_time_window']) + float(
self.test_alerts_config.max_unconfirmed_blocks[
'critical_repeat'])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, alert_timestamp
)
expected_alert = MaxUnconfirmedBlocksIncreasedAboveThresholdAlert(
self.test_node_name, current, 'CRITICAL', alert_timestamp, pad,
'CRITICAL', self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_thresh_time_win_only_1_critical_if_above_and_no_repeat(
self) -> None:
"""
In this test we will check that if critical_repeat is disabled, an
increased abaove critical alert is not re-raised.
"""
self.test_alerts_config.max_unconfirmed_blocks[
'critical_repeat_enabled'] = "False"
data_for_alerting = []
# Start timer
current = int(self.test_alerts_config.max_unconfirmed_blocks[
'critical_threshold'])
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, datetime.now().timestamp()
)
# First CRITICAL above threshold alert
pad = float(self.test_alerts_config.max_unconfirmed_blocks[
'critical_time_window'])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, alert_timestamp
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Let repeat time to elapse and check that a critical alert is
# still not re-raised
pad = float(self.test_alerts_config.max_unconfirmed_blocks[
'critical_time_window']) + float(
self.test_alerts_config.max_unconfirmed_blocks[
'critical_repeat'])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, alert_timestamp
)
self.assertEqual([], data_for_alerting)
@parameterized.expand([
('critical_threshold', 'critical_time_window', 'CRITICAL',),
('warning_threshold', 'warning_time_window', 'WARNING',)
])
@freeze_time("2012-01-01")
def test_classify_thresh_time_win_info_alert_if_below_thresh_and_alert_sent(
self, threshold, time_window_threshold, threshold_severity) -> None:
"""
In this test we will check that once the current value is less than a
threshold, a decreased below threshold info alert is sent. We will
perform this test for both warning and critical.
"""
data_for_alerting = []
# Start timers
current = int(self.test_alerts_config.max_unconfirmed_blocks[
threshold])
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, datetime.now().timestamp()
)
# First above threshold alert
pad = float(self.test_alerts_config.max_unconfirmed_blocks[
time_window_threshold])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, alert_timestamp
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Check that an INFO alert is raised
pad = float(self.test_alerts_config.max_unconfirmed_blocks[
time_window_threshold])
alert_timestamp = datetime.now().timestamp() + pad + 60
self.test_factory_instance.classify_thresholded_time_window_alert(
0,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, alert_timestamp
)
expected_alert = MaxUnconfirmedBlocksDecreasedBelowThresholdAlert(
self.test_node_name, 0, 'INFO', alert_timestamp,
threshold_severity, self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_thresh_time_win_warn_alert_if_below_critical_above_warn(
self) -> None:
"""
In this test we will check that whenever
warning <= current <= critical <= previous, a warning alert is raised to
inform that the current value is greater than the critical value. Note
we will perform this test for the case when we first alert warning, then
critical and not immediately critical, as the warning alerting would be
obvious.
"""
data_for_alerting = []
# Start times
current = int(self.test_alerts_config.max_unconfirmed_blocks[
'critical_threshold'])
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, datetime.now().timestamp()
)
# First above warning threshold alert
pad = float(self.test_alerts_config.max_unconfirmed_blocks[
'warning_time_window'])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, alert_timestamp
)
self.assertEqual(1, len(data_for_alerting))
# First above critical threshold alert
pad = float(self.test_alerts_config.max_unconfirmed_blocks[
'critical_time_window'])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_thresholded_time_window_alert(
current,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, alert_timestamp
)
self.assertEqual(2, len(data_for_alerting))
data_for_alerting.clear()
# Check that 2 alerts are raised, below critical and above warning
pad = float(self.test_alerts_config.max_unconfirmed_blocks[
'critical_time_window'])
alert_timestamp = datetime.now().timestamp() + pad + 60
self.test_factory_instance.classify_thresholded_time_window_alert(
current - 1,
self.test_alerts_config.max_unconfirmed_blocks,
MaxUnconfirmedBlocksIncreasedAboveThresholdAlert,
MaxUnconfirmedBlocksDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.MaxUnconfirmedBlocksThreshold
.value, self.test_node_name, alert_timestamp
)
expected_alert_1 = MaxUnconfirmedBlocksDecreasedBelowThresholdAlert(
self.test_node_name, current - 1, 'INFO', alert_timestamp,
'CRITICAL', self.test_parent_id, self.test_node_id)
expected_alert_2 = MaxUnconfirmedBlocksIncreasedAboveThresholdAlert(
self.test_node_name, current - 1, 'WARNING', alert_timestamp,
pad + 60, 'WARNING', self.test_parent_id, self.test_node_id
)
self.assertEqual(2, len(data_for_alerting))
self.assertEqual(expected_alert_1.alert_data, data_for_alerting[0])
self.assertEqual(expected_alert_2.alert_data, data_for_alerting[1])
def test_classify_thresh_time_period_does_nothing_warning_critical_disabled(
self) -> None:
"""
In this test we will check that no alert is raised whenever both warning
and critical alerts are disabled. We will perform this test for both
when current_occurrences >= critical and current_occurrences >= warning.
For an alert to be raised when current_occurrences < critical or
current_occurrences < warning it must be that one of the severities is
enabled.
"""
self.test_alerts_config.run_status_update_total[
'warning_enabled'] = 'False'
self.test_alerts_config.run_status_update_total[
'critical_enabled'] = 'False'
data_for_alerting = []
self.test_factory_instance.classify_thresholded_in_time_period_alert(
100, 50,
self.test_alerts_config.run_status_update_total,
TotalErroredJobRunsIncreasedAboveThresholdAlert,
TotalErroredJobRunsDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.TotalErroredJobRunsThreshold
.value, self.test_node_name, datetime.now().timestamp()
)
self.assertEqual([], data_for_alerting)
@parameterized.expand([
('warning_time_window', 'WARNING', 'warning_threshold'),
('critical_time_window', 'CRITICAL', 'critical_threshold'),
])
@freeze_time("2012-01-01")
def test_classify_threshold_in_time_period_raises_alert_if_above_threshold(
self, period_var, severity, threshold_var) -> None:
"""
In this test we will check that a warning/critical above threshold alert
is raised if the current value exceeds the warning/critical threshold.
"""
current = int(
self.test_alerts_config.run_status_update_total[
threshold_var])
previous = 0
data_for_alerting = []
self.test_factory_instance.classify_thresholded_in_time_period_alert(
current, previous,
self.test_alerts_config.run_status_update_total,
TotalErroredJobRunsIncreasedAboveThresholdAlert,
TotalErroredJobRunsDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.TotalErroredJobRunsThreshold
.value, self.test_node_name, datetime.now().timestamp()
)
period = float(
self.test_alerts_config.run_status_update_total[
period_var])
expected_alert = TotalErroredJobRunsIncreasedAboveThresholdAlert(
self.test_node_name, current, severity, datetime.now().timestamp(),
period, severity, self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_threshold_time_period_no_warning_if_warning_already_sent(
self) -> None:
"""
In this test we will check that no warning alert is raised if a warning
alert has already been sent
"""
data_for_alerting = []
# Set the timer
current = int(
self.test_alerts_config.run_status_update_total[
'warning_threshold'])
previous = 0
self.test_factory_instance.classify_thresholded_in_time_period_alert(
current, previous,
self.test_alerts_config.run_status_update_total,
TotalErroredJobRunsIncreasedAboveThresholdAlert,
TotalErroredJobRunsDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.TotalErroredJobRunsThreshold
.value, self.test_node_name, datetime.now().timestamp()
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
self.test_factory_instance.classify_thresholded_in_time_period_alert(
current + 1, current,
self.test_alerts_config.run_status_update_total,
TotalErroredJobRunsIncreasedAboveThresholdAlert,
TotalErroredJobRunsDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.TotalErroredJobRunsThreshold
.value, self.test_node_name, datetime.now().timestamp()
)
self.assertEqual([], data_for_alerting)
@freeze_time("2012-01-01")
def test_classify_threshold_time_period_raises_critical_if_repeat_elapsed(
self) -> None:
"""
In this test we will check that a critical above threshold alert is
re-raised if the critical repeat window elapses. We will also check that
if the critical window does not elapse, a critical alert is not
re-raised.
"""
data_for_alerting = []
# First critical above threshold alert
current = int(
self.test_alerts_config.run_status_update_total[
'critical_threshold'])
previous = 0
self.test_factory_instance.classify_thresholded_in_time_period_alert(
current, previous,
self.test_alerts_config.run_status_update_total,
TotalErroredJobRunsIncreasedAboveThresholdAlert,
TotalErroredJobRunsDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.TotalErroredJobRunsThreshold
.value, self.test_node_name, datetime.now().timestamp()
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Classify with not elapsed repeat to confirm that no critical alert is
# raised.
pad = float(
self.test_alerts_config.run_status_update_total[
'critical_repeat']) - 1
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_thresholded_in_time_period_alert(
current, current,
self.test_alerts_config.run_status_update_total,
TotalErroredJobRunsIncreasedAboveThresholdAlert,
TotalErroredJobRunsDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.TotalErroredJobRunsThreshold
.value, self.test_node_name, alert_timestamp
)
self.assertEqual([], data_for_alerting)
# Let repeat time to elapse and check that a critical alert is
# re-raised
pad = float(
self.test_alerts_config.run_status_update_total[
'critical_repeat'])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_thresholded_in_time_period_alert(
current, current,
self.test_alerts_config.run_status_update_total,
TotalErroredJobRunsIncreasedAboveThresholdAlert,
TotalErroredJobRunsDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.TotalErroredJobRunsThreshold
.value, self.test_node_name, alert_timestamp
)
period = float(
self.test_alerts_config.run_status_update_total[
'critical_time_window'])
expected_alert = TotalErroredJobRunsIncreasedAboveThresholdAlert(
self.test_node_name, current, "CRITICAL", alert_timestamp, period,
"CRITICAL", self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_threshold_time_per_only_1_critical_if_above_and_no_repeat(
self) -> None:
"""
In this test we will check that if critical_repeat is disabled, an
increased above critical alert is not re-raised.
"""
self.test_alerts_config.run_status_update_total[
'critical_repeat_enabled'] = "False"
data_for_alerting = []
# First critical above threshold alert
current = int(
self.test_alerts_config.run_status_update_total[
'critical_threshold'])
previous = 0
self.test_factory_instance.classify_thresholded_in_time_period_alert(
current, previous,
self.test_alerts_config.run_status_update_total,
TotalErroredJobRunsIncreasedAboveThresholdAlert,
TotalErroredJobRunsDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.TotalErroredJobRunsThreshold
.value, self.test_node_name, datetime.now().timestamp()
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Let repeat time to elapse and check that a critical alert is not
# re-raised
pad = float(
self.test_alerts_config.run_status_update_total[
'critical_repeat'])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_thresholded_in_time_period_alert(
current, current,
self.test_alerts_config.run_status_update_total,
TotalErroredJobRunsIncreasedAboveThresholdAlert,
TotalErroredJobRunsDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.TotalErroredJobRunsThreshold
.value, self.test_node_name, alert_timestamp
)
self.assertEqual([], data_for_alerting)
@parameterized.expand([
('critical_time_window', 'critical_threshold', 'CRITICAL',),
('warning_time_window', 'warning_threshold', 'WARNING',)
])
@freeze_time("2012-01-01")
def test_classify_thresh_time_per_info_alert_if_below_thresh_and_alert_sent(
self, period_var, threshold, threshold_severity) -> None:
"""
In this test we will check that once the current value is less than a
threshold, a decreased below threshold info alert is sent. We will
perform this test for both warning and critical.
"""
data_for_alerting = []
# First above threshold alert
current = int(
self.test_alerts_config.run_status_update_total[
threshold])
previous = 0
self.test_factory_instance.classify_thresholded_in_time_period_alert(
current, previous,
self.test_alerts_config.run_status_update_total,
TotalErroredJobRunsIncreasedAboveThresholdAlert,
TotalErroredJobRunsDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.TotalErroredJobRunsThreshold
.value, self.test_node_name, datetime.now().timestamp()
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Check that a below threshold INFO alert is raised
period = int(
self.test_alerts_config.run_status_update_total[
period_var])
alert_timestamp = datetime.now().timestamp() + period + 1
self.test_factory_instance.classify_thresholded_in_time_period_alert(
current, current,
self.test_alerts_config.run_status_update_total,
TotalErroredJobRunsIncreasedAboveThresholdAlert,
TotalErroredJobRunsDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.TotalErroredJobRunsThreshold
.value, self.test_node_name, alert_timestamp
)
expected_alert = TotalErroredJobRunsDecreasedBelowThresholdAlert(
self.test_node_name, 0, 'INFO', alert_timestamp, period,
threshold_severity, self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_thresh_time_per_warn_alert_if_below_critical_above_warn(
self) -> None:
"""
In this test we will check that whenever
warning <= current <= critical <= previous, a warning alert is raised to
inform that the current value is greater than the critical value. Note
we will perform this test for the case when we first alert warning, then
critical and not immediately critical, as the warning alerting would be
obvious.
"""
data_for_alerting = []
# Send warning increase above threshold alert
current = int(
self.test_alerts_config.run_status_update_total[
'warning_threshold'])
previous = 0
self.test_factory_instance.classify_thresholded_in_time_period_alert(
current, previous,
self.test_alerts_config.run_status_update_total,
TotalErroredJobRunsIncreasedAboveThresholdAlert,
TotalErroredJobRunsDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.TotalErroredJobRunsThreshold
.value, self.test_node_name, datetime.now().timestamp()
)
self.assertEqual(1, len(data_for_alerting))
# Send critical increase above threshold alert
previous = 0
current = int(
self.test_alerts_config.run_status_update_total[
'critical_threshold'])
self.test_factory_instance.classify_thresholded_in_time_period_alert(
current, previous,
self.test_alerts_config.run_status_update_total,
TotalErroredJobRunsIncreasedAboveThresholdAlert,
TotalErroredJobRunsDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.TotalErroredJobRunsThreshold
.value, self.test_node_name, datetime.now().timestamp()
)
self.assertEqual(2, len(data_for_alerting))
data_for_alerting.clear()
# Check that 2 alerts are raised, below critical and above warning
critical_period = int(
self.test_alerts_config.run_status_update_total[
'critical_time_window'])
warning_period = int(
self.test_alerts_config.run_status_update_total[
'warning_time_window'])
current = int(
self.test_alerts_config.run_status_update_total[
'warning_threshold'])
previous = 0
# Allow a lot of time to pass so that all previous occurrences are
# automatically deleted, and we are thus above warning.
alert_timestamp = datetime.now().timestamp() + critical_period + 100
self.test_factory_instance.classify_thresholded_in_time_period_alert(
current, previous,
self.test_alerts_config.run_status_update_total,
TotalErroredJobRunsIncreasedAboveThresholdAlert,
TotalErroredJobRunsDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.TotalErroredJobRunsThreshold
.value, self.test_node_name, alert_timestamp
)
new_current = int(
self.test_alerts_config.run_status_update_total[
'warning_threshold'])
expected_alert_1 = TotalErroredJobRunsDecreasedBelowThresholdAlert(
self.test_node_name, new_current, 'INFO', alert_timestamp,
critical_period, 'CRITICAL', self.test_parent_id, self.test_node_id)
expected_alert_2 = TotalErroredJobRunsIncreasedAboveThresholdAlert(
self.test_node_name, new_current, "WARNING", alert_timestamp,
warning_period, "WARNING", self.test_parent_id, self.test_node_id)
self.assertEqual(2, len(data_for_alerting))
self.assertEqual(expected_alert_1.alert_data, data_for_alerting[0])
self.assertEqual(expected_alert_2.alert_data, data_for_alerting[1])
@freeze_time("2012-01-01")
def test_classify_conditional_alert_raises_condition_true_alert_if_true(
self) -> None:
"""
Given a true condition, in this test we will check that the
classify_conditional_alert fn calls the condition_true_alert
"""
def condition_function(*args): return True
data_for_alerting = []
self.test_factory_instance.classify_conditional_alert(
ChangeInSourceNodeAlert, condition_function, [], [
self.test_node_name, 'new_source', 'WARNING',
datetime.now().timestamp(), self.test_parent_id,
self.test_node_id
], data_for_alerting
)
expected_alert_1 = ChangeInSourceNodeAlert(
self.test_node_name, 'new_source', 'WARNING',
datetime.now().timestamp(), self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert_1.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_conditional_alert_raises_condition_false_alert_if_false(
self) -> None:
"""
Given a false condition, in this test we will check that the
classify_conditional_alert fn calls the condition_false_alert if it is
not None.
"""
def condition_function(*args): return False
data_for_alerting = []
self.test_factory_instance.classify_conditional_alert(
PrometheusSourceIsDownAlert, condition_function, [], [
self.test_node_name, 'WARNING', datetime.now().timestamp(),
self.test_parent_id, self.test_node_id
], data_for_alerting, PrometheusSourceBackUpAgainAlert,
[self.test_node_name, 'INFO', datetime.now().timestamp(),
self.test_parent_id, self.test_node_id]
)
expected_alert_1 = PrometheusSourceBackUpAgainAlert(
self.test_node_name, 'INFO', datetime.now().timestamp(),
self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert_1.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_conditional_alert_no_alert_if_no_false_alert_and_false(
self) -> None:
"""
Given a false condition and no condition_false_alert, in this test we
will check that no alert is raised by the classify_conditional_alert fn.
"""
def condition_function(*args): return False
data_for_alerting = []
self.test_factory_instance.classify_conditional_alert(
PrometheusSourceIsDownAlert, condition_function, [], [
self.test_node_name, 'WARNING', datetime.now().timestamp(),
self.test_parent_id, self.test_node_id
], data_for_alerting
)
self.assertEqual([], data_for_alerting)
def test_classify_thresholded_reverse_does_nothing_warning_critical_disabled(
self) -> None:
"""
In this test we will check that no alert is raised whenever both warning
and critical alerts are disabled. We will perform this test for both
when current <= critical and current <= warning. For an alert to be
raised when current > critical or current > warning it must be that one
of the severities is enabled.
"""
self.test_alerts_config.eth_balance_amount[
'warning_enabled'] = 'False'
self.test_alerts_config.eth_balance_amount[
'critical_enabled'] = 'False'
data_for_alerting = []
current = float(self.test_alerts_config.eth_balance_amount[
'critical_threshold']) - 1
self.test_factory_instance.classify_thresholded_alert_reverse(
current, self.test_alerts_config.eth_balance_amount,
EthBalanceIncreasedAboveThresholdAlert,
EthBalanceDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.EthBalanceThreshold.value,
self.test_node_name, datetime.now().timestamp()
)
self.assertEqual([], data_for_alerting)
@parameterized.expand([
('WARNING', 'warning_threshold'),
('CRITICAL', 'critical_threshold'),
])
@freeze_time("2012-01-01")
def test_classify_thresholded_reverse_raises_alert_if_below_threshold(
self, severity, threshold_var) -> None:
"""
In this test we will check that a warning/critical below threshold alert
is raised if the current value goes below the warning/critical
threshold.
"""
data_for_alerting = []
current = float(
self.test_alerts_config.eth_balance_amount[threshold_var]) - 1
self.test_factory_instance.classify_thresholded_alert_reverse(
current, self.test_alerts_config.eth_balance_amount,
EthBalanceIncreasedAboveThresholdAlert,
EthBalanceDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.EthBalanceThreshold.value,
self.test_node_name, datetime.now().timestamp()
)
expected_alert = EthBalanceDecreasedBelowThresholdAlert(
self.test_node_name, current, severity, datetime.now().timestamp(),
severity, self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_thresholded_reverse_no_warning_if_warning_already_sent(
self) -> None:
"""
In this test we will check that no warning alert is raised if a warning
alert has already been sent
"""
data_for_alerting = []
# Send first warning alert
current = float(
self.test_alerts_config.eth_balance_amount['warning_threshold']) - 1
self.test_factory_instance.classify_thresholded_alert_reverse(
current, self.test_alerts_config.eth_balance_amount,
EthBalanceIncreasedAboveThresholdAlert,
EthBalanceDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.EthBalanceThreshold.value,
self.test_node_name, datetime.now().timestamp()
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Classify again to check if a warning alert is raised
self.test_factory_instance.classify_thresholded_alert_reverse(
current, self.test_alerts_config.eth_balance_amount,
EthBalanceIncreasedAboveThresholdAlert,
EthBalanceDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.EthBalanceThreshold.value,
self.test_node_name, datetime.now().timestamp() + 1
)
self.assertEqual([], data_for_alerting)
@freeze_time("2012-01-01")
def test_classify_thresholded_reverse_raises_critical_if_repeat_elapsed(
self) -> None:
"""
In this test we will check that a critical below threshold alert is
re-raised if the critical repeat window elapses. We will also check that
if the critical window does not elapse, a critical alert is not
re-raised.
"""
data_for_alerting = []
# First critical below threshold alert
current = float(self.test_alerts_config.eth_balance_amount[
'critical_threshold']) - 1
self.test_factory_instance.classify_thresholded_alert_reverse(
current, self.test_alerts_config.eth_balance_amount,
EthBalanceIncreasedAboveThresholdAlert,
EthBalanceDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.EthBalanceThreshold.value,
self.test_node_name, datetime.now().timestamp()
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Classify with not elapsed repeat to confirm that no critical alert is
# raised.
pad = float(self.test_alerts_config.eth_balance_amount[
'critical_repeat']) - 1
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_thresholded_alert_reverse(
current, self.test_alerts_config.eth_balance_amount,
EthBalanceIncreasedAboveThresholdAlert,
EthBalanceDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.EthBalanceThreshold.value,
self.test_node_name, alert_timestamp
)
self.assertEqual([], data_for_alerting)
# Let repeat time to elapse and check that a critical alert is
# re-raised
pad = float(self.test_alerts_config.eth_balance_amount[
'critical_repeat'])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_thresholded_alert_reverse(
current, self.test_alerts_config.eth_balance_amount,
EthBalanceIncreasedAboveThresholdAlert,
EthBalanceDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.EthBalanceThreshold.value,
self.test_node_name, alert_timestamp
)
expected_alert = EthBalanceDecreasedBelowThresholdAlert(
self.test_node_name, current, "CRITICAL", alert_timestamp,
"CRITICAL", self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_threshold_reverse_only_1_critical_if_below_and_no_repeat(
self) -> None:
"""
In this test we will check that if critical_repeat is disabled, a
decreased below critical alert is not re-raised.
"""
self.test_alerts_config.eth_balance_amount[
'critical_repeat_enabled'] = "False"
data_for_alerting = []
# First critical below threshold alert
current = float(self.test_alerts_config.eth_balance_amount[
'critical_threshold']) - 1
self.test_factory_instance.classify_thresholded_alert_reverse(
current, self.test_alerts_config.eth_balance_amount,
EthBalanceIncreasedAboveThresholdAlert,
EthBalanceDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.EthBalanceThreshold.value,
self.test_node_name, datetime.now().timestamp()
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Let repeat time to elapse and check that a critical alert is not
# re-raised
pad = float(self.test_alerts_config.eth_balance_amount[
'critical_repeat'])
alert_timestamp = datetime.now().timestamp() + pad
self.test_factory_instance.classify_thresholded_alert_reverse(
current, self.test_alerts_config.eth_balance_amount,
EthBalanceIncreasedAboveThresholdAlert,
EthBalanceDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.EthBalanceThreshold.value,
self.test_node_name, alert_timestamp
)
self.assertEqual([], data_for_alerting)
@parameterized.expand([
('critical_threshold', 'CRITICAL',),
('warning_threshold', 'WARNING',)
])
@freeze_time("2012-01-01")
def test_classify_thresh_reverse_info_alert_if_above_thresh_and_alert_sent(
self, threshold_var, threshold_severity) -> None:
"""
In this test we will check that once the current value is greater than a
threshold, an increased above threshold info alert is sent. We will
perform this test for both warning and critical.
"""
data_for_alerting = []
# First below threshold alert
current = float(self.test_alerts_config.eth_balance_amount[
threshold_var]) - 1
self.test_factory_instance.classify_thresholded_alert_reverse(
current, self.test_alerts_config.eth_balance_amount,
EthBalanceIncreasedAboveThresholdAlert,
EthBalanceDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.EthBalanceThreshold.value,
self.test_node_name, datetime.now().timestamp()
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Check that an above threshold INFO alert is raised. Current is set to
# warning + 1 to not trigger a warning alert as it is expected that
# critical <= warning.
current = float(self.test_alerts_config.eth_balance_amount[
'warning_threshold']) + 1
alert_timestamp = datetime.now().timestamp()
self.test_factory_instance.classify_thresholded_alert_reverse(
current, self.test_alerts_config.eth_balance_amount,
EthBalanceIncreasedAboveThresholdAlert,
EthBalanceDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.EthBalanceThreshold.value,
self.test_node_name, alert_timestamp
)
expected_alert = EthBalanceIncreasedAboveThresholdAlert(
self.test_node_name, current, 'INFO', alert_timestamp,
threshold_severity, self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_thresh_reverse_warn_alert_if_above_critical_below_warn(
self) -> None:
"""
In this test we will check that whenever
warning >= current >= critical >= previous, a warning alert is raised to
inform that the current value is smaller than the warning value. Note
we will perform this test for the case when we first alert warning, then
critical and not immediately critical, as the warning alerting would be
obvious.
"""
data_for_alerting = []
# Send warning decrease below threshold alert
current = float(self.test_alerts_config.eth_balance_amount[
'warning_threshold']) - 1
self.test_factory_instance.classify_thresholded_alert_reverse(
current, self.test_alerts_config.eth_balance_amount,
EthBalanceIncreasedAboveThresholdAlert,
EthBalanceDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.EthBalanceThreshold.value,
self.test_node_name, datetime.now().timestamp()
)
self.assertEqual(1, len(data_for_alerting))
# Send critical decrease below threshold alert
current = float(self.test_alerts_config.eth_balance_amount[
'critical_threshold']) - 1
self.test_factory_instance.classify_thresholded_alert_reverse(
current, self.test_alerts_config.eth_balance_amount,
EthBalanceIncreasedAboveThresholdAlert,
EthBalanceDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.EthBalanceThreshold.value,
self.test_node_name, datetime.now().timestamp()
)
self.assertEqual(2, len(data_for_alerting))
data_for_alerting.clear()
# Check that 2 alerts are raised, above critical and below warning
current = float(self.test_alerts_config.eth_balance_amount[
'critical_threshold']) + 1
alert_timestamp = datetime.now().timestamp() + 10
self.test_factory_instance.classify_thresholded_alert_reverse(
current, self.test_alerts_config.eth_balance_amount,
EthBalanceIncreasedAboveThresholdAlert,
EthBalanceDecreasedBelowThresholdAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.EthBalanceThreshold.value,
self.test_node_name, alert_timestamp
)
expected_alert_1 = EthBalanceIncreasedAboveThresholdAlert(
self.test_node_name, current, 'INFO', alert_timestamp,
'CRITICAL', self.test_parent_id, self.test_node_id)
expected_alert_2 = EthBalanceDecreasedBelowThresholdAlert(
self.test_node_name, current, 'WARNING', alert_timestamp,
'WARNING', self.test_parent_id, self.test_node_id)
self.assertEqual(2, len(data_for_alerting))
self.assertEqual(expected_alert_1.alert_data, data_for_alerting[0])
self.assertEqual(expected_alert_2.alert_data, data_for_alerting[1])
def test_classify_thresholded_does_nothing_warning_critical_disabled(
self) -> None:
"""
In this test we will check that no alert is raised whenever both warning
and critical alerts are disabled. We will perform this test for both
when current>= critical and current >= warning. For an alert to be
raised when current < critical or current < warning it must be that one
of the severities is enabled.
"""
self.evm_node_alerts_config.evm_block_syncing_block_height_difference[
'warning_enabled'] = 'False'
self.evm_node_alerts_config.evm_block_syncing_block_height_difference[
'critical_enabled'] = 'False'
data_for_alerting = []
current = float(self.evm_node_alerts_config
.evm_block_syncing_block_height_difference[
'critical_threshold']) + 1
self.test_evm_factory_instance.classify_thresholded_alert(
current, self.evm_node_alerts_config
.evm_block_syncing_block_height_difference,
BlockHeightDifferenceIncreasedAboveThresholdAlert,
BlockHeightDifferenceDecreasedBelowThresholdAlert,
data_for_alerting,
self.test_parent_id, self.test_node_id,
EVMAlertsMetricCode.BlockHeightDifference.value,
self.test_node_name, datetime.now().timestamp()
)
self.assertEqual([], data_for_alerting)
@parameterized.expand([
('WARNING', 'warning_threshold'),
('CRITICAL', 'critical_threshold'),
])
@freeze_time("2012-01-01")
def test_classify_thresholded_raises_alert_if_above_threshold(
self, severity, threshold_var) -> None:
"""
In this test we will check that a warning/critical above threshold
alert is raised if the current value goes above the warning/critical
threshold.
"""
data_for_alerting = []
current = int(self.evm_node_alerts_config
.evm_block_syncing_block_height_difference[
threshold_var]) + 1
self.test_evm_factory_instance.classify_thresholded_alert(
current,
self.evm_node_alerts_config
.evm_block_syncing_block_height_difference,
BlockHeightDifferenceIncreasedAboveThresholdAlert,
BlockHeightDifferenceDecreasedBelowThresholdAlert,
data_for_alerting,
self.test_parent_id, self.test_node_id,
EVMAlertsMetricCode.BlockHeightDifference.value,
self.test_node_name, datetime.now().timestamp()
)
expected_alert = BlockHeightDifferenceIncreasedAboveThresholdAlert(
self.test_node_name, current, severity, datetime.now().timestamp(),
severity, self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_thresholded_no_warning_if_warning_already_sent(
self) -> None:
"""
In this test we will check that no warning alert is raised if a warning
alert has already been sent
"""
data_for_alerting = []
# Send first warning alert
current = float(self.evm_node_alerts_config
.evm_block_syncing_block_height_difference[
'warning_threshold']) + 1
self.test_evm_factory_instance.classify_thresholded_alert(
current, self.evm_node_alerts_config
.evm_block_syncing_block_height_difference,
BlockHeightDifferenceIncreasedAboveThresholdAlert,
BlockHeightDifferenceDecreasedBelowThresholdAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
EVMAlertsMetricCode.BlockHeightDifference.value,
self.test_node_name, datetime.now().timestamp()
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Classify again to check if a warning alert is raised
self.test_evm_factory_instance.classify_thresholded_alert(
current,
self.evm_node_alerts_config
.evm_block_syncing_block_height_difference,
BlockHeightDifferenceIncreasedAboveThresholdAlert,
BlockHeightDifferenceDecreasedBelowThresholdAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
EVMAlertsMetricCode.BlockHeightDifference.value,
self.test_node_name, datetime.now().timestamp() + 1
)
self.assertEqual([], data_for_alerting)
@freeze_time("2012-01-01")
def test_classify_thresholded_raises_critical_if_repeat_elapsed(
self) -> None:
"""
In this test we will check that a critical above threshold alert is
re-raised if the critical repeat window elapses. We will also check that
if the critical window does not elapse, a critical alert is not
re-raised.
"""
data_for_alerting = []
# First critical below threshold alert
current = int(self.evm_node_alerts_config
.evm_block_syncing_block_height_difference[
'critical_threshold']) + 1
self.test_evm_factory_instance.classify_thresholded_alert(
current,
self.evm_node_alerts_config
.evm_block_syncing_block_height_difference,
BlockHeightDifferenceIncreasedAboveThresholdAlert,
BlockHeightDifferenceDecreasedBelowThresholdAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
EVMAlertsMetricCode.BlockHeightDifference.value,
self.test_node_name, datetime.now().timestamp()
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Classify with not elapsed repeat to confirm that no critical alert is
# raised.
pad = float(self.evm_node_alerts_config
.evm_block_syncing_block_height_difference[
'critical_repeat']) - 1
alert_timestamp = datetime.now().timestamp() + pad
self.test_evm_factory_instance.classify_thresholded_alert(
current,
self.evm_node_alerts_config
.evm_block_syncing_block_height_difference,
BlockHeightDifferenceIncreasedAboveThresholdAlert,
BlockHeightDifferenceDecreasedBelowThresholdAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
EVMAlertsMetricCode.BlockHeightDifference.value,
self.test_node_name, alert_timestamp
)
self.assertEqual([], data_for_alerting)
# Let repeat time to elapse and check that a critical alert is
# re-raised
pad = int(self.evm_node_alerts_config
.evm_block_syncing_block_height_difference['critical_repeat'])
alert_timestamp = datetime.now().timestamp() + pad
self.test_evm_factory_instance.classify_thresholded_alert(
current, self.evm_node_alerts_config
.evm_block_syncing_block_height_difference,
BlockHeightDifferenceIncreasedAboveThresholdAlert,
BlockHeightDifferenceDecreasedBelowThresholdAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
EVMAlertsMetricCode.BlockHeightDifference.value,
self.test_node_name, alert_timestamp
)
expected_alert = BlockHeightDifferenceIncreasedAboveThresholdAlert(
self.test_node_name, current, "CRITICAL", alert_timestamp,
"CRITICAL", self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_threshold_only_1_critical_if_below_and_no_repeat(
self) -> None:
"""
In this test we will check that if critical_repeat is disabled, a
increase above critical alert is not re-raised.
"""
self.evm_node_alerts_config.evm_block_syncing_block_height_difference[
'critical_repeat_enabled'] = "False"
data_for_alerting = []
# First critical below threshold alert
current = float(self.evm_node_alerts_config
.evm_block_syncing_block_height_difference[
'critical_threshold']) + 1
self.test_evm_factory_instance.classify_thresholded_alert(
current,
self.evm_node_alerts_config
.evm_block_syncing_block_height_difference,
BlockHeightDifferenceIncreasedAboveThresholdAlert,
BlockHeightDifferenceDecreasedBelowThresholdAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
EVMAlertsMetricCode.BlockHeightDifference.value,
self.test_node_name, datetime.now().timestamp()
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Let repeat time to elapse and check that a critical alert is not
# re-raised
pad = (
float(self.evm_node_alerts_config
.evm_block_syncing_block_height_difference[
'critical_repeat']))
alert_timestamp = datetime.now().timestamp() + pad
self.test_evm_factory_instance.classify_thresholded_alert(
current,
self.evm_node_alerts_config
.evm_block_syncing_block_height_difference,
BlockHeightDifferenceIncreasedAboveThresholdAlert,
BlockHeightDifferenceDecreasedBelowThresholdAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
EVMAlertsMetricCode.BlockHeightDifference.value,
self.test_node_name, alert_timestamp
)
self.assertEqual([], data_for_alerting)
@parameterized.expand([
('critical_threshold', 'CRITICAL',),
('warning_threshold', 'WARNING',)
])
@freeze_time("2012-01-01")
def test_classify_thresh_info_alert_if_below_thresh_and_alert_sent(
self, threshold_var, threshold_severity) -> None:
"""
In this test we will check that once the current value is lower than a
threshold, a decrease below threshold info alert is sent. We will
perform this test for both warning and critical.
"""
data_for_alerting = []
# First below threshold alert
current = int(self.evm_node_alerts_config
.evm_block_syncing_block_height_difference[
threshold_var]) + 1
self.test_evm_factory_instance.classify_thresholded_alert(
current,
self.evm_node_alerts_config
.evm_block_syncing_block_height_difference,
BlockHeightDifferenceIncreasedAboveThresholdAlert,
BlockHeightDifferenceDecreasedBelowThresholdAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
EVMAlertsMetricCode.BlockHeightDifference.value,
self.test_node_name, datetime.now().timestamp()
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Check that an above threshold INFO alert is raised. Current is set to
# warning + 1 to not trigger a warning alert as it is expected that
# critical <= warning.
current = int(
self.evm_node_alerts_config
.evm_block_syncing_block_height_difference[
'warning_threshold']) - 1
alert_timestamp = datetime.now().timestamp()
self.test_evm_factory_instance.classify_thresholded_alert(
current,
self.evm_node_alerts_config
.evm_block_syncing_block_height_difference,
BlockHeightDifferenceIncreasedAboveThresholdAlert,
BlockHeightDifferenceDecreasedBelowThresholdAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
EVMAlertsMetricCode.BlockHeightDifference.value,
self.test_node_name, alert_timestamp
)
expected_alert = BlockHeightDifferenceDecreasedBelowThresholdAlert(
self.test_node_name, current, 'INFO', alert_timestamp,
threshold_severity, self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_thresh_warn_alert_if_below_critical_above_warn(
self) -> None:
"""
In this test we will check that whenever
warning <= current <= critical <= previous, a warning alert is raised to
inform that the current value is bigger than the warning value. Note
we will perform this test for the case when we first alert warning, then
critical and not immediately critical, as the warning alerting would be
obvious.
"""
data_for_alerting = []
# Send warning increases above threshold alert
current = (float(self.evm_node_alerts_config
.evm_block_syncing_block_height_difference[
'warning_threshold']) + 1)
self.test_evm_factory_instance.classify_thresholded_alert(
current, self.evm_node_alerts_config
.evm_block_syncing_block_height_difference,
BlockHeightDifferenceIncreasedAboveThresholdAlert,
BlockHeightDifferenceDecreasedBelowThresholdAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
EVMAlertsMetricCode.BlockHeightDifference.value,
self.test_node_name, datetime.now().timestamp()
)
self.assertEqual(1, len(data_for_alerting))
# Send critical decrease below threshold alert
current = int(self.evm_node_alerts_config
.evm_block_syncing_block_height_difference[
'critical_threshold']) + 1
self.test_evm_factory_instance.classify_thresholded_alert(
current,
self.evm_node_alerts_config
.evm_block_syncing_block_height_difference,
BlockHeightDifferenceIncreasedAboveThresholdAlert,
BlockHeightDifferenceDecreasedBelowThresholdAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
EVMAlertsMetricCode.BlockHeightDifference.value,
self.test_node_name, datetime.now().timestamp()
)
self.assertEqual(2, len(data_for_alerting))
data_for_alerting.clear()
# Check that 2 alerts are raised, below critical and above warning
current = int(self.evm_node_alerts_config
.evm_block_syncing_block_height_difference[
'critical_threshold']) - 1
alert_timestamp = datetime.now().timestamp() + 10
self.test_evm_factory_instance.classify_thresholded_alert(
current,
self.evm_node_alerts_config
.evm_block_syncing_block_height_difference,
BlockHeightDifferenceIncreasedAboveThresholdAlert,
BlockHeightDifferenceDecreasedBelowThresholdAlert,
data_for_alerting,
self.test_parent_id, self.test_node_id,
EVMAlertsMetricCode.BlockHeightDifference.value,
self.test_node_name, alert_timestamp
)
expected_alert_1 = BlockHeightDifferenceDecreasedBelowThresholdAlert(
self.test_node_name, current, 'INFO', alert_timestamp,
'CRITICAL', self.test_parent_id, self.test_node_id)
expected_alert_2 = BlockHeightDifferenceIncreasedAboveThresholdAlert(
self.test_node_name, current, 'WARNING', alert_timestamp,
'WARNING', self.test_parent_id, self.test_node_id)
self.assertEqual(2, len(data_for_alerting))
self.assertEqual(expected_alert_1.alert_data, data_for_alerting[0])
self.assertEqual(expected_alert_2.alert_data, data_for_alerting[1])
@freeze_time("2012-01-01")
def test_classify_error_alert_raises_error_alert_if_matched_error_codes(
self) -> None:
test_err = InvalidUrlException('test_url')
data_for_alerting = []
self.test_factory_instance.classify_error_alert(
test_err.code, InvalidUrlAlert, ValidUrlAlert, data_for_alerting,
self.test_parent_id, self.test_node_id, self.test_node_name,
datetime.now().timestamp(),
GroupedChainlinkNodeAlertsMetricCode.InvalidUrl.value, "error msg",
"resolved msg", test_err.code
)
expected_alert = InvalidUrlAlert(
self.test_node_name, 'error msg', 'ERROR',
datetime.now().timestamp(), self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_error_alert_does_nothing_if_no_err_received_and_no_raised(
self) -> None:
test_err = InvalidUrlException('test_url')
data_for_alerting = []
self.test_factory_instance.classify_error_alert(
test_err.code, InvalidUrlAlert, ValidUrlAlert, data_for_alerting,
self.test_parent_id, self.test_node_id, self.test_node_name,
datetime.now().timestamp(),
GroupedChainlinkNodeAlertsMetricCode.InvalidUrl.value, "error msg",
"resolved msg", None
)
self.assertEqual([], data_for_alerting)
@parameterized.expand([
(None,), (MetricNotFoundException('test-metric', 'test_url').code,),
])
@freeze_time("2012-01-01")
def test_classify_error_alert_raises_err_solved_if_alerted_and_no_error(
self, code) -> None:
"""
In this test we will check that an error solved alert is raised whenever
no error is detected or a new error is detected after reporting a
different error
"""
test_err = InvalidUrlException('test_url')
data_for_alerting = []
# Generate first error alert
self.test_factory_instance.classify_error_alert(
test_err.code, InvalidUrlAlert, ValidUrlAlert, data_for_alerting,
self.test_parent_id, self.test_node_id, self.test_node_name,
datetime.now().timestamp(),
GroupedChainlinkNodeAlertsMetricCode.InvalidUrl.value, "error msg",
"resolved msg", test_err.code
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Generate solved alert
alerted_timestamp = datetime.now().timestamp() + 10
self.test_factory_instance.classify_error_alert(
test_err.code, InvalidUrlAlert, ValidUrlAlert, data_for_alerting,
self.test_parent_id, self.test_node_id, self.test_node_name,
alerted_timestamp,
GroupedChainlinkNodeAlertsMetricCode.InvalidUrl.value, "error msg",
"resolved msg", code
)
expected_alert = ValidUrlAlert(
self.test_node_name, 'resolved msg', 'INFO', alerted_timestamp,
self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_downtime_alert_does_nothing_warning_critical_disabled(
self) -> None:
"""
In this test we will check that no alert is raised and no timer is
started whenever both warning and critical alerts are disabled. We will
perform this test for both when downtime >= critical_window and
downtime >= warning_window.
"""
self.test_alerts_config.node_is_down['warning_enabled'] = 'False'
self.test_alerts_config.node_is_down['critical_enabled'] = 'False'
data_for_alerting = []
current_went_down = datetime.now().timestamp()
alert_timestamp = \
current_went_down + float(self.test_alerts_config.node_is_down[
'critical_threshold'])
self.test_factory_instance.classify_downtime_alert(
current_went_down, self.test_alerts_config.node_is_down,
NodeWentDownAtAlert, NodeStillDownAlert, NodeBackUpAgainAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value,
self.test_node_name, alert_timestamp
)
critical_window_timer = self.test_factory_instance.alerting_state[
self.test_parent_id][self.test_node_id]['critical_window_timer'][
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value]
warning_window_timer = self.test_factory_instance.alerting_state[
self.test_parent_id][self.test_node_id]['warning_window_timer'][
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value]
self.assertEqual([], data_for_alerting)
self.assertFalse(critical_window_timer.timer_started)
self.assertFalse(warning_window_timer.timer_started)
@parameterized.expand([
('WARNING', 'warning_threshold'),
('CRITICAL', 'critical_threshold'),
])
@freeze_time("2012-01-01")
def test_classify_downtime_alert_raises_alert_if_above_time_window(
self, severity, threshold_var) -> None:
"""
In this test we will check that a warning/critical downtime alert is
raised if downtime exceeds the warning/critical window. We will also
check that no alert is raised if the timer is not started or not elapsed
"""
data_for_alerting = []
current_went_down = datetime.now().timestamp()
alert_timestamp = \
current_went_down + float(self.test_alerts_config.node_is_down[
threshold_var])
# Start timer, no alert is raised
self.test_factory_instance.classify_downtime_alert(
current_went_down, self.test_alerts_config.node_is_down,
NodeWentDownAtAlert, NodeStillDownAlert, NodeBackUpAgainAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value,
self.test_node_name, current_went_down
)
self.assertEqual([], data_for_alerting)
# No alert is raised if the time window is not elapsed yet
self.test_factory_instance.classify_downtime_alert(
current_went_down, self.test_alerts_config.node_is_down,
NodeWentDownAtAlert, NodeStillDownAlert, NodeBackUpAgainAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value,
self.test_node_name, current_went_down
)
self.assertEqual([], data_for_alerting)
# A critical/warning downtime alert is now raised
self.test_factory_instance.classify_downtime_alert(
current_went_down, self.test_alerts_config.node_is_down,
NodeWentDownAtAlert, NodeStillDownAlert, NodeBackUpAgainAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value,
self.test_node_name, alert_timestamp
)
expected_alert = NodeWentDownAtAlert(
self.test_node_name, severity, alert_timestamp, self.test_parent_id,
self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_downtime_alert_no_warning_if_warning_already_sent(
self) -> None:
"""
In this test we will check that no warning alert is raised if a warning
alert has already been sent
"""
data_for_alerting = []
current_went_down = datetime.now().timestamp()
alert_timestamp = \
current_went_down + float(self.test_alerts_config.node_is_down[
'warning_threshold'])
# Start timer, no alert is raised
self.test_factory_instance.classify_downtime_alert(
current_went_down, self.test_alerts_config.node_is_down,
NodeWentDownAtAlert, NodeStillDownAlert, NodeBackUpAgainAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value,
self.test_node_name, current_went_down
)
# Raise a warning downtime alert is now raised
self.test_factory_instance.classify_downtime_alert(
current_went_down, self.test_alerts_config.node_is_down,
NodeWentDownAtAlert, NodeStillDownAlert, NodeBackUpAgainAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value,
self.test_node_name, alert_timestamp
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Try to generate another warning alert. Confirm that none was raised.
self.test_factory_instance.classify_downtime_alert(
current_went_down, self.test_alerts_config.node_is_down,
NodeWentDownAtAlert, NodeStillDownAlert, NodeBackUpAgainAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value,
self.test_node_name, alert_timestamp + 10
)
@freeze_time("2012-01-01")
def test_classify_downtime_alert_raises_critical_if_repeat_elapsed(
self) -> None:
"""
In this test we will check that a critical downtime alert is re-raised
if the critical window elapses. We will also check that if the critical
window does not elapse, a critical alert is not re-raised.
"""
data_for_alerting = []
current_went_down = datetime.now().timestamp()
alert_timestamp = \
current_went_down + float(self.test_alerts_config.node_is_down[
'critical_threshold'])
# Start timer, no alert is raised
self.test_factory_instance.classify_downtime_alert(
current_went_down, self.test_alerts_config.node_is_down,
NodeWentDownAtAlert, NodeStillDownAlert, NodeBackUpAgainAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value,
self.test_node_name, current_went_down
)
# A first critical/warning downtime alert is now raised
self.test_factory_instance.classify_downtime_alert(
current_went_down, self.test_alerts_config.node_is_down,
NodeWentDownAtAlert, NodeStillDownAlert, NodeBackUpAgainAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value,
self.test_node_name, alert_timestamp
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# No alert is re-raised if the repeat time is not elapsed
alert_timestamp = \
alert_timestamp + float(self.test_alerts_config.node_is_down[
'critical_repeat'])
self.test_factory_instance.classify_downtime_alert(
current_went_down, self.test_alerts_config.node_is_down,
NodeWentDownAtAlert, NodeStillDownAlert, NodeBackUpAgainAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value,
self.test_node_name, alert_timestamp - 1
)
self.assertEqual([], data_for_alerting)
# Critical alert is re-raised if the repeat time elapsed.
self.test_factory_instance.classify_downtime_alert(
current_went_down, self.test_alerts_config.node_is_down,
NodeWentDownAtAlert, NodeStillDownAlert, NodeBackUpAgainAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value,
self.test_node_name, alert_timestamp
)
difference = alert_timestamp - current_went_down
expected_alert = NodeStillDownAlert(
self.test_node_name, difference, 'CRITICAL', alert_timestamp,
self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_downtime_alert_only_1_critical_if_repeat_disabled(
self) -> None:
"""
In this test we will check that if critical_repeat is disabled, a
critical downtime alert is not re-raised.
"""
self.test_alerts_config.node_is_down[
'critical_repeat_enabled'] = "False"
data_for_alerting = []
current_went_down = datetime.now().timestamp()
alert_timestamp = \
current_went_down + float(self.test_alerts_config.node_is_down[
'critical_threshold'])
# Start timer, no alert is raised
self.test_factory_instance.classify_downtime_alert(
current_went_down, self.test_alerts_config.node_is_down,
NodeWentDownAtAlert, NodeStillDownAlert, NodeBackUpAgainAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value,
self.test_node_name, current_went_down
)
# A first critical/warning downtime alert is now raised
self.test_factory_instance.classify_downtime_alert(
current_went_down, self.test_alerts_config.node_is_down,
NodeWentDownAtAlert, NodeStillDownAlert, NodeBackUpAgainAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value,
self.test_node_name, alert_timestamp
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Critical alert is not re-raised if the repeat time elapsed.
alert_timestamp = \
alert_timestamp + float(self.test_alerts_config.node_is_down[
'critical_repeat'])
self.test_factory_instance.classify_downtime_alert(
current_went_down, self.test_alerts_config.node_is_down,
NodeWentDownAtAlert, NodeStillDownAlert, NodeBackUpAgainAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value,
self.test_node_name, alert_timestamp
)
self.assertEqual([], data_for_alerting)
@parameterized.expand([
('warning_threshold',), ('critical_threshold',),
])
@freeze_time("2012-01-01")
def test_classify_downtime_alert_raises_info_if_node_is_back_up(
self, threshold_var) -> None:
"""
In this test we will check that an info alert is raised whenever a
node is no longer down after it has been reported that it is down. We
will perform this test for both critical and warning
"""
data_for_alerting = []
current_went_down = datetime.now().timestamp()
alert_timestamp = \
current_went_down + float(self.test_alerts_config.node_is_down[
threshold_var])
# Start timer, no alert is raised
self.test_factory_instance.classify_downtime_alert(
current_went_down, self.test_alerts_config.node_is_down,
NodeWentDownAtAlert, NodeStillDownAlert, NodeBackUpAgainAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value,
self.test_node_name, current_went_down
)
# A first critical/warning downtime alert is now raised
self.test_factory_instance.classify_downtime_alert(
current_went_down, self.test_alerts_config.node_is_down,
NodeWentDownAtAlert, NodeStillDownAlert, NodeBackUpAgainAlert,
data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value,
self.test_node_name, alert_timestamp
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
# Info back up again alert is raised if node is no longer down.
alert_timestamp = alert_timestamp + 10
self.test_factory_instance.classify_downtime_alert(
None, self.test_alerts_config.node_is_down, NodeWentDownAtAlert,
NodeStillDownAlert, NodeBackUpAgainAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value,
self.test_node_name, alert_timestamp
)
expected_alert = NodeBackUpAgainAlert(
self.test_node_name, 'INFO', alert_timestamp, self.test_parent_id,
self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert.alert_data, data_for_alerting[0])
def test_classify_downtime_alert_does_nothing_if_node_is_never_down(
self) -> None:
"""
In this test we will check that no timer is started and no alert is
raised if the node was and is not down.
"""
data_for_alerting = []
# Send data indicating that the node is not down, and check that no
# alert is raised and that no timer is started.
self.test_factory_instance.classify_downtime_alert(
None, self.test_alerts_config.node_is_down, NodeWentDownAtAlert,
NodeStillDownAlert, NodeBackUpAgainAlert, data_for_alerting,
self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value,
self.test_node_name, datetime.now().timestamp()
)
critical_window_timer = self.test_factory_instance.alerting_state[
self.test_parent_id][self.test_node_id]['critical_window_timer'][
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value]
warning_window_timer = self.test_factory_instance.alerting_state[
self.test_parent_id][self.test_node_id]['warning_window_timer'][
GroupedChainlinkNodeAlertsMetricCode.NodeIsDown.value]
self.assertEqual([], data_for_alerting)
self.assertFalse(critical_window_timer.timer_started)
self.assertFalse(warning_window_timer.timer_started)
"""new"""
@freeze_time("2012-01-01")
def test_classify_source_downtime_alert_raises_condition_true_alert_if_true(
self) -> None:
"""
Given a true condition, in this test we will check that the
classify_source_downtime_alert fn calls the condition_true_alert
"""
def condition_function(*args): return True
data_for_alerting = []
self.test_factory_instance.classify_source_downtime_alert(
PrometheusSourceIsDownAlert, condition_function, [], [
self.test_node_name, 'WARNING', datetime.now().timestamp(),
self.test_parent_id, self.test_node_id
], data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.PrometheusSourceIsDown.value,
PrometheusSourceBackUpAgainAlert
)
expected_alert_1 = PrometheusSourceIsDownAlert(
self.test_node_name, 'WARNING', datetime.now().timestamp(),
self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert_1.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_source_downtime_alert_raises_condition_false_alert_if_false_and_warning_sent(
self) -> None:
"""
Given a false condition, in this test we will check that the
classify_source_downtime_alert fn calls the condition_false_alert if it
is not None and a warning alert notifying the problem has already been
sent.
"""
def condition_function_true(*args): return True
def condition_function_false(*args): return not condition_function_true(
args)
data_for_alerting = []
# Send the warning alert first
self.test_factory_instance.classify_source_downtime_alert(
PrometheusSourceIsDownAlert, condition_function_true, [], [
self.test_node_name, 'WARNING', datetime.now().timestamp(),
self.test_parent_id, self.test_node_id
], data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.PrometheusSourceIsDown.value,
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
self.test_factory_instance.classify_source_downtime_alert(
PrometheusSourceIsDownAlert, condition_function_false, [], [
self.test_node_name, 'WARNING', datetime.now().timestamp() + 1,
self.test_parent_id, self.test_node_id
], data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.PrometheusSourceIsDown.value,
PrometheusSourceBackUpAgainAlert, [
self.test_node_name, 'INFO', datetime.now().timestamp() + 1,
self.test_parent_id, self.test_node_id
]
)
expected_alert_1 = PrometheusSourceBackUpAgainAlert(
self.test_node_name, 'INFO', datetime.now().timestamp() + 1,
self.test_parent_id, self.test_node_id)
self.assertEqual(1, len(data_for_alerting))
self.assertEqual(expected_alert_1.alert_data, data_for_alerting[0])
@freeze_time("2012-01-01")
def test_classify_source_down_alert_no_alert_if_no_false_alert_and_false(
self) -> None:
"""
Given a false condition and no condition_false_alert, in this test we
will check that no alert is raised by the classify_source_downtime_alert
fn.
"""
def condition_function_true(*args): return True
def condition_function_false(*args): return not condition_function_true(
args)
data_for_alerting = []
# Send the warning alert first
self.test_factory_instance.classify_source_downtime_alert(
PrometheusSourceIsDownAlert, condition_function_true, [], [
self.test_node_name, 'WARNING', datetime.now().timestamp(),
self.test_parent_id, self.test_node_id
], data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.PrometheusSourceIsDown.value,
)
self.assertEqual(1, len(data_for_alerting))
data_for_alerting.clear()
self.test_factory_instance.classify_source_downtime_alert(
PrometheusSourceIsDownAlert, condition_function_false, [], [
self.test_node_name, 'INFO', datetime.now().timestamp() + 1,
self.test_parent_id, self.test_node_id
], data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.PrometheusSourceIsDown.value,
)
self.assertEqual([], data_for_alerting)
@freeze_time("2012-01-01")
def test_classify_source_down_alert_no_alert_if_warning_not_sent_and_false(
self) -> None:
"""
Given a false condition and that no warning alert has been sent, in this
test we will check that no alert is raised by the
classify_source_downtime_alert fn.
"""
def condition_function_true(*args): return True
def condition_function_false(*args): return not condition_function_true(
args)
data_for_alerting = []
self.test_factory_instance.classify_source_downtime_alert(
PrometheusSourceIsDownAlert, condition_function_false, [], [
self.test_node_name, 'WARNING', datetime.now().timestamp() + 1,
self.test_parent_id, self.test_node_id
], data_for_alerting, self.test_parent_id, self.test_node_id,
GroupedChainlinkNodeAlertsMetricCode.PrometheusSourceIsDown.value,
PrometheusSourceBackUpAgainAlert, [
self.test_node_name, 'INFO', datetime.now().timestamp() + 1,
self.test_parent_id, self.test_node_id
]
)
self.assertEqual([], data_for_alerting)
| 47.829833 | 99 | 0.673929 | 13,548 | 132,106 | 6.198553 | 0.024801 | 0.075448 | 0.05823 | 0.041678 | 0.927445 | 0.908 | 0.89508 | 0.883684 | 0.869442 | 0.852414 | 0 | 0.007395 | 0.25989 | 132,106 | 2,761 | 100 | 47.847157 | 0.851513 | 0.108852 | 0 | 0.798689 | 0 | 0 | 0.049483 | 0.007332 | 0 | 0 | 0 | 0 | 0.066011 | 1 | 0.032772 | false | 0 | 0.007491 | 0.004682 | 0.041667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2d80b3edc89d68ff62dd21245eefb0efc326e97f | 224 | py | Python | seleniumbase/__init__.py | aoruilin/SeleniumBase | 304fd7a23661ebf561da47d4cd8f7365dba519ca | [
"MIT"
] | 1 | 2019-11-26T08:22:40.000Z | 2019-11-26T08:22:40.000Z | seleniumbase/__init__.py | aoruilin/SeleniumBase | 304fd7a23661ebf561da47d4cd8f7365dba519ca | [
"MIT"
] | 3 | 2021-03-31T19:32:57.000Z | 2021-12-13T20:33:50.000Z | seleniumbase/__init__.py | aoruilin/SeleniumBase | 304fd7a23661ebf561da47d4cd8f7365dba519ca | [
"MIT"
] | 1 | 2020-10-05T17:57:50.000Z | 2020-10-05T17:57:50.000Z | from seleniumbase.fixtures.base_case import BaseCase # noqa
from seleniumbase.masterqa.master_qa import MasterQA # noqa
from seleniumbase.common import decorators # noqa
from seleniumbase.common import encryption # noqa
| 44.8 | 60 | 0.830357 | 28 | 224 | 6.571429 | 0.5 | 0.347826 | 0.326087 | 0.282609 | 0.347826 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 224 | 4 | 61 | 56 | 0.938776 | 0.084821 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
2d94621f81217813af22e192daf2e2ca81441846 | 7,721 | py | Python | conanguide/ui/dialog/edit/name/edit_name_ui.py | afri-bit/conan-guide | 4eac41710ee83da64cea1e5ce2aafc09844c4fa1 | [
"MIT"
] | 8 | 2021-05-31T10:45:14.000Z | 2022-03-29T12:55:58.000Z | conanguide/ui/dialog/edit/name/edit_name_ui.py | afri-bit/conan-blade | 4eac41710ee83da64cea1e5ce2aafc09844c4fa1 | [
"MIT"
] | 4 | 2021-05-31T10:46:37.000Z | 2021-07-24T08:15:01.000Z | conanguide/ui/dialog/edit/name/edit_name_ui.py | afri-bit/conan-blade | 4eac41710ee83da64cea1e5ce2aafc09844c4fa1 | [
"MIT"
] | 2 | 2021-05-31T17:42:28.000Z | 2021-07-23T17:40:28.000Z | # -*- coding: utf-8 -*-
################################################################################
## Form generated from reading UI file 'edit_name.ui'
##
## Created by: Qt User Interface Compiler version 5.15.2
##
## WARNING! All changes made in this file will be lost when recompiling UI file!
################################################################################
from PySide2.QtCore import *
from PySide2.QtGui import *
from PySide2.QtWidgets import *
from conanguide.ui.res import resources_rc
class Ui_DialogEditName(object):
def setupUi(self, DialogEditName):
if not DialogEditName.objectName():
DialogEditName.setObjectName(u"DialogEditName")
DialogEditName.setWindowModality(Qt.NonModal)
DialogEditName.resize(400, 63)
sizePolicy = QSizePolicy(QSizePolicy.Preferred, QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(DialogEditName.sizePolicy().hasHeightForWidth())
DialogEditName.setSizePolicy(sizePolicy)
DialogEditName.setMinimumSize(QSize(400, 63))
DialogEditName.setMaximumSize(QSize(400, 64))
DialogEditName.setBaseSize(QSize(350, 63))
icon = QIcon()
icon.addFile(u":/general/icon/conan_guide_icon.png", QSize(), QIcon.Normal, QIcon.Off)
DialogEditName.setWindowIcon(icon)
DialogEditName.setSizeGripEnabled(False)
DialogEditName.setModal(False)
self.verticalLayout = QVBoxLayout(DialogEditName)
self.verticalLayout.setSpacing(2)
self.verticalLayout.setObjectName(u"verticalLayout")
self.verticalLayout.setContentsMargins(5, 5, 5, 5)
self.lineEdit = QLineEdit(DialogEditName)
self.lineEdit.setObjectName(u"lineEdit")
self.lineEdit.setMinimumSize(QSize(0, 25))
self.verticalLayout.addWidget(self.lineEdit)
self.frame = QFrame(DialogEditName)
self.frame.setObjectName(u"frame")
sizePolicy.setHeightForWidth(self.frame.sizePolicy().hasHeightForWidth())
self.frame.setSizePolicy(sizePolicy)
self.frame.setFrameShape(QFrame.StyledPanel)
self.frame.setFrameShadow(QFrame.Raised)
self.horizontalLayout = QHBoxLayout(self.frame)
self.horizontalLayout.setSpacing(0)
self.horizontalLayout.setObjectName(u"horizontalLayout")
self.horizontalLayout.setSizeConstraint(QLayout.SetDefaultConstraint)
self.horizontalLayout.setContentsMargins(0, 0, 0, 0)
self.labelInfo = QLabel(self.frame)
self.labelInfo.setObjectName(u"labelInfo")
self.labelInfo.setMinimumSize(QSize(0, 22))
self.labelInfo.setMaximumSize(QSize(16777215, 22))
self.horizontalLayout.addWidget(self.labelInfo)
self.btnOK = QPushButton(self.frame)
self.btnOK.setObjectName(u"btnOK")
self.btnOK.setMinimumSize(QSize(0, 25))
self.btnOK.setMaximumSize(QSize(75, 16777215))
self.horizontalLayout.addWidget(self.btnOK)
self.btnCancel = QPushButton(self.frame)
self.btnCancel.setObjectName(u"btnCancel")
self.btnCancel.setMinimumSize(QSize(0, 25))
self.btnCancel.setMaximumSize(QSize(75, 16777215))
self.horizontalLayout.addWidget(self.btnCancel)
self.verticalLayout.addWidget(self.frame)
self.retranslateUi(DialogEditName)
QMetaObject.connectSlotsByName(DialogEditName)
# setupUi
def retranslateUi(self, DialogEditName):
DialogEditName.setWindowTitle(QCoreApplication.translate("DialogEditName", u"Dialog", None))
self.labelInfo.setText("")
self.btnOK.setText(QCoreApplication.translate("DialogEditName", u"OK", None))
self.btnCancel.setText(QCoreApplication.translate("DialogEditName", u"Cancel", None))
# re# -*- coding: utf-8 -*-
################################################################################
## Form generated from reading UI file 'edit_name.ui'
##
## Created by: Qt User Interface Compiler version 5.15.2
##
## WARNING! All changes made in this file will be lost when recompiling UI file!
################################################################################
from PySide2.QtCore import *
from PySide2.QtGui import *
from PySide2.QtWidgets import *
from conanguide.ui.res import resources_rc
class Ui_DialogEditName(object):
def setupUi(self, DialogEditName):
if not DialogEditName.objectName():
DialogEditName.setObjectName(u"DialogEditName")
DialogEditName.setWindowModality(Qt.NonModal)
DialogEditName.resize(400, 63)
sizePolicy = QSizePolicy(QSizePolicy.Preferred, QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(DialogEditName.sizePolicy().hasHeightForWidth())
DialogEditName.setSizePolicy(sizePolicy)
DialogEditName.setMinimumSize(QSize(400, 63))
DialogEditName.setMaximumSize(QSize(400, 64))
DialogEditName.setBaseSize(QSize(350, 63))
icon = QIcon()
icon.addFile(u":/general/icon/conan_guide_icon.png", QSize(), QIcon.Normal, QIcon.Off)
DialogEditName.setWindowIcon(icon)
DialogEditName.setSizeGripEnabled(False)
DialogEditName.setModal(False)
self.verticalLayout = QVBoxLayout(DialogEditName)
self.verticalLayout.setSpacing(2)
self.verticalLayout.setObjectName(u"verticalLayout")
self.verticalLayout.setContentsMargins(5, 5, 5, 5)
self.lineEdit = QLineEdit(DialogEditName)
self.lineEdit.setObjectName(u"lineEdit")
self.lineEdit.setMinimumSize(QSize(0, 25))
self.verticalLayout.addWidget(self.lineEdit)
self.frame = QFrame(DialogEditName)
self.frame.setObjectName(u"frame")
sizePolicy.setHeightForWidth(self.frame.sizePolicy().hasHeightForWidth())
self.frame.setSizePolicy(sizePolicy)
self.frame.setFrameShape(QFrame.StyledPanel)
self.frame.setFrameShadow(QFrame.Raised)
self.horizontalLayout = QHBoxLayout(self.frame)
self.horizontalLayout.setSpacing(0)
self.horizontalLayout.setObjectName(u"horizontalLayout")
self.horizontalLayout.setSizeConstraint(QLayout.SetDefaultConstraint)
self.horizontalLayout.setContentsMargins(0, 0, 0, 0)
self.labelInfo = QLabel(self.frame)
self.labelInfo.setObjectName(u"labelInfo")
self.labelInfo.setMinimumSize(QSize(0, 22))
self.labelInfo.setMaximumSize(QSize(16777215, 22))
self.horizontalLayout.addWidget(self.labelInfo)
self.btnOK = QPushButton(self.frame)
self.btnOK.setObjectName(u"btnOK")
self.btnOK.setMinimumSize(QSize(0, 25))
self.btnOK.setMaximumSize(QSize(75, 16777215))
self.horizontalLayout.addWidget(self.btnOK)
self.btnCancel = QPushButton(self.frame)
self.btnCancel.setObjectName(u"btnCancel")
self.btnCancel.setMinimumSize(QSize(0, 25))
self.btnCancel.setMaximumSize(QSize(75, 16777215))
self.horizontalLayout.addWidget(self.btnCancel)
self.verticalLayout.addWidget(self.frame)
self.retranslateUi(DialogEditName)
QMetaObject.connectSlotsByName(DialogEditName)
# setupUi
def retranslateUi(self, DialogEditName):
DialogEditName.setWindowTitle(QCoreApplication.translate("DialogEditName", u"Dialog", None))
self.labelInfo.setText("")
self.btnOK.setText(QCoreApplication.translate("DialogEditName", u"OK", None))
self.btnCancel.setText(QCoreApplication.translate("DialogEditName", u"Cancel", None))
# retranslateUi
| 41.735135 | 100 | 0.683979 | 722 | 7,721 | 7.300554 | 0.171745 | 0.037564 | 0.024663 | 0.025043 | 0.997154 | 0.997154 | 0.997154 | 0.997154 | 0.997154 | 0.997154 | 0 | 0.025742 | 0.174848 | 7,721 | 184 | 101 | 41.961957 | 0.801601 | 0.057246 | 0 | 1 | 1 | 0 | 0.049344 | 0.0101 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030769 | false | 0 | 0.061538 | 0 | 0.107692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
2deb393000bd3bef9a07b16b379cbdd5544095de | 101 | py | Python | src/graph_transpiler/webdnn/backend/interface/__init__.py | steerapi/webdnn | 1df51cc094e5a528cfd3452c264905708eadb491 | [
"MIT"
] | 1 | 2018-07-26T13:52:21.000Z | 2018-07-26T13:52:21.000Z | src/graph_transpiler/webdnn/backend/interface/__init__.py | steerapi/webdnn | 1df51cc094e5a528cfd3452c264905708eadb491 | [
"MIT"
] | null | null | null | src/graph_transpiler/webdnn/backend/interface/__init__.py | steerapi/webdnn | 1df51cc094e5a528cfd3452c264905708eadb491 | [
"MIT"
] | null | null | null | from webdnn.backend.interface import generator
from webdnn.backend.interface import graph_descriptor
| 33.666667 | 53 | 0.881188 | 13 | 101 | 6.769231 | 0.615385 | 0.227273 | 0.386364 | 0.590909 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.079208 | 101 | 2 | 54 | 50.5 | 0.946237 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
9310c916f1bfaa177afd5eced2691ed61e265643 | 298 | py | Python | src/nfvlib/nfv_actions.py | harpratap/nfv-mpls | bd7cb779a0ddf613f112fae860d149b7f8f0972f | [
"MIT"
] | null | null | null | src/nfvlib/nfv_actions.py | harpratap/nfv-mpls | bd7cb779a0ddf613f112fae860d149b7f8f0972f | [
"MIT"
] | null | null | null | src/nfvlib/nfv_actions.py | harpratap/nfv-mpls | bd7cb779a0ddf613f112fae860d149b7f8f0972f | [
"MIT"
] | null | null | null | class SDNActions():
def addReplaceMPLSLabel(label):
#FIXME: implement
return None
def addPushMPLSLabel(label):
#FIXME: implement
return None
def addPopMPLSLabel():
#FIXME: implement
return None
def addForwardToPorts(ports):
#FIXME: implement
return None
| 16.555556 | 33 | 0.687919 | 29 | 298 | 7.068966 | 0.448276 | 0.273171 | 0.390244 | 0.468293 | 0.443902 | 0.312195 | 0 | 0 | 0 | 0 | 0 | 0 | 0.234899 | 298 | 17 | 34 | 17.529412 | 0.899123 | 0.214765 | 0 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 0 | 1 | 0.444444 | false | 0 | 0 | 0.444444 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
9320550b5d994f3dcd043d1591527916fc757b9f | 93 | py | Python | meta_agents/utils/__init__.py | zhanpenghe/meta_agents | b3b4df70bab1ebe621d48eebb4c886b85c1d8323 | [
"MIT"
] | 3 | 2020-09-26T16:17:52.000Z | 2021-04-23T08:56:04.000Z | meta_agents/utils/__init__.py | zhanpenghe/meta_agents | b3b4df70bab1ebe621d48eebb4c886b85c1d8323 | [
"MIT"
] | 1 | 2019-09-03T19:57:40.000Z | 2019-09-03T19:57:40.000Z | meta_agents/utils/__init__.py | zhanpenghe/meta_agents | b3b4df70bab1ebe621d48eebb4c886b85c1d8323 | [
"MIT"
] | 1 | 2020-12-09T03:06:48.000Z | 2020-12-09T03:06:48.000Z | from meta_agents.utils.serializable import Serializable
from meta_agents.utils.utils import * | 46.5 | 55 | 0.870968 | 13 | 93 | 6.076923 | 0.461538 | 0.202532 | 0.35443 | 0.481013 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075269 | 93 | 2 | 56 | 46.5 | 0.918605 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
932549f5957027f7eacb1597cfbf8f895bac240c | 612 | py | Python | Aufgabe 2 Bootcamp Python.py | lacharogerio10/lacharogerio10.github.io | 325dcb6f64e3a80764e85849ac2f40cc72f06f2b | [
"CC0-1.0"
] | null | null | null | Aufgabe 2 Bootcamp Python.py | lacharogerio10/lacharogerio10.github.io | 325dcb6f64e3a80764e85849ac2f40cc72f06f2b | [
"CC0-1.0"
] | null | null | null | Aufgabe 2 Bootcamp Python.py | lacharogerio10/lacharogerio10.github.io | 325dcb6f64e3a80764e85849ac2f40cc72f06f2b | [
"CC0-1.0"
] | null | null | null | import random
Würfel=random.randint(1,6)
counter=1
print ("Anzahl Counter benutzt",counter)
print (Würfel)
Würfel=random.randint(1,6)
counter=counter+1
print("Anzahl Counter benutzt",counter)
print(Würfel)
Würfel=random.randint(1,6)
counter=counter+1
print("Anzahl counter benutzt",counter)
print(Würfel)
Würfel=random.randint(1,6)
counter=counter+1
print("Anzahl counter benutzt",counter)
print(Würfel)
Würfel=random.randint(1,6)
counter=counter+1
print("Anzahl counter benutzt",counter)
print(Würfel)
Würfel=random.randint(1,6)
counter=counter+1
print("Anzahl counter benutzt",counter)
| 24.48 | 41 | 0.764706 | 89 | 612 | 5.258427 | 0.11236 | 0.153846 | 0.24359 | 0.25641 | 0.974359 | 0.974359 | 0.929487 | 0.929487 | 0.929487 | 0.929487 | 0 | 0.032787 | 0.102941 | 612 | 24 | 42 | 25.5 | 0.819672 | 0 | 0 | 0.916667 | 0 | 0 | 0.22449 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.041667 | 0 | 0.041667 | 0.458333 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 10 |
9349e7e5ad20ff1b74af8c5dc94d78832c8a5963 | 2,274 | py | Python | archeutils/configs.py | acdh-oeaw/4dpuzzle | 7856bbd82c7dfa8da1d5f1ad40593219a35b3cfe | [
"MIT"
] | null | null | null | archeutils/configs.py | acdh-oeaw/4dpuzzle | 7856bbd82c7dfa8da1d5f1ad40593219a35b3cfe | [
"MIT"
] | 6 | 2020-06-05T18:32:02.000Z | 2022-02-10T07:22:24.000Z | archeutils/configs.py | acdh-oeaw/4dpuzzle | 7856bbd82c7dfa8da1d5f1ad40593219a35b3cfe | [
"MIT"
] | 1 | 2020-06-30T13:52:41.000Z | 2020-06-30T13:52:41.000Z | EXTENSION_HAS_CATEGORY_MAPPING = {
'zip': 'https://vocabs.acdh.oeaw.ac.at/archecategory/collection',
'tif': 'https://vocabs.acdh.oeaw.ac.at/archecategory/image',
'tiff': 'https://vocabs.acdh.oeaw.ac.at/archecategory/image',
'jpg': 'https://vocabs.acdh.oeaw.ac.at/archecategory/image',
'png': 'https://vocabs.acdh.oeaw.ac.at/archecategory/image',
'svg': 'https://vocabs.acdh.oeaw.ac.at/archecategory/image',
'mtl': 'https://vocabs.acdh.oeaw.ac.at/archecategory/3dData',
'xyzi': 'https://vocabs.acdh.oeaw.ac.at/archecategory/3dData',
'mtl': 'https://vocabs.acdh.oeaw.ac.at/archecategory/3dData',
'obj': 'https://vocabs.acdh.oeaw.ac.at/archecategory/3dData',
'ply': 'https://vocabs.acdh.oeaw.ac.at/archecategory/3dData',
'x3d': 'https://vocabs.acdh.oeaw.ac.at/archecategory/3dData',
'pdf': 'https://vocabs.acdh.oeaw.ac.at/archecategory/text/pdf',
'txt': 'https://vocabs.acdh.oeaw.ac.at/archecategory/text',
'html': 'https://vocabs.acdh.oeaw.ac.at/archecategory/text/html',
'md': 'https://vocabs.acdh.oeaw.ac.at/archecategory/text/markdown',
'csv': 'https://vocabs.acdh.oeaw.ac.at/archecategory/dataset/tabular',
'gfs': 'https://vocabs.acdh.oeaw.ac.at/archecategory/dataset',
'gml': 'https://vocabs.acdh.oeaw.ac.at/archecategory/dataset',
'tab': 'https://vocabs.acdh.oeaw.ac.at/archecategory/dataset',
'kml': 'https://vocabs.acdh.oeaw.ac.at/archecategory/dataset/kml',
'geojson': 'https://vocabs.acdh.oeaw.ac.at/archecategory/dataset/geojson',
'kmz': 'https://vocabs.acdh.oeaw.ac.at/archecategory/dataset',
'osm': 'https://vocabs.acdh.oeaw.ac.at/archecategory/dataset',
'gpx': 'https://vocabs.acdh.oeaw.ac.at/archecategory/dataset',
'tfw': 'https://vocabs.acdh.oeaw.ac.at/archecategory/dataset',
'prj': 'https://vocabs.acdh.oeaw.ac.at/archecategory/dataset',
'xml': 'https://vocabs.acdh.oeaw.ac.at/archecategory/dataset/xml',
'tfwx': 'https://vocabs.acdh.oeaw.ac.at/archecategory/dataset',
'points': 'https://vocabs.acdh.oeaw.ac.at/archecategory/dataset',
'qpj': 'https://vocabs.acdh.oeaw.ac.at/archecategory/dataset',
'asc': 'https://vocabs.acdh.oeaw.ac.at/archecategory/dataset',
'qgs': 'https://vocabs.acdh.oeaw.ac.at/archecategory/dataset'
}
| 63.166667 | 78 | 0.689534 | 308 | 2,274 | 5.081169 | 0.162338 | 0.231949 | 0.316294 | 0.400639 | 0.900958 | 0.900958 | 0.900958 | 0.877955 | 0.23131 | 0 | 0 | 0.003378 | 0.08883 | 2,274 | 35 | 79 | 64.971429 | 0.751931 | 0 | 0 | 0.057143 | 0 | 0 | 0.810026 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
935eb9451b3ce6878026956201fd1eb370e61f37 | 451 | py | Python | src/meterpreter_traffic_parser/exceptions/__init__.py | SpartaEN/meterpreter-traffic-parser | 5681c3973ac8fc3bbd478c82fdc27de4ab6c447a | [
"MIT"
] | 1 | 2021-12-22T15:14:54.000Z | 2021-12-22T15:14:54.000Z | src/meterpreter_traffic_parser/exceptions/__init__.py | SpartaEN/meterpreter-traffic-parser | 5681c3973ac8fc3bbd478c82fdc27de4ab6c447a | [
"MIT"
] | null | null | null | src/meterpreter_traffic_parser/exceptions/__init__.py | SpartaEN/meterpreter-traffic-parser | 5681c3973ac8fc3bbd478c82fdc27de4ab6c447a | [
"MIT"
] | null | null | null | class BadPacketHeader(Exception):
def __init__(self, *args: object) -> None:
super().__init__(*args)
class IncompletePayload(Exception):
def __init__(self, *args: object) -> None:
super().__init__(*args)
class UnknowType(Exception):
def __init__(self, *args: object) -> None:
super().__init__(*args)
class AESKeyError(Exception):
def __init__(self, *args: object) -> None:
super().__init__(*args)
| 23.736842 | 46 | 0.649667 | 48 | 451 | 5.4375 | 0.270833 | 0.183908 | 0.245211 | 0.306513 | 0.777778 | 0.777778 | 0.777778 | 0.777778 | 0.777778 | 0.777778 | 0 | 0 | 0.199557 | 451 | 18 | 47 | 25.055556 | 0.722992 | 0 | 0 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 9 |
fa895f7c319f725a5dd2624e204d7bff2277f407 | 121 | py | Python | OnePy/builtin_module/data_readers/__init__.py | Chandlercjy/OnePyfx | 9bd43b721d3f7352495b6ccab76bd533a3d2e8f2 | [
"MIT"
] | 321 | 2017-07-09T09:25:45.000Z | 2022-03-29T16:51:35.000Z | OnePy/builtin_module/data_readers/__init__.py | sunzhouhong/OnePy | 4e225945de297ba1211035a7b95b5094cdddc2a7 | [
"MIT"
] | 7 | 2017-08-23T12:10:29.000Z | 2020-03-26T12:56:09.000Z | OnePy/builtin_module/data_readers/__init__.py | sunzhouhong/OnePy | 4e225945de297ba1211035a7b95b5094cdddc2a7 | [
"MIT"
] | 134 | 2017-07-26T22:29:18.000Z | 2022-03-23T09:22:10.000Z | from OnePy.builtin_module.data_readers.csvreader import *
from OnePy.builtin_module.data_readers.mongodb_reader import *
| 40.333333 | 62 | 0.867769 | 17 | 121 | 5.882353 | 0.588235 | 0.18 | 0.32 | 0.44 | 0.66 | 0.66 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066116 | 121 | 2 | 63 | 60.5 | 0.884956 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
4f0de0345e930948933818a0c016fd0bed15956d | 68,279 | py | Python | FeatureClass.py | ivan-bilan/author-profiling-pan-2016 | c1bde09cd6f7b42238138dd8d46f50b21e1df534 | [
"MIT"
] | 6 | 2019-02-22T01:46:05.000Z | 2019-07-31T19:43:02.000Z | FeatureClass.py | ivan-bilan/author-profiling-pan-2016 | c1bde09cd6f7b42238138dd8d46f50b21e1df534 | [
"MIT"
] | null | null | null | FeatureClass.py | ivan-bilan/author-profiling-pan-2016 | c1bde09cd6f7b42238138dd8d46f50b21e1df534 | [
"MIT"
] | 2 | 2019-07-31T19:48:34.000Z | 2020-02-01T20:25:54.000Z | # -*- coding: utf-8 -*-
__version__ = "1.0"
__date__ = "24.07.2016"
__author__ = "Ivan Bilan"
import nltk
import os
from nltk.tokenize import word_tokenize
from nltk.corpus import stopwords
from string import punctuation
from time import sleep
import codecs
import re
import numpy
class FeatureClass(object):
def __init__(self, lang):
# load resources
self.language = lang
if lang == "en":
self.connective_words_list = self.extract_bow(os.path.join(os.path.dirname(__file__), 'Resources/en/connective_words.txt'))
self.slang_words_list = self.extract_bow(os.path.join(os.path.dirname(__file__), 'Resources\\en\\slang_words.txt'))
# get emoticon words
# list taken from https://gist.github.com/ryanlewis/a37739d710ccdb4b406d
self.emotion_words_list = self.extract_bow(os.path.join(os.path.dirname(__file__), 'Resources\\en\\emotion_words.txt'))
# get swear words
self.abbreviation_list = self.extract_bow(os.path.join(os.path.dirname(__file__), 'Resources\\en\\abbreviation.txt'))
self.contractions_list = self.extract_bow(os.path.join(os.path.dirname(__file__), 'Resources\\en\\contractions.txt'))
self.cachedStopWords = stopwords.words("english")
elif lang == "nl":
# get connective words
self.connective_words_list = self.extract_bow(os.path.join(os.path.dirname(__file__), 'Resources\\nl\\connective_words.txt'))
self.slang_words_list = self.extract_bow(os.path.join(os.path.dirname(__file__), 'Resources\\nl\\slang_words.txt'))
# get emoticon words
# list taken from https://gist.github.com/ryanlewis/a37739d710ccdb4b406d
self.emotion_words_list = self.extract_bow(os.path.join(os.path.dirname(__file__), 'Resources\\nl\\emotion_words.txt'))
# get swear words
# self.swear_words_list = self.extract_bow(os.path.join(os.path.dirname(__file__), 'Resources\\nl\\swear_words.txt')
self.abbreviation_list = self.extract_bow(os.path.join(os.path.dirname(__file__), 'Resources\\nl\\abbreviation.txt'))
self.contractions_list = self.extract_bow(os.path.join(os.path.dirname(__file__), 'Resources\\nl\\contractions.txt'))
self.cachedStopWords = stopwords.words("dutch")
elif lang == "es":
# get connective words
self.connective_words_list = self.extract_bow(os.path.join(os.path.dirname(__file__), 'Resources\\es\\connective_words.txt'))
self.slang_words_list = self.extract_bow(os.path.join(os.path.dirname(__file__), 'Resources\\es\\slang_words.txt'))
# get emoticon words
self.emotion_words_list = self.extract_bow(os.path.join(os.path.dirname(__file__), 'Resources\\es\\emotion_words.txt'))
# get swear words
# self.swear_words_list = self.extract_bow(os.path.join(os.path.dirname(__file__), 'Resources\\es\\swear_words.txt')
self.abbreviation_list = self.extract_bow(os.path.join(os.path.dirname(__file__), 'Resources\\es\\abbreviation.txt'))
self.contractions_list = self.extract_bow(os.path.join(os.path.dirname(__file__), 'Resources\\es\\contractions.txt'))
self.cachedStopWords = stopwords.words("spanish")
# emoticon regex
self.m_emoticons_comp = re.compile("[\;\:\=]\-*[\)\(\]\>\/]")
# pre-scaling text levels
calc = list()
self.calc_words = self.recursive_addition(calc, 0, 13, 10450)
calc2 = list()
self.calc_chars = self.recursive_addition(calc2, 0, 80, 64800)
def recursive_addition(self, list_inner, val, step, limit):
val+=step
list_inner.append(val)
if val < limit:
return self.recursive_addition(list_inner, val, step, limit)
else:
return list_inner
def regex_str(self, items):
"""
Create a regex out of a list
"""
new_list = list()
for x in items:
# print x
new_list.append(re.escape(x))
fulls_joined = '|'.join(new_list)
my_regex = r'(^|\b)('+fulls_joined+r')(\s|$)' # repr(fulls_joined)
# print my_regex
return my_regex
def extract_bow(self, filename):
"""
Extract tokens from txt file
"""
input_file = codecs.open(filename, encoding = "utf_8", mode='r')
lines = input_file.readlines()
bow_words = []
for x in lines:
x = x.replace('\r\n', '')
bow_words.append(x.strip().lower())
input_file.close()
return list(set(bow_words))
def catch_url(self, untagged, average_value = 0):
"""
Get the usage of linked content
"""
if average_value == 3:
sum_vector = 0
for single_text in untagged:
tokens = single_text.split()
result = re.findall("\[URL\]", untagged)
if (len(result) > 0) and (len(tokens) > 1):
average = float(len(result)) / len(tokens)
sum_vector += average
else:
average = 0
sum_vector += average
if len(untagged) > 0:
average_return = sum_vector / len(untagged)
else:
average_return = sum_vector
return average_return
else:
tokens = untagged.split()
result = re.findall(r"url", untagged, re.IGNORECASE)
if average_value == 0:
return len(result)
elif average_value == 1:
if (len(result) > 0) and (len(tokens) > 0):
average = float(len(result)) / len(tokens)
else:
average = 0
return average
elif average_value == 5:
return self.counter_pre_scaling(len(result), len(word_tokenize(untagged)))
def contractions(self, untagged, average_value = 0):
# count the amount of contractions in the text
# list of words taken from http://www.textfixer.com/resources/english-contractions-list.php
return self.word_counter_feature(self.contractions_list, untagged, average_value)
'''
# not included, may be used in future work
# this function counts the amount of swear words in the blog posts
def swear_words(self, untagged, average_value = 5):
# list taken from https://gist.github.com/ryanlewis/a37739d710ccdb4b406d
return self.word_counter_feature(self.swear_words_list, untagged, average_value)
'''
def emotion_words(self, untagged, average_value = 5):
# this function counts the amount of emotion words ( e.g. disgusted, hurt, aggressive )
# http://www.psychpage.com/learning/library/assess/feelings.html
return self.word_counter_feature(self.emotion_words_list, untagged, average_value)
def slang_words(self, untagged, average_value = 5):
# this function counts the amount of slang words
# http://www.psychpage.com/learning/library/assess/feelings.html
return self.word_counter_feature(self.slang_words_list, untagged, average_value)
def connective_words(self, untagged, average_value = 5):
#### checked
# feature suggested here : http://www.uni-weimar.de/medien/webis/research/events/pan-13/pan13-talks/pan13-author-profiling/meina13-poster.pdf
# these feature looks for all connective words (words that provide stylistic connection between sentences, paragraphs etc.)
# http://www.grammarbank.com/connectives-list.html
return self.word_counter_feature(self.connective_words_list, untagged, average_value)
def get_abbreviations(self, untagged, average_value = 5):
# http://www.grammarbank.com/connectives-list.html
# list taken from http://www.textfixer.com/resources/english-contractions-list.php
return self.word_counter_feature(self.abbreviation_list, untagged, average_value)
# get all emoticons
# not used, use for future work
def get_emoticons(self, untagged, average_value = 0):
if average_value == 3:
sum_vector = 0
for single_text in untagged:
counter = 0
# tokens = word_tokenize(untagged)
tokens = single_text.split()
for token in tokens:
m_emoticons = self.m_emoticons_comp.match(token)
if m_emoticons:
counter += 1
#print token
if (counter > 0) and (len(tokens) > 1):
average = float(counter) / len(tokens)
sum_vector += average
else:
average = 0
sum_vector += average
if len(untagged) > 0:
average_return = sum_vector / len(untagged)
else:
average_return = sum_vector
return average_return
else:
counter = 0
tokens = untagged.split()
for token in tokens:
m_emoticons = self.m_emoticons_comp.match(token)
if m_emoticons:
counter += 1
if average_value == 0:
return counter
elif average_value == 1:
if (counter > 0) and (len(tokens) > 1):
average = float(counter) / len(tokens)
else:
average = 0
return average
# get all emoticons
def positive_emoticons(self, untagged, average_value=0):
# not used
# this function counts the amount of positive emoticons
if average_value == 3:
sum_vector = 0
for single_text in untagged:
# print single_text, len(single_text)
counter = 0
# tokens = word_tokenize(untagged)
tokens = single_text.split()
list_of_unicode_smilies = [u'😀', u'😁', u'😂', u'😃',u'😄',u'😅',u'😆', u'😉',u'😊',u'😋', u'😎',u'🙋',u'😸', u'😛',u'?']
for token in tokens:
# m = re.match("(\:|\;)(\=|c|\-\o)*(\]|\)|\D|\*)*", token)
m = re.match("((?::|;|<)(?:-|,)?(?:\)|D|3))", token)
if m: # or m2
#print token
counter += 1
if token in list_of_unicode_smilies:
#print token
counter += 1
if (counter > 0) and (len(tokens) > 1):
average = float(counter) / len(tokens)
sum_vector += average
else:
average = 0
sum_vector += average
if len(untagged) > 0:
average_return = sum_vector / len(untagged)
else:
average_return = sum_vector
# print average_return
return average_return
else:
counter = 0
tokens = untagged.split()
list_of_unicode_smilies = [u'😀', u'😁', u'😂', u'😃',u'😄',u'😅',u'😆', u'😉',u'😊',u'😋', u'😎',u'🙋',u'😸', u'😛',u'?']
for token in tokens:
# m = re.match("(\:|\;)(\=|c|\-\o)*(\]|\)|\D|\*)*", token)
m = re.match("((?::|;|<)(?:-|,)?(?:\)|D|3))", token)
if m: # or m2
#print token
counter += 1
if token in list_of_unicode_smilies:
#print token
counter += 1
if average_value == 0:
return counter
elif average_value == 1:
if (counter > 0) and (len(tokens) > 3):
average = float(counter) / len(tokens)
else:
average = 0
return average
# not used
# this function counts the amount of negative emoticons
def negative_emoticons(self, untagged, average_value = 0):
if average_value == 3:
sum_vector = 0
for single_text in untagged:
counter = 0
tokens = single_text.split()
#print tokens[0]
list_of_unicode_smilies = [u'😒', u'😕', u'😟', u'😠', u'😞', u'😢', u'😦', u'😧', u'😬', u'😿', u'🙎']
for token in tokens:
# m = re.match("(\:|\;)(\=|c|\-\o)*(\]|\)|\D)*", token)
#m = re.match("((?::|;|=|D)(?:-)?(?:\(|\\|x|X|8|c|\[|\:))", token) # |\:
m = re.match("((?::|;|=|D)(?:-)?(?:\(|\\|x|X|8|c|C|\[|\:))", token) # |\:
#m2 = re.match("\<\3*", token)
if m : # or m2
#print token
counter += 1
if token in list_of_unicode_smilies:
#print token
counter += 1
if (counter > 0) and (len(tokens) > 1):
average = float(counter) / len(tokens)
sum_vector += average
else:
average = 0
sum_vector += average
if len(untagged) > 0:
average_return = sum_vector / len(untagged)
else:
average_return = sum_vector
return average_return
else:
counter = 0
tokens = untagged.split()
#print tokens[0]
list_of_unicode_smilies = [u'😒', u'😕', u'😟', u'😠', u'😞', u'😢', u'😦', u'😧', u'😬', u'😿', u'🙎']
for token in tokens:
# m = re.match("(\:|\;)(\=|c|\-\o)*(\]|\)|\D)*", token)
#m = re.match("((?::|;|=|D)(?:-)?(?:\(|\\|x|X|8|c|\[|\:))", token) # |\:
m = re.match("((?::|;|=|D)(?:-)?(?:\(|\\|x|X|8|c|C|\[|\:))", token) # |\:
#m2 = re.match("\<\3*", token)
if m : # or m2
#print token
counter += 1
if token in list_of_unicode_smilies:
#print token
counter += 1
if average_value == 0:
return counter
elif average_value == 1:
if (counter > 0) and (len(tokens) > 3):
average = float(counter) / len(tokens)
else:
average = 0
return average
# not used
# this function counts the amount of neutral emoticons
def neutral_emoticons(self, untagged, average_value = 0):
if average_value == 3:
sum_vector = 0
for single_text in untagged:
counter = 0
list_of_unicode_smilies = [u'😐', u'😑', u'😶']
tokens = single_text.split()
#print tokens[0]
for token in tokens:
# m = re.match("(\:|\;)(\=|c|\-\o)*(\]|\)|\D|\*)*", token)
m = re.match("((?::|=|<)(?:-)?(?:\||o|O))", token)
# m2 = re.match("\<\3*", token)
if m:
counter += 1
if token in list_of_unicode_smilies:
counter += 1
if (counter > 0) and (len(tokens) > 1):
average = float(counter) / len(tokens)
sum_vector += average
else:
average = 0
sum_vector += average
if len(untagged) > 0:
average_return = sum_vector / len(untagged)
else:
average_return = sum_vector
return average_return
else:
counter = 0
list_of_unicode_smilies = [u'😐', u'😑', u'😶']
tokens = untagged.split()
#print tokens[0]
for token in tokens:
# m = re.match("(\:|\;)(\=|c|\-\o)*(\]|\)|\D|\*)*", token)
m = re.match("((?::|=|<)(?:-)?(?:\||o|O))", token)
# m2 = re.match("\<\3*", token)
if m:
counter += 1
if token in list_of_unicode_smilies:
counter += 1
if average_value == 0:
return counter
elif average_value == 1:
if (counter > 0) and (len(tokens) > 3):
average = float(counter) / len(tokens)
else:
average = 0
return average
# not used
def quotation(untagged):
counter = 0
# regex to find all quoted content
prog = re.compile(r'''".*?"''')
# count all occurrences of quoted content
counter = len(prog.findall(untagged))
# test
#if counter != 0 :
# print prog.findall(conversation)
# print counter
return counter
# count all punctuation marks, without special characters
# we use a list of unicode characters because this
# solution works better than with 'string.punctuation'
def general_punctuation_new(self, untagged, average_value=0):
if average_value == 3:
pass
else:
counter = 0
#print conversation
count = lambda l1,l2: sum([1 for x in l1 if x in l2])
counter = count(untagged,set(punctuation))
'''
if counter > 2:
print untagged
print counter
'''
if average_value == 0:
# print counter
return counter
elif average_value == 1:
if len(untagged) > 0:
result = float(counter) / len(untagged.split())
# print result
return result
else:
return 0
elif average_value == 5:
return self.counter_pre_scaling_char(counter, len(untagged))
def general_punctuation(self, untagged, average_value=0):
if average_value == 3:
pass
else:
list_of_punct = [u"\u0021", # exclamation mark
u"\u002E", # fullstop
u"\u002D", # hyphen
u"\u003B", # semicolon
u"\u0337", u"\u0338", u"\u002F",u"\u005C", # slash, solidus
u"\u003F" , # question mark
u"\u005B" ,u"\u005D" ,u"\u007B",u"\u007D",u"\u0028",u"\u0029", #brackets
u"\u0084" ,u"\u0022", u"\u00BB", u"\u00AB" # quotation marks
u"\u2024" ,u"\u2025" ,u"\u2026", # ellipsis
u"\u2012" ,u"\u2013" ,u"\u2014" ,u"\u2015" , # dash
u"\u2018" ,u"\u2019" ,u"\u2020" ,u"\u201A" ,u"\u201B" , # quotation line 1
u"\u201C" ,u"\u201D" ,u"\u201E" ,u"\u201F" ,# quotation
u"\u003A" , # colon
u"\u002C", u"\u02BD",u"\u02BB", u"\u0312",u"\u0313",u"\u0314",u"\u0315",# comma
u"\u02BC" ,u"\u0027",# apostrophe
u"\u02EE" , #double apostrophe
# special characters
u"\u2010" , u"\u2011" , u"\u2012", u"\u2013", u"\u2014",u"\u2015",
u"\u2016", u"\u2017",u"\u2018",u"\u2020",u"\u201A", u"\u201B",
u"\u201C", u"\u201D", u"\u201E", u"\u201F", u"\u2020", u"\u2021",
u"\u2022", u"\u2023", u"\u2024", u"\u2025", u"\u2026", u"\u2027",
u"\u2028", u"\u2029", u"\u2030", u"\u2031", u"\u2032", u"\u2033",
u"\u2034", u"\u2035", u"\u2036", u"\u2037", u"\u2038", u"\u2039",
u"\u2040", u"\u2041", u"\u2042", u"\u2043", u"\u2044", u"\u2045",
u"\u2046", u"\u2047", u"\u2048", u"\u2049", u"\u2050", u"\u2051",
u"\u2052", u"\u2053", u"\u2054", u"\u2055", u"\u2056", u"\u2057",
u"\u2058", u"\u2059", u"\u2060", u"\u2061", u"\u2062", u"\u2063",
u"\u2064", u"\u2065", u"\u2066", u"\u2067", u"\u2068", u"\u2069",
u"\u206A", u"\u206B", u"\u206C", u"\u206D", u"\u206E", u"\u206F",
u"\u205A", u"\u205B", u"\u205C", u"\u205D", u"\u205E", u"\u205F",
u"\u204A", u"\u204B", u"\u204C", u"\u204D", u"\u204E", u"\u204F",
u"\u203A", u"\u203B", u"\u203C", u"\u203D", u"\u203E", u"\u203F",
u"\u202A", u"\u202B", u"\u202C", u"\u202D", u"\u202E", u"\u202F"
]
counter = 0
for x in untagged:
for char in x:
# print char
if char in list_of_punct:
counter += 1
if average_value == 0:
# print counter
return counter
elif average_value == 1:
if len(untagged) > 0:
result = float(counter) / len(untagged)
# print result
return result
else:
return 0
def special_characters(self, untagged):
# get all special characters, that are not included in the general_punctuation feature
list_of_punct = [u"\u2010" , u"\u2011" , u"\u2012", u"\u2013", u"\u2014",u"\u2015",
u"\u2016", u"\u2017",u"\u2018",u"\u2020",u"\u201A", u"\u201B",
u"\u201C", u"\u201D", u"\u201E", u"\u201F", u"\u2020", u"\u2021",
u"\u2022", u"\u2023", u"\u2024", u"\u2025", u"\u2026", u"\u2027",
u"\u2028", u"\u2029", u"\u2030", u"\u2031", u"\u2032", u"\u2033",
u"\u2034", u"\u2035", u"\u2036", u"\u2037", u"\u2038", u"\u2039",
u"\u2040", u"\u2041", u"\u2042", u"\u2043", u"\u2044", u"\u2045",
u"\u2046", u"\u2047", u"\u2048", u"\u2049", u"\u2050", u"\u2051",
u"\u2052", u"\u2053", u"\u2054", u"\u2055", u"\u2056", u"\u2057",
u"\u2058", u"\u2059", u"\u2060", u"\u2061", u"\u2062", u"\u2063",
u"\u2064", u"\u2065", u"\u2066", u"\u2067", u"\u2068", u"\u2069",
u"\u206A", u"\u206B", u"\u206C", u"\u206D", u"\u206E", u"\u206F",
u"\u205A", u"\u205B", u"\u205C", u"\u205D", u"\u205E", u"\u205F",
u"\u204A", u"\u204B", u"\u204C", u"\u204D", u"\u204E", u"\u204F",
u"\u203A", u"\u203B", u"\u203C", u"\u203D", u"\u203E", u"\u203F",
u"\u202A", u"\u202B", u"\u202C", u"\u202D", u"\u202E", u"\u202F"
]
counter = 0
for x in untagged:
if x in list_of_punct:
counter += 1
#print x
#print counter
return counter
##################### Gender Preferential Features START #############################
def stylistic_ending_custom(self, untagged, average_value = 0, custom_ending=None):
# count all words that end with -able, -ful, -al, -ible, -ic, -ive, -less, -ous
## these features are introduced in http://www.aclweb.org/anthology/D10-1021
if average_value == 3:
sum_value = 0
for text in untagged:
# tokens = word_tokenize(untagged)
tokens = text.split()
counter = 0
for x in tokens:
if x.endswith(custom_ending):
counter += 1
sum_value += counter
if len(untagged) > 0:
average_inner = sum_value / len(untagged)
else:
average_inner = sum_value
return average_inner
else:
tokens = word_tokenize(untagged)
'''
counter = 0
for x in tokens:
if x.endswith(custom_ending):
counter += 1
'''
my_regex = r'(^|\b)(\w+'+custom_ending+r')(\s|$)'
custom_regex = re.compile(my_regex, re.IGNORECASE)
counter = len(custom_regex.findall(untagged))
if average_value == 0:
# print counter
return counter
elif average_value == 1:
if (counter > 0) and (len(tokens) > 0):
average = float(counter) / len(tokens)
else:
average = 0
# print average
return average
elif average_value == 5:
return self.counter_pre_scaling(counter, len(tokens))
##################### Gender Preferential Features END #############################
# calculate token/type ratio
def type_token_ratio(self, untagged, average_value = 0):
if average_value == 3:
sum_value = 0
for text in untagged:
# tokenize the conversation
# tokens = word_tokenize(text)
tokens = text.split()
relevant_tokens = []
for x in tokens:
if len(x) > 2:
relevant_tokens.append(x)
types = set(relevant_tokens)
if (len(types)) != 0:
ratio = float(float(len(types)) / float(len(relevant_tokens)) * 100)
else:
ratio = 0
sum_value += ratio
if len(untagged) > 0:
average_inner = sum_value / len(untagged)
return average_inner
else:
return sum_value
else:
# tokenize the conversation
tokens = word_tokenize(untagged)
relevant_tokens = []
for x in tokens:
if len(x) > 2:
relevant_tokens.append(x)
types = set(relevant_tokens)
if (len(types)) != 0:
ratio = float(float(len(types)) / float(len(relevant_tokens)) * 100)
else:
ratio = 0
return ratio
def amount_of_tokens(self, untagged):
splited_conversation = untagged.split()
words = []
for x in splited_conversation:
x = re.sub('[\.!\?\,\:\'\"]', '', x)
if (len(x) >= 1):
words.append(x)
#print len(splited_conversation)
return len(words)
def amount_of_types(self, untagged):
splited_conversation = untagged.split()
words = []
for x in splited_conversation:
x = re.sub('[\.!\?\,\:\'\"]', '', x)
if (len(x) >= 1):
words.append(x)
#print len(splited_conversation)
set_words = set(words)
return len(set_words)
# not used
# this feature was suggested in:
# 'Forensic Psycholinguistics. Using Language Analysis for Identifying and Assessing Offenders'
# http://diogenesllc.com/statementlinguistics.pdf
def amount_sorryWords(self, untagged, average_value=0):
if average_value == 3:
pass
else:
tokens = untagged.split()
result = re.findall(r"sorry", untagged)
if average_value == 0:
# print len(result)
return len(result)
elif average_value == 1:
if (len(result) > 0) and (len(tokens) > 0):
average = float(len(result)) / len(tokens)
else:
average = 0
# print average
return average
def average_wordlength(self, untagged, average_value=0):
# calculate average word length
if average_value == 3:
# add funct for PAN
pass
else:
if len(untagged.split()) == 0:
average_word_length = 0
else:
average_word_length = numpy.mean([len(word) for word in untagged.split()])
if isinstance(average_word_length, float):
pass
else:
average_word_length = 0
return average_word_length
def words_capitalized(self, untagged, average_value = 5):
# count capitalized words
if average_value == 3:
sum_value = 0
for text in untagged:
# tokenize the conversation
# tokens = word_tokenize(untagged)
tokens = text.split()
# amount_words = len(tokens)
counter = 0
for word in tokens:
if word[0].isupper():
counter += 1
sum_value += counter
if len(untagged) > 0:
average_inner = sum_value / len(untagged)
else:
average_inner = sum_value
return average_inner
else:
# tokenize the conversation
tokens = untagged.split()
counter = 0
for index, word in enumerate(tokens):
# print word
if word[0].isupper() and word != 'URL' and index != 0:
counter += 1
if average_value == 0:
# print counter
return counter
elif average_value == 1:
if (counter > 0) and (len(tokens) > 0):
average = float(counter) / len(tokens)
else:
average = 0
# print average
return average
elif average_value == 5:
return self.counter_pre_scaling(counter, len(tokens))
def AllCaps(self, untagged, average_value=5):
# counts words with all capital letters
# tokenize the conversation
tokens = word_tokenize(untagged)
counter = 0
for word in tokens:
if word.isupper() and not word == 'URL' and not word == 'USER': # or [URL]
counter += 1
#print word
if average_value == 0:
# print counter
return counter
elif average_value == 1:
if (counter > 0) and (len(tokens) > 0):
average = float(counter) / len(tokens)
else:
average = 0
# print average
return average
elif average_value == 5:
return self.counter_pre_scaling(counter, len(tokens))
############## Features that require Pos_tags START #########################
def count_stop_words(self, untagged, average_value = 0):
# import nltk
# nltk.download()
# print len(stopwords.words('english'))
return self.word_counter_feature(self.cachedStopWords, untagged, average_value)
def each_part_of_speech(self, tagged, average_value=0, custom_pos=None):
### all POS tags are based on TreeTagger tags
### in TreeTagger, each language has a different TAG set
if self.language == "en":
if average_value == 3:
pass
else:
splitted_text = tagged.split()
if custom_pos == "nouns":
# pos_counter = int(splitted_text.count("NN") + splitted_text.count("NNP") + splitted_text.count("NNPS") + splitted_text.count("NNS"))
list_to_find = ["NN", "NNP", "NNPS", "NNS"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "adjectives":
# pos_counter = int(splitted_text.count("JJ") + splitted_text.count("JJR") + splitted_text.count("JJS"))
list_to_find = ["JJ", "JJR", "JJS"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "determiner":
# pos_counter = int(splitted_text.count("DT") + splitted_text.count("WDT") + splitted_text.count("PDT"))
list_to_find = ["DT", "WDT", "PDT"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "conjunctions":
# pos_counter = int(splitted_text.count("CC") + splitted_text.count("IN"))
list_to_find = ["CC", "IN"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "pronouns":
# pos_counter = int(splitted_text.count("PRP") + splitted_text.count("PRP$") + splitted_text.count("WP") + splitted_text.count("WP$"))
list_to_find = ["PRP", "PRP$", "WP", "WP$"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "verbs":
# pos_counter = int(splitted_text.count("VB") + splitted_text.count("VBD") + splitted_text.count("VBG") + splitted_text.count("VBN") + splitted_text.count("VBP") + splitted_text.count("VBZ"))
list_to_find = ["VB", "VBD", "VBG", "VBN", "VBP", "VBZ"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "adverbs":
# pos_counter = int(splitted_text.count("RB") + splitted_text.count("RBR") + splitted_text.count("RBS"))
list_to_find = ["RB", "RBR", "RBS"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "modals":
# pos_counter = int(splitted_text.count("MD"))
list_to_find = ["MD"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "interjections":
# pos_counter = int(splitted_text.count("UH"))
list_to_find = ["UH"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "to_pos":
# pos_counter = int(splitted_text.count("TO"))
list_to_find = ["TO"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "cardinal_num":
# pos_counter = int(splitted_text.count("CD"))
list_to_find = ["CD"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
if average_value == 0:
return pos_counter
elif average_value == 1:
if len(splitted_text) > 0:
result = float(pos_counter) / len(splitted_text)
else:
result = 0
return result
elif average_value == 5:
return self.counter_pre_scaling(pos_counter, len(splitted_text))
elif self.language == "nl":
if average_value == 3:
pass
else:
splitted_text = tagged.split()
if custom_pos == "nouns":
# pos_counter = int(splitted_text.count("NN") + splitted_text.count("NNP") + splitted_text.count("NNPS") + splitted_text.count("NNS"))
list_to_find = ["noun*kop", "nounabbr", "nounpl", "nounprop", "nounsg"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "adjectives":
# pos_counter = int(splitted_text.count("JJ") + splitted_text.count("JJR") + splitted_text.count("JJS"))
list_to_find = ["adj", "adj*kop", "adjabbr"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "determiner":
# pos_counter = int(splitted_text.count("DT") + splitted_text.count("WDT") + splitted_text.count("PDT"))
list_to_find = ["det__demo", "prondemo", "det__art"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "conjunctions":
# pos_counter = int(splitted_text.count("CC") + splitted_text.count("IN"))
list_to_find = ["conjcoord", "conjsubo"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "pronouns":
# pos_counter = int(splitted_text.count("PRP") + splitted_text.count("PRP$") + splitted_text.count("WP") + splitted_text.count("WP$"))
list_to_find = ["pronindef", "pronpers", "pronposs", "pronquest", "pronrefl", "pronrel"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "verbs":
# pos_counter = int(splitted_text.count("VB") + splitted_text.count("VBD") + splitted_text.count("VBG") + splitted_text.count("VBN") + splitted_text.count("VBP") + splitted_text.count("VBZ"))
list_to_find = ["verbinf", "verbpapa", "verbpastpl", "verbpastsg", "verbpresp", "verbprespl", "verbpressg"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "adverbs":
# pos_counter = int(splitted_text.count("RB") + splitted_text.count("RBR") + splitted_text.count("RBS"))
list_to_find = ["adv", "advabbr", "pronadv"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "modals":
# pos_counter = int(splitted_text.count("MD"))
# here changed to particle -te in Dutch
list_to_find = ["partte"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "interjections":
# pos_counter = int(splitted_text.count("UH"))
list_to_find = ["int"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "to_pos":
# pos_counter = int(splitted_text.count("TO"))
list_to_find = ["prep", "prepabbr"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "cardinal_num":
# pos_counter = int(splitted_text.count("CD"))
list_to_find = ["num__card", "num__ord"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
if average_value == 0:
return pos_counter
elif average_value == 1:
if len(splitted_text) > 0:
result = float(pos_counter) / len(splitted_text)
else:
result = 0
return result
elif average_value == 5:
return self.counter_pre_scaling(pos_counter, len(splitted_text))
elif self.language == "es":
if average_value == 3:
pass
else:
splitted_text = tagged.split()
if custom_pos == "nouns":
# pos_counter = int(splitted_text.count("NN") + splitted_text.count("NNP") + splitted_text.count("NNPS") + splitted_text.count("NNS"))
list_to_find = ["NN", "NP", "NC", "NMEA", "PAL", "PDEL", "PE"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "adjectives":
# pos_counter = int(splitted_text.count("JJ") + splitted_text.count("JJR") + splitted_text.count("JJS"))
list_to_find = ["ADJ"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "determiner":
# pos_counter = int(splitted_text.count("DT") + splitted_text.count("WDT") + splitted_text.count("PDT"))
list_to_find = ["DM"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "conjunctions":
# pos_counter = int(splitted_text.count("CC") + splitted_text.count("IN"))
list_to_find = ["CC", "CCAD", "CQUE", "CSUBF", "CSUBI", "CSUBX"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "pronouns":
# pos_counter = int(splitted_text.count("PRP") + splitted_text.count("PRP$") + splitted_text.count("WP") + splitted_text.count("WP$"))
list_to_find = ["DM", "INT", "PPC", "PPO", "PPX", "REL"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "verbs":
# pos_counter = int(splitted_text.count("VB") + splitted_text.count("VBD") + splitted_text.count("VBG") + splitted_text.count("VBN") + splitted_text.count("VBP") + splitted_text.count("VBZ"))
list_to_find = ["VCLIger", "VCLIinf", "VCLIfin", "VEadj", "VEfin", "VEger", "VEinf", "VHadj", "VEadj", "VHfin", "VHger" , "VHinf", "VLadj", "VLfin", "VLger", "VLinf", "VSadj", "VSfin", "VSger", "VSinf"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "adverbs":
# pos_counter = int(splitted_text.count("RB") + splitted_text.count("RBR") + splitted_text.count("RBS"))
list_to_find = ["ADV"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "modals":
# pos_counter = int(splitted_text.count("MD"))
list_to_find = ["VMadj", "VMfin", "VMger", "VMinf"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "interjections":
# pos_counter = int(splitted_text.count("UH"))
list_to_find = ["ITJN"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "to_pos":
# pos_counter = int(splitted_text.count("TO"))
# here prepositions
list_to_find = ["PREP", "PREP/DEL"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
elif custom_pos == "cardinal_num":
# pos_counter = int(splitted_text.count("CD"))
list_to_find = ["CARD"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
pos_counter = len(custom_regex.findall(tagged))
if average_value == 0:
return pos_counter
elif average_value == 1:
if len(splitted_text) > 0:
result = float(pos_counter) / len(splitted_text)
else:
result = 0
return result
elif average_value == 5:
return self.counter_pre_scaling(pos_counter, len(splitted_text))
def counter_pre_scaling_char(self, counter, text_length):
'''
in characters
tweets: min 6, mean 80, max 320
review: min 5, mean 410, max 16673
blogs: min 1, mean 3058, max 64790
'''
for index, element in enumerate(self.calc_chars):
if text_length < element:
result = float(counter) / (index+1)
break
elif text_length > self.calc_chars[-1]:
if len(self.calc_chars) > 0:
result = float(counter) / len(self.calc_chars)
else:
result = float(counter)
break
return result
def counter_pre_scaling(self, counter, text_length):
'''
in words
tweets: min 2, mean 13, max 33
review: min 3, mean 75, max 2955
blogs: min 1, mean 508, max 10424
'''
for index, element in enumerate(self.calc_words):
if text_length < element:
result = float(counter) / (index+1)
break
elif text_length > self.calc_words[-1]:
if len(self.calc_words) > 0:
result = float(counter) / len(self.calc_words)
else:
result = float(counter)
break
return result
def word_counter_feature(self, word_list, text_sample, average_value):
# feature to count all occurrences of given words in a text sample
if average_value == 3:
# over all texts per author
sum_vector = 0
for single_text in text_sample:
counter = 0
tokens = word_tokenize(text_sample)
# tokens = single_text.split()
connective = re.compile(self.regex_str(word_list), re.IGNORECASE)
counter = len(connective.findall(text_sample))
if (counter > 0) and (len(tokens) > 1):
average = float(counter) / len(tokens)
sum_vector += average
else:
average = 0
sum_vector += average
if len(text_sample) > 0:
average_return = sum_vector / len(text_sample)
else:
average_return = sum_vector
return average_return
else:
if average_value == 4:
feature_vector = list()
tokens = word_tokenize(text_sample)
for token1 in tokens:
for token2 in word_list:
if token1 == token2:
feature_vector.append(1)
else:
feature_vector.append(0)
return feature_vector
else:
# print self.regex_str(word_list)
connective = re.compile(self.regex_str(word_list), re.IGNORECASE)
counter = len(connective.findall(text_sample))
# print connective.findall(text_sample)
if average_value == 0:
# print counter
return counter
elif average_value == 1:
tokens = text_sample.split()
if (counter > 0) and (len(tokens) > 1):
average = float(counter) / len(tokens)
else:
average = 0
# print average
return average
elif average_value == 5:
result = self.counter_pre_scaling(counter, len(text_sample.split()))
return result
def lexical_Fmeasure_new(self, tagged):
if self.language == "en":
# This feature calculates how implicit or explicit the text is.
# It is a unitary measure of text's relative contextuality in opposition to its formality.
# Feature is suggested in this paper: http://www.aclweb.org/anthology/D10-1021
# # liste von alle TAGS >> http://www.ling.upenn.edu/courses/Fall_2003/ling001/penn_treebank_pos.html
counter_nouns = 0
counter_adj = 0
counter_prep = 0
counter_art = 0
counter_pron = 0
counter_verb = 0
counter_adv = 0
counter_int = 0
# print tagged
splitted_text = tagged.split()
# counter_nouns = int(splitted_text.count("NN") + splitted_text.count("NNP") + splitted_text.count("NNPS") + splitted_text.count("NNS"))
list_to_find_nouns = ["NN", "NNP", "NNPS", "NNS"]
custom_regex = re.compile(self.regex_str(list_to_find_nouns), re.IGNORECASE)
counter_nouns = len(custom_regex.findall(tagged))
counter_nouns = self.counter_pre_scaling(counter_nouns, len(splitted_text))
# counter_adj = int(splitted_text.count("JJ") + splitted_text.count("JJR") + splitted_text.count("JJS"))
list_to_find_adj = ["JJ", "JJR", "JJS"]
custom_regex = re.compile(self.regex_str(list_to_find_adj), re.IGNORECASE)
counter_adj = len(custom_regex.findall(tagged))
counter_adj = self.counter_pre_scaling(counter_adj, len(splitted_text))
# counter_prep = int(splitted_text.count("DT") + splitted_text.count("WDT") + splitted_text.count("PDT"))
list_to_find_art = ["DT"]
custom_regex = re.compile(self.regex_str(list_to_find_art), re.IGNORECASE)
counter_art = len(custom_regex.findall(tagged))
counter_art = self.counter_pre_scaling(counter_art, len(splitted_text))
# counter_prep = int(splitted_text.count("IN"))
list_to_find_prep = ["IN"]
custom_regex = re.compile(self.regex_str(list_to_find_prep), re.IGNORECASE)
counter_prep = len(custom_regex.findall(tagged))
counter_prep = self.counter_pre_scaling(counter_prep, len(splitted_text))
# counter_pron = int(splitted_text.count("PRP") + splitted_text.count("PRP$") + splitted_text.count("WP") + splitted_text.count("WP$"))
list_to_find_pron = ["PRP", "PRP$", "WP", "WP$"]
custom_regex = re.compile(self.regex_str(list_to_find_pron), re.IGNORECASE)
counter_pron = len(custom_regex.findall(tagged))
counter_pron = self.counter_pre_scaling(counter_pron, len(splitted_text))
# counter_verb = int(splitted_text.count("VB") + splitted_text.count("VBD") + splitted_text.count("VBG") + splitted_text.count("VBN") + splitted_text.count("VBP") + splitted_text.count("VBZ"))
list_to_find_verbs = ["VB", "VBD", "VBG", "VBN", "VBP", "VBZ"]
custom_regex = re.compile(self.regex_str(list_to_find_verbs), re.IGNORECASE)
counter_verb = len(custom_regex.findall(tagged))
counter_verb = self.counter_pre_scaling(counter_verb, len(splitted_text))
# counter_adv = int(splitted_text.count("RB") + splitted_text.count("RBR") + splitted_text.count("RBS"))
list_to_find_adv = ["RB", "RBR", "RBS"]
custom_regex = re.compile(self.regex_str(list_to_find_adv), re.IGNORECASE)
counter_adv = len(custom_regex.findall(tagged))
counter_adv = self.counter_pre_scaling(counter_adv, len(splitted_text))
# counter_int = int(splitted_text.count("UH"))
list_to_find_int = ["UH"]
custom_regex = re.compile(self.regex_str(list_to_find_int), re.IGNORECASE)
counter_int = len(custom_regex.findall(tagged))
counter_int = self.counter_pre_scaling(counter_int, len(splitted_text))
'''
print "Nouns: ", counter_nouns
print "Adj: ", counter_adj
print "Prep: ", counter_prep
print "Art: ", counter_art
print "Pron: ", counter_pron
print "Verb: ", counter_verb
print "Adv: ", counter_adv
print "Int: ", counter_int
'''
F = 0.5 * ((counter_nouns + counter_adj + counter_prep + counter_art) - (counter_pron + counter_verb + counter_adv + counter_int ) + 100)
# print "Lexical F-Measure :", F
return F
elif self.language == "nl":
# This feature calculates how implicit or explicit the text is.
# It is a unitary measure of text's relative contextuality in opposition to its formality.
# Feature is suggested in this paper: http://www.aclweb.org/anthology/D10-1021
# # liste von alle TAGS >> http://www.ling.upenn.edu/courses/Fall_2003/ling001/penn_treebank_pos.html
counter_nouns = 0
counter_adj = 0
counter_prep = 0
counter_art = 0
counter_pron = 0
counter_verb = 0
counter_adv = 0
counter_int = 0
# print tagged
splitted_text = tagged.split()
# counter_nouns = int(splitted_text.count("NN") + splitted_text.count("NNP") + splitted_text.count("NNPS") + splitted_text.count("NNS"))
list_to_find_nouns = ["noun*kop", "nounabbr", "nounpl", "nounprop", "nounsg"]
custom_regex = re.compile(self.regex_str(list_to_find_nouns), re.IGNORECASE)
counter_nouns = len(custom_regex.findall(tagged))
counter_nouns = self.counter_pre_scaling(counter_nouns, len(splitted_text))
# counter_adj = int(splitted_text.count("JJ") + splitted_text.count("JJR") + splitted_text.count("JJS"))
list_to_find_adj = ["adj", "adj*kop", "adjabbr"]
custom_regex = re.compile(self.regex_str(list_to_find_adj), re.IGNORECASE)
counter_adj = len(custom_regex.findall(tagged))
counter_adj = self.counter_pre_scaling(counter_adj, len(splitted_text))
# counter_prep = int(splitted_text.count("DT") + splitted_text.count("WDT") + splitted_text.count("PDT"))
list_to_find_art = ["det__demo", "prondemo", "det__art"]
custom_regex = re.compile(self.regex_str(list_to_find_art), re.IGNORECASE)
counter_art = len(custom_regex.findall(tagged))
counter_art = self.counter_pre_scaling(counter_art, len(splitted_text))
# counter_prep = int(splitted_text.count("IN"))
list_to_find_prep = ["prep", "prepabbr"]
custom_regex = re.compile(self.regex_str(list_to_find_prep), re.IGNORECASE)
counter_prep = len(custom_regex.findall(tagged))
counter_prep = self.counter_pre_scaling(counter_prep, len(splitted_text))
# counter_pron = int(splitted_text.count("PRP") + splitted_text.count("PRP$") + splitted_text.count("WP") + splitted_text.count("WP$"))
list_to_find_pron = ["pronindef", "pronpers", "pronposs", "pronquest", "pronrefl", "pronrel"]
custom_regex = re.compile(self.regex_str(list_to_find_pron), re.IGNORECASE)
counter_pron = len(custom_regex.findall(tagged))
counter_pron = self.counter_pre_scaling(counter_pron, len(splitted_text))
# counter_verb = int(splitted_text.count("VB") + splitted_text.count("VBD") + splitted_text.count("VBG") + splitted_text.count("VBN") + splitted_text.count("VBP") + splitted_text.count("VBZ"))
list_to_find_verbs = ["verbinf", "verbpapa", "verbpastpl", "verbpastsg", "verbpresp", "verbprespl", "verbpressg"]
custom_regex = re.compile(self.regex_str(list_to_find_verbs), re.IGNORECASE)
counter_verb = len(custom_regex.findall(tagged))
counter_verb = self.counter_pre_scaling(counter_verb, len(splitted_text))
# counter_adv = int(splitted_text.count("RB") + splitted_text.count("RBR") + splitted_text.count("RBS"))
list_to_find_adv = ["adv", "advabbr", "pronadv"]
custom_regex = re.compile(self.regex_str(list_to_find_adv), re.IGNORECASE)
counter_adv = len(custom_regex.findall(tagged))
counter_adv = self.counter_pre_scaling(counter_adv, len(splitted_text))
# counter_int = int(splitted_text.count("UH"))
list_to_find_int = ["int"]
custom_regex = re.compile(self.regex_str(list_to_find_int), re.IGNORECASE)
counter_int = len(custom_regex.findall(tagged))
counter_int = self.counter_pre_scaling(counter_int, len(splitted_text))
'''
print "Nouns: ", counter_nouns
print "Adj: ", counter_adj
print "Prep: ", counter_prep
print "Art: ", counter_art
print "Pron: ", counter_pron
print "Verb: ", counter_verb
print "Adv: ", counter_adv
print "Int: ", counter_int
'''
F = 0.5 * ((counter_nouns + counter_adj + counter_prep + counter_art) - (counter_pron + counter_verb + counter_adv + counter_int ) + 100)
# print "Lexical F-Measure :", F
return F
elif self.language == "es":
# This feature calculates how implicit or explicit the text is.
# It is a unitary measure of text's relative contextuality in opposition to its formality.
# Feature is suggested in this paper: http://www.aclweb.org/anthology/D10-1021
# # liste von alle TAGS >> http://www.ling.upenn.edu/courses/Fall_2003/ling001/penn_treebank_pos.html
counter_nouns = 0
counter_adj = 0
counter_prep = 0
counter_art = 0
counter_pron = 0
counter_verb = 0
counter_adv = 0
counter_int = 0
# print tagged
splitted_text = tagged.split()
# counter_nouns = int(splitted_text.count("NN") + splitted_text.count("NNP") + splitted_text.count("NNPS") + splitted_text.count("NNS"))
list_to_find_nouns = ["NN", "NP", "NC", "NMEA", "PAL", "PDEL", "PE"]
custom_regex = re.compile(self.regex_str(list_to_find_nouns), re.IGNORECASE)
counter_nouns = len(custom_regex.findall(tagged))
counter_nouns = self.counter_pre_scaling(counter_nouns, len(splitted_text))
# counter_adj = int(splitted_text.count("JJ") + splitted_text.count("JJR") + splitted_text.count("JJS"))
list_to_find_adj = ["ADJ"]
custom_regex = re.compile(self.regex_str(list_to_find_adj), re.IGNORECASE)
counter_adj = len(custom_regex.findall(tagged))
counter_adj = self.counter_pre_scaling(counter_adj, len(splitted_text))
# counter_prep = int(splitted_text.count("DT") + splitted_text.count("WDT") + splitted_text.count("PDT"))
list_to_find_art = ["DM"]
custom_regex = re.compile(self.regex_str(list_to_find_art), re.IGNORECASE)
counter_art = len(custom_regex.findall(tagged))
counter_art = self.counter_pre_scaling(counter_art, len(splitted_text))
# counter_prep = int(splitted_text.count("IN"))
list_to_find_prep = ["PREP", "PREP/DEL"]
custom_regex = re.compile(self.regex_str(list_to_find_prep), re.IGNORECASE)
counter_prep = len(custom_regex.findall(tagged))
counter_prep = self.counter_pre_scaling(counter_prep, len(splitted_text))
# counter_pron = int(splitted_text.count("PRP") + splitted_text.count("PRP$") + splitted_text.count("WP") + splitted_text.count("WP$"))
list_to_find_pron = ["DM", "INT", "PPC", "PPO", "PPX", "REL"]
custom_regex = re.compile(self.regex_str(list_to_find_pron), re.IGNORECASE)
counter_pron = len(custom_regex.findall(tagged))
counter_pron = self.counter_pre_scaling(counter_pron, len(splitted_text))
# counter_verb = int(splitted_text.count("VB") + splitted_text.count("VBD") + splitted_text.count("VBG") + splitted_text.count("VBN") + splitted_text.count("VBP") + splitted_text.count("VBZ"))
list_to_find_verbs = ["VCLIger", "VCLIinf", "VCLIfin", "VEadj", "VEfin", "VEger", "VEinf", "VHadj", "VEadj", "VHfin", "VHger" , "VHinf", "VLadj", "VLfin", "VLger", "VLinf", "VSadj", "VSfin", "VSger", "VSinf", "VMadj", "VMfin", "VMger", "VMinf"]
custom_regex = re.compile(self.regex_str(list_to_find_verbs), re.IGNORECASE)
counter_verb = len(custom_regex.findall(tagged))
counter_verb = self.counter_pre_scaling(counter_verb, len(splitted_text))
# counter_adv = int(splitted_text.count("RB") + splitted_text.count("RBR") + splitted_text.count("RBS"))
list_to_find_adv = ["ADV"]
custom_regex = re.compile(self.regex_str(list_to_find_adv), re.IGNORECASE)
counter_adv = len(custom_regex.findall(tagged))
counter_adv = self.counter_pre_scaling(counter_adv, len(splitted_text))
# counter_int = int(splitted_text.count("UH"))
list_to_find_int = ["ITJN"]
custom_regex = re.compile(self.regex_str(list_to_find_int), re.IGNORECASE)
counter_int = len(custom_regex.findall(tagged))
counter_int = self.counter_pre_scaling(counter_int, len(splitted_text))
'''
print "Nouns: ", counter_nouns
print "Adj: ", counter_adj
print "Prep: ", counter_prep
print "Art: ", counter_art
print "Pron: ", counter_pron
print "Verb: ", counter_verb
print "Adv: ", counter_adv
print "Int: ", counter_int
'''
F = 0.5 * ((counter_nouns + counter_adj + counter_prep + counter_art) - (counter_pron + counter_verb + counter_adv + counter_int ) + 100)
# print "Lexical F-Measure :", F
return F
def plurality(self, tagged, average_value = 0):
# This feature calculates the amount of Plural Nouns
# this feature is suggested in http://ischool.syr.edu/media/documents/2013/2/beiyujan2013.pdf
if self.language == "en":
if average_value == 3:
pass
else:
counter = 0
splitted_text = tagged.split()
# counter = int(splitted_text.count("NNS") + splitted_text.count("NNPS"))
list_to_find = ["NNS", "NNPS"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
counter = len(custom_regex.findall(tagged))
# print "Plurality: ", counter
# print tagged.split()
if average_value == 0:
return counter
elif average_value == 1:
if (len(splitted_text) > 0) and (counter > 0):
counter_new = float(counter)/len(splitted_text)
else:
counter_new = 0
# test output
'''
if counter_new > 0:
print tagged
print counter
print counter_new
'''
return counter_new
elif average_value == 5:
return self.counter_pre_scaling(counter, len(splitted_text))
if self.language == "nl":
if average_value == 3:
pass
else:
counter = 0
splitted_text = tagged.split()
# counter = int(splitted_text.count("NNS") + splitted_text.count("NNPS"))
list_to_find = ["nounpl", "verbpastpl", "verbprespl"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
counter = len(custom_regex.findall(tagged))
# print "Plurality: ", counter
# print tagged.split()
if average_value == 0:
return counter
elif average_value == 1:
if (len(splitted_text) > 0) and (counter > 0):
counter_new = float(counter)/ len(splitted_text)
else:
counter_new = 0
# test output
'''
if counter_new > 0:
print tagged
print counter
print counter_new
'''
return counter_new
elif average_value == 5:
return self.counter_pre_scaling(counter, len(splitted_text))
if self.language == "es":
if average_value == 3:
pass
else:
counter = 0
splitted_text = tagged.split()
# counter = int(splitted_text.count("NNS") + splitted_text.count("NNPS"))
# here modified to:
# ALFS Singular letter of the alphabet (A, b)
# SE Se (as particle)
list_to_find = ["ALFP", "SE"]
custom_regex = re.compile(self.regex_str(list_to_find), re.IGNORECASE)
counter = len(custom_regex.findall(tagged))
# print "Plurality: ", counter
# print tagged.split()
if average_value == 0:
return counter
elif average_value == 1:
if (len(splitted_text) > 0) and (counter > 0):
counter_new = float(counter)/ len(splitted_text)
else:
counter_new = 0
return counter_new
elif average_value == 5:
return self.counter_pre_scaling(counter, len(splitted_text))
# count all vowels
def find_vowels(self, untagged):
from collections import Counter
vowels = "aeiuoAEIOU"
c = Counter(untagged)
return sum(c[v] for v in vowels) | 46.83059 | 257 | 0.528493 | 7,612 | 68,279 | 4.533762 | 0.09104 | 0.07615 | 0.083249 | 0.032338 | 0.842485 | 0.82565 | 0.806294 | 0.786648 | 0.777839 | 0.771145 | 0 | 0.030261 | 0.349522 | 68,279 | 1,458 | 258 | 46.83059 | 0.745508 | 0.183219 | 0 | 0.750782 | 0 | 0 | 0.069662 | 0.01312 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035454 | false | 0.01147 | 0.010428 | 0.006257 | 0.133472 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
878e6dbcef51384be76ce6e1e65724bf77d02ea6 | 29,409 | py | Python | main/test.py | zhouyixin829/Fanta-Tic-Tac-Toe | c102272d8996706297f7c4326f6a314052975132 | [
"Unlicense"
] | 1 | 2021-05-14T05:52:42.000Z | 2021-05-14T05:52:42.000Z | main/test.py | zhouyixin829/Fanta-Tic-Tac-Toe | c102272d8996706297f7c4326f6a314052975132 | [
"Unlicense"
] | null | null | null | main/test.py | zhouyixin829/Fanta-Tic-Tac-Toe | c102272d8996706297f7c4326f6a314052975132 | [
"Unlicense"
] | null | null | null | import numpy as np
import random
#Game Over Check For Each same Tic tac toe
def game_over(state):
for symbol in "XO":
if(state == symbol).all(axis=0).any() :
return (True,symbol)
if(state == symbol).all(axis=1).any() :
return (True,symbol)
if np.diag(state == symbol).all() :
return (True,symbol)
if np.diag(np.rot90(state) == symbol).all() :
return (True,symbol)
if not (state =="_").any():return (True,symbol)
return (False,symbol)
#Game Over Check For Whole Tic tac toe
def game_over_out(state):
#ROW
for i in range(3):
a = (state[i]=='O').all(axis=1)
b = (state[i]=='O').all(axis=2)
if (not (a==False).any()) and (not (b==False).any()):
return True
for i in range(3):
a = (state[i]=='X').all(axis=1)
b = (state[i]=='X').all(axis=2)
if (not (a==False).any()) and (not (b==False).any()):
return True
#COL
a=(state=='X').all(axis=3)
b = (a ==True).all(axis=2)
if (b==True).all(axis=0).any():
return True
a=(state=='O').all(axis=3)
b = (a ==True).all(axis=2)
if (b==True).all(axis=0).any():
return True
#DIAG
if (not (state[0][0] != state[1][1]).any()) and (not (state[1][1] != state[2][2]).any()):
if(state[0][0][0][0] != '_'):
return True
if (not (state[0][2] != state[1][1]).any()) and (not (state[1][1] != state[2][0]).any()):
if(state[0][2][0][0] != '_'):
return True
#TIE
if not (state=="_").any(): return True
return False
#return Socre
def score(state):
for (symbol, sign) in zip("XO",[-1,+1]):
if(state == symbol).all(axis=0).any() : return sign
if(state == symbol).all(axis=1).any() : return sign
if np.diag(state == symbol).all() : return sign
if np.diag(np.rot90(state == symbol)).all() : return sign
return 0
#Ai_X's move action choice
def ai_2_turn(state):
max_score = float("-inf")
best_action = (0,0)
nodes_num = 0
min_depth = -float('inf')
for i in range(3):
for j in range(3):
if state[i][j] == "_":
state[i][j] = "X"
value, nodes_num, depth = minimax_ai2(state,False,float("-inf"),float("+inf"),nodes_num,0)
unique, counts = np.unique(state,return_counts=True)
xo_num = dict(zip(unique,counts))
if 'O' in xo_num and xo_num['O'] >xo_num['X']:
state[i][j] = "_"
if value >= max_score and depth > min_depth:
max_score = value
min_depth = depth
best_action = (i,j)
else:
state[i][j] = "_"
if value > max_score:
max_score = value
best_action = (i,j)
state[best_action[0]][best_action[1]] = "X"
return nodes_num,best_action
#return Socre
def score_ai2(state):
for (symbol, sign) in zip("OX",[-1,+1]):
if(state == symbol).all(axis=0).any() : return sign
if(state == symbol).all(axis=1).any() : return sign
if np.diag(state == symbol).all() : return sign
if np.diag(np.rot90(state == symbol)).all() : return sign
return 0
#Minimax Algorithm
def minimax_ai2(state,max_turn,alpha,beta,nodes_num,depth):
if game_over(state)[0]:
nodes_num += 1
return (score_ai2(state),nodes_num,depth)
if max_turn == True:
max_score = float("-inf")
for i in range(3):
for j in range(3):
if state[i][j] == "_":
state[i][j] = "X"
value, nodes_num, depth = minimax_ai2(state,False,alpha,beta,nodes_num,depth+1)
state[i][j] = "_"
max_score = max(value,max_score)
alpha = max(max_score,alpha)
if max_score >= beta:
return (max_score, nodes_num, depth)
return (max_score, nodes_num, depth)
else:
min_score = float("inf")
for i in range(3):
for j in range(3):
if state[i][j] == "_":
state[i][j] = "O"
value, nodes_num, depth = minimax_ai2(state,True,alpha,beta,nodes_num,depth+1)
state[i][j] = "_"
min_score = min(value,min_score)
beta = min(min_score,beta)
if min_score <= alpha:
return (min_score, nodes_num, depth)
return (min_score, nodes_num, depth)
#Ai's move action choice
def ai_1_turn(state):
max_score = float("-inf")
best_action = (0,0)
nodes_num = 0
min_depth = -float('inf')
for i in range(3):
for j in range(3):
if state[i][j] == "_":
state[i][j] = "O"
value, nodes_num, depth = minimax(state,False ,float("-inf"),float("+inf"),nodes_num,0)
unique, counts = np.unique(state,return_counts=True)
xo_num = dict(zip(unique,counts))
if 'X' in xo_num and xo_num['X'] >xo_num['O']:
state[i][j] = "_"
if value >= max_score and depth > min_depth:
max_score = value
min_depth = depth
best_action = (i,j)
else:
state[i][j] = "_"
if value > max_score:
max_score = value
best_action = (i,j)
state[best_action[0]][best_action[1]] = "O"
return nodes_num,best_action
#Ai's move action choice
def ai_turn(state):
max_score = float("-inf")
best_action = (0,0)
nodes_num = 0
min_depth = -float('inf')
for i in range(3):
for j in range(3):
if state[i][j] == "_":
state[i][j] = "O"
value, nodes_num, depth = minimax(state,False,float("-inf"),float("+inf"),nodes_num,0)
unique, counts = np.unique(state,return_counts=True)
xo_num = dict(zip(unique,counts))
if 'O' not in xo_num:
if 'X' in xo_num and (xo_num['X']>1) :
state[i][j] = "_"
if value >= max_score and depth > min_depth:
max_score = value
min_depth = depth
best_action = (i,j)
else:
state[i][j] = "_"
if value > max_score:
max_score = value
best_action = (i,j)
else:
if 'X' in xo_num and xo_num['X'] >xo_num['O']:
state[i][j] = "_"
if value >= max_score and depth > min_depth:
max_score = value
min_depth = depth
best_action = (i,j)
else:
state[i][j] = "_"
if value > max_score:
max_score = value
best_action = (i,j)
state[best_action[0]][best_action[1]] = "O"
return nodes_num
#Minimax Algorithm
def minimax(state,max_turn,alpha,beta,nodes_num,depth):
if game_over(state)[0]:
nodes_num += 1
return (score(state),nodes_num,depth)
if max_turn == True:
max_score = float("-inf")
for i in range(3):
for j in range(3):
if state[i][j] == "_":
state[i][j] = "O"
value, nodes_num, depth = minimax(state,False,alpha,beta,nodes_num,depth+1)
state[i][j] = "_"
max_score = max(value,max_score)
alpha = max(max_score,alpha)
if max_score >= beta:
return (max_score, nodes_num, depth)
return (max_score, nodes_num, depth)
else:
min_score = float("inf")
for i in range(3):
for j in range(3):
if state[i][j] == "_":
state[i][j] = "X"
value, nodes_num, depth = minimax(state,True,alpha,beta,nodes_num,depth+1)
state[i][j] = "_"
min_score = min(value,min_score)
beta = min(min_score,beta)
if min_score <= alpha:
return (min_score, nodes_num, depth)
return (min_score, nodes_num, depth)
def print_func(state):
for i in range(3):
print('{0} | {1} | {2}\n'.format(state[0][0][i],state[0][1][i],state[0][2][i]))
print('--------------------------------------------')
for i in range(3):
print('{0} | {1} | {2}\n'.format(state[1][0][i],state[1][1][i],state[1][2][i]))
print('--------------------------------------------')
for i in range(3):
print('{0} | {1} | {2}\n'.format(state[2][0][i],state[2][1][i],state[2][2][i]))
print('--------------------------------------------')
if __name__ == "__main__":
#@yxiao09
#Ai is O , player is X
choice_ai = input('''plase choose the difficuty of the AI:\n
3.tree-based ai vs tree based ai:\n
4.tree-based ai vs baseline ai:\n ''')
while True:
try:
choice_ai = int( choice_ai)
break
except:
choice_ai = input('Plase input a int number bewtween 1 and 4.\n plase choose the difficuty of the AI:\n 1. tree-based one\n \
2."baseline" AI:\n 3.tree-based ai vs tree based ai:\n4.tree-based ai vs baseline ai\n ')
if choice_ai == 3:
print('============================================')
print('tree-based ai vs tree based ai')
print('=============================================')
elif choice_ai == 4:
print('============================================')
print('tree-based ai vs baseline ai')
print('=============================================')
ai1,ai2 = 0,0
base_ai,tree_ai =0,0
#Random obstacle
obs_nums = input('\nPlease input the number of obstacle you want between 0-4 in int number\n0-non obstacle, 1-one obstacle, 2-two obstacles, 3-three obstacles, 4-four obstacles:\n')
obs_nums = int(obs_nums)
'''
random_out = []
random_inner = []
for i in range(obs_nums):
random_out.append((random.randint(0,2),random.randint(0,2)))
random_inner.append((random.randint(0,2),random.randint(0,2)))
'''
count = input("Please input the number of games, you want the AI to play:\n")
count = int(count)
while (choice_ai == 3 and count > 0) or (choice_ai ==4 and count > 0 ):
if choice_ai == 3:
state = np.array([
[[["_" for _ in range(3)] for _ in range(3)]
for large_column in range(3)]
for large_row in range(3)])
#Random obstacle
state[random.randint(0,1)][random.randint(0,2)][random.randint(0,2)][random.randint(0,2)] = 'P'
state[random.randint(0,2)][random.randint(0,2)][random.randint(0,2)][random.randint(0,2)] = 'P'
state[random.randint(0,2)][random.randint(0,2)][random.randint(0,2)][random.randint(0,2)] = 'P'
state[random.randint(0,2)][random.randint(0,2)][random.randint(0,2)][random.randint(0,2)] = 'P'
count -= 1
state[1][1][0][0] = "_"
a,b,c,d = (random.randint(0,2),random.randint(0,2),random.randint(0,2),random.randint(0,2))
print('ai-1 move')
state[a][b][c][d] = "X"
cur_turn = 'player'
elif choice_ai ==4 :
state = np.array([
[[["_" for _ in range(3)] for _ in range(3)]
for large_column in range(3)]
for large_row in range(3)])
for i in range(obs_nums):
state[random.randint(0,2)][random.randint(0,2)][random.randint(0,2)][random.randint(0,2)] = 'P'
count -= 1
a,b,c,d = (random.randint(0,2),random.randint(0,2),random.randint(0,2),random.randint(0,2))
print('\nbaseline ai(X) move\n')
state[a][b][c][d] = "X"
cur_turn = 'player'
#Keeping Gamming
while (state =="_").any() and not game_over_out(state):
#'tree-based ai vs tree based ai'
if choice_ai == 3 :
#=========================ai 1(O)=====================================
print('======================AI-1(O)=====================')
unique, counts = np.unique(state[a][b],return_counts=True)
xo_num = dict(zip(unique,counts))
unique_sub, counts_sub = np.unique(state[c][d],return_counts=True)
xo_num_sub = dict(zip(unique_sub,counts_sub))
nodes_num = 0
#AI TURN
if (state[c][d] =="_").any() :
if (('X' in xo_num and 'O' in xo_num and xo_num['X'] > xo_num['O']) or
('X' in xo_num and 'O' not in xo_num and xo_num['X']>0)):
if '_' in xo_num_sub and xo_num_sub['_'] >1 and state[c][d][a][b] =='_':
state[c][d][a][b] = 'P'
nodes_num,best_action = ai_1_turn(state[c][d])
state[c][d][a][b] ='_'
else:
nodes_num,best_action = ai_1_turn(state[c][d])
else:
nodes_num,best_action = ai_1_turn(state[c][d])
if game_over(state[c][d])[0]:
state[c][d] = game_over(state[c][d])[1]
cur_turn = "ai"
else:
#Find a tic-tac-toe which can win by place only one more move
for i in range(3):
for j in range(3):
if (state[i][j] == "_").any():
state_temp = state.copy()
nodes_num_temp,best_action = ai_1_turn(state_temp[i][j])
if game_over(state_temp[i][j])[0]:
nodes_num ,best_action= ai_1_turn(state[i][j])
print("random pick")
state[i][j] =game_over(state_temp[i][j])[1]
cur_turn = "ai"
break
else: continue
break
#If did not find that move, we random pick one gird
if cur_turn != "ai":
print('3')
ramd_move = (random.randint(0,2),random.randint(0,2))
if (state[ramd_move[0]][ramd_move[1]] =="_").any():
nodes_num ,best_action= ai_1_turn(state[ramd_move[0]][ramd_move[1]])
print("random pick")
if game_over(state[ramd_move[0]][ramd_move[1]])[0]:
state[ramd_move[0]][ramd_move[1]] ="O"
cur_turn = "ai"
else:
for i in range(3):
for j in range(3):
if (state[i][j] == "_").any():
nodes_num,best_action = ai_1_turn(state[i][j])
print("random pick")
if game_over(state[i][j])[0]:
state[i][j] =game_over(state[i][j])[1]
cur_turn = "ai"
break
else: continue
break
print('the number of tree nodes that')
print('the AI processed before selecting its action: \n', nodes_num)
print('\n')
if game_over_out(state):
print('================Game Over====================')
ai1 += 1
print('\nAI-1 WIN!\n')
print('=============================================\n')
break
print('===================Below is Ai-1\'s move=====================\n')
print_func(state)
print('===================Now This is AI-2 Turn====================\n \n')
a,b,c,d = c,d,best_action[0],best_action[1]
#======================AI-2(X)======================
print('======================AI-2(X)======================')
unique, counts = np.unique(state[a][b],return_counts=True)
xo_num = dict(zip(unique,counts))
unique_sub, counts_sub = np.unique(state[c][d],return_counts=True)
xo_num_sub = dict(zip(unique_sub,counts_sub))
nodes_num = 0
#AI TURN
if (state[c][d] =="_").any() :
if (('O' in xo_num and 'X' in xo_num and xo_num['O'] > xo_num['X']) or
('O' in xo_num and 'X' not in xo_num and xo_num['O']>0)):
if '_' in xo_num_sub and xo_num_sub['_'] >1 and state[c][d][a][b] =='_':
state[c][d][a][b] = 'P'
nodes_num,best_action = ai_2_turn(state[c][d])
state[c][d][a][b] ='_'
else:
nodes_num,best_action = ai_2_turn(state[c][d])
else:
nodes_num,best_action = ai_2_turn(state[c][d])
if game_over(state[c][d])[0]:
state[c][d] =game_over(state[c][d])[1]
cur_turn = "player"
else:
#Find a tic-tac-toe which can win by place only one more move
for i in range(3):
for j in range(3):
if (state[i][j] == "_").any():
state_temp = state.copy()
nodes_num_temp,best_action_temp = ai_2_turn(state_temp[i][j])
if game_over(state_temp[i][j])[0]:
nodes_num,best_action = ai_2_turn(state[i][j])
print("random pick")
state[i][j] =game_over(state_temp[i][j])[1]
cur_turn = "player"
break
else: continue
break
#If did not find that move, we random pick one gird
if cur_turn != "player":
ramd_move = (random.randint(0,2),random.randint(0,2))
if (state[ramd_move[0]][ramd_move[1]] =="_").any():
nodes_num,best_action = ai_2_turn(state[ramd_move[0]][ramd_move[1]])
print("random pick")
if game_over(state[ramd_move[0]][ramd_move[1]])[0]:
state[ramd_move[0]][ramd_move[1]] =game_over(state[ramd_move[0]][ramd_move[1]])[1]
cur_turn = "player"
else:
for i in range(3):
for j in range(3):
if (state[i][j] == "_").any():
nodes_num,best_action = ai_2_turn(state[i][j])
print("random pick")
if game_over(state[i][j])[0]:
state[i][j] =game_over(state[i][j])[1]
cur_turn = "player"
break
else: continue
break
print('the number of tree nodes that')
print('the AI processed before selecting its action: \n', nodes_num)
print('\n')
if game_over_out(state):
print('================Game Over====================')
print('\nAI-2 Win!\n')
ai2 += 1
print('=============================================\n')
break
print('===================Below is Ai-2\'s move=====================\n')
print_func(state)
print('===================Now This is AI-1 Turn====================\n \n')
a,b,c,d = c,d,best_action[0],best_action[1]
#==========================================option 4 baseline random AI vs tree based ai======================================================
#'tree-based ai vs baseline ai'
elif choice_ai == 4:
#=========================ai 2(O)=====================================
print('======================AI-2(O)=====================')
unique, counts = np.unique(state[a][b],return_counts=True)
xo_num = dict(zip(unique,counts))
unique_sub, counts_sub = np.unique(state[c][d],return_counts=True)
xo_num_sub = dict(zip(unique_sub,counts_sub))
nodes_num = 0
#AI TURN
if (state[c][d] =="_").any() :
if (('X' in xo_num and 'O' in xo_num and xo_num['X'] > xo_num['O']) or
('X' in xo_num and 'O' not in xo_num and xo_num['X']>0)):
if '_' in xo_num_sub and xo_num_sub['_'] >1 and state[c][d][a][b] =='_':
state[c][d][a][b] = 'P'
nodes_num,best_action = ai_1_turn(state[c][d])
state[c][d][a][b] ='_'
else:
nodes_num,best_action = ai_1_turn(state[c][d])
else:
nodes_num,best_action = ai_1_turn(state[c][d])
if game_over(state[c][d])[0]:
state[c][d] =game_over(state[c][d])[1]
cur_turn = "ai"
else:
#Find a tic-tac-toe which can win by place only one more move
for i in range(3):
for j in range(3):
if (state[i][j] == "_").any():
state_temp = state.copy()
nodes_num_temp,best_action = ai_1_turn(state_temp[i][j])
if game_over(state_temp[i][j])[0]:
nodes_num ,best_action= ai_1_turn(state[i][j])
print("random pick")
state[i][j] =game_over(state_temp[i][j])[1]
cur_turn = "ai"
break
else: continue
break
#If did not find that move, we random pick one gird
if cur_turn != "ai":
for i in range(3):
for j in range(3):
if (state[i][j] =="_").any() and (state[i][j] =="O").any():
nodes_num ,best_action= ai_1_turn(state[i][j])
print("random pick2")
if game_over(state[i][j])[0]:
state[i][j] =game_over(state[i][j])[1]
cur_turn = "ai"
break
else: continue
break
if cur_turn != "ai":
ramd_move = (random.randint(0,2),random.randint(0,2))
if (state[ramd_move[0]][ramd_move[1]] =="_").any():
nodes_num ,best_action= ai_1_turn(state[ramd_move[0]][ramd_move[1]])
print("random pick3")
if game_over(state[ramd_move[0]][ramd_move[1]])[0]:
state[ramd_move[0]][ramd_move[1]] =game_over(state[ramd_move[0]][ramd_move[1]])[1]
cur_turn = "ai"
if cur_turn != "ai":
for i in range(3):
for j in range(3):
if (state[i][j] == "_").any():
nodes_num = ai_turn(state[i][j])
print("random pick")
if game_over(state[i][j])[0]:
state[i][j] =game_over(state[i][j])[1]
cur_turn = "ai"
break
else: continue
break
print('the number of tree nodes that')
print('the AI processed before selecting its action: \n', nodes_num)
print('\n')
if game_over_out(state):
print('================Game Over====================')
print('AI-TREE WIN')
tree_ai += 1
print('=============================================\n')
break
print('===================Below is Ai-1\'s move=====================\n')
print_func(state)
print('===================Now This is Baseline AI Turn====================\n \n')
a,b,c,d = c,d,best_action[0],best_action[1]
#==============================baseline ai(X)===============================
if (state[c][d] =="_").any() :
while True:
ramd_move_sub = (random.randint(0,2),random.randint(0,2))
if (state[c][d][ramd_move_sub[0]][ramd_move_sub[1]] =="_"):
state[c][d][ramd_move_sub[0]][ramd_move_sub[1]] = 'X'
break
if game_over(state[c][d])[0]:
state[c][d] =game_over(state[c][d])[1]
cur_turn = "player"
else:
while True:
ramd_move = (random.randint(0,2),random.randint(0,2))
ramd_move_sub = (random.randint(0,2),random.randint(0,2))
if (state[ramd_move[0]][ramd_move[1]][ramd_move_sub[0]][ramd_move_sub[1]] =="_"):
state[ramd_move[0]][ramd_move[1]][ramd_move_sub[0]][ramd_move_sub[1]] = 'X'
break
if game_over(state[ramd_move[0]][ramd_move[1]])[0]:
state[ramd_move[0]][ramd_move[1]] =game_over(state[ramd_move[0]][ramd_move[1]])[1]
cur_turn = 'player'
if game_over_out(state):
print('================Game Over====================')
print('Baseline ai win')
base_ai += 1
print('=============================================\n')
break
#print( (state =="_").any())
#print( game_over_out(state))
print('===================Below is Ai\'s move=====================\n')
print_func(state)
print('===================Now This is Your Turn====================\n \n')
a,b,c,d = c,d,ramd_move_sub[0],ramd_move_sub[1]
if choice_ai == 3:
print('ai1 wins :{0} games\n ai2 wins:{1} games'.format(ai1,ai2))
elif choice_ai ==4:
print('tree ai wins:{0} games\n base ai wins:{1} games'.format(tree_ai,base_ai))
# Final Situation
print_func(state)
| 42.012857 | 186 | 0.397837 | 3,380 | 29,409 | 3.295858 | 0.050888 | 0.013106 | 0.035189 | 0.057899 | 0.8693 | 0.854578 | 0.823878 | 0.804399 | 0.787882 | 0.767684 | 0 | 0.025597 | 0.430106 | 29,409 | 699 | 187 | 42.072961 | 0.639081 | 0.040226 | 0 | 0.793037 | 0 | 0.011605 | 0.090749 | 0.037892 | 0 | 0 | 0 | 0 | 0 | 1 | 0.019342 | false | 0 | 0.003868 | 0 | 0.075435 | 0.125725 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
87a6131c0a945e5a44e88fd355212d63f2796333 | 130 | py | Python | roboball2d/rendering/__init__.py | dtrb/roboball2d | 9e49ec89d3a8ce2b3b249dc8f3a975e9a412977a | [
"BSD-3-Clause"
] | 2 | 2020-04-17T19:52:32.000Z | 2020-06-14T13:27:42.000Z | roboball2d/rendering/__init__.py | dtrb/roboball2d | 9e49ec89d3a8ce2b3b249dc8f3a975e9a412977a | [
"BSD-3-Clause"
] | 4 | 2020-05-04T11:27:03.000Z | 2020-07-08T15:40:54.000Z | roboball2d/rendering/__init__.py | dtrb/roboball2d | 9e49ec89d3a8ce2b3b249dc8f3a975e9a412977a | [
"BSD-3-Clause"
] | 4 | 2020-04-09T13:21:24.000Z | 2021-06-14T11:23:09.000Z | from roboball2d.rendering.pyglet_renderer import PygletRenderer
from roboball2d.rendering.rendering_config import RenderingConfig
| 43.333333 | 65 | 0.907692 | 14 | 130 | 8.285714 | 0.642857 | 0.241379 | 0.396552 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016393 | 0.061538 | 130 | 2 | 66 | 65 | 0.934426 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
e217011490a2fbb976cf9879485d6dccb63a7a11 | 185 | py | Python | integration_tests/__init__.py | CanopySimulations/canopy-python | 9ec37e674e65d6fbef0402ac0c612c163d55631e | [
"MIT"
] | null | null | null | integration_tests/__init__.py | CanopySimulations/canopy-python | 9ec37e674e65d6fbef0402ac0c612c163d55631e | [
"MIT"
] | 1 | 2022-01-31T10:18:08.000Z | 2022-01-31T10:18:08.000Z | integration_tests/__init__.py | CanopySimulations/canopy-python | 9ec37e674e65d6fbef0402ac0c612c163d55631e | [
"MIT"
] | null | null | null |
from integration_tests.environment_data import EnvironmentData
from integration_tests.environment_loader import EnvironmentLoader
from integration_tests.environment import Environment
| 37 | 66 | 0.913514 | 20 | 185 | 8.2 | 0.45 | 0.27439 | 0.365854 | 0.567073 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.07027 | 185 | 4 | 67 | 46.25 | 0.953488 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
356169cfdcd9a7f3ee28e779c49f5a334d858f75 | 11,167 | py | Python | tests/test_util_methods.py | si4141/scraper_for_economy_watcher | 92423ed2198263998a6593ce9cb800a10c5d0943 | [
"MIT"
] | 1 | 2018-12-18T08:21:36.000Z | 2018-12-18T08:21:36.000Z | tests/test_util_methods.py | si4141/scraper_for_economy_watcher | 92423ed2198263998a6593ce9cb800a10c5d0943 | [
"MIT"
] | null | null | null | tests/test_util_methods.py | si4141/scraper_for_economy_watcher | 92423ed2198263998a6593ce9cb800a10c5d0943 | [
"MIT"
] | null | null | null | import unittest
from econ_watcher_reader.settings import TOP_MENU_PAGE, WatcherType, TOKYO_FLAG_VALUE_IN_RAW_DATA
import re
import numpy as np
from econ_watcher_reader.scraper import get_watcher_directory, get_watcher_file,\
get_publish_date_from_url
import econ_watcher_reader.parser as parser
import logging
logging.basicConfig()
logging.getLogger("econ_watcher_reader").setLevel(level=logging.DEBUG)
class TestScraper(unittest.TestCase):
def test_get_watcher_directory(self):
links_ = get_watcher_directory(TOP_MENU_PAGE)
for link_ in links_:
self.assertTrue('watcher' in link_)
self.assertFalse('menu' in link_)
self.assertTrue(len(re.findall('\d{4}', link_)))
def test_get_watcher_file(self):
links_ = get_watcher_directory(TOP_MENU_PAGE)
data = get_watcher_file(links_[0], WatcherType.Current.file_name)
self.assertTrue(data.size>0)
def test_get_publish_date_from_url(self):
links_ = get_watcher_directory(TOP_MENU_PAGE)
self.assertIsNotNone(get_publish_date_from_url(links_[0]))
class TestParserCurrent(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.links_ = get_watcher_directory(TOP_MENU_PAGE)
cls.watcher_type = WatcherType.Current
cls.data = get_watcher_file(cls.links_[0], cls.watcher_type.file_name)
def test_eliminate_newline_code(self):
eliminated = parser.eliminate_newline_code(self.data)
self.assertFalse(eliminated.apply(
lambda x: x.str.contains('\n').any() if x.dtype == np.object else False
).any())
self.assertFalse(eliminated.apply(
lambda x: x.str.contains('\r').any() if x.dtype == np.object else False
).any())
def test_eliminate_rows_with_na_in_economic_status(self):
eliminated = parser.eliminate_rows_with_na_in_economic_status(self.data,
self.watcher_type.iloc_economic_status_score)
self.assertGreater(eliminated.shape[0], 0)
def test_build_is_tokyo_flag(self):
eliminated = parser.eliminate_rows_with_na_in_economic_status(self.data,
self.watcher_type.iloc_economic_status_score)
flagged = parser.build_is_tokyo_flag(eliminated, self.watcher_type.iloc_is_tokyo_flag)
self.assertTrue(
flagged[flagged.is_tokyo].iloc[:, self.watcher_type.iloc_is_tokyo_flag].str.contains(TOKYO_FLAG_VALUE_IN_RAW_DATA).all()
)
def test_make_field_column(self):
eliminated = parser.eliminate_newline_code(self.data)
eliminated = parser.eliminate_rows_with_na_in_economic_status(eliminated,
self.watcher_type.iloc_economic_status_score)
data_with_field = parser.make_field_column(eliminated, self.watcher_type.iloc_field)
self.assertFalse(data_with_field.field.isnull().any())
def test_clean_field_column(self):
eliminated = parser.eliminate_newline_code(self.data)
eliminated = parser.eliminate_rows_with_na_in_economic_status(eliminated,
self.watcher_type.iloc_economic_status_score)
data_with_field = parser.make_field_column(eliminated, self.watcher_type.iloc_field)
cleaned = parser.clean_field_column(data_with_field)
# TODO assert
def test_make_region_column(self):
eliminated = parser.eliminate_rows_with_na_in_economic_status(self.data,
self.watcher_type.iloc_economic_status_score)
data_with_field = parser.make_field_column(eliminated, self.watcher_type.iloc_field)
data_with_region = parser.make_region_column(data_with_field)
# TODO assert
def test_convert_economic_state_score_into_integer(self):
eliminated = parser.eliminate_rows_with_na_in_economic_status(self.data,
self.watcher_type.iloc_economic_status_score)
converted = parser.convert_economic_state_score_into_integer(
eliminated,
self.watcher_type.iloc_economic_status_score,
self.watcher_type.score_map
)
self.assertTrue(converted.score.dropna().dtype == np.float)
def test_eliminate_rows_without_sentence(self):
eliminated = parser.eliminate_rows_with_na_in_economic_status(self.data,
self.watcher_type.iloc_economic_status_score)
data_with_integer_score = parser.convert_economic_state_score_into_integer(
eliminated,
self.watcher_type.iloc_economic_status_score,
self.watcher_type.score_map
)
eliminated = parser.eliminate_rows_without_sentence(data_with_integer_score,
self.watcher_type.iloc_reason_sentence)
self.assertTrue((eliminated.iloc[:, self.watcher_type.iloc_reason_sentence].apply(len) > 1).all())
def test_clean_sentence_reason(self):
eliminated = parser.eliminate_rows_with_na_in_economic_status(self.data,
self.watcher_type.iloc_economic_status_score)
data_with_integer_score = parser.convert_economic_state_score_into_integer(
eliminated,
self.watcher_type.iloc_economic_status_score,
self.watcher_type.score_map
)
eliminated = parser.eliminate_rows_without_sentence(data_with_integer_score,
self.watcher_type.iloc_reason_sentence)
cleaned = parser.clean_sentence_reason(eliminated, self.watcher_type.iloc_reason_sentence)
cleaned.to_clipboard()
# not any values start with center dot `・`
self.assertFalse((cleaned.reason_sentence.str.find('・') == 0).any())
class TestParserFuture(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.links_ = get_watcher_directory(TOP_MENU_PAGE)
cls.watcher_type = WatcherType.Future
cls.data = get_watcher_file(cls.links_[0], cls.watcher_type.file_name)
def test_eliminate_newline_code(self):
eliminated = parser.eliminate_newline_code(self.data)
self.assertFalse(eliminated.apply(
lambda x: x.str.contains('\n').any() if x.dtype == np.object else False
).any())
self.assertFalse(eliminated.apply(
lambda x: x.str.contains('\r').any() if x.dtype == np.object else False
).any())
def test_eliminate_rows_with_na_in_economic_status(self):
eliminated = parser.eliminate_rows_with_na_in_economic_status(self.data,
self.watcher_type.iloc_economic_status_score)
self.assertGreater(eliminated.shape[0], 0)
def test_build_is_tokyo_flag(self):
eliminated = parser.eliminate_rows_with_na_in_economic_status(self.data,
self.watcher_type.iloc_economic_status_score)
flagged = parser.build_is_tokyo_flag(eliminated, self.watcher_type.iloc_is_tokyo_flag)
self.assertTrue(
flagged[flagged.is_tokyo].iloc[:, self.watcher_type.iloc_is_tokyo_flag].str.contains(TOKYO_FLAG_VALUE_IN_RAW_DATA).all()
)
def test_make_field_column(self):
eliminated = parser.eliminate_newline_code(self.data)
eliminated = parser.eliminate_rows_with_na_in_economic_status(eliminated,
self.watcher_type.iloc_economic_status_score)
data_with_field = parser.make_field_column(eliminated, self.watcher_type.iloc_field)
self.assertFalse(data_with_field.field.isnull().any())
def test_clean_field_column(self):
eliminated = parser.eliminate_newline_code(self.data)
eliminated = parser.eliminate_rows_with_na_in_economic_status(eliminated,
self.watcher_type.iloc_economic_status_score)
data_with_field = parser.make_field_column(eliminated, self.watcher_type.iloc_field)
cleaned = parser.clean_field_column(data_with_field)
# TODO assert
def test_make_region_column(self):
eliminated = parser.eliminate_rows_with_na_in_economic_status(self.data,
self.watcher_type.iloc_economic_status_score)
data_with_field = parser.make_field_column(eliminated, self.watcher_type.iloc_field)
data_with_region = parser.make_region_column(data_with_field)
# TODO assert
def test_convert_economic_state_score_into_integer(self):
eliminated = parser.eliminate_rows_with_na_in_economic_status(self.data,
self.watcher_type.iloc_economic_status_score)
converted = parser.convert_economic_state_score_into_integer(
eliminated,
self.watcher_type.iloc_economic_status_score,
self.watcher_type.score_map
)
self.assertTrue(converted.score.dropna().dtype == np.float)
def test_eliminate_rows_without_sentence(self):
eliminated = parser.eliminate_rows_with_na_in_economic_status(self.data,
self.watcher_type.iloc_economic_status_score)
data_with_integer_score = parser.convert_economic_state_score_into_integer(
eliminated,
self.watcher_type.iloc_economic_status_score,
self.watcher_type.score_map
)
eliminated = parser.eliminate_rows_without_sentence(data_with_integer_score,
self.watcher_type.iloc_reason_sentence)
self.assertTrue((eliminated.iloc[:, self.watcher_type.iloc_reason_sentence].apply(len) > 1).all())
def test_clean_sentence_reason(self):
eliminated = parser.eliminate_rows_with_na_in_economic_status(self.data,
self.watcher_type.iloc_economic_status_score)
data_with_integer_score = parser.convert_economic_state_score_into_integer(
eliminated,
self.watcher_type.iloc_economic_status_score,
self.watcher_type.score_map
)
eliminated = parser.eliminate_rows_without_sentence(data_with_integer_score,
self.watcher_type.iloc_reason_sentence)
cleaned = parser.clean_sentence_reason(eliminated, self.watcher_type.iloc_reason_sentence)
# not any values start with center dot `・`
self.assertFalse((cleaned.reason_sentence.str.find('・') == 0).any())
if __name__ == '__main__':
unittest.main()
| 48.341991 | 132 | 0.656757 | 1,286 | 11,167 | 5.267496 | 0.095645 | 0.081193 | 0.10186 | 0.112194 | 0.902864 | 0.893564 | 0.890168 | 0.889135 | 0.871863 | 0.871863 | 0 | 0.001719 | 0.270798 | 11,167 | 230 | 133 | 48.552174 | 0.82967 | 0.011552 | 0 | 0.794286 | 0 | 0 | 0.004805 | 0 | 0 | 0 | 0 | 0.004348 | 0.12 | 1 | 0.131429 | false | 0 | 0.04 | 0 | 0.188571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3562f7b618b1595d652c5aa527a1a309bd67fe62 | 22,675 | py | Python | pgoapi/protos/pogoprotos/networking/platform/telemetry/distribution_pb2.py | SkOODaT/Pgoapi | a0021998eb75f6b1a867d7348acb4fcf1298987d | [
"MIT"
] | 1 | 2017-12-12T10:05:57.000Z | 2017-12-12T10:05:57.000Z | pgoapi/protos/pogoprotos/networking/platform/telemetry/distribution_pb2.py | SkOODaT/Pgoapi | a0021998eb75f6b1a867d7348acb4fcf1298987d | [
"MIT"
] | null | null | null | pgoapi/protos/pogoprotos/networking/platform/telemetry/distribution_pb2.py | SkOODaT/Pgoapi | a0021998eb75f6b1a867d7348acb4fcf1298987d | [
"MIT"
] | null | null | null | # Generated by the protocol buffer compiler. DO NOT EDIT!
# source: pogoprotos/networking/platform/telemetry/distribution.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
from google.protobuf import descriptor_pb2
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name='pogoprotos/networking/platform/telemetry/distribution.proto',
package='pogoprotos.networking.platform.telemetry',
syntax='proto3',
serialized_pb=_b('\n;pogoprotos/networking/platform/telemetry/distribution.proto\x12(pogoprotos.networking.platform.telemetry\"\xd5\x08\n\x0c\x44istribution\x12\r\n\x05\x63ount\x18\x01 \x01(\x03\x12\x0c\n\x04mean\x18\x02 \x01(\x02\x12 \n\x18sum_of_squared_deviation\x18\x03 \x01(\x01\x12K\n\x05range\x18\x04 \x01(\x0b\x32<.pogoprotos.networking.platform.telemetry.Distribution.Range\x12\\\n\x0e\x62ucket_options\x18\x05 \x01(\x0b\x32\x44.pogoprotos.networking.platform.telemetry.Distribution.BucketOptions\x12\x15\n\rbucket_counts\x18\x06 \x03(\x03\x1a\xfe\x03\n\rBucketOptions\x12\\\n\x0elinear_buckets\x18\x01 \x01(\x0b\x32\x44.pogoprotos.networking.platform.telemetry.Distribution.LinearBuckets\x12\x66\n\x13\x65xponential_buckets\x18\x02 \x01(\x0b\x32I.pogoprotos.networking.platform.telemetry.Distribution.ExponentialBuckets\x12`\n\x10\x65xplicit_buckets\x18\x03 \x01(\x0b\x32\x46.pogoprotos.networking.platform.telemetry.Distribution.ExplicitBuckets\x1a!\n\x0f\x45xplicitBuckets\x12\x0e\n\x06\x62ounds\x18\x01 \x03(\x03\x1aV\n\x12\x45xponentialBuckets\x12\x1a\n\x12num_finite_buckets\x18\x01 \x01(\x03\x12\x15\n\rgrowth_factor\x18\x02 \x01(\x02\x12\r\n\x05scale\x18\x03 \x01(\x02\x1aJ\n\rLinearBuckets\x12\x1a\n\x12num_finite_buckets\x18\x01 \x01(\x03\x12\r\n\x05width\x18\x02 \x01(\x03\x12\x0e\n\x06offset\x18\x03 \x01(\x03\x1a!\n\x0f\x45xplicitBuckets\x12\x0e\n\x06\x62ounds\x18\x01 \x03(\x03\x1aV\n\x12\x45xponentialBuckets\x12\x1a\n\x12num_finite_buckets\x18\x01 \x01(\x03\x12\x15\n\rgrowth_factor\x18\x02 \x01(\x02\x12\r\n\x05scale\x18\x03 \x01(\x02\x1aJ\n\rLinearBuckets\x12\x1a\n\x12num_finite_buckets\x18\x01 \x01(\x03\x12\r\n\x05width\x18\x02 \x01(\x03\x12\x0e\n\x06offset\x18\x03 \x01(\x03\x1a!\n\x05Range\x12\x0b\n\x03min\x18\x01 \x01(\x03\x12\x0b\n\x03max\x18\x02 \x01(\x03\"Y\n\nBucketType\x12\x08\n\x04none\x10\x00\x12\x12\n\x0elinear_buckets\x10\x01\x12\x17\n\x13\x65xponential_buckets\x10\x02\x12\x14\n\x10\x65xplicit_buckets\x10\x03\x62\x06proto3')
)
_DISTRIBUTION_BUCKETTYPE = _descriptor.EnumDescriptor(
name='BucketType',
full_name='pogoprotos.networking.platform.telemetry.Distribution.BucketType',
filename=None,
file=DESCRIPTOR,
values=[
_descriptor.EnumValueDescriptor(
name='none', index=0, number=0,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='linear_buckets', index=1, number=1,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='exponential_buckets', index=2, number=2,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='explicit_buckets', index=3, number=3,
options=None,
type=None),
],
containing_type=None,
options=None,
serialized_start=1126,
serialized_end=1215,
)
_sym_db.RegisterEnumDescriptor(_DISTRIBUTION_BUCKETTYPE)
_DISTRIBUTION_BUCKETOPTIONS_EXPLICITBUCKETS = _descriptor.Descriptor(
name='ExplicitBuckets',
full_name='pogoprotos.networking.platform.telemetry.Distribution.BucketOptions.ExplicitBuckets',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='bounds', full_name='pogoprotos.networking.platform.telemetry.Distribution.BucketOptions.ExplicitBuckets.bounds', index=0,
number=1, type=3, cpp_type=2, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=693,
serialized_end=726,
)
_DISTRIBUTION_BUCKETOPTIONS_EXPONENTIALBUCKETS = _descriptor.Descriptor(
name='ExponentialBuckets',
full_name='pogoprotos.networking.platform.telemetry.Distribution.BucketOptions.ExponentialBuckets',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='num_finite_buckets', full_name='pogoprotos.networking.platform.telemetry.Distribution.BucketOptions.ExponentialBuckets.num_finite_buckets', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='growth_factor', full_name='pogoprotos.networking.platform.telemetry.Distribution.BucketOptions.ExponentialBuckets.growth_factor', index=1,
number=2, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='scale', full_name='pogoprotos.networking.platform.telemetry.Distribution.BucketOptions.ExponentialBuckets.scale', index=2,
number=3, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=728,
serialized_end=814,
)
_DISTRIBUTION_BUCKETOPTIONS_LINEARBUCKETS = _descriptor.Descriptor(
name='LinearBuckets',
full_name='pogoprotos.networking.platform.telemetry.Distribution.BucketOptions.LinearBuckets',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='num_finite_buckets', full_name='pogoprotos.networking.platform.telemetry.Distribution.BucketOptions.LinearBuckets.num_finite_buckets', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='width', full_name='pogoprotos.networking.platform.telemetry.Distribution.BucketOptions.LinearBuckets.width', index=1,
number=2, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='offset', full_name='pogoprotos.networking.platform.telemetry.Distribution.BucketOptions.LinearBuckets.offset', index=2,
number=3, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=816,
serialized_end=890,
)
_DISTRIBUTION_BUCKETOPTIONS = _descriptor.Descriptor(
name='BucketOptions',
full_name='pogoprotos.networking.platform.telemetry.Distribution.BucketOptions',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='linear_buckets', full_name='pogoprotos.networking.platform.telemetry.Distribution.BucketOptions.linear_buckets', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='exponential_buckets', full_name='pogoprotos.networking.platform.telemetry.Distribution.BucketOptions.exponential_buckets', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='explicit_buckets', full_name='pogoprotos.networking.platform.telemetry.Distribution.BucketOptions.explicit_buckets', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[_DISTRIBUTION_BUCKETOPTIONS_EXPLICITBUCKETS, _DISTRIBUTION_BUCKETOPTIONS_EXPONENTIALBUCKETS, _DISTRIBUTION_BUCKETOPTIONS_LINEARBUCKETS, ],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=380,
serialized_end=890,
)
_DISTRIBUTION_EXPLICITBUCKETS = _descriptor.Descriptor(
name='ExplicitBuckets',
full_name='pogoprotos.networking.platform.telemetry.Distribution.ExplicitBuckets',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='bounds', full_name='pogoprotos.networking.platform.telemetry.Distribution.ExplicitBuckets.bounds', index=0,
number=1, type=3, cpp_type=2, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=693,
serialized_end=726,
)
_DISTRIBUTION_EXPONENTIALBUCKETS = _descriptor.Descriptor(
name='ExponentialBuckets',
full_name='pogoprotos.networking.platform.telemetry.Distribution.ExponentialBuckets',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='num_finite_buckets', full_name='pogoprotos.networking.platform.telemetry.Distribution.ExponentialBuckets.num_finite_buckets', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='growth_factor', full_name='pogoprotos.networking.platform.telemetry.Distribution.ExponentialBuckets.growth_factor', index=1,
number=2, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='scale', full_name='pogoprotos.networking.platform.telemetry.Distribution.ExponentialBuckets.scale', index=2,
number=3, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=728,
serialized_end=814,
)
_DISTRIBUTION_LINEARBUCKETS = _descriptor.Descriptor(
name='LinearBuckets',
full_name='pogoprotos.networking.platform.telemetry.Distribution.LinearBuckets',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='num_finite_buckets', full_name='pogoprotos.networking.platform.telemetry.Distribution.LinearBuckets.num_finite_buckets', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='width', full_name='pogoprotos.networking.platform.telemetry.Distribution.LinearBuckets.width', index=1,
number=2, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='offset', full_name='pogoprotos.networking.platform.telemetry.Distribution.LinearBuckets.offset', index=2,
number=3, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=816,
serialized_end=890,
)
_DISTRIBUTION_RANGE = _descriptor.Descriptor(
name='Range',
full_name='pogoprotos.networking.platform.telemetry.Distribution.Range',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='min', full_name='pogoprotos.networking.platform.telemetry.Distribution.Range.min', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='max', full_name='pogoprotos.networking.platform.telemetry.Distribution.Range.max', index=1,
number=2, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1091,
serialized_end=1124,
)
_DISTRIBUTION = _descriptor.Descriptor(
name='Distribution',
full_name='pogoprotos.networking.platform.telemetry.Distribution',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='count', full_name='pogoprotos.networking.platform.telemetry.Distribution.count', index=0,
number=1, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='mean', full_name='pogoprotos.networking.platform.telemetry.Distribution.mean', index=1,
number=2, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='sum_of_squared_deviation', full_name='pogoprotos.networking.platform.telemetry.Distribution.sum_of_squared_deviation', index=2,
number=3, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='range', full_name='pogoprotos.networking.platform.telemetry.Distribution.range', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='bucket_options', full_name='pogoprotos.networking.platform.telemetry.Distribution.bucket_options', index=4,
number=5, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='bucket_counts', full_name='pogoprotos.networking.platform.telemetry.Distribution.bucket_counts', index=5,
number=6, type=3, cpp_type=2, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[_DISTRIBUTION_BUCKETOPTIONS, _DISTRIBUTION_EXPLICITBUCKETS, _DISTRIBUTION_EXPONENTIALBUCKETS, _DISTRIBUTION_LINEARBUCKETS, _DISTRIBUTION_RANGE, ],
enum_types=[
_DISTRIBUTION_BUCKETTYPE,
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=106,
serialized_end=1215,
)
_DISTRIBUTION_BUCKETOPTIONS_EXPLICITBUCKETS.containing_type = _DISTRIBUTION_BUCKETOPTIONS
_DISTRIBUTION_BUCKETOPTIONS_EXPONENTIALBUCKETS.containing_type = _DISTRIBUTION_BUCKETOPTIONS
_DISTRIBUTION_BUCKETOPTIONS_LINEARBUCKETS.containing_type = _DISTRIBUTION_BUCKETOPTIONS
_DISTRIBUTION_BUCKETOPTIONS.fields_by_name['linear_buckets'].message_type = _DISTRIBUTION_LINEARBUCKETS
_DISTRIBUTION_BUCKETOPTIONS.fields_by_name['exponential_buckets'].message_type = _DISTRIBUTION_EXPONENTIALBUCKETS
_DISTRIBUTION_BUCKETOPTIONS.fields_by_name['explicit_buckets'].message_type = _DISTRIBUTION_EXPLICITBUCKETS
_DISTRIBUTION_BUCKETOPTIONS.containing_type = _DISTRIBUTION
_DISTRIBUTION_EXPLICITBUCKETS.containing_type = _DISTRIBUTION
_DISTRIBUTION_EXPONENTIALBUCKETS.containing_type = _DISTRIBUTION
_DISTRIBUTION_LINEARBUCKETS.containing_type = _DISTRIBUTION
_DISTRIBUTION_RANGE.containing_type = _DISTRIBUTION
_DISTRIBUTION.fields_by_name['range'].message_type = _DISTRIBUTION_RANGE
_DISTRIBUTION.fields_by_name['bucket_options'].message_type = _DISTRIBUTION_BUCKETOPTIONS
_DISTRIBUTION_BUCKETTYPE.containing_type = _DISTRIBUTION
DESCRIPTOR.message_types_by_name['Distribution'] = _DISTRIBUTION
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
Distribution = _reflection.GeneratedProtocolMessageType('Distribution', (_message.Message,), dict(
BucketOptions = _reflection.GeneratedProtocolMessageType('BucketOptions', (_message.Message,), dict(
ExplicitBuckets = _reflection.GeneratedProtocolMessageType('ExplicitBuckets', (_message.Message,), dict(
DESCRIPTOR = _DISTRIBUTION_BUCKETOPTIONS_EXPLICITBUCKETS,
__module__ = 'pogoprotos.networking.platform.telemetry.distribution_pb2'
# @@protoc_insertion_point(class_scope:pogoprotos.networking.platform.telemetry.Distribution.BucketOptions.ExplicitBuckets)
))
,
ExponentialBuckets = _reflection.GeneratedProtocolMessageType('ExponentialBuckets', (_message.Message,), dict(
DESCRIPTOR = _DISTRIBUTION_BUCKETOPTIONS_EXPONENTIALBUCKETS,
__module__ = 'pogoprotos.networking.platform.telemetry.distribution_pb2'
# @@protoc_insertion_point(class_scope:pogoprotos.networking.platform.telemetry.Distribution.BucketOptions.ExponentialBuckets)
))
,
LinearBuckets = _reflection.GeneratedProtocolMessageType('LinearBuckets', (_message.Message,), dict(
DESCRIPTOR = _DISTRIBUTION_BUCKETOPTIONS_LINEARBUCKETS,
__module__ = 'pogoprotos.networking.platform.telemetry.distribution_pb2'
# @@protoc_insertion_point(class_scope:pogoprotos.networking.platform.telemetry.Distribution.BucketOptions.LinearBuckets)
))
,
DESCRIPTOR = _DISTRIBUTION_BUCKETOPTIONS,
__module__ = 'pogoprotos.networking.platform.telemetry.distribution_pb2'
# @@protoc_insertion_point(class_scope:pogoprotos.networking.platform.telemetry.Distribution.BucketOptions)
))
,
ExplicitBuckets = _reflection.GeneratedProtocolMessageType('ExplicitBuckets', (_message.Message,), dict(
DESCRIPTOR = _DISTRIBUTION_EXPLICITBUCKETS,
__module__ = 'pogoprotos.networking.platform.telemetry.distribution_pb2'
# @@protoc_insertion_point(class_scope:pogoprotos.networking.platform.telemetry.Distribution.ExplicitBuckets)
))
,
ExponentialBuckets = _reflection.GeneratedProtocolMessageType('ExponentialBuckets', (_message.Message,), dict(
DESCRIPTOR = _DISTRIBUTION_EXPONENTIALBUCKETS,
__module__ = 'pogoprotos.networking.platform.telemetry.distribution_pb2'
# @@protoc_insertion_point(class_scope:pogoprotos.networking.platform.telemetry.Distribution.ExponentialBuckets)
))
,
LinearBuckets = _reflection.GeneratedProtocolMessageType('LinearBuckets', (_message.Message,), dict(
DESCRIPTOR = _DISTRIBUTION_LINEARBUCKETS,
__module__ = 'pogoprotos.networking.platform.telemetry.distribution_pb2'
# @@protoc_insertion_point(class_scope:pogoprotos.networking.platform.telemetry.Distribution.LinearBuckets)
))
,
Range = _reflection.GeneratedProtocolMessageType('Range', (_message.Message,), dict(
DESCRIPTOR = _DISTRIBUTION_RANGE,
__module__ = 'pogoprotos.networking.platform.telemetry.distribution_pb2'
# @@protoc_insertion_point(class_scope:pogoprotos.networking.platform.telemetry.Distribution.Range)
))
,
DESCRIPTOR = _DISTRIBUTION,
__module__ = 'pogoprotos.networking.platform.telemetry.distribution_pb2'
# @@protoc_insertion_point(class_scope:pogoprotos.networking.platform.telemetry.Distribution)
))
_sym_db.RegisterMessage(Distribution)
_sym_db.RegisterMessage(Distribution.BucketOptions)
_sym_db.RegisterMessage(Distribution.BucketOptions.ExplicitBuckets)
_sym_db.RegisterMessage(Distribution.BucketOptions.ExponentialBuckets)
_sym_db.RegisterMessage(Distribution.BucketOptions.LinearBuckets)
_sym_db.RegisterMessage(Distribution.ExplicitBuckets)
_sym_db.RegisterMessage(Distribution.ExponentialBuckets)
_sym_db.RegisterMessage(Distribution.LinearBuckets)
_sym_db.RegisterMessage(Distribution.Range)
# @@protoc_insertion_point(module_scope)
| 42.702448 | 1,973 | 0.769085 | 2,597 | 22,675 | 6.443204 | 0.076242 | 0.042551 | 0.10542 | 0.139306 | 0.817845 | 0.773621 | 0.735015 | 0.707883 | 0.688759 | 0.649435 | 0 | 0.033175 | 0.11731 | 22,675 | 530 | 1,974 | 42.783019 | 0.802848 | 0.05226 | 0 | 0.716049 | 1 | 0.002058 | 0.278124 | 0.244133 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.012346 | 0 | 0.012346 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ea03fbf4ae8834df5e667f4f6725bb11e3056cfb | 17,623 | py | Python | postprocess/tubesSummitCdb.py | cinemascienceworkflows/warpx | c09a0dc97853b2de44ab6425260b54deff629438 | [
"BSD-3-Clause"
] | null | null | null | postprocess/tubesSummitCdb.py | cinemascienceworkflows/warpx | c09a0dc97853b2de44ab6425260b54deff629438 | [
"BSD-3-Clause"
] | null | null | null | postprocess/tubesSummitCdb.py | cinemascienceworkflows/warpx | c09a0dc97853b2de44ab6425260b54deff629438 | [
"BSD-3-Clause"
] | null | null | null | # trace generated using paraview version 5.9.0
#### import the simple module from the paraview
from paraview.simple import *
#### disable automatic camera reset on 'Show'
paraview.simple._DisableFirstRenderCameraReset()
# get active view
renderView1 = GetActiveViewOrCreate('RenderView')
# destroy renderView1
Delete(renderView1)
del renderView1
# load state
LoadState('<PvSkript>', DataDirectory='<DataDirectory>',
AMReXBoxLibGridReader3FileNames=['<AMReXBoxLibGridReader3FileNames>'])
print('skript')
# find view
renderView1 = FindViewOrCreate('RenderView1', viewtype='RenderView')
# set active view
SetActiveView(renderView1)
# find source
aMReXBoxLibGridReader3 = FindSource('AMReXBoxLibGridReader3')
# set active source
SetActiveSource(aMReXBoxLibGridReader3)
# Properties modified on aMReXBoxLibGridReader3
aMReXBoxLibGridReader3.CellArrayStatus = []
# update the view to ensure updated data information
renderView1.Update()
# Properties modified on aMReXBoxLibGridReader3
aMReXBoxLibGridReader3.CellArrayStatus = ['Bx', 'By', 'Bz', 'Ex', 'Ey', 'Ez', 'jx', 'jy', 'jz', 'rho']
# update the view to ensure updated data information
renderView1.Update()
# find source
resampleToImage1 = FindSource('ResampleToImage1')
# set active source
SetActiveSource(resampleToImage1)
# get display properties
resampleToImage1Display = GetDisplayProperties(resampleToImage1, view=renderView1)
# toggle 3D widget visibility (only when running from the GUI)
Show3DWidgets(proxy=resampleToImage1Display)
# toggle 3D widget visibility (only when running from the GUI)
Hide3DWidgets(proxy=resampleToImage1Display.SliceFunction)
# toggle 3D widget visibility (only when running from the GUI)
Hide3DWidgets(proxy=resampleToImage1Display)
# reset view to fit data
renderView1.ResetCamera()
# create extractor
pNG1 = CreateExtractor('PNG', renderView1, registrationName='PNG1')
# trace defaults for the extractor.
# init the 'PNG' selected for 'Writer'
pNG1.Writer.FileName = 'RenderView1_%.6ts%cm.png'
pNG1.Writer.ImageResolution = [500, 500]
pNG1.Writer.Format = 'PNG'
# Properties modified on pNG1.Writer
pNG1.Writer.CameraMode = 'Phi-Theta'
# Properties modified on pNG1.Trigger
pNG1.Trigger.UseStartTimeStep = 1
# update the view to ensure updated data information
renderView1.Update()
# Properties modified on pNG1.Trigger
pNG1.Trigger.UseEndTimeStep = 1
# get layout
layout1 = GetLayout()
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [0.00010591363255556098, -2.7970035577731125e-05, 5.460277445390548e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [0.44762183573101705, 0.5471248639415862, -0.7073111588499755]
renderView1.CameraParallelScale = 5.436905525073885e-05
# save extracts
SaveExtracts(ExtractsOutputDirectory='<CDB>',
GenerateCinemaSpecification=1)
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [0.00010591363255556098, -2.7970035577731125e-05, 5.460277445390548e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [0.44762183573101705, 0.5471248639415862, -0.7073111588499755]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [0.0, 0.0, 0.00012778766088545343]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [0.0, 0.00010269045465793091, 6.849929924549199e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [0.0, 0.5000000000000002, -0.8660254037844386]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [0.0, 0.00010269045465793095, -5.0077424034430956e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [0.0, -0.49999999999999967, -0.8660254037844388]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [0.0, 6.945670167485263e-20, -0.00010936578567439248]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [0.0, -1.0, -5.857532553913371e-16]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [0.0, -0.00010269045465793089, -5.0077424034431085e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [0.0, -0.5000000000000008, 0.8660254037844383]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [0.0, -0.00010269045465793101, 6.849929924549188e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [0.0, 0.49999999999999917, 0.8660254037844392]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [0.00010269045465793091, 0.0, 6.849929924549199e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [5.134522732896548e-05, 0.00010269045465793093, 3.885511842551125e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [-0.7499999999999999, 0.5000000000000002, -0.43301270189221946]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [-5.134522732896543e-05, 0.00010269045465793097, -2.0433243214450247e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [-0.7500000000000002, -0.4999999999999997, -0.4330127018922196]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [-0.00010269045465793095, 7.030373462210693e-20, -5.007742403443102e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [-5.389951109126102e-16, -1.0, -2.5222625841645313e-16]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [-5.134522732896556e-05, -0.00010269045465793091, -2.0433243214450294e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [0.7499999999999997, -0.5000000000000007, 0.4330127018922193]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [5.1345227328965375e-05, -0.00010269045465793104, 3.885511842551121e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [0.7500000000000006, 0.4999999999999991, 0.4330127018922195]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [0.00010269045465793094, 0.0, -5.0077424034430956e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [5.13452273289655e-05, 0.00010269045465793091, -2.043324321445024e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [-0.75, 0.5000000000000002, 0.433012701892219]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [-5.134522732896544e-05, 0.00010269045465793097, 3.8855118425511205e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [-0.7500000000000003, -0.4999999999999997, 0.4330127018922192]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [-0.00010269045465793098, 4.828087799349512e-20, 6.849929924549196e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [-4.2919644476521296e-16, -1.0, 7.094984288033657e-17]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [-5.134522732896556e-05, -0.00010269045465793093, 3.885511842551124e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [0.7499999999999998, -0.5000000000000004, -0.43301270189221913]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [5.134522732896541e-05, -0.00010269045465793102, -2.0433243214450233e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [0.7500000000000004, 0.49999999999999933, -0.4330127018922193]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [1.4521460461813908e-20, 0.0, -0.00010936578567439246]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [7.260730230906957e-21, 0.0001026904546579309, -5.007742403443101e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [-1.0605752387249068e-16, 0.5000000000000003, 0.8660254037844386]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [-7.260730230906948e-21, 0.00010269045465793094, 6.849929924549192e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [-1.0605752387249072e-16, -0.49999999999999967, 0.8660254037844388]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [-1.4521460461813908e-20, 7.284483346386983e-20, 0.00012778766088545343]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [-8.737044059982583e-32, -1.0, 6.14326584922622e-16]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [-7.260730230906968e-21, -0.00010269045465793087, 6.849929924549206e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [1.0605752387249063e-16, -0.5000000000000009, -0.8660254037844383]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [7.260730230906939e-21, -0.000102690454657931, -5.007742403443088e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [1.0605752387249078e-16, 0.49999999999999906, -0.8660254037844393]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [-0.0001026904546579309, 0.0, -5.0077424034431044e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [-5.134522732896548e-05, 0.00010269045465793091, -2.0433243214450294e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [0.7499999999999997, 0.5000000000000002, 0.43301270189221963]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [5.13452273289654e-05, 0.00010269045465793095, 3.8855118425511225e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [0.7500000000000001, -0.49999999999999956, 0.43301270189221985]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [0.00010269045465793089, 7.623296525288703e-20, 6.849929924549202e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [5.168779601716475e-16, -1.0, 3.9054094057795733e-16]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [5.134522732896552e-05, -0.00010269045465793085, 3.885511842551131e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [-0.7499999999999993, -0.5000000000000009, -0.4330127018922194]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [-5.134522732896533e-05, -0.00010269045465793097, -2.04332432144502e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [-0.7500000000000002, 0.49999999999999895, -0.43301270189222013]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [-0.00010269045465793091, 0.0, 6.849929924549199e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [-5.134522732896548e-05, 0.00010269045465793093, 3.885511842551125e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [0.7499999999999999, 0.5000000000000002, -0.43301270189221946]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [5.134522732896543e-05, 0.00010269045465793097, -2.0433243214450247e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [0.7500000000000002, -0.4999999999999997, -0.4330127018922196]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [0.00010269045465793095, 7.030373462210693e-20, -5.007742403443102e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [5.389951109126102e-16, -1.0, -2.5222625841645313e-16]
renderView1.CameraParallelScale = 5.436905525073885e-05
# layout/tab size in pixels
layout1.SetSize(850, 862)
# current camera placement for renderView1
renderView1.CameraPosition = [5.134522732896556e-05, -0.00010269045465793091, -2.0433243214450294e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [-0.7499999999999997, -0.5000000000000007, 0.4330127018922193]
renderView1.CameraParallelScale = 5.436905525073885e-05
#================================================================
# addendum: following script captures some of the application
# state to faithfully reproduce the visualization during playback
#================================================================
#--------------------------------
# saving layout sizes for layouts
# layout/tab size in pixels
layout1.SetSize(850, 862)
#-----------------------------------
# saving camera placements for views
# current camera placement for renderView1
renderView1.CameraPosition = [-5.1345227328965375e-05, -0.00010269045465793104, 3.885511842551121e-05]
renderView1.CameraFocalPoint = [0.0, 0.0, 9.21093760553049e-06]
renderView1.CameraViewUp = [-0.7500000000000006, 0.4999999999999991, 0.4330127018922195]
renderView1.CameraParallelScale = 5.436905525073885e-05
| 40.23516 | 103 | 0.796402 | 1,992 | 17,623 | 7.044679 | 0.140562 | 0.020238 | 0.018813 | 0.011402 | 0.775885 | 0.775885 | 0.766051 | 0.761491 | 0.761491 | 0.758569 | 0 | 0.340156 | 0.08784 | 17,623 | 437 | 104 | 40.327231 | 0.532815 | 0.220564 | 0 | 0.555046 | 1 | 0 | 0.014837 | 0.005802 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.004587 | 0 | 0.004587 | 0.004587 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ea571d43795bdf34f41533fd5ed52e04aa3d58bb | 25,642 | py | Python | main/views.py | waterwoodwind/QA_web | 307398106fc38716d3d2cd9f91805a801bd37368 | [
"MIT"
] | null | null | null | main/views.py | waterwoodwind/QA_web | 307398106fc38716d3d2cd9f91805a801bd37368 | [
"MIT"
] | null | null | null | main/views.py | waterwoodwind/QA_web | 307398106fc38716d3d2cd9f91805a801bd37368 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import sys
reload(sys)
sys.setdefaultencoding('utf-8')
from django.shortcuts import render
from django.http import HttpResponse
from main.models import qa_info
from django.core import serializers
import json
import pandas as pd
import arrow
import re
import pickle
#time count
import time
# import save and load data function
from save_load_func import list_all_data
def timeit(func):
def wrapper(*args, **args2):
start = time.clock()
back = func(*args, **args2)
end =time.clock()
print "@%.3fs taken for {%s}" % (end - start, func.__name__)
return back
return wrapper
# Create functions here.
def make_scrutator_json(df_data, department, source):
df_da = df_data[(df_data[u"受检单位"] == department)& \
(df_data[u"信息来源"] == source)]
df_person = df_da[u'责任人']
#chinese_name = u'([/u4e00-/u9fa5]+)'
#pattern = re.compile(chinese_name)
res_dict = {}
for item in df_person.values:
#print item
results = re.findall(ur"[\u4e00-\u9fa5]+", item)
for result in results:
#print result
res_dict[result] = res_dict.get(result, 0) + 1
#检查者
df_scrutator = df_da[u"检查者"]
res_dict = {}
for item in df_scrutator.values:
# print item
results = re.findall(ur"[\u4e00-\u9fa5\d]+", item)
for result in results:
# print result
res_dict[result] = res_dict.get(result, 0) + 1
scrutator_count_list = res_dict.items()
json_list = []
for item in scrutator_count_list:
single_dict = {}
# 去掉人员名单中的无
if item[0] == u'无':
print item[0], item[1]
continue
single_dict[u'检查者'] = item[0]
single_dict[u'次数'] = item[1]
json_list.append(single_dict)
json_scrutator = json.dumps(json_list)
return json_scrutator
def make_month_count_json():
df_data = pd.read_hdf('data.h5', 'df')
df_da = pd.DataFrame(list_all_data(), index=df_data[u'日期'])
string_index = df_data[u'日期']
# 计算出起止月份
start_day = string_index.min()
end_day = string_index.max()
start_ar = arrow.get(start_day)
end_ar = arrow.get(end_day)
print end_ar
if start_ar.day >= 26:
number_month = start_ar.month + 1
else:
number_month = start_ar.month
start_month = start_ar.replace(month=number_month)
if end_ar.day >= 26:
number_month = end_ar.month + 1
else:
number_month = end_ar.month + 1
end_month = end_ar.replace(months=number_month)
print end_month
list_month = []
list_month_count = []
for r in arrow.Arrow.range('month', start_month, end_month):
year_month = r.format("YYYY-MM")
end = arrow.get(r)
end = end.replace(day=25)
start = end.replace(months=-1)
start = start.replace(day=26)
list_a_month = []
for r in arrow.Arrow.range('day', start, end):
a_day = r.format('YYYY-MM-DD')
list_a_month.append(a_day)
try:
df_month = df_da.loc[list_a_month]
list_month.append(year_month)
list_month_count.append(df_month[u'日期'].count())
except:
continue
json_month = json.dumps(list_month)
json_count = json.dumps(list_month_count)
return json_month, json_count
# Create your views here.
@timeit
def home(request):
if request.method == 'POST':
post_data = request.POST
date_range = post_data["date_range"]
date_start = date_range.split(' to ')[0]
date_end = date_range.split(' to ')[1]
print date_start,date_end
df_data = pd.DataFrame(date_range_df_chinese_data(date_start,date_end))
else:
df_data = pd.read_hdf('data.h5', 'df')
if df_data.empty:
return HttpResponse(u"该时间范围内无数据,请返回上一页")
air1_dep = make_scrutator_json(df_data, u"航线一", u"车间监管")
air1_team = make_scrutator_json(df_data, u"航线一", u"班组自查")
air2_dep = make_scrutator_json(df_data, u"航线二", u"车间监管")
air2_team = make_scrutator_json(df_data, u"航线二", u"班组自查")
air3_dep = make_scrutator_json(df_data, u"航线三", u"车间监管")
air3_team = make_scrutator_json(df_data, u"航线三", u"班组自查")
certain_dep = make_scrutator_json(df_data, u"定检", u"车间监管")
certain_team = make_scrutator_json(df_data, u"定检", u"班组自查")
#
json_month, json_count = make_month_count_json()
return render(request, "home.html", {'air1_dep': air1_dep,
'air1_team': air1_team,
'air2_dep':air2_dep,
'air2_team':air2_team,
'air3_dep': air3_dep,
'air3_team': air3_team,
'certain_dep':certain_dep,
'certain_team':certain_team,
"json_month":json_month,
"json_count":json_count})
@timeit
def information(request):
upload_data = json.dumps(list_all_data())
return render(request, 'information.html',{'json_data': upload_data})
@timeit
def df_chinese_data():
exclude_list = []
query_data = qa_info.objects.all().order_by('-data')
json_data = serializers.serialize("json", query_data, use_natural_foreign_keys=True)
list_data = json.loads(json_data)
dict_name_verbose_name = {}
columns_set = []
colheaders = []
dataSchema = {}
for field in qa_info._meta.fields:
dict_name_verbose_name[field.name] = field.verbose_name
if not field.verbose_name in exclude_list:
print field.verbose_name
colheaders.append(field.verbose_name.encode("utf8"))
dataSchema[field.verbose_name] = ''
columns_item = {
u"title": field.verbose_name,
u"field": field.verbose_name,
# u"sortable": u"true",
}
if field.verbose_name == u"问题描述":
columns_item[u"width"] = u"20%"
columns_item[u"title"] = u"问题描述"
elif field.verbose_name == u"整改措施":
columns_item[u"width"] = u"20%"
columns_item[u"title"] = u"整改措施"
elif field.verbose_name == u"处理意见":
columns_item[u"width"] = u"6%"
columns_item[u"title"] = u"处理意见"
else:
split_list = list(field.verbose_name)
# every two word add
title_str = ""
for i in range(len(split_list)):
title_str = title_str + split_list[i]
if (i + 1) % 2 == 0:
title_str = title_str + u"<br>"
if field.verbose_name == u"相关附件":
columns_item[u'formatter'] = "attachment"
columns_item[u"title"] = title_str
columns_item[u"width"] = u"2%"
columns_set.append(columns_item)
json_columns = json.dumps(columns_set)
upload_data = []
for item in list_data:
single_data = item['fields']
single_data[u'id'] = item['pk']
upload_data.append(single_data)
# print upload_data
chinese_updata = []
for item in upload_data:
dict_updata = {}
for key, value in item.items():
dict_updata[dict_name_verbose_name[key]] = value
# print chinese_updata
chinese_updata.append(dict_updata)
#save list
file_1 = file('data_all.pkl', 'wb')
pickle.dump(chinese_updata, file_1, True)
#save pd file
df_data = pd.DataFrame(chinese_updata)
df_data.to_hdf('data.h5', 'df')
return chinese_updata
def return_df_chinese_data():
exclude_list = []
query_data = qa_info.objects.all().order_by('-data')
json_data = serializers.serialize("json", query_data, use_natural_foreign_keys=True)
list_data = json.loads(json_data)
dict_name_verbose_name = {}
columns_set = []
colheaders = []
dataSchema = {}
for field in qa_info._meta.fields:
dict_name_verbose_name[field.name] = field.verbose_name
if not field.verbose_name in exclude_list:
print field.verbose_name
colheaders.append(field.verbose_name.encode("utf8"))
dataSchema[field.verbose_name] = ''
columns_item = {
u"title": field.verbose_name,
u"field": field.verbose_name,
# u"sortable": u"true",
}
if field.verbose_name == u"问题描述":
columns_item[u"width"] = u"20%"
columns_item[u"title"] = u"问题描述"
elif field.verbose_name == u"整改措施":
columns_item[u"width"] = u"20%"
columns_item[u"title"] = u"整改措施"
elif field.verbose_name == u"处理意见":
columns_item[u"width"] = u"6%"
columns_item[u"title"] = u"处理意见"
else:
split_list = list(field.verbose_name)
# every two word add
title_str = ""
for i in range(len(split_list)):
title_str = title_str + split_list[i]
if (i + 1) % 2 == 0:
title_str = title_str + u"<br>"
if field.verbose_name == u"相关附件":
columns_item[u'formatter'] = "attachment"
columns_item[u"title"] = title_str
columns_item[u"width"] = u"2%"
columns_set.append(columns_item)
json_columns = json.dumps(columns_set)
upload_data = []
for item in list_data:
single_data = item['fields']
single_data[u'id'] = item['pk']
upload_data.append(single_data)
# print upload_data
chinese_updata = []
for item in upload_data:
dict_updata = {}
for key, value in item.items():
dict_updata[dict_name_verbose_name[key]] = value
# print chinese_updata
chinese_updata.append(dict_updata)
return chinese_updata
@timeit
def background(request):
upload_data = json.dumps(return_df_chinese_data())
return render(request, 'background.html', {'json_data': upload_data})
@timeit
def source(request):
df_data = pd.read_hdf('data.h5', 'df')
source = df_data[u"信息来源"].value_counts().to_json()
title = u'汇总'
return render(request, 'source.html',{'title': title, 'source': source})
@timeit
def source_month(request):
df_data = pd.read_hdf('data.h5', 'df')
df_da = pd.DataFrame(list_all_data(), index=df_data[u'日期'])
year_month = request.GET.get('value_conf', None)
end = arrow.get(year_month)
end = end.replace(day=25)
start = end.replace(months=-1)
start = start.replace(day=26)
print end
list_a_month = []
for r in arrow.Arrow.range('day', start, end):
a_day = r.format('YYYY-MM-DD')
list_a_month.append(a_day)
df_month = df_da.loc[list_a_month]
source = df_month[u"信息来源"].value_counts().to_json()
print source
print type(source)
return HttpResponse(source)
@timeit
def month_count(request):
df_data = pd.read_hdf('data.h5', 'df')
df_da = pd.DataFrame(list_all_data(), index=df_data[u'日期'])
string_index = df_data[u'日期']
# 计算出起止月份
start_day = string_index.min()
end_day = string_index.max()
start_ar = arrow.get(start_day)
end_ar = arrow.get(end_day)
print end_ar
if start_ar.day >= 26:
number_month = start_ar.month + 1
else:
number_month = start_ar.month
start_month = start_ar.replace(month=number_month)
if end_ar.day >= 26:
number_month = end_ar.month
else:
number_month = end_ar.month
end_month = end_ar.replace(months=number_month)
print end_month
list_month = []
list_month_count = []
for r in arrow.Arrow.range('month', start_month, end_month):
year_month = r.format("YYYY-MM")
end = arrow.get(r)
end = end.replace(day=25)
start = end.replace(months=-1)
start = start.replace(day=26)
list_a_month = []
for r in arrow.Arrow.range('day', start, end):
a_day = r.format('YYYY-MM-DD')
list_a_month.append(a_day)
try:
df_month = df_da.loc[list_a_month]
list_month.append(year_month)
list_month_count.append(df_month[u'日期'].count())
except:
continue
json_month = json.dumps(list_month)
json_count = json.dumps(list_month_count)
return render(request, "month_count.html",{"json_month":json_month,
"json_count":json_count})
def classification(request):
df_data = pd.read_hdf('data.h5', 'df')
df_da = pd.DataFrame(list_all_data(), index=df_data[u'日期'])
string_index = df_data[u'日期']
# 计算出起止月份
start_day = string_index.min()
end_day = string_index.max()
start_ar = arrow.get(start_day)
end_ar = arrow.get(end_day)
if start_ar.day >= 26:
start_ar_shift = start_ar.shift(months = 1)
number_month = start_ar_shift.month
else:
number_month = start_ar.month
start_month = start_ar.replace(month=number_month)
end_ar = end_ar.shift(months = 1)
end_month = end_ar
print "end_month"
print end_month
list_month = []
list_month_cl_count = []
for r in arrow.Arrow.range('month', start_month, end_month):
print r
year_month = r.format("YYYY-MM").encode("utf-8")
end = arrow.get(r)
end = end.replace(day=25)
start = end.replace(months=-1)
start = start.replace(day=26)
list_a_month = []
for r in arrow.Arrow.range('day', start, end):
a_day = r.format('YYYY-MM-DD')
list_a_month.append(a_day)
try:
df_month = df_da.loc[list_a_month]
df_month_group = df_month.groupby(u'问题分类')
df_cl = df_month_group[u"时间"].count()
dict_cl = df_cl.to_dict()
list_cl_name = [u'程序执行', u'工卡执行', u'工具设备', u"维护作风", u"现场管理", u"维修记录", u"生产组织", u"器材管理", u"其它"]
for item in list_cl_name:
dict_cl.setdefault(item, 0)
single_month = [dict_cl[u'程序执行'], dict_cl[u'工卡执行'], dict_cl[u'工具设备'], dict_cl[u"维护作风"], dict_cl[u"现场管理"],
dict_cl[u"维修记录"], dict_cl[u"生产组织"], dict_cl[u"器材管理"], dict_cl[u"其它"]]
single_month = map(lambda x: int(x), single_month)
list_month.append(year_month)
list_month_cl_count.append({year_month : single_month})
except:
continue
series_single_orignal = {
"name": '2016-05',
"type": 'bar',
"data":[21,33,56,89,44,55,66],
"itemStyle": {
"normal":{
"label":{
"show": "true",
"formatter": '{c}'
}
}
}
}
all_series = []
for list_item in list_month_cl_count:
for item, value in list_item.items():
#print item, value
series_single = series_single_orignal.copy()
series_single["name"] = item
series_single["data"] = value
#print series_single["name"], series_single["data"]
all_series.append(series_single)
#print all_series
#获取最后一个元素,控制显示
dict_selected = {}
for index, list_item in enumerate(list_month_cl_count):
for item, value in list_item.items():
dict_selected[item] = False
if index == (len(list_month_cl_count) - 1):
dict_selected[item] = True
json_month = json.dumps(list_month)
json_count = json.dumps(dict_selected)
json_series = json.dumps(all_series)
return render(request, "classification.html",{"json_month":json_month,
"json_count":json_count,
"json_series": json_series})
@timeit
def person_count(request):
if request.method == 'POST':
post_data = request.POST
date_range = post_data["date_range"]
date_start = date_range.split(' to ')[0]
date_end = date_range.split(' to ')[1]
print date_start,date_end
df_data = pd.DataFrame(date_range_df_chinese_data(date_start,date_end))
else:
df_data = pd.read_hdf('data.h5', 'df')
if df_data.empty:
return HttpResponse(u"该时间范围内无数据,请返回上一页")
df_da = df_data
df_person = df_da[u'责任人']
#chinese_name = u'([/u4e00-/u9fa5]+)'
#pattern = re.compile(chinese_name)
res_dict = {}
for item in df_person.values:
#print item
results = re.findall(ur"[\u4e00-\u9fa5]+", item)
for result in results:
#print result
res_dict[result] = res_dict.get(result, 0) + 1
person_count_list = res_dict.items()
json_list = []
for item in person_count_list:
single_dict = {}
# 去掉人员名单中的无
if item[0] == u'无':
print item[0], item[1]
continue
single_dict[u'责任人'] = item[0]
single_dict[u'次数'] = item[1]
json_list.append(single_dict)
json_person = json.dumps(json_list)
#检查者
df_scrutator = df_da[u"检查者"]
res_dict = {}
for item in df_scrutator.values:
# print item
results = re.findall(ur"[\u4e00-\u9fa5\d]+", item)
for result in results:
# print result
res_dict[result] = res_dict.get(result, 0) + 1
scrutator_count_list = res_dict.items()
json_list = []
for item in scrutator_count_list:
single_dict = {}
# 去掉人员名单中的无
if item[0] == u'无':
print item[0], item[1]
continue
single_dict[u'检查者'] = item[0]
single_dict[u'次数'] = item[1]
json_list.append(single_dict)
json_scrutator = json.dumps(json_list)
return render(request, "person_count.html", {'json_person': json_person,
'json_scrutator':json_scrutator})
@timeit
def date_range_df_chinese_data(date_start, date_end):
exclude_list = []
query_data = qa_info.objects.filter(data__range=[date_start, date_end]).order_by('-data')
json_data = serializers.serialize("json", query_data, use_natural_foreign_keys=True)
list_data = json.loads(json_data)
dict_name_verbose_name = {}
columns_set = []
colheaders = []
dataSchema = {}
for field in qa_info._meta.fields:
dict_name_verbose_name[field.name] = field.verbose_name
if not field.verbose_name in exclude_list:
print field.verbose_name
colheaders.append(field.verbose_name.encode("utf8"))
dataSchema[field.verbose_name] = ''
columns_item = {
u"title": field.verbose_name,
u"field": field.verbose_name,
# u"sortable": u"true",
}
if field.verbose_name == u"问题描述":
columns_item[u"width"] = u"20%"
columns_item[u"title"] = u"问题描述"
elif field.verbose_name == u"整改措施":
columns_item[u"width"] = u"20%"
columns_item[u"title"] = u"整改措施"
elif field.verbose_name == u"处理意见":
columns_item[u"width"] = u"6%"
columns_item[u"title"] = u"处理意见"
else:
split_list = list(field.verbose_name)
# every two word add
title_str = ""
for i in range(len(split_list)):
title_str = title_str + split_list[i]
if (i + 1) % 2 == 0:
title_str = title_str + u"<br>"
if field.verbose_name == u"相关附件":
columns_item[u'formatter'] = "attachment"
columns_item[u"title"] = title_str
columns_item[u"width"] = u"2%"
columns_set.append(columns_item)
json_columns = json.dumps(columns_set)
upload_data = []
for item in list_data:
single_data = item['fields']
single_data[u'id'] = item['pk']
upload_data.append(single_data)
# print upload_data
chinese_updata = []
for item in upload_data:
dict_updata = {}
for key, value in item.items():
dict_updata[dict_name_verbose_name[key]] = value
# print chinese_updata
chinese_updata.append(dict_updata)
return chinese_updata
@timeit
def month_count_group_by_source(request):
df_data = pd.read_hdf('data.h5', 'df')
df_da = pd.DataFrame(list_all_data(), index=df_data[u'日期'])
string_index = df_data[u'日期']
# 计算出起止月份
start_day = string_index.min()
end_day = string_index.max()
start_ar = arrow.get(start_day)
end_ar = arrow.get(end_day)
print end_ar
if start_ar.day >= 26:
number_month = start_ar.month + 1
else:
number_month = start_ar.month
start_month = start_ar.replace(month=number_month)
if end_ar.day >= 26:
number_month = end_ar.month + 1
else:
number_month = end_ar.month + 1
end_month = end_ar.replace(months=number_month)
print end_month
list_month = []
list_month_count_quality = []
list_month_count_workshop = []
list_month_count_team = []
for r in arrow.Arrow.range('month', start_month, end_month):
year_month = r.format("YYYY-MM")
end = arrow.get(r)
end = end.replace(day=25)
start = end.replace(months=-1)
start = start.replace(day=26)
list_a_month = []
for r in arrow.Arrow.range('day', start, end):
a_day = r.format('YYYY-MM-DD')
list_a_month.append(a_day)
try:
df_month = df_da.loc[list_a_month]
list_month.append(year_month)
list_month_count_quality.append(df_month[u'信息来源'][df_month[u'信息来源']==u"质量监管"].count())
list_month_count_workshop.append(df_month[u'信息来源'][df_month[u'信息来源']==u"车间监管"].count())
list_month_count_team.append(df_month[u'信息来源'][df_month[u'信息来源']==u"班组自查"].count())
except:
continue
json_month = json.dumps(list_month)
json_count_quality = json.dumps(list_month_count_quality)
json_count_workshop = json.dumps(list_month_count_workshop)
json_count_team = json.dumps(list_month_count_team)
return render(request, "month_count_group_by_source.html",{"json_month":json_month,
"json_count_quality":json_count_quality,
"json_count_workshop":json_count_workshop,
"json_count_team":json_count_team})
@timeit
def month_count_group_by_department(request):
df_data = pd.read_hdf('data.h5', 'df')
df_da = pd.DataFrame(list_all_data(), index=df_data[u'日期'])
string_index = df_data[u'日期']
# 计算出起止月份
start_day = string_index.min()
end_day = string_index.max()
start_ar = arrow.get(start_day)
end_ar = arrow.get(end_day)
print end_ar
if start_ar.day >= 26:
number_month = start_ar.month + 1
else:
number_month = start_ar.month
start_month = start_ar.replace(month=number_month)
if end_ar.day >= 26:
number_month = end_ar.month + 1
else:
number_month = end_ar.month + 1
end_month = end_ar.replace(months=number_month)
print end_month
list_month = []
list_month_count_scheduled = []
list_month_count_airline1 = []
list_month_count_airline2 = []
list_month_count_airline3 = []
for r in arrow.Arrow.range('month', start_month, end_month):
year_month = r.format("YYYY-MM")
end = arrow.get(r)
end = end.replace(day=25)
start = end.replace(months=-1)
start = start.replace(day=26)
list_a_month = []
for r in arrow.Arrow.range('day', start, end):
a_day = r.format('YYYY-MM-DD')
list_a_month.append(a_day)
try:
df_month = df_da.loc[list_a_month]
list_month.append(year_month)
list_month_count_scheduled.append(df_month[u'受检单位'][df_month[u'受检单位']==u"定检"].count())
list_month_count_airline1.append(df_month[u'受检单位'][df_month[u'受检单位']==u"航线一"].count())
list_month_count_airline2.append(df_month[u'受检单位'][df_month[u'受检单位']==u"航线二"].count())
list_month_count_airline3.append(df_month[u'受检单位'][df_month[u'受检单位'] == u"航线三"].count())
except:
continue
json_month = json.dumps(list_month)
json_count_scheduled = json.dumps(list_month_count_scheduled)
json_count_airline1 = json.dumps(list_month_count_airline1)
json_count_airline2 = json.dumps(list_month_count_airline2)
json_count_airline3 = json.dumps(list_month_count_airline3)
return render(request, "month_count_group_by_department.html",{"json_month":json_month,
"json_count_scheduled":json_count_scheduled,
"json_count_airline1":json_count_airline1,
"json_count_airline2":json_count_airline2,
"json_count_airline3": json_count_airline3}) | 35.515235 | 120 | 0.584627 | 3,378 | 25,642 | 4.150089 | 0.077561 | 0.030173 | 0.041087 | 0.021828 | 0.800842 | 0.774378 | 0.741137 | 0.722805 | 0.697411 | 0.688994 | 0 | 0.012408 | 0.299119 | 25,642 | 722 | 121 | 35.515235 | 0.767639 | 0.030341 | 0 | 0.721943 | 0 | 0 | 0.068821 | 0.00274 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.020101 | null | null | 0.038526 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ea597bbebad1ea19c34ecd9e83300263939c2bda | 7,862 | py | Python | tests/test_lib_regressors.py | mkowiel/restraintlib | 32de01d67ae290a45f3199e90c729acc258a6249 | [
"BSD-3-Clause"
] | null | null | null | tests/test_lib_regressors.py | mkowiel/restraintlib | 32de01d67ae290a45f3199e90c729acc258a6249 | [
"BSD-3-Clause"
] | 1 | 2021-11-11T18:45:10.000Z | 2021-11-11T18:45:10.000Z | tests/test_lib_regressors.py | mkowiel/restraintlib | 32de01d67ae290a45f3199e90c729acc258a6249 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from unittest import TestCase
import sklearn
from restraintlib.restraints import load_function
class LibRegressorsTestCase(TestCase):
def assertFunctional(self, function_name, x, expected_value, expected_sigma):
regressor = load_function(function_name)
value, sigma = regressor.predict([[x]], return_std=True)
print('self.assertFunctional("%s",%s %.1f, %.4f, %.4f)' % (function_name, (42-len(function_name))*" ", x, value[0], sigma[0]))
self.assertAlmostEqual(value[0], expected_value, places=4)
self.assertAlmostEqual(sigma[0], expected_sigma, places=4)
def test_pickle(self):
if sklearn.__version__ < "0.22.0":
self.assertFunctional("All-C1'-N1 or C1'-N9.pickle", 10.0, 1.4874497, 0.011119879)
self.assertFunctional("All-C1'-O4'.pickle", 10.0, 1.4095869, 0.0092497524)
self.assertFunctional("deoxyribose-C2'-endo-C1'-C2'-C3'.pickle", 10.0, 108.46958654, 0.705017644)
self.assertFunctional("deoxyribose-C2'-endo-C1'-O4'-C4'.pickle", 10.0, 113.33767771, 0.654150846)
self.assertFunctional("deoxyribose-C2'-endo-C2'-C3'-C4'.pickle", 10.0, 107.58851579, 0.853301519)
self.assertFunctional("deoxyribose-C2'-endo-C3'-C4'-O4'.pickle", 10.0, 107.49313064, 0.867046559)
self.assertFunctional("deoxyribose-C3'-endo-C1'-C2'-C3'.pickle", 10.0, 109.23909162, 0.981169342)
self.assertFunctional("deoxyribose-C3'-endo-C1'-O4'-C4'.pickle", 10.0, 112.37972166, 0.946601366)
self.assertFunctional("deoxyribose-C3'-endo-C2'-C3'-C4'.pickle", 10.0, 106.43252974, 0.842729834)
self.assertFunctional("deoxyribose-C3'-endo-C3'-C4'-O4'.pickle", 10.0, 109.46664887, 0.696293795)
self.assertFunctional("deoxyribose-Other-C1'-C2'-C3'.pickle", 10.0, 107.0025446, 1.4534300232)
self.assertFunctional("deoxyribose-Other-C1'-O4'-C4'.pickle", 10.0, 112.9288726, 1.3327459941)
self.assertFunctional("deoxyribose-Other-C2'-C3'-C4'.pickle", 10.0, 106.9423613, 1.2974987305)
self.assertFunctional("deoxyribose-Other-C3'-C4'-O4'.pickle", 10.0, 109.3494080, 1.7140458693)
self.assertFunctional("purine-C1'-N1-C2 or C1'-N9-C4.pickle", 10.0, 126.9829216, 1.5048922785)
self.assertFunctional("purine-C1'-N1-C6 or C1'-N9-C8.pickle", 10.0, 127.1839575, 1.6464197451)
self.assertFunctional("purine-N1-C1'-C2' or N9-C1'-C2'.pickle", 10.0, 113.6571028, 1.2607817385)
self.assertFunctional("purine-N1-C1'-O4' or N9-C1'-O4'.pickle", 10.0, 107.3662676, 1.1939858801)
self.assertFunctional("pyrimidine-C1'-N1-C2 or C1'-N9-C4.pickle", 10.0, 117.9132730, 1.3865739592)
self.assertFunctional("pyrimidine-C1'-N1-C6 or C1'-N9-C8.pickle", 10.0, 120.7343454, 1.2841006880)
self.assertFunctional("pyrimidine-N1-C1'-C2' or N9-C1'-C2'.pickle", 10.0, 113.5856496, 1.3802839915)
self.assertFunctional("pyrimidine-N1-C1'-O4' or N9-C1'-O4'.pickle", 10.0, 108.0619242, 1.1917431290)
self.assertFunctional("ribose-C2'-endo-C1'-C2'-C3'.pickle", 10.0, 107.6786200, 0.8340928962)
self.assertFunctional("ribose-C2'-endo-C1'-O4'-C4'.pickle", 10.0, 113.6515472, 0.8514767462)
self.assertFunctional("ribose-C2'-endo-C2'-C3'-C4'.pickle", 10.0, 106.5254955, 0.8538892370)
self.assertFunctional("ribose-C2'-endo-C3'-C4'-O4'.pickle", 10.0, 109.0108704, 0.8917961243)
self.assertFunctional("ribose-C3'-endo-C1'-C2'-C3'.pickle", 10.0, 106.5080838, 1.1667549808)
self.assertFunctional("ribose-C3'-endo-C1'-O4'-C4'.pickle", 10.0, 114.2986980, 0.7747160588)
self.assertFunctional("ribose-C3'-endo-C2'-C3'-C4'.pickle", 10.0, 108.9975249, 0.9434856130)
self.assertFunctional("ribose-C3'-endo-C3'-C4'-O4'.pickle", 10.0, 109.3762512, 0.8818958553)
self.assertFunctional("ribose-Other-C1'-C2'-C3'.pickle", 10.0, 108.1068697, 2.3693564847)
self.assertFunctional("ribose-Other-C1'-O4'-C4'.pickle", 10.0, 116.0915706, 1.6088767710)
self.assertFunctional("ribose-Other-C2'-C3'-C4'.pickle", 10.0, 106.1877587, 1.4164389140)
self.assertFunctional("ribose-Other-C3'-C4'-O4'.pickle", 10.0, 104.7131548, 1.0766946431)
else:
self.assertFunctional("All-C1'-N1 or C1'-N9.pickle", 10.0, 1.4868, 0.0111)
self.assertFunctional("All-C1'-O4'.pickle", 10.0, 1.4096, 0.0093)
self.assertFunctional("deoxyribose-C2'-endo-C1'-C2'-C3'.pickle", 10.0, 108.4696, 0.7050)
self.assertFunctional("deoxyribose-C2'-endo-C1'-O4'-C4'.pickle", 10.0, 113.3377, 0.6542)
self.assertFunctional("deoxyribose-C2'-endo-C2'-C3'-C4'.pickle", 10.0, 107.5885, 0.8533)
self.assertFunctional("deoxyribose-C2'-endo-C3'-C4'-O4'.pickle", 10.0, 107.4932, 0.8670)
self.assertFunctional("deoxyribose-C3'-endo-C1'-C2'-C3'.pickle", 10.0, 109.2391, 0.9812)
self.assertFunctional("deoxyribose-C3'-endo-C1'-O4'-C4'.pickle", 10.0, 112.3798, 0.9466)
self.assertFunctional("deoxyribose-C3'-endo-C2'-C3'-C4'.pickle", 10.0, 106.4325, 0.8427)
self.assertFunctional("deoxyribose-C3'-endo-C3'-C4'-O4'.pickle", 10.0, 109.4667, 0.6963)
self.assertFunctional("deoxyribose-Other-C1'-C2'-C3'.pickle", 10.0, 107.0026, 1.4534)
self.assertFunctional("deoxyribose-Other-C1'-O4'-C4'.pickle", 10.0, 112.9289, 1.3327)
self.assertFunctional("deoxyribose-Other-C2'-C3'-C4'.pickle", 10.0, 106.9424, 1.2975)
self.assertFunctional("deoxyribose-Other-C3'-C4'-O4'.pickle", 10.0, 109.3494, 1.7140)
self.assertFunctional("purine-C1'-N1-C2 or C1'-N9-C4.pickle", 10.0, 127.0279, 1.6484)
self.assertFunctional("purine-C1'-N1-C6 or C1'-N9-C8.pickle", 10.0, 127.4152, 1.7406)
self.assertFunctional("purine-N1-C1'-C2' or N9-C1'-C2'.pickle", 10.0, 113.4415, 1.2873)
self.assertFunctional("purine-N1-C1'-O4' or N9-C1'-O4'.pickle", 10.0, 107.3712, 1.1930)
self.assertFunctional("pyrimidine-C1'-N1-C2 or C1'-N9-C4.pickle", 10.0, 117.9546, 1.7695)
self.assertFunctional("pyrimidine-C1'-N1-C6 or C1'-N9-C8.pickle", 10.0, 120.7348, 1.6394)
self.assertFunctional("pyrimidine-N1-C1'-C2' or N9-C1'-C2'.pickle", 10.0, 113.5880, 1.5892)
self.assertFunctional("pyrimidine-N1-C1'-O4' or N9-C1'-O4'.pickle", 10.0, 108.0620, 0.9913)
self.assertFunctional("ribose-C2'-endo-C1'-C2'-C3'.pickle", 10.0, 107.6786, 0.8341)
self.assertFunctional("ribose-C2'-endo-C1'-O4'-C4'.pickle", 10.0, 113.6515, 0.8515)
self.assertFunctional("ribose-C2'-endo-C2'-C3'-C4'.pickle", 10.0, 106.5255, 0.8539)
self.assertFunctional("ribose-C2'-endo-C3'-C4'-O4'.pickle", 10.0, 109.0109, 0.8918)
self.assertFunctional("ribose-C3'-endo-C1'-C2'-C3'.pickle", 10.0, 106.5081, 1.1668)
self.assertFunctional("ribose-C3'-endo-C1'-O4'-C4'.pickle", 10.0, 114.2987, 0.7747)
self.assertFunctional("ribose-C3'-endo-C2'-C3'-C4'.pickle", 10.0, 108.9975, 0.9435)
self.assertFunctional("ribose-C3'-endo-C3'-C4'-O4'.pickle", 10.0, 109.3763, 0.8819)
self.assertFunctional("ribose-Other-C1'-C2'-C3'.pickle", 10.0, 108.1072, 2.3693)
self.assertFunctional("ribose-Other-C1'-O4'-C4'.pickle", 10.0, 116.0916, 1.6089)
self.assertFunctional("ribose-Other-C2'-C3'-C4'.pickle", 10.0, 106.1879, 1.4164)
self.assertFunctional("ribose-Other-C3'-C4'-O4'.pickle", 10.0, 104.7146, 1.0767)
| 89.340909 | 134 | 0.627448 | 1,117 | 7,862 | 4.40197 | 0.189794 | 0.280659 | 0.124466 | 0.06264 | 0.71629 | 0.71629 | 0.71629 | 0.71629 | 0.71629 | 0.701241 | 0 | 0.252854 | 0.186594 | 7,862 | 87 | 135 | 90.367816 | 0.516028 | 0.002671 | 0 | 0 | 0 | 0 | 0.315346 | 0.23179 | 0 | 0 | 0 | 0 | 0.888889 | 1 | 0.024691 | false | 0 | 0.037037 | 0 | 0.074074 | 0.012346 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ea9060ce0538ada7431990c8c7cd72fbd3c39853 | 4,691 | py | Python | maintain_ner_position/my_insert.py | wmc1992/maintain_ner_position | 910622c715b1009039ec4a5c2d7909d180b888fd | [
"MIT"
] | null | null | null | maintain_ner_position/my_insert.py | wmc1992/maintain_ner_position | 910622c715b1009039ec4a5c2d7909d180b888fd | [
"MIT"
] | null | null | null | maintain_ner_position/my_insert.py | wmc1992/maintain_ner_position | 910622c715b1009039ec4a5c2d7909d180b888fd | [
"MIT"
] | null | null | null | from maintain_ner_position.my_utils import check_entity_position, \
convert_entity_object_type, reconvert_entity_object_type
def insert_entity(content, entity_list, insert_idx, insert_cont, insert_type):
"""
插入一个新实体,并维持实体列表中所有的索引值正常;若新插入的文本刚好处于某个实体的边界,新插入的文本不
会作为实体,原实体保持不变。
举例如下:
原文本:小明去招商银行。
原实体:{"type": "机构", "start": 3, "end": 7, "value": "招商银行"}
插入文本:中国
插入索引:3
新文本:小明去中国招商银行。
新实体:{"type": "机构", "start": 5, "end": 9, "value": "招商银行"}
:param content: 原始文本
:param entity_list: 原始文本对应的实体列表
:param insert_idx: 插入的位置索引
:param insert_cont: 待插入的文本
:param insert_type: 新插入的文本对应的实体类型
:return:
"""
entity_list, entity_object_type = convert_entity_object_type(entity_list)
content, entity_list = insert_content(content, entity_list, insert_idx, insert_cont)
entity_list.append({
"type": insert_type,
"start": insert_idx,
"end": insert_idx + len(insert_cont),
"value": content[insert_idx:insert_idx + len(insert_cont)],
})
entity_list = reconvert_entity_object_type(entity_list, entity_object_type)
return content, entity_list
def insert_content(content, entity_list, insert_idx, insert_cont):
"""
插入一段文本,并维持实体列表中所有的索引值正常;若新插入的文本刚好处于某个实体的边界,新插入的文本不
会作为实体,原实体保持不变。
举例如下:
原文本:小明去招商银行。
原实体:{"type": "机构", "start": 3, "end": 7, "value": "招商银行"}
插入文本:中国
插入索引:3
新文本:小明去中国招商银行。
新实体:{"type": "机构", "start": 5, "end": 9, "value": "招商银行"}
:param content: 原始文本
:param entity_list: 原始文本对应的实体列表
:param insert_idx: 插入的位置索引
:param insert_cont: 待插入的文本
:return:
"""
entity_list, entity_object_type = convert_entity_object_type(entity_list)
check_entity_position(content, entity_list)
if insert_idx > len(content):
raise RuntimeError(f"插入的位置索引超过了文本长度,位置索引:{insert_idx},文本长度:{len(content)}")
for e in entity_list:
if e["start"] >= insert_idx:
e["start"] += len(insert_cont)
if e["end"] > insert_idx:
e["end"] += len(insert_cont)
content = content[:insert_idx] + insert_cont + content[insert_idx:]
for e in entity_list:
e["value"] = content[e["start"]:e["end"]]
entity_list = reconvert_entity_object_type(entity_list, entity_object_type)
return content, entity_list
def insert_entity_extend_entity(content, entity_list, insert_idx, insert_cont, insert_type):
"""
插入一个新实体,并维持实体列表中所有的索引值正常;若新插入的文本刚好处于某个实体的边界,会扩展该实体的
边界,将新插入的文本也作为实体的一部分。
举例如下:
原文本:小明去招商银行。
原实体:{"type": "机构", "start": 3, "end": 7, "value": "招商银行"}
插入文本:中国
插入索引:3
新文本:小明去中国招商银行。
新实体:{"type": "机构", "start": 3, "end": 9, "value": "中国招商银行"}
:param content: 原始文本
:param entity_list: 原始文本对应的实体列表
:param insert_idx: 插入的位置索引
:param insert_cont: 待插入的文本
:param insert_type: 新插入的文本对应的实体类型
:return:
"""
entity_list, entity_object_type = convert_entity_object_type(entity_list)
content, entity_list = insert_content_extend_entity(content, entity_list, insert_idx, insert_cont)
entity_list.append({
"type": insert_type,
"start": insert_idx,
"end": insert_idx + len(insert_cont),
"value": content[insert_idx:insert_idx + len(insert_cont)],
})
entity_list = reconvert_entity_object_type(entity_list, entity_object_type)
return content, entity_list
def insert_content_extend_entity(content, entity_list, insert_idx, insert_cont):
"""
插入一段文本,并维持实体列表中所有的索引值正常;若新插入的文本刚好处于某个实体的边界,会扩展该实体的
边界,将新插入的文本也作为实体的一部分。
举例如下:
原文本:小明去招商银行。
原实体:{"type": "机构", "start": 3, "end": 7, "value": "招商银行"}
插入文本:中国
插入索引:3
新文本:小明去中国招商银行。
新实体:{"type": "机构", "start": 3, "end": 9, "value": "中国招商银行"}
:param content: 原始文本
:param entity_list: 原始文本对应的实体列表
:param insert_idx: 插入的位置索引
:param insert_cont: 待插入的文本
:return:
"""
entity_list, entity_object_type = convert_entity_object_type(entity_list)
check_entity_position(content, entity_list)
if insert_idx > len(content):
raise RuntimeError(f"插入的位置索引超过了文本长度,位置索引:{insert_idx},文本长度:{len(content)}")
for e in entity_list:
if e["start"] > insert_idx:
e["start"] += len(insert_cont)
if e["end"] >= insert_idx:
e["end"] += len(insert_cont)
content = content[:insert_idx] + insert_cont + content[insert_idx:]
for e in entity_list:
e["value"] = content[e["start"]:e["end"]]
entity_list = reconvert_entity_object_type(entity_list, entity_object_type)
return content, entity_list
| 31.911565 | 102 | 0.655297 | 582 | 4,691 | 5.001718 | 0.118557 | 0.13741 | 0.098935 | 0.063209 | 0.959464 | 0.959464 | 0.959464 | 0.959464 | 0.959464 | 0.957403 | 0 | 0.005464 | 0.219783 | 4,691 | 146 | 103 | 32.130137 | 0.789891 | 0.336176 | 0 | 0.777778 | 0 | 0 | 0.069087 | 0.036658 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074074 | false | 0 | 0.018519 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
575e5fab108ff6e2b86f703507bf9901750a6a8d | 2,203 | py | Python | reshape_checkpoints.py | saralpatel/mmdetection | ad7889cf2d5c3ce82fbfc895cbbc38c416085fe2 | [
"Apache-2.0"
] | null | null | null | reshape_checkpoints.py | saralpatel/mmdetection | ad7889cf2d5c3ce82fbfc895cbbc38c416085fe2 | [
"Apache-2.0"
] | null | null | null | reshape_checkpoints.py | saralpatel/mmdetection | ad7889cf2d5c3ce82fbfc895cbbc38c416085fe2 | [
"Apache-2.0"
] | null | null | null | import torch
pretrained_weights = torch.load("./experiments/try/ssd512_coco_vgg16_caffe_120e_20181221-d48b0be8.pth")
num_class = 2
'''
pretrained_weights['state_dict']['bbox_head.0.fc_cls.weight'].resize_([num_class, 1024])
pretrained_weights['state_dict']['bbox_head.0.fc_cls.bias'].resize_([num_class])
pretrained_weights['state_dict']['bbox_head.1.fc_cls.weight'].resize_([num_class, 1024])
pretrained_weights['state_dict']['bbox_head.1.fc_cls.bias'].resize_([num_class])
pretrained_weights['state_dict']['bbox_head.2.fc_cls.weight'].resize_([num_class, 1024])
pretrained_weights['state_dict']['bbox_head.2.fc_cls.bias'].resize_([num_class])
pretrained_weights['state_dict']['bbox_head.fc_cls.weight'].resize_([num_class, 1024])
pretrained_weights['state_dict']['bbox_head.fc_cls.bias'].resize_([num_class])
pretrained_weights['state_dict']['bbox_head.fc_reg.weight'].resize_([8, 1024])
pretrained_weights['state_dict']['bbox_head.fc_reg.bias'].resize_([8])'''
pretrained_weights['state_dict']['bbox_head.cls_convs.0.weight'].resize_([8, 512, 3, 3])
pretrained_weights['state_dict']['bbox_head.cls_convs.0.bias'].resize_([8])
pretrained_weights['state_dict']['bbox_head.cls_convs.1.weight'].resize_([12, 1024, 3, 3])
pretrained_weights['state_dict']['bbox_head.cls_convs.1.bias'].resize_([12])
pretrained_weights['state_dict']['bbox_head.cls_convs.2.weight'].resize_([12, 512, 3, 3])
pretrained_weights['state_dict']['bbox_head.cls_convs.2.bias'].resize_([12])
pretrained_weights['state_dict']['bbox_head.cls_convs.3.weight'].resize_([12, 256, 3, 3])
pretrained_weights['state_dict']['bbox_head.cls_convs.3.bias'].resize_([12])
pretrained_weights['state_dict']['bbox_head.cls_convs.4.weight'].resize_([12, 256, 3, 3])
pretrained_weights['state_dict']['bbox_head.cls_convs.4.bias'].resize_([12])
pretrained_weights['state_dict']['bbox_head.cls_convs.5.weight'].resize_([8, 256, 3, 3])
pretrained_weights['state_dict']['bbox_head.cls_convs.5.bias'].resize_([8])
pretrained_weights['state_dict']['bbox_head.cls_convs.6.weight'].resize_([8, 256, 3, 3])
pretrained_weights['state_dict']['bbox_head.cls_convs.6.bias'].resize_([8])
torch.save(pretrained_weights, "./experiments/try/Ssd_512_1class.pth") | 66.757576 | 104 | 0.772583 | 347 | 2,203 | 4.507205 | 0.123919 | 0.282609 | 0.337596 | 0.398977 | 0.854859 | 0.854859 | 0.854859 | 0.854859 | 0.825448 | 0.790921 | 0 | 0.058466 | 0.029505 | 2,203 | 33 | 105 | 66.757576 | 0.673059 | 0 | 0 | 0 | 0 | 0 | 0.453683 | 0.351568 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.055556 | 0 | 0.055556 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
57aa4d52451da463382535fbd95239620f09ce4f | 611 | py | Python | src/utils/error.py | shubranshugupta/Lung-Cancer-Segmentation | 7e6eccb8fe055a107cdd50a9e419ca01f4a72d51 | [
"MIT"
] | 1 | 2021-11-18T20:34:07.000Z | 2021-11-18T20:34:07.000Z | src/utils/error.py | shubranshugupta/Lung-Cancer-Segmentation | 7e6eccb8fe055a107cdd50a9e419ca01f4a72d51 | [
"MIT"
] | null | null | null | src/utils/error.py | shubranshugupta/Lung-Cancer-Segmentation | 7e6eccb8fe055a107cdd50a9e419ca01f4a72d51 | [
"MIT"
] | null | null | null | class FolderNotFoundError(Exception):
def __init__(self, message):
self.message = message
def __str__(self):
return self.message
class AWSCredentialError(Exception):
def __init__(self, message):
self.message = message
def __str__(self):
return self.message
class DownloadDataError(Exception):
def __init__(self, message):
self.message = message
def __str__(self):
return self.message
class MyNoCredentialError(Exception):
def __init__(self, message):
self.message = message
def __str__(self):
return self.message | 26.565217 | 37 | 0.675941 | 64 | 611 | 5.953125 | 0.1875 | 0.346457 | 0.167979 | 0.209974 | 0.795276 | 0.795276 | 0.795276 | 0.795276 | 0.795276 | 0.795276 | 0 | 0 | 0.238953 | 611 | 23 | 38 | 26.565217 | 0.819355 | 0 | 0 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0.2 | 0.8 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 11 |
aa0bebb91293262f80f6745e6586a6a912f584ca | 458 | py | Python | orb_simulator/sprites_and_graph_ent/__init__.py | dmguezjaviersnet/IA-Sim-Comp-Project | 8165b9546efc45f98091a3774e2dae4f45942048 | [
"MIT"
] | 1 | 2022-01-19T22:49:09.000Z | 2022-01-19T22:49:09.000Z | orb_simulator/sprites_and_graph_ent/__init__.py | dmguezjaviersnet/IA-Sim-Comp-Project | 8165b9546efc45f98091a3774e2dae4f45942048 | [
"MIT"
] | 15 | 2021-11-10T14:25:02.000Z | 2022-02-12T19:17:11.000Z | orb_simulator/sprites_and_graph_ent/__init__.py | dmguezjaviersnet/IA-Sim-Comp-Project | 8165b9546efc45f98091a3774e2dae4f45942048 | [
"MIT"
] | null | null | null | from sprites_and_graph_ent.earth import Sphere
from sprites_and_graph_ent.eliptic_orbit import ElipticOrbit
from sprites_and_graph_ent.space_debris import SpaceDebris
from sprites_and_graph_ent.orbit_obj import OrbitObj
from sprites_and_graph_ent.space_obj import SpaceObj
from sprites_and_graph_ent.satellite import Satellite
from sprites_and_graph_ent.space_debris_collector import SpaceDebrisCollector
from sprites_and_graph_ent.launchpad import Launchpad | 57.25 | 77 | 0.914847 | 70 | 458 | 5.557143 | 0.3 | 0.226221 | 0.287918 | 0.390746 | 0.521851 | 0.239075 | 0.169666 | 0 | 0 | 0 | 0 | 0 | 0.067686 | 458 | 8 | 78 | 57.25 | 0.911007 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
aa41415c1099d7b00c98fb7f65b516e1707abbcc | 80 | py | Python | thseq/nn/__init__.py | DeepLearnXMU/ABDNMT-RNMT | c3b20e4afdbfee5741e95a42bbd31329bb9bb93d | [
"MIT"
] | 12 | 2019-08-17T15:40:11.000Z | 2022-02-04T16:22:18.000Z | thseq/nn/__init__.py | DeepLearnXMU/ABDNMT-RNMT | c3b20e4afdbfee5741e95a42bbd31329bb9bb93d | [
"MIT"
] | null | null | null | thseq/nn/__init__.py | DeepLearnXMU/ABDNMT-RNMT | c3b20e4afdbfee5741e95a42bbd31329bb9bb93d | [
"MIT"
] | 3 | 2019-06-04T08:39:56.000Z | 2020-01-10T06:52:04.000Z | from thseq.nn.ff import PositionWiseFeedForward
from thseq.nn.embedding import * | 40 | 47 | 0.85 | 11 | 80 | 6.181818 | 0.636364 | 0.264706 | 0.323529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0875 | 80 | 2 | 48 | 40 | 0.931507 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.