hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
76ef154ef0896b64fbb0e2d8933b02a3671e6b96 | 2,285 | py | Python | app/game_api.py | pitzer42/mini-magic | e12d27034bff433b453daac7ab4e8f920cc27c14 | [
"MIT"
] | null | null | null | app/game_api.py | pitzer42/mini-magic | e12d27034bff433b453daac7ab4e8f920cc27c14 | [
"MIT"
] | 1 | 2021-06-01T22:26:08.000Z | 2021-06-01T22:26:08.000Z | app/game_api.py | pitzer42/mini-magic | e12d27034bff433b453daac7ab4e8f920cc27c14 | [
"MIT"
] | null | null | null | import actions
from actions import IllegalOperation
from app.validations import inject_json_fields
from flask import abort, Response
@inject_json_fields('player_id', 'deck_id')
def join(match_id, player_id, deck_id):
try:
actions.join(match_id, player_id, deck_id)
except IllegalOperation as e:
message = str(e)
print(message)
abort(400, description=message)
return Response(status=200)
def play(match_id, player_index, card_index):
player_index -= 1
card_index -= 1
try:
actions.play_card(match_id, player_index, card_index)
except IndexError as error:
error_message = str(error)
print(error_message)
abort(404, description=error_message)
except IllegalOperation as error:
error_message = str(error)
print(error_message)
abort(400, description=error_message)
return Response(status=200)
def use(match_id, player_index, card_index):
player_index -= 1
card_index -= 1
try:
actions.use_card(match_id, player_index, card_index)
except IndexError as error:
error_message = str(error)
print(error_message)
abort(404, description=error_message)
except IllegalOperation as error:
error_message = str(error)
print(error_message)
abort(400, description=error_message)
return Response(status=200)
def yield_play(match_id, player_index):
player_index -= 1
try:
actions.yield_play(match_id, player_index)
except IndexError as error:
error_message = str(error)
print(error_message)
abort(404, description=error_message)
except IllegalOperation as error:
error_message = str(error)
print(error_message)
abort(400, description=error_message)
return Response(status=200)
def end_turn(match_id, player_index):
player_index -= 1
try:
actions.end_turn(match_id, player_index)
except IndexError as error:
error_message = str(error)
print(error_message)
abort(404, description=error_message)
except IllegalOperation as error:
error_message = str(error)
print(error_message)
abort(400, description=error_message)
return Response(status=200)
| 29.294872 | 61 | 0.68709 | 287 | 2,285 | 5.233449 | 0.142857 | 0.191744 | 0.086551 | 0.095872 | 0.844208 | 0.841545 | 0.798269 | 0.76498 | 0.76498 | 0.711718 | 0 | 0.02746 | 0.235011 | 2,285 | 77 | 62 | 29.675325 | 0.831808 | 0 | 0 | 0.716418 | 0 | 0 | 0.007002 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074627 | false | 0 | 0.059701 | 0 | 0.208955 | 0.134328 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6a9223fe01a74a0445c17b84de925d603fb5f4e6 | 14,380 | py | Python | hcanet/rl/replay.py | douglasrizzo/hcanet | e8d434e58b62f5afc0085529e9db012e07669705 | [
"Apache-2.0"
] | null | null | null | hcanet/rl/replay.py | douglasrizzo/hcanet | e8d434e58b62f5afc0085529e9db012e07669705 | [
"Apache-2.0"
] | null | null | null | hcanet/rl/replay.py | douglasrizzo/hcanet | e8d434e58b62f5afc0085529e9db012e07669705 | [
"Apache-2.0"
] | 1 | 2021-10-07T12:19:28.000Z | 2021-10-07T12:19:28.000Z | import random
from collections import namedtuple
from .segment_tree import MinSegmentTree, SumSegmentTree
# the original algorithm used a namedtuple, but those can't be documented.
# I learned a hack on how to document them here https://stackoverflow.com/a/1606478
Transition_ = namedtuple("Transition", ("state", "action", "reward", "done", "av_actions"))
class Transition(Transition_):
"""A named tuple to store a state transition (s, a, s', r)"""
pass
class ReplayBuffer:
def __init__(self, size):
"""Create Replay buffer.
Parameters
----------
size: int
Max number of transitions to store in the buffer. When the buffer
overflows the old memories are dropped.
"""
self._storage = []
self._maxsize = size
self._next_idx = 0
self.is_prioritized = False
def __len__(self):
return len(self._storage)
def can_sample(self, batch_size):
return len(self._storage) >= batch_size
def add(self, obs_t, action, reward, obs_tp1, done):
data = Transition(obs_t, action, reward, obs_tp1, done)
if self._next_idx >= len(self._storage):
self._storage.append(data)
else:
self._storage[self._next_idx] = data
self._next_idx = (self._next_idx + 1) % self._maxsize
def extend(self, obs_t, action, reward, obs_tp1, done):
"""
add a new batch of transitions to the buffer
:param obs_t: (Union[Tuple[Union[np.ndarray, int]], np.ndarray]) the last batch of observations
:param action: (Union[Tuple[Union[np.ndarray, int]]], np.ndarray]) the batch of actions
:param reward: (Union[Tuple[float], np.ndarray]) the batch of the rewards of the transition
:param obs_tp1: (Union[Tuple[Union[np.ndarray, int]], np.ndarray]) the current batch of observations
:param done: (Union[Tuple[bool], np.ndarray]) terminal status of the batch
Note: uses the same names as .add to keep compatibility with named argument passing
but expects iterables and arrays with more than 1 dimensions
"""
for data in zip(obs_t, action, reward, obs_tp1, done):
if self._next_idx >= len(self._storage):
self._storage.append(data)
else:
self._storage[self._next_idx] = data
self._next_idx = (self._next_idx + 1) % self._maxsize
def sample(self, batch_size):
"""Sample a batch of experiences.
Parameters
----------
batch_size: int
How many transitions to sample.
Returns
-------
transitions: [Transition]
batch of transitions
"""
return random.sample(self._storage, batch_size)
class PrioritizedReplayBuffer(ReplayBuffer):
def __init__(self, size, alpha=0.6):
"""Create Prioritized Replay buffer.
Parameters
----------
size: int
Max number of transitions to store in the buffer. When the buffer
overflows the old memories are dropped.
alpha: float
how much prioritization is used
(0 - no prioritization, 1 - full prioritization)
See Also
--------
ReplayBuffer.__init__
"""
super().__init__(size)
assert alpha >= 0
self._alpha = alpha
it_capacity = 1
while it_capacity < size:
it_capacity *= 2
self._it_sum = SumSegmentTree(it_capacity)
self._it_min = MinSegmentTree(it_capacity)
self._max_priority = 1.0
self.is_prioritized = True
def add(self, obs_t, action, reward, obs_tp1, done):
"""See ReplayBuffer.store_effect"""
idx = self._next_idx
super().add(obs_t, action, reward, obs_tp1, done)
self._it_sum[idx] = self._max_priority**self._alpha
self._it_min[idx] = self._max_priority**self._alpha
def extend(self, obs_t, action, reward, obs_tp1, done):
"""
add a new batch of transitions to the buffer
:param obs_t: (Union[Tuple[Union[np.ndarray, int]], np.ndarray]) the last batch of observations
:param action: (Union[Tuple[Union[np.ndarray, int]]], np.ndarray]) the batch of actions
:param reward: (Union[Tuple[float], np.ndarray]) the batch of the rewards of the transition
:param obs_tp1: (Union[Tuple[Union[np.ndarray, int]], np.ndarray]) the current batch of observations
:param done: (Union[Tuple[bool], np.ndarray]) terminal status of the batch
Note: uses the same names as .add to keep compatibility with named argument passing
but expects iterables and arrays with more than 1 dimensions
"""
idx = self._next_idx
super().extend(obs_t, action, reward, obs_tp1, done)
while idx != self._next_idx:
self._it_sum[idx] = self._max_priority**self._alpha
self._it_min[idx] = self._max_priority**self._alpha
idx = (idx + 1) % self._maxsize
def _sample_proportional(self, batch_size):
res = []
p_total = self._it_sum.sum(0, len(self._storage) - 1)
every_range_len = p_total / batch_size
for i in range(batch_size):
mass = random.random() * every_range_len + i * every_range_len
idx = self._it_sum.find_prefixsum_idx(mass)
res.append(idx)
return res
def sample(self, batch_size, beta=0.4):
"""Sample a batch of experiences.
compared to ReplayBuffer.sample
it also returns importance weights and idxes
of sampled experiences.
Parameters
----------
batch_size: int
How many transitions to sample.
beta: float
To what degree to use importance weights
(0 - no corrections, 1 - full correction)
Returns
-------
transitions: [Transition]
batch of transitions
weights: [float]
List of size (batch_size) and dtype float
denoting importance weight of each sampled transition
idxes: [int]
List of size (batch_size) and dtype int
indexes in buffer of sampled experiences
"""
assert beta > 0
idxes = self._sample_proportional(batch_size)
weights = []
p_min = self._it_min.min() / self._it_sum.sum()
max_weight = (p_min * len(self._storage))**(-beta)
for idx in idxes:
p_sample = self._it_sum[idx] / self._it_sum.sum()
weight = (p_sample * len(self._storage))**(-beta)
weights.append(weight / max_weight)
transitions = [self._storage[idx] for idx in idxes]
return transitions, weights, idxes
def update_priorities(self, idxes, priorities):
"""Update priorities of sampled transitions.
sets priority of transition at index idxes[i] in buffer
to priorities[i].
Parameters
----------
idxes: [int]
List of idxes of sampled transitions
priorities: [float]
List of updated priorities corresponding to
transitions at the sampled idxes denoted by
variable `idxes`.
"""
assert len(idxes) == len(priorities)
assert all(p > 0 for p in priorities)
assert all(0 <= idx < len(self._storage) for idx in idxes)
for idx, priority in zip(idxes, priorities):
self._it_sum[idx] = priority**self._alpha
self._it_min[idx] = priority**self._alpha
self._max_priority = max([self._max_priority] + priorities)
def copy(self, sample_size: int):
"""
:param sample_size: number of items to include in the new object
:type sample_size: int
:return: a new replay buffer, retaining
:rtype: PrioritizedReplayBuffer
"""
newb = PrioritizedReplayBuffer(self._maxsize, self._alpha)
transitions, weights, idxes = self.sample(sample_size)
newb._storage = transitions
newb._next_idx = len(transitions)
for i in range(sample_size):
newb._it_sum[i] = self._it_sum[idxes[i]]
newb._it_min[i] = self._it_min[idxes[i]]
newb._max_priority = max(self._max_priority, newb._it_sum[i])
return newb
class EpisodeReplayBuffer:
def __init__(self, size):
"""Create Replay buffer.
Parameters
----------
size: int
Max number of episodes to store in the buffer. When the buffer
overflows the old memories are dropped.
"""
self._storage = []
self._maxsize = size
self._next_idx = 0
self.is_prioritized = False
def __len__(self):
return len(self._storage)
def can_sample(self, batch_size):
return len(self._storage) >= batch_size
def add(self, data: list):
if self._next_idx >= len(self._storage):
self._storage.append(data)
else:
self._storage[self._next_idx] = data
self._next_idx = (self._next_idx + 1) % self._maxsize
def sample(self, batch_size):
"""Sample a sequential batch of episodes.
Parameters
----------
batch_size: int
How many episodes to sample.
Returns
-------
transitions: list
batch of episodes
"""
return random.sample(self._storage, batch_size)
def extend(self, datas):
"""
add a new batch of episodes to the buffer
:param datas: a list of transitions
"""
for data in datas:
if self._next_idx >= len(self._storage):
self._storage.append(data)
else:
self._storage[self._next_idx] = data
self._next_idx = (self._next_idx + 1) % self._maxsize
class PrioritizedEpisodeReplayBuffer(EpisodeReplayBuffer):
def __init__(self, size, alpha=0.6):
"""Create Prioritized Replay buffer.
Parameters
----------
size: int
Max number of episodes to store in the buffer. When the buffer
overflows the old memories are dropped.
alpha: float
how much prioritization is used
(0 - no prioritization, 1 - full prioritization)
See Also
--------
EpisodeReplayBuffer.__init__
"""
super().__init__(size)
assert alpha >= 0
self._alpha = alpha
it_capacity = 1
while it_capacity < size:
it_capacity *= 2
self._it_sum = SumSegmentTree(it_capacity)
self._it_min = MinSegmentTree(it_capacity)
self._max_priority = 1.0
self.is_prioritized = True
def add(self, data):
"""See ReplayBuffer.store_effect"""
idx = self._next_idx
super().add(data)
self._it_sum[idx] = self._max_priority**self._alpha
self._it_min[idx] = self._max_priority**self._alpha
def extend(self, datas):
"""
add a new batch of episodes to the buffer
:param datas: a list of transitions
"""
idx = self._next_idx
super().extend(datas)
while idx != self._next_idx:
self._it_sum[idx] = self._max_priority**self._alpha
self._it_min[idx] = self._max_priority**self._alpha
idx = (idx + 1) % self._maxsize
def _sample_proportional(self, batch_size):
res = []
p_total = self._it_sum.sum(0, len(self._storage) - 1)
every_range_len = p_total / batch_size
for i in range(batch_size):
mass = random.random() * every_range_len + i * every_range_len
idx = self._it_sum.find_prefixsum_idx(mass)
res.append(idx)
return res
def sample(self, batch_size, beta=0.4):
"""Sample a batch of experiences.
compared to EpisodeReplayBuffer.sample
it also returns importance weights and idxes
of sampled experiences.
Parameters
----------
batch_size: int
How many transitions to sample.
beta: float
To what degree to use importance weights
(0 - no corrections, 1 - full correction)
Returns
-------
transitions: list
batch of episodes
weights: [float]
List of size (batch_size) and dtype float
denoting importance weight of each sampled episode
idxes: [int]
List of size (batch_size) and dtype int
indexes in buffer of sampled experiences
"""
assert beta > 0
idxes = self._sample_proportional(batch_size)
weights = []
p_min = self._it_min.min() / self._it_sum.sum()
max_weight = (p_min * len(self._storage))**(-beta)
for idx in idxes:
p_sample = self._it_sum[idx] / self._it_sum.sum()
weight = (p_sample * len(self._storage))**(-beta)
weights.append(weight / max_weight)
transitions = [self._storage[idx] for idx in idxes]
return transitions, weights, idxes
def update_priorities(self, idxes, priorities):
"""Update priorities of sampled transitions.
sets priority of transition at index idxes[i] in buffer
to priorities[i].
Parameters
----------
idxes: [int]
List of idxes of sampled transitions
priorities: [float]
List of updated priorities corresponding to
transitions at the sampled idxes denoted by
variable `idxes`.
"""
assert len(idxes) == len(priorities)
assert all(p > 0 for p in priorities)
assert all(0 <= idx < len(self._storage) for idx in idxes)
for idx, priority in zip(idxes, priorities):
self._it_sum[idx] = priority**self._alpha
self._it_min[idx] = priority**self._alpha
self._max_priority = max([self._max_priority] + priorities)
def copy(self, sample_size: int):
"""
:param sample_size: number of items to include in the new object
:type sample_size: int
:return: a new replay buffer, retaining
:rtype: PrioritizedReplayBuffer
"""
newb = PrioritizedEpisodeReplayBuffer(self._maxsize, self._alpha)
transitions, weights, idxes = self.sample(sample_size)
newb._storage = transitions
newb._next_idx = len(transitions)
for i in range(sample_size):
newb._it_sum[i] = self._it_sum[idxes[i]]
newb._it_min[i] = self._it_min[idxes[i]]
newb._max_priority = max(self._max_priority, newb._it_sum[i])
return newb
| 34.319809 | 108 | 0.616203 | 1,810 | 14,380 | 4.68453 | 0.108287 | 0.022644 | 0.031136 | 0.016511 | 0.92334 | 0.915202 | 0.890553 | 0.875457 | 0.875457 | 0.875457 | 0 | 0.006123 | 0.284492 | 14,380 | 418 | 109 | 34.401914 | 0.817961 | 0.362517 | 0 | 0.885246 | 0 | 0 | 0.005235 | 0 | 0 | 0 | 0 | 0 | 0.054645 | 1 | 0.142077 | false | 0.005464 | 0.016393 | 0.021858 | 0.251366 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6aa1dc7f07e4115a8869be17a86abcef612606f1 | 177 | py | Python | python/writeIO.py | youran1024/iOS-MindNode | e227a6e993c9a487e88ee15ffe6920a84871cc2c | [
"MIT"
] | null | null | null | python/writeIO.py | youran1024/iOS-MindNode | e227a6e993c9a487e88ee15ffe6920a84871cc2c | [
"MIT"
] | null | null | null | python/writeIO.py | youran1024/iOS-MindNode | e227a6e993c9a487e88ee15ffe6920a84871cc2c | [
"MIT"
] | null | null | null |
f = open('/Users/MrYang/Desktop/python/1.py', 'r')
print(f.read())
try:
f = open('/Users/MrYang/Desktop/python/1.py', 'r')
print(f.read())
finally:
if f:
f.close()
| 11.8 | 51 | 0.59322 | 30 | 177 | 3.5 | 0.5 | 0.095238 | 0.190476 | 0.304762 | 0.819048 | 0.819048 | 0.819048 | 0.819048 | 0.819048 | 0.819048 | 0 | 0.013423 | 0.158192 | 177 | 14 | 52 | 12.642857 | 0.691275 | 0 | 0 | 0.5 | 0 | 0 | 0.390805 | 0.37931 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0a779fcee00f470b69eec84c52ee1f7073fb9eb1 | 235 | py | Python | tests/data/fmtskip6.py | henrikhorluck/black | 5379d4f3f460ec9b7063dd1cc10f437b0edf9ae3 | [
"MIT"
] | 16,110 | 2019-07-22T21:54:54.000Z | 2022-03-31T22:52:39.000Z | tests/data/fmtskip6.py | marnixah/black-but-usable | 83b83d3066d1d857983bfa1a666a409e7255d79d | [
"MIT"
] | 1,981 | 2019-07-22T21:26:16.000Z | 2022-03-31T23:14:35.000Z | tests/data/fmtskip6.py | marnixah/black-but-usable | 83b83d3066d1d857983bfa1a666a409e7255d79d | [
"MIT"
] | 1,762 | 2019-07-22T21:23:00.000Z | 2022-03-31T06:10:22.000Z | class A:
def f(self):
for line in range(10):
if True:
pass # fmt: skip
# output
class A:
def f(self):
for line in range(10):
if True:
pass # fmt: skip
| 16.785714 | 33 | 0.425532 | 31 | 235 | 3.225806 | 0.516129 | 0.12 | 0.18 | 0.2 | 0.94 | 0.94 | 0.94 | 0.94 | 0.94 | 0.94 | 0 | 0.033058 | 0.485106 | 235 | 13 | 34 | 18.076923 | 0.793388 | 0.110638 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0.2 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
0a89f806d0120ac137b5534140f37e581e804f90 | 30,852 | py | Python | opus/application/apps/tools/test_file_utils.py | fyellin/pds-opus | 96705b6736458f9944599d44f86ec54ad1d74e38 | [
"AFL-3.0"
] | 10 | 2018-04-16T21:10:53.000Z | 2021-05-05T12:07:25.000Z | opus/application/apps/tools/test_file_utils.py | fyellin/pds-opus | 96705b6736458f9944599d44f86ec54ad1d74e38 | [
"AFL-3.0"
] | 777 | 2018-02-01T21:28:55.000Z | 2022-03-31T16:59:06.000Z | opus/application/apps/tools/test_file_utils.py | dstopp/pds-opus | 3be5688ff75c38e92966a576d585b235049744dc | [
"AFL-3.0"
] | 7 | 2018-02-01T20:38:51.000Z | 2019-04-02T01:48:27.000Z | # tools/test_file_utils.py
from collections import OrderedDict
import logging
from unittest import TestCase
from django.core.cache import cache
from tools.file_utils import get_pds_products
import settings
class fileUtilsTests(TestCase):
def setUp(self):
self.maxDiff = None
settings.OPUS_FAKE_API_DELAYS = 0
settings.OPUS_FAKE_SERVER_ERROR404_PROBABILITY = 0
settings.OPUS_FAKE_SERVER_ERROR500_PROBABILITY = 0
logging.disable(logging.ERROR)
cache.clear()
def tearDown(self):
logging.disable(logging.NOTSET)
###############################################
######### get_pds_products UNIT TESTS #########
###############################################
def test__get_pds_products_ib4v21gc_opusid_url(self):
"[test_file_utils.py] get_pds_products: no versions opusid hst-11559-wfc3-ib4v21gc url"
opus_id = 'hst-11559-wfc3-ib4v21gcq'
ret = get_pds_products(opus_id)
expected = OrderedDict([('hst-11559-wfc3-ib4v21gcq', OrderedDict([('Current', OrderedDict([(('HST', 10, 'hst_text', 'FITS Header Text'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ.ASC', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ.LBL']), (('HST', 20, 'hst_tiff', 'Raw Data Preview (lossless)'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ_RAW.TIF', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ.LBL']), (('HST', 30, 'hst_raw', 'Raw Data Preview'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ_RAW.JPG', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ.LBL']), (('HST', 40, 'hst_calib', 'Calibrated Data Preview'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ_FLT.JPG', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ.LBL']), (('HST', 70, 'hst_drizzled', 'Calibrated Geometrically Corrected Preview'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ_DRZ.JPG', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ.LBL']), (('metadata', 5, 'rms_index', 'RMS Node Augmented Index'), ['https://opus.pds-rings.seti.org/holdings/metadata/HSTIx_xxxx/HSTI1_1559/HSTI1_1559_index.tab', 'https://opus.pds-rings.seti.org/holdings/metadata/HSTIx_xxxx/HSTI1_1559/HSTI1_1559_index.lbl']), (('metadata', 6, 'hstfiles_index', 'HST Files Associations Index'), ['https://opus.pds-rings.seti.org/holdings/metadata/HSTIx_xxxx/HSTI1_1559/HSTI1_1559_hstfiles.tab', 'https://opus.pds-rings.seti.org/holdings/metadata/HSTIx_xxxx/HSTI1_1559/HSTI1_1559_hstfiles.lbl']), (('browse', 10, 'browse_thumb', 'Browse Image (thumbnail)'), ['https://opus.pds-rings.seti.org/holdings/previews/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ_thumb.jpg']), (('browse', 20, 'browse_small', 'Browse Image (small)'), ['https://opus.pds-rings.seti.org/holdings/previews/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ_small.jpg']), (('browse', 30, 'browse_medium', 'Browse Image (medium)'), ['https://opus.pds-rings.seti.org/holdings/previews/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ_med.jpg']), (('browse', 40, 'browse_full', 'Browse Image (full)'), ['https://opus.pds-rings.seti.org/holdings/previews/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ_full.jpg'])]))]))])
print('Got:')
print(ret)
print('Expected:')
print(expected)
self.assertEqual(dict(ret), dict(expected))
def test__get_pds_products_ib4v19r3q_opusid_url(self):
"[test_file_utils.py] get_pds_products: versions opusid hst-11559-wfc3-ib4v19r3q url"
opus_id = 'hst-11559-wfc3-ib4v19r3q'
ret = get_pds_products(opus_id)
expected = OrderedDict([('hst-11559-wfc3-ib4v19r3q', OrderedDict([('Current', OrderedDict([(('HST', 10, 'hst_text', 'FITS Header Text'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.ASC', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 20, 'hst_tiff', 'Raw Data Preview (lossless)'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_RAW.TIF', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 30, 'hst_raw', 'Raw Data Preview'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_RAW.JPG', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 40, 'hst_calib', 'Calibrated Data Preview'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_FLT.JPG', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 70, 'hst_drizzled', 'Calibrated Geometrically Corrected Preview'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_DRZ.JPG', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('metadata', 5, 'rms_index', 'RMS Node Augmented Index'), ['https://opus.pds-rings.seti.org/holdings/metadata/HSTIx_xxxx/HSTI1_1559/HSTI1_1559_index.tab', 'https://opus.pds-rings.seti.org/holdings/metadata/HSTIx_xxxx/HSTI1_1559/HSTI1_1559_index.lbl']), (('metadata', 6, 'hstfiles_index', 'HST Files Associations Index'), ['https://opus.pds-rings.seti.org/holdings/metadata/HSTIx_xxxx/HSTI1_1559/HSTI1_1559_hstfiles.tab', 'https://opus.pds-rings.seti.org/holdings/metadata/HSTIx_xxxx/HSTI1_1559/HSTI1_1559_hstfiles.lbl']), (('browse', 10, 'browse_thumb', 'Browse Image (thumbnail)'), ['https://opus.pds-rings.seti.org/holdings/previews/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_thumb.jpg']), (('browse', 20, 'browse_small', 'Browse Image (small)'), ['https://opus.pds-rings.seti.org/holdings/previews/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_small.jpg']), (('browse', 30, 'browse_medium', 'Browse Image (medium)'), ['https://opus.pds-rings.seti.org/holdings/previews/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_med.jpg']), (('browse', 40, 'browse_full', 'Browse Image (full)'), ['https://opus.pds-rings.seti.org/holdings/previews/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_full.jpg'])])), ('1.1', OrderedDict([(('HST', 10, 'hst_text', 'FITS Header Text'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.ASC', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 20, 'hst_tiff', 'Raw Data Preview (lossless)'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_RAW.TIF', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 30, 'hst_raw', 'Raw Data Preview'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_RAW.JPG', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 40, 'hst_calib', 'Calibrated Data Preview'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_FLT.JPG', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 70, 'hst_drizzled', 'Calibrated Geometrically Corrected Preview'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_DRZ.JPG', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL'])])), ('1.0', OrderedDict([(('HST', 10, 'hst_text', 'FITS Header Text'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.ASC', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 20, 'hst_tiff', 'Raw Data Preview (lossless)'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_RAW.TIF', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 30, 'hst_raw', 'Raw Data Preview'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_RAW.JPG', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 40, 'hst_calib', 'Calibrated Data Preview'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_FLT.JPG', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 70, 'hst_drizzled', 'Calibrated Geometrically Corrected Preview'), ['https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_DRZ.JPG', 'https://opus.pds-rings.seti.org/holdings/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL'])]))]))])
print('Got:')
print(ret)
print('Expected:')
print(expected)
self.assertEqual(dict(ret), dict(expected))
def test__get_pds_products_ib4v21gcq_opusid_path(self):
"[test_file_utils.py] get_pds_products: no versions opusid hst-11559-wfc3-ib4v21gcq path"
opus_id = 'hst-11559-wfc3-ib4v21gcq'
ret = get_pds_products(opus_id, loc_type='path')
expected = OrderedDict([('hst-11559-wfc3-ib4v21gcq', OrderedDict([('Current', OrderedDict([(('HST', 10, 'hst_text', 'FITS Header Text'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ.ASC', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ.LBL']), (('HST', 20, 'hst_tiff', 'Raw Data Preview (lossless)'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ_RAW.TIF', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ.LBL']), (('HST', 30, 'hst_raw', 'Raw Data Preview'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ_RAW.JPG', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ.LBL']), (('HST', 40, 'hst_calib', 'Calibrated Data Preview'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ_FLT.JPG', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ.LBL']), (('HST', 70, 'hst_drizzled', 'Calibrated Geometrically Corrected Preview'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ_DRZ.JPG', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ.LBL']), (('metadata', 5, 'rms_index', 'RMS Node Augmented Index'), [settings.PDS_DATA_DIR+'/metadata/HSTIx_xxxx/HSTI1_1559/HSTI1_1559_index.tab', settings.PDS_DATA_DIR+'/metadata/HSTIx_xxxx/HSTI1_1559/HSTI1_1559_index.lbl']), (('metadata', 6, 'hstfiles_index', 'HST Files Associations Index'), [settings.PDS_DATA_DIR+'/metadata/HSTIx_xxxx/HSTI1_1559/HSTI1_1559_hstfiles.tab', settings.PDS_DATA_DIR+'/metadata/HSTIx_xxxx/HSTI1_1559/HSTI1_1559_hstfiles.lbl']), (('browse', 10, 'browse_thumb', 'Browse Image (thumbnail)'), [settings.PDS_DATA_DIR+'/previews/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ_thumb.jpg']), (('browse', 20, 'browse_small', 'Browse Image (small)'), [settings.PDS_DATA_DIR+'/previews/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ_small.jpg']), (('browse', 30, 'browse_medium', 'Browse Image (medium)'), [settings.PDS_DATA_DIR+'/previews/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ_med.jpg']), (('browse', 40, 'browse_full', 'Browse Image (full)'), [settings.PDS_DATA_DIR+'/previews/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_21/IB4V21GCQ_full.jpg'])]))]))])
print('Got:')
print(ret)
print('Expected:')
print(expected)
self.assertEqual(dict(ret), dict(expected))
def test__get_pds_products_ib4v19r3q_opusid_path(self):
"[test_file_utils.py] get_pds_products: versions opusid hst-11559-wfc3-ib4v19r3q path"
opus_id = 'hst-11559-wfc3-ib4v19r3q'
ret = get_pds_products(opus_id, loc_type='path')
expected = OrderedDict([('hst-11559-wfc3-ib4v19r3q', OrderedDict([('Current', OrderedDict([(('HST', 10, 'hst_text', 'FITS Header Text'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.ASC', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 20, 'hst_tiff', 'Raw Data Preview (lossless)'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_RAW.TIF', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 30, 'hst_raw', 'Raw Data Preview'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_RAW.JPG', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 40, 'hst_calib', 'Calibrated Data Preview'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_FLT.JPG', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 70, 'hst_drizzled', 'Calibrated Geometrically Corrected Preview'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_DRZ.JPG', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('metadata', 5, 'rms_index', 'RMS Node Augmented Index'), [settings.PDS_DATA_DIR+'/metadata/HSTIx_xxxx/HSTI1_1559/HSTI1_1559_index.tab', settings.PDS_DATA_DIR+'/metadata/HSTIx_xxxx/HSTI1_1559/HSTI1_1559_index.lbl']), (('metadata', 6, 'hstfiles_index', 'HST Files Associations Index'), [settings.PDS_DATA_DIR+'/metadata/HSTIx_xxxx/HSTI1_1559/HSTI1_1559_hstfiles.tab', settings.PDS_DATA_DIR+'/metadata/HSTIx_xxxx/HSTI1_1559/HSTI1_1559_hstfiles.lbl']), (('browse', 10, 'browse_thumb', 'Browse Image (thumbnail)'), [settings.PDS_DATA_DIR+'/previews/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_thumb.jpg']), (('browse', 20, 'browse_small', 'Browse Image (small)'), [settings.PDS_DATA_DIR+'/previews/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_small.jpg']), (('browse', 30, 'browse_medium', 'Browse Image (medium)'), [settings.PDS_DATA_DIR+'/previews/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_med.jpg']), (('browse', 40, 'browse_full', 'Browse Image (full)'), [settings.PDS_DATA_DIR+'/previews/HSTIx_xxxx/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_full.jpg'])])), ('1.1', OrderedDict([(('HST', 10, 'hst_text', 'FITS Header Text'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.ASC', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 20, 'hst_tiff', 'Raw Data Preview (lossless)'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_RAW.TIF', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 30, 'hst_raw', 'Raw Data Preview'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_RAW.JPG', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 40, 'hst_calib', 'Calibrated Data Preview'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_FLT.JPG', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 70, 'hst_drizzled', 'Calibrated Geometrically Corrected Preview'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_DRZ.JPG', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.1/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL'])])), ('1.0', OrderedDict([(('HST', 10, 'hst_text', 'FITS Header Text'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.ASC', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 20, 'hst_tiff', 'Raw Data Preview (lossless)'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_RAW.TIF', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 30, 'hst_raw', 'Raw Data Preview'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_RAW.JPG', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 40, 'hst_calib', 'Calibrated Data Preview'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_FLT.JPG', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL']), (('HST', 70, 'hst_drizzled', 'Calibrated Geometrically Corrected Preview'), [settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q_DRZ.JPG', settings.PDS_DATA_DIR+'/volumes/HSTIx_xxxx_v1.0/HSTI1_1559/DATA/VISIT_19/IB4V19R3Q.LBL'])]))]))])
print('Got:')
print(ret)
print('Expected:')
print(expected)
self.assertEqual(dict(ret), dict(expected))
def test__get_pds_products_multiple_opusid(self):
"[test_file_utils.py] get_pds_products: versions multiple opusids path"
opus_id_list = ['co-iss-n1460960868',
'co-uvis-euv2001_001_02_12',
'co-vims-v1484504505_ir',
'vg-iss-2-s-c4360048']
ret = get_pds_products(opus_id_list, loc_type='path')
expected = OrderedDict([('co-iss-n1460960868', OrderedDict([('Current', OrderedDict([(('Cassini ISS', 0, 'coiss_raw', 'Raw Image'), [settings.PDS_DATA_DIR+'/volumes/COISS_2xxx/COISS_2002/data/1460960653_1461048959/N1460960868_1.IMG', settings.PDS_DATA_DIR+'/volumes/COISS_2xxx/COISS_2002/data/1460960653_1461048959/N1460960868_1.LBL', settings.PDS_DATA_DIR+'/volumes/COISS_2xxx/COISS_2002/label/prefix2.fmt', settings.PDS_DATA_DIR+'/volumes/COISS_2xxx/COISS_2002/label/tlmtab.fmt']), (('Cassini ISS', 10, 'coiss_calib', 'Calibrated Image'), [settings.PDS_DATA_DIR+'/calibrated/COISS_2xxx/COISS_2002/data/1460960653_1461048959/N1460960868_1_CALIB.IMG', settings.PDS_DATA_DIR+'/calibrated/COISS_2xxx/COISS_2002/data/1460960653_1461048959/N1460960868_1_CALIB.LBL']), (('Cassini ISS', 110, 'coiss_thumb', 'Extra Preview (thumbnail)'), [settings.PDS_DATA_DIR+'/volumes/COISS_2xxx/COISS_2002/extras/thumbnail/1460960653_1461048959/N1460960868_1.IMG.jpeg_small']), (('Cassini ISS', 120, 'coiss_medium', 'Extra Preview (medium)'), [settings.PDS_DATA_DIR+'/volumes/COISS_2xxx/COISS_2002/extras/browse/1460960653_1461048959/N1460960868_1.IMG.jpeg']), (('Cassini ISS', 130, 'coiss_full', 'Extra Preview (full)'), [settings.PDS_DATA_DIR+'/volumes/COISS_2xxx/COISS_2002/extras/full/1460960653_1461048959/N1460960868_1.IMG.png']), (('metadata', 5, 'rms_index', 'RMS Node Augmented Index'), [settings.PDS_DATA_DIR+'/metadata/COISS_2xxx/COISS_2002/COISS_2002_index.tab', settings.PDS_DATA_DIR+'/metadata/COISS_2xxx/COISS_2002/COISS_2002_index.lbl']), (('metadata', 10, 'inventory', 'Target Body Inventory'), [settings.PDS_DATA_DIR+'/metadata/COISS_2xxx/COISS_2002/COISS_2002_inventory.csv', settings.PDS_DATA_DIR+'/metadata/COISS_2xxx/COISS_2002/COISS_2002_inventory.lbl']), (('metadata', 20, 'planet_geometry', 'Planet Geometry Index'), [settings.PDS_DATA_DIR+'/metadata/COISS_2xxx/COISS_2002/COISS_2002_saturn_summary.tab', settings.PDS_DATA_DIR+'/metadata/COISS_2xxx/COISS_2002/COISS_2002_saturn_summary.lbl']), (('metadata', 30, 'moon_geometry', 'Moon Geometry Index'), [settings.PDS_DATA_DIR+'/metadata/COISS_2xxx/COISS_2002/COISS_2002_moon_summary.tab', settings.PDS_DATA_DIR+'/metadata/COISS_2xxx/COISS_2002/COISS_2002_moon_summary.lbl']), (('metadata', 40, 'ring_geometry', 'Ring Geometry Index'), [settings.PDS_DATA_DIR+'/metadata/COISS_2xxx/COISS_2002/COISS_2002_ring_summary.tab', settings.PDS_DATA_DIR+'/metadata/COISS_2xxx/COISS_2002/COISS_2002_ring_summary.lbl']), (('browse', 10, 'browse_thumb', 'Browse Image (thumbnail)'), [settings.PDS_DATA_DIR+'/previews/COISS_2xxx/COISS_2002/data/1460960653_1461048959/N1460960868_1_thumb.jpg']), (('browse', 20, 'browse_small', 'Browse Image (small)'), [settings.PDS_DATA_DIR+'/previews/COISS_2xxx/COISS_2002/data/1460960653_1461048959/N1460960868_1_small.jpg']), (('browse', 30, 'browse_medium', 'Browse Image (medium)'), [settings.PDS_DATA_DIR+'/previews/COISS_2xxx/COISS_2002/data/1460960653_1461048959/N1460960868_1_med.jpg']), (('browse', 40, 'browse_full', 'Browse Image (full)'), [settings.PDS_DATA_DIR+'/previews/COISS_2xxx/COISS_2002/data/1460960653_1461048959/N1460960868_1_full.png'])])), ('2', OrderedDict([(('Cassini ISS', 10, 'coiss_calib', 'Calibrated Image'), [settings.PDS_DATA_DIR+'/calibrated/COISS_2xxx_v2/COISS_2002/data/1460960653_1461048959/N1460960868_1_CALIB.IMG', settings.PDS_DATA_DIR+'/calibrated/COISS_2xxx_v2/COISS_2002/data/1460960653_1461048959/N1460960868_1_CALIB.LBL'])])), ('1', OrderedDict([(('Cassini ISS', 10, 'coiss_calib', 'Calibrated Image'), [settings.PDS_DATA_DIR+'/calibrated/COISS_2xxx_v1/COISS_2002/data/1460960653_1461048959/N1460960868_1_CALIB.IMG', settings.PDS_DATA_DIR+'/calibrated/COISS_2xxx_v1/COISS_2002/data/1460960653_1461048959/N1460960868_1_CALIB.LBL'])]))])), ('co-uvis-euv2001_001_02_12', OrderedDict([('Current', OrderedDict([(('Cassini UVIS', 10, 'couvis_raw', 'Raw Data'), [settings.PDS_DATA_DIR+'/volumes/COUVIS_0xxx/COUVIS_0002/DATA/D2001_001/EUV2001_001_02_12.DAT', settings.PDS_DATA_DIR+'/volumes/COUVIS_0xxx/COUVIS_0002/DATA/D2001_001/EUV2001_001_02_12.LBL']), (('Cassini UVIS', 20, 'couvis_calib_corr', 'Calibration Data'), [settings.PDS_DATA_DIR+'/volumes/COUVIS_0xxx/COUVIS_0002/CALIB/VERSION_3/D2001_001/EUV2001_001_02_12_CAL_3.DAT', settings.PDS_DATA_DIR+'/volumes/COUVIS_0xxx/COUVIS_0002/CALIB/VERSION_3/D2001_001/EUV2001_001_02_12_CAL_3.LBL']), (('metadata', 5, 'rms_index', 'RMS Node Augmented Index'), [settings.PDS_DATA_DIR+'/metadata/COUVIS_0xxx/COUVIS_0002/COUVIS_0002_index.tab', settings.PDS_DATA_DIR+'/metadata/COUVIS_0xxx/COUVIS_0002/COUVIS_0002_index.lbl']), (('metadata', 9, 'supplemental_index', 'Supplemental Index'), [settings.PDS_DATA_DIR+'/metadata/COUVIS_0xxx/COUVIS_0002/COUVIS_0002_supplemental_index.tab', settings.PDS_DATA_DIR+'/metadata/COUVIS_0xxx/COUVIS_0002/COUVIS_0002_supplemental_index.lbl']), (('browse', 10, 'browse_thumb', 'Browse Image (thumbnail)'), [settings.PDS_DATA_DIR+'/previews/COUVIS_0xxx/COUVIS_0002/DATA/D2001_001/EUV2001_001_02_12_thumb.png']), (('browse', 20, 'browse_small', 'Browse Image (small)'), [settings.PDS_DATA_DIR+'/previews/COUVIS_0xxx/COUVIS_0002/DATA/D2001_001/EUV2001_001_02_12_small.png']), (('browse', 30, 'browse_medium', 'Browse Image (medium)'), [settings.PDS_DATA_DIR+'/previews/COUVIS_0xxx/COUVIS_0002/DATA/D2001_001/EUV2001_001_02_12_med.png']), (('browse', 40, 'browse_full', 'Browse Image (full)'), [settings.PDS_DATA_DIR+'/previews/COUVIS_0xxx/COUVIS_0002/DATA/D2001_001/EUV2001_001_02_12_full.png'])]))])), ('co-vims-v1484504505_ir', OrderedDict([('Current', OrderedDict([(('Cassini VIMS', 0, 'covims_raw', 'Raw Cube'), [settings.PDS_DATA_DIR+'/volumes/COVIMS_0xxx/COVIMS_0006/data/2005015T175855_2005016T184233/v1484504505_4.qub', settings.PDS_DATA_DIR+'/volumes/COVIMS_0xxx/COVIMS_0006/data/2005015T175855_2005016T184233/v1484504505_4.lbl', settings.PDS_DATA_DIR+'/volumes/COVIMS_0xxx/COVIMS_0006/label/band_bin_center.fmt', settings.PDS_DATA_DIR+'/volumes/COVIMS_0xxx/COVIMS_0006/label/core_description.fmt', settings.PDS_DATA_DIR+'/volumes/COVIMS_0xxx/COVIMS_0006/label/suffix_description.fmt']), (('Cassini VIMS', 110, 'covims_thumb', 'Extra Preview (thumbnail)'), [settings.PDS_DATA_DIR+'/volumes/COVIMS_0xxx/COVIMS_0006/extras/thumbnail/2005015T175855_2005016T184233/v1484504505_4.qub.jpeg_small']), (('Cassini VIMS', 120, 'covims_medium', 'Extra Preview (medium)'), [settings.PDS_DATA_DIR+'/volumes/COVIMS_0xxx/COVIMS_0006/extras/browse/2005015T175855_2005016T184233/v1484504505_4.qub.jpeg']), (('Cassini VIMS', 130, 'covims_full', 'Extra Preview (full)'), [settings.PDS_DATA_DIR+'/volumes/COVIMS_0xxx/COVIMS_0006/extras/tiff/2005015T175855_2005016T184233/v1484504505_4.qub.tiff']), (('metadata', 5, 'rms_index', 'RMS Node Augmented Index'), [settings.PDS_DATA_DIR+'/metadata/COVIMS_0xxx/COVIMS_0006/COVIMS_0006_index.tab', settings.PDS_DATA_DIR+'/metadata/COVIMS_0xxx/COVIMS_0006/COVIMS_0006_index.lbl']), (('metadata', 9, 'supplemental_index', 'Supplemental Index'), [settings.PDS_DATA_DIR+'/metadata/COVIMS_0xxx/COVIMS_0006/COVIMS_0006_supplemental_index.tab', settings.PDS_DATA_DIR+'/metadata/COVIMS_0xxx/COVIMS_0006/COVIMS_0006_supplemental_index.lbl']), (('metadata', 10, 'inventory', 'Target Body Inventory'), [settings.PDS_DATA_DIR+'/metadata/COVIMS_0xxx/COVIMS_0006/COVIMS_0006_inventory.csv', settings.PDS_DATA_DIR+'/metadata/COVIMS_0xxx/COVIMS_0006/COVIMS_0006_inventory.lbl']), (('browse', 10, 'browse_thumb', 'Browse Image (thumbnail)'), [settings.PDS_DATA_DIR+'/previews/COVIMS_0xxx/COVIMS_0006/data/2005015T175855_2005016T184233/v1484504505_4_thumb.png']), (('browse', 20, 'browse_small', 'Browse Image (small)'), [settings.PDS_DATA_DIR+'/previews/COVIMS_0xxx/COVIMS_0006/data/2005015T175855_2005016T184233/v1484504505_4_small.png']), (('browse', 30, 'browse_medium', 'Browse Image (medium)'), [settings.PDS_DATA_DIR+'/previews/COVIMS_0xxx/COVIMS_0006/data/2005015T175855_2005016T184233/v1484504505_4_med.png']), (('browse', 40, 'browse_full', 'Browse Image (full)'), [settings.PDS_DATA_DIR+'/previews/COVIMS_0xxx/COVIMS_0006/data/2005015T175855_2005016T184233/v1484504505_4_full.png'])]))])), ('vg-iss-2-s-c4360048', OrderedDict([('Current', OrderedDict([(('Voyager ISS', 0, 'vgiss_raw', 'Raw Image'), [settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/DATA/C43600XX/C4360048_RAW.IMG', settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/DATA/C43600XX/C4360048_RAW.LBL']), (('Voyager ISS', 10, 'vgiss_cleaned', 'Cleaned Image'), [settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/DATA/C43600XX/C4360048_CLEANED.IMG', settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/DATA/C43600XX/C4360048_CLEANED.LBL']), (('Voyager ISS', 20, 'vgiss_calib', 'Calibrated Image'), [settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/DATA/C43600XX/C4360048_CALIB.IMG', settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/DATA/C43600XX/C4360048_CALIB.LBL']), (('Voyager ISS', 30, 'vgiss_geomed', 'Geometrically Corrected Image'), [settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/DATA/C43600XX/C4360048_GEOMED.IMG', settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/DATA/C43600XX/C4360048_GEOMED.LBL']), (('Voyager ISS', 40, 'vgiss_resloc', 'Reseau Table'), [settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/DATA/C43600XX/C4360048_RESLOC.TAB', settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/DATA/C43600XX/C4360048_RESLOC.DAT', settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/DATA/C43600XX/C4360048_RESLOC.LBL']), (('Voyager ISS', 50, 'vgiss_geoma', 'Geometric Tiepoint Table'), [settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/DATA/C43600XX/C4360048_GEOMA.TAB', settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/DATA/C43600XX/C4360048_GEOMA.DAT', settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/DATA/C43600XX/C4360048_GEOMA.LBL']), (('Voyager ISS', 60, 'vgiss_raw_browse', 'Extra Preview (raw)'), [settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/BROWSE/C43600XX/C4360048_RAW.JPG', settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/BROWSE/C43600XX/C4360048_RAW.LBL']), (('Voyager ISS', 70, 'vgiss_cleaned_browse', 'Extra Preview (cleaned)'), [settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/BROWSE/C43600XX/C4360048_CLEANED.JPG', settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/BROWSE/C43600XX/C4360048_CLEANED.LBL']), (('Voyager ISS', 80, 'vgiss_calib_browse', 'Extra Preview (calibrated)'), [settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/BROWSE/C43600XX/C4360048_CALIB.JPG', settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/BROWSE/C43600XX/C4360048_CALIB.LBL']), (('Voyager ISS', 90, 'vgiss_geomed_browse', 'Extra Preview (geometrically corrected)'), [settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/BROWSE/C43600XX/C4360048_GEOMED.JPG', settings.PDS_DATA_DIR+'/volumes/VGISS_6xxx/VGISS_6210/BROWSE/C43600XX/C4360048_GEOMED.LBL']), (('metadata', 5, 'rms_index', 'RMS Node Augmented Index'), [settings.PDS_DATA_DIR+'/metadata/VGISS_6xxx/VGISS_6210/VGISS_6210_index.tab', settings.PDS_DATA_DIR+'/metadata/VGISS_6xxx/VGISS_6210/VGISS_6210_index.lbl']), (('metadata', 7, 'raw_image_index', 'Raw Image Index'), [settings.PDS_DATA_DIR+'/metadata/VGISS_6xxx/VGISS_6210/VGISS_6210_raw_image_index.tab', settings.PDS_DATA_DIR+'/metadata/VGISS_6xxx/VGISS_6210/VGISS_6210_raw_image_index.lbl']), (('metadata', 9, 'supplemental_index', 'Supplemental Index'), [settings.PDS_DATA_DIR+'/metadata/VGISS_6xxx/VGISS_6210/VGISS_6210_supplemental_index.tab', settings.PDS_DATA_DIR+'/metadata/VGISS_6xxx/VGISS_6210/VGISS_6210_supplemental_index.lbl']), (('metadata', 10, 'inventory', 'Target Body Inventory'), [settings.PDS_DATA_DIR+'/metadata/VGISS_6xxx/VGISS_6210/VGISS_6210_inventory.csv', settings.PDS_DATA_DIR+'/metadata/VGISS_6xxx/VGISS_6210/VGISS_6210_inventory.lbl']), (('metadata', 20, 'planet_geometry', 'Planet Geometry Index'), [settings.PDS_DATA_DIR+'/metadata/VGISS_6xxx/VGISS_6210/VGISS_6210_saturn_summary.tab', settings.PDS_DATA_DIR+'/metadata/VGISS_6xxx/VGISS_6210/VGISS_6210_saturn_summary.lbl']), (('metadata', 40, 'ring_geometry', 'Ring Geometry Index'), [settings.PDS_DATA_DIR+'/metadata/VGISS_6xxx/VGISS_6210/VGISS_6210_ring_summary.tab', settings.PDS_DATA_DIR+'/metadata/VGISS_6xxx/VGISS_6210/VGISS_6210_ring_summary.lbl']), (('browse', 10, 'browse_thumb', 'Browse Image (thumbnail)'), [settings.PDS_DATA_DIR+'/previews/VGISS_6xxx/VGISS_6210/DATA/C43600XX/C4360048_thumb.jpg']), (('browse', 20, 'browse_small', 'Browse Image (small)'), [settings.PDS_DATA_DIR+'/previews/VGISS_6xxx/VGISS_6210/DATA/C43600XX/C4360048_small.jpg']), (('browse', 30, 'browse_medium', 'Browse Image (medium)'), [settings.PDS_DATA_DIR+'/previews/VGISS_6xxx/VGISS_6210/DATA/C43600XX/C4360048_med.jpg']), (('browse', 40, 'browse_full', 'Browse Image (full)'), [settings.PDS_DATA_DIR+'/previews/VGISS_6xxx/VGISS_6210/DATA/C43600XX/C4360048_full.jpg'])]))]))])
print('Got:')
print(ret)
print('Expected:')
print(expected)
self.assertEqual(dict(ret), dict(expected))
| 350.590909 | 12,835 | 0.769091 | 4,556 | 30,852 | 4.904522 | 0.046532 | 0.074334 | 0.101365 | 0.121638 | 0.936585 | 0.927277 | 0.912106 | 0.911032 | 0.910271 | 0.901768 | 0 | 0.123005 | 0.055588 | 30,852 | 87 | 12,836 | 354.62069 | 0.643889 | 0.015104 | 0 | 0.478261 | 0 | 0.811594 | 0.692944 | 0.337298 | 0 | 0 | 0 | 0 | 0.072464 | 1 | 0.101449 | false | 0 | 0.086957 | 0 | 0.202899 | 0.289855 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
0a938cd034e1b135dce2ec683bfa05ea18b5fe7d | 14,147 | py | Python | sdk/lusid/api/relation_definitions_api.py | mneedham/lusid-sdk-python | edabec16b357ba3fc48a53f3faacb4f94b18843e | [
"MIT"
] | null | null | null | sdk/lusid/api/relation_definitions_api.py | mneedham/lusid-sdk-python | edabec16b357ba3fc48a53f3faacb4f94b18843e | [
"MIT"
] | null | null | null | sdk/lusid/api/relation_definitions_api.py | mneedham/lusid-sdk-python | edabec16b357ba3fc48a53f3faacb4f94b18843e | [
"MIT"
] | null | null | null | # coding: utf-8
"""
LUSID API
FINBOURNE Technology # noqa: E501
The version of the OpenAPI document: 0.11.2808
Contact: info@finbourne.com
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from lusid.api_client import ApiClient
from lusid.exceptions import (
ApiTypeError,
ApiValueError
)
class RelationDefinitionsApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_relation_definition(self, create_relation_definition_request, **kwargs): # noqa: E501
"""[DEPRECATED] Create a relation definition # noqa: E501
Define a new relation. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_relation_definition(create_relation_definition_request, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param CreateRelationDefinitionRequest create_relation_definition_request: The definition of the new relation. (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: RelationDefinition
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.create_relation_definition_with_http_info(create_relation_definition_request, **kwargs) # noqa: E501
def create_relation_definition_with_http_info(self, create_relation_definition_request, **kwargs): # noqa: E501
"""[DEPRECATED] Create a relation definition # noqa: E501
Define a new relation. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_relation_definition_with_http_info(create_relation_definition_request, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param CreateRelationDefinitionRequest create_relation_definition_request: The definition of the new relation. (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(RelationDefinition, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['create_relation_definition_request'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method create_relation_definition" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'create_relation_definition_request' is set
if ('create_relation_definition_request' not in local_var_params or
local_var_params['create_relation_definition_request'] is None):
raise ApiValueError("Missing the required parameter `create_relation_definition_request` when calling `create_relation_definition`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'create_relation_definition_request' in local_var_params:
body_params = local_var_params['create_relation_definition_request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain', 'application/json', 'text/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json-patch+json', 'application/json', 'text/json', 'application/*+json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
# set the LUSID header
header_params['X-LUSID-SDK-Language'] = 'Python'
header_params['X-LUSID-SDK-Version'] = '0.11.2808'
return self.api_client.call_api(
'/api/relationdefinitions', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='RelationDefinition', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_relation_definition(self, scope, code, **kwargs): # noqa: E501
"""[DEPRECATED] Get relation definition # noqa: E501
Retrieve the definition of a specified relation. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_relation_definition(scope, code, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the specified relation. (required)
:param str code: The code of the specified relation. Together with the domain and scope this uniquely identifies the relation. (required)
:param datetime as_at: The asAt datetime at which to retrieve the relation definition. Defaults to return the latest version of the definition if not specified.
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: RelationDefinition
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_relation_definition_with_http_info(scope, code, **kwargs) # noqa: E501
def get_relation_definition_with_http_info(self, scope, code, **kwargs): # noqa: E501
"""[DEPRECATED] Get relation definition # noqa: E501
Retrieve the definition of a specified relation. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_relation_definition_with_http_info(scope, code, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the specified relation. (required)
:param str code: The code of the specified relation. Together with the domain and scope this uniquely identifies the relation. (required)
:param datetime as_at: The asAt datetime at which to retrieve the relation definition. Defaults to return the latest version of the definition if not specified.
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(RelationDefinition, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['scope', 'code', 'as_at'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_relation_definition" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
if ('scope' in local_var_params and
len(local_var_params['scope']) > 64):
raise ApiValueError("Invalid value for parameter `scope` when calling `get_relation_definition`, length must be less than or equal to `64`") # noqa: E501
if ('scope' in local_var_params and
len(local_var_params['scope']) < 1):
raise ApiValueError("Invalid value for parameter `scope` when calling `get_relation_definition`, length must be greater than or equal to `1`") # noqa: E501
if 'scope' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['scope']): # noqa: E501
raise ApiValueError("Invalid value for parameter `scope` when calling `get_relation_definition`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) > 64):
raise ApiValueError("Invalid value for parameter `code` when calling `get_relation_definition`, length must be less than or equal to `64`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) < 1):
raise ApiValueError("Invalid value for parameter `code` when calling `get_relation_definition`, length must be greater than or equal to `1`") # noqa: E501
if 'code' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['code']): # noqa: E501
raise ApiValueError("Invalid value for parameter `code` when calling `get_relation_definition`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'scope' in local_var_params:
path_params['scope'] = local_var_params['scope'] # noqa: E501
if 'code' in local_var_params:
path_params['code'] = local_var_params['code'] # noqa: E501
query_params = []
if 'as_at' in local_var_params:
query_params.append(('asAt', local_var_params['as_at'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain', 'application/json', 'text/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
# set the LUSID header
header_params['X-LUSID-SDK-Language'] = 'Python'
header_params['X-LUSID-SDK-Version'] = '0.11.2808'
return self.api_client.call_api(
'/api/relationdefinitions/{scope}/{code}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='RelationDefinition', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
| 49.121528 | 181 | 0.634127 | 1,644 | 14,147 | 5.226277 | 0.130779 | 0.039106 | 0.061918 | 0.050512 | 0.888966 | 0.872672 | 0.844856 | 0.820182 | 0.808543 | 0.807495 | 0 | 0.016837 | 0.28628 | 14,147 | 287 | 182 | 49.292683 | 0.834109 | 0.431328 | 0 | 0.606061 | 1 | 0.045455 | 0.27416 | 0.096367 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037879 | false | 0 | 0.037879 | 0 | 0.113636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0abe0faaab67ecc3f4b2682dd0816f12f171f1e6 | 27,750 | py | Python | estoria_app/tests.py | estoriadigital/Estoria-Admin | 72a724f0c27358d58fc3b603a907be7e9a9097db | [
"MIT"
] | null | null | null | estoria_app/tests.py | estoriadigital/Estoria-Admin | 72a724f0c27358d58fc3b603a907be7e9a9097db | [
"MIT"
] | null | null | null | estoria_app/tests.py | estoriadigital/Estoria-Admin | 72a724f0c27358d58fc3b603a907be7e9a9097db | [
"MIT"
] | null | null | null | from .tasks import reader_xml, estoria_xml, critical_edition_first, bake_chapters
from .apps import EstoriaAppConfig
from django.apps import apps
from django.test import TestCase, override_settings
from django.urls import reverse
from django.core.files.base import ContentFile
from django.conf import settings
from testfixtures import log_capture
from unittest.mock import patch
from selenium.webdriver import FirefoxOptions
import selenium
import os
def mocked_check_call(args, cwd):
""""
mock subprocess.check_call
"""
return True
class TestEstoriaApp(TestCase):
"""
Test apps.py
"""
def test_apps(self):
self.assertEqual(EstoriaAppConfig.name, 'estoria_app')
self.assertEqual(apps.get_app_config('estoria_app').name, 'estoria_app')
class Test1EstoriaXml(TestCase):
"""
Test estoria_xml Celery task
"""
@log_capture('estoria_app.tasks')
@patch('subprocess.check_call', side_effect=mocked_check_call)
def test_estoria_xml_run_task(self, mocked_check_call, capture):
"""
test the estoria_xml Celery task
"""
data_path = settings.ESTORIA_DATA_PATH
scripts_path = os.path.join(settings.ESTORIA_BASE_LOCATION, 'estoria-digital/editions/src/assets/scripts')
self.task = estoria_xml.apply(args=[data_path, scripts_path])
self.results = self.task.get()
self.assertEqual(self.task.state, 'SUCCESS')
capture.check(
('estoria_app.tasks', 'INFO', '{}: estoria_xml task started'.format(self.task.id)),
('estoria_app.tasks', 'DEBUG',
'{}: Data path: {}'.format(self.task.id, settings.ESTORIA_DATA_PATH)),
('estoria_app.tasks',
'DEBUG', '{}: Scripts location: {}'.format(self.task.id,
os.path.join(settings.ESTORIA_BASE_LOCATION,
'estoria-digital/editions/src/assets/scripts'))),
('estoria_app.tasks', 'DEBUG', '{}: run make_paginated_json.py'.format(self.task.id)),
('estoria_app.tasks', 'DEBUG', '{}: run add_html_to_paginated_json.py'.format(self.task.id)),
('estoria_app.tasks', 'DEBUG', '{}: run make_chapter_index_json.py'.format(self.task.id)),
('estoria_app.tasks', 'INFO', '{}: complete'.format(self.task.id)),
)
class Test2ReaderXml(TestCase):
"""
Test reader_xml Celery task
"""
@log_capture('estoria_app.tasks')
@patch('subprocess.check_call', side_effect=mocked_check_call)
def test_reader_xml_run_task(self, mocked_check_call, capture):
"""
Test reader_xml Celery task
"""
data_path = settings.ESTORIA_DATA_PATH
scripts_path = os.path.join(settings.ESTORIA_BASE_LOCATION, 'estoria-digital/editions/src/assets/scripts')
self.task = reader_xml.apply(args=[data_path, scripts_path])
self.results = self.task.get()
self.assertEqual(self.task.state, 'SUCCESS')
capture.check(
('estoria_app.tasks', 'INFO', '{}: reader_xml task started'.format(self.task.id)),
('estoria_app.tasks', 'DEBUG',
'{}: Data path: {}'.format(self.task.id, settings.ESTORIA_DATA_PATH)),
('estoria_app.tasks',
'DEBUG', '{}: Scripts location: {}'.format(self.task.id,
os.path.join(settings.ESTORIA_BASE_LOCATION,
'estoria-digital/editions/src/assets/scripts'))),
('estoria_app.tasks', 'DEBUG', '{}: run make_reader.py'.format(self.task.id)),
('estoria_app.tasks', 'INFO', '{}: complete'.format(self.task.id)),
)
class Test3CriticalEditionFirst(TestCase):
"""
Test test_critical_edition_first_run_task Celery task
"""
@log_capture('estoria_app.tasks')
@patch('subprocess.check_call', side_effect=mocked_check_call)
def test_critical_edition_first_run_task(self, mocked_check_call, capture):
"""
Test test_critical_edition_first_run_task Celery task
"""
data_path = settings.ESTORIA_DATA_PATH
scripts_path = os.path.join(settings.ESTORIA_BASE_LOCATION, 'estoria-digital/editions/src/assets/scripts')
self.task = critical_edition_first.apply(args=[data_path, scripts_path])
self.results = self.task.get()
self.assertEqual(self.task.state, 'SUCCESS')
capture.check(
('estoria_app.tasks', 'INFO', '{}: critical_edition_first task started'.format(self.task.id)),
('estoria_app.tasks', 'DEBUG',
'{}: Data path: {}'.format(self.task.id, settings.ESTORIA_DATA_PATH)),
('estoria_app.tasks',
'DEBUG', '{}: Scripts location: {}'.format(self.task.id,
os.path.join(settings.ESTORIA_BASE_LOCATION,
'estoria-digital/editions/src/assets/scripts'))),
('estoria_app.tasks', 'DEBUG', '{}: run make_critical_chapter_verse_json.py'.format(self.task.id)),
('estoria_app.tasks', 'DEBUG', '{}: run make_verse_page_index_json.py'.format(self.task.id)),
('estoria_app.tasks', 'INFO', '{}: complete'.format(self.task.id)),
)
class Test4BakeChapters(TestCase):
"""
Test bake_chapters Celery task
"""
@patch.object(selenium.webdriver.FirefoxOptions, 'add_argument')
@patch('selenium.webdriver.Firefox')
@patch('builtins.open')
@log_capture('estoria_app.tasks')
def test_bake_chapters_run_task(self, capture, mocked_open, mocked_firefox):
"""
Test bake_chapters Celery task
"""
baking_url = os.path.join(settings.ADMIN_TOOLS_LOCATION, 'apparatus/estoria-digital')
data_path = settings.ESTORIA_DATA_PATH
self.task = bake_chapters.apply(args=(101, 101, baking_url, data_path))
self.results = self.task.get()
self.assertEqual(self.task.state, 'SUCCESS')
self.assertTrue(mocked_open.called)
self.assertTrue(mocked_firefox.called)
capture.check(
('estoria_app.tasks', 'INFO', '{}: bake_chapters task started'.format(self.task.id)),
('estoria_app.tasks', 'DEBUG', '{}: Baking chapters: 101 to 101'.format(self.task.id)),
('estoria_app.tasks', 'DEBUG', '{}: Bake chapter: 101 at {}'.format(self.task.id, baking_url)),
('estoria_app.tasks', 'INFO', '{}: complete'.format(self.task.id)),
)
class TestIndexView(TestCase):
"""
Test Index Views
"""
def test_index_empty_get(self):
"""
Test the simple index page is returned
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('index')
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<p>Management tools for the website.</p>')
def test_index_nonempty_get(self):
"""
Test the simple index page is returned
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('index')
response = self.client.get(url, {'nonsense': 'aaa'})
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<p>Management tools for the website.</p>')
# TODO add tests for selection of project if none selected
class TestTranscriptionsView(TestCase):
"""
Test Transcriptions Views
"""
def test_transcriptions_empty_get(self):
"""
Empty GET request of the transcriptions page
should get a form
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('transcriptions')
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>Transcriptions</h1>')
self.assertContains(response, '<form method="post" enctype="multipart/form-data">')
def test_transcriptions_nonsense_get(self):
"""
Nonsense GET request of the transcriptions page
should get a form
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('transcriptions')
response = self.client.get(url, {'nonsense': 'aaa'})
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>Transcriptions</h1>')
self.assertContains(response, '<form method="post" enctype="multipart/form-data">')
def test_transcriptions_nonsense_post(self):
"""
Nonsense POST request of the transcriptions page
should get a form
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('transcriptions')
response = self.client.post(url, {'nonsense': 'aaa'})
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>Transcriptions</h1>')
self.assertContains(response, '<form method="post" enctype="multipart/form-data">')
def test_transcriptions_job_get(self):
"""
'job' GET request of the transcriptions page
should get the job status check page
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('transcriptions')
response = self.client.get(url, {'job': 'aaa'})
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>Transcriptions</h1>')
self.assertContains(response, '<p id="user-count">Checking the server for the task.</p>')
@patch('estoria_app.tasks.estoria_xml.delay')
def test_transcriptions_rebuild_post(self, mocked_task):
"""
'rebuild' POST request of the transcriptions page
should set off the task and push the user to the job status page
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('transcriptions')
response = self.client.post(url, {'rebuild': 'Rebuild'})
self.assertEqual(response.status_code, 302)
self.assertTrue(response['Location'].startswith('?job='))
self.assertTrue(mocked_task.called)
def test_transcriptions_upload_empty_post(self):
"""
'upload' POST request of the transcriptions page, without a file
should get an error message
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('transcriptions')
response = self.client.post(url, {'upload': 'Upload'})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.context['message'], 'There was a problem with the file upload')
def test_transcriptions_upload_xmlfile_post(self):
"""
'upload' POST request of the transcriptions page, with a valid XML file
should upload the file and report success
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('transcriptions')
faked_file = ContentFile('<a><b></b></a>')
faked_file.name = 'test.xml'
response = self.client.post(url, {'upload': 'Upload', 'xmlfile': faked_file})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.context['message'], 'File uploaded')
os.remove(os.path.join(settings.ESTORIA_BASE_LOCATION, session['project'], 'transcriptions/manuscripts/test.xml'))
def test_transcriptions_upload_nonxmlfile_post(self):
"""
'upload' POST request of the transcriptions page, with an invalid XML file
should get an error message
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('transcriptions')
faked_file = ContentFile('hello world')
faked_file.name = 'test.xml'
response = self.client.post(url, {'upload': 'Upload', 'xmlfile': faked_file})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.context['message'], 'The uploaded file is not valid XML')
class TestReaderxmlView(TestCase):
"""
Test Reader XML Views
"""
def test_readerxml_empty_get(self):
"""
Empty GET request of the readerxml page
should get a form
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('readerxml')
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>ReaderXML</h1>')
self.assertContains(response, '<form method="post" enctype="multipart/form-data">')
def test_readerxml_nonsense_get(self):
"""
Nonsense GET request of the readerxml page
should get a form
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('readerxml')
response = self.client.get(url, {'nonsense': 'aaa'})
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>ReaderXML</h1>')
self.assertContains(response, '<form method="post" enctype="multipart/form-data">')
def test_readerxml_nonsense_post(self):
"""
Nonsense POST request of the readerxml page
should get a form
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('readerxml')
response = self.client.post(url, {'nonsense': 'aaa'})
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>ReaderXML</h1>')
self.assertContains(response, '<form method="post" enctype="multipart/form-data">')
def test_readerxml_job_get(self):
"""
'job' GET request of the readerxml page
should get the job status check page
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('readerxml')
response = self.client.get(url, {'job': 'aaa'})
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>ReaderXML</h1>')
self.assertContains(response, '<p id="user-count">Checking the server for the task.</p>')
@patch('estoria_app.tasks.reader_xml.delay')
def test_readerxml_rebuild_post(self, mocked_task):
"""
'rebuild' POST request of the readerxml page
should set off the task and push the user to the job status page
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('readerxml')
response = self.client.post(url, {'rebuild': 'Rebuild'})
self.assertEqual(response.status_code, 302)
self.assertTrue(response['Location'].startswith('?job='))
self.assertTrue(mocked_task.called)
def test_readerxml_upload_empty_post(self):
"""
'upload' POST request of the readerxml page, without a file
should get an error message
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('readerxml')
response = self.client.post(url, {'upload': 'Upload'})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.context['message'], 'There was a problem with the file upload')
def test_readerxml_upload_xmlfile_post(self):
"""
'upload' POST request of the readerxml page, with a valid XML file
should upload the file and report success
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('readerxml')
faked_file = ContentFile('<a><b></b></a>')
faked_file.name = 'test.xml'
response = self.client.post(url, {'upload': 'Upload', 'xmlfile': faked_file})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.context['message'], 'File uploaded')
os.remove(os.path.join(settings.ESTORIA_BASE_LOCATION, session['project'], 'transcriptions/readerXML/test.xml'))
def test_readerxml_upload_nonxmlfile_post(self):
"""
'upload' POST request of the readerxml page, with an invalid XML file
should get an error message
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('readerxml')
faked_file = ContentFile('hello world')
faked_file.name = 'test.xml'
response = self.client.post(url, {'upload': 'Upload', 'xmlfile': faked_file})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.context['message'], 'The uploaded file is not valid XML')
class TestBakingView(TestCase):
"""
Test Baking Views
"""
def test_baking_empty_get(self):
"""
Empty GET request of the baking page
should get a form
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('apparatus')
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>Baking</h1>')
self.assertContains(response, '<form method="post">')
def test_baking_nonsense_get(self):
"""
Nonsense GET request of the baking page
should get a form
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('apparatus')
response = self.client.get(url, {'nonsense': 'aaa'})
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>Baking</h1>')
self.assertContains(response, '<form method="post">')
def test_baking_nonsense_post(self):
"""
Nonsense POST request of the baking page
should get a form
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('apparatus')
response = self.client.post(url, {'nonsense': 'aaa'})
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>Baking</h1>')
self.assertContains(response, '<form method="post">')
def test_baking_job_get(self):
"""
'job' GET request of the baking page
should get the job status check page
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('apparatus')
response = self.client.get(url, {'job': 'aaa'})
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>Baking Chapters</h1>')
self.assertContains(response, '<p id="user-count">Checking the server for the task.</p>')
@patch('estoria_app.tasks.bake_chapters.delay')
def test_baking_sensible_range_post(self, mocked_task):
"""
'range' POST request of the baking page, with sensible input
should set off the task and push the user to the job status page
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('apparatus')
response = self.client.post(url, {'range': 'Bake', 'start_chapter': 9, 'stop_chapter': 10})
self.assertEqual(response.status_code, 302)
self.assertTrue(response['Location'].startswith('?job='))
self.assertTrue(mocked_task.called)
def test_baking_backwards_range_post(self):
"""
'range' POST request of the baking page, with impossible input
should get an error message
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('apparatus')
response = self.client.post(url, {'range': 'Bake', 'start_chapter': 10, 'stop_chapter': 1})
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>Baking</h1>')
self.assertContains(response, '<form method="post">')
self.assertEqual(response.context['message'], 'There was a problem with the supplied chapters to bake!')
def test_baking_nonnumerical_range_post(self):
"""
'range' POST request of the baking page, with non-numeric input
should get an error message
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('apparatus')
response = self.client.post(url, {'range': 'Bake', 'start_chapter': 'aaa', 'stop_chapter': 'bbb'})
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>Baking</h1>')
self.assertContains(response, '<form method="post">')
self.assertEqual(response.context['message'], 'There was a problem with the supplied chapters to bake!')
@patch('estoria_app.tasks.bake_chapters.delay')
def test_baking_sensible_one_post(self, mocked_task):
"""
'one' POST request of the baking page, with sensible input
should set off the task and push the user to the job status page
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('apparatus')
response = self.client.post(url, {'one': 'Bake', 'chapter': 5})
self.assertEqual(response.status_code, 302)
self.assertTrue(response['Location'].startswith('?job='))
self.assertTrue(mocked_task.called)
def test_baking_nonnumerical_one_post(self):
"""
'one' POST request of the baking page, with non-numerical input
should get an error message
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('apparatus')
response = self.client.post(url, {'one': 'Bake', 'chapter': 'aaa'})
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>Baking</h1>')
self.assertContains(response, '<form method="post">')
self.assertEqual(response.context['message'], 'There was a problem with the supplied chapters to bake!')
def test_baking_above_maximum_one_post(self):
"""
'one' POST request of the baking page, with above maximum possible chapter input
should get an error message
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('apparatus')
response = self.client.post(url, {'one': 'Bake', 'chapter': 1000000})
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>Baking</h1>')
self.assertContains(response, '<form method="post">')
self.assertEqual(response.context['message'], 'There was a problem with the supplied chapters to bake!')
class TestCriticalView(TestCase):
"""
Test Critical Edition Views
"""
def test_critical_empty_get(self):
"""
Empty GET request of the critical edition page
should get a form
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('critical')
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>Critical Edition</h1>')
self.assertContains(response, '<form method="post">')
def test_critical_nonsense_get(self):
"""
Nonsense GET request of the critical edition page
should get a form
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('critical')
response = self.client.get(url, {'nonsense': 'aaa'})
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>Critical Edition</h1>')
self.assertContains(response, '<form method="post">')
def test_critical_nonsense_post(self):
"""
Nonsense POST request of the critical edition page
should get a form
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('critical')
response = self.client.post(url, {'nonsense': 'aaa'})
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>Critical Edition</h1>')
self.assertContains(response, '<form method="post">')
def test_critical_job_get(self):
"""
'job' GET request of the critical edition page
should get the job status check page
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('critical')
response = self.client.get(url, {'job': 'aaa'})
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>Critical Edition</h1>')
self.assertContains(response, '<p id="user-count">Checking the server for the task.</p>')
@patch('estoria_app.tasks.critical_edition_first.delay')
def test_critical_rebuild_first_post(self, mocked_task):
"""
'rebuildfirst' POST request of the critical edition page
should set off the task and push the user to the job status page
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
url = reverse('critical')
response = self.client.post(url, {'rebuildfirst': 'Rebuild First Part'})
self.assertEqual(response.status_code, 302)
self.assertTrue(response['Location'].startswith('?job='))
self.assertTrue(mocked_task.called)
def test_critical_rebuild_first_with_non_existent_page_post(self):
"""
'rebuildfirst' POST request of the critical edition page when the collation index does not exist on the server
(the problem with the index is done by messing, temporarily, with the file locations)
should get an error message
"""
session = self.client.session
session['project'] = 'estoria-digital'
session.save()
estoria_location = settings.ESTORIA_BASE_LOCATION
settings.ESTORIA_BASE_LOCATION = 'this_does_not_exist'
url = reverse('critical')
response = self.client.post(url, {'rebuildfirst': 'Rebuild First Part'})
self.assertEqual(response.status_code, 200)
self.assertContains(response, '<h1>Critical Edition</h1>')
self.assertContains(response, '<form method="post">')
self.assertEqual(response.context['message'], 'There is a problem. The collation does not appear to exist.')
settings.ESTORIA_BASE_LOCATION = estoria_location
| 41.604198 | 122 | 0.629369 | 3,134 | 27,750 | 5.450223 | 0.071474 | 0.03981 | 0.060594 | 0.047772 | 0.866577 | 0.85276 | 0.84796 | 0.846613 | 0.816931 | 0.79685 | 0 | 0.008446 | 0.244793 | 27,750 | 666 | 123 | 41.666667 | 0.806604 | 0.124505 | 0 | 0.748826 | 0 | 0 | 0.230993 | 0.050452 | 0 | 0 | 0 | 0.001502 | 0.251174 | 1 | 0.093897 | false | 0 | 0.028169 | 0 | 0.147887 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0abf0778ca0e197aacb875556b5dff14f05c3c5d | 6,570 | py | Python | devilry/devilry_admin/tests/cradminextensions/test_multiselect2/test_multiselect2_relatedstudent.py | devilry/devilry-django | 9ae28e462dfa4cfee966ebacbca04ade9627e715 | [
"BSD-3-Clause"
] | 29 | 2015-01-18T22:56:23.000Z | 2020-11-10T21:28:27.000Z | devilry/devilry_admin/tests/cradminextensions/test_multiselect2/test_multiselect2_relatedstudent.py | devilry/devilry-django | 9ae28e462dfa4cfee966ebacbca04ade9627e715 | [
"BSD-3-Clause"
] | 786 | 2015-01-06T16:10:18.000Z | 2022-03-16T11:10:50.000Z | devilry/devilry_admin/tests/cradminextensions/test_multiselect2/test_multiselect2_relatedstudent.py | devilry/devilry-django | 9ae28e462dfa4cfee966ebacbca04ade9627e715 | [
"BSD-3-Clause"
] | 15 | 2015-04-06T06:18:43.000Z | 2021-02-24T12:28:30.000Z | import htmls
import mock
from django import test
from django import forms
from model_bakery import baker
from devilry.apps.core.models import RelatedStudent
from devilry.devilry_admin.cradminextensions.multiselect2 import multiselect2_relatedstudent
class TestSelectedItem(test.TestCase):
def test_title_without_fullname(self):
relatedstudent = baker.make('core.RelatedStudent',
user__shortname='test@example.com',
user__fullname='')
relatedstudent = RelatedStudent.objects.prefetch_syncsystemtag_objects().get(id=relatedstudent.id)
selector = htmls.S(multiselect2_relatedstudent.SelectedItem(value=relatedstudent).render())
self.assertEqual(
'test@example.com',
selector.one('.cradmin-legacy-multiselect2-target-selected-item-title').alltext_normalized)
def test_title_with_fullname(self):
relatedstudent = baker.make('core.RelatedStudent',
user__fullname='Test User',
user__shortname='test@example.com')
relatedstudent = RelatedStudent.objects.prefetch_syncsystemtag_objects().get(id=relatedstudent.id)
selector = htmls.S(multiselect2_relatedstudent.SelectedItem(value=relatedstudent).render())
self.assertEqual(
'Test User(test@example.com)',
selector.one('.cradmin-legacy-multiselect2-target-selected-item-title').alltext_normalized)
def test_description_without_tags(self):
relatedstudent = baker.make('core.RelatedStudent')
relatedstudent = RelatedStudent.objects.prefetch_syncsystemtag_objects().get(id=relatedstudent.id)
selector = htmls.S(multiselect2_relatedstudent.SelectedItem(value=relatedstudent).render())
self.assertFalse(
selector.exists('.cradmin-legacy-multiselect2-target-selected-item-description'))
# def test_description_with_tags(self):
# relatedstudent = baker.make('core.RelatedStudent')
# baker.make('core.RelatedStudentTag', tag='a', relatedstudent=relatedstudent)
# baker.make('core.RelatedStudentTag', tag='b', relatedstudent=relatedstudent)
# relatedstudent = RelatedStudent.objects.prefetch_syncsystemtag_objects().get(id=relatedstudent.id)
# selector = htmls.S(multiselect2_relatedstudent.SelectedItem(value=relatedstudent).render())
# self.assertEqual(
# 'a, b',
# selector.one('.cradmin-legacy-multiselect2-target-selected-item-description').alltext_normalized)
def test_description_with_tags(self):
testperiod = baker.make('core.Period')
testperiodtag1 = baker.make('core.PeriodTag', period=testperiod, tag='a')
testperiodtag2 = baker.make('core.PeriodTag', period=testperiod, tag='b')
relatedstudent = baker.make('core.RelatedStudent', period=testperiod)
testperiodtag1.relatedstudents.add(relatedstudent)
testperiodtag2.relatedstudents.add(relatedstudent)
relatedstudent = RelatedStudent.objects.prefetch_syncsystemtag_objects().get(id=relatedstudent.id)
selector = htmls.S(multiselect2_relatedstudent.ItemValue(value=relatedstudent).render())
self.assertEqual(
'a, b',
selector.one('.cradmin-legacy-listbuilder-itemvalue-titledescription-description').alltext_normalized)
class TestItemValue(test.TestCase):
def test_title_without_fullname(self):
relatedstudent = baker.make('core.RelatedStudent',
user__shortname='test@example.com',
user__fullname='')
relatedstudent = RelatedStudent.objects.prefetch_syncsystemtag_objects().get(id=relatedstudent.id)
selector = htmls.S(multiselect2_relatedstudent.ItemValue(value=relatedstudent).render())
self.assertEqual(
'test@example.com',
selector.one('.cradmin-legacy-listbuilder-itemvalue-titledescription-title').alltext_normalized)
def test_title_with_fullname(self):
relatedstudent = baker.make('core.RelatedStudent',
user__fullname='Test User',
user__shortname='test@example.com')
relatedstudent = RelatedStudent.objects.prefetch_syncsystemtag_objects().get(id=relatedstudent.id)
selector = htmls.S(multiselect2_relatedstudent.ItemValue(value=relatedstudent).render())
self.assertEqual(
'Test User(test@example.com)',
selector.one('.cradmin-legacy-listbuilder-itemvalue-titledescription-title').alltext_normalized)
def test_description_without_tags(self):
relatedstudent = baker.make('core.RelatedStudent')
relatedstudent = RelatedStudent.objects.prefetch_syncsystemtag_objects().get(id=relatedstudent.id)
selector = htmls.S(multiselect2_relatedstudent.ItemValue(value=relatedstudent).render())
self.assertFalse(
selector.exists('.cradmin-legacy-listbuilder-itemvalue-titledescription-description'))
def test_description_with_tags(self):
testperiod = baker.make('core.Period')
testperiodtag1 = baker.make('core.PeriodTag', period=testperiod, tag='a')
testperiodtag2 = baker.make('core.PeriodTag', period=testperiod, tag='b')
relatedstudent = baker.make('core.RelatedStudent', period=testperiod)
testperiodtag1.relatedstudents.add(relatedstudent)
testperiodtag2.relatedstudents.add(relatedstudent)
relatedstudent = RelatedStudent.objects.prefetch_syncsystemtag_objects().get(id=relatedstudent.id)
selector = htmls.S(multiselect2_relatedstudent.ItemValue(value=relatedstudent).render())
self.assertEqual(
'a, b',
selector.one('.cradmin-legacy-listbuilder-itemvalue-titledescription-description').alltext_normalized)
class TestTarget(test.TestCase):
def test_with_items_title(self):
selector = htmls.S(multiselect2_relatedstudent.Target(form=forms.Form()).render(request=mock.MagicMock()))
self.assertEqual(
'Selected students',
selector.one('.cradmin-legacy-multiselect2-target-title').alltext_normalized)
def test_without_items_text(self):
selector = htmls.S(multiselect2_relatedstudent.Target(form=forms.Form()).render(request=mock.MagicMock()))
self.assertEqual(
'No students selected',
selector.one('.cradmin-legacy-multiselect2-target-without-items-content').alltext_normalized)
| 56.153846 | 114 | 0.703957 | 626 | 6,570 | 7.244409 | 0.126198 | 0.033738 | 0.048732 | 0.065491 | 0.909592 | 0.899449 | 0.855347 | 0.825799 | 0.819184 | 0.802426 | 0 | 0.005064 | 0.188432 | 6,570 | 116 | 115 | 56.637931 | 0.845461 | 0.090868 | 0 | 0.804348 | 0 | 0 | 0.173403 | 0.105819 | 0 | 0 | 0 | 0 | 0.108696 | 1 | 0.108696 | false | 0 | 0.076087 | 0 | 0.217391 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e402049818535ba3b3d3f085cfd5ce90d5aa4205 | 75 | py | Python | moa/trace/exec.py | Security-Banana-Group/MoA | 5cc098338fa47b495ff5ff29254a86734dab628b | [
"MIT"
] | null | null | null | moa/trace/exec.py | Security-Banana-Group/MoA | 5cc098338fa47b495ff5ff29254a86734dab628b | [
"MIT"
] | null | null | null | moa/trace/exec.py | Security-Banana-Group/MoA | 5cc098338fa47b495ff5ff29254a86734dab628b | [
"MIT"
] | 1 | 2018-02-22T19:49:30.000Z | 2018-02-22T19:49:30.000Z | from moa.trace.command import run_trace
def exec_trace():
run_trace()
| 15 | 39 | 0.746667 | 12 | 75 | 4.416667 | 0.666667 | 0.301887 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 75 | 4 | 40 | 18.75 | 0.84127 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
7c435da4c3a7d8dc1005f46d2f0d640727d9b358 | 45,233 | py | Python | test/test_loadTable.py | dolphindb/api_python3 | caf1c6a38fe3dc0febf33ca5f299c2cdae0f139d | [
"Apache-2.0"
] | 26 | 2020-08-09T06:02:41.000Z | 2022-03-22T10:21:27.000Z | test/test_loadTable.py | dolphindb/api_python3 | caf1c6a38fe3dc0febf33ca5f299c2cdae0f139d | [
"Apache-2.0"
] | 8 | 2020-09-15T03:26:34.000Z | 2022-03-23T10:44:33.000Z | test/test_loadTable.py | dolphindb/api_python3 | caf1c6a38fe3dc0febf33ca5f299c2cdae0f139d | [
"Apache-2.0"
] | 5 | 2020-09-22T16:15:50.000Z | 2021-07-28T05:48:27.000Z | import unittest
import dolphindb as ddb
from numpy import repeat
from numpy.testing import assert_array_equal
from pandas.testing import assert_frame_equal
from setup import HOST, PORT, WORK_DIR
class DBInfo:
dfsDBName = 'dfs://testLoadTable'
diskDBName = WORK_DIR + '/testLoadTable'
table1 = 'tb1'
table2 = 'tb2'
def create_dfs_dimension_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db=database(dbPath,RANGE,1..10)
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createTable(tdata,`{tb1}).append!(tdata)
db.createTable(tdata,`{tb2}).append!(tdata)
'''.format(db=DBInfo.dfsDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_dfs_range_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db=database(dbPath,RANGE,0..10*10000+1)
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, 1..n as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`id).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`id).append!(tdata)
'''.format(db=DBInfo.dfsDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_dfs_hash_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db=database(dbPath,HASH,[INT,10])
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`id).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`id).append!(tdata)
'''.format(db=DBInfo.dfsDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_dfs_value_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db=database(dbPath,VALUE,2010.01.01..2010.01.30)
n=100000
tdata=table(sort(take(2010.01.01..2010.01.30, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`date).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`date).append!(tdata)
'''.format(db=DBInfo.dfsDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_dfs_list_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db=database(dbPath,LIST,[`AMD`QWE`CES,`DOP`ASZ,`FSD`BBVC,`AWQ`DS])
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`sym).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`sym).append!(tdata)
'''.format(db=DBInfo.dfsDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_dfs_compo_range_range_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db1=database('',RANGE,2010.01M+0..12)
db2=database('',RANGE,1 3 5 7 9 11)
db=database(dbPath,COMPO,[db1,db2])
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`date`id).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`date`id).append!(tdata)
'''.format(db=DBInfo.dfsDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_dfs_compo_range_hash_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db1=database('',RANGE,2010.01M+0..12)
db2=database('',HASH,[INT,10])
db=database(dbPath,COMPO,[db1,db2])
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`date`id).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`date`id).append!(tdata)
'''.format(db=DBInfo.dfsDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_dfs_compo_range_value_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db1=database('',RANGE,2010.01M+0..12)
db2=database('',VALUE,1..10)
db=database(dbPath,COMPO,[db1,db2])
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`date`id).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`date`id).append!(tdata)
'''.format(db=DBInfo.dfsDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_dfs_compo_range_list_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db1=database('',RANGE,2010.01M+0..12)
db2=database('',LIST,[`AMD`QWE`CES,`DOP`ASZ,`FSD`BBVC,`AWQ`DS])
db=database(dbPath,COMPO,[db1,db2])
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`date`sym).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`date`sym).append!(tdata)
'''.format(db=DBInfo.dfsDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_dfs_compo_range_hash_list_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db1=database('',RANGE,2010.01M+0..12)
db2=database('',HASH,[INT,10])
db3=database('',LIST,[`AMD`QWE`CES,`DOP`ASZ,`FSD`BBVC,`AWQ`DS])
db=database(dbPath,COMPO,[db1,db2,db3])
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`date`id`sym).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`date`id`sym).append!(tdata)
'''.format(db=DBInfo.dfsDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_dfs_compo_range_value_list_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db1=database('',RANGE,2010.01M+0..12)
db2=database('',VALUE,1..10)
db3=database('',LIST,[`AMD`QWE`CES,`DOP`ASZ,`FSD`BBVC,`AWQ`DS])
db=database(dbPath,COMPO,[db1,db2,db3])
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`date`id`sym).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`date`id`sym).append!(tdata)
'''.format(db=DBInfo.dfsDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_disk_unpartitioned_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(exists(dbPath))
dropDatabase(dbPath)
db=database(dbPath)
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, 1..n as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
saveTable(db,tdata,`{tb1})
saveTable(db,tdata,`{tb2})
'''.format(db=DBInfo.diskDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_disk_range_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db=database(dbPath,RANGE,0..10*10000+1)
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, 1..n as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`id).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`id).append!(tdata)
'''.format(db=DBInfo.diskDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_disk_hash_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db=database(dbPath,HASH,[INT,10])
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`id).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`id).append!(tdata)
'''.format(db=DBInfo.diskDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_disk_value_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db=database(dbPath,VALUE,2010.01.01..2010.01.30)
n=100000
tdata=table(sort(take(2010.01.01..2010.01.30, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`date).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`date).append!(tdata)
'''.format(db=DBInfo.diskDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_disk_list_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db=database(dbPath,LIST,[`AMD`QWE`CES,`DOP`ASZ,`FSD`BBVC,`AWQ`DS])
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`sym).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`sym).append!(tdata)
'''.format(db=DBInfo.diskDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_disk_compo_range_range_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db1=database('',RANGE,2010.01M+0..12)
db2=database('',RANGE,1 3 5 7 9 11)
db=database(dbPath,COMPO,[db1,db2])
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`date`id).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`date`id).append!(tdata)
'''.format(db=DBInfo.diskDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_disk_compo_range_hash_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db1=database('',RANGE,2010.01M+0..12)
db2=database('',HASH,[INT,10])
db=database(dbPath,COMPO,[db1,db2])
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`date`id).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`date`id).append!(tdata)
'''.format(db=DBInfo.diskDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_disk_compo_range_value_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db1=database('',RANGE,2010.01M+0..12)
db2=database('',VALUE,1..10)
db=database(dbPath,COMPO,[db1,db2])
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`date`id).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`date`id).append!(tdata)
'''.format(db=DBInfo.diskDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_disk_compo_range_list_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db1=database('',RANGE,2010.01M+0..12)
db2=database('',LIST,[`AMD`QWE`CES,`DOP`ASZ,`FSD`BBVC,`AWQ`DS])
db=database(dbPath,COMPO,[db1,db2])
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`date`sym).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`date`sym).append!(tdata)
'''.format(db=DBInfo.diskDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_disk_compo_range_hash_list_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db1=database('',RANGE,2010.01M+0..12)
db2=database('',HASH,[INT,10])
db3=database('',LIST,[`AMD`QWE`CES,`DOP`ASZ,`FSD`BBVC,`AWQ`DS])
db=database(dbPath,COMPO,[db1,db2,db3])
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`date`id`sym).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`date`id`sym).append!(tdata)
'''.format(db=DBInfo.diskDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
def create_disk_compo_range_value_list_db():
s = ddb.session()
s.connect(HOST, PORT, "admin", "123456")
ddb_script = '''
login('admin','123456')
dbPath='{db}'
if(existsDatabase(dbPath))
dropDatabase(dbPath)
db1=database('',RANGE,2010.01M+0..12)
db2=database('',VALUE,1..10)
db3=database('',LIST,[`AMD`QWE`CES,`DOP`ASZ,`FSD`BBVC,`AWQ`DS])
db=database(dbPath,COMPO,[db1,db2,db3])
n=100000
tdata=table(sort(take(2010.01.01..2010.12.31, n)) as date, take(1..10,n) as id,take(`AMD`QWE`CES`DOP`ASZ`FSD`BBVC`AWQ`DS, n) as sym,rand(100,n) as val)
db.createPartitionedTable(tdata,`{tb1},`date`id`sym).append!(tdata)
db.createPartitionedTable(tdata,`{tb2},`date`id`sym).append!(tdata)
'''.format(db=DBInfo.diskDBName, tb1=DBInfo.table1, tb2=DBInfo.table2)
s.run(ddb_script)
s.close()
class LoadTableTest(unittest.TestCase):
@classmethod
def setUp(cls):
cls.s = ddb.session()
cls.s.connect(HOST, PORT, "admin", "123456")
dbPaths = [DBInfo.dfsDBName, DBInfo.diskDBName]
for dbPath in dbPaths:
script = """
if(existsDatabase('{dbPath}'))
dropDatabase('{dbPath}')
if(exists('{dbPath}'))
rmdir('{dbPath}', true)
""".format(dbPath=dbPath)
cls.s.run(script)
@classmethod
def tearDown(cls):
cls.s = ddb.session()
cls.s.connect(HOST, PORT, "admin", "123456")
dbPaths = [DBInfo.dfsDBName, DBInfo.diskDBName]
for dbPath in dbPaths:
script = """
if(existsDatabase('{dbPath}'))
dropDatabase('{dbPath}')
if(exists('{dbPath}'))
rmdir('{dbPath}', true)
""".format(dbPath=dbPath)
cls.s.run(script)
def test_loadTable_dfs_dimension(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_dimension_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_dfs_range(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_range_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_dfs_range_param_partitions(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_range_db()
self.assertRaises(RuntimeError, self.s.loadTable, tbName1, dbPath, [5000, 15000])
def test_loadTable_dfs_range_param_memoryMode(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_range_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
def test_loadTable_dfs_hash(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_hash_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_dfs_hash_param_partitions(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_hash_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=[1, 2])
def test_loadTable_dfs_hash_param_memoryMode(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_hash_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
def test_loadTable_dfs_value(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_value_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_dfs_value_param_partitions(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_value_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=["2010.01.01", "2010.01.30"])
def test_loadTable_dfs_value_param_memoryMode(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_value_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
def test_loadTable_dfs_list(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_list_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_dfs_list_param_partitions(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_list_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=["`DOP", "`BBVC"])
def test_loadTable_dfs_list_param_memoryMode(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_list_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
def test_loadTable_dfs_compo_range_range(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_compo_range_range_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_dfs_compo_range_range_param_partitions(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_compo_range_range_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=["2010.01.01", "2010.01.30"])
def test_loadTable_dfs_compo_range_range_param_memoryMode(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_compo_range_range_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
def test_loadTable_dfs_compo_range_hash(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_compo_range_hash_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_dfs_compo_range_hash_param_partitions(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_compo_range_hash_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=["2010.01.01", "2010.01.30"])
def test_loadTable_dfs_compo_range_hash_param_memoryMode(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_compo_range_hash_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
def test_loadTable_dfs_compo_range_value(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_compo_range_value_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_dfs_compo_range_value_param_partitions(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_compo_range_value_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=["2010.01.01", "2010.01.30"])
def test_loadTable_dfs_compo_range_value_param_memoryMode(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_compo_range_value_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
def test_loadTable_dfs_compo_range_list(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_compo_range_list_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_dfs_compo_range_list_param_partitions(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_compo_range_list_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=["2010.01.01", "2010.01.30"])
def test_loadTable_dfs_compo_range_list_param_memoryMode(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_compo_range_list_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
def test_loadTable_dfs_compo_range_hash_list(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_compo_range_hash_list_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_dfs_compo_range_hash_list_param_partitions(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_compo_range_hash_list_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=["2010.01.01", "2010.01.30"])
def test_loadTable_dfs_compo_range_hash_list_param_memoryMode(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_compo_range_hash_list_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
def test_loadTable_dfs_compo_range_value_list(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_compo_range_value_list_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_dfs_compo_range_value_list_param_partitions(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_compo_range_value_list_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=["2010.01.01", "2010.01.30"])
def test_loadTable_dfs_compo_range_value_list_param_memoryMode(self):
dbPath = DBInfo.dfsDBName
tbName1 = DBInfo.table1
create_dfs_compo_range_value_list_db()
with self.assertRaises(RuntimeError):
self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
def test_loadTable_disk_unpartitioned(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_unpartitioned_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_range(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_range_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_range_param_partitions(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_range_db()
rs = self.s.run("select * from loadTable('{db}','{tb}') where id<20001".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=[5000, 15000])
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_range_param_memoryMode(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_range_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
before = list(self.s.run("exec memSize from getSessionMemoryStat()"))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
after = list(self.s.run("exec memSize from getSessionMemoryStat()"))
assert_frame_equal(tmp.toDF(), rs)
assert_array_equal(after >= before, repeat(True, 4))
def test_loadTable_disk_hash(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_hash_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_hash_param_partitions(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_hash_db()
rs = self.s.run("select * from loadTable('{db}','{tb}') where id in [1,3,5]".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=[1, 3, 5])
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_hash_param_memoryMode(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_hash_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
before = list(self.s.run("exec memSize from getSessionMemoryStat()"))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
after = list(self.s.run("exec memSize from getSessionMemoryStat()"))
assert_frame_equal(tmp.toDF(), rs)
assert_array_equal(after >= before, repeat(True, 4))
def test_loadTable_disk_value(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_value_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_value_param_partitions(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_value_db()
rs = self.s.run("select * from loadTable('{db}','{tb}') where date in [2010.01.01, 2010.01.30]".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=["2010.01.01", "2010.01.30"])
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_value_param_memoryMode(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_value_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
before = list(self.s.run("exec memSize from getSessionMemoryStat()"))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
after = list(self.s.run("exec memSize from getSessionMemoryStat()"))
assert_frame_equal(tmp.toDF(), rs)
assert_array_equal(after >= before, repeat(True, 4))
def test_loadTable_disk_list(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_list_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_list_param_partitions(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_list_db()
rs = self.s.run("select * from loadTable('{db}','{tb}') where sym in `DOP`ASZ`FSD`BBVC`AWQ`DS".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=["DOP", "FSD", "AWQ"])
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_list_param_memoryMode(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_list_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
before = list(self.s.run("exec memSize from getSessionMemoryStat()"))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
after = list(self.s.run("exec memSize from getSessionMemoryStat()"))
assert_frame_equal(tmp.toDF(), rs)
assert_array_equal(after >= before, repeat(True, 4))
def test_loadTable_disk_compo_range_range(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_compo_range_range_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_compo_range_range_param_partitions(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_compo_range_range_db()
rs = self.s.run("select * from loadTable('{db}','{tb}') where "
"date between 2010.01.01:2010.01.31 "
"or date between 2010.04.01:2010.04.30".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=["2010.01.01", "2010.04.25"])
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_compo_range_range_param_memoryMode(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_compo_range_range_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
before = list(self.s.run("exec memSize from getSessionMemoryStat()"))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
after = list(self.s.run("exec memSize from getSessionMemoryStat()"))
assert_frame_equal(tmp.toDF(), rs)
assert_array_equal(after >= before, repeat(True, 4))
def test_loadTable_disk_compo_range_hash(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_compo_range_hash_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_compo_range_hash_param_partitions(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_compo_range_hash_db()
rs = self.s.run("select * from loadTable('{db}','{tb}') where "
"date between 2010.01.01:2010.01.31 "
"or date between 2010.04.01:2010.04.30".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=["2010.01.01", "2010.04.25"])
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_compo_range_hash_param_memoryMode(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_compo_range_hash_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
before = list(self.s.run("exec memSize from getSessionMemoryStat()"))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
after = list(self.s.run("exec memSize from getSessionMemoryStat()"))
assert_frame_equal(tmp.toDF(), rs)
assert_array_equal(after >= before, repeat(True, 4))
def test_loadTable_disk_compo_range_value(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_compo_range_value_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_compo_range_value_param_partitions(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_compo_range_value_db()
rs = self.s.run("select * from loadTable('{db}','{tb}') where "
"date between 2010.01.01:2010.01.31 "
"or date between 2010.04.01:2010.04.30".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=["2010.01.01", "2010.04.25"])
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_compo_range_value_param_memoryMode(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_compo_range_value_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
before = list(self.s.run("exec memSize from getSessionMemoryStat()"))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
after = list(self.s.run("exec memSize from getSessionMemoryStat()"))
assert_frame_equal(tmp.toDF(), rs)
assert_array_equal(after >= before, repeat(True, 4))
def test_loadTable_disk_compo_range_list(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_compo_range_list_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_compo_range_list_param_partitions(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_compo_range_list_db()
rs = self.s.run("select * from loadTable('{db}','{tb}') where "
"date between 2010.01.01:2010.01.31 "
"or date between 2010.04.01:2010.04.30".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=["2010.01.01", "2010.04.25"])
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_compo_range_list_param_memoryMode(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_compo_range_list_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
before = list(self.s.run("exec memSize from getSessionMemoryStat()"))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
after = list(self.s.run("exec memSize from getSessionMemoryStat()"))
assert_frame_equal(tmp.toDF(), rs)
assert_array_equal(after >= before, repeat(True, 4))
def test_loadTable_disk_compo_range_hash_list(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_compo_range_hash_list_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_compo_range_hash_list_param_partitions(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_compo_range_hash_list_db()
rs = self.s.run("select * from loadTable('{db}','{tb}') where "
"date between 2010.01.01:2010.01.31 "
"or date between 2010.04.01:2010.04.30".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=["2010.01.01", "2010.04.25"])
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_compo_range_hash_list_param_memoryMode(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_compo_range_hash_list_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
before = list(self.s.run("exec memSize from getSessionMemoryStat()"))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
after = list(self.s.run("exec memSize from getSessionMemoryStat()"))
assert_frame_equal(tmp.toDF(), rs)
assert_array_equal(after >= before, repeat(True, 4))
def test_loadTable_disk_compo_range_value_list(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_compo_range_value_list_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath)
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_compo_range_value_list_param_partitions(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_compo_range_value_list_db()
rs = self.s.run("select * from loadTable('{db}','{tb}') where "
"date between 2010.01.01:2010.01.31 "
"or date between 2010.04.01:2010.04.30".format(db=dbPath, tb=tbName1))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, partitions=["2010.01.01", "2010.04.25"])
assert_frame_equal(tmp.toDF(), rs)
def test_loadTable_disk_compo_range_value_list_param_memoryMode(self):
dbPath = DBInfo.diskDBName
tbName1 = DBInfo.table1
create_disk_compo_range_value_list_db()
rs = self.s.run("select * from loadTable('{db}','{tb}')".format(db=dbPath, tb=tbName1))
before = list(self.s.run("exec memSize from getSessionMemoryStat()"))
tmp = self.s.loadTable(tableName=tbName1, dbPath=dbPath, memoryMode=True)
after = list(self.s.run("exec memSize from getSessionMemoryStat()"))
assert_frame_equal(tmp.toDF(), rs)
assert_array_equal(after >= before, repeat(True, 4))
def test_loadTable_disk_value_partition_string_scalar(self):
myDBName=WORK_DIR+"/db1"
script='''
login("admin","123456")
if(exists("{dbName}"))
dropDatabase("{dbName}")
db=database("{dbName}", VALUE, ["AAA", "BBB", "CCC"])
t=table(take(["AAA", "BBB", "CCC"], 1000) as sym, rand(100.0, 1000) as val)
db.createPartitionedTable(t, "pt", "sym").append!(t)
'''.format(dbName=myDBName)
self.s.run(script)
res=self.s.loadTable(tableName="pt", dbPath=myDBName, partitions="AAA", memoryMode=True).toDF()
expected=self.s.run("select * from loadTable('{dbName}', 'pt') where sym='AAA'".format(dbName=myDBName))
assert_frame_equal(res, expected)
def test_loadTable_disk_value_partition_string_vector(self):
myDBName=WORK_DIR+"/db1"
script='''
login("admin","123456")
if(exists("{dbName}"))
dropDatabase("{dbName}")
db=database("{dbName}", VALUE, ["AAA", "BBB", "CCC"])
t=table(take(["AAA", "BBB", "CCC"], 1000) as sym, rand(100.0, 1000) as val)
db.createPartitionedTable(t, "pt", "sym").append!(t)
'''.format(dbName=myDBName)
self.s.run(script)
res=self.s.loadTable(tableName="pt", dbPath=myDBName, partitions=["AAA", "BBB"], memoryMode=True).toDF()
expected=self.s.run("select * from loadTable('{dbName}', 'pt') where sym='AAA' or sym='BBB'".format(dbName=myDBName))
assert_frame_equal(res, expected)
if __name__ == '__main__':
unittest.main()
| 43.958212 | 156 | 0.642186 | 6,019 | 45,233 | 4.677521 | 0.026084 | 0.023087 | 0.018754 | 0.051467 | 0.977836 | 0.977836 | 0.971762 | 0.969809 | 0.96647 | 0.96235 | 0 | 0.056214 | 0.20715 | 45,233 | 1,028 | 157 | 44.000973 | 0.728829 | 0 | 0 | 0.860465 | 0 | 0.037652 | 0.360095 | 0.220676 | 0 | 0 | 0 | 0 | 0.084164 | 1 | 0.097453 | false | 0 | 0.006645 | 0 | 0.110742 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7c583460c01760ee956dd3c79209da09c81146b2 | 43,664 | py | Python | idaes/generic_models/properties/core/state_definitions/tests/test_FcPh_electrolyte.py | eyoung55/idaes-pse | 01a6d795d484155923baabcfc2878d5285b8c9b1 | [
"RSA-MD"
] | 1 | 2021-09-22T13:58:59.000Z | 2021-09-22T13:58:59.000Z | idaes/generic_models/properties/core/state_definitions/tests/test_FcPh_electrolyte.py | eyoung55/idaes-pse | 01a6d795d484155923baabcfc2878d5285b8c9b1 | [
"RSA-MD"
] | 3 | 2021-07-20T20:12:59.000Z | 2022-03-09T21:06:40.000Z | idaes/generic_models/properties/core/state_definitions/tests/test_FcPh_electrolyte.py | eyoung55/idaes-pse | 01a6d795d484155923baabcfc2878d5285b8c9b1 | [
"RSA-MD"
] | 1 | 2021-08-13T15:20:31.000Z | 2021-08-13T15:20:31.000Z | #################################################################################
# The Institute for the Design of Advanced Energy Systems Integrated Platform
# Framework (IDAES IP) was produced under the DOE Institute for the
# Design of Advanced Energy Systems (IDAES), and is copyright (c) 2018-2021
# by the software owners: The Regents of the University of California, through
# Lawrence Berkeley National Laboratory, National Technology & Engineering
# Solutions of Sandia, LLC, Carnegie Mellon University, West Virginia University
# Research Corporation, et al. All rights reserved.
#
# Please see the files COPYRIGHT.md and LICENSE.md for full copyright and
# license information.
#################################################################################
"""
Tests for constructing and using component lists in electrolyte systems
"""
# Import Python libraries
import pytest
# Import Pyomo units
from pyomo.environ import (ConcreteModel,
Constraint,
TerminationCondition,
Set,
SolverStatus,
value,
Var,
units as pyunits)
# Import IDAES cores
from idaes.core import AqueousPhase, VaporPhase
from idaes.core.components import *
from idaes.generic_models.properties.core.state_definitions import FcPh
from idaes.generic_models.properties.core.eos.ideal import Ideal
from idaes.generic_models.properties.core.generic.tests.dummy_eos import DummyEoS
from idaes.generic_models.properties.core.reactions.dh_rxn import \
constant_dh_rxn
from idaes.generic_models.properties.core.reactions.equilibrium_constant import \
van_t_hoff
from idaes.generic_models.properties.core.reactions.equilibrium_forms import \
power_law_equil
from idaes.generic_models.properties.core.generic.generic_reaction import (
ConcentrationForm)
from idaes.core import FlowsheetBlock
from idaes.generic_models.properties.core.generic.generic_property import (
GenericParameterBlock, StateIndex)
from idaes.core.util.model_statistics import degrees_of_freedom
from idaes.core.util import get_solver
def dummy_method(b, *args, **kwargs):
return 42
# -----------------------------------------------------------------------------
class TestApparentSpeciesBasisNoInherent():
config = {
# Specifying components
"components": {
'H2O': {"type": Solvent,
"enth_mol_ig_comp": dummy_method,
"parameter_data": {
"mw": (18E-3, pyunits.kg/pyunits.mol)}},
'CO2': {"type": Solute,
"enth_mol_ig_comp": dummy_method,
"parameter_data": {
"mw": (44E-3, pyunits.kg/pyunits.mol)}},
'KHCO3': {"type": Apparent,
"dissociation_species": {"K+": 1, "HCO3-": 1},
"enth_mol_ig_comp": dummy_method,
"parameter_data": {
"mw": (100.1E-3, pyunits.kg/pyunits.mol)}},
'K+': {"type": Cation,
"charge": +1,
"parameter_data": {
"mw": (39.1E-3, pyunits.kg/pyunits.mol)}},
'HCO3-': {"type": Anion,
"charge": -1,
"parameter_data": {
"mw": (61E-3, pyunits.kg/pyunits.mol)}},
'N2': {"type": Component,
"enth_mol_ig_comp": dummy_method,
"parameter_data": {
"mw": (28E-3, pyunits.kg/pyunits.mol)}}},
# Specifying phases
"phases": {'Liq': {"type": AqueousPhase,
"equation_of_state": DummyEoS,
"equation_of_state_options": {
"pH_range": "basic"}},
'Vap': {"type": VaporPhase,
"equation_of_state": Ideal}},
# Set base units of measurement
"base_units": {"time": pyunits.s,
"length": pyunits.m,
"mass": pyunits.kg,
"amount": pyunits.mol,
"temperature": pyunits.K},
# Specifying state definition
"state_definition": FcPh,
"state_bounds": {"flow_mol_comp": (0, 100, 1000, pyunits.mol/pyunits.s),
"enth_mol": (0, 10000, 1e8, pyunits.J/pyunits.mol),
"temperature": (273.15, 300, 500, pyunits.K),
"pressure": (5e4, 1e5, 1e6, pyunits.Pa)},
"state_components": StateIndex.apparent,
"pressure_ref": (101325, pyunits.Pa),
"temperature_ref": (298.15, pyunits.K)}
@pytest.fixture(scope="class")
def frame(self):
m = ConcreteModel()
m.fs = FlowsheetBlock(default={'dynamic': False})
m.fs.props = GenericParameterBlock(
default=TestApparentSpeciesBasisNoInherent.config)
m.fs.state = m.fs.props.build_state_block(
[1],
default={"defined_state": True})
return m
@pytest.mark.unit
def test_vars_and_constraints(self, frame):
m = frame
assert m.fs.state[1].component_list is m.fs.props.apparent_species_set
assert m.fs.state[1].phase_component_set is \
m.fs.props.apparent_phase_component_set
assert isinstance(m.fs.state[1].flow_mol_comp, Var)
assert len(m.fs.state[1].flow_mol_comp) == 4
assert isinstance(m.fs.state[1].pressure, Var)
assert len(m.fs.state[1].pressure) == 1
assert isinstance(m.fs.state[1].temperature, Var)
assert len(m.fs.state[1].temperature) == 1
assert isinstance(m.fs.state[1].flow_mol_phase, Var)
assert len(m.fs.state[1].flow_mol_phase) == 2
for p in m.fs.state[1].flow_mol_phase:
assert p in ["Liq", "Vap"]
assert isinstance(m.fs.state[1].phase_frac, Var)
assert len(m.fs.state[1].phase_frac) == 2
for p in m.fs.state[1].phase_frac:
assert p in ["Liq", "Vap"]
assert isinstance(m.fs.state[1].mole_frac_comp, Var)
assert len(m.fs.state[1].mole_frac_comp) == 4
for j in m.fs.state[1].mole_frac_comp:
assert j in ["KHCO3", "H2O", "CO2", "N2"]
assert isinstance(m.fs.state[1].total_flow_balance, Constraint)
assert len(m.fs.state[1].total_flow_balance) == 1
assert isinstance(m.fs.state[1].phase_fraction_constraint, Constraint)
assert len(m.fs.state[1].phase_fraction_constraint) == 2
assert isinstance(m.fs.state[1].component_flow_balances, Constraint)
assert len(m.fs.state[1].component_flow_balances) == 4
for j in m.fs.state[1].component_flow_balances:
assert j in ["KHCO3", "H2O", "CO2", "N2"]
assert isinstance(m.fs.state[1].mole_frac_phase_comp, Var)
assert len(m.fs.state[1].mole_frac_phase_comp) == 7
for j in m.fs.state[1].mole_frac_phase_comp:
assert j in [("Liq", "H2O"), ("Liq", "CO2"),
("Liq", "KHCO3"),
("Vap", "H2O"), ("Vap", "CO2"),
("Vap", "KHCO3"), ("Vap", "N2")]
# Check references to base state variables
assert m.fs.state[1].flow_mol_apparent is m.fs.state[1].flow_mol
for i in m.fs.state[1].flow_mol_phase_apparent:
assert m.fs.state[1].flow_mol_phase_apparent[i] is \
m.fs.state[1].flow_mol_phase[i]
assert i in m.fs.props.phase_list
for i in m.fs.state[1].flow_mol_phase_comp_apparent:
assert m.fs.state[1].flow_mol_phase_comp_apparent[i] is \
m.fs.state[1].flow_mol_phase_comp[i]
assert i in m.fs.props.apparent_phase_component_set
for i in m.fs.state[1].mole_frac_phase_comp_apparent:
assert m.fs.state[1].mole_frac_phase_comp_apparent[i] is \
m.fs.state[1].mole_frac_phase_comp[i]
assert i in m.fs.props.apparent_phase_component_set
# Check for true species components
assert isinstance(m.fs.state[1].flow_mol_phase_comp_true, Var)
assert len(m.fs.state[1].flow_mol_phase_comp_true) == 8
assert isinstance(m.fs.state[1].appr_to_true_species, Constraint)
assert len(m.fs.state[1].appr_to_true_species) == 8
@pytest.mark.component
def test_solve_for_true_species(self, frame):
m = frame
m.fs.state[1].phase_frac["Liq"].fix(0.5)
m.fs.state[1].temperature.fix(300)
m.fs.state[1].pressure.fix(1e5)
m.fs.state[1].flow_mol_comp["H2O"].fix(0.5)
m.fs.state[1].flow_mol_comp["CO2"].fix(0.5)
m.fs.state[1].flow_mol_comp["KHCO3"].fix(0.5)
m.fs.state[1].flow_mol_comp["N2"].fix(0.5)
m.fs.state[1].mole_frac_phase_comp["Vap", "KHCO3"].fix(1/6)
m.fs.state[1].mole_frac_phase_comp["Vap", "CO2"].fix(1/6)
assert degrees_of_freedom(m.fs) == 0
solver = get_solver()
res = solver.solve(m.fs, tee=True)
# Check for optimal solution
assert res.solver.termination_condition == \
TerminationCondition.optimal
assert res.solver.status == SolverStatus.ok
# Check true species flowrates
assert (value(m.fs.state[1].flow_mol_phase_comp_true["Vap", "H2O"]) ==
pytest.approx(1/6, rel=1e-5))
assert (value(m.fs.state[1].flow_mol_phase_comp_true["Vap", "CO2"]) ==
pytest.approx(1/6, rel=1e-5))
assert (value(
m.fs.state[1].flow_mol_phase_comp_true["Vap", "KHCO3"]) ==
pytest.approx(1/6, rel=1e-5))
assert (value(m.fs.state[1].flow_mol_phase_comp_true["Vap", "N2"]) ==
pytest.approx(0.5, rel=1e-5))
assert (value(m.fs.state[1].flow_mol_phase_comp_true["Liq", "H2O"]) ==
pytest.approx(1/3, rel=1e-5))
assert (value(m.fs.state[1].flow_mol_phase_comp_true["Liq", "CO2"]) ==
pytest.approx(1/3, rel=1e-5))
assert (value(m.fs.state[1].flow_mol_phase_comp_true["Liq", "K+"]) ==
pytest.approx(1/3, rel=1e-5))
assert (value(
m.fs.state[1].flow_mol_phase_comp_true["Liq", "HCO3-"]) ==
pytest.approx(1/3, rel=1e-5))
# -----------------------------------------------------------------------------
def dens_mol_H2O(*args, **kwargs):
return 55e3
class TestApparentSpeciesBasisInherent():
config = {
# Specifying components
"components": {
'H2O': {"type": Solvent,
"dens_mol_liq_comp": dens_mol_H2O,
"parameter_data": {
"mw": (18E-3, pyunits.kg/pyunits.mol)}},
'KHCO3': {"type": Apparent,
"dissociation_species": {"K+": 1, "HCO3-": 1},
"parameter_data": {
"mw": (100.1E-3, pyunits.kg/pyunits.mol)}},
'K2CO3': {"type": Apparent,
"dissociation_species": {"K+": 2, "CO3--": 1},
"parameter_data": {
"mw": (138.2E-3, pyunits.kg/pyunits.mol)}},
'KOH': {"type": Apparent,
"dissociation_species": {"K+": 1, "OH-": 1},
"parameter_data": {
"mw": (56.1E-3, pyunits.kg/pyunits.mol)}},
'H+': {"type": Cation,
"charge": +1,
"dens_mol_liq_comp": dens_mol_H2O,
"parameter_data": {
"mw": (1E-3, pyunits.kg/pyunits.mol)}},
'K+': {"type": Cation,
"charge": +1,
"dens_mol_liq_comp": dens_mol_H2O,
"parameter_data": {
"mw": (39.1E-3, pyunits.kg/pyunits.mol)}},
'OH-': {"type": Anion,
"charge": -1,
"dens_mol_liq_comp": dens_mol_H2O,
"parameter_data": {
"mw": (17E-3, pyunits.kg/pyunits.mol)}},
'HCO3-': {"type": Anion,
"charge": -1,
"dens_mol_liq_comp": dens_mol_H2O,
"parameter_data": {
"mw": (61E-3, pyunits.kg/pyunits.mol)}},
'CO3--': {"type": Anion,
"charge": -2,
"dens_mol_liq_comp": dens_mol_H2O,
"parameter_data": {
"mw": (60E-3, pyunits.kg/pyunits.mol)}}},
# Specifying phases
"phases": {'Liq': {"type": AqueousPhase,
"equation_of_state": DummyEoS,
"equation_of_state_options": {
"pH_range": "basic"}}},
# Set base units of measurement
"base_units": {"time": pyunits.s,
"length": pyunits.m,
"mass": pyunits.kg,
"amount": pyunits.mol,
"temperature": pyunits.K},
# Specifying state definition
"state_definition": FcPh,
"state_bounds": {"flow_mol_comp": (0, 100, 1000, pyunits.mol/pyunits.s),
"enth_mol": (0, 10000, 1e8, pyunits.J/pyunits.mol),
"temperature": (273.15, 300, 500, pyunits.K),
"pressure": (5e4, 1e5, 1e6, pyunits.Pa)},
"state_components": StateIndex.apparent,
"pressure_ref": (101325, pyunits.Pa),
"temperature_ref": (298.15, pyunits.K),
"inherent_reactions": {
"H2O_si": {"stoichiometry": {("Liq", "H2O"): -1,
("Liq", "H+"): 1,
("Liq", "OH-"): 1},
"heat_of_reaction": constant_dh_rxn,
"equilibrium_constant": van_t_hoff,
"equilibrium_form": power_law_equil,
"concentration_form": ConcentrationForm.molarity,
"parameter_data": {
"reaction_order": {("Liq", "H+"): 1,
("Liq", "OH-"): 1},
"dh_rxn_ref": 1,
"k_eq_ref": 1e-14,
"T_eq_ref": 350}},
"co3_hco3": {"stoichiometry": {("Liq", "CO3--"): -1,
("Liq", "H2O"): -1,
("Liq", "HCO3-"): 1,
("Liq", "OH-"): 1},
"heat_of_reaction": constant_dh_rxn,
"equilibrium_constant": van_t_hoff,
"equilibrium_form": power_law_equil,
"concentration_form": ConcentrationForm.molarity,
"parameter_data": {
"reaction_order": {("Liq", "CO3--"): -1,
("Liq", "HCO3-"): 1,
("Liq", "OH-"): 1},
"dh_rxn_ref": 1,
"k_eq_ref": 5e-11,
"T_eq_ref": 350}}}}
@pytest.fixture(scope="class")
def frame(self):
m = ConcreteModel()
m.fs = FlowsheetBlock(default={'dynamic': False})
m.fs.props = GenericParameterBlock(
default=TestApparentSpeciesBasisInherent.config)
m.fs.state = m.fs.props.build_state_block(
[1],
default={"defined_state": True})
m.fs.state[1].calculate_scaling_factors()
return m
@pytest.mark.unit
def test_vars_and_constraints(self, frame):
m = frame
assert m.fs.state[1].component_list is m.fs.props.apparent_species_set
assert m.fs.state[1].phase_component_set is \
m.fs.props.apparent_phase_component_set
assert isinstance(m.fs.state[1].flow_mol_comp, Var)
assert len(m.fs.state[1].flow_mol_comp) == 4
assert isinstance(m.fs.state[1].pressure, Var)
assert len(m.fs.state[1].pressure) == 1
assert isinstance(m.fs.state[1].temperature, Var)
assert len(m.fs.state[1].temperature) == 1
assert isinstance(m.fs.state[1].flow_mol_phase, Var)
assert len(m.fs.state[1].flow_mol_phase) == 1
for p in m.fs.state[1].flow_mol_phase:
assert p in ["Liq"]
assert isinstance(m.fs.state[1].phase_frac, Var)
assert len(m.fs.state[1].phase_frac) == 1
for p in m.fs.state[1].phase_frac:
assert p in ["Liq"]
assert isinstance(m.fs.state[1].mole_frac_comp, Var)
assert len(m.fs.state[1].mole_frac_comp) == 4
for j in m.fs.state[1].mole_frac_comp:
assert j in ["KHCO3", "H2O", "K2CO3", "KOH"]
assert isinstance(m.fs.state[1].total_flow_balance, Constraint)
assert len(m.fs.state[1].total_flow_balance) == 1
assert isinstance(m.fs.state[1].phase_fraction_constraint, Constraint)
assert len(m.fs.state[1].phase_fraction_constraint) == 1
assert isinstance(m.fs.state[1].component_flow_balances, Constraint)
assert len(m.fs.state[1].component_flow_balances) == 4
for j in m.fs.state[1].component_flow_balances:
assert j in ["KHCO3", "H2O", "K2CO3", "KOH"]
assert isinstance(m.fs.state[1].mole_frac_phase_comp, Var)
assert len(m.fs.state[1].mole_frac_phase_comp) == 4
for j in m.fs.state[1].mole_frac_phase_comp:
assert j in [("Liq", "H2O"), ("Liq", "K2CO3"),
("Liq", "KHCO3"), ("Liq", "KOH")]
# Check references to base state variables
assert m.fs.state[1].flow_mol_apparent is m.fs.state[1].flow_mol
for i in m.fs.state[1].flow_mol_phase_apparent:
assert m.fs.state[1].flow_mol_phase_apparent[i] is \
m.fs.state[1].flow_mol_phase[i]
assert i in m.fs.props.phase_list
for i in m.fs.state[1].flow_mol_phase_comp_apparent:
assert m.fs.state[1].flow_mol_phase_comp_apparent[i] is \
m.fs.state[1].flow_mol_phase_comp[i]
assert i in m.fs.props.apparent_phase_component_set
for i in m.fs.state[1].mole_frac_phase_comp_apparent:
assert m.fs.state[1].mole_frac_phase_comp_apparent[i] is \
m.fs.state[1].mole_frac_phase_comp[i]
assert i in m.fs.props.apparent_phase_component_set
# Check for true species components
assert isinstance(m.fs.state[1].flow_mol_phase_comp_true, Var)
assert len(m.fs.state[1].flow_mol_phase_comp_true) == 6
assert isinstance(m.fs.state[1].mole_frac_phase_comp_true, Var)
assert len(m.fs.state[1].mole_frac_phase_comp_true) == 6
assert isinstance(m.fs.state[1].appr_to_true_species, Constraint)
assert len(m.fs.state[1].appr_to_true_species) == 6
assert isinstance(m.fs.state[1].true_mole_frac_constraint, Constraint)
assert len(m.fs.state[1].true_mole_frac_constraint) == 6
# Check for inherent reactions
assert m.fs.state[1].has_inherent_reactions
assert not m.fs.state[1].include_inherent_reactions
assert isinstance(m.fs.state[1].params.inherent_reaction_idx, Set)
assert len(m.fs.state[1].params.inherent_reaction_idx) == 2
for i in m.fs.state[1].params.inherent_reaction_idx:
assert i in ["H2O_si", "co3_hco3"]
assert isinstance(m.fs.state[1].apparent_inherent_reaction_extent, Var)
assert len(m.fs.state[1].apparent_inherent_reaction_extent) == 2
@pytest.mark.component
def test_solve_for_true_species(self, frame):
m = frame
m.fs.state[1].enth_mol.fix(35000)
m.fs.state[1].temperature.set_value(350)
m.fs.state[1].pressure.fix(1e5)
m.fs.state[1].flow_mol_comp["H2O"].fix(0.8*2)
m.fs.state[1].flow_mol_comp["K2CO3"].fix(0.2*2)
m.fs.state[1].flow_mol_comp["KHCO3"].fix(0)
m.fs.state[1].flow_mol_comp["KOH"].fix(0)
assert degrees_of_freedom(m.fs) == 0
m.fs.state.initialize()
solver = get_solver()
res = solver.solve(m.fs)
# Check for optimal solution
assert res.solver.termination_condition == \
TerminationCondition.optimal
assert res.solver.status == SolverStatus.ok
# Check apparent species flowrates
for j in m.fs.state[1].mole_frac_comp:
assert (
value(m.fs.state[1].flow_mol_phase_comp_apparent["Liq", j]) ==
pytest.approx(value(m.fs.state[1].flow_mol *
m.fs.state[1].mole_frac_comp[j]),
rel=1e-5))
# Check element balances
assert(value(
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "K2CO3"]*2 +
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "KHCO3"] +
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "KOH"]) ==
pytest.approx(value(
m.fs.state[1].flow_mol_phase_comp_true["Liq", "K+"]),
rel=1e-5))
assert(value(
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "K2CO3"] +
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "KHCO3"]) ==
pytest.approx(value(
m.fs.state[1].flow_mol_phase_comp_true["Liq", "CO3--"] +
m.fs.state[1].flow_mol_phase_comp_true["Liq", "HCO3-"]),
rel=1e-5))
assert(value(
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "K2CO3"]*3 +
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "KHCO3"]*3 +
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "KOH"] +
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "H2O"]) ==
pytest.approx(value(
m.fs.state[1].flow_mol_phase_comp_true["Liq", "CO3--"]*3 +
m.fs.state[1].flow_mol_phase_comp_true["Liq", "HCO3-"]*3 +
m.fs.state[1].flow_mol_phase_comp_true["Liq", "OH-"] +
m.fs.state[1].flow_mol_phase_comp_true["Liq", "H2O"]),
rel=1e-5))
assert(value(
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "KHCO3"] +
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "KOH"] +
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "H2O"]*2) ==
pytest.approx(value(
m.fs.state[1].flow_mol_phase_comp_true["Liq", "HCO3-"] +
m.fs.state[1].flow_mol_phase_comp_true["Liq", "OH-"] +
m.fs.state[1].flow_mol_phase_comp_true["Liq", "H+"] +
m.fs.state[1].flow_mol_phase_comp_true["Liq", "H2O"]*2),
rel=1e-5))
# Check true species mole fractions
assert(value(
m.fs.state[1].mole_frac_phase_comp_true["Liq", "CO3--"]) ==
pytest.approx(0.142857, rel=1e-5))
assert(value(
m.fs.state[1].mole_frac_phase_comp_true["Liq", "H+"]) ==
pytest.approx(2.90081e-16, rel=1e-5))
assert(value(
m.fs.state[1].mole_frac_phase_comp_true["Liq", "H2O"]) ==
pytest.approx(0.571429, rel=1e-5))
assert(value(
m.fs.state[1].mole_frac_phase_comp_true["Liq", "HCO3-"]) ==
pytest.approx(1.13961e-08, rel=1e-5))
assert(value(
m.fs.state[1].mole_frac_phase_comp_true["Liq", "K+"]) ==
pytest.approx(0.285714, rel=1e-5))
assert(value(
m.fs.state[1].mole_frac_phase_comp_true["Liq", "OH-"]) ==
pytest.approx(1.139606e-08, rel=1e-5))
# -----------------------------------------------------------------------------
class TestTrueSpeciesBasisNoInherent():
config = {
# Specifying components
"components": {
'H2O': {"type": Solvent,
"enth_mol_ig_comp": dummy_method,
"parameter_data": {
"mw": (18E-3, pyunits.kg/pyunits.mol)}},
'CO2': {"type": Solute,
"enth_mol_ig_comp": dummy_method,
"parameter_data": {
"mw": (44E-3, pyunits.kg/pyunits.mol)}},
'KHCO3': {"type": Apparent,
"enth_mol_ig_comp": dummy_method,
"dissociation_species": {"K+": 1, "HCO3-": 1},
"parameter_data": {
"mw": (100.1E-3, pyunits.kg/pyunits.mol)}},
'K+': {"type": Cation,
"charge": +1,
"parameter_data": {
"mw": (39.1E-3, pyunits.kg/pyunits.mol)}},
'HCO3-': {"type": Anion,
"charge": -1,
"parameter_data": {
"mw": (61E-3, pyunits.kg/pyunits.mol)}},
'N2': {"type": Component,
"enth_mol_ig_comp": dummy_method,
"parameter_data": {
"mw": (28E-3, pyunits.kg/pyunits.mol)}}},
# Specifying phases
"phases": {'Liq': {"type": AqueousPhase,
"equation_of_state": DummyEoS,
"equation_of_state_options": {
"pH_range": "basic"}},
'Vap': {"type": VaporPhase,
"equation_of_state": Ideal}},
# Set base units of measurement
"base_units": {"time": pyunits.s,
"length": pyunits.m,
"mass": pyunits.kg,
"amount": pyunits.mol,
"temperature": pyunits.K},
# Specifying state definition
"state_definition": FcPh,
"state_bounds": {"flow_mol_comp": (0, 100, 1000, pyunits.mol/pyunits.s),
"enth_mol": (0, 10000, 1e8, pyunits.J/pyunits.mol),
"temperature": (273.15, 300, 500, pyunits.K),
"pressure": (5e4, 1e5, 1e6, pyunits.Pa)},
"state_components": StateIndex.true,
"pressure_ref": (101325, pyunits.Pa),
"temperature_ref": (298.15, pyunits.K)}
@pytest.fixture(scope="class")
def frame(self):
m = ConcreteModel()
m.fs = FlowsheetBlock(default={'dynamic': False})
m.fs.props = GenericParameterBlock(
default=TestTrueSpeciesBasisNoInherent.config)
m.fs.state = m.fs.props.build_state_block(
[1],
default={"defined_state": True})
return m
@pytest.mark.unit
def test_vars_and_constraints(self, frame):
m = frame
assert m.fs.state[1].component_list is m.fs.props.true_species_set
assert m.fs.state[1].phase_component_set is \
m.fs.props.true_phase_component_set
assert isinstance(m.fs.state[1].flow_mol_comp, Var)
assert len(m.fs.state[1].flow_mol_comp) == 5
assert isinstance(m.fs.state[1].pressure, Var)
assert len(m.fs.state[1].pressure) == 1
assert isinstance(m.fs.state[1].temperature, Var)
assert len(m.fs.state[1].temperature) == 1
assert isinstance(m.fs.state[1].flow_mol_phase, Var)
assert len(m.fs.state[1].flow_mol_phase) == 2
for p in m.fs.state[1].flow_mol_phase:
assert p in ["Liq", "Vap"]
assert isinstance(m.fs.state[1].phase_frac, Var)
assert len(m.fs.state[1].phase_frac) == 2
for p in m.fs.state[1].phase_frac:
assert p in ["Liq", "Vap"]
assert isinstance(m.fs.state[1].mole_frac_comp, Var)
assert len(m.fs.state[1].mole_frac_comp) == 5
for j in m.fs.state[1].mole_frac_comp:
assert j in ["K+", "HCO3-", "H2O", "CO2", "N2"]
assert isinstance(m.fs.state[1].total_flow_balance, Constraint)
assert len(m.fs.state[1].total_flow_balance) == 1
assert isinstance(m.fs.state[1].phase_fraction_constraint, Constraint)
assert len(m.fs.state[1].phase_fraction_constraint) == 2
assert isinstance(m.fs.state[1].component_flow_balances, Constraint)
assert len(m.fs.state[1].component_flow_balances) == 5
for j in m.fs.state[1].component_flow_balances:
assert j in ["K+", "HCO3-", "H2O", "CO2", "N2"]
assert isinstance(m.fs.state[1].mole_frac_phase_comp, Var)
assert len(m.fs.state[1].mole_frac_phase_comp) == 8
for j in m.fs.state[1].mole_frac_phase_comp:
assert j in [("Liq", "H2O"), ("Liq", "CO2"),
("Liq", "K+"), ("Liq", "HCO3-"),
("Vap", "H2O"), ("Vap", "CO2"),
("Vap", "KHCO3"), ("Vap", "N2")]
# Check references to base state variables
assert m.fs.state[1].flow_mol_true is m.fs.state[1].flow_mol
for i in m.fs.state[1].flow_mol_phase_true:
assert m.fs.state[1].flow_mol_phase_true[i] is \
m.fs.state[1].flow_mol_phase[i]
assert i in m.fs.props.phase_list
for i in m.fs.state[1].flow_mol_phase_comp_true:
assert m.fs.state[1].flow_mol_phase_comp_true[i] is \
m.fs.state[1].flow_mol_phase_comp[i]
assert i in m.fs.props.true_phase_component_set
for i in m.fs.state[1].mole_frac_phase_comp_true:
assert m.fs.state[1].mole_frac_phase_comp_true[i] is \
m.fs.state[1].mole_frac_phase_comp[i]
assert i in m.fs.props.true_phase_component_set
# Check for apparent species components
assert isinstance(m.fs.state[1].flow_mol_phase_comp_apparent, Var)
assert len(m.fs.state[1].flow_mol_phase_comp_apparent) == 7
assert isinstance(m.fs.state[1].true_to_appr_species, Constraint)
assert len(m.fs.state[1].true_to_appr_species) == 7
# -----------------------------------------------------------------------------
class TestTrueSpeciesBasisInherent():
config = {
# Specifying components
"components": {
'H2O': {"type": Solvent,
"dens_mol_liq_comp": dens_mol_H2O,
"parameter_data": {
"mw": (18E-3, pyunits.kg/pyunits.mol)}},
'KHCO3': {"type": Apparent,
"dissociation_species": {"K+": 1, "HCO3-": 1},
"parameter_data": {
"mw": (100.1E-3, pyunits.kg/pyunits.mol)}},
'K2CO3': {"type": Apparent,
"dissociation_species": {"K+": 2, "CO3--": 1},
"parameter_data": {
"mw": (138.2E-3, pyunits.kg/pyunits.mol)}},
'KOH': {"type": Apparent,
"dissociation_species": {"K+": 1, "OH-": 1},
"parameter_data": {
"mw": (56.1E-3, pyunits.kg/pyunits.mol)}},
'H+': {"type": Cation,
"charge": +1,
"dens_mol_liq_comp": dens_mol_H2O,
"parameter_data": {
"mw": (1E-3, pyunits.kg/pyunits.mol)}},
'K+': {"type": Cation,
"charge": +1,
"dens_mol_liq_comp": dens_mol_H2O,
"parameter_data": {
"mw": (39.1E-3, pyunits.kg/pyunits.mol)}},
'OH-': {"type": Anion,
"charge": -1,
"dens_mol_liq_comp": dens_mol_H2O,
"parameter_data": {
"mw": (17E-3, pyunits.kg/pyunits.mol)}},
'HCO3-': {"type": Anion,
"charge": -1,
"dens_mol_liq_comp": dens_mol_H2O,
"parameter_data": {
"mw": (61E-3, pyunits.kg/pyunits.mol)}},
'CO3--': {"type": Anion,
"charge": -2,
"dens_mol_liq_comp": dens_mol_H2O,
"parameter_data": {
"mw": (60E-3, pyunits.kg/pyunits.mol)}}},
# Specifying phases
"phases": {'Liq': {"type": AqueousPhase,
"equation_of_state": DummyEoS,
"equation_of_state_options": {
"pH_range": "basic"}}},
# Set base units of measurement
"base_units": {"time": pyunits.s,
"length": pyunits.m,
"mass": pyunits.kg,
"amount": pyunits.mol,
"temperature": pyunits.K},
# Specifying state definition
"state_definition": FcPh,
"state_bounds": {"flow_mol_comp": (0, 100, 1000, pyunits.mol/pyunits.s),
"enth_mol": (0, 10000, 1e8, pyunits.J/pyunits.mol),
"temperature": (273.15, 300, 500, pyunits.K),
"pressure": (5e4, 1e5, 1e6, pyunits.Pa)},
"state_components": StateIndex.true,
"pressure_ref": (101325, pyunits.Pa),
"temperature_ref": (298.15, pyunits.K),
"inherent_reactions": {
"H2O_si": {"stoichiometry": {("Liq", "H2O"): -1,
("Liq", "H+"): 1,
("Liq", "OH-"): 1},
"heat_of_reaction": constant_dh_rxn,
"equilibrium_constant": van_t_hoff,
"equilibrium_form": power_law_equil,
"concentration_form": ConcentrationForm.molarity,
"parameter_data": {
"reaction_order": {("Liq", "H+"): 1,
("Liq", "OH-"): 1},
"dh_rxn_ref": 1,
"k_eq_ref": 1e-14,
"T_eq_ref": 350}},
"co3_hco3": {"stoichiometry": {("Liq", "CO3--"): -1,
("Liq", "H2O"): -1,
("Liq", "HCO3-"): 1,
("Liq", "OH-"): 1},
"heat_of_reaction": constant_dh_rxn,
"equilibrium_constant": van_t_hoff,
"equilibrium_form": power_law_equil,
"concentration_form": ConcentrationForm.molarity,
"parameter_data": {
"reaction_order": {("Liq", "CO3--"): -1,
("Liq", "HCO3-"): 1,
("Liq", "OH-"): 1},
"dh_rxn_ref": 1,
"k_eq_ref": 5e-11,
"T_eq_ref": 350}}},
"default_scaling_factors": {("mole_frac_comp", "OH-"): 1e8,
("mole_frac_comp", "HCO3-"): 1e8,
("mole_frac_comp", "H+"): 1e16}}
@pytest.fixture(scope="class")
def frame(self):
m = ConcreteModel()
m.fs = FlowsheetBlock(default={'dynamic': False})
m.fs.props = GenericParameterBlock(
default=TestTrueSpeciesBasisInherent.config)
m.fs.state = m.fs.props.build_state_block(
[1],
default={"defined_state": True})
m.fs.state[1].calculate_scaling_factors()
return m
@pytest.mark.unit
def test_vars_and_constraints(self, frame):
m = frame
assert m.fs.state[1].component_list is m.fs.props.true_species_set
assert m.fs.state[1].phase_component_set is \
m.fs.props.true_phase_component_set
assert isinstance(m.fs.state[1].flow_mol_comp, Var)
assert len(m.fs.state[1].flow_mol_comp) == 6
assert isinstance(m.fs.state[1].pressure, Var)
assert len(m.fs.state[1].pressure) == 1
assert isinstance(m.fs.state[1].temperature, Var)
assert len(m.fs.state[1].temperature) == 1
assert isinstance(m.fs.state[1].flow_mol_phase, Var)
assert len(m.fs.state[1].flow_mol_phase) == 1
for p in m.fs.state[1].flow_mol_phase:
assert p in ["Liq"]
assert isinstance(m.fs.state[1].phase_frac, Var)
assert len(m.fs.state[1].phase_frac) == 1
for p in m.fs.state[1].phase_frac:
assert p in ["Liq"]
assert isinstance(m.fs.state[1].mole_frac_comp, Var)
assert len(m.fs.state[1].mole_frac_comp) == 6
for j in m.fs.state[1].mole_frac_comp:
assert j in ["H+", "K+", "HCO3-", "CO3--", "OH-", "H2O"]
assert isinstance(m.fs.state[1].total_flow_balance, Constraint)
assert len(m.fs.state[1].total_flow_balance) == 1
assert isinstance(m.fs.state[1].phase_fraction_constraint, Constraint)
assert len(m.fs.state[1].phase_fraction_constraint) == 1
assert isinstance(m.fs.state[1].component_flow_balances, Constraint)
assert len(m.fs.state[1].component_flow_balances) == 6
for j in m.fs.state[1].component_flow_balances:
assert j in ["H+", "K+", "HCO3-", "CO3--", "OH-", "H2O"]
assert isinstance(m.fs.state[1].mole_frac_phase_comp, Var)
assert len(m.fs.state[1].mole_frac_phase_comp) == 6
for j in m.fs.state[1].mole_frac_phase_comp:
assert j in [("Liq", "H+"), ("Liq", "K+"), ("Liq", "HCO3-"),
("Liq", "CO3--"), ("Liq", "OH-"), ("Liq", "H2O")]
# Check references to base state variables
assert m.fs.state[1].flow_mol_true is m.fs.state[1].flow_mol
for i in m.fs.state[1].flow_mol_phase_true:
assert m.fs.state[1].flow_mol_phase_true[i] is \
m.fs.state[1].flow_mol_phase[i]
assert i in m.fs.props.phase_list
for i in m.fs.state[1].flow_mol_phase_comp_true:
assert m.fs.state[1].flow_mol_phase_comp_true[i] is \
m.fs.state[1].flow_mol_phase_comp[i]
assert i in m.fs.props.true_phase_component_set
for i in m.fs.state[1].mole_frac_phase_comp_true:
assert m.fs.state[1].mole_frac_phase_comp_true[i] is \
m.fs.state[1].mole_frac_phase_comp[i]
assert i in m.fs.props.true_phase_component_set
# Check for apparent species components
assert isinstance(m.fs.state[1].flow_mol_phase_comp_apparent, Var)
assert len(m.fs.state[1].flow_mol_phase_comp_apparent) == 4
assert isinstance(m.fs.state[1].mole_frac_phase_comp_apparent, Var)
assert len(m.fs.state[1].mole_frac_phase_comp_apparent) == 4
assert isinstance(m.fs.state[1].true_to_appr_species, Constraint)
assert len(m.fs.state[1].true_to_appr_species) == 4
assert isinstance(m.fs.state[1].appr_mole_frac_constraint,
Constraint)
assert len(m.fs.state[1].appr_mole_frac_constraint) == 4
# Check for inherent reactions
assert m.fs.state[1].has_inherent_reactions
assert m.fs.state[1].include_inherent_reactions
assert isinstance(m.fs.state[1].params.inherent_reaction_idx, Set)
assert len(m.fs.state[1].params.inherent_reaction_idx) == 2
for i in m.fs.state[1].params.inherent_reaction_idx:
assert i in ["H2O_si", "co3_hco3"]
assert not hasattr(m.fs.state[1], "apparent_inherent_reaction_extent")
@pytest.mark.component
def test_solve_for_true_species(self, frame):
m = frame
m.fs.state[1].enth_mol.fix(35000)
m.fs.state[1].temperature.set_value(350)
m.fs.state[1].pressure.fix(1e5)
m.fs.state[1].flow_mol_comp["H2O"].fix(0.5716*2)
m.fs.state[1].flow_mol_comp["K+"].fix(0.2858*2)
m.fs.state[1].flow_mol_comp["H+"].fix(2.9e-16*2)
m.fs.state[1].flow_mol_comp["HCO3-"].fix(1.14e-8*2)
m.fs.state[1].flow_mol_comp["CO3--"].fix(0.1429*2)
m.fs.state[1].flow_mol_comp["OH-"].fix(1.14e-8*2)
assert degrees_of_freedom(m.fs) == 0
m.fs.state.initialize()
solver = get_solver()
res = solver.solve(m.fs)
# Check for optimal solution
assert res.solver.termination_condition == \
TerminationCondition.optimal
assert res.solver.status == SolverStatus.ok
assert m.fs.state[1].temperature.value == 350
# Check true species flowrates
for j in m.fs.state[1].mole_frac_comp:
assert (
value(m.fs.state[1].flow_mol_phase_comp_true["Liq", j]) ==
pytest.approx(value(m.fs.state[1].flow_mol *
m.fs.state[1].mole_frac_comp[j]),
rel=1e-5))
# Check element balances
assert(value(
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "K2CO3"]*2 +
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "KHCO3"] +
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "KOH"]) ==
pytest.approx(value(
m.fs.state[1].flow_mol_phase_comp_true["Liq", "K+"]),
rel=1e-5))
assert(value(
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "K2CO3"] +
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "KHCO3"]) ==
pytest.approx(value(
m.fs.state[1].flow_mol_phase_comp_true["Liq", "CO3--"] +
m.fs.state[1].flow_mol_phase_comp_true["Liq", "HCO3-"]),
rel=1e-5))
assert(value(
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "K2CO3"]*3 +
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "KHCO3"]*3 +
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "KOH"] +
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "H2O"]) ==
pytest.approx(value(
m.fs.state[1].flow_mol_phase_comp_true["Liq", "CO3--"]*3 +
m.fs.state[1].flow_mol_phase_comp_true["Liq", "HCO3-"]*3 +
m.fs.state[1].flow_mol_phase_comp_true["Liq", "OH-"] +
m.fs.state[1].flow_mol_phase_comp_true["Liq", "H2O"]),
rel=1e-5))
assert(value(
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "KHCO3"] +
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "KOH"] +
m.fs.state[1].flow_mol_phase_comp_apparent["Liq", "H2O"]*2) ==
pytest.approx(value(
m.fs.state[1].flow_mol_phase_comp_true["Liq", "HCO3-"] +
m.fs.state[1].flow_mol_phase_comp_true["Liq", "OH-"] +
m.fs.state[1].flow_mol_phase_comp_true["Liq", "H+"] +
m.fs.state[1].flow_mol_phase_comp_true["Liq", "H2O"]*2),
rel=1e-5))
# Check apparent species mole fractions
assert(value(
m.fs.state[1].mole_frac_phase_comp_apparent["Liq", "K2CO3"]) ==
pytest.approx(0.2, rel=1e-5))
assert(value(
m.fs.state[1].mole_frac_phase_comp_apparent["Liq", "H2O"]) ==
pytest.approx(0.8, rel=1e-5))
assert(value(
m.fs.state[1].mole_frac_phase_comp_apparent["Liq", "KHCO3"]) ==
pytest.approx(1.6e-8, abs=1e-9))
assert(value(
m.fs.state[1].mole_frac_phase_comp_apparent["Liq", "KHCO3"]) ==
pytest.approx(1.6e-8, abs=1e-9))
| 45.294606 | 81 | 0.531949 | 5,478 | 43,664 | 4.02939 | 0.057503 | 0.045259 | 0.106918 | 0.117836 | 0.917999 | 0.915009 | 0.909799 | 0.898473 | 0.869614 | 0.863906 | 0 | 0.03829 | 0.316348 | 43,664 | 963 | 82 | 45.341641 | 0.701149 | 0.047728 | 0 | 0.834403 | 0 | 0 | 0.100126 | 0.003772 | 0 | 0 | 0 | 0 | 0.27086 | 1 | 0.016688 | false | 0 | 0.019255 | 0.002567 | 0.053915 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7c5d27dab85e39c5aa1b16ff8074b5e822499d0e | 26,897 | py | Python | modules/models/bert_models.py | VarnithChordia/Multlingual_Punctuation_restoration | 17c026e8935b9fecae01d446a756926c7733fcd1 | [
"MIT"
] | 8 | 2020-07-24T05:50:54.000Z | 2022-02-17T00:16:07.000Z | modules/models/bert_models.py | VarnithChordia/Multlingual_Punctuation_restoration | 17c026e8935b9fecae01d446a756926c7733fcd1 | [
"MIT"
] | 4 | 2021-04-22T12:27:22.000Z | 2022-03-12T00:59:43.000Z | modules/models/bert_models.py | VarnithChordia/Multlingual_Punctuation_restoration | 17c026e8935b9fecae01d446a756926c7733fcd1 | [
"MIT"
] | null | null | null | from modules.layers.decoders import *
from modules.layers.embedders import *
from modules.layers.layers import BiLSTM, MultiHeadAttention
import abc
class BERTPunctModel(nn.Module, metaclass=abc.ABCMeta):
"""Base class for all BERT Models"""
@abc.abstractmethod
def forward(self, batch):
raise NotImplementedError("abstract method forward must be implemented")
@abc.abstractmethod
def score(self, batch):
raise NotImplementedError("abstract method score must be implemented")
@abc.abstractmethod
def create(self, *args, **kwargs):
raise NotImplementedError("abstract method create must be implemented")
def get_n_trainable_params(self):
pp = 0
for p in list(self.parameters()):
if p.requires_grad:
num = 1
for s in list(p.size()):
num = num * s
pp += num
return pp
class BERTBiLSTMCRF(BERTNerModel):
def __init__(self, embeddings, lstm, crf, device="cuda"):
super(BERTBiLSTMCRF, self).__init__()
self.embeddings = embeddings
self.lstm = lstm
self.crf = crf
self.to(device)
def forward(self, batch):
input_, labels_mask, input_type_ids = batch[:3]
input_embeddings = self.embeddings(batch)
output, _ = self.lstm.forward(input_embeddings, labels_mask)
return self.crf.forward(output, labels_mask)
def score(self, batch):
input_, labels_mask, input_type_ids, labels = batch
input_embeddings = self.embeddings(batch)
output, _ = self.lstm.forward(input_embeddings, labels_mask)
return self.crf.score(output, labels_mask, labels)
@classmethod
def create(cls,
label_size,
# BertEmbedder params
model_name='bert-base-multilingual-cased', mode="weighted", is_freeze=True,
# BiLSTM params
embedding_size=768, hidden_dim=512, rnn_layers=1, lstm_dropout=0.3,
# CRFDecoder params
crf_dropout=0.5,
# Global params
device="cuda"):
embeddings = BERTEmbedder.create(model_name=model_name, device=device, mode=mode, is_freeze=is_freeze)
lstm = BiLSTM.create(
embedding_size=embedding_size, hidden_dim=hidden_dim, rnn_layers=rnn_layers, dropout=lstm_dropout)
crf = CRFDecoder.create(label_size, hidden_dim, crf_dropout)
return cls(embeddings, lstm, crf, device)
class BERTBiLSTMNCRF(BERTPunctModel):
def __init__(self, embeddings, lstm, crf, device="cuda"):
super(BERTBiLSTMNCRF, self).__init__()
self.embeddings = embeddings
self.lstm = lstm
self.crf = crf
self.to(device)
def forward(self, batch):
input_, labels_mask, input_type_ids = batch[:3]
input_embeddings = self.embeddings(batch)
output, _ = self.lstm.forward(input_embeddings, labels_mask)
return self.crf.forward(output, labels_mask)
def score(self, batch):
input_, labels_mask, input_type_ids, labels = batch
input_embeddings = self.embeddings(batch)
output, _ = self.lstm.forward(input_embeddings, labels_mask)
return self.crf.score(output, labels_mask, labels)
@classmethod
def create(cls,
label_size,
# BertEmbedder params
model_name='', mode="weighted", is_freeze=True,
# BiLSTM params
embedding_size=768, hidden_dim=512, rnn_layers=1, lstm_dropout=0.3,
# NCRFDecoder params
crf_dropout=0.5, nbest=1,
# Global params
device="cuda"):
embeddings = BERTEmbedder.create(model_name=model_name, device=device, mode=mode, is_freeze=is_freeze)
lstm = BiLSTM.create(
embedding_size=embedding_size, hidden_dim=hidden_dim, rnn_layers=rnn_layers, dropout=lstm_dropout)
crf = NCRFDecoder.create(
label_size, hidden_dim, crf_dropout, nbest, device=device)
return cls(embeddings, lstm, crf, device)
class BERTAttnCRF(BERTPunctModel):
def __init__(self, embeddings, attn, crf, device="cuda"):
super(BERTAttnCRF, self).__init__()
self.embeddings = embeddings
self.attn = attn
self.crf = crf
self.to(device)
def forward(self, batch):
input_, labels_mask, input_type_ids = batch[:3]
input_embeddings = self.embeddings(batch)
output, _ = self.attn(input_embeddings, input_embeddings, input_embeddings, None)
return self.crf.forward(output, labels_mask)
def score(self, batch):
input_, labels_mask, input_type_ids, labels = batch
input_embeddings = self.embeddings(batch)
output, _ = self.attn(input_embeddings, input_embeddings, input_embeddings, None)
return self.crf.score(output, labels_mask, labels)
@classmethod
def create(cls,
label_size,
# BertEmbedder params
model_name='bert-base-multilingual-cased', mode="weighted", is_freeze=True,
# Attn params
embedding_size=768, key_dim=64, val_dim=64, num_heads=3, attn_dropout=0.3,
# CRFDecoder params
crf_dropout=0.5,
# Global params
device="cuda"):
embeddings = BERTEmbedder.create(model_name=model_name, device=device, mode=mode, is_freeze=is_freeze)
attn = MultiHeadAttention(key_dim, val_dim, embedding_size, num_heads, attn_dropout)
crf = CRFDecoder.create(
label_size, embedding_size, crf_dropout)
return cls(embeddings, attn, crf, device)
class BERTAttnNCRF(BERTPunctModel):
def __init__(self, embeddings, attn, crf, device="cuda"):
super(BERTAttnNCRF, self).__init__()
self.embeddings = embeddings
self.attn = attn
self.crf = crf
self.to(device)
def forward(self, batch):
input_, labels_mask, input_type_ids = batch[:3]
input_embeddings = self.embeddings(batch)
output, _ = self.attn(input_embeddings, input_embeddings, input_embeddings, None)
return self.crf.forward(output, labels_mask)
def score(self, batch):
input_, labels_mask, input_type_ids, labels = batch
input_embeddings = self.embeddings(batch)
output, _ = self.attn(input_embeddings, input_embeddings, input_embeddings, None)
return self.crf.score(output, labels_mask, labels)
@classmethod
def create(cls,
label_size,
# BertEmbedder params
model_name='bert-base-multilingual-cased', mode="weighted", is_freeze=True,
# Attn params
embedding_size=768, key_dim=64, val_dim=64, num_heads=3, attn_dropout=0.3,
# NCRFDecoder params
crf_dropout=0.5, nbest=1,
# Global params
device="cuda"):
embeddings = BERTEmbedder.create(model_name=model_name, device=device, mode=mode, is_freeze=is_freeze)
attn = MultiHeadAttention(key_dim, val_dim, embedding_size, num_heads, attn_dropout)
crf = NCRFDecoder.create(
label_size, embedding_size, crf_dropout, nbest=nbest, device=device)
return cls(embeddings, attn, crf, device)
class BERTBiLSTMAttnCRF(BERTPunctModel):
def __init__(self, embeddings, lstm, attn, crf, device="cuda"):
super(BERTBiLSTMAttnCRF, self).__init__()
self.embeddings = embeddings
self.lstm = lstm
self.attn = attn
self.crf = crf
self.to(device)
def forward(self, batch):
input_, labels_mask, input_type_ids = batch[:3]
input_embeddings = self.embeddings(batch)
output, _ = self.lstm.forward(input_embeddings, labels_mask)
output, _ = self.attn(output, output, output, None)
return self.crf.forward(output, labels_mask)
def score(self, batch):
input_, labels_mask, input_type_ids, labels = batch
input_embeddings = self.embeddings(batch)
output, _ = self.lstm.forward(input_embeddings, labels_mask)
output, _ = self.attn(output, output, output, None)
return self.crf.score(output, labels_mask, labels)
@classmethod
def create(cls,
label_size,
# BertEmbedder params
model_name='bert-base-multilingual-cased', mode="weighted", is_freeze=True,
# BiLSTM
hidden_dim=512, rnn_layers=1, lstm_dropout=0.3,
# Attn params
embedding_size=768, key_dim=64, val_dim=64, num_heads=3, attn_dropout=0.3,
# CRFDecoder params
crf_dropout=0.5,
# Global params
device="cuda"):
embeddings = BERTEmbedder.create(model_name=model_name, device=device, mode=mode, is_freeze=is_freeze)
lstm = BiLSTM.create(
embedding_size=embedding_size, hidden_dim=hidden_dim, rnn_layers=rnn_layers, dropout=lstm_dropout)
attn = MultiHeadAttention(key_dim, val_dim, hidden_dim, num_heads, attn_dropout)
crf = CRFDecoder.create(
label_size, hidden_dim, crf_dropout)
return cls(embeddings, lstm, attn, crf, device)
class BERTBiLSTMAttnNCRF(BERTPunctModel):
def __init__(self, embeddings, lstm, attn, crf, device="cuda"):
super(BERTBiLSTMAttnNCRF, self).__init__()
self.embeddings = embeddings
self.lstm = lstm
self.attn = attn
self.crf = crf
self.to(device)
def forward(self, batch):
input_, labels_mask, input_type_ids = batch[:3]
input_embeddings = self.embeddings(batch)
output, _ = self.lstm.forward(input_embeddings, labels_mask)
output, _ = self.attn(output, output, output, None)
return self.crf.forward(output, labels_mask)
def score(self, batch):
input_, labels_mask, input_type_ids, labels = batch
input_embeddings = self.embeddings(batch)
output, _ = self.lstm.forward(input_embeddings, labels_mask)
output, _ = self.attn(output, output, output, None)
return self.crf.score(output, labels_mask, labels)
@classmethod
def create(cls,
label_size,
# BertEmbedder params
model_name='bert-base-multilingual-cased', mode="weighted", is_freeze=True,
# BiLSTM
hidden_dim=512, rnn_layers=1, lstm_dropout=0.3,
# Attn params
embedding_size=768, key_dim=64, val_dim=64, num_heads=3, attn_dropout=0.3,
# NCRFDecoder params
crf_dropout=0.5, nbest=1,
# Global params
device="cuda"):
embeddings = BERTEmbedder.create(model_name=model_name, device=device, mode=mode, is_freeze=is_freeze)
lstm = BiLSTM.create(
embedding_size=embedding_size, hidden_dim=hidden_dim, rnn_layers=rnn_layers, dropout=lstm_dropout)
attn = MultiHeadAttention(key_dim, val_dim, hidden_dim, num_heads, attn_dropout)
crf = NCRFDecoder.create(
label_size, hidden_dim, crf_dropout, nbest=nbest, device=device)
return cls(embeddings, lstm, attn, crf, device)
class BERTBiLSTMAttnNCRFJoint(BERTPunctModel):
def __init__(self, embeddings, lstm, attn, crf, clf, mode, device="cuda"):
super(BERTBiLSTMAttnNCRFJoint, self).__init__()
self.embeddings = embeddings
self.lstm = lstm
self.attn = attn
self.crf = crf
self.clf = clf
self.mode = mode
self.to(device)
def forward(self, batch):
input_, labels_mask, input_type_ids = batch[:3]
input_embeddings = self.embeddings(batch)
output, _ = self.lstm.forward(input_embeddings, labels_mask)
output, _ = self.attn(output, output, output, None)
return self.crf.forward(output, labels_mask), self.clf(output),self.mode(output)
def score(self, batch):
input_, labels_mask, input_type_ids, labels, cls_ids, mode_ids = batch
input_embeddings = self.embeddings(batch)
output, _ = self.lstm.forward(input_embeddings, labels_mask)
output, _ = self.attn(output, output, output, None)
return [self.crf.score(output, labels_mask, labels),self.clf.score(output, cls_ids),self.clf.score(output, mode_ids)]
@classmethod
def create(cls,
label_size, intent_size, mode_size,
# BertEmbedder params
model_name='bert-base-multilingual-cased', mode="weighted", is_freeze=True,
# BiLSTM
hidden_dim=512, rnn_layers=1, lstm_dropout=0.3,
# Attn params
embedding_size=768, key_dim=64, val_dim=64, num_heads=3, attn_dropout=0.3,
# NCRFDecoder params
crf_dropout=0.5, nbest=1,
# Clf params
clf_dropout=0.3,
# Global params
device="cuda"):
embeddings = BERTEmbedder.create(model_name=model_name, device=device, mode=mode, is_freeze=is_freeze)
lstm = BiLSTM.create(
embedding_size=embedding_size, hidden_dim=hidden_dim, rnn_layers=rnn_layers, dropout=lstm_dropout)
attn = MultiHeadAttention(key_dim, val_dim, hidden_dim, num_heads, attn_dropout)
crf = NCRFDecoder.create(
label_size, hidden_dim, crf_dropout, nbest=nbest, device=device)
clf = ClassDecoder(intent_size, hidden_dim, clf_dropout)
mode_tr = ClassDecoder(mode_size, hidden_dim, clf_dropout)
return cls(embeddings, lstm, attn, crf, clf, mode_tr, device)
class BERTBiLSTMAttnCRFJoint(BERTPunctModel):
def __init__(self, embeddings, lstm, attn, crf, clf,mode, device="cuda"):
super(BERTBiLSTMAttnCRFJoint, self).__init__()
self.embeddings = embeddings
self.lstm = lstm
self.attn = attn
self.crf = crf
self.mode = mode
self.clf = clf
self.to(device)
def forward(self, batch):
input_, labels_mask, input_type_ids = batch[:3]
input_embeddings = self.embeddings(batch)
output, _ = self.lstm.forward(input_embeddings, labels_mask)
output, _ = self.attn(output, output, output, None)
return self.crf.forward(output, labels_mask), self.clf(output), self.mode(output)
def score(self, batch):
input_, labels_mask, input_type_ids, labels, cls_ids, mode_ids = batch
input_embeddings = self.embeddings(batch)
output, _ = self.lstm.forward(input_embeddings, labels_mask)
output, _ = self.attn(output, output, output, None)
return [self.crf.score(output, labels_mask, labels),self.clf.score(output, cls_ids),self.clf.score(output, mode_ids)]
@classmethod
def create(cls,
label_size, intent_size,mode_size,
# BertEmbedder params
model_name='bert-base-multilingual-cased', mode="weighted", is_freeze=True,
# BiLSTM
hidden_dim=512, rnn_layers=1, lstm_dropout=0.3,
# Attn params
embedding_size=768, key_dim=64, val_dim=64, num_heads=3, attn_dropout=0.3,
# CRFDecoder params
crf_dropout=0.5,
# Clf params
clf_dropout=0.3,
# Global params
device="cuda"):
embeddings = BERTEmbedder.create(model_name=model_name, device=device, mode=mode, is_freeze=is_freeze)
lstm = BiLSTM.create(
embedding_size=embedding_size, hidden_dim=hidden_dim, rnn_layers=rnn_layers, dropout=lstm_dropout)
attn = MultiHeadAttention(key_dim, val_dim, hidden_dim, num_heads, attn_dropout)
crf = CRFDecoder.create(
label_size, hidden_dim, crf_dropout)
clf = ClassDecoder(intent_size, hidden_dim, clf_dropout)
mode_tr = ClassDecoder(mode_size, hidden_dim, clf_dropout)
return cls(embeddings, lstm, attn, crf, clf,mode_tr, device)
class BERTBiLSTMCRFJoint(BERTPunctModel):
def __init__(self, embeddings, lstm, crf, clf,mode, device="cuda"):
super(BERTBiLSTMCRFJoint, self).__init__()
self.embeddings = embeddings
self.lstm = lstm
self.crf = crf
self.clf = clf
self.mode = mode
self.to(device)
def forward(self, batch):
input_, labels_mask, input_type_ids = batch[:3]
input_embeddings = self.embeddings(batch)
output, _ = self.lstm.forward(input_embeddings, labels_mask)
return self.crf.forward(output, labels_mask), self.clf(output), self.mode(output)
def score(self, batch):
input_, labels_mask, input_type_ids, labels, cls_ids, mode_ids = batch
input_embeddings = self.embeddings(batch)
output, _ = self.lstm.forward(input_embeddings, labels_mask)
return [self.crf.score(output, labels_mask, labels),self.clf.score(output, cls_ids),self.clf.score(output, mode_ids)]
@classmethod
def create(cls,
label_size, intent_size, mode_size,
# BertEmbedder params
model_name='bert-base-multilingual-cased', mode="weighted", is_freeze=True,
# BiLSTM params
embedding_size=768, hidden_dim=512, rnn_layers=1, lstm_dropout=0.3,
# CRFDecoder params
crf_dropout=0.5,
# Clf params
clf_dropout=0.3,
# Global params
device="cuda"):
embeddings = BERTEmbedder.create(model_name=model_name, device=device, mode=mode, is_freeze=is_freeze)
lstm = BiLSTM.create(
embedding_size=embedding_size, hidden_dim=hidden_dim, rnn_layers=rnn_layers, dropout=lstm_dropout)
crf = CRFDecoder.create(label_size, hidden_dim, crf_dropout)
clf = ClassDecoder(intent_size, hidden_dim, clf_dropout)
mode_tr = ClassDecoder(mode_size, hidden_dim, clf_dropout)
return cls(embeddings, lstm, crf, clf,mode_tr, device)
class BERTBiLSTMNCRFJoint(BERTPunctModel):
def __init__(self, embeddings, lstm, crf, clf,mode, device="cuda"):
super(BERTBiLSTMNCRFJoint, self).__init__()
self.embeddings = embeddings
self.lstm = lstm
self.crf = crf
self.clf = clf
self.mode = mode
self.to(device)
def forward(self, batch):
input_, labels_mask, input_type_ids = batch[:3]
input_embeddings = self.embeddings(batch)
output, _ = self.lstm.forward(input_embeddings, labels_mask)
return self.crf.forward(output, labels_mask), self.clf(output), self.mode(output)
def score(self, batch):
input_, labels_mask, input_type_ids, labels, cls_ids, mode_ids = batch
input_embeddings = self.embeddings(batch)
output, _ = self.lstm.forward(input_embeddings, labels_mask)
return [self.crf.score(output, labels_mask, labels),self.clf.score(output, cls_ids),self.clf.score(output, mode_ids)]
@classmethod
def create(cls,
label_size, intent_size,mode_size,
# BertEmbedder params
model_name='bert-base-multilingual-cased', mode="weighted", is_freeze=True,
# BiLSTM params
embedding_size=768, hidden_dim=512, rnn_layers=1, lstm_dropout=0.3,
# CRFDecoder params
crf_dropout=0.5, nbest=1,
# Clf params
clf_dropout=0.3,
# Global params
device="cuda"):
embeddings = BERTEmbedder.create(model_name=model_name, device=device, mode=mode, is_freeze=is_freeze)
lstm = BiLSTM.create(
embedding_size=embedding_size, hidden_dim=hidden_dim, rnn_layers=rnn_layers, dropout=lstm_dropout)
crf = NCRFDecoder.create(label_size, hidden_dim, crf_dropout, nbest=nbest, device=device)
clf = ClassDecoder(intent_size, hidden_dim, clf_dropout)
mode_tr = ClassDecoder(mode_size, hidden_dim, clf_dropout)
return cls(embeddings, lstm, crf, clf, mode_tr, device)
class BERTAttnCRFJoint(BERTPunctModel):
def __init__(self, embeddings, attn, crf, clf, mode, device="cuda"):
super(BERTAttnCRFJoint, self).__init__()
self.embeddings = embeddings
self.attn = attn
self.crf = crf
self.clf = clf
self.mode = mode
self.to(device)
def forward(self, batch):
input_, labels_mask, input_type_ids = batch[:3]
input_embeddings = self.embeddings(batch)
output, _ = self.attn(input_embeddings, input_embeddings, input_embeddings, None)
return self.crf.forward(output, labels_mask), self.clf(output), self.mode(output)
def score(self, batch):
input_, labels_mask, input_type_ids, labels, cls_ids, mode_ids = batch
input_embeddings = self.embeddings(batch)
output, _ = self.attn(input_embeddings, input_embeddings, input_embeddings, None)
return [self.crf.score(output, labels_mask, labels),self.clf.score(output, cls_ids),self.clf.score(output, mode_ids)]
@classmethod
def create(cls,
label_size, intent_size,mode_size,
# BertEmbedder params
model_name='bert-base-multilingual-cased', mode="weighted", is_freeze=True,
# BiLSTM
hidden_dim=512, rnn_layers=1, lstm_dropout=0.3,
# Attn params
embedding_size=768, key_dim=64, val_dim=64, num_heads=3, attn_dropout=0.3,
# CRFDecoder params
crf_dropout=0.5,
# Clf params
clf_dropout=0.3,
# Global params
device="cuda"):
embeddings = BERTEmbedder.create(model_name=model_name, device=device, mode=mode, is_freeze=is_freeze)
attn = MultiHeadAttention(key_dim, val_dim, hidden_dim, num_heads, attn_dropout)
crf = CRFDecoder.create(
label_size, hidden_dim, crf_dropout)
clf = ClassDecoder(intent_size, hidden_dim, clf_dropout)
mode_tr = ClassDecoder(mode_size, hidden_dim, clf_dropout)
return cls(embeddings, attn, crf, clf, mode_tr, device)
class BERTAttnNCRFJoint(BERTPunctModel):
def __init__(self, embeddings, attn, crf, clf, mode, device="cuda"):
super(BERTAttnNCRFJoint, self).__init__()
self.embeddings = embeddings
self.attn = attn
self.crf = crf
self.clf = clf
self.mode = mode
self.to(device)
def forward(self, batch):
input_, labels_mask, input_type_ids = batch[:3]
input_embeddings = self.embeddings(batch)
output, _ = self.attn(input_embeddings, input_embeddings, input_embeddings, None)
return self.crf.forward(output, labels_mask), self.clf(output), self.mode(output)
def score(self, batch):
input_, labels_mask, input_type_ids, labels, cls_ids,mode_ids = batch
input_embeddings = self.embeddings(batch)
output, _ = self.attn(input_embeddings, input_embeddings, input_embeddings, None)
return [self.crf.score(output, labels_mask, labels),self.clf.score(output, cls_ids),self.clf.score(output, mode_ids)]
@classmethod
def create(cls,
label_size, intent_size, mode_size,
# BertEmbedder params
model_name='bert-base-multilingual-cased', mode="weighted", is_freeze=True,
# BiLSTM
hidden_dim=512, rnn_layers=1, lstm_dropout=0.3,
# Attn params
embedding_size=768, key_dim=64, val_dim=64, num_heads=3, attn_dropout=0.3,
# NCRFDecoder params
crf_dropout=0.5, nbest=1,
# Clf params
clf_dropout=0.3,
# Global params
device="cuda"):
embeddings = BERTEmbedder.create(model_name=model_name, device=device, mode=mode, is_freeze=is_freeze)
attn = MultiHeadAttention(key_dim, val_dim, hidden_dim, num_heads, attn_dropout)
crf = NCRFDecoder.create(
label_size, hidden_dim, crf_dropout, nbest=nbest, device=device)
clf = ClassDecoder(intent_size, hidden_dim, clf_dropout)
mode_tr = ClassDecoder(mode_size, hidden_dim, clf_dropout)
return cls(embeddings, attn, crf, clf, mode_tr, device)
class BERTNCRF(BERTPunctModel):
def __init__(self, embeddings, crf, device="cuda"):
super(BERTNCRF, self).__init__()
self.embeddings = embeddings
self.crf = crf
self.to(device)
def forward(self, batch):
input_, labels_mask, input_type_ids = batch[:3]
input_embeddings = self.embeddings(batch)
return self.crf.forward(input_embeddings, labels_mask)
def score(self, batch):
input_, labels_mask, input_type_ids, labels = batch
input_embeddings = self.embeddings(batch)
return self.crf.score(input_embeddings, labels_mask, labels)
@classmethod
def create(cls,
label_size,
# BertEmbedder params
model_name='bert-base-multilingual-cased', mode="weighted", is_freeze=True,
embedding_size=768,
# NCRFDecoder params
crf_dropout=0.5, nbest=1,
# Global params
device="cuda"):
embeddings = BERTEmbedder.create(model_name=model_name, device=device, mode=mode, is_freeze=is_freeze)
crf = NCRFDecoder.create(
label_size, embedding_size, crf_dropout, nbest=nbest, device=device)
return cls(embeddings, crf, device)
class BERTCRF(BERTPunctModel):
def __init__(self, embeddings, crf, device="cuda"):
super(BERTCRF, self).__init__()
self.embeddings = embeddings
self.crf = crf
self.to(device)
def forward(self, batch):
input_, labels_mask, input_type_ids = batch[:3]
input_embeddings = self.embeddings(batch)
return self.crf.forward(input_embeddings, labels_mask)
def score(self, batch):
input_, labels_mask, input_type_ids, labels = batch
input_embeddings = self.embeddings(batch)
return self.crf.score(input_embeddings, labels_mask, labels)
@classmethod
def create(cls,
label_size,
# BertEmbedder params
model_name='bert-base-multilingual-cased', mode="weighted", is_freeze=True,
embedding_size=768,
# NCRFDecoder params
crf_dropout=0.5,
# Global params
device="cuda"):
embeddings = BERTEmbedder.create(model_name=model_name, device=device, mode=mode, is_freeze=is_freeze)
crf = CRFDecoder.create(
label_size, embedding_size, crf_dropout)
return cls(embeddings, crf, device)
| 42.158307 | 125 | 0.640666 | 3,189 | 26,897 | 5.155221 | 0.037316 | 0.043796 | 0.023723 | 0.034063 | 0.945377 | 0.944282 | 0.934063 | 0.92719 | 0.924757 | 0.911922 | 0 | 0.011146 | 0.262855 | 26,897 | 637 | 126 | 42.22449 | 0.818026 | 0.038146 | 0 | 0.89749 | 0 | 0 | 0.027677 | 0.01411 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125523 | false | 0 | 0.008368 | 0 | 0.25523 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7cb810acfc35f6e279e855b2bb19e8801310594f | 2,491 | py | Python | tests/test_signup.py | priyapower/practice-api-movies | e5b383afe636291b656de6c1b239752fc8e7b656 | [
"RSA-MD"
] | null | null | null | tests/test_signup.py | priyapower/practice-api-movies | e5b383afe636291b656de6c1b239752fc8e7b656 | [
"RSA-MD"
] | 1 | 2021-02-17T17:43:08.000Z | 2021-02-17T17:43:08.000Z | tests/test_signup.py | priyapower/practice-api-movies | e5b383afe636291b656de6c1b239752fc8e7b656 | [
"RSA-MD"
] | null | null | null | import json
from tests.BaseCase import BaseCase
class TestUserSignup(BaseCase):
def test_successful_signup(self):
# Given
payload = json.dumps({
"email": "paurakh011@gmail.com",
"password": "mycoolpassword"
})
# When
response = self.app.post('/api/v1/auth/signup', headers={"Content-Type": "application/json"}, data=payload)
# Then
self.assertEqual(str, type(response.json['id']))
self.assertEqual(200, response.status_code)
def test_signup_with_non_existing_field(self):
#Given
payload = json.dumps({
"username": "mycoolusername",
"email": "paurakh011@gmail.com",
"password": "mycoolpassword"
})
#When
response = self.app.post('/api/v1/auth/signup', headers={"Content-Type": "application/json"}, data=payload)
# Then
self.assertEqual('Request is missing required fields', response.json['message'])
self.assertEqual(400, response.status_code)
def test_signup_without_email(self):
#Given
payload = json.dumps({
"password": "mycoolpassword",
})
#When
response = self.app.post('/api/v1/auth/signup', headers={"Content-Type": "application/json"}, data=payload)
# Then
self.assertEqual('Something went wrong', response.json['message'])
self.assertEqual(500, response.status_code)
def test_signup_without_password(self):
#Given
payload = json.dumps({
"email": "paurakh011@gmail.com",
})
#When
response = self.app.post('/api/v1/auth/signup', headers={"Content-Type": "application/json"}, data=payload)
# Then
self.assertEqual('Something went wrong', response.json['message'])
self.assertEqual(500, response.status_code)
def test_creating_already_existing_user(self):
#Given
payload = json.dumps({
"email": "paurakh011@gmail.com",
"password": "mycoolpassword"
})
response = self.app.post('/api/v1/auth/signup', headers={"Content-Type": "application/json"}, data=payload)
# When
response = self.app.post('/api/v1/auth/signup', headers={"Content-Type": "application/json"}, data=payload)
# Then
self.assertEqual('User with given email address already exists', response.json['message'])
self.assertEqual(400, response.status_code)
| 33.662162 | 115 | 0.611802 | 263 | 2,491 | 5.711027 | 0.239544 | 0.099867 | 0.05992 | 0.075899 | 0.813582 | 0.780293 | 0.759654 | 0.737683 | 0.737683 | 0.63249 | 0 | 0.017638 | 0.248896 | 2,491 | 73 | 116 | 34.123288 | 0.785142 | 0.028904 | 0 | 0.674419 | 0 | 0 | 0.266334 | 0 | 0 | 0 | 0 | 0 | 0.232558 | 1 | 0.116279 | false | 0.116279 | 0.046512 | 0 | 0.186047 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
6b0052b78b0c569bd3d8870abda670bc7b4e09bb | 14,321 | py | Python | superset/migrations/versions/3305c4127d6a_add_paid_column_to_solarbislice.py | webdev778/SolarBI | ca680d8c333dd450dc0fc83e931314f41725ada1 | [
"Apache-2.0"
] | 1 | 2020-02-13T13:41:10.000Z | 2020-02-13T13:41:10.000Z | superset/migrations/versions/3305c4127d6a_add_paid_column_to_solarbislice.py | webdev778/SolarBI | ca680d8c333dd450dc0fc83e931314f41725ada1 | [
"Apache-2.0"
] | null | null | null | superset/migrations/versions/3305c4127d6a_add_paid_column_to_solarbislice.py | webdev778/SolarBI | ca680d8c333dd450dc0fc83e931314f41725ada1 | [
"Apache-2.0"
] | null | null | null | # Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
"""Add paid column to SolarBISlice
Revision ID: 3305c4127d6a
Revises: 9743b15bcdf3
Create Date: 2019-09-25 14:49:52.970429
"""
# revision identifiers, used by Alembic.
revision = '3305c4127d6a'
down_revision = '9743b15bcdf3'
from alembic import op
import sqlalchemy as sa
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
pass
# with op.batch_alter_table('annotation', schema=None) as batch_op:
# batch_op.alter_column('layer_id',
# existing_type=sa.INTEGER(),
# nullable=False)
#
# with op.batch_alter_table('clusters', schema=None) as batch_op:
# batch_op.add_column(sa.Column('broker_pass', sqlalchemy_utils.types.encrypted.encrypted_type.EncryptedType(), nullable=True))
# batch_op.add_column(sa.Column('broker_user', sa.String(length=255), nullable=True))
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# batch_op.create_unique_constraint(None, ['verbose_name'])
#
# with op.batch_alter_table('columns', schema=None) as batch_op:
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# batch_op.alter_column('column_name',
# existing_type=sa.VARCHAR(length=255),
# nullable=False)
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=True)
#
# with op.batch_alter_table('css_templates', schema=None) as batch_op:
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=True)
#
# with op.batch_alter_table('dashboard_slices', schema=None) as batch_op:
# batch_op.create_unique_constraint(None, ['dashboard_id', 'slice_id'])
#
# with op.batch_alter_table('dashboards', schema=None) as batch_op:
# batch_op.add_column(sa.Column('published', sa.Boolean(), nullable=True))
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# batch_op.create_unique_constraint(None, ['slug'])
#
# with op.batch_alter_table('datasources', schema=None) as batch_op:
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# batch_op.alter_column('datasource_name',
# existing_type=sa.VARCHAR(length=255),
# nullable=False)
# batch_op.create_foreign_key(None, 'ab_user', ['created_by_fk'], ['id'])
#
# with op.batch_alter_table('dbs', schema=None) as batch_op:
# batch_op.alter_column('allow_csv_upload',
# existing_type=sa.BOOLEAN(),
# nullable=True,
# existing_server_default=sa.text('1'))
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# batch_op.create_unique_constraint(None, ['verbose_name'])
#
# with op.batch_alter_table('metrics', schema=None) as batch_op:
# batch_op.alter_column('json',
# existing_type=sa.TEXT(),
# nullable=False)
# batch_op.alter_column('metric_name',
# existing_type=sa.VARCHAR(length=255),
# nullable=False)
# batch_op.create_foreign_key(None, 'ab_user', ['created_by_fk'], ['id'])
# batch_op.create_foreign_key(None, 'ab_user', ['changed_by_fk'], ['id'])
#
# with op.batch_alter_table('query', schema=None) as batch_op:
# batch_op.drop_column('limit_used')
#
# with op.batch_alter_table('slices', schema=None) as batch_op:
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=True)
with op.batch_alter_table('solarbi_slices', schema=None) as batch_op:
batch_op.add_column(sa.Column('paid', sa.Boolean(), nullable=True))
batch_op.create_foreign_key(None, 'ab_user', ['changed_by_fk'], ['id'])
batch_op.create_foreign_key(None, 'ab_user', ['created_by_fk'], ['id'])
# with op.batch_alter_table('sql_metrics', schema=None) as batch_op:
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# batch_op.alter_column('expression',
# existing_type=sa.TEXT(),
# nullable=False)
# batch_op.alter_column('metric_name',
# existing_type=sa.VARCHAR(length=512),
# nullable=False)
# batch_op.create_unique_constraint(None, ['table_id', 'metric_name'])
#
# with op.batch_alter_table('table_columns', schema=None) as batch_op:
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# batch_op.alter_column('column_name',
# existing_type=sa.VARCHAR(length=255),
# nullable=False)
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# batch_op.create_unique_constraint(None, ['table_id', 'column_name'])
# batch_op.drop_column('database_expression')
#
# with op.batch_alter_table('tables', schema=None) as batch_op:
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# batch_op.drop_constraint('uq_table_in_db_schema', type_='unique')
# batch_op.create_unique_constraint(None, ['database_id', 'table_name'])
#
# with op.batch_alter_table('url', schema=None) as batch_op:
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=True)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
pass
# with op.batch_alter_table('url', schema=None) as batch_op:
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=False)
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=False)
#
# with op.batch_alter_table('tables', schema=None) as batch_op:
# batch_op.drop_constraint(None, type_='unique')
# batch_op.create_unique_constraint('uq_table_in_db_schema', ['database_id', 'schema', 'table_name'])
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=False)
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=False)
#
# with op.batch_alter_table('table_columns', schema=None) as batch_op:
# batch_op.add_column(sa.Column('database_expression', sa.VARCHAR(length=255), nullable=True))
# batch_op.drop_constraint(None, type_='unique')
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=False)
# batch_op.alter_column('column_name',
# existing_type=sa.VARCHAR(length=255),
# nullable=True)
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=False)
#
# with op.batch_alter_table('sql_metrics', schema=None) as batch_op:
# batch_op.drop_constraint(None, type_='unique')
# batch_op.alter_column('metric_name',
# existing_type=sa.VARCHAR(length=512),
# nullable=True)
# batch_op.alter_column('expression',
# existing_type=sa.TEXT(),
# nullable=True)
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=False)
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=False)
# with op.batch_alter_table('solarbi_slices', schema=None) as batch_op:
# batch_op.drop_constraint(None, type_='foreignkey')
# batch_op.drop_constraint(None, type_='foreignkey')
# batch_op.drop_column('paid')
# with op.batch_alter_table('slices', schema=None) as batch_op:
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=False)
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=False)
#
# with op.batch_alter_table('query', schema=None) as batch_op:
# batch_op.add_column(sa.Column('limit_used', sa.BOOLEAN(), nullable=True))
#
# with op.batch_alter_table('metrics', schema=None) as batch_op:
# batch_op.drop_constraint(None, type_='foreignkey')
# batch_op.drop_constraint(None, type_='foreignkey')
# batch_op.alter_column('metric_name',
# existing_type=sa.VARCHAR(length=255),
# nullable=True)
# batch_op.alter_column('json',
# existing_type=sa.TEXT(),
# nullable=True)
#
# with op.batch_alter_table('dbs', schema=None) as batch_op:
# batch_op.drop_constraint(None, type_='unique')
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=False)
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=False)
# batch_op.alter_column('allow_csv_upload',
# existing_type=sa.BOOLEAN(),
# nullable=False,
# existing_server_default=sa.text('1'))
#
# with op.batch_alter_table('datasources', schema=None) as batch_op:
# batch_op.drop_constraint(None, type_='foreignkey')
# batch_op.alter_column('datasource_name',
# existing_type=sa.VARCHAR(length=255),
# nullable=True)
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=False)
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=False)
#
# with op.batch_alter_table('dashboards', schema=None) as batch_op:
# batch_op.drop_constraint(None, type_='unique')
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=False)
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=False)
# batch_op.drop_column('published')
#
# with op.batch_alter_table('dashboard_slices', schema=None) as batch_op:
# batch_op.drop_constraint(None, type_='unique')
#
# with op.batch_alter_table('css_templates', schema=None) as batch_op:
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=False)
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=False)
#
# with op.batch_alter_table('columns', schema=None) as batch_op:
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=False)
# batch_op.alter_column('column_name',
# existing_type=sa.VARCHAR(length=255),
# nullable=True)
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=False)
#
# with op.batch_alter_table('clusters', schema=None) as batch_op:
# batch_op.drop_constraint(None, type_='unique')
# batch_op.alter_column('created_on',
# existing_type=sa.DATETIME(),
# nullable=False)
# batch_op.alter_column('changed_on',
# existing_type=sa.DATETIME(),
# nullable=False)
# batch_op.drop_column('broker_user')
# batch_op.drop_column('broker_pass')
#
# with op.batch_alter_table('annotation', schema=None) as batch_op:
# batch_op.alter_column('layer_id',
# existing_type=sa.INTEGER(),
# nullable=True)
# ### end Alembic commands ###
| 43.006006 | 135 | 0.603799 | 1,665 | 14,321 | 4.890691 | 0.108108 | 0.113472 | 0.091367 | 0.13705 | 0.860985 | 0.851897 | 0.830529 | 0.816652 | 0.806705 | 0.806705 | 0 | 0.009011 | 0.271559 | 14,321 | 332 | 136 | 43.135542 | 0.771568 | 0.843098 | 0 | 0.166667 | 0 | 0 | 0.044932 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0.166667 | 0.166667 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
6b05769b776a5bf4b2e27873526b1bedff676ca1 | 826 | py | Python | 0x07-Session_authentication/main_1.py | JoseAVallejo12/holbertonschool-web_back_end | eb514784772352b8e4873d1f648726815ab69592 | [
"MIT"
] | null | null | null | 0x07-Session_authentication/main_1.py | JoseAVallejo12/holbertonschool-web_back_end | eb514784772352b8e4873d1f648726815ab69592 | [
"MIT"
] | null | null | null | 0x07-Session_authentication/main_1.py | JoseAVallejo12/holbertonschool-web_back_end | eb514784772352b8e4873d1f648726815ab69592 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
""" Main 1
"""
from api.v1.auth.session_auth import SessionAuth
sa = SessionAuth()
print("{}: {}".format(type(sa.user_id_by_session_id), sa.user_id_by_session_id))
user_id = None
session = sa.create_session(user_id)
print("{} => {}: {}".format(user_id, session, sa.user_id_by_session_id))
user_id = 89
session = sa.create_session(user_id)
print("{} => {}: {}".format(user_id, session, sa.user_id_by_session_id))
user_id = "abcde"
session = sa.create_session(user_id)
print("{} => {}: {}".format(user_id, session, sa.user_id_by_session_id))
user_id = "fghij"
session = sa.create_session(user_id)
print("{} => {}: {}".format(user_id, session, sa.user_id_by_session_id))
user_id = "abcde"
session = sa.create_session(user_id)
print("{} => {}: {}".format(user_id, session, sa.user_id_by_session_id))
| 28.482759 | 80 | 0.705811 | 131 | 826 | 4.076336 | 0.19084 | 0.247191 | 0.104869 | 0.131086 | 0.810861 | 0.810861 | 0.775281 | 0.775281 | 0.728464 | 0.728464 | 0 | 0.006729 | 0.100484 | 826 | 28 | 81 | 29.5 | 0.711978 | 0.033898 | 0 | 0.666667 | 0 | 0 | 0.102532 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.055556 | 0 | 0.055556 | 0.333333 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6b20362f6ce2ca650da4ea45c0e37efe45c2d3c9 | 206 | py | Python | conda-recipes/openmdao.examples.bar3simulation/run_test.py | mjfwest/OpenMDAO-Framework | a5521f47ad7686c25b203de74e1c7dff5fd7a52b | [
"Apache-2.0"
] | 69 | 2015-01-02T19:10:08.000Z | 2021-11-14T04:42:28.000Z | conda-recipes/openmdao.examples.bar3simulation/run_test.py | jcchin/OpenMDAO-Framework | 038e89b06da1c74f00918f4c6fbd8bd365e25657 | [
"Apache-2.0"
] | 3 | 2015-01-15T23:08:18.000Z | 2015-03-11T16:57:35.000Z | conda-recipes/openmdao.examples.bar3simulation/run_test.py | jcchin/OpenMDAO-Framework | 038e89b06da1c74f00918f4c6fbd8bd365e25657 | [
"Apache-2.0"
] | 31 | 2015-09-16T00:37:35.000Z | 2022-01-10T06:27:55.000Z | import openmdao.examples.bar3simulation
import openmdao.examples.bar3simulation.bar3
from openmdao.examples.bar3simulation.bar3 import forces
from openmdao.examples.bar3simulation.bar3 import runbar3truss
| 34.333333 | 62 | 0.883495 | 23 | 206 | 7.913043 | 0.347826 | 0.351648 | 0.659341 | 0.56044 | 0.483516 | 0.483516 | 0 | 0 | 0 | 0 | 0 | 0.041451 | 0.063107 | 206 | 5 | 63 | 41.2 | 0.901554 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
86118fef3bee19ada6fcf1fcfb43c448cb0e6e8f | 9,572 | py | Python | PyBench/MessageGenerator.py | kit-tm/gcmi2 | 607ffd191e2d35186683447ea14fc8d3ce453ff4 | [
"BSD-2-Clause"
] | null | null | null | PyBench/MessageGenerator.py | kit-tm/gcmi2 | 607ffd191e2d35186683447ea14fc8d3ce453ff4 | [
"BSD-2-Clause"
] | null | null | null | PyBench/MessageGenerator.py | kit-tm/gcmi2 | 607ffd191e2d35186683447ea14fc8d3ce453ff4 | [
"BSD-2-Clause"
] | 1 | 2019-12-19T13:25:17.000Z | 2019-12-19T13:25:17.000Z | from pyof.v0x01.asynchronous.packet_in import PacketIn
from pyof.v0x01.asynchronous.packet_in import PacketInReason
from pyof.v0x01.controller2switch.flow_mod import FlowMod, FlowModCommand
from pyof.v0x01.controller2switch.get_config_reply import GetConfigReply
from pyof.v0x01.asynchronous.error_msg import ErrorMsg
from pyof.v0x01.asynchronous.error_msg import ErrorType
from pyof.v0x01.symmetric.echo_reply import EchoReply
from pyof.v0x01.symmetric.echo_request import EchoRequest
from pyof.v0x01.common.utils import unpack_message
from pyof.v0x01.common.flow_match import Match, FlowWildCards
from pyof.foundation.basic_types import IPAddress, HWAddress, UBInt32
from random import randint
import os
TOTAL_MAC_ADDRESSES = 10000
class MessageGenerator():
def __init__(self):
self.packet_in_message = PacketIn()
self.packet_in_message.in_port = 1
self.packet_in_message.buffer_id = 1
self.packet_in_message.pad = 0
self.packet_in_message.reason = PacketInReason.OFPR_NO_MATCH
# from cbench
data = b"\x04\x06\x00\xe0\x04\x01\x00\x00\x00\x00\x76\xa9\xd4\x0d\x25\x48\x00\x00\x01\x00\x02\x00\x00\x00\x00\x00\x00\x00\x00\x00\x07\xff\x00\x01\x1a\xc1\x51\xff\xef\x8a\x76\x65\x74\x68\x31\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xc0\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\xce\x2f\xa2\x87\xf6\x70\x76\x65\x74\x68\x33\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xc0\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x03\xca\x8a\x1e\xf3\x77\xef\x76\x65\x74\x68\x35\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xc0\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x04\xfa\xbc\x77\x8d\x7e\x0b\x76\x65\x74\x68\x37\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xc0\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00"
self.features_reply_message = unpack_message(data)
# for packet_in messages
self.current_src_mac = 1
def generate_packet_in_message(self, xid, data_length):
self.packet_in_message.header.xid = xid
self.packet_in_message.total_len = data_length + 52
src_mac_hex = hex(self.current_src_mac % TOTAL_MAC_ADDRESSES)[2:].zfill(12)
self.current_src_mac += 1
ethernet_data = \
"800000000001" + \
src_mac_hex + \
"0800"
ip_in_ethernet_frame = "450000320000000040fff72cc0a80028c0a80128"
data = os.urandom(data_length)
self.packet_in_message.data = bytes.fromhex(ethernet_data + ip_in_ethernet_frame) + data
return self.packet_in_message.pack()
def generate_benchmark_message(self, xid):
echo_or_packet_in = randint(0, 1)
if echo_or_packet_in:
return self.generate_echo_request_message(xid)
packet_in_size = randint(0, 2)
return self.generate_packet_in_message(xid, pow(2, packet_in_size) * 32)
def generate_features_reply_message(self, xid, datapath_id):
self.features_reply_message.header.xid = xid
self.features_reply_message.datapath_id = datapath_id
return self.features_reply_message.pack()
def generate_flow_mod_message(self, xid, srcIpAddress):
match = Match(dl_type=2048,
nw_src=IPAddress(srcIpAddress))
message = FlowMod(xid=xid, match=match, command=FlowModCommand.OFPFC_ADD)
return message.pack()
def generate_config_reply_message(self, xid, flags, miss_send_len):
message = GetConfigReply()
message.header.xid = xid
message.flags = flags
message.miss_send_len = miss_send_len
return message.pack()
def generate_echo_reply_message(self, xid):
message = EchoReply(xid=xid)
message.header.xid = xid
return message.pack()
def generate_echo_request_message(self, xid):
message = EchoRequest(xid=xid)
message.header.xid = xid
return message.pack()
def generate_stats_reply_message(self):
data = b"\x01\x11\x04\x2c\xff\xff\xff\xfc\x00\x00\x00\x00\x43\x62\x65\x6e" \
b"\x63\x68\x20\x2d\x20\x63\x6f\x6e\x74\x72\x6f\x6c\x6c\x65\x72\x20" \
b"\x49\x2f\x4f\x20\x62\x65\x6e\x63\x68\x6d\x61\x72\x6b\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x74\x68\x69\x73" \
b"\x20\x69\x73\x20\x61\x63\x74\x75\x61\x6c\x6c\x79\x20\x73\x6f\x66" \
b"\x74\x77\x61\x72\x65\x2e\x2e\x2e\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x76\x65\x72\x73" \
b"\x69\x6f\x6e\x20\x30\x2e\x30\x31\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x6e\x6f\x6e\x65" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x6e\x6f\x6e\x65" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00"
return data
def generate_vendor_reply_message(self, xid):
message = ErrorMsg(xid=xid)
message.error_type = ErrorType.OFPET_BAD_REQUEST
message.code = 0
return message.pack()
| 56.305882 | 914 | 0.669453 | 1,804 | 9,572 | 3.47949 | 0.099224 | 1.010355 | 1.406564 | 1.732038 | 0.652063 | 0.609845 | 0.59487 | 0.570018 | 0.570018 | 0.56476 | 0 | 0.32124 | 0.147618 | 9,572 | 169 | 915 | 56.639053 | 0.448094 | 0.003552 | 0 | 0.471429 | 0 | 0.485714 | 0.547876 | 0.546198 | 0 | 1 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.092857 | 0 | 0.242857 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
863cebcbb40624ab2656068929b7321b2842e605 | 36,559 | py | Python | Keras_tensorflow_nightly/source2.7/tensorflow/contrib/boosted_trees/python/ops/gen_prediction_ops.py | Con-Mi/lambda-packs | b23a8464abdd88050b83310e1d0e99c54dac28ab | [
"MIT"
] | 1 | 2019-06-27T12:09:44.000Z | 2019-06-27T12:09:44.000Z | Keras_tensorflow_nightly/source2.7/tensorflow/contrib/boosted_trees/python/ops/gen_prediction_ops.py | Con-Mi/lambda-packs | b23a8464abdd88050b83310e1d0e99c54dac28ab | [
"MIT"
] | null | null | null | Keras_tensorflow_nightly/source2.7/tensorflow/contrib/boosted_trees/python/ops/gen_prediction_ops.py | Con-Mi/lambda-packs | b23a8464abdd88050b83310e1d0e99c54dac28ab | [
"MIT"
] | null | null | null | """Python wrappers around TensorFlow ops.
This file is MACHINE GENERATED! Do not edit.
Original C++ source file: gen_prediction_ops_py.cc
"""
import collections as _collections
import six as _six
from tensorflow.python import pywrap_tensorflow as _pywrap_tensorflow
from tensorflow.python.eager import context as _context
from tensorflow.python.eager import core as _core
from tensorflow.python.eager import execute as _execute
from tensorflow.python.framework import dtypes as _dtypes
from tensorflow.python.framework import errors as _errors
from tensorflow.python.framework import tensor_shape as _tensor_shape
from tensorflow.core.framework import op_def_pb2 as _op_def_pb2
# Needed to trigger the call to _set_call_cpp_shape_fn.
from tensorflow.python.framework import common_shapes as _common_shapes
from tensorflow.python.framework import op_def_registry as _op_def_registry
from tensorflow.python.framework import ops as _ops
from tensorflow.python.framework import op_def_library as _op_def_library
from tensorflow.python.util.tf_export import tf_export
@tf_export('gradient_trees_partition_examples')
def gradient_trees_partition_examples(tree_ensemble_handle, dense_float_features, sparse_float_feature_indices, sparse_float_feature_values, sparse_float_feature_shapes, sparse_int_feature_indices, sparse_int_feature_values, sparse_int_feature_shapes, use_locking=False, name=None):
r"""Splits input examples into the leaves of the tree.
Args:
tree_ensemble_handle: A `Tensor` of type `resource`.
The handle to the tree ensemble.
dense_float_features: A list of `Tensor` objects with type `float32`.
Rank 2 Tensors containing dense float feature values.
sparse_float_feature_indices: A list of `Tensor` objects with type `int64`.
Rank 2 Tensors containing sparse float indices.
sparse_float_feature_values: A list with the same length as `sparse_float_feature_indices` of `Tensor` objects with type `float32`.
Rank 1 Tensors containing sparse float values.
sparse_float_feature_shapes: A list with the same length as `sparse_float_feature_indices` of `Tensor` objects with type `int64`.
Rank 1 Tensors containing sparse float shapes.
sparse_int_feature_indices: A list of `Tensor` objects with type `int64`.
Rank 2 Tensors containing sparse int indices.
sparse_int_feature_values: A list with the same length as `sparse_int_feature_indices` of `Tensor` objects with type `int64`.
Rank 1 Tensors containing sparse int values.
sparse_int_feature_shapes: A list with the same length as `sparse_int_feature_indices` of `Tensor` objects with type `int64`.
Rank 1 Tensors containing sparse int shapes.
use_locking: An optional `bool`. Defaults to `False`.
Whether to use locking.
name: A name for the operation (optional).
Returns:
A `Tensor` of type `int32`.
Rank 1 Tensor containing partition ids per example.
"""
_ctx = _context._context
if _ctx is None or not _ctx._eager_context.is_eager:
if not isinstance(dense_float_features, (list, tuple)):
raise TypeError(
"Expected list for 'dense_float_features' argument to "
"'gradient_trees_partition_examples' Op, not %r." % dense_float_features)
_attr_num_dense_float_features = len(dense_float_features)
if not isinstance(sparse_float_feature_indices, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_float_feature_indices' argument to "
"'gradient_trees_partition_examples' Op, not %r." % sparse_float_feature_indices)
_attr_num_sparse_float_features = len(sparse_float_feature_indices)
if not isinstance(sparse_float_feature_values, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_float_feature_values' argument to "
"'gradient_trees_partition_examples' Op, not %r." % sparse_float_feature_values)
if len(sparse_float_feature_values) != _attr_num_sparse_float_features:
raise ValueError(
"List argument 'sparse_float_feature_values' to 'gradient_trees_partition_examples' Op with length %d "
"must match length %d of argument 'sparse_float_feature_indices'." %
(len(sparse_float_feature_values), _attr_num_sparse_float_features))
if not isinstance(sparse_float_feature_shapes, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_float_feature_shapes' argument to "
"'gradient_trees_partition_examples' Op, not %r." % sparse_float_feature_shapes)
if len(sparse_float_feature_shapes) != _attr_num_sparse_float_features:
raise ValueError(
"List argument 'sparse_float_feature_shapes' to 'gradient_trees_partition_examples' Op with length %d "
"must match length %d of argument 'sparse_float_feature_indices'." %
(len(sparse_float_feature_shapes), _attr_num_sparse_float_features))
if not isinstance(sparse_int_feature_indices, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_int_feature_indices' argument to "
"'gradient_trees_partition_examples' Op, not %r." % sparse_int_feature_indices)
_attr_num_sparse_int_features = len(sparse_int_feature_indices)
if not isinstance(sparse_int_feature_values, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_int_feature_values' argument to "
"'gradient_trees_partition_examples' Op, not %r." % sparse_int_feature_values)
if len(sparse_int_feature_values) != _attr_num_sparse_int_features:
raise ValueError(
"List argument 'sparse_int_feature_values' to 'gradient_trees_partition_examples' Op with length %d "
"must match length %d of argument 'sparse_int_feature_indices'." %
(len(sparse_int_feature_values), _attr_num_sparse_int_features))
if not isinstance(sparse_int_feature_shapes, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_int_feature_shapes' argument to "
"'gradient_trees_partition_examples' Op, not %r." % sparse_int_feature_shapes)
if len(sparse_int_feature_shapes) != _attr_num_sparse_int_features:
raise ValueError(
"List argument 'sparse_int_feature_shapes' to 'gradient_trees_partition_examples' Op with length %d "
"must match length %d of argument 'sparse_int_feature_indices'." %
(len(sparse_int_feature_shapes), _attr_num_sparse_int_features))
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_, _, _op = _op_def_lib._apply_op_helper(
"GradientTreesPartitionExamples",
tree_ensemble_handle=tree_ensemble_handle,
dense_float_features=dense_float_features,
sparse_float_feature_indices=sparse_float_feature_indices,
sparse_float_feature_values=sparse_float_feature_values,
sparse_float_feature_shapes=sparse_float_feature_shapes,
sparse_int_feature_indices=sparse_int_feature_indices,
sparse_int_feature_values=sparse_int_feature_values,
sparse_int_feature_shapes=sparse_int_feature_shapes,
use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("num_dense_float_features",
_op.get_attr("num_dense_float_features"),
"num_sparse_float_features",
_op.get_attr("num_sparse_float_features"),
"num_sparse_int_features",
_op.get_attr("num_sparse_int_features"), "use_locking",
_op.get_attr("use_locking"))
_execute.record_gradient(
"GradientTreesPartitionExamples", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
else:
try:
_result = _pywrap_tensorflow.TFE_Py_FastPathExecute(
_ctx._context_handle, _ctx._eager_context.device_name,
"GradientTreesPartitionExamples", name,
_ctx._post_execution_callbacks, tree_ensemble_handle,
dense_float_features, sparse_float_feature_indices,
sparse_float_feature_values, sparse_float_feature_shapes,
sparse_int_feature_indices, sparse_int_feature_values,
sparse_int_feature_shapes, "use_locking", use_locking)
return _result
except _core._FallbackException:
return gradient_trees_partition_examples_eager_fallback(
tree_ensemble_handle, dense_float_features,
sparse_float_feature_indices, sparse_float_feature_values,
sparse_float_feature_shapes, sparse_int_feature_indices,
sparse_int_feature_values, sparse_int_feature_shapes,
use_locking=use_locking, name=name, ctx=_ctx)
except _core._NotOkStatusException as e:
if name is not None:
message = e.message + " name: " + name
else:
message = e.message
_six.raise_from(_core._status_to_exception(e.code, message), None)
def gradient_trees_partition_examples_eager_fallback(tree_ensemble_handle, dense_float_features, sparse_float_feature_indices, sparse_float_feature_values, sparse_float_feature_shapes, sparse_int_feature_indices, sparse_int_feature_values, sparse_int_feature_shapes, use_locking=False, name=None, ctx=None):
r"""This is the slowpath function for Eager mode.
This is for function gradient_trees_partition_examples
"""
_ctx = ctx if ctx else _context.context()
if not isinstance(dense_float_features, (list, tuple)):
raise TypeError(
"Expected list for 'dense_float_features' argument to "
"'gradient_trees_partition_examples' Op, not %r." % dense_float_features)
_attr_num_dense_float_features = len(dense_float_features)
if not isinstance(sparse_float_feature_indices, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_float_feature_indices' argument to "
"'gradient_trees_partition_examples' Op, not %r." % sparse_float_feature_indices)
_attr_num_sparse_float_features = len(sparse_float_feature_indices)
if not isinstance(sparse_float_feature_values, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_float_feature_values' argument to "
"'gradient_trees_partition_examples' Op, not %r." % sparse_float_feature_values)
if len(sparse_float_feature_values) != _attr_num_sparse_float_features:
raise ValueError(
"List argument 'sparse_float_feature_values' to 'gradient_trees_partition_examples' Op with length %d "
"must match length %d of argument 'sparse_float_feature_indices'." %
(len(sparse_float_feature_values), _attr_num_sparse_float_features))
if not isinstance(sparse_float_feature_shapes, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_float_feature_shapes' argument to "
"'gradient_trees_partition_examples' Op, not %r." % sparse_float_feature_shapes)
if len(sparse_float_feature_shapes) != _attr_num_sparse_float_features:
raise ValueError(
"List argument 'sparse_float_feature_shapes' to 'gradient_trees_partition_examples' Op with length %d "
"must match length %d of argument 'sparse_float_feature_indices'." %
(len(sparse_float_feature_shapes), _attr_num_sparse_float_features))
if not isinstance(sparse_int_feature_indices, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_int_feature_indices' argument to "
"'gradient_trees_partition_examples' Op, not %r." % sparse_int_feature_indices)
_attr_num_sparse_int_features = len(sparse_int_feature_indices)
if not isinstance(sparse_int_feature_values, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_int_feature_values' argument to "
"'gradient_trees_partition_examples' Op, not %r." % sparse_int_feature_values)
if len(sparse_int_feature_values) != _attr_num_sparse_int_features:
raise ValueError(
"List argument 'sparse_int_feature_values' to 'gradient_trees_partition_examples' Op with length %d "
"must match length %d of argument 'sparse_int_feature_indices'." %
(len(sparse_int_feature_values), _attr_num_sparse_int_features))
if not isinstance(sparse_int_feature_shapes, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_int_feature_shapes' argument to "
"'gradient_trees_partition_examples' Op, not %r." % sparse_int_feature_shapes)
if len(sparse_int_feature_shapes) != _attr_num_sparse_int_features:
raise ValueError(
"List argument 'sparse_int_feature_shapes' to 'gradient_trees_partition_examples' Op with length %d "
"must match length %d of argument 'sparse_int_feature_indices'." %
(len(sparse_int_feature_shapes), _attr_num_sparse_int_features))
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
tree_ensemble_handle = _ops.convert_to_tensor(tree_ensemble_handle, _dtypes.resource)
dense_float_features = _ops.convert_n_to_tensor(dense_float_features, _dtypes.float32)
sparse_float_feature_indices = _ops.convert_n_to_tensor(sparse_float_feature_indices, _dtypes.int64)
sparse_float_feature_values = _ops.convert_n_to_tensor(sparse_float_feature_values, _dtypes.float32)
sparse_float_feature_shapes = _ops.convert_n_to_tensor(sparse_float_feature_shapes, _dtypes.int64)
sparse_int_feature_indices = _ops.convert_n_to_tensor(sparse_int_feature_indices, _dtypes.int64)
sparse_int_feature_values = _ops.convert_n_to_tensor(sparse_int_feature_values, _dtypes.int64)
sparse_int_feature_shapes = _ops.convert_n_to_tensor(sparse_int_feature_shapes, _dtypes.int64)
_inputs_flat = [tree_ensemble_handle] + list(dense_float_features) + list(sparse_float_feature_indices) + list(sparse_float_feature_values) + list(sparse_float_feature_shapes) + list(sparse_int_feature_indices) + list(sparse_int_feature_values) + list(sparse_int_feature_shapes)
_attrs = ("num_dense_float_features", _attr_num_dense_float_features,
"num_sparse_float_features", _attr_num_sparse_float_features,
"num_sparse_int_features", _attr_num_sparse_int_features, "use_locking",
use_locking)
_result = _execute.execute(b"GradientTreesPartitionExamples", 1,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
_execute.record_gradient(
"GradientTreesPartitionExamples", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
_ops.RegisterShape("GradientTreesPartitionExamples")(None)
_gradient_trees_prediction_outputs = ["predictions",
"drop_out_tree_indices_weights"]
_GradientTreesPredictionOutput = _collections.namedtuple(
"GradientTreesPrediction", _gradient_trees_prediction_outputs)
@tf_export('gradient_trees_prediction')
def gradient_trees_prediction(tree_ensemble_handle, seed, dense_float_features, sparse_float_feature_indices, sparse_float_feature_values, sparse_float_feature_shapes, sparse_int_feature_indices, sparse_int_feature_values, sparse_int_feature_shapes, learner_config, apply_dropout, apply_averaging, center_bias, reduce_dim, use_locking=False, name=None):
r"""Runs multiple additive regression forests predictors on input instances
and computes the final prediction for each class.
Args:
tree_ensemble_handle: A `Tensor` of type `resource`.
The handle to the tree ensemble.
seed: A `Tensor` of type `int64`. random seed to be used for dropout.
dense_float_features: A list of `Tensor` objects with type `float32`.
Rank 2 Tensors containing dense float feature values.
sparse_float_feature_indices: A list of `Tensor` objects with type `int64`.
Rank 2 Tensors containing sparse float indices.
sparse_float_feature_values: A list with the same length as `sparse_float_feature_indices` of `Tensor` objects with type `float32`.
Rank 1 Tensors containing sparse float values.
sparse_float_feature_shapes: A list with the same length as `sparse_float_feature_indices` of `Tensor` objects with type `int64`.
Rank 1 Tensors containing sparse float shapes.
sparse_int_feature_indices: A list of `Tensor` objects with type `int64`.
Rank 2 Tensors containing sparse int indices.
sparse_int_feature_values: A list with the same length as `sparse_int_feature_indices` of `Tensor` objects with type `int64`.
Rank 1 Tensors containing sparse int values.
sparse_int_feature_shapes: A list with the same length as `sparse_int_feature_indices` of `Tensor` objects with type `int64`.
Rank 1 Tensors containing sparse int shapes.
learner_config: A `string`.
Config for the learner of type LearnerConfig proto. Prediction
ops for now uses only LearningRateDropoutDrivenConfig config from the learner.
apply_dropout: A `bool`. whether to apply dropout during prediction.
apply_averaging: A `bool`.
whether averaging of tree ensembles should take place. If set
to true, will be based on AveragingConfig from learner_config.
center_bias: A `bool`.
reduce_dim: A `bool`.
whether to reduce the dimension (legacy impl) or not.
use_locking: An optional `bool`. Defaults to `False`.
Whether to use locking.
name: A name for the operation (optional).
Returns:
A tuple of `Tensor` objects (predictions, drop_out_tree_indices_weights).
predictions: A `Tensor` of type `float32`. Rank 2 Tensor containing predictions per example per class.
drop_out_tree_indices_weights: A `Tensor` of type `float32`. Tensor of Rank 2 containing dropped trees indices
and original weights of those trees during prediction.
"""
_ctx = _context._context
if _ctx is None or not _ctx._eager_context.is_eager:
if not isinstance(dense_float_features, (list, tuple)):
raise TypeError(
"Expected list for 'dense_float_features' argument to "
"'gradient_trees_prediction' Op, not %r." % dense_float_features)
_attr_num_dense_float_features = len(dense_float_features)
if not isinstance(sparse_float_feature_indices, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_float_feature_indices' argument to "
"'gradient_trees_prediction' Op, not %r." % sparse_float_feature_indices)
_attr_num_sparse_float_features = len(sparse_float_feature_indices)
if not isinstance(sparse_float_feature_values, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_float_feature_values' argument to "
"'gradient_trees_prediction' Op, not %r." % sparse_float_feature_values)
if len(sparse_float_feature_values) != _attr_num_sparse_float_features:
raise ValueError(
"List argument 'sparse_float_feature_values' to 'gradient_trees_prediction' Op with length %d "
"must match length %d of argument 'sparse_float_feature_indices'." %
(len(sparse_float_feature_values), _attr_num_sparse_float_features))
if not isinstance(sparse_float_feature_shapes, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_float_feature_shapes' argument to "
"'gradient_trees_prediction' Op, not %r." % sparse_float_feature_shapes)
if len(sparse_float_feature_shapes) != _attr_num_sparse_float_features:
raise ValueError(
"List argument 'sparse_float_feature_shapes' to 'gradient_trees_prediction' Op with length %d "
"must match length %d of argument 'sparse_float_feature_indices'." %
(len(sparse_float_feature_shapes), _attr_num_sparse_float_features))
if not isinstance(sparse_int_feature_indices, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_int_feature_indices' argument to "
"'gradient_trees_prediction' Op, not %r." % sparse_int_feature_indices)
_attr_num_sparse_int_features = len(sparse_int_feature_indices)
if not isinstance(sparse_int_feature_values, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_int_feature_values' argument to "
"'gradient_trees_prediction' Op, not %r." % sparse_int_feature_values)
if len(sparse_int_feature_values) != _attr_num_sparse_int_features:
raise ValueError(
"List argument 'sparse_int_feature_values' to 'gradient_trees_prediction' Op with length %d "
"must match length %d of argument 'sparse_int_feature_indices'." %
(len(sparse_int_feature_values), _attr_num_sparse_int_features))
if not isinstance(sparse_int_feature_shapes, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_int_feature_shapes' argument to "
"'gradient_trees_prediction' Op, not %r." % sparse_int_feature_shapes)
if len(sparse_int_feature_shapes) != _attr_num_sparse_int_features:
raise ValueError(
"List argument 'sparse_int_feature_shapes' to 'gradient_trees_prediction' Op with length %d "
"must match length %d of argument 'sparse_int_feature_indices'." %
(len(sparse_int_feature_shapes), _attr_num_sparse_int_features))
learner_config = _execute.make_str(learner_config, "learner_config")
apply_dropout = _execute.make_bool(apply_dropout, "apply_dropout")
apply_averaging = _execute.make_bool(apply_averaging, "apply_averaging")
center_bias = _execute.make_bool(center_bias, "center_bias")
reduce_dim = _execute.make_bool(reduce_dim, "reduce_dim")
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
_, _, _op = _op_def_lib._apply_op_helper(
"GradientTreesPrediction", tree_ensemble_handle=tree_ensemble_handle,
seed=seed, dense_float_features=dense_float_features,
sparse_float_feature_indices=sparse_float_feature_indices,
sparse_float_feature_values=sparse_float_feature_values,
sparse_float_feature_shapes=sparse_float_feature_shapes,
sparse_int_feature_indices=sparse_int_feature_indices,
sparse_int_feature_values=sparse_int_feature_values,
sparse_int_feature_shapes=sparse_int_feature_shapes,
learner_config=learner_config, apply_dropout=apply_dropout,
apply_averaging=apply_averaging, center_bias=center_bias,
reduce_dim=reduce_dim, use_locking=use_locking, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("learner_config", _op.get_attr("learner_config"),
"num_dense_float_features",
_op.get_attr("num_dense_float_features"),
"num_sparse_float_features",
_op.get_attr("num_sparse_float_features"),
"num_sparse_int_features",
_op.get_attr("num_sparse_int_features"), "use_locking",
_op.get_attr("use_locking"), "apply_dropout",
_op.get_attr("apply_dropout"), "apply_averaging",
_op.get_attr("apply_averaging"), "center_bias",
_op.get_attr("center_bias"), "reduce_dim",
_op.get_attr("reduce_dim"))
_execute.record_gradient(
"GradientTreesPrediction", _inputs_flat, _attrs, _result, name)
_result = _GradientTreesPredictionOutput._make(_result)
return _result
else:
try:
_result = _pywrap_tensorflow.TFE_Py_FastPathExecute(
_ctx._context_handle, _ctx._eager_context.device_name,
"GradientTreesPrediction", name, _ctx._post_execution_callbacks,
tree_ensemble_handle, seed, dense_float_features,
sparse_float_feature_indices, sparse_float_feature_values,
sparse_float_feature_shapes, sparse_int_feature_indices,
sparse_int_feature_values, sparse_int_feature_shapes,
"learner_config", learner_config, "use_locking", use_locking,
"apply_dropout", apply_dropout, "apply_averaging", apply_averaging,
"center_bias", center_bias, "reduce_dim", reduce_dim)
_result = _GradientTreesPredictionOutput._make(_result)
return _result
except _core._FallbackException:
return gradient_trees_prediction_eager_fallback(
tree_ensemble_handle, seed, dense_float_features,
sparse_float_feature_indices, sparse_float_feature_values,
sparse_float_feature_shapes, sparse_int_feature_indices,
sparse_int_feature_values, sparse_int_feature_shapes,
learner_config=learner_config, use_locking=use_locking,
apply_dropout=apply_dropout, apply_averaging=apply_averaging,
center_bias=center_bias, reduce_dim=reduce_dim, name=name, ctx=_ctx)
except _core._NotOkStatusException as e:
if name is not None:
message = e.message + " name: " + name
else:
message = e.message
_six.raise_from(_core._status_to_exception(e.code, message), None)
def gradient_trees_prediction_eager_fallback(tree_ensemble_handle, seed, dense_float_features, sparse_float_feature_indices, sparse_float_feature_values, sparse_float_feature_shapes, sparse_int_feature_indices, sparse_int_feature_values, sparse_int_feature_shapes, learner_config, apply_dropout, apply_averaging, center_bias, reduce_dim, use_locking=False, name=None, ctx=None):
r"""This is the slowpath function for Eager mode.
This is for function gradient_trees_prediction
"""
_ctx = ctx if ctx else _context.context()
if not isinstance(dense_float_features, (list, tuple)):
raise TypeError(
"Expected list for 'dense_float_features' argument to "
"'gradient_trees_prediction' Op, not %r." % dense_float_features)
_attr_num_dense_float_features = len(dense_float_features)
if not isinstance(sparse_float_feature_indices, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_float_feature_indices' argument to "
"'gradient_trees_prediction' Op, not %r." % sparse_float_feature_indices)
_attr_num_sparse_float_features = len(sparse_float_feature_indices)
if not isinstance(sparse_float_feature_values, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_float_feature_values' argument to "
"'gradient_trees_prediction' Op, not %r." % sparse_float_feature_values)
if len(sparse_float_feature_values) != _attr_num_sparse_float_features:
raise ValueError(
"List argument 'sparse_float_feature_values' to 'gradient_trees_prediction' Op with length %d "
"must match length %d of argument 'sparse_float_feature_indices'." %
(len(sparse_float_feature_values), _attr_num_sparse_float_features))
if not isinstance(sparse_float_feature_shapes, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_float_feature_shapes' argument to "
"'gradient_trees_prediction' Op, not %r." % sparse_float_feature_shapes)
if len(sparse_float_feature_shapes) != _attr_num_sparse_float_features:
raise ValueError(
"List argument 'sparse_float_feature_shapes' to 'gradient_trees_prediction' Op with length %d "
"must match length %d of argument 'sparse_float_feature_indices'." %
(len(sparse_float_feature_shapes), _attr_num_sparse_float_features))
if not isinstance(sparse_int_feature_indices, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_int_feature_indices' argument to "
"'gradient_trees_prediction' Op, not %r." % sparse_int_feature_indices)
_attr_num_sparse_int_features = len(sparse_int_feature_indices)
if not isinstance(sparse_int_feature_values, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_int_feature_values' argument to "
"'gradient_trees_prediction' Op, not %r." % sparse_int_feature_values)
if len(sparse_int_feature_values) != _attr_num_sparse_int_features:
raise ValueError(
"List argument 'sparse_int_feature_values' to 'gradient_trees_prediction' Op with length %d "
"must match length %d of argument 'sparse_int_feature_indices'." %
(len(sparse_int_feature_values), _attr_num_sparse_int_features))
if not isinstance(sparse_int_feature_shapes, (list, tuple)):
raise TypeError(
"Expected list for 'sparse_int_feature_shapes' argument to "
"'gradient_trees_prediction' Op, not %r." % sparse_int_feature_shapes)
if len(sparse_int_feature_shapes) != _attr_num_sparse_int_features:
raise ValueError(
"List argument 'sparse_int_feature_shapes' to 'gradient_trees_prediction' Op with length %d "
"must match length %d of argument 'sparse_int_feature_indices'." %
(len(sparse_int_feature_shapes), _attr_num_sparse_int_features))
learner_config = _execute.make_str(learner_config, "learner_config")
apply_dropout = _execute.make_bool(apply_dropout, "apply_dropout")
apply_averaging = _execute.make_bool(apply_averaging, "apply_averaging")
center_bias = _execute.make_bool(center_bias, "center_bias")
reduce_dim = _execute.make_bool(reduce_dim, "reduce_dim")
if use_locking is None:
use_locking = False
use_locking = _execute.make_bool(use_locking, "use_locking")
tree_ensemble_handle = _ops.convert_to_tensor(tree_ensemble_handle, _dtypes.resource)
seed = _ops.convert_to_tensor(seed, _dtypes.int64)
dense_float_features = _ops.convert_n_to_tensor(dense_float_features, _dtypes.float32)
sparse_float_feature_indices = _ops.convert_n_to_tensor(sparse_float_feature_indices, _dtypes.int64)
sparse_float_feature_values = _ops.convert_n_to_tensor(sparse_float_feature_values, _dtypes.float32)
sparse_float_feature_shapes = _ops.convert_n_to_tensor(sparse_float_feature_shapes, _dtypes.int64)
sparse_int_feature_indices = _ops.convert_n_to_tensor(sparse_int_feature_indices, _dtypes.int64)
sparse_int_feature_values = _ops.convert_n_to_tensor(sparse_int_feature_values, _dtypes.int64)
sparse_int_feature_shapes = _ops.convert_n_to_tensor(sparse_int_feature_shapes, _dtypes.int64)
_inputs_flat = [tree_ensemble_handle, seed] + list(dense_float_features) + list(sparse_float_feature_indices) + list(sparse_float_feature_values) + list(sparse_float_feature_shapes) + list(sparse_int_feature_indices) + list(sparse_int_feature_values) + list(sparse_int_feature_shapes)
_attrs = ("learner_config", learner_config, "num_dense_float_features",
_attr_num_dense_float_features, "num_sparse_float_features",
_attr_num_sparse_float_features, "num_sparse_int_features",
_attr_num_sparse_int_features, "use_locking", use_locking, "apply_dropout",
apply_dropout, "apply_averaging", apply_averaging, "center_bias",
center_bias, "reduce_dim", reduce_dim)
_result = _execute.execute(b"GradientTreesPrediction", 2,
inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,
name=name)
_execute.record_gradient(
"GradientTreesPrediction", _inputs_flat, _attrs, _result, name)
_result = _GradientTreesPredictionOutput._make(_result)
return _result
_ops.RegisterShape("GradientTreesPrediction")(None)
def _InitOpDefLibrary(op_list_proto_bytes):
op_list = _op_def_pb2.OpList()
op_list.ParseFromString(op_list_proto_bytes)
_op_def_registry.register_op_list(op_list)
op_def_lib = _op_def_library.OpDefLibrary()
op_def_lib.add_op_list(op_list)
return op_def_lib
# op {
# name: "GradientTreesPartitionExamples"
# input_arg {
# name: "tree_ensemble_handle"
# type: DT_RESOURCE
# }
# input_arg {
# name: "dense_float_features"
# type: DT_FLOAT
# number_attr: "num_dense_float_features"
# }
# input_arg {
# name: "sparse_float_feature_indices"
# type: DT_INT64
# number_attr: "num_sparse_float_features"
# }
# input_arg {
# name: "sparse_float_feature_values"
# type: DT_FLOAT
# number_attr: "num_sparse_float_features"
# }
# input_arg {
# name: "sparse_float_feature_shapes"
# type: DT_INT64
# number_attr: "num_sparse_float_features"
# }
# input_arg {
# name: "sparse_int_feature_indices"
# type: DT_INT64
# number_attr: "num_sparse_int_features"
# }
# input_arg {
# name: "sparse_int_feature_values"
# type: DT_INT64
# number_attr: "num_sparse_int_features"
# }
# input_arg {
# name: "sparse_int_feature_shapes"
# type: DT_INT64
# number_attr: "num_sparse_int_features"
# }
# output_arg {
# name: "partition_ids"
# type: DT_INT32
# }
# attr {
# name: "num_dense_float_features"
# type: "int"
# has_minimum: true
# }
# attr {
# name: "num_sparse_float_features"
# type: "int"
# has_minimum: true
# }
# attr {
# name: "num_sparse_int_features"
# type: "int"
# has_minimum: true
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# is_stateful: true
# }
# op {
# name: "GradientTreesPrediction"
# input_arg {
# name: "tree_ensemble_handle"
# type: DT_RESOURCE
# }
# input_arg {
# name: "seed"
# type: DT_INT64
# }
# input_arg {
# name: "dense_float_features"
# type: DT_FLOAT
# number_attr: "num_dense_float_features"
# }
# input_arg {
# name: "sparse_float_feature_indices"
# type: DT_INT64
# number_attr: "num_sparse_float_features"
# }
# input_arg {
# name: "sparse_float_feature_values"
# type: DT_FLOAT
# number_attr: "num_sparse_float_features"
# }
# input_arg {
# name: "sparse_float_feature_shapes"
# type: DT_INT64
# number_attr: "num_sparse_float_features"
# }
# input_arg {
# name: "sparse_int_feature_indices"
# type: DT_INT64
# number_attr: "num_sparse_int_features"
# }
# input_arg {
# name: "sparse_int_feature_values"
# type: DT_INT64
# number_attr: "num_sparse_int_features"
# }
# input_arg {
# name: "sparse_int_feature_shapes"
# type: DT_INT64
# number_attr: "num_sparse_int_features"
# }
# output_arg {
# name: "predictions"
# type: DT_FLOAT
# }
# output_arg {
# name: "drop_out_tree_indices_weights"
# type: DT_FLOAT
# }
# attr {
# name: "learner_config"
# type: "string"
# }
# attr {
# name: "num_dense_float_features"
# type: "int"
# has_minimum: true
# }
# attr {
# name: "num_sparse_float_features"
# type: "int"
# has_minimum: true
# }
# attr {
# name: "num_sparse_int_features"
# type: "int"
# has_minimum: true
# }
# attr {
# name: "use_locking"
# type: "bool"
# default_value {
# b: false
# }
# }
# attr {
# name: "apply_dropout"
# type: "bool"
# }
# attr {
# name: "apply_averaging"
# type: "bool"
# }
# attr {
# name: "center_bias"
# type: "bool"
# }
# attr {
# name: "reduce_dim"
# type: "bool"
# }
# is_stateful: true
# }
_op_def_lib = _InitOpDefLibrary(b"\n\344\004\n\036GradientTreesPartitionExamples\022\030\n\024tree_ensemble_handle\030\024\0222\n\024dense_float_features\030\001*\030num_dense_float_features\022;\n\034sparse_float_feature_indices\030\t*\031num_sparse_float_features\022:\n\033sparse_float_feature_values\030\001*\031num_sparse_float_features\022:\n\033sparse_float_feature_shapes\030\t*\031num_sparse_float_features\0227\n\032sparse_int_feature_indices\030\t*\027num_sparse_int_features\0226\n\031sparse_int_feature_values\030\t*\027num_sparse_int_features\0226\n\031sparse_int_feature_shapes\030\t*\027num_sparse_int_features\032\021\n\rpartition_ids\030\003\"!\n\030num_dense_float_features\022\003int(\001\"\"\n\031num_sparse_float_features\022\003int(\001\" \n\027num_sparse_int_features\022\003int(\001\"\027\n\013use_locking\022\004bool\032\002(\000\210\001\001\n\373\005\n\027GradientTreesPrediction\022\030\n\024tree_ensemble_handle\030\024\022\010\n\004seed\030\t\0222\n\024dense_float_features\030\001*\030num_dense_float_features\022;\n\034sparse_float_feature_indices\030\t*\031num_sparse_float_features\022:\n\033sparse_float_feature_values\030\001*\031num_sparse_float_features\022:\n\033sparse_float_feature_shapes\030\t*\031num_sparse_float_features\0227\n\032sparse_int_feature_indices\030\t*\027num_sparse_int_features\0226\n\031sparse_int_feature_values\030\t*\027num_sparse_int_features\0226\n\031sparse_int_feature_shapes\030\t*\027num_sparse_int_features\032\017\n\013predictions\030\001\032!\n\035drop_out_tree_indices_weights\030\001\"\030\n\016learner_config\022\006string\"!\n\030num_dense_float_features\022\003int(\001\"\"\n\031num_sparse_float_features\022\003int(\001\" \n\027num_sparse_int_features\022\003int(\001\"\027\n\013use_locking\022\004bool\032\002(\000\"\025\n\rapply_dropout\022\004bool\"\027\n\017apply_averaging\022\004bool\"\023\n\013center_bias\022\004bool\"\022\n\nreduce_dim\022\004bool\210\001\001")
| 53.44883 | 1,948 | 0.751963 | 4,806 | 36,559 | 5.258427 | 0.057636 | 0.083571 | 0.10114 | 0.049462 | 0.885565 | 0.867403 | 0.859054 | 0.855888 | 0.846312 | 0.846312 | 0 | 0.019206 | 0.166854 | 36,559 | 683 | 1,949 | 53.527086 | 0.810499 | 0.204738 | 0 | 0.788599 | 1 | 0.002375 | 0.264672 | 0.165072 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011876 | false | 0 | 0.035629 | 0 | 0.068884 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
86b7c2f5e53cb69e34955acf9d0f4d02412f3c4c | 39,297 | py | Python | tests/core/test_WindowLooper.py | gilbertohasnofb/auxjad | 553b7fe97221b6f378a93ade6262f024e3cbc678 | [
"MIT"
] | 6 | 2020-05-18T09:28:29.000Z | 2021-12-22T00:40:54.000Z | tests/core/test_WindowLooper.py | gilbertohasnofb/auxjad | 553b7fe97221b6f378a93ade6262f024e3cbc678 | [
"MIT"
] | 1 | 2021-04-21T20:29:38.000Z | 2021-04-22T19:44:54.000Z | tests/core/test_WindowLooper.py | gilbertohasnofb/auxjad | 553b7fe97221b6f378a93ade6262f024e3cbc678 | [
"MIT"
] | 1 | 2021-04-21T18:54:46.000Z | 2021-04-21T18:54:46.000Z | import random
import abjad
import pytest
import auxjad
def test_WindowLooper_01():
container = abjad.Container(r"c'4 d'2 e'4 f'2 ~ f'8 g'4.")
looper = auxjad.WindowLooper(container,
window_size=(4, 4),
step_size=(1, 16),
)
assert abjad.lilypond(looper) == abjad.String.normalize(
r"""
{
c'4
d'2
e'4
f'2
~
f'8
g'4.
}
"""
)
notes = looper()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 4/4
c'4
d'2
e'4
}
"""
)
notes = looper()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 4/4
c'8.
d'16
~
d'4
~
d'8.
e'16
~
e'8.
f'16
}
"""
)
notes = looper.current_window
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 4/4
c'8.
d'16
~
d'4
~
d'8.
e'16
~
e'8.
f'16
}
"""
)
def test_WindowLooper_02():
container = abjad.Container(r"c'4 d'2 e'4 f'2 ~ f'8 g'4.")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 8),
)
notes = looper()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
c'4
d'2
}
"""
)
notes = looper()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
c'8
d'8
~
d'4.
e'8
}
"""
)
def test_WindowLooper_03():
container = abjad.Container(r"c'4 d'2 e'4")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 8),
)
notes = looper.__next__()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
c'4
d'2
}
"""
)
notes = looper.__next__()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
c'8
d'8
~
d'4.
e'8
}
"""
)
notes = looper.__next__()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
d'2
e'4
}
"""
)
notes = looper.__next__()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
d'4.
e'4
r8
}
"""
)
notes = looper.__next__()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
d'4
e'4
r4
}
"""
)
notes = looper.__next__()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
d'8
e'4
r4.
}
"""
)
notes = looper.__next__()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
e'4
r2
}
"""
)
notes = looper.__next__()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
e'8
r8
r2
}
"""
)
with pytest.raises(StopIteration):
notes = looper.__next__() # noqa: F841
def test_WindowLooper_04():
container = abjad.Container(r"c'4 d'2 e'4 f'2 ~ f'8 g'4.")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(5, 8),
max_steps=2,
repetition_chance=0.25,
forward_bias=0.2,
head_position=(2, 8),
omit_time_signatures=False,
fill_with_rests=False,
boundary_depth=0,
maximum_dot_count=1,
rewrite_tuplets=False,
process_on_first_call=True,
after_rest=(1, 8),
after_rest_in_new_measure=True,
use_multimeasure_rests=False,
)
assert looper.window_size == abjad.Meter((3, 4))
assert looper.step_size == abjad.Duration((5, 8))
assert looper.max_steps == 2
assert looper.repetition_chance == 0.25
assert looper.forward_bias == 0.2
assert looper.head_position == abjad.Duration((1, 4))
assert not looper.omit_time_signatures
assert not looper.fill_with_rests
assert looper.boundary_depth == 0
assert looper.maximum_dot_count == 1
assert not looper.rewrite_tuplets
assert looper.process_on_first_call
assert looper.after_rest == abjad.Duration((1, 8))
assert looper.after_rest_in_new_measure
assert not looper.use_multimeasure_rests
looper.window_size = (5, 4)
looper.step_size = (1, 4)
looper.max_steps = 3
looper.repetition_chance = 0.1
looper.forward_bias = 0.8
looper.head_position = 0
looper.omit_time_signatures = True
looper.fill_with_rests = True
looper.boundary_depth = 1
looper.maximum_dot_count = 2
looper.rewrite_tuplets = True
looper.process_on_first_call = False
looper.after_rest = 0
looper.after_rest_in_new_measure = False
looper.use_multimeasure_rests = True
assert looper.window_size == abjad.Meter((5, 4))
assert looper.step_size == abjad.Duration((1, 4))
assert looper.max_steps == 3
assert looper.repetition_chance == 0.1
assert looper.forward_bias == 0.8
assert looper.head_position == abjad.Duration(0)
assert looper.omit_time_signatures
assert looper.fill_with_rests
assert looper.boundary_depth == 1
assert looper.maximum_dot_count == 2
assert looper.rewrite_tuplets
assert not looper.process_on_first_call
assert looper.after_rest == abjad.Duration(0)
assert not looper.after_rest_in_new_measure
assert looper.use_multimeasure_rests
def test_WindowLooper_05():
container = abjad.Container(r"c'4 d'4 e'4 f'4")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 4),
)
notes = looper.output_all()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
c'4
d'4
e'4
d'4
e'4
f'4
e'4
f'4
r4
f'4
r2
}
"""
)
def test_WindowLooper_06():
container = abjad.Container(r"c'4 d'2 e'4 f'2 ~ f'8 g'4.")
looper = auxjad.WindowLooper(container,
window_size=(4, 4),
step_size=(1, 16),
)
notes = looper.__next__()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 4/4
c'4
d'2
e'4
}
"""
)
notes = looper.__next__()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 4/4
c'8.
d'16
~
d'4
~
d'8.
e'16
~
e'8.
f'16
}
"""
)
notes = looper.__next__()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 4/4
c'8
d'8
~
d'4
~
d'8
e'4
f'8
}
"""
)
looper.window_size = (3, 8)
notes = looper.__next__()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/8
c'16
d'16
~
d'4
}
"""
)
notes = looper.__next__()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/8
d'4.
}
"""
)
notes = looper.__next__()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/8
d'4.
}
"""
)
def test_WindowLooper_07():
container = abjad.Container(r"\times 2/3 {c'8 d'8 e'8} d'2.")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 16),
)
notes = looper.output_n(3)
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\times 2/3
{
\time 3/4
c'8
d'8
e'8
}
d'2
\times 2/3
{
c'32
d'16
~
d'16
e'8
}
d'16
~
d'2
\times 2/3
{
d'16
e'8
}
d'8
~
d'2
}
"""
)
def test_WindowLooper_08():
container = abjad.Container(r"c'4 d'2 e'4 f'2 ~ f'8 g'4.")
looper = auxjad.WindowLooper(container,
window_size=(4, 4),
step_size=(1, 16),
omit_time_signatures=True,
)
notes = looper()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
c'4
d'2
e'4
}
"""
)
def test_WindowLooper_09():
wrong_type_input = 'foobar'
container = abjad.Container(r"c'4 d'2 e'4 f'2 ~ f'8 g'4.")
with pytest.raises(TypeError):
looper = auxjad.WindowLooper(wrong_type_input, # noqa: F841
window_size=(4, 4),
step_size=(1, 16),
)
looper = auxjad.WindowLooper(container, # noqa: F841
window_size=17j,
step_size=(1, 16),
)
looper = auxjad.WindowLooper(container, # noqa: F841
window_size=(4, 4),
step_size=True,
)
looper = auxjad.WindowLooper(container, # noqa: F841
window_size=(4, 4),
step_size=(1, 16),
max_steps='foo',
)
looper = auxjad.WindowLooper(container, # noqa: F841
window_size=(4, 4),
step_size=(1, 16),
repetition_chance='bar',
)
looper = auxjad.WindowLooper(container, # noqa: F841
window_size=(4, 4),
step_size=(1, 16),
forward_bias=False,
)
looper = auxjad.WindowLooper(container, # noqa: F841
window_size=(4, 4),
step_size=(1, 16),
head_position=62.3j,
)
looper = auxjad.WindowLooper(container, # noqa: F841
window_size=(4, 4),
step_size=(1, 16),
omit_time_signatures='xyz',
)
looper = auxjad.WindowLooper(container, # noqa: F841
window_size=(4, 4),
step_size=(1, 16),
fill_with_rests=1.2,
)
with pytest.raises(ValueError):
looper = auxjad.WindowLooper(container, # noqa: F841
window_size=(100, 1),
step_size=(1, 16),
)
looper = auxjad.WindowLooper(container, # noqa: F841
window_size=(4, 4),
step_size=(1, 16),
max_steps=-1,
)
looper = auxjad.WindowLooper(container, # noqa: F841
window_size=(4, 4),
step_size=(1, 16),
repetition_chance=-0.3,
)
looper = auxjad.WindowLooper(container, # noqa: F841
window_size=(4, 4),
step_size=(1, 16),
repetition_chance=1.4,
)
looper = auxjad.WindowLooper(container, # noqa: F841
window_size=(4, 4),
step_size=(1, 16),
forward_bias=-0.3,
)
looper = auxjad.WindowLooper(container, # noqa: F841
window_size=(4, 4),
step_size=(1, 16),
forward_bias=1.4,
)
looper = auxjad.WindowLooper(container, # noqa: F841
window_size=(4, 4),
step_size=(1, 16),
head_position=(100, 1),
)
def test_WindowLooper_10():
container = abjad.Container(r"c'4 e'2 d'2 f'4")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 4),
)
notes = looper.output_all(tie_identical_pitches=True)
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
c'4
e'2
~
e'2
d'4
e'4
d'2
~
d'2
f'4
d'4
f'4
r4
f'4
r2
}
"""
)
def test_WindowLooper_11():
container = abjad.Container(r"c'4 <e' f' g'>2 r4 f'2.")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 4),
)
notes = looper.output_all(tie_identical_pitches=True)
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
c'4
<e' f' g'>2
~
<e' f' g'>2
r4
<e' f' g'>4
r4
f'4
r4
f'2
~
f'2.
~
f'2
r4
f'4
r2
}
"""
)
def test_WindowLooper_12():
container = abjad.Container(r"c'4 d'4 e'4 f'4")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 4),
)
notes = looper.output_n(2)
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
c'4
d'4
e'4
d'4
e'4
f'4
}
"""
)
def test_WindowLooper_13():
container = abjad.Container(r"c'4 d'2 e'4 f'4")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 4),
)
notes = looper.output_n(2, tie_identical_pitches=True)
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
c'4
d'2
~
d'2
e'4
}
"""
)
def test_WindowLooper_14():
container = abjad.Container(r"c'4 d'4 e'4 f'4")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 4),
)
with pytest.raises(RuntimeError):
notes = looper.output_n(100) # noqa: F841
def test_WindowLooper_15():
container = abjad.Container(r"c'4 d'4 e'4 f'4 g'4 a'4")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 4),
head_position=(3, 4),
forward_bias=0.0,
)
notes = looper.output_n(3)
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
f'4
g'4
a'4
e'4
f'4
g'4
d'4
e'4
f'4
}
"""
)
def test_WindowLooper_16():
container = abjad.Container(r"c'4 d'4 e'4 f'4 g'4 a'4")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 4),
head_position=(3, 4),
forward_bias=0.0,
)
notes = looper.output_all()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
f'4
g'4
a'4
e'4
f'4
g'4
d'4
e'4
f'4
c'4
d'4
e'4
}
"""
)
def test_WindowLooper_17():
container = abjad.Container(r"c'4 d'4 e'4 f'4 g'4 a'4")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 4),
forward_bias=0.0,
)
notes = looper.output_all()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
c'4
d'4
e'4
}
"""
)
def test_WindowLooper_18():
container = abjad.Container(r"c'4 d'2 e'4 f'2 ~ f'8 g'4.")
looper = auxjad.WindowLooper(container,
window_size=(4, 4),
step_size=(1, 16),
process_on_first_call=True,
)
notes = looper()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 4/4
c'8.
d'16
~
d'4
~
d'8.
e'16
~
e'8.
f'16
}
"""
)
def test_WindowLooper_19():
container = abjad.Container(r"c'4-.\p\< d'2--\f e'4->\ppp f'2 ~ f'8")
looper = auxjad.WindowLooper(container,
window_size=(4, 4),
step_size=(1, 16),
)
staff = abjad.Staff()
notes = looper.output_n(2)
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 4/4
c'4
\p
- \staccato
\<
d'2
\f
- \tenuto
e'4
\ppp
- \accent
c'8.
\p
- \staccato
\<
d'16
\f
- \tenuto
~
d'4
~
d'8.
e'16
\ppp
- \accent
~
e'8.
f'16
}
"""
)
def test_WindowLooper_20():
container = abjad.Container(r"c'4 d'2 e'4 f'2 ~ f'8 g'4.")
looper = auxjad.WindowLooper(container,
window_size=(4, 4),
step_size=(1, 16),
)
notes = looper()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 4/4
c'4
d'2
e'4
}
"""
)
notes = looper()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 4/4
c'8.
d'16
~
d'4
~
d'8.
e'16
~
e'8.
f'16
}
"""
)
looper.contents = abjad.Container(r"c'16 d'16 e'16 f'16 g'2. | a'1")
notes = looper()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 4/4
d'16
e'16
f'16
g'16
~
g'2
~
g'8.
a'16
}
"""
)
looper.head_position = 0
notes = looper()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 4/4
c'16
d'16
e'16
f'16
g'2.
}
"""
)
def test_WindowLooper_21():
container = abjad.Container(r"c'1")
looper = auxjad.WindowLooper(container,
window_size=(4, 4),
step_size=(1, 16),
)
assert len(looper) == 16
container = abjad.Container(r"c'1")
looper = auxjad.WindowLooper(container,
window_size=(4, 4),
step_size=(1, 4),
)
assert len(looper) == 4
container = abjad.Container(r"c'2..")
looper = auxjad.WindowLooper(container,
window_size=(2, 4),
step_size=(1, 4),
)
assert len(looper) == 4
def test_WindowLooper_22():
container = abjad.Container(r"c'4 d'4 e'4 f'4")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 4),
fill_with_rests=False,
)
notes = looper.output_n(2)
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
c'4
d'4
e'4
d'4
e'4
f'4
}
"""
)
def test_WindowLooper_23():
random.seed(43271)
container = abjad.Container(r"c'4 d'4 e'4 f'4 g'4 a'4 b'4 c''4")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 4),
head_position=(3, 4),
forward_bias=0.5,
)
notes = looper.output_n(5)
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
f'4
g'4
a'4
e'4
f'4
g'4
d'4
e'4
f'4
e'4
f'4
g'4
d'4
e'4
f'4
}
"""
)
def test_WindowLooper_24():
random.seed(81723)
container = abjad.Container(
r"c'4 d'4 e'4 f'4 g'4 a'4 b'4 c''4 d''4 e''4 f''4"
)
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 4),
max_steps=4,
)
notes = looper.output_n(4)
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
c'4
d'4
e'4
e'4
f'4
g'4
f'4
g'4
a'4
c''4
d''4
e''4
}
"""
)
def test_WindowLooper_25():
container = abjad.Container(r"c'4. d'8 e'2")
looper = auxjad.WindowLooper(container,
window_size=(4, 4),
step_size=(1, 16),
)
notes = looper()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 4/4
c'4.
d'8
e'2
}
"""
)
looper = auxjad.WindowLooper(container,
window_size=(4, 4),
step_size=(1, 16),
boundary_depth=1,
)
notes = looper()
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 4/4
c'4
~
c'8
d'8
e'2
}
"""
)
def test_WindowLooper_26():
container = abjad.Container(r"c'4 d'2 e'4")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 8),
)
staff = abjad.Staff()
for window in looper:
staff.append(window)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
c'4
d'2
\time 3/4
c'8
d'8
~
d'4.
e'8
\time 3/4
d'2
e'4
\time 3/4
d'4.
e'4
r8
\time 3/4
d'4
e'4
r4
\time 3/4
d'8
e'4
r4.
\time 3/4
e'4
r2
\time 3/4
e'8
r8
r2
}
"""
)
auxjad.mutate.remove_repeated_time_signatures(staff[:])
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
c'4
d'2
c'8
d'8
~
d'4.
e'8
d'2
e'4
d'4.
e'4
r8
d'4
e'4
r4
d'8
e'4
r4.
e'4
r2
e'8
r8
r2
}
"""
)
def test_WindowLooper_27():
container = abjad.Container(r"c'4 d'4 e'4 f'4")
looper = auxjad.WindowLooper(container,
window_size=(4, 4),
step_size=(1, 16),
)
assert isinstance(looper(), abjad.Selection)
tuplet = abjad.Tuplet('3:2', r"c'2 d'2 e'2")
looper = auxjad.WindowLooper(tuplet,
window_size=(4, 4),
step_size=(1, 16),
)
assert isinstance(looper(), abjad.Selection)
voice = abjad.Voice(r"c'4 d'4 e'4 f'4")
looper = auxjad.WindowLooper(voice,
window_size=(4, 4),
step_size=(1, 16),
)
assert isinstance(looper(), abjad.Selection)
staff = abjad.Staff(r"c'4 d'4 e'4 f'4")
looper = auxjad.WindowLooper(staff,
window_size=(4, 4),
step_size=(1, 16),
)
assert isinstance(looper(), abjad.Selection)
score = abjad.Score([abjad.Staff(r"c'4 d'4 e'4 f'4")])
looper = auxjad.WindowLooper(score,
window_size=(4, 4),
step_size=(1, 16),
)
assert isinstance(looper(), abjad.Selection)
voice = abjad.Voice(r"c'4 d'4 e'4 f'4")
staff = abjad.Staff([voice])
looper = auxjad.WindowLooper(staff,
window_size=(4, 4),
step_size=(1, 16),
)
assert isinstance(looper(), abjad.Selection)
staff = abjad.Staff(r"c'4 d'4 e'4 f'4")
score = abjad.Score([staff])
looper = auxjad.WindowLooper(score,
window_size=(4, 4),
step_size=(1, 16),
)
assert isinstance(looper(), abjad.Selection)
voice1 = abjad.Voice(r"c'4 d'4 e'4 f'4")
voice2 = abjad.Voice(r"g2 f2")
staff = abjad.Staff([voice1, voice2], simultaneous=True)
with pytest.raises(ValueError):
looper = auxjad.WindowLooper(staff, # noqa: F841
window_size=(4, 4),
step_size=(1, 16),
)
staff1 = abjad.Staff(r"c'4 d'4 e'4 f'4")
staff2 = abjad.Staff(r"g2 f2")
score = abjad.Score([staff1, staff2])
with pytest.raises(ValueError):
looper = auxjad.WindowLooper(score, # noqa: F841
window_size=(4, 4),
step_size=(1, 16),
)
def test_WindowLooper_28():
container = abjad.Container(r"c'4.\p( d'8 e'8\f) f'4.\p( ~ f'4 g'1\pp)")
looper = auxjad.WindowLooper(container,
window_size=(4, 4),
step_size=(1, 4),
)
notes = looper.output_n(6)
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 4/4
c'4.
\p
(
d'8
e'8
\f
)
f'4.
\p
c'8
(
d'8
e'8
\f
)
f'8
\p
~
f'2
e'8
\f
f'8
\p
(
~
f'2
g'4
\pp
)
f'2
\p
(
g'2
\pp
)
f'4
\p
(
g'2.
\pp
)
g'1
}
"""
)
def test_WindowLooper_29():
container = abjad.Container(r"c'4 d'4 e'4 f'4 g'4")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 16),
)
notes = looper.output_n(2)
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
c'4
d'4
e'4
c'8.
d'16
~
d'8.
e'16
~
e'8.
f'16
}
"""
)
container = abjad.Container(r"c'4 d'4 e'4 f'4 g'4")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 16),
disable_rewrite_meter=True,
)
notes = looper.output_n(2)
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
c'4
d'4
e'4
c'8.
d'4
e'4
f'16
}
"""
)
def test_WindowLooper_30():
container = abjad.Container(r"c'4 d'2 e'4 f'2 ~ f'8 g'4.")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 16),
after_rest=(1, 4),
)
notes = looper.output_n(3)
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 4/4
c'4
d'2
r4
c'8.
d'16
~
d'4
~
d'8.
e'16
r4
c'8
d'8
~
d'4
~
d'8
e'8
r4
}
"""
)
def test_WindowLooper_31():
container = abjad.Container(r"c'4 d'2 e'4 f'2 ~ f'8 g'4.")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 16),
after_rest=(1, 4),
after_rest_in_new_measure=True,
)
notes = looper.output_n(3)
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
c'4
d'2
\time 1/4
R1 * 1/4
\time 3/4
c'8.
d'16
~
d'4..
e'16
\time 1/4
R1 * 1/4
\time 3/4
c'8
d'8
~
d'4.
e'8
\time 1/4
R1 * 1/4
}
"""
)
def test_WindowLooper_32():
container = abjad.Container(r"c'4 d'2 e'4 f'2 ~ f'8 g'4.")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 16),
after_rest=(1, 4),
after_rest_in_new_measure=True,
use_multimeasure_rests=False,
)
notes = looper.output_n(3)
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
\time 3/4
c'4
d'2
\time 1/4
r4
\time 3/4
c'8.
d'16
~
d'4..
e'16
\time 1/4
r4
\time 3/4
c'8
d'8
~
d'4.
e'8
\time 1/4
r4
}
"""
)
def test_WindowLooper_33():
container = abjad.Container(r"c'4 d'2 e'4 f'2 ~ f'8 g'4.")
looper = auxjad.WindowLooper(container,
window_size=(3, 4),
step_size=(1, 16),
after_rest=(1, 4),
after_rest_in_new_measure=True,
use_multimeasure_rests=True,
omit_time_signatures=True,
)
notes = looper.output_n(3)
staff = abjad.Staff(notes)
assert abjad.lilypond(staff) == abjad.String.normalize(
r"""
\new Staff
{
c'4
d'2
r4
c'8.
d'16
~
d'4..
e'16
r4
c'8
d'8
~
d'4.
e'8
r4
}
"""
)
| 26.232977 | 76 | 0.374507 | 3,936 | 39,297 | 3.629573 | 0.041921 | 0.071399 | 0.0126 | 0.039199 | 0.862383 | 0.821504 | 0.804424 | 0.767675 | 0.748145 | 0.741915 | 0 | 0.071199 | 0.524646 | 39,297 | 1,497 | 77 | 26.250501 | 0.693576 | 0.005573 | 0 | 0.545324 | 0 | 0.005755 | 0.034708 | 0 | 0 | 0 | 0 | 0 | 0.129496 | 1 | 0.047482 | false | 0 | 0.005755 | 0 | 0.053237 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
86d0a769c5f02c3ac6fb3789be24048df4a5e671 | 11,175 | py | Python | src/ufc-2.0.5/src/utils/python/ufc_utils/integrals.py | szmurlor/fiver | 083251420eb934d860c99dcf1eb07ae5b8ba7e8c | [
"Apache-2.0"
] | null | null | null | src/ufc-2.0.5/src/utils/python/ufc_utils/integrals.py | szmurlor/fiver | 083251420eb934d860c99dcf1eb07ae5b8ba7e8c | [
"Apache-2.0"
] | null | null | null | src/ufc-2.0.5/src/utils/python/ufc_utils/integrals.py | szmurlor/fiver | 083251420eb934d860c99dcf1eb07ae5b8ba7e8c | [
"Apache-2.0"
] | null | null | null | # Code generation format strings for UFC (Unified Form-assembly Code) v. 2.0.5.
# This code is released into the public domain.
#
# The FEniCS Project (http://www.fenicsproject.org/) 2006-2011.
cell_integral_combined = """\
/// This class defines the interface for the tabulation of the cell
/// tensor corresponding to the local contribution to a form from
/// the integral over a cell.
class %(classname)s: public ufc::cell_integral
{%(members)s
public:
/// Constructor
%(classname)s(%(constructor_arguments)s) : ufc::cell_integral()%(initializer_list)s
{
%(constructor)s
}
/// Destructor
virtual ~%(classname)s()
{
%(destructor)s
}
/// Tabulate the tensor for the contribution from a local cell
virtual void tabulate_tensor(double* A,
const double * const * w,
const ufc::cell& c) const
{
%(tabulate_tensor)s
}
/// Tabulate the tensor for the contribution from a local cell
/// using the specified reference cell quadrature points/weights
virtual void tabulate_tensor(double* A,
const double * const * w,
const ufc::cell& c,
unsigned int num_quadrature_points,
const double * const * quadrature_points,
const double* quadrature_weights) const
{
%(tabulate_tensor_quadrature)s
}
};
"""
cell_integral_header = """\
/// This class defines the interface for the tabulation of the cell
/// tensor corresponding to the local contribution to a form from
/// the integral over a cell.
class %(classname)s: public ufc::cell_integral
{%(members)s
public:
/// Constructor
%(classname)s(%(constructor_arguments)s);
/// Destructor
virtual ~%(classname)s();
/// Tabulate the tensor for the contribution from a local cell
virtual void tabulate_tensor(double* A,
const double * const * w,
const ufc::cell& c) const;
/// Tabulate the tensor for the contribution from a local cell
/// using the specified reference cell quadrature points/weights
virtual void tabulate_tensor(double* A,
const double * const * w,
const ufc::cell& c,
unsigned int num_quadrature_points,
const double * const * quadrature_points,
const double* quadrature_weights) const;
};
"""
cell_integral_implementation = """\
/// Constructor
%(classname)s::%(classname)s(%(constructor_arguments)s) : ufc::cell_integral()%(initializer_list)s
{
%(constructor)s
}
/// Destructor
%(classname)s::~%(classname)s()
{
%(destructor)s
}
/// Tabulate the tensor for the contribution from a local cell
void %(classname)s::tabulate_tensor(double* A,
const double * const * w,
const ufc::cell& c) const
{
%(tabulate_tensor)s
}
/// Tabulate the tensor for the contribution from a local cell
/// using the specified reference cell quadrature points/weights
void %(classname)s::tabulate_tensor(double* A,
const double * const * w,
const ufc::cell& c,
unsigned int num_quadrature_points,
const double * const * quadrature_points,
const double* quadrature_weights) const
{
%(tabulate_tensor_quadrature)s
}
"""
exterior_facet_integral_combined = """\
/// This class defines the interface for the tabulation of the
/// exterior facet tensor corresponding to the local contribution to
/// a form from the integral over an exterior facet.
class %(classname)s: public ufc::exterior_facet_integral
{%(members)s
public:
/// Constructor
%(classname)s(%(constructor_arguments)s) : ufc::exterior_facet_integral()%(initializer_list)s
{
%(constructor)s
}
/// Destructor
virtual ~%(classname)s()
{
%(destructor)s
}
/// Tabulate the tensor for the contribution from a local exterior facet
virtual void tabulate_tensor(double* A,
const double * const * w,
const ufc::cell& c,
unsigned int facet) const
{
%(tabulate_tensor)s
}
/// Tabulate the tensor for the contribution from a local exterior facet
/// using the specified reference cell quadrature points/weights
virtual void tabulate_tensor(double* A,
const double * const * w,
const ufc::cell& c,
unsigned int num_quadrature_points,
const double * const * quadrature_points,
const double* quadrature_weights) const
{
%(tabulate_tensor_quadrature)s
}
};
"""
exterior_facet_integral_header = """\
/// This class defines the interface for the tabulation of the
/// exterior facet tensor corresponding to the local contribution to
/// a form from the integral over an exterior facet.
class %(classname)s: public ufc::exterior_facet_integral
{%(members)s
public:
/// Constructor
%(classname)s(%(constructor_arguments)s);
/// Destructor
virtual ~%(classname)s();
/// Tabulate the tensor for the contribution from a local exterior facet
virtual void tabulate_tensor(double* A,
const double * const * w,
const ufc::cell& c,
unsigned int facet) const;
/// Tabulate the tensor for the contribution from a local exterior facet
/// using the specified reference cell quadrature points/weights
virtual void tabulate_tensor(double* A,
const double * const * w,
const ufc::cell& c,
unsigned int num_quadrature_points,
const double * const * quadrature_points,
const double* quadrature_weights) const;
};
"""
exterior_facet_integral_implementation = """\
/// Constructor
%(classname)s::%(classname)s(%(constructor_arguments)s) : ufc::exterior_facet_integral()%(initializer_list)s
{
%(constructor)s
}
/// Destructor
%(classname)s::~%(classname)s()
{
%(destructor)s
}
/// Tabulate the tensor for the contribution from a local exterior facet
void %(classname)s::tabulate_tensor(double* A,
const double * const * w,
const ufc::cell& c,
unsigned int facet) const
{
%(tabulate_tensor)s
}
/// Tabulate the tensor for the contribution from a local exterior facet
/// using the specified reference cell quadrature points/weights
void %(classname)s::tabulate_tensor(double* A,
const double * const * w,
const ufc::cell& c,
unsigned int num_quadrature_points,
const double * const * quadrature_points,
const double* quadrature_weights) const
{
%(tabulate_tensor_quadrature)s
}
"""
interior_facet_integral_combined = """\
/// This class defines the interface for the tabulation of the
/// interior facet tensor corresponding to the local contribution to
/// a form from the integral over an interior facet.
class %(classname)s: public ufc::interior_facet_integral
{%(members)s
public:
/// Constructor
%(classname)s(%(constructor_arguments)s) : ufc::interior_facet_integral()%(initializer_list)s
{
%(constructor)s
}
/// Destructor
virtual ~%(classname)s()
{
%(destructor)s
}
/// Tabulate the tensor for the contribution from a local interior facet
virtual void tabulate_tensor(double* A,
const double * const * w,
const ufc::cell& c0,
const ufc::cell& c1,
unsigned int facet0,
unsigned int facet1) const
{
%(tabulate_tensor)s
}
/// Tabulate the tensor for the contribution from a local interior facet
/// using the specified reference cell quadrature points/weights
virtual void tabulate_tensor(double* A,
const double * const * w,
const ufc::cell& c,
unsigned int num_quadrature_points,
const double * const * quadrature_points,
const double* quadrature_weights) const
{
%(tabulate_tensor_quadrature)s
}
};
"""
interior_facet_integral_header = """\
/// This class defines the interface for the tabulation of the
/// interior facet tensor corresponding to the local contribution to
/// a form from the integral over an interior facet.
class %(classname)s: public ufc::interior_facet_integral
{%(members)s
public:
/// Constructor
%(classname)s(%(constructor_arguments)s);
/// Destructor
virtual ~%(classname)s();
/// Tabulate the tensor for the contribution from a local interior facet
virtual void tabulate_tensor(double* A,
const double * const * w,
const ufc::cell& c0,
const ufc::cell& c1,
unsigned int facet0,
unsigned int facet1) const;
/// Tabulate the tensor for the contribution from a local interior facet
/// using the specified reference cell quadrature points/weights
virtual void tabulate_tensor(double* A,
const double * const * w,
const ufc::cell& c,
unsigned int num_quadrature_points,
const double * const * quadrature_points,
const double* quadrature_weights) const;
};
"""
interior_facet_integral_implementation = """\
/// Constructor
%(classname)s::%(classname)s(%(constructor_arguments)s) : ufc::interior_facet_integral()%(initializer_list)s
{
%(constructor)s
}
/// Destructor
%(classname)s::~%(classname)s()
{
%(destructor)s
}
/// Tabulate the tensor for the contribution from a local interior facet
void %(classname)s::tabulate_tensor(double* A,
const double * const * w,
const ufc::cell& c0,
const ufc::cell& c1,
unsigned int facet0,
unsigned int facet1) const
{
%(tabulate_tensor)s
}
/// Tabulate the tensor for the contribution from a local interior facet
/// using the specified reference cell quadrature points/weights
void %(classname)s::tabulate_tensor(double* A,
const double * const * w,
const ufc::cell& c,
unsigned int num_quadrature_points,
const double * const * quadrature_points,
const double* quadrature_weights) const
{
%(tabulate_tensor_quadrature)s
}
"""
| 32.204611 | 108 | 0.594094 | 1,202 | 11,175 | 5.414309 | 0.064892 | 0.055317 | 0.06638 | 0.055317 | 0.973571 | 0.973571 | 0.973571 | 0.973571 | 0.973571 | 0.973571 | 0 | 0.003012 | 0.316689 | 11,175 | 346 | 109 | 32.297688 | 0.849267 | 0.016555 | 0 | 0.770833 | 0 | 0.010417 | 0.966045 | 0.161584 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
86ec29aa9b260b1aac35fde2053ddefd45bd8987 | 94 | py | Python | hidt/networks/generators/__init__.py | constantine7cd/StyleTF | e1fa28735e3c3102a2a857c121a6d8a7d76ae4da | [
"MIT"
] | 638 | 2020-03-14T20:57:24.000Z | 2022-03-31T06:19:14.000Z | hidt/networks/generators/__init__.py | constantine7cd/StyleTF | e1fa28735e3c3102a2a857c121a6d8a7d76ae4da | [
"MIT"
] | 13 | 2020-04-02T12:29:10.000Z | 2021-12-27T09:27:22.000Z | hidt/networks/generators/__init__.py | constantine7cd/StyleTF | e1fa28735e3c3102a2a857c121a6d8a7d76ae4da | [
"MIT"
] | 83 | 2020-03-20T21:52:52.000Z | 2022-03-01T02:57:25.000Z | from .gen_base import *
from .gen_content_style import *
from .gen_content_style_unet import * | 31.333333 | 37 | 0.819149 | 15 | 94 | 4.733333 | 0.466667 | 0.295775 | 0.366197 | 0.56338 | 0.704225 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117021 | 94 | 3 | 37 | 31.333333 | 0.855422 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
810f4afdb88e44dffa0a0959211918445dd84878 | 5,068 | py | Python | metricbeat/module/nats/test_nats.py | tetianakravchenko/beats | 6aec024e0ab8239791be20885d6d3c58697d18cd | [
"ECL-2.0",
"Apache-2.0"
] | 9,729 | 2015-12-02T12:44:19.000Z | 2022-03-31T13:26:12.000Z | metricbeat/module/nats/test_nats.py | tetianakravchenko/beats | 6aec024e0ab8239791be20885d6d3c58697d18cd | [
"ECL-2.0",
"Apache-2.0"
] | 25,281 | 2015-12-02T08:46:55.000Z | 2022-03-31T23:26:12.000Z | metricbeat/module/nats/test_nats.py | tetianakravchenko/beats | 6aec024e0ab8239791be20885d6d3c58697d18cd | [
"ECL-2.0",
"Apache-2.0"
] | 5,239 | 2015-12-02T09:22:33.000Z | 2022-03-31T15:11:58.000Z | import metricbeat
import os
import sys
import unittest
NATS_FIELDS = metricbeat.COMMON_FIELDS + ["nats"]
@metricbeat.parameterized_with_supported_versions
class TestNats(metricbeat.BaseTest):
COMPOSE_SERVICES = ['nats', 'nats-routes']
@unittest.skipUnless(metricbeat.INTEGRATION_TESTS, "integration test")
def test_stats(self):
"""
nats stats test
"""
self.render_config_template(modules=[{
"name": "nats",
"metricsets": ["stats"],
"hosts": self.get_hosts(),
"period": "5s",
"stats.metrics_path": "/varz"
}])
proc = self.start_beat()
self.wait_until(lambda: self.output_lines() > 0)
proc.check_kill_and_wait()
self.assert_no_logged_warnings()
output = self.read_output_json()
self.assertEqual(len(output), 1)
evt = output[0]
self.assertCountEqual(self.de_dot(NATS_FIELDS), evt.keys(), evt)
self.assert_fields_are_documented(evt)
@unittest.skipUnless(metricbeat.INTEGRATION_TESTS, "integration test")
def test_connections(self):
"""
nats connections test
"""
self.render_config_template(modules=[{
"name": "nats",
"metricsets": ["connections"],
"hosts": self.get_hosts(),
"period": "5s",
"connections.metrics_path": "/connz"
}])
proc = self.start_beat()
self.wait_until(lambda: self.output_lines() > 0)
proc.check_kill_and_wait()
self.assert_no_logged_warnings()
output = self.read_output_json()
self.assertEqual(len(output), 1)
evt = output[0]
self.assertCountEqual(self.de_dot(NATS_FIELDS), evt.keys(), evt)
self.assert_fields_are_documented(evt)
@unittest.skipUnless(metricbeat.INTEGRATION_TESTS, "integration test")
def test_connection(self):
"""
nats connection test
"""
self.render_config_template(modules=[{
"name": "nats",
"metricsets": ["connection"],
"hosts": self.get_hosts(),
"period": "5s",
"connections.metrics_path": "/connz"
}])
proc = self.start_beat()
self.wait_until(lambda: self.output_lines() > 0)
proc.check_kill_and_wait()
self.assert_no_logged_warnings()
output = self.read_output_json()
self.assertEqual(len(output), 1)
evt = output[0]
self.assertCountEqual(self.de_dot(NATS_FIELDS), evt.keys(), evt)
self.assert_fields_are_documented(evt)
@unittest.skipUnless(metricbeat.INTEGRATION_TESTS, "integration test")
def test_routes(self):
"""
nats routes test
"""
self.render_config_template(modules=[{
"name": "nats",
"metricsets": ["routes"],
"hosts": self.get_hosts(),
"period": "5s",
"routes.metrics_path": "/routez"
}])
proc = self.start_beat()
self.wait_until(lambda: self.output_lines() > 0)
proc.check_kill_and_wait()
self.assert_no_logged_warnings()
output = self.read_output_json()
self.assertEqual(len(output), 1)
evt = output[0]
self.assertCountEqual(self.de_dot(NATS_FIELDS), evt.keys(), evt)
self.assert_fields_are_documented(evt)
@unittest.skipUnless(metricbeat.INTEGRATION_TESTS, "integration test")
def test_route(self):
"""
nats route test
"""
self.render_config_template(modules=[{
"name": "nats",
"metricsets": ["route"],
"hosts": self.get_hosts(),
"period": "5s",
"routes.metrics_path": "/routez"
}])
proc = self.start_beat()
self.wait_until(lambda: self.output_lines() > 0)
proc.check_kill_and_wait()
self.assert_no_logged_warnings()
output = self.read_output_json()
self.assertEqual(len(output), 1)
evt = output[0]
self.assertCountEqual(self.de_dot(NATS_FIELDS), evt.keys(), evt)
self.assert_fields_are_documented(evt)
@unittest.skipUnless(metricbeat.INTEGRATION_TESTS, "integration test")
def test_subscriptions(self):
"""
nats subscriptions test
"""
self.render_config_template(modules=[{
"name": "nats",
"metricsets": ["subscriptions"],
"hosts": self.get_hosts(),
"period": "5s",
"subscriptions.metrics_path": "/subsz"
}])
proc = self.start_beat()
self.wait_until(lambda: self.output_lines() > 0)
proc.check_kill_and_wait()
self.assert_no_logged_warnings()
output = self.read_output_json()
self.assertEqual(len(output), 1)
evt = output[0]
self.assertCountEqual(self.de_dot(NATS_FIELDS), evt.keys(), evt)
self.assert_fields_are_documented(evt)
def get_hosts(self):
return [self.compose_host("nats")]
| 30.715152 | 74 | 0.593133 | 541 | 5,068 | 5.297597 | 0.138632 | 0.04187 | 0.058618 | 0.081647 | 0.834613 | 0.834613 | 0.817167 | 0.817167 | 0.817167 | 0.683182 | 0 | 0.006554 | 0.277427 | 5,068 | 164 | 75 | 30.902439 | 0.776079 | 0.022691 | 0 | 0.79661 | 0 | 0 | 0.108434 | 0.015372 | 0 | 0 | 0 | 0 | 0.20339 | 1 | 0.059322 | false | 0 | 0.033898 | 0.008475 | 0.118644 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d4b521c89f59da2a298a20410fd64032535427ad | 32,063 | py | Python | haiku/_src/conv.py | timwillhack/dm-haikuBah2 | b76a3db3a39b82c8a1ae5a81a8a0173c23c252e5 | [
"Apache-2.0"
] | 1,647 | 2020-02-21T14:24:31.000Z | 2022-03-31T04:31:34.000Z | haiku/_src/conv.py | timwillhack/dm-haikuBah2 | b76a3db3a39b82c8a1ae5a81a8a0173c23c252e5 | [
"Apache-2.0"
] | 169 | 2020-02-21T14:07:25.000Z | 2022-03-31T13:08:28.000Z | haiku/_src/conv.py | timwillhack/dm-haikuBah2 | b76a3db3a39b82c8a1ae5a81a8a0173c23c252e5 | [
"Apache-2.0"
] | 159 | 2020-02-21T19:31:02.000Z | 2022-03-29T12:41:35.000Z | # Copyright 2019 DeepMind Technologies Limited. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Convolutional Haiku modules."""
import types
from typing import Optional, Sequence, Union, Tuple
from haiku._src import base
from haiku._src import initializers
from haiku._src import module
from haiku._src import pad
from haiku._src import utils
from jax import lax
import jax.numpy as jnp
import numpy as np
# If you are forking replace this with `import haiku as hk`.
hk = types.ModuleType("haiku")
hk.initializers = initializers
hk.pad = pad
hk.get_parameter = base.get_parameter
hk.Module = module.Module
del base, module, initializers, pad
def to_dimension_numbers(
num_spatial_dims: int,
channels_last: bool,
transpose: bool,
) -> lax.ConvDimensionNumbers:
"""Create a `lax.ConvDimensionNumbers` for the given inputs."""
num_dims = num_spatial_dims + 2
if channels_last:
spatial_dims = tuple(range(1, num_dims - 1))
image_dn = (0, num_dims - 1) + spatial_dims
else:
spatial_dims = tuple(range(2, num_dims))
image_dn = (0, 1) + spatial_dims
if transpose:
kernel_dn = (num_dims - 2, num_dims - 1) + tuple(range(num_dims - 2))
else:
kernel_dn = (num_dims - 1, num_dims - 2) + tuple(range(num_dims - 2))
return lax.ConvDimensionNumbers(lhs_spec=image_dn, rhs_spec=kernel_dn,
out_spec=image_dn)
class ConvND(hk.Module):
"""General N-dimensional convolutional."""
def __init__(
self,
num_spatial_dims: int,
output_channels: int,
kernel_shape: Union[int, Sequence[int]],
stride: Union[int, Sequence[int]] = 1,
rate: Union[int, Sequence[int]] = 1,
padding: Union[str, Sequence[Tuple[int, int]], hk.pad.PadFn,
Sequence[hk.pad.PadFn]] = "SAME",
with_bias: bool = True,
w_init: Optional[hk.initializers.Initializer] = None,
b_init: Optional[hk.initializers.Initializer] = None,
data_format: str = "channels_last",
mask: Optional[jnp.ndarray] = None,
feature_group_count: int = 1,
name: Optional[str] = None,
):
"""Initializes the module.
Args:
num_spatial_dims: The number of spatial dimensions of the input.
output_channels: Number of output channels.
kernel_shape: The shape of the kernel. Either an integer or a sequence of
length ``num_spatial_dims``.
stride: Optional stride for the kernel. Either an integer or a sequence of
length ``num_spatial_dims``. Defaults to 1.
rate: Optional kernel dilation rate. Either an integer or a sequence of
length ``num_spatial_dims``. 1 corresponds to standard ND convolution,
``rate > 1`` corresponds to dilated convolution. Defaults to 1.
padding: Optional padding algorithm. Either ``VALID`` or ``SAME`` or a
sequence of n ``(low, high)`` integer pairs that give the padding to
apply before and after each spatial dimension. or a callable or sequence
of callables of size ``num_spatial_dims``. Any callables must take a
single integer argument equal to the effective kernel size and return a
sequence of two integers representing the padding before and after. See
``haiku.pad.*`` for more details and example functions. Defaults to
``SAME``. See:
https://www.tensorflow.org/xla/operation_semantics#conv_convolution.
with_bias: Whether to add a bias. By default, true.
w_init: Optional weight initialization. By default, truncated normal.
b_init: Optional bias initialization. By default, zeros.
data_format: The data format of the input. Can be either
``channels_first``, ``channels_last``, ``N...C`` or ``NC...``. By
default, ``channels_last``.
mask: Optional mask of the weights.
feature_group_count: Optional number of groups in group convolution.
Default value of 1 corresponds to normal dense convolution. If a higher
value is used, convolutions are applied separately to that many groups,
then stacked together. This reduces the number of parameters
and possibly the compute for a given ``output_channels``. See:
https://www.tensorflow.org/xla/operation_semantics#conv_convolution.
name: The name of the module.
"""
super().__init__(name=name)
if num_spatial_dims <= 0:
raise ValueError(
"We only support convolution operations for `num_spatial_dims` "
f"greater than 0, received num_spatial_dims={num_spatial_dims}.")
self.num_spatial_dims = num_spatial_dims
self.output_channels = output_channels
self.kernel_shape = (
utils.replicate(kernel_shape, num_spatial_dims, "kernel_shape"))
self.with_bias = with_bias
self.stride = utils.replicate(stride, num_spatial_dims, "strides")
self.w_init = w_init
self.b_init = b_init or jnp.zeros
self.mask = mask
self.feature_group_count = feature_group_count
self.lhs_dilation = utils.replicate(1, num_spatial_dims, "lhs_dilation")
self.kernel_dilation = (
utils.replicate(rate, num_spatial_dims, "kernel_dilation"))
self.data_format = data_format
self.channel_index = utils.get_channel_index(data_format)
self.dimension_numbers = to_dimension_numbers(
num_spatial_dims, channels_last=(self.channel_index == -1),
transpose=False)
if isinstance(padding, str):
self.padding = padding.upper()
elif hk.pad.is_padfn(padding):
self.padding = hk.pad.create_from_padfn(padding=padding,
kernel=self.kernel_shape,
rate=self.kernel_dilation,
n=self.num_spatial_dims)
else:
self.padding = hk.pad.create_from_tuple(padding, self.num_spatial_dims)
def __call__(
self,
inputs: jnp.ndarray,
*,
precision: Optional[lax.Precision] = None,
) -> jnp.ndarray:
"""Connects ``ConvND`` layer.
Args:
inputs: An array of shape ``[spatial_dims, C]`` and rank-N+1 if unbatched,
or an array of shape ``[N, spatial_dims, C]`` and rank-N+2 if batched.
precision: Optional :class:`jax.lax.Precision` to pass to
:func:`jax.lax.conv_general_dilated`.
Returns:
An array of shape ``[spatial_dims, output_channels]`` and rank-N+1 if
unbatched, or an array of shape ``[N, spatial_dims, output_channels]``
and rank-N+2 if batched.
"""
unbatched_rank = self.num_spatial_dims + 1
allowed_ranks = [unbatched_rank, unbatched_rank + 1]
if inputs.ndim not in allowed_ranks:
raise ValueError(f"Input to ConvND needs to have rank in {allowed_ranks},"
f" but input has shape {inputs.shape}.")
unbatched = inputs.ndim == unbatched_rank
if unbatched:
inputs = jnp.expand_dims(inputs, axis=0)
if inputs.shape[self.channel_index] % self.feature_group_count != 0:
raise ValueError(f"Inputs channels {inputs.shape[self.channel_index]} "
f"should be a multiple of feature_group_count "
f"{self.feature_group_count}")
w_shape = self.kernel_shape + (
inputs.shape[self.channel_index] // self.feature_group_count,
self.output_channels)
if self.mask is not None and self.mask.shape != w_shape:
raise ValueError("Mask needs to have the same shape as weights. "
f"Shapes are: {self.mask.shape}, {w_shape}")
w_init = self.w_init
if w_init is None:
fan_in_shape = np.prod(w_shape[:-1])
stddev = 1. / np.sqrt(fan_in_shape)
w_init = hk.initializers.TruncatedNormal(stddev=stddev)
w = hk.get_parameter("w", w_shape, inputs.dtype, init=w_init)
if self.mask is not None:
w *= self.mask
out = lax.conv_general_dilated(inputs,
w,
window_strides=self.stride,
padding=self.padding,
lhs_dilation=self.lhs_dilation,
rhs_dilation=self.kernel_dilation,
dimension_numbers=self.dimension_numbers,
feature_group_count=self.feature_group_count,
precision=precision)
if self.with_bias:
if self.channel_index == -1:
bias_shape = (self.output_channels,)
else:
bias_shape = (self.output_channels,) + (1,) * self.num_spatial_dims
b = hk.get_parameter("b", bias_shape, inputs.dtype, init=self.b_init)
b = jnp.broadcast_to(b, out.shape)
out = out + b
if unbatched:
out = jnp.squeeze(out, axis=0)
return out
class Conv1D(ConvND):
"""One dimensional convolution."""
def __init__(
self,
output_channels: int,
kernel_shape: Union[int, Sequence[int]],
stride: Union[int, Sequence[int]] = 1,
rate: Union[int, Sequence[int]] = 1,
padding: Union[str, Sequence[Tuple[int, int]], hk.pad.PadFn,
Sequence[hk.pad.PadFn]] = "SAME",
with_bias: bool = True,
w_init: Optional[hk.initializers.Initializer] = None,
b_init: Optional[hk.initializers.Initializer] = None,
data_format: str = "NWC",
mask: Optional[jnp.ndarray] = None,
feature_group_count: int = 1,
name: Optional[str] = None,
):
"""Initializes the module.
Args:
output_channels: Number of output channels.
kernel_shape: The shape of the kernel. Either an integer or a sequence of
length 1.
stride: Optional stride for the kernel. Either an integer or a sequence of
length 1. Defaults to 1.
rate: Optional kernel dilation rate. Either an integer or a sequence of
length 1. 1 corresponds to standard ND convolution,
``rate > 1`` corresponds to dilated convolution. Defaults to 1.
padding: Optional padding algorithm. Either ``VALID`` or ``SAME`` or
a callable or sequence of callables of length 1. Any callables must
take a single integer argument equal to the effective kernel size and
return a list of two integers representing the padding before and after.
See haiku.pad.* for more details and example functions.
Defaults to ``SAME``. See:
https://www.tensorflow.org/xla/operation_semantics#conv_convolution.
with_bias: Whether to add a bias. By default, true.
w_init: Optional weight initialization. By default, truncated normal.
b_init: Optional bias initialization. By default, zeros.
data_format: The data format of the input. Either ``NWC`` or ``NCW``. By
default, ``NWC``.
mask: Optional mask of the weights.
feature_group_count: Optional number of groups in group convolution.
Default value of 1 corresponds to normal dense convolution. If a higher
value is used, convolutions are applied separately to that many groups,
then stacked together. This reduces the number of parameters
and possibly the compute for a given ``output_channels``. See:
https://www.tensorflow.org/xla/operation_semantics#conv_convolution.
name: The name of the module.
"""
super().__init__(
num_spatial_dims=1,
output_channels=output_channels,
kernel_shape=kernel_shape,
stride=stride,
rate=rate,
padding=padding,
with_bias=with_bias,
w_init=w_init,
b_init=b_init,
data_format=data_format,
mask=mask,
feature_group_count=feature_group_count,
name=name)
class Conv2D(ConvND):
"""Two dimensional convolution."""
def __init__(
self,
output_channels: int,
kernel_shape: Union[int, Sequence[int]],
stride: Union[int, Sequence[int]] = 1,
rate: Union[int, Sequence[int]] = 1,
padding: Union[str, Sequence[Tuple[int, int]], hk.pad.PadFn,
Sequence[hk.pad.PadFn]] = "SAME",
with_bias: bool = True,
w_init: Optional[hk.initializers.Initializer] = None,
b_init: Optional[hk.initializers.Initializer] = None,
data_format: str = "NHWC",
mask: Optional[jnp.ndarray] = None,
feature_group_count: int = 1,
name: Optional[str] = None,
):
"""Initializes the module.
Args:
output_channels: Number of output channels.
kernel_shape: The shape of the kernel. Either an integer or a sequence of
length 2.
stride: Optional stride for the kernel. Either an integer or a sequence of
length 2. Defaults to 1.
rate: Optional kernel dilation rate. Either an integer or a sequence of
length 2. 1 corresponds to standard ND convolution,
``rate > 1`` corresponds to dilated convolution. Defaults to 1.
padding: Optional padding algorithm. Either ``VALID`` or ``SAME`` or
a callable or sequence of callables of length 2. Any callables must
take a single integer argument equal to the effective kernel size and
return a list of two integers representing the padding before and after.
See haiku.pad.* for more details and example functions.
Defaults to ``SAME``. See:
https://www.tensorflow.org/xla/operation_semantics#conv_convolution.
with_bias: Whether to add a bias. By default, true.
w_init: Optional weight initialization. By default, truncated normal.
b_init: Optional bias initialization. By default, zeros.
data_format: The data format of the input. Either ``NHWC`` or ``NCHW``. By
default, ``NHWC``.
mask: Optional mask of the weights.
feature_group_count: Optional number of groups in group convolution.
Default value of 1 corresponds to normal dense convolution. If a higher
value is used, convolutions are applied separately to that many groups,
then stacked together. This reduces the number of parameters
and possibly the compute for a given ``output_channels``. See:
https://www.tensorflow.org/xla/operation_semantics#conv_convolution.
name: The name of the module.
"""
super().__init__(
num_spatial_dims=2,
output_channels=output_channels,
kernel_shape=kernel_shape,
stride=stride,
rate=rate,
padding=padding,
with_bias=with_bias,
w_init=w_init,
b_init=b_init,
data_format=data_format,
mask=mask,
feature_group_count=feature_group_count,
name=name)
class Conv3D(ConvND):
"""Three dimensional convolution."""
def __init__(
self,
output_channels: int,
kernel_shape: Union[int, Sequence[int]],
stride: Union[int, Sequence[int]] = 1,
rate: Union[int, Sequence[int]] = 1,
padding: Union[str, Sequence[Tuple[int, int]], hk.pad.PadFn,
Sequence[hk.pad.PadFn]] = "SAME",
with_bias: bool = True,
w_init: Optional[hk.initializers.Initializer] = None,
b_init: Optional[hk.initializers.Initializer] = None,
data_format: str = "NDHWC",
mask: Optional[jnp.ndarray] = None,
feature_group_count: int = 1,
name: Optional[str] = None,
):
"""Initializes the module.
Args:
output_channels: Number of output channels.
kernel_shape: The shape of the kernel. Either an integer or a sequence of
length 3.
stride: Optional stride for the kernel. Either an integer or a sequence of
length 3. Defaults to 1.
rate: Optional kernel dilation rate. Either an integer or a sequence of
length 3. 1 corresponds to standard ND convolution,
`rate > 1` corresponds to dilated convolution. Defaults to 1.
padding: Optional padding algorithm. Either ``VALID`` or ``SAME`` or
a callable or sequence of callables of length 3. Any callables must
take a single integer argument equal to the effective kernel size and
return a list of two integers representing the padding before and after.
See haiku.pad.* for more details and example functions.
Defaults to ``SAME``. See:
https://www.tensorflow.org/xla/operation_semantics#conv_convolution.
with_bias: Whether to add a bias. By default, true.
w_init: Optional weight initialization. By default, truncated normal.
b_init: Optional bias initialization. By default, zeros.
data_format: The data format of the input. Either ``NDHWC`` or ``NCDHW``.
By default, ``NDHWC``.
mask: Optional mask of the weights.
feature_group_count: Optional number of groups in group convolution.
Default value of 1 corresponds to normal dense convolution. If a higher
value is used, convolutions are applied separately to that many groups,
then stacked together. This reduces the number of parameters
and possibly the compute for a given ``output_channels``. See:
https://www.tensorflow.org/xla/operation_semantics#conv_convolution.
name: The name of the module.
"""
super().__init__(
num_spatial_dims=3,
output_channels=output_channels,
kernel_shape=kernel_shape,
stride=stride,
rate=rate,
padding=padding,
with_bias=with_bias,
w_init=w_init,
b_init=b_init,
data_format=data_format,
mask=mask,
feature_group_count=feature_group_count,
name=name)
def compute_adjusted_padding(
input_size: int,
output_size: int,
kernel_size: int,
stride: int,
padding: str,
dilation: int = 1,
) -> Tuple[int, int]:
"""Computes adjusted padding for desired ConvTranspose `output_size`."""
kernel_size = (kernel_size - 1) * dilation + 1
if padding == "VALID":
expected_input_size = (output_size - kernel_size + stride) // stride
if input_size != expected_input_size:
raise ValueError(f"The expected input size with the current set of input "
f"parameters is {expected_input_size} which doesn't "
f"match the actual input size {input_size}.")
padding_before = 0
elif padding == "SAME":
expected_input_size = (output_size + stride - 1) // stride
if input_size != expected_input_size:
raise ValueError(f"The expected input size with the current set of input "
f"parameters is {expected_input_size} which doesn't "
f"match the actual input size {input_size}.")
padding_needed = max(0,
(input_size - 1) * stride + kernel_size - output_size)
padding_before = padding_needed // 2
else:
raise ValueError(f"`padding` must be 'VALID' or 'SAME'. Passed: {padding}.")
expanded_input_size = (input_size - 1) * stride + 1
padded_out_size = output_size + kernel_size - 1
pad_before = kernel_size - 1 - padding_before
pad_after = padded_out_size - expanded_input_size - pad_before
return (pad_before, pad_after)
class ConvNDTranspose(hk.Module):
"""General n-dimensional transposed convolution (aka. deconvolution)."""
def __init__(
self,
num_spatial_dims: int,
output_channels: int,
kernel_shape: Union[int, Sequence[int]],
stride: Union[int, Sequence[int]] = 1,
output_shape: Optional[Union[int, Sequence[int]]] = None,
padding: Union[str, Sequence[Tuple[int, int]]] = "SAME",
with_bias: bool = True,
w_init: Optional[hk.initializers.Initializer] = None,
b_init: Optional[hk.initializers.Initializer] = None,
data_format: str = "channels_last",
mask: Optional[jnp.ndarray] = None,
name: Optional[str] = None,
):
"""Initializes the module.
Args:
num_spatial_dims: The number of spatial dimensions of the input.
output_channels: Number of output channels.
kernel_shape: The shape of the kernel. Either an integer or a sequence of
length ``num_spatial_dims``.
stride: Optional stride for the kernel. Either an integer or a sequence of
length ``num_spatial_dims``. Defaults to 1.
output_shape: Output shape of the spatial dimensions of a transpose
convolution. Can be either an integer or an iterable of integers. If a
`None` value is given, a default shape is automatically calculated.
padding: Optional padding algorithm. Either "VALID" or "SAME".
Defaults to "SAME". See:
https://www.tensorflow.org/xla/operation_semantics#conv_convolution.
with_bias: Whether to add a bias. By default, true.
w_init: Optional weight initialization. By default, truncated normal.
b_init: Optional bias initialization. By default, zeros.
data_format: The data format of the input. Can be either
``channels_first``, ``channels_last``, ``N...C`` or ``NC...``. By
default, ``channels_last``.
mask: Optional mask of the weights.
name: The name of the module.
"""
super().__init__(name=name)
if num_spatial_dims <= 0:
raise ValueError(
"We only support convolution operations for `num_spatial_dims` "
f"greater than 0, received num_spatial_dims={num_spatial_dims}.")
self.num_spatial_dims = num_spatial_dims
self.output_channels = output_channels
self.kernel_shape = (
utils.replicate(kernel_shape, num_spatial_dims, "kernel_shape"))
self.output_shape = output_shape
if self.output_shape is not None:
self.output_shape = (
utils.replicate(output_shape, num_spatial_dims, "output_shape"))
if not isinstance(padding, str):
raise ValueError("When specifying `output_shape`, ensure that paddding "
"is 'VALID' or 'SAME'.")
self.with_bias = with_bias
self.stride = utils.replicate(stride, num_spatial_dims, "strides")
self.w_init = w_init
self.b_init = b_init or jnp.zeros
self.mask = mask
# TODO(tomhennigan) Make use of hk.pad.create_from_tuple here?
self.padding = padding
self.data_format = data_format
self.channel_index = utils.get_channel_index(data_format)
self.dimension_numbers = to_dimension_numbers(
num_spatial_dims, channels_last=(self.channel_index == -1),
transpose=True)
def __call__(
self,
inputs: jnp.ndarray,
*,
precision: Optional[lax.Precision] = None,
) -> jnp.ndarray:
"""Computes the transposed convolution of the input.
Args:
inputs: An array of shape ``[spatial_dims, C]`` and rank-N+1 if unbatched,
or an array of shape ``[N, spatial_dims, C]`` and rank-N+2 if batched.
precision: Optional :class:`jax.lax.Precision` to pass to
:func:`jax.lax.conv_transpose`.
Returns:
An array of shape ``[spatial_dims, output_channels]`` and rank-N+1 if
unbatched, or an array of shape ``[N, spatial_dims, output_channels]``
and rank-N+2 if batched.
"""
unbatched_rank = self.num_spatial_dims + 1
allowed_ranks = [unbatched_rank, unbatched_rank + 1]
if inputs.ndim not in allowed_ranks:
raise ValueError(f"Input to ConvNDTranspose needs to have rank in "
f"{allowed_ranks}, but input has shape {inputs.shape}.")
unbatched = inputs.ndim == unbatched_rank
if unbatched:
inputs = jnp.expand_dims(inputs, axis=0)
input_channels = inputs.shape[self.channel_index]
w_shape = self.kernel_shape + (self.output_channels, input_channels)
if self.mask is not None and self.mask.shape != w_shape:
raise ValueError("Mask needs to have the same shape as weights. "
f"Shapes are: {self.mask.shape}, {w_shape}")
w_init = self.w_init
if w_init is None:
fan_in_shape = self.kernel_shape + (input_channels,)
stddev = 1. / np.sqrt(np.prod(fan_in_shape))
w_init = hk.initializers.TruncatedNormal(stddev=stddev)
w = hk.get_parameter("w", w_shape, inputs.dtype, init=w_init)
if self.mask is not None:
w = w * self.mask
padding = self.padding
if self.output_shape is not None:
input_shape = (
inputs.shape[2:] if self.channel_index == 1 else inputs.shape[1:-1])
padding = tuple(map(
lambda i, o, k, s: compute_adjusted_padding(i, o, k, s, self.padding),
input_shape, self.output_shape, self.kernel_shape, self.stride))
out = lax.conv_transpose(inputs,
w,
strides=self.stride,
padding=padding,
dimension_numbers=self.dimension_numbers,
precision=precision)
if self.with_bias:
if self.channel_index == -1:
bias_shape = (self.output_channels,)
else:
bias_shape = (self.output_channels,) + (1,) * self.num_spatial_dims
b = hk.get_parameter("b", bias_shape, inputs.dtype, init=self.b_init)
b = jnp.broadcast_to(b, out.shape)
out = out + b
if unbatched:
out = jnp.squeeze(out, axis=0)
return out
class Conv1DTranspose(ConvNDTranspose):
"""One dimensional transposed convolution (aka. deconvolution)."""
def __init__(
self,
output_channels: int,
kernel_shape: Union[int, Sequence[int]],
stride: Union[int, Sequence[int]] = 1,
output_shape: Optional[Union[int, Sequence[int]]] = None,
padding: Union[str, Sequence[Tuple[int, int]]] = "SAME",
with_bias: bool = True,
w_init: Optional[hk.initializers.Initializer] = None,
b_init: Optional[hk.initializers.Initializer] = None,
data_format: str = "NWC",
mask: Optional[jnp.ndarray] = None,
name: Optional[str] = None,
):
"""Initializes the module.
Args:
output_channels: Number of output channels.
kernel_shape: The shape of the kernel. Either an integer or a sequence of
length 1.
stride: Optional stride for the kernel. Either an integer or a sequence of
length 1. Defaults to 1.
output_shape: Output shape of the spatial dimensions of a transpose
convolution. Can be either an integer or an iterable of integers. If a
`None` value is given, a default shape is automatically calculated.
padding: Optional padding algorithm. Either ``VALID`` or ``SAME``.
Defaults to ``SAME``. See:
https://www.tensorflow.org/xla/operation_semantics#conv_convolution.
with_bias: Whether to add a bias. By default, true.
w_init: Optional weight initialization. By default, truncated normal.
b_init: Optional bias initialization. By default, zeros.
data_format: The data format of the input. Either ``NWC`` or ``NCW``. By
default, ``NWC``.
mask: Optional mask of the weights.
name: The name of the module.
"""
super().__init__(
num_spatial_dims=1,
output_channels=output_channels,
kernel_shape=kernel_shape,
output_shape=output_shape,
stride=stride,
padding=padding,
with_bias=with_bias,
w_init=w_init,
b_init=b_init,
data_format=data_format,
mask=mask,
name=name)
class Conv2DTranspose(ConvNDTranspose):
"""Two dimensional transposed convolution (aka. deconvolution)."""
def __init__(
self,
output_channels: int,
kernel_shape: Union[int, Sequence[int]],
stride: Union[int, Sequence[int]] = 1,
output_shape: Optional[Union[int, Sequence[int]]] = None,
padding: Union[str, Sequence[Tuple[int, int]]] = "SAME",
with_bias: bool = True,
w_init: Optional[hk.initializers.Initializer] = None,
b_init: Optional[hk.initializers.Initializer] = None,
data_format: str = "NHWC",
mask: Optional[jnp.ndarray] = None,
name: Optional[str] = None,
):
"""Initializes the module.
Args:
output_channels: Number of output channels.
kernel_shape: The shape of the kernel. Either an integer or a sequence of
length 2.
stride: Optional stride for the kernel. Either an integer or a sequence of
length 2. Defaults to 1.
output_shape: Output shape of the spatial dimensions of a transpose
convolution. Can be either an integer or an iterable of integers. If a
`None` value is given, a default shape is automatically calculated.
padding: Optional padding algorithm. Either ``VALID`` or ``SAME``.
Defaults to ``SAME``. See:
https://www.tensorflow.org/xla/operation_semantics#conv_convolution.
with_bias: Whether to add a bias. By default, true.
w_init: Optional weight initialization. By default, truncated normal.
b_init: Optional bias initialization. By default, zeros.
data_format: The data format of the input. Either ``NHWC`` or ``NCHW``. By
default, ``NHWC``.
mask: Optional mask of the weights.
name: The name of the module.
"""
super().__init__(
num_spatial_dims=2,
output_channels=output_channels,
kernel_shape=kernel_shape,
stride=stride,
output_shape=output_shape,
padding=padding,
with_bias=with_bias,
w_init=w_init,
b_init=b_init,
data_format=data_format,
mask=mask,
name=name)
class Conv3DTranspose(ConvNDTranspose):
"""Three dimensional transposed convolution (aka. deconvolution)."""
def __init__(
self,
output_channels: int,
kernel_shape: Union[int, Sequence[int]],
stride: Union[int, Sequence[int]] = 1,
output_shape: Optional[Union[int, Sequence[int]]] = None,
padding: Union[str, Sequence[Tuple[int, int]]] = "SAME",
with_bias: bool = True,
w_init: Optional[hk.initializers.Initializer] = None,
b_init: Optional[hk.initializers.Initializer] = None,
data_format: str = "NDHWC",
mask: Optional[jnp.ndarray] = None,
name: Optional[str] = None,
):
"""Initializes the module.
Args:
output_channels: Number of output channels.
kernel_shape: The shape of the kernel. Either an integer or a sequence of
length 3.
stride: Optional stride for the kernel. Either an integer or a sequence of
length 3. Defaults to 1.
output_shape: Output shape of the spatial dimensions of a transpose
convolution. Can be either an integer or an iterable of integers. If a
`None` value is given, a default shape is automatically calculated.
padding: Optional padding algorithm. Either ``VALID`` or ``SAME``.
Defaults to ``SAME``. See:
https://www.tensorflow.org/xla/operation_semantics#conv_convolution.
with_bias: Whether to add a bias. By default, true.
w_init: Optional weight initialization. By default, truncated normal.
b_init: Optional bias initialization. By default, zeros.
data_format: The data format of the input. Either ``NDHWC`` or ``NCDHW``.
By default, ``NDHWC``.
mask: Optional mask of the weights.
name: The name of the module.
"""
super().__init__(
num_spatial_dims=3,
output_channels=output_channels,
kernel_shape=kernel_shape,
stride=stride,
output_shape=output_shape,
padding=padding,
with_bias=with_bias,
w_init=w_init,
b_init=b_init,
data_format=data_format,
mask=mask,
name=name)
| 41.478655 | 80 | 0.656582 | 4,217 | 32,063 | 4.829263 | 0.07944 | 0.030788 | 0.030935 | 0.022391 | 0.83344 | 0.814437 | 0.805205 | 0.802652 | 0.798625 | 0.793764 | 0 | 0.005777 | 0.249602 | 32,063 | 772 | 81 | 41.532383 | 0.840648 | 0.427377 | 0 | 0.735763 | 0 | 0 | 0.076421 | 0.010043 | 0 | 0 | 0 | 0.001295 | 0 | 1 | 0.027335 | false | 0.002278 | 0.022779 | 0 | 0.077449 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d4baaf974f19e8ea5d69139bc2e36614444e7dd7 | 246 | py | Python | stst/__init__.py | skywalkerzhang/STS-tookit | 74fbd4d7709e4474ccde3a8e1c260a9ef162a1d9 | [
"MIT"
] | 94 | 2017-02-08T02:47:58.000Z | 2022-02-04T15:05:36.000Z | stst/__init__.py | saharghannay/Semantic-Texual-Similarity-Toolkits | 7ef271e4e4ca55330b31bce06368274c2ddbe3a9 | [
"MIT"
] | 18 | 2017-02-08T07:29:58.000Z | 2019-07-24T10:00:32.000Z | stst/__init__.py | saharghannay/Semantic-Texual-Similarity-Toolkits | 7ef271e4e4ca55330b31bce06368274c2ddbe3a9 | [
"MIT"
] | 31 | 2017-04-07T06:59:09.000Z | 2021-12-21T09:23:09.000Z | from stst.parser.corenlp_utils import *
from stst.features import *
from stst.metrics.evaluation import *
from stst.metrics.record import *
from stst.modules.classifier import *
from stst.modules.model import *
from stst.data.data_utils import * | 30.75 | 39 | 0.804878 | 36 | 246 | 5.444444 | 0.388889 | 0.285714 | 0.428571 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113821 | 246 | 8 | 40 | 30.75 | 0.899083 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d4e4ef8d41400eb9043c563e21bc3564562b0c6b | 66 | py | Python | tsx/db/__init__.py | nesp-tsr/tsx | 99125db6e81652592cf4b3e8a89f8c8ac8de64af | [
"MIT"
] | 3 | 2019-01-27T12:03:46.000Z | 2022-01-29T02:06:33.000Z | tsx/db/__init__.py | nesp-tsr/tsx | 99125db6e81652592cf4b3e8a89f8c8ac8de64af | [
"MIT"
] | 27 | 2019-02-25T22:49:14.000Z | 2022-02-02T04:07:52.000Z | tsx/db/__init__.py | nesp-tsr/tsx | 99125db6e81652592cf4b3e8a89f8c8ac8de64af | [
"MIT"
] | 1 | 2019-02-28T04:24:43.000Z | 2019-02-28T04:24:43.000Z | from tsx.db.connect import get_session
from tsx.db.models import * | 33 | 38 | 0.818182 | 12 | 66 | 4.416667 | 0.666667 | 0.264151 | 0.339623 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106061 | 66 | 2 | 39 | 33 | 0.898305 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d4fe734db4835aeab52ec96f2054b6c22f7f74be | 41 | py | Python | app/endpoints/__init__.py | patrykomiotek/seo-monitor-api | d4d40998946f69e50527cbfe4631d95522046ce0 | [
"MIT"
] | null | null | null | app/endpoints/__init__.py | patrykomiotek/seo-monitor-api | d4d40998946f69e50527cbfe4631d95522046ce0 | [
"MIT"
] | 77 | 2017-05-18T12:31:42.000Z | 2022-03-31T02:17:32.000Z | app/endpoints/__init__.py | patrykomiotek/seo-monitor-api | d4d40998946f69e50527cbfe4631d95522046ce0 | [
"MIT"
] | null | null | null | from . import ping
from . import sitemaps | 20.5 | 22 | 0.780488 | 6 | 41 | 5.333333 | 0.666667 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170732 | 41 | 2 | 22 | 20.5 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
be18525fbccc2255d4cdd5436d8401a21e3dad88 | 135,969 | py | Python | django_eve_data/esi_api/characters.py | SvenMatzke/EveData | a890c5bf4197f63092938de5d35c63054e0bb18c | [
"MIT"
] | 1 | 2017-02-26T20:34:11.000Z | 2017-02-26T20:34:11.000Z | django_eve_data/esi_api/characters.py | SvenMatzke/eve_data | a890c5bf4197f63092938de5d35c63054e0bb18c | [
"MIT"
] | null | null | null | django_eve_data/esi_api/characters.py | SvenMatzke/eve_data | a890c5bf4197f63092938de5d35c63054e0bb18c | [
"MIT"
] | null | null | null | # coding utf-8
"""
Autogenerated Template File
"""
from .base import EsiRequestObject
class CharactersDetailBookmarksFolders(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/bookmarks/folders/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_bookmarks_folders_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_bookmarks_folders_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_bookmarks_folders_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_bookmarks_folders_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-bookmarks.read_character_bookmarks.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'items': {'type': 'object', 'properties': {'owner_id': {'format': 'int32', 'type': 'integer', 'description': 'owner_id integer', 'title': 'get_characters_character_id_bookmarks_folders_owner_id'}, 'folder_id': {'format': 'int32', 'type': 'integer', 'description': 'folder_id integer', 'title': 'get_characters_character_id_bookmarks_folders_folder_id'}, 'name': {'type': 'string', 'description': 'name string', 'title': 'get_characters_character_id_bookmarks_folders_name'}}, 'description': '200 ok object', 'title': 'get_characters_character_id_bookmarks_folders_200_ok'}, 'type': 'array', 'description': '200 ok array', 'title': 'get_characters_character_id_bookmarks_folders_ok'}, 'examples': {'application/json': [{'owner_id': 90000001, 'folder_id': 5, 'name': 'Icecream'}]}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'List of bookmark folders'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
List your character's personal bookmark folders
---
Alternate route: `/v1/characters/{character_id}/bookmarks/folders/`
Alternate route: `/legacy/characters/{character_id}/bookmarks/folders/`
Alternate route: `/dev/characters/{character_id}/bookmarks/folders/`
---
This route is cached for up to 3600 seconds
:type character_id: int
:param character_id: An EVE character ID
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetail(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '200': {'schema': {'title': 'get_characters_character_id_ok', 'type': 'object', 'properties': {'ancestry_id': {'format': 'int32', 'type': 'integer', 'description': 'ancestry_id integer', 'title': 'get_characters_character_id_ancestry_id'}, 'corporation_id': {'format': 'int32', 'type': 'integer', 'description': "The character's corporation ID", 'title': 'get_characters_character_id_corporation_id'}, 'alliance_id': {'format': 'int32', 'type': 'integer', 'description': "The character's alliance ID", 'title': 'get_characters_character_id_alliance_id'}, 'security_status': {'maximum': 10, 'format': 'float', 'minimum': -10, 'type': 'number', 'description': 'security_status number', 'title': 'get_characters_character_id_security_status'}, 'birthday': {'format': 'date-time', 'type': 'string', 'description': 'Creation date of the character', 'title': 'get_characters_character_id_birthday'}, 'race_id': {'format': 'int32', 'type': 'integer', 'description': 'race_id integer', 'title': 'get_characters_character_id_race_id'}, 'bloodline_id': {'format': 'int32', 'type': 'integer', 'description': 'bloodline_id integer', 'title': 'get_characters_character_id_bloodline_id'}, 'name': {'type': 'string', 'description': 'name string', 'title': 'get_characters_character_id_name'}, 'description': {'type': 'string', 'description': 'description string', 'title': 'get_characters_character_id_description'}, 'gender': {'enum': ['female', 'male'], 'type': 'string', 'description': 'gender string', 'title': 'get_characters_character_id_gender'}}, 'description': '200 ok object', 'required': ['corporation_id', 'birthday', 'name', 'gender', 'race_id', 'bloodline_id']}, 'examples': {'application/json': {'ancestry_id': 19, 'corporation_id': 109299958, 'birthday': '2015-03-24T11:37:00Z', 'race_id': 2, 'bloodline_id': 3, 'name': 'CCP Bartender', 'description': '', 'gender': 'male'}}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'Public data for the given character'}, '404': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Not found message', 'title': 'get_characters_character_id_404_not_found'}}, 'description': 'Not found', 'title': 'get_characters_character_id_not_found'}, 'examples': {'application/json': {'error': 'Not found message'}}, 'description': 'Character not found'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
Public information about a character
---
Alternate route: `/v4/characters/{character_id}/`
Alternate route: `/dev/characters/{character_id}/`
---
This route is cached for up to 3600 seconds
:type character_id: int
:param character_id: An EVE character ID
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetailContactsLabels(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/contacts/labels/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_contacts_labels_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_contacts_labels_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_contacts_labels_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_contacts_labels_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-characters.read_contacts.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'items': {'title': 'get_characters_character_id_contacts_labels_200_ok', 'type': 'object', 'properties': {'label_id': {'format': 'int64', 'type': 'integer', 'description': 'label_id integer', 'title': 'get_characters_character_id_contacts_labels_label_id'}, 'label_name': {'type': 'string', 'description': 'label_name string', 'title': 'get_characters_character_id_contacts_labels_label_name'}}, 'description': '200 ok object', 'required': ['label_id', 'label_name']}, 'type': 'array', 'description': '200 ok array', 'title': 'get_characters_character_id_contacts_labels_ok'}, 'examples': {'application/json': [{'label_id': 123, 'label_name': 'Friends'}]}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'A list of contact labels'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
Return custom labels for contacts the character defined
---
Alternate route: `/v1/characters/{character_id}/contacts/labels/`
Alternate route: `/legacy/characters/{character_id}/contacts/labels/`
Alternate route: `/dev/characters/{character_id}/contacts/labels/`
---
This route is cached for up to 300 seconds
:type character_id: int
:param character_id: ID for a character
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetailMailLabelsDetail(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/mail/labels/{label_id}/"
delete_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'delete_characters_character_id_mail_labels_label_id_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'delete_characters_character_id_mail_labels_label_id_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'delete_characters_character_id_mail_labels_label_id_403_forbidden'}}, 'description': 'Forbidden', 'title': 'delete_characters_character_id_mail_labels_label_id_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-mail.organize_mail.v1'}}, 'description': 'Forbidden'}, '204': {'description': 'Label deleted'}, '422': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Unprocessable entity message', 'title': 'delete_characters_character_id_mail_labels_label_id_422_unprocessable_entity'}}, 'description': 'Unprocessable entity', 'title': 'delete_characters_character_id_mail_labels_label_id_unprocessable_entity'}, 'examples': {'application/json': {'error': 'Unprocessable entity message'}}, 'description': 'Default labels cannot be deleted'}}
def delete(self, character_id, label_id, datasource="tranquility",**kwargs):
"""
Delete a mail label
---
Alternate route: `/v1/characters/{character_id}/mail/labels/{label_id}/`
Alternate route: `/legacy/characters/{character_id}/mail/labels/{label_id}/`
Alternate route: `/dev/characters/{character_id}/mail/labels/{label_id}/`
:type character_id: int
:param character_id: An EVE character ID
:type label_id: int
:param label_id: An EVE label id
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "label_id" : label_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.delete_responses) \
.delete(**kwargs_dict)
class CharactersDetailSkillqueue(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/skillqueue/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_skillqueue_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_skillqueue_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_skillqueue_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_skillqueue_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-skills.read_skillqueue.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'items': {'title': 'get_characters_character_id_skillqueue_200_ok', 'type': 'object', 'properties': {'start_date': {'format': 'date-time', 'type': 'string', 'description': 'start_date string', 'title': 'get_characters_character_id_skillqueue_start_date'}, 'finish_date': {'format': 'date-time', 'type': 'string', 'description': 'finish_date string', 'title': 'get_characters_character_id_skillqueue_finish_date'}, 'skill_id': {'format': 'int32', 'type': 'integer', 'description': 'skill_id integer', 'title': 'get_characters_character_id_skillqueue_skill_id'}, 'finished_level': {'maximum': 5, 'format': 'int32', 'minimum': 0, 'type': 'integer', 'description': 'finished_level integer', 'title': 'get_characters_character_id_skillqueue_finished_level'}, 'queue_position': {'format': 'int32', 'type': 'integer', 'description': 'queue_position integer', 'title': 'get_characters_character_id_skillqueue_queue_position'}, 'level_start_sp': {'format': 'int32', 'type': 'integer', 'description': "Amount of SP that was in the skill when it started training it's current level. Used to calculate % of current level complete.", 'title': 'get_characters_character_id_skillqueue_level_start_sp'}, 'level_end_sp': {'format': 'int32', 'type': 'integer', 'description': 'level_end_sp integer', 'title': 'get_characters_character_id_skillqueue_level_end_sp'}, 'training_start_sp': {'format': 'int32', 'type': 'integer', 'description': 'training_start_sp integer', 'title': 'get_characters_character_id_skillqueue_training_start_sp'}}, 'description': '200 ok object', 'required': ['skill_id', 'finished_level', 'queue_position']}, 'type': 'array', 'description': '200 ok array', 'title': 'get_characters_character_id_skillqueue_ok'}, 'examples': {'application/json': [{'queue_position': 0, 'finish_date': '2016-06-29T10:47:00Z', 'start_date': '2016-06-29T10:46:00Z', 'skill_id': 1, 'finished_level': 3}, {'queue_position': 1, 'finish_date': '2016-07-15T10:47:00Z', 'start_date': '2016-06-29T10:47:00Z', 'skill_id': 1, 'finished_level': 4}, {'queue_position': 2, 'finish_date': '2016-08-30T10:47:00Z', 'start_date': '2016-07-15T10:47:00Z', 'skill_id': 2, 'finished_level': 2}]}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'The current skill queue, sorted ascending by finishing time'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
List the configured skill queue for the given character
---
Alternate route: `/v2/characters/{character_id}/skillqueue/`
Alternate route: `/legacy/characters/{character_id}/skillqueue/`
Alternate route: `/dev/characters/{character_id}/skillqueue/`
---
This route is cached for up to 120 seconds
:type character_id: int
:param character_id: Character id of the target character
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetailFittings(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/fittings/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_fittings_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_fittings_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_fittings_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_fittings_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-fittings.read_fittings.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'items': {'title': 'get_characters_character_id_fittings_200_ok', 'type': 'object', 'properties': {'ship_type_id': {'format': 'int32', 'type': 'integer', 'description': 'ship_type_id integer', 'title': 'get_characters_character_id_fittings_ship_type_id'}, 'fitting_id': {'format': 'int32', 'type': 'integer', 'description': 'fitting_id integer', 'title': 'get_characters_character_id_fittings_fitting_id'}, 'name': {'type': 'string', 'description': 'name string', 'title': 'get_characters_character_id_fittings_name'}, 'description': {'type': 'string', 'description': 'description string', 'title': 'get_characters_character_id_fittings_description'}, 'items': {'items': {'title': 'get_characters_character_id_fittings_item', 'type': 'object', 'properties': {'flag': {'format': 'int32', 'type': 'integer', 'description': 'flag integer', 'title': 'get_characters_character_id_fittings_flag'}, 'quantity': {'format': 'int32', 'type': 'integer', 'description': 'quantity integer', 'title': 'get_characters_character_id_fittings_quantity'}, 'type_id': {'format': 'int32', 'type': 'integer', 'description': 'type_id integer', 'title': 'get_characters_character_id_fittings_type_id'}}, 'description': 'item object', 'required': ['type_id', 'flag', 'quantity']}, 'type': 'array', 'description': 'items array', 'title': 'get_characters_character_id_fittings_items'}}, 'description': '200 ok object', 'required': ['fitting_id', 'name', 'description', 'ship_type_id', 'items']}, 'type': 'array', 'description': '200 ok array', 'title': 'get_characters_character_id_fittings_ok'}, 'examples': {'application/json': [{'ship_type_id': 123, 'fitting_id': 1, 'name': 'Best Vindicator', 'description': 'Awesome Vindi fitting', 'items': [{'flag': 12, 'quantity': 1, 'type_id': 1234}]}]}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'A list of fittings'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
Return fittings of a character
---
Alternate route: `/v1/characters/{character_id}/fittings/`
Alternate route: `/legacy/characters/{character_id}/fittings/`
Alternate route: `/dev/characters/{character_id}/fittings/`
---
This route is cached for up to 300 seconds
:type character_id: int
:param character_id: ID for a character
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
post_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'post_characters_character_id_fittings_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'post_characters_character_id_fittings_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'post_characters_character_id_fittings_403_forbidden'}}, 'description': 'Forbidden', 'title': 'post_characters_character_id_fittings_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-fittings.write_fittings.v1'}}, 'description': 'Forbidden'}, '201': {'schema': {'title': 'post_characters_character_id_fittings_created', 'type': 'object', 'properties': {'fitting_id': {'format': 'int32', 'type': 'integer', 'description': 'fitting_id integer', 'title': 'post_characters_character_id_fittings_fitting_id'}}, 'description': '201 created object', 'required': ['fitting_id']}, 'examples': {'application/json': {'fitting_id': 2}}, 'description': 'A list of fittings'}}
def post(self, character_id, datasource="tranquility",**kwargs):
"""
Save a new fitting for a character
---
Alternate route: `/v1/characters/{character_id}/fittings/`
Alternate route: `/legacy/characters/{character_id}/fittings/`
Alternate route: `/dev/characters/{character_id}/fittings/`
:type character_id: int
:param character_id: ID for a character
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: fitting, token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.post_responses) \
.post(**kwargs_dict)
class CharactersDetailSearch(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/search/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_search_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_search_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_search_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_search_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-search.search_structures.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'type': 'object', 'properties': {'faction': {'items': {'format': 'int32', 'type': 'integer', 'description': 'faction integer', 'title': 'get_characters_character_id_search_faction'}, 'type': 'array', 'description': 'faction array', 'title': 'get_characters_character_id_search_faction'}, 'constellation': {'items': {'format': 'int32', 'type': 'integer', 'description': 'constellation integer', 'title': 'get_characters_character_id_search_constellation'}, 'type': 'array', 'description': 'constellation array', 'title': 'get_characters_character_id_search_constellation'}, 'inventorytype': {'items': {'format': 'int32', 'type': 'integer', 'description': 'inventorytype integer', 'title': 'get_characters_character_id_search_inventorytype'}, 'type': 'array', 'description': 'inventorytype array', 'title': 'get_characters_character_id_search_inventorytype'}, 'wormhole': {'items': {'format': 'int32', 'type': 'integer', 'description': 'wormhole integer', 'title': 'get_characters_character_id_search_wormhole'}, 'type': 'array', 'description': 'wormhole array', 'title': 'get_characters_character_id_search_wormhole'}, 'corporation': {'items': {'format': 'int32', 'type': 'integer', 'description': 'corporation integer', 'title': 'get_characters_character_id_search_corporation'}, 'type': 'array', 'description': 'corporation array', 'title': 'get_characters_character_id_search_corporation'}, 'region': {'items': {'format': 'int32', 'type': 'integer', 'description': 'region integer', 'title': 'get_characters_character_id_search_region'}, 'type': 'array', 'description': 'region array', 'title': 'get_characters_character_id_search_region'}, 'alliance': {'items': {'format': 'int32', 'type': 'integer', 'description': 'alliance integer', 'title': 'get_characters_character_id_search_alliance'}, 'type': 'array', 'description': 'alliance array', 'title': 'get_characters_character_id_search_alliance'}, 'character': {'items': {'format': 'int32', 'type': 'integer', 'description': 'character integer', 'title': 'get_characters_character_id_search_character'}, 'type': 'array', 'description': 'character array', 'title': 'get_characters_character_id_search_character'}, 'structure': {'items': {'format': 'int64', 'type': 'integer', 'description': 'structure integer', 'title': 'get_characters_character_id_search_structure'}, 'type': 'array', 'description': 'structure array', 'title': 'get_characters_character_id_search_structure'}, 'agent': {'items': {'format': 'int32', 'type': 'integer', 'description': 'agent integer', 'title': 'get_characters_character_id_search_agent'}, 'type': 'array', 'description': 'agent array', 'title': 'get_characters_character_id_search_agent'}, 'solarsystem': {'items': {'format': 'int32', 'type': 'integer', 'description': 'solarsystem integer', 'title': 'get_characters_character_id_search_solarsystem'}, 'type': 'array', 'description': 'solarsystem array', 'title': 'get_characters_character_id_search_solarsystem'}, 'station': {'items': {'format': 'int32', 'type': 'integer', 'description': 'station integer', 'title': 'get_characters_character_id_search_station'}, 'type': 'array', 'description': 'station array', 'title': 'get_characters_character_id_search_station'}}, 'description': '200 ok object', 'title': 'get_characters_character_id_search_ok'}, 'examples': {'application/json': {'solarsystem': [30002510], 'station': [60004588, 60004594, 60005725, 60009106, 60012721, 60012724, 60012727]}}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'A list of search results'}}
def get(self, categories, character_id, search, datasource="tranquility",language="en-us",strict="False",**kwargs):
"""
Search for entities that match a given sub-string.
---
Alternate route: `/v2/characters/{character_id}/search/`
---
This route is cached for up to 3600 seconds
:type categories: list
:param categories: Type of entities to search for
:type character_id: int
:param character_id: An EVE character ID
:type search: str
:param search: The string to search on
:type datasource: str
:param datasource: The server name you would like data from
:type language: str
:param language: Search locale
:type strict: boolean
:param strict: Whether the search should be a strict match
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"categories" : categories, "character_id" : character_id, "search" : search, "datasource" : datasource, "language" : language, "strict" : strict,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersNames(object):
base_url = "https://esi.tech.ccp.is/latest/characters/names/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_names_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_names_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '200': {'schema': {'items': {'title': 'get_characters_names_200_ok', 'type': 'object', 'properties': {'character_id': {'format': 'int64', 'type': 'integer', 'description': 'character_id integer', 'title': 'get_characters_names_character_id'}, 'character_name': {'type': 'string', 'description': 'character_name string', 'title': 'get_characters_names_character_name'}}, 'description': '200 ok object', 'required': ['character_id', 'character_name']}, 'type': 'array', 'description': '200 ok array', 'title': 'get_characters_names_ok'}, 'examples': {'application/json': [{'character_id': 95465499, 'character_name': 'CCP Bartender'}]}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'List of id/name associations'}}
def get(self, character_ids, datasource="tranquility",**kwargs):
"""
Resolve a set of character IDs to character names
---
Alternate route: `/v1/characters/names/`
Alternate route: `/legacy/characters/names/`
Alternate route: `/dev/characters/names/`
---
This route is cached for up to 3600 seconds
:type character_ids: list
:param character_ids: A comma separated list of character IDs
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: user_agent, X-User-Agent
"""
kwargs_dict ={
"character_ids" : character_ids, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetailPlanets(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/planets/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_planets_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_planets_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_planets_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_planets_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-planets.manage_planets.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'items': {'title': 'get_characters_character_id_planets_200_ok', 'type': 'object', 'properties': {'planet_type': {'enum': ['temperate', 'barren', 'oceanic', 'ice', 'gas', 'lava', 'storm', 'plasma'], 'type': 'string', 'description': 'planet_type string', 'title': 'get_characters_character_id_planets_planet_type'}, 'owner_id': {'format': 'int32', 'type': 'integer', 'description': 'owner_id integer', 'title': 'get_characters_character_id_planets_owner_id'}, 'upgrade_level': {'maximum': 5, 'format': 'int32', 'minimum': 0, 'type': 'integer', 'description': 'upgrade_level integer', 'title': 'get_characters_character_id_planets_upgrade_level'}, 'num_pins': {'format': 'int32', 'minimum': 1, 'type': 'integer', 'description': 'num_pins integer', 'title': 'get_characters_character_id_planets_num_pins'}, 'solar_system_id': {'format': 'int32', 'type': 'integer', 'description': 'solar_system_id integer', 'title': 'get_characters_character_id_planets_solar_system_id'}, 'last_update': {'format': 'date-time', 'type': 'string', 'description': 'last_update string', 'title': 'get_characters_character_id_planets_last_update'}, 'planet_id': {'format': 'int32', 'type': 'integer', 'description': 'planet_id integer', 'title': 'get_characters_character_id_planets_planet_id'}}, 'description': '200 ok object', 'required': ['solar_system_id', 'planet_id', 'planet_type', 'owner_id', 'last_update', 'upgrade_level', 'num_pins']}, 'type': 'array', 'description': '200 ok array', 'title': 'get_characters_character_id_planets_ok'}, 'examples': {'application/json': [{'planet_type': 'plasma', 'owner_id': 90000001, 'upgrade_level': 0, 'num_pins': 1, 'solar_system_id': 30000379, 'last_update': '2016-11-28T16:42:51Z', 'planet_id': 40023691}, {'planet_type': 'barren', 'owner_id': 90000001, 'upgrade_level': 0, 'num_pins': 1, 'solar_system_id': 30000379, 'last_update': '2016-11-28T16:41:54Z', 'planet_id': 40023697}]}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'List of colonies'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
Returns a list of all planetary colonies owned by a character.
---
Alternate route: `/v1/characters/{character_id}/planets/`
Alternate route: `/legacy/characters/{character_id}/planets/`
Alternate route: `/dev/characters/{character_id}/planets/`
---
This route is cached for up to 600 seconds
:type character_id: int
:param character_id: Character id of the target character
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetailBookmarks(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/bookmarks/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_bookmarks_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_bookmarks_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_bookmarks_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_bookmarks_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-bookmarks.read_character_bookmarks.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'items': {'title': 'get_characters_character_id_bookmarks_200_ok', 'type': 'object', 'properties': {'create_date': {'format': 'date-time', 'type': 'string', 'description': 'create_date string', 'title': 'get_characters_character_id_bookmarks_create_date'}, 'memo': {'type': 'string', 'description': 'memo string', 'title': 'get_characters_character_id_bookmarks_memo'}, 'owner_id': {'format': 'int32', 'type': 'integer', 'description': 'owner_id integer', 'title': 'get_characters_character_id_bookmarks_owner_id'}, 'creator_id': {'format': 'int32', 'type': 'integer', 'description': 'creator_id integer', 'title': 'get_characters_character_id_bookmarks_creator_id'}, 'folder_id': {'format': 'int32', 'type': 'integer', 'description': 'folder_id integer', 'title': 'get_characters_character_id_bookmarks_folder_id'}, 'target': {'title': 'get_characters_character_id_bookmarks_target', 'type': 'object', 'properties': {'item': {'title': 'get_characters_character_id_bookmarks_item', 'type': 'object', 'properties': {'item_id': {'format': 'int64', 'type': 'integer', 'description': 'item_id integer', 'title': 'get_characters_character_id_bookmarks_item_id'}, 'type_id': {'format': 'int32', 'type': 'integer', 'description': 'type_id integer', 'title': 'get_characters_character_id_bookmarks_type_id'}}, 'description': 'item object', 'required': ['item_id', 'type_id']}, 'coordinates': {'title': 'get_characters_character_id_bookmarks_coordinates', 'type': 'object', 'properties': {'x': {'format': 'double', 'type': 'number', 'description': 'x number', 'title': 'get_characters_character_id_bookmarks_x'}, 'y': {'format': 'double', 'type': 'number', 'description': 'y number', 'title': 'get_characters_character_id_bookmarks_y'}, 'z': {'format': 'double', 'type': 'number', 'description': 'z number', 'title': 'get_characters_character_id_bookmarks_z'}}, 'description': 'coordinates object', 'required': ['x', 'y', 'z']}, 'location_id': {'format': 'int64', 'type': 'integer', 'description': 'location_id integer', 'title': 'get_characters_character_id_bookmarks_location_id'}}, 'description': 'target object', 'required': ['location_id']}, 'note': {'type': 'string', 'description': 'note string', 'title': 'get_characters_character_id_bookmarks_note'}, 'bookmark_id': {'format': 'int64', 'type': 'integer', 'description': 'bookmark_id integer', 'title': 'get_characters_character_id_bookmarks_bookmark_id'}}, 'description': '200 ok object', 'required': ['bookmark_id', 'creator_id', 'owner_id', 'create_date', 'memo', 'note', 'target']}, 'type': 'array', 'description': '200 ok array', 'title': 'get_characters_character_id_bookmarks_ok'}, 'examples': {'application/json': [{'create_date': '2016-08-09T11:57:47Z', 'memo': 'aoeu ( Citadel )', 'owner_id': 90000001, 'creator_id': 90000001, 'folder_id': 5, 'target': {'item': {'item_id': 1000000012668, 'type_id': 35832}, 'location_id': 30000005}, 'note': '', 'bookmark_id': 32}]}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'A list of bookmarks'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
List your character's personal bookmarks
---
Alternate route: `/v1/characters/{character_id}/bookmarks/`
Alternate route: `/legacy/characters/{character_id}/bookmarks/`
Alternate route: `/dev/characters/{character_id}/bookmarks/`
---
This route is cached for up to 3600 seconds
:type character_id: int
:param character_id: An EVE character ID
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetailSkills(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/skills/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_skills_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_skills_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_skills_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_skills_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-skills.read_skills.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'type': 'object', 'properties': {'skills': {'items': {'type': 'object', 'properties': {'current_skill_level': {'format': 'int32', 'type': 'integer', 'description': 'current_skill_level integer', 'title': 'get_characters_character_id_skills_current_skill_level'}, 'skillpoints_in_skill': {'format': 'int64', 'type': 'integer', 'description': 'skillpoints_in_skill integer', 'title': 'get_characters_character_id_skills_skillpoints_in_skill'}, 'skill_id': {'format': 'int32', 'type': 'integer', 'description': 'skill_id integer', 'title': 'get_characters_character_id_skills_skill_id'}}, 'description': 'skill object', 'title': 'get_characters_character_id_skills_skill'}, 'type': 'array', 'description': 'skills array', 'title': 'get_characters_character_id_skills_skills'}, 'total_sp': {'format': 'int64', 'type': 'integer', 'description': 'total_sp integer', 'title': 'get_characters_character_id_skills_total_sp'}}, 'description': '200 ok object', 'title': 'get_characters_character_id_skills_ok'}, 'examples': {'application/json': {'skills': [{'current_skill_level': 1, 'skillpoints_in_skill': 10000, 'skill_id': 1}, {'current_skill_level': 1, 'skillpoints_in_skill': 10000, 'skill_id': 2}], 'total_sp': 20000}}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'Known skills for the character'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
List all trained skills for the given character
---
Alternate route: `/v3/characters/{character_id}/skills/`
Alternate route: `/dev/characters/{character_id}/skills/`
---
This route is cached for up to 120 seconds
:type character_id: int
:param character_id: An EVE character ID
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetailWallets(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/wallets/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_wallets_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_wallets_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_wallets_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_wallets_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-wallet.read_character_wallet.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'items': {'type': 'object', 'properties': {'wallet_id': {'format': 'int32', 'type': 'integer', 'description': 'wallet_id integer', 'title': 'get_characters_character_id_wallets_wallet_id'}, 'balance': {'format': 'int64', 'type': 'integer', 'description': "Wallet's balance in ISK hundredths.", 'title': 'get_characters_character_id_wallets_balance'}}, 'description': '200 ok object', 'title': 'get_characters_character_id_wallets_200_ok'}, 'type': 'array', 'description': '200 ok array', 'title': 'get_characters_character_id_wallets_ok'}, 'examples': {'application/json': [{'wallet_id': 1000, 'balance': 295000}]}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'Wallet data for selected user'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
List your wallets and their balances. Characters typically have only one wallet, with wallet_id 1000 being the master wallet.
---
Alternate route: `/v1/characters/{character_id}/wallets/`
Alternate route: `/legacy/characters/{character_id}/wallets/`
Alternate route: `/dev/characters/{character_id}/wallets/`
---
This route is cached for up to 120 seconds
:type character_id: int
:param character_id: An EVE character ID
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetailFittingsDetail(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/fittings/{fitting_id}/"
delete_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'delete_characters_character_id_fittings_fitting_id_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'delete_characters_character_id_fittings_fitting_id_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'delete_characters_character_id_fittings_fitting_id_403_forbidden'}}, 'description': 'Forbidden', 'title': 'delete_characters_character_id_fittings_fitting_id_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-fittings.write_fittings.v1'}}, 'description': 'Forbidden'}, '204': {'description': 'Fitting deleted'}}
def delete(self, character_id, fitting_id, datasource="tranquility",**kwargs):
"""
Delete a fitting from a character
---
Alternate route: `/v1/characters/{character_id}/fittings/{fitting_id}/`
Alternate route: `/legacy/characters/{character_id}/fittings/{fitting_id}/`
Alternate route: `/dev/characters/{character_id}/fittings/{fitting_id}/`
:type character_id: int
:param character_id: ID for a character
:type fitting_id: int
:param fitting_id: ID for a fitting of this character
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "fitting_id" : fitting_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.delete_responses) \
.delete(**kwargs_dict)
class CharactersDetailCspa(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/cspa/"
post_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'post_characters_character_id_cspa_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'post_characters_character_id_cspa_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'post_characters_character_id_cspa_403_forbidden'}}, 'description': 'Forbidden', 'title': 'post_characters_character_id_cspa_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-characters.read_contacts.v1'}}, 'description': 'Forbidden'}, '201': {'schema': {'type': 'object', 'properties': {'cost': {'format': 'int64', 'type': 'integer', 'description': 'cost integer', 'title': 'post_characters_character_id_cspa_cost'}}, 'description': '201 created object', 'title': 'post_characters_character_id_cspa_created'}, 'examples': {'application/json': {'cost': 295000}}, 'description': 'Aggregate cost of sending a mail from the source character to the target characters, in ISK hundredths'}}
def post(self, character_id, characters, datasource="tranquility",**kwargs):
"""
Takes a source character ID in the url and a set of target character ID's in the body, returns a CSPA charge cost
---
Alternate route: `/v3/characters/{character_id}/cspa/`
Alternate route: `/legacy/characters/{character_id}/cspa/`
Alternate route: `/dev/characters/{character_id}/cspa/`
:type character_id: int
:param character_id: An EVE character ID
:type characters: None
:param characters: The target characters to calculate the charge for
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "characters" : characters, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.post_responses) \
.post(**kwargs_dict)
class CharactersDetailPlanetsDetail(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/planets/{planet_id}/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_planets_planet_id_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_planets_planet_id_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_planets_planet_id_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_planets_planet_id_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-planets.manage_planets.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'title': 'get_characters_character_id_planets_planet_id_ok', 'type': 'object', 'properties': {'links': {'items': {'title': 'get_characters_character_id_planets_planet_id_link', 'type': 'object', 'properties': {'destination_pin_id': {'format': 'int64', 'type': 'integer', 'description': 'destination_pin_id integer', 'title': 'get_characters_character_id_planets_planet_id_destination_pin_id'}, 'link_level': {'maximum': 10, 'format': 'int32', 'minimum': 0, 'type': 'integer', 'description': 'link_level integer', 'title': 'get_characters_character_id_planets_planet_id_link_level'}, 'source_pin_id': {'format': 'int64', 'type': 'integer', 'description': 'source_pin_id integer', 'title': 'get_characters_character_id_planets_planet_id_source_pin_id'}}, 'description': 'link object', 'required': ['source_pin_id', 'destination_pin_id', 'link_level']}, 'type': 'array', 'description': 'links array', 'title': 'get_characters_character_id_planets_planet_id_links'}, 'pins': {'items': {'title': 'get_characters_character_id_planets_planet_id_pin', 'type': 'object', 'properties': {'factory_details': {'title': 'get_characters_character_id_planets_planet_id_factory_details', 'type': 'object', 'properties': {'schematic_id': {'format': 'int32', 'type': 'integer', 'description': 'schematic_id integer', 'title': 'get_characters_character_id_planets_planet_id_schematic_id'}}, 'description': 'factory_details object', 'required': ['schematic_id']}, 'expiry_time': {'format': 'date-time', 'type': 'string', 'description': 'expiry_time string', 'title': 'get_characters_character_id_planets_planet_id_expiry_time'}, 'schematic_id': {'format': 'int32', 'type': 'integer', 'description': 'schematic_id integer', 'title': 'get_characters_character_id_planets_planet_id_schematic_id'}, 'latitude': {'format': 'float', 'type': 'number', 'description': 'latitude number', 'title': 'get_characters_character_id_planets_planet_id_latitude'}, 'pin_id': {'format': 'int64', 'type': 'integer', 'description': 'pin_id integer', 'title': 'get_characters_character_id_planets_planet_id_pin_id'}, 'longitude': {'format': 'float', 'type': 'number', 'description': 'longitude number', 'title': 'get_characters_character_id_planets_planet_id_longitude'}, 'install_time': {'format': 'date-time', 'type': 'string', 'description': 'install_time string', 'title': 'get_characters_character_id_planets_planet_id_install_time'}, 'last_cycle_start': {'format': 'date-time', 'type': 'string', 'description': 'last_cycle_start string', 'title': 'get_characters_character_id_planets_planet_id_last_cycle_start'}, 'type_id': {'format': 'int32', 'type': 'integer', 'description': 'type_id integer', 'title': 'get_characters_character_id_planets_planet_id_type_id'}, 'extractor_details': {'title': 'get_characters_character_id_planets_planet_id_extractor_details', 'type': 'object', 'properties': {'product_type_id': {'format': 'int32', 'type': 'integer', 'description': 'product_type_id integer', 'title': 'get_characters_character_id_planets_planet_id_product_type_id'}, 'cycle_time': {'format': 'int32', 'type': 'integer', 'description': 'in seconds', 'title': 'get_characters_character_id_planets_planet_id_cycle_time'}, 'qty_per_cycle': {'format': 'int32', 'type': 'integer', 'description': 'qty_per_cycle integer', 'title': 'get_characters_character_id_planets_planet_id_qty_per_cycle'}, 'head_radius': {'format': 'float', 'type': 'number', 'description': 'head_radius number', 'title': 'get_characters_character_id_planets_planet_id_head_radius'}, 'heads': {'items': {'title': 'get_characters_character_id_planets_planet_id_head', 'type': 'object', 'properties': {'head_id': {'maximum': 9, 'format': 'int32', 'minimum': 0, 'type': 'integer', 'description': 'head_id integer', 'title': 'get_characters_character_id_planets_planet_id_head_id'}, 'longitude': {'format': 'float', 'type': 'number', 'description': 'longitude number', 'title': 'get_characters_character_id_planets_planet_id_longitude'}, 'latitude': {'format': 'float', 'type': 'number', 'description': 'latitude number', 'title': 'get_characters_character_id_planets_planet_id_latitude'}}, 'description': 'head object', 'required': ['head_id', 'latitude', 'longitude']}, 'type': 'array', 'description': 'heads array', 'title': 'get_characters_character_id_planets_planet_id_heads'}}, 'description': 'extractor_details object', 'required': ['heads']}}, 'description': 'pin object', 'required': ['pin_id', 'type_id', 'latitude', 'longitude']}, 'type': 'array', 'description': 'pins array', 'title': 'get_characters_character_id_planets_planet_id_pins'}, 'routes': {'items': {'title': 'get_characters_character_id_planets_planet_id_route', 'type': 'object', 'properties': {'content_type_id': {'format': 'int32', 'type': 'integer', 'description': 'content_type_id integer', 'title': 'get_characters_character_id_planets_planet_id_content_type_id'}, 'source_pin_id': {'format': 'int64', 'type': 'integer', 'description': 'source_pin_id integer', 'title': 'get_characters_character_id_planets_planet_id_source_pin_id'}, 'waypoints': {'items': {'title': 'get_characters_character_id_planets_planet_id_waypoint', 'type': 'object', 'properties': {'order': {'maximum': 5, 'format': 'int32', 'minimum': 1, 'type': 'integer', 'description': 'order integer', 'title': 'get_characters_character_id_planets_planet_id_order'}, 'pin_id': {'format': 'int64', 'type': 'integer', 'description': 'pin_id integer', 'title': 'get_characters_character_id_planets_planet_id_pin_id'}}, 'description': 'waypoint object', 'required': ['pin_id', 'order']}, 'type': 'array', 'description': 'waypoints array', 'title': 'get_characters_character_id_planets_planet_id_waypoints'}, 'destination_pin_id': {'format': 'int64', 'type': 'integer', 'description': 'destination_pin_id integer', 'title': 'get_characters_character_id_planets_planet_id_destination_pin_id'}, 'route_id': {'format': 'int64', 'type': 'integer', 'description': 'route_id integer', 'title': 'get_characters_character_id_planets_planet_id_route_id'}, 'quantity': {'format': 'float', 'type': 'number', 'description': 'quantity number', 'title': 'get_characters_character_id_planets_planet_id_quantity'}}, 'description': 'route object', 'required': ['route_id', 'source_pin_id', 'destination_pin_id', 'content_type_id', 'quantity']}, 'type': 'array', 'description': 'routes array', 'title': 'get_characters_character_id_planets_planet_id_routes'}}, 'description': '200 ok object', 'required': ['links', 'pins', 'routes']}, 'examples': {'application/json': {'links': [{'destination_pin_id': 1000000017022, 'link_level': 0, 'source_pin_id': 1000000017021}], 'pins': [{'type_id': 2254, 'longitude': 0.717145933308, 'latitude': 1.55087844973, 'is_running': True, 'pin_id': 1000000017021}, {'type_id': 2256, 'longitude': 0.709775584394, 'latitude': 1.53360639935, 'is_running': True, 'pin_id': 1000000017022}], 'routes': [{'destination_pin_id': 1000000017030, 'content_type_id': 2393, 'route_id': 4, 'quantity': 20, 'source_pin_id': 1000000017029}]}}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'Colony layout'}, '404': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'error message', 'title': 'get_characters_character_id_planets_planet_id_error'}}, 'description': 'Colony not found', 'title': 'get_characters_character_id_planets_planet_id_not_found'}, 'examples': {'application/json': {'error': 'Colony not found'}}, 'description': 'Colony not found'}}
def get(self, character_id, planet_id, datasource="tranquility",**kwargs):
"""
Returns full details on the layout of a single planetary colony, including links, pins and routes. Note: Planetary information is only recalculated when the colony is viewed through the client. Information on this endpoint will not update until this criteria is met.
---
Alternate route: `/v2/characters/{character_id}/planets/{planet_id}/`
Alternate route: `/dev/characters/{character_id}/planets/{planet_id}/`
---
This route is cached for up to 600 seconds
:type character_id: int
:param character_id: Character id of the target character
:type planet_id: int
:param planet_id: Planet id of the target planet
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "planet_id" : planet_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetailMailLists(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/mail/lists/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_mail_lists_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_mail_lists_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_mail_lists_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_mail_lists_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-mail.read_mail.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'items': {'title': 'get_characters_character_id_mail_lists_200_ok', 'type': 'object', 'properties': {'name': {'type': 'string', 'description': 'name string', 'title': 'get_characters_character_id_mail_lists_name'}, 'mailing_list_id': {'format': 'int32', 'type': 'integer', 'description': 'Mailing list ID', 'title': 'get_characters_character_id_mail_lists_mailing_list_id'}}, 'description': '200 ok object', 'required': ['mailing_list_id', 'name']}, 'type': 'array', 'description': '200 ok array', 'title': 'get_characters_character_id_mail_lists_ok'}, 'examples': {'application/json': [{'name': 'test_mailing_list', 'mailing_list_id': 1}]}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'Mailing lists'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
Return all mailing lists that the character is subscribed to
---
Alternate route: `/v1/characters/{character_id}/mail/lists/`
Alternate route: `/legacy/characters/{character_id}/mail/lists/`
Alternate route: `/dev/characters/{character_id}/mail/lists/`
---
This route is cached for up to 120 seconds
:type character_id: int
:param character_id: An EVE character ID
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetailLoyaltyPoints(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/loyalty/points/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_loyalty_points_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_loyalty_points_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_loyalty_points_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_loyalty_points_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-characters.read_loyalty.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'items': {'title': 'get_characters_character_id_loyalty_points_200_ok', 'type': 'object', 'properties': {'loyalty_points': {'format': 'int32', 'type': 'integer', 'description': 'loyalty_points integer', 'title': 'get_characters_character_id_loyalty_points_loyalty_points'}, 'corporation_id': {'format': 'int32', 'type': 'integer', 'description': 'corporation_id integer', 'title': 'get_characters_character_id_loyalty_points_corporation_id'}}, 'description': '200 ok object', 'required': ['corporation_id', 'loyalty_points']}, 'type': 'array', 'description': '200 ok array', 'title': 'get_characters_character_id_loyalty_points_ok'}, 'examples': {'application/json': [{'loyalty_points': 100, 'corporation_id': 123}]}, 'description': 'A list of loyalty points'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
Return a list of loyalty points for all corporations the character has worked for
---
Alternate route: `/v1/characters/{character_id}/loyalty/points/`
Alternate route: `/legacy/characters/{character_id}/loyalty/points/`
Alternate route: `/dev/characters/{character_id}/loyalty/points/`
:type character_id: int
:param character_id: ID for a character
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetailMailDetail(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/mail/{mail_id}/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_mail_mail_id_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_mail_mail_id_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_mail_mail_id_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_mail_mail_id_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-mail.read_mail.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'type': 'object', 'properties': {'subject': {'type': 'string', 'description': 'Mail subject', 'title': 'get_characters_character_id_mail_mail_id_subject'}, 'body': {'type': 'string', 'description': "Mail's body", 'title': 'get_characters_character_id_mail_mail_id_body'}, 'read': {'type': 'boolean', 'description': 'Whether the mail is flagged as read', 'title': 'get_characters_character_id_mail_mail_id_read'}, 'from': {'format': 'int32', 'type': 'integer', 'description': 'From whom the mail was sent', 'title': 'get_characters_character_id_mail_mail_id_from'}, 'timestamp': {'format': 'date-time', 'type': 'string', 'description': 'When the mail was sent', 'title': 'get_characters_character_id_mail_mail_id_timestamp'}, 'labels': {'items': {'format': 'int64', 'minimum': 0, 'type': 'integer', 'uniqueItems': True, 'description': 'label integer', 'title': 'get_characters_character_id_mail_mail_id_label'}, 'type': 'array', 'description': 'Labels attached to the mail', 'title': 'get_characters_character_id_mail_mail_id_labels'}, 'recipients': {'minItems': 1, 'maxItems': 50, 'items': {'title': 'get_characters_character_id_mail_mail_id_recipient', 'type': 'object', 'properties': {'recipient_type': {'enum': ['alliance', 'character', 'corporation', 'mailing_list'], 'type': 'string', 'description': 'recipient_type string', 'title': 'get_characters_character_id_mail_mail_id_recipient_type'}, 'recipient_id': {'format': 'int32', 'type': 'integer', 'description': 'recipient_id integer', 'title': 'get_characters_character_id_mail_mail_id_recipient_id'}}, 'description': 'recipient object', 'required': ['recipient_type', 'recipient_id']}, 'uniqueItems': True, 'type': 'array', 'description': 'Recipients of the mail', 'title': 'get_characters_character_id_mail_mail_id_recipients'}}, 'description': '200 ok object', 'title': 'get_characters_character_id_mail_mail_id_ok'}, 'examples': {'application/json': {'subject': 'test', 'body': 'blah blah blah', 'read': False, 'from': 90000001, 'timestamp': '2015-09-30T16:07:00Z', 'labels': [2, 32]}}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'Contents of a mail'}, '404': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Not found message', 'title': 'get_characters_character_id_mail_mail_id_404_not_found'}}, 'description': 'Not found', 'title': 'get_characters_character_id_mail_mail_id_not_found'}, 'examples': {'application/json': {'error': 'Not found message'}}, 'description': 'Mail not found'}}
def get(self, character_id, mail_id, datasource="tranquility",**kwargs):
"""
Return the contents of an EVE mail
---
Alternate route: `/v1/characters/{character_id}/mail/{mail_id}/`
Alternate route: `/legacy/characters/{character_id}/mail/{mail_id}/`
Alternate route: `/dev/characters/{character_id}/mail/{mail_id}/`
---
This route is cached for up to 30 seconds
:type character_id: int
:param character_id: An EVE character ID
:type mail_id: int
:param mail_id: An EVE mail ID
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "mail_id" : mail_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
delete_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'delete_characters_character_id_mail_mail_id_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'delete_characters_character_id_mail_mail_id_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'delete_characters_character_id_mail_mail_id_403_forbidden'}}, 'description': 'Forbidden', 'title': 'delete_characters_character_id_mail_mail_id_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-mail.organize_mail.v1'}}, 'description': 'Forbidden'}, '204': {'description': 'Mail deleted'}}
def delete(self, character_id, mail_id, datasource="tranquility",**kwargs):
"""
Delete a mail
---
Alternate route: `/v1/characters/{character_id}/mail/{mail_id}/`
Alternate route: `/legacy/characters/{character_id}/mail/{mail_id}/`
Alternate route: `/dev/characters/{character_id}/mail/{mail_id}/`
:type character_id: int
:param character_id: An EVE character ID
:type mail_id: int
:param mail_id: An EVE mail ID
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "mail_id" : mail_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.delete_responses) \
.delete(**kwargs_dict)
put_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'put_characters_character_id_mail_mail_id_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'put_characters_character_id_mail_mail_id_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '400': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Bad request message', 'title': 'put_characters_character_id_mail_mail_id_400_bad_request'}}, 'description': 'Bad request', 'title': 'put_characters_character_id_mail_mail_id_bad_request'}, 'examples': {'application/json': {'error': 'Bad request message'}}, 'description': 'Invalid label ID; or No parameters in body -- nothing to do'}, '204': {'description': 'Mail updated'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'put_characters_character_id_mail_mail_id_403_forbidden'}}, 'description': 'Forbidden', 'title': 'put_characters_character_id_mail_mail_id_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-mail.organize_mail.v1'}}, 'description': 'Forbidden'}}
def put(self, character_id, contents, mail_id, datasource="tranquility",**kwargs):
"""
Update metadata about a mail
---
Alternate route: `/v1/characters/{character_id}/mail/{mail_id}/`
Alternate route: `/legacy/characters/{character_id}/mail/{mail_id}/`
Alternate route: `/dev/characters/{character_id}/mail/{mail_id}/`
:type character_id: int
:param character_id: An EVE character ID
:type contents: None
:param contents: Data used to update the mail
:type mail_id: int
:param mail_id: An EVE mail ID
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "contents" : contents, "mail_id" : mail_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.put_responses) \
.put(**kwargs_dict)
class CharactersDetailAssets(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/assets/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_assets_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_assets_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_assets_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_assets_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-assets.read_assets.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'items': {'title': 'get_characters_character_id_assets_200_ok', 'type': 'object', 'properties': {'location_id': {'format': 'int64', 'type': 'integer', 'description': 'location_id integer', 'title': 'get_characters_character_id_assets_location_id'}, 'location_type': {'enum': ['station', 'solar_system', 'other'], 'type': 'string', 'description': 'location_type string', 'title': 'get_characters_character_id_assets_location_type'}, 'is_singleton': {'type': 'boolean', 'description': 'is_singleton boolean', 'title': 'get_characters_character_id_assets_is_singleton'}, 'quantity': {'format': 'int32', 'type': 'integer', 'description': 'quantity integer', 'title': 'get_characters_character_id_assets_quantity'}, 'location_flag': {'enum': ['AutoFit', 'Cargo', 'CorpseBay', 'DroneBay', 'FleetHangar', 'Deliveries', 'HiddenModifiers', 'Hangar', 'HangarAll', 'LoSlot0', 'LoSlot1', 'LoSlot2', 'LoSlot3', 'LoSlot4', 'LoSlot5', 'LoSlot6', 'LoSlot7', 'MedSlot0', 'MedSlot1', 'MedSlot2', 'MedSlot3', 'MedSlot4', 'MedSlot5', 'MedSlot6', 'MedSlot7', 'HiSlot0', 'HiSlot1', 'HiSlot2', 'HiSlot3', 'HiSlot4', 'HiSlot5', 'HiSlot6', 'HiSlot7', 'AssetSafety', 'Locked', 'Unlocked', 'Implant', 'QuafeBay', 'RigSlot0', 'RigSlot1', 'RigSlot2', 'RigSlot3', 'RigSlot4', 'RigSlot5', 'RigSlot6', 'RigSlot7', 'ShipHangar', 'SpecializedFuelBay', 'SpecializedOreHold', 'SpecializedGasHold', 'SpecializedMineralHold', 'SpecializedSalvageHold', 'SpecializedShipHold', 'SpecializedSmallShipHold', 'SpecializedMediumShipHold', 'SpecializedLargeShipHold', 'SpecializedIndustrialShipHold', 'SpecializedAmmoHold', 'SpecializedCommandCenterHold', 'SpecializedPlanetaryCommoditiesHold', 'SpecializedMaterialBay', 'SubSystemSlot0', 'SubSystemSlot1', 'SubSystemSlot2', 'SubSystemSlot3', 'SubSystemSlot4', 'SubSystemSlot5', 'SubSystemSlot6', 'SubSystemSlot7', 'FighterBay', 'FighterTube0', 'FighterTube1', 'FighterTube2', 'FighterTube3', 'FighterTube4', 'Module'], 'type': 'string', 'description': 'location_flag string', 'title': 'get_characters_character_id_assets_location_flag'}, 'item_id': {'format': 'int64', 'type': 'integer', 'description': 'item_id integer', 'title': 'get_characters_character_id_assets_item_id'}, 'type_id': {'format': 'int32', 'type': 'integer', 'description': 'type_id integer', 'title': 'get_characters_character_id_assets_type_id'}}, 'description': '200 ok object', 'required': ['type_id', 'location_id', 'location_type', 'item_id', 'location_flag', 'is_singleton']}, 'type': 'array', 'description': '200 ok array', 'title': 'get_characters_character_id_assets_ok'}, 'examples': {'application/json': [{'location_id': 60002959, 'location_type': 'station', 'is_singleton': True, 'location_flag': 'Hangar', 'item_id': 1000000016835, 'type_id': 3516}]}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'A flat list of the users assets'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
Return a list of the characters assets
---
Alternate route: `/v1/characters/{character_id}/assets/`
Alternate route: `/legacy/characters/{character_id}/assets/`
Alternate route: `/dev/characters/{character_id}/assets/`
---
This route is cached for up to 3600 seconds
:type character_id: int
:param character_id: Character id of the target character
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetailMail(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/mail/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_mail_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_mail_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_mail_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_mail_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-mail.read_mail.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'items': {'type': 'object', 'properties': {'subject': {'type': 'string', 'description': 'Mail subject', 'title': 'get_characters_character_id_mail_subject'}, 'labels': {'minimum': 0, 'type': 'array', 'maxItems': 25, 'items': {'format': 'int64', 'type': 'integer', 'description': 'label integer', 'title': 'get_characters_character_id_mail_label'}, 'uniqueItems': True, 'description': 'labels array', 'title': 'get_characters_character_id_mail_labels'}, 'is_read': {'type': 'boolean', 'description': 'is_read boolean', 'title': 'get_characters_character_id_mail_is_read'}, 'mail_id': {'format': 'int64', 'type': 'integer', 'description': 'mail_id integer', 'title': 'get_characters_character_id_mail_mail_id'}, 'from': {'format': 'int32', 'type': 'integer', 'description': 'From whom the mail was sent', 'title': 'get_characters_character_id_mail_from'}, 'timestamp': {'format': 'date-time', 'type': 'string', 'description': 'When the mail was sent', 'title': 'get_characters_character_id_mail_timestamp'}, 'recipients': {'minItems': 1, 'maxItems': 50, 'items': {'title': 'get_characters_character_id_mail_recipient', 'type': 'object', 'properties': {'recipient_type': {'enum': ['alliance', 'character', 'corporation', 'mailing_list'], 'type': 'string', 'description': 'recipient_type string', 'title': 'get_characters_character_id_mail_recipient_type'}, 'recipient_id': {'format': 'int32', 'type': 'integer', 'description': 'recipient_id integer', 'title': 'get_characters_character_id_mail_recipient_id'}}, 'description': 'recipient object', 'required': ['recipient_type', 'recipient_id']}, 'uniqueItems': True, 'type': 'array', 'description': 'Recipients of the mail', 'title': 'get_characters_character_id_mail_recipients'}}, 'description': '200 ok object', 'title': 'get_characters_character_id_mail_200_ok'}, 'type': 'array', 'description': '200 ok array', 'title': 'get_characters_character_id_mail_ok'}, 'examples': {'application/json': [{'subject': 'Title for EVE Mail', 'labels': [3], 'is_read': True, 'mail_id': 7, 'from': 90000001, 'timestamp': '2015-09-30T16:07:00Z', 'recipients': [{'recipient_type': 'character', 'recipient_id': 90000002}]}]}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'The requested mail'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
Return the 50 most recent mail headers belonging to the character that match the query criteria. Queries can be filtered by label, and last_mail_id can be used to paginate backwards.
---
Alternate route: `/v1/characters/{character_id}/mail/`
Alternate route: `/legacy/characters/{character_id}/mail/`
Alternate route: `/dev/characters/{character_id}/mail/`
---
This route is cached for up to 30 seconds
:type character_id: int
:param character_id: An EVE character ID
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: labels, last_mail_id, token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
post_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'post_characters_character_id_mail_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'post_characters_character_id_mail_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'post_characters_character_id_mail_403_forbidden'}}, 'description': 'Forbidden', 'title': 'post_characters_character_id_mail_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-mail.send_mail.v1'}}, 'description': 'Forbidden'}, '400': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Bad request message', 'title': 'post_characters_character_id_mail_400_bad_request'}}, 'description': 'Bad request', 'title': 'post_characters_character_id_mail_bad_request'}, 'examples': {'application/json': {'error': 'Bad request message'}}, 'description': 'Only one corporation, alliance, or mailing list can be the\nrecipient of a mail\n'}, '201': {'schema': {'format': 'int32', 'type': 'integer', 'description': 'Mail ID', 'title': 'post_characters_character_id_mail_created'}, 'examples': {'application/json': 13}, 'description': 'Mail created'}}
def post(self, character_id, mail, datasource="tranquility",**kwargs):
"""
Create and send a new mail
---
Alternate route: `/v1/characters/{character_id}/mail/`
Alternate route: `/legacy/characters/{character_id}/mail/`
Alternate route: `/dev/characters/{character_id}/mail/`
:type character_id: int
:param character_id: The sender's character ID
:type mail: None
:param mail: The mail to send
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "mail" : mail, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.post_responses) \
.post(**kwargs_dict)
class CharactersDetailClones(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/clones/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_clones_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_clones_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_clones_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_clones_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-clones.read_clones.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'title': 'get_characters_character_id_clones_ok', 'type': 'object', 'properties': {'home_location': {'type': 'object', 'properties': {'location_id': {'format': 'int64', 'type': 'integer', 'description': 'location_id integer', 'title': 'get_characters_character_id_clones_location_id'}, 'location_type': {'enum': ['station', 'structure'], 'type': 'string', 'description': 'location_type string', 'title': 'get_characters_character_id_clones_location_type'}}, 'description': 'home_location object', 'title': 'get_characters_character_id_clones_home_location'}, 'jump_clones': {'items': {'type': 'object', 'properties': {'implants': {'items': {'format': 'int32', 'type': 'integer', 'description': 'implant integer', 'title': 'get_characters_character_id_clones_implant'}, 'type': 'array', 'description': 'implants array', 'title': 'get_characters_character_id_clones_implants'}, 'location_id': {'format': 'int64', 'type': 'integer', 'description': 'location_id integer', 'title': 'get_characters_character_id_clones_location_id'}, 'location_type': {'enum': ['station', 'structure'], 'type': 'string', 'description': 'location_type string', 'title': 'get_characters_character_id_clones_location_type'}}, 'description': 'jump_clone object', 'title': 'get_characters_character_id_clones_jump_clone'}, 'type': 'array', 'description': 'jump_clones array', 'title': 'get_characters_character_id_clones_jump_clones'}, 'last_jump_date': {'format': 'date-time', 'type': 'string', 'description': 'last_jump_date string', 'title': 'get_characters_character_id_clones_last_jump_date'}}, 'description': '200 ok object', 'required': ['jump_clones']}, 'examples': {'application/json': {'home_location': {'location_id': 1021348135816, 'location_type': 'structure'}, 'jump_clones': [{'implants': [22118], 'location_id': 60003463, 'location_type': 'station'}, {'implants': [], 'location_id': 1021348135816, 'location_type': 'structure'}]}}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'Clone information for the given character'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
A list of the character's clones
---
Alternate route: `/v2/characters/{character_id}/clones/`
Alternate route: `/dev/characters/{character_id}/clones/`
---
This route is cached for up to 120 seconds
:type character_id: int
:param character_id: An EVE character ID
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetailContacts(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/contacts/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_contacts_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_contacts_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_contacts_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_contacts_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-characters.read_contacts.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'items': {'title': 'get_characters_character_id_contacts_200_ok', 'type': 'object', 'properties': {'label_id': {'format': 'int64', 'type': 'integer', 'description': 'Custom label of the contact', 'title': 'get_characters_character_id_contacts_label_id'}, 'standing': {'format': 'float', 'type': 'number', 'description': 'Standing of the contact', 'title': 'get_characters_character_id_contacts_standing'}, 'is_watched': {'type': 'boolean', 'description': 'Whether this contact is being watched', 'title': 'get_characters_character_id_contacts_is_watched'}, 'contact_type': {'enum': ['character', 'corporation', 'alliance', 'faction'], 'type': 'string', 'description': 'contact_type string', 'title': 'get_characters_character_id_contacts_contact_type'}, 'is_blocked': {'type': 'boolean', 'description': 'Whether this contact is in the blocked list. Note a missing value denotes unknown, not true or false', 'title': 'get_characters_character_id_contacts_is_blocked'}, 'contact_id': {'format': 'int32', 'type': 'integer', 'description': 'contact_id integer', 'title': 'get_characters_character_id_contacts_contact_id'}}, 'description': '200 ok object', 'required': ['standing', 'contact_type', 'contact_id']}, 'type': 'array', 'description': '200 ok array', 'title': 'get_characters_character_id_contacts_ok'}, 'examples': {'application/json': [{'is_blocked': False, 'standing': 10.0, 'is_watched': True, 'contact_id': 123, 'contact_type': 'character'}]}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'A list of contacts'}}
def get(self, character_id, datasource="tranquility",page=1,**kwargs):
"""
Return contacts of a character
---
Alternate route: `/v1/characters/{character_id}/contacts/`
Alternate route: `/legacy/characters/{character_id}/contacts/`
Alternate route: `/dev/characters/{character_id}/contacts/`
---
This route is cached for up to 300 seconds
:type character_id: int
:param character_id: ID for a character
:type datasource: str
:param datasource: The server name you would like data from
:type page: int
:param page: page integer
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource, "page" : page,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
post_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'post_characters_character_id_contacts_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'post_characters_character_id_contacts_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'post_characters_character_id_contacts_403_forbidden'}}, 'description': 'Forbidden', 'title': 'post_characters_character_id_contacts_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-characters.write_contacts.v1'}}, 'description': 'Forbidden'}, '201': {'schema': {'items': {'format': 'int32', 'type': 'integer', 'description': '201 created integer', 'title': 'post_characters_character_id_contacts_201_created'}, 'type': 'array', 'description': '201 created array', 'title': 'post_characters_character_id_contacts_created'}, 'examples': {'application/json': [123, 456]}, 'description': 'A list of contact ids that successfully created'}}
def post(self, character_id, contact_ids, datasource="tranquility",label_id=0,standing="-10",watched="False",**kwargs):
"""
Bulk add contacts with same settings
---
Alternate route: `/v1/characters/{character_id}/contacts/`
Alternate route: `/legacy/characters/{character_id}/contacts/`
Alternate route: `/dev/characters/{character_id}/contacts/`
:type character_id: int
:param character_id: ID for a character
:type contact_ids: None
:param contact_ids: A list of contacts to add
:type datasource: str
:param datasource: The server name you would like data from
:type label_id: int
:param label_id: Add a custom label to the new contact
:type standing: number
:param standing: Standing for the new contact
:type watched: boolean
:param watched: Whether the new contact should be watched, note this is only effective on characters
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "contact_ids" : contact_ids, "datasource" : datasource, "label_id" : label_id, "standing" : standing, "watched" : watched,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.post_responses) \
.post(**kwargs_dict)
delete_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'delete_characters_character_id_contacts_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'delete_characters_character_id_contacts_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'delete_characters_character_id_contacts_403_forbidden'}}, 'description': 'Forbidden', 'title': 'delete_characters_character_id_contacts_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-characters.write_contacts.v1'}}, 'description': 'Forbidden'}, '204': {'description': 'Contacts deleted'}}
def delete(self, character_id, contact_ids, datasource="tranquility",**kwargs):
"""
Bulk delete contacts
---
Alternate route: `/v1/characters/{character_id}/contacts/`
Alternate route: `/legacy/characters/{character_id}/contacts/`
Alternate route: `/dev/characters/{character_id}/contacts/`
:type character_id: int
:param character_id: ID for a character
:type contact_ids: None
:param contact_ids: A list of contacts to edit
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "contact_ids" : contact_ids, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.delete_responses) \
.delete(**kwargs_dict)
put_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'put_characters_character_id_contacts_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'put_characters_character_id_contacts_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'put_characters_character_id_contacts_403_forbidden'}}, 'description': 'Forbidden', 'title': 'put_characters_character_id_contacts_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-characters.write_contacts.v1'}}, 'description': 'Forbidden'}, '204': {'description': 'Contacts updated'}}
def put(self, character_id, contact_ids, datasource="tranquility",label_id=0,standing="-10",watched="False",**kwargs):
"""
Bulk edit contacts with same settings
---
Alternate route: `/v1/characters/{character_id}/contacts/`
Alternate route: `/legacy/characters/{character_id}/contacts/`
Alternate route: `/dev/characters/{character_id}/contacts/`
:type character_id: int
:param character_id: ID for a character
:type contact_ids: None
:param contact_ids: A list of contacts to edit
:type datasource: str
:param datasource: The server name you would like data from
:type label_id: int
:param label_id: Add a custom label to the contact, use 0 for clearing label
:type standing: number
:param standing: Standing for the contact
:type watched: boolean
:param watched: Whether the contact should be watched, note this is only effective on characters
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "contact_ids" : contact_ids, "datasource" : datasource, "label_id" : label_id, "standing" : standing, "watched" : watched,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.put_responses) \
.put(**kwargs_dict)
class CharactersDetailLocation(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/location/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_location_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_location_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_location_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_location_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-location.read_location.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'title': 'get_characters_character_id_location_ok', 'type': 'object', 'properties': {'solar_system_id': {'format': 'int32', 'type': 'integer', 'description': 'solar_system_id integer', 'title': 'get_characters_character_id_location_solar_system_id'}, 'station_id': {'format': 'int32', 'type': 'integer', 'description': 'station_id integer', 'title': 'get_characters_character_id_location_station_id'}, 'structure_id': {'format': 'int64', 'type': 'integer', 'description': 'structure_id integer', 'title': 'get_characters_character_id_location_structure_id'}}, 'description': '200 ok object', 'required': ['solar_system_id']}, 'examples': {'application/json': {'solar_system_id': 30002505, 'structure_id': 1000000016989}}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'Information about the characters current location. Returns the current solar system id, and also the current station or structure ID if applicable.'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
Information about the characters current location. Returns the current solar system id, and also the current station or structure ID if applicable.
---
Alternate route: `/v1/characters/{character_id}/location/`
Alternate route: `/legacy/characters/{character_id}/location/`
Alternate route: `/dev/characters/{character_id}/location/`
---
This route is cached for up to 5 seconds
:type character_id: int
:param character_id: An EVE character ID
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetailCalendarDetail(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/calendar/{event_id}/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_calendar_event_id_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_calendar_event_id_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_calendar_event_id_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_calendar_event_id_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-calendar.read_calendar_events.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'title': 'get_characters_character_id_calendar_event_id_ok', 'type': 'object', 'properties': {'importance': {'format': 'int32', 'type': 'integer', 'description': 'importance integer', 'title': 'get_characters_character_id_calendar_event_id_importance'}, 'duration': {'format': 'int32', 'type': 'integer', 'description': 'Length in minutes', 'title': 'get_characters_character_id_calendar_event_id_duration'}, 'date': {'format': 'date-time', 'type': 'string', 'description': 'date string', 'title': 'get_characters_character_id_calendar_event_id_date'}, 'event_id': {'format': 'int32', 'type': 'integer', 'description': 'event_id integer', 'title': 'get_characters_character_id_calendar_event_id_event_id'}, 'owner_type': {'enum': ['eve_server', 'corporation', 'faction', 'character', 'alliance'], 'type': 'string', 'description': 'owner_type string', 'title': 'get_characters_character_id_calendar_event_id_owner_type'}, 'response': {'type': 'string', 'description': 'response string', 'title': 'get_characters_character_id_calendar_event_id_response'}, 'text': {'type': 'string', 'description': 'text string', 'title': 'get_characters_character_id_calendar_event_id_text'}, 'owner_name': {'type': 'string', 'description': 'owner_name string', 'title': 'get_characters_character_id_calendar_event_id_owner_name'}, 'owner_id': {'format': 'int32', 'type': 'integer', 'description': 'owner_id integer', 'title': 'get_characters_character_id_calendar_event_id_owner_id'}, 'title': {'type': 'string', 'description': 'title string', 'title': 'get_characters_character_id_calendar_event_id_title'}}, 'description': 'Full details of a specific event', 'required': ['event_id', 'owner_id', 'owner_name', 'date', 'title', 'duration', 'importance', 'response', 'text', 'owner_type']}, 'examples': {'application/json': {'importance': 1, 'duration': 60, 'date': '2016-06-26T21:00:00Z', 'event_id': 1386435, 'owner_type': 'eve_server', 'response': 'Undecided', 'text': 'o7: The EVE Online Show features latest developer news, fast paced action, community overviews and a lot more with CCP Guard and CCP Mimic. Join the thrilling o7 live broadcast at 20:00 EVE time (=UTC) on <a href="http://www.twitch.tv/ccp">EVE TV</a>. Don\'t miss it!', 'owner_name': 'EVE System', 'owner_id': 1, 'title': 'o7 The EVE Online Show'}}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'Full details of a specific event'}}
def get(self, character_id, event_id, datasource="tranquility",**kwargs):
"""
Get all the information for a specific event
---
Alternate route: `/v3/characters/{character_id}/calendar/{event_id}/`
Alternate route: `/dev/characters/{character_id}/calendar/{event_id}/`
---
This route is cached for up to 5 seconds
:type character_id: int
:param character_id: The character id requesting the event
:type event_id: int
:param event_id: The id of the event requested
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "event_id" : event_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
put_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'put_characters_character_id_calendar_event_id_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'put_characters_character_id_calendar_event_id_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'put_characters_character_id_calendar_event_id_403_forbidden'}}, 'description': 'Forbidden', 'title': 'put_characters_character_id_calendar_event_id_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-calendar.respond_calendar_events.v1'}}, 'description': 'Forbidden'}, '204': {'description': 'Event updated'}}
def put(self, character_id, event_id, response, datasource="tranquility",**kwargs):
"""
Set your response status to an event
---
Alternate route: `/v3/characters/{character_id}/calendar/{event_id}/`
Alternate route: `/dev/characters/{character_id}/calendar/{event_id}/`
:type character_id: int
:param character_id: The character ID requesting the event
:type event_id: int
:param event_id: The ID of the event requested
:type response: None
:param response: The response value to set, overriding current value.
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "event_id" : event_id, "response" : response, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.put_responses) \
.put(**kwargs_dict)
class CharactersDetailKillmailsRecent(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/killmails/recent/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_killmails_recent_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_killmails_recent_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_killmails_recent_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_killmails_recent_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-killmails.read_killmails.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'items': {'title': 'get_characters_character_id_killmails_recent_200_ok', 'type': 'object', 'properties': {'killmail_id': {'format': 'int32', 'type': 'integer', 'description': 'ID of this killmail', 'title': 'get_characters_character_id_killmails_recent_killmail_id'}, 'killmail_hash': {'type': 'string', 'description': 'A hash of this killmail', 'title': 'get_characters_character_id_killmails_recent_killmail_hash'}}, 'description': '200 ok object', 'required': ['killmail_id', 'killmail_hash']}, 'type': 'array', 'description': '200 ok array', 'title': 'get_characters_character_id_killmails_recent_ok'}, 'examples': {'application/json': [{'killmail_id': 2, 'killmail_hash': '8eef5e8fb6b88fe3407c489df33822b2e3b57a5e'}, {'killmail_id': 1, 'killmail_hash': 'b41ccb498ece33d64019f64c0db392aa3aa701fb'}]}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'A list of killmail IDs and hashes'}}
def get(self, character_id, datasource="tranquility",max_count=50,**kwargs):
"""
Return a list of character's recent kills and losses
---
Alternate route: `/v1/characters/{character_id}/killmails/recent/`
Alternate route: `/legacy/characters/{character_id}/killmails/recent/`
Alternate route: `/dev/characters/{character_id}/killmails/recent/`
---
This route is cached for up to 120 seconds
:type character_id: int
:param character_id: An EVE character ID
:type datasource: str
:param datasource: The server name you would like data from
:type max_count: int
:param max_count: How many killmails to return at maximum
:param kwargs: max_kill_id, token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource, "max_count" : max_count,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetailCorporationhistory(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/corporationhistory/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_corporationhistory_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_corporationhistory_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '200': {'schema': {'items': {'type': 'object', 'properties': {'start_date': {'format': 'date-time', 'type': 'string', 'description': 'start_date string', 'title': 'get_characters_character_id_corporationhistory_start_date'}, 'corporation_id': {'format': 'int32', 'type': 'integer', 'description': 'corporation_id integer', 'title': 'get_characters_character_id_corporationhistory_corporation_id'}, 'is_deleted': {'type': 'boolean', 'description': 'True if the corporation has been deleted', 'title': 'get_characters_character_id_corporationhistory_is_deleted'}, 'record_id': {'format': 'int32', 'type': 'integer', 'description': 'An incrementing ID that can be used to canonically establish order of records in cases where dates may be ambiguous', 'title': 'get_characters_character_id_corporationhistory_record_id'}}, 'description': '200 ok object', 'title': 'get_characters_character_id_corporationhistory_200_ok'}, 'type': 'array', 'description': '200 ok array', 'title': 'get_characters_character_id_corporationhistory_ok'}, 'examples': {'application/json': [{'start_date': '2016-06-26T20:00:00Z', 'corporation_id': 90000001, 'is_deleted': False, 'record_id': 500}, {'start_date': '2016-07-26T20:00:00Z', 'corporation_id': 90000002, 'is_deleted': False, 'record_id': 501}]}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'Corporation history for the given character'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
Get a list of all the corporations a character has been a member of
---
Alternate route: `/v1/characters/{character_id}/corporationhistory/`
Alternate route: `/legacy/characters/{character_id}/corporationhistory/`
Alternate route: `/dev/characters/{character_id}/corporationhistory/`
---
This route is cached for up to 3600 seconds
:type character_id: int
:param character_id: An EVE character ID
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetailPortrait(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/portrait/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_portrait_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_portrait_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '200': {'schema': {'type': 'object', 'properties': {'px64x64': {'type': 'string', 'description': 'px64x64 string', 'title': 'get_characters_character_id_portrait_px64x64'}, 'px256x256': {'type': 'string', 'description': 'px256x256 string', 'title': 'get_characters_character_id_portrait_px256x256'}, 'px512x512': {'type': 'string', 'description': 'px512x512 string', 'title': 'get_characters_character_id_portrait_px512x512'}, 'px128x128': {'type': 'string', 'description': 'px128x128 string', 'title': 'get_characters_character_id_portrait_px128x128'}}, 'description': '200 ok object', 'title': 'get_characters_character_id_portrait_ok'}, 'examples': {'application/json': {'px64x64': 'https://imageserver.eveonline.com/Character/95465499_64.jpg', 'px256x256': 'https://imageserver.eveonline.com/Character/95465499_256.jpg', 'px512x512': 'https://imageserver.eveonline.com/Character/95465499_512.jpg', 'px128x128': 'https://imageserver.eveonline.com/Character/95465499_128.jpg'}}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'Public data for the given character'}, '404': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'error message', 'title': 'get_characters_character_id_portrait_error'}}, 'description': 'No image server for this datasource', 'title': 'get_characters_character_id_portrait_not_found'}, 'examples': {'application/json': {'error': 'No image server for this datasource'}}, 'description': 'No image server for this datasource'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
Get portrait urls for a character
---
Alternate route: `/v2/characters/{character_id}/portrait/`
Alternate route: `/dev/characters/{character_id}/portrait/`
---
This route is cached for up to 3600 seconds
:type character_id: int
:param character_id: An EVE character ID
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetailMailLabels(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/mail/labels/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_mail_labels_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_mail_labels_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_mail_labels_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_mail_labels_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-mail.read_mail.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'type': 'object', 'properties': {'total_unread_count': {'format': 'int32', 'minimum': 0, 'type': 'integer', 'description': 'total_unread_count integer', 'title': 'get_characters_character_id_mail_labels_total_unread_count'}, 'labels': {'items': {'type': 'object', 'properties': {'label_id': {'format': 'int32', 'minimum': 0, 'type': 'integer', 'description': 'label_id integer', 'title': 'get_characters_character_id_mail_labels_label_id'}, 'unread_count': {'format': 'int32', 'minimum': 0, 'type': 'integer', 'description': 'unread_count integer', 'title': 'get_characters_character_id_mail_labels_unread_count'}, 'name': {'maxLength': 40, 'type': 'string', 'description': 'name string', 'title': 'get_characters_character_id_mail_labels_name'}, 'color': {'default': '#ffffff', 'enum': ['#ffffff', '#ffff01', '#ff6600', '#fe0000', '#9a0000', '#660066', '#0000fe', '#0099ff', '#01ffff', '#00ff33', '#349800', '#006634', '#666666', '#999999', '#e6e6e6', '#ffffcd', '#99ffff', '#ccff9a'], 'type': 'string', 'description': 'color string', 'title': 'get_characters_character_id_mail_labels_color'}}, 'description': 'label object', 'title': 'get_characters_character_id_mail_labels_label'}, 'type': 'array', 'description': 'labels array', 'title': 'get_characters_character_id_mail_labels_labels'}}, 'description': '200 ok object', 'title': 'get_characters_character_id_mail_labels_ok'}, 'examples': {'application/json': {'total_unread_count': 5, 'labels': [{'label_id': 16, 'unread_count': 4, 'name': 'PINK', 'color_hex': '#660066'}, {'label_id': 17, 'unread_count': 1, 'name': 'WHITE', 'color_hex': '#ffffff'}]}}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'A list of mail labels and unread counts'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
Return a list of the users mail labels, unread counts for each label and a total unread count.
---
Alternate route: `/v3/characters/{character_id}/mail/labels/`
Alternate route: `/dev/characters/{character_id}/mail/labels/`
---
This route is cached for up to 30 seconds
:type character_id: int
:param character_id: An EVE character ID
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
post_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'post_characters_character_id_mail_labels_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'post_characters_character_id_mail_labels_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'post_characters_character_id_mail_labels_403_forbidden'}}, 'description': 'Forbidden', 'title': 'post_characters_character_id_mail_labels_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-mail.organize_mail.v1'}}, 'description': 'Forbidden'}, '201': {'schema': {'format': 'int64', 'type': 'integer', 'description': 'Label ID', 'title': 'post_characters_character_id_mail_labels_created'}, 'examples': {'application/json': 128}, 'description': 'Label created'}}
def post(self, character_id, datasource="tranquility",**kwargs):
"""
Create a mail label
---
Alternate route: `/v2/characters/{character_id}/mail/labels/`
Alternate route: `/legacy/characters/{character_id}/mail/labels/`
Alternate route: `/dev/characters/{character_id}/mail/labels/`
:type character_id: int
:param character_id: An EVE character ID
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: label, token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.post_responses) \
.post(**kwargs_dict)
class CharactersDetailShip(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/ship/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_ship_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_ship_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_ship_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_ship_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-location.read_ship_type.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'title': 'get_characters_character_id_ship_ok', 'type': 'object', 'properties': {'ship_type_id': {'format': 'int32', 'type': 'integer', 'description': 'ship_type_id integer', 'title': 'get_characters_character_id_ship_ship_type_id'}, 'ship_name': {'type': 'string', 'description': 'ship_name string', 'title': 'get_characters_character_id_ship_ship_name'}, 'ship_item_id': {'format': 'int64', 'type': 'integer', 'description': "Item id's are unique to a ship and persist until it is repackaged. This value can be used to track repeated uses of a ship, or detect when a pilot changes into a different instance of the same ship type.", 'title': 'get_characters_character_id_ship_ship_item_id'}}, 'description': '200 ok object', 'required': ['ship_type_id', 'ship_item_id', 'ship_name']}, 'examples': {'application/json': {'ship_type_id': 1233, 'ship_name': 'SPACESHIPS!!!', 'ship_item_id': 1000000016991}}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'Get the current ship type, name and id'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
Get the current ship type, name and id
---
Alternate route: `/v1/characters/{character_id}/ship/`
Alternate route: `/legacy/characters/{character_id}/ship/`
Alternate route: `/dev/characters/{character_id}/ship/`
---
This route is cached for up to 5 seconds
:type character_id: int
:param character_id: An EVE character ID
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict)
class CharactersDetailCalendar(object):
base_url = "https://esi.tech.ccp.is/latest/characters/{character_id}/calendar/"
get_responses = {'500': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Internal server error message', 'title': 'get_characters_character_id_calendar_500_internal_server_error'}}, 'description': 'Internal server error', 'title': 'get_characters_character_id_calendar_internal_server_error'}, 'examples': {'application/json': {'error': "uncaught exception: IOError('out of memory')"}}, 'description': 'Internal server error'}, '403': {'schema': {'type': 'object', 'properties': {'error': {'type': 'string', 'description': 'Forbidden message', 'title': 'get_characters_character_id_calendar_403_forbidden'}}, 'description': 'Forbidden', 'title': 'get_characters_character_id_calendar_forbidden'}, 'examples': {'application/json': {'error': 'Token is not valid for scope(s): esi-calendar.read_calendar_events.v1'}}, 'description': 'Forbidden'}, '200': {'schema': {'items': {'type': 'object', 'properties': {'event_date': {'format': 'date-time', 'type': 'string', 'description': 'event_date string', 'title': 'get_characters_character_id_calendar_event_date'}, 'importance': {'format': 'int32', 'type': 'integer', 'description': 'importance integer', 'title': 'get_characters_character_id_calendar_importance'}, 'event_id': {'format': 'int32', 'type': 'integer', 'description': 'event_id integer', 'title': 'get_characters_character_id_calendar_event_id'}, 'event_response': {'enum': ['declined', 'not_responded', 'accepted', 'tentative'], 'type': 'string', 'description': 'event_response string', 'title': 'get_characters_character_id_calendar_event_response'}, 'title': {'type': 'string', 'description': 'title string', 'title': 'get_characters_character_id_calendar_title'}}, 'description': 'event', 'title': 'get_characters_character_id_calendar_200_ok'}, 'type': 'array', 'description': 'Up to 50 events from now or the event you requested\n', 'title': 'get_characters_character_id_calendar_ok'}, 'examples': {'application/json': [{'event_date': '2016-06-26T20:00:00Z', 'importance': 0, 'event_id': 1386435, 'event_response': 'accepted', 'title': 'o7 The EVE Online Show'}]}, 'headers': {'Expires': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}, 'Cache-Control': {'type': 'string', 'description': 'The caching mechanism used'}, 'Last-Modified': {'type': 'string', 'description': 'RFC7231 formatted datetime string'}}, 'description': 'A collection of event summaries'}}
def get(self, character_id, datasource="tranquility",**kwargs):
"""
Get 50 event summaries from the calendar. If no event ID is given,
the resource will return the next 50 chronological event summaries
from now. If an event ID is specified, it will return the next 50
chronological event summaries from after that event.
---
Alternate route: `/v1/characters/{character_id}/calendar/`
Alternate route: `/legacy/characters/{character_id}/calendar/`
Alternate route: `/dev/characters/{character_id}/calendar/`
---
This route is cached for up to 5 seconds
:type character_id: int
:param character_id: The character to retrieve events from
:type datasource: str
:param datasource: The server name you would like data from
:param kwargs: from_event, token, user_agent, X-User-Agent
"""
kwargs_dict ={
"character_id" : character_id, "datasource" : datasource,
}
kwargs_dict.update(kwargs)
return EsiRequestObject(self.base_url, self.get_responses) \
.get(**kwargs_dict) | 100.867211 | 8,627 | 0.695953 | 15,678 | 135,969 | 5.777714 | 0.048794 | 0.094598 | 0.128435 | 0.107007 | 0.835985 | 0.813387 | 0.755418 | 0.688098 | 0.661824 | 0.617368 | 0 | 0.021545 | 0.135671 | 135,969 | 1,348 | 8,628 | 100.867211 | 0.74923 | 0.188183 | 0 | 0.546917 | 1 | 0.034853 | 0.655501 | 0.207519 | 0 | 0 | 0 | 0 | 0 | 1 | 0.104558 | false | 0 | 0.008043 | 0 | 0.482574 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0778b0076bf52628697b55ec41b637b93992f172 | 7,545 | py | Python | tests/test_rlearner.py | ilirmaci/upliftml | 2a78055553548ea0207255bf556dfabc508a0648 | [
"Apache-2.0"
] | 154 | 2021-09-23T14:38:53.000Z | 2022-03-29T16:50:27.000Z | tests/test_rlearner.py | LeihuaYe/upliftml | 2a78055553548ea0207255bf556dfabc508a0648 | [
"Apache-2.0"
] | 3 | 2021-10-04T09:09:04.000Z | 2022-01-17T17:10:18.000Z | tests/test_rlearner.py | LeihuaYe/upliftml | 2a78055553548ea0207255bf556dfabc508a0648 | [
"Apache-2.0"
] | 12 | 2021-09-23T20:36:50.000Z | 2022-03-30T20:09:02.000Z | import h2o # type: ignore
from h2o.estimators.glm import H2OGeneralizedLinearEstimator # type: ignore
from pyspark.sql import SparkSession
from sklearn.metrics import r2_score # type: ignore
from upliftml.models.h2o import RLearnerEstimator
def test_train_model(spark: SparkSession, df_h2o_cont: h2o.H2OFrame) -> None:
orig_data_types = df_h2o_cont.types
orig_shape = df_h2o_cont.shape
predictor_colnames = [col for col in df_h2o_cont.columns if col.startswith("feature")]
model = RLearnerEstimator(
target_model_class=H2OGeneralizedLinearEstimator,
target_model_params={},
final_model_class=H2OGeneralizedLinearEstimator,
final_model_params={},
predictor_colnames=predictor_colnames,
propensity_model_class=None,
propensity_model_params=None,
target_colname="outcome",
treatment_colname="treatment",
treatment_value=1,
control_value=0,
categorical_outcome=False,
fold_colname=None,
n_folds=None,
)
model.fit(df_h2o_cont)
df_h2o_train_eval = df_h2o_cont.cbind(model.predict(df_h2o_cont).set_names(["predicted_cate"]))
df_pd_train_eval = df_h2o_train_eval.as_data_frame()
# check that after fitting the model and predicting, the df_h2o still has the same shape as the original df_pd
assert df_h2o_cont.shape == orig_shape
# check that the evaluation dataset has the expected shape
assert df_h2o_train_eval.shape[0] == orig_shape[0]
assert df_h2o_train_eval.shape[1] == orig_shape[1] + 1
# check that the column types in df_h2o still match the original ones
assert df_h2o_cont.types == orig_data_types
# check that the predictions are better than random
assert r2_score(df_pd_train_eval["actual_cate"], df_pd_train_eval["predicted_cate"]) > 0
def test_train_model_binary_outcome(spark: SparkSession, df_h2o_binary: h2o.H2OFrame) -> None:
orig_data_types = df_h2o_binary.types
orig_shape = df_h2o_binary.shape
predictor_colnames = [col for col in df_h2o_binary.columns if col.startswith("feature")]
model = RLearnerEstimator(
target_model_class=H2OGeneralizedLinearEstimator,
target_model_params={},
final_model_class=H2OGeneralizedLinearEstimator,
final_model_params={},
predictor_colnames=predictor_colnames,
propensity_model_class=None,
propensity_model_params=None,
target_colname="outcome",
treatment_colname="treatment",
treatment_value=1,
control_value=0,
categorical_outcome=True,
fold_colname=None,
n_folds=None,
)
model.fit(df_h2o_binary)
df_h2o_train_eval = df_h2o_binary.cbind(model.predict(df_h2o_binary).set_names(["predicted_cate"]))
df_pd_train_eval = df_h2o_train_eval.as_data_frame()
# check that after fitting the model and predicting, the df_h2o still has the same shape as the original df_pd
assert df_h2o_binary.shape == orig_shape
# check that the evaluation dataset has the expected shape
assert df_h2o_train_eval.shape[0] == orig_shape[0]
assert df_h2o_train_eval.shape[1] == orig_shape[1] + 1
# check that the column types in df_h2o still match the original ones
assert df_h2o_binary.types == orig_data_types
# check that the predictions are better than random
assert r2_score(df_pd_train_eval["actual_cate"], df_pd_train_eval["predicted_cate"]) > 0
# check that the predicted treatment effect values are between [-1, 1]
assert df_pd_train_eval["predicted_cate"].min() >= -1
assert df_pd_train_eval["predicted_cate"].max() <= 1
def test_train_model_same_train_val_set(spark: SparkSession, df_h2o_binary: h2o.H2OFrame) -> None:
orig_data_types = df_h2o_binary.types
orig_shape = df_h2o_binary.shape
predictor_colnames = [col for col in df_h2o_binary.columns if col.startswith("feature")]
model = RLearnerEstimator(
target_model_class=H2OGeneralizedLinearEstimator,
target_model_params={},
final_model_class=H2OGeneralizedLinearEstimator,
final_model_params={},
predictor_colnames=predictor_colnames,
propensity_model_class=None,
propensity_model_params=None,
target_colname="outcome",
treatment_colname="treatment",
treatment_value=1,
control_value=0,
categorical_outcome=True,
fold_colname=None,
n_folds=None,
)
model.fit(df_h2o_binary, df_h2o_binary)
df_h2o_train_eval = df_h2o_binary.cbind(model.predict(df_h2o_binary).set_names(["predicted_cate"]))
df_pd_train_eval = df_h2o_train_eval.as_data_frame()
# check that after fitting the model and predicting, the df_h2o still has the same shape as the original df_pd
assert df_h2o_binary.shape == orig_shape
# check that the evaluation dataset has the expected shape
assert df_h2o_train_eval.shape[0] == orig_shape[0]
assert df_h2o_train_eval.shape[1] == orig_shape[1] + 1
# check that the column types in df_h2o still match the original ones
assert df_h2o_binary.types == orig_data_types
# check that the predictions are better than random
assert r2_score(df_pd_train_eval["actual_cate"], df_pd_train_eval["predicted_cate"]) > 0
# check that the predicted treatment effect values are between [-1, 1]
assert df_pd_train_eval["predicted_cate"].min() >= -1
assert df_pd_train_eval["predicted_cate"].max() <= 1
def test_train_model_val_set(spark: SparkSession, df_h2o_binary: h2o.H2OFrame, df_h2o_binary_val: h2o.H2OFrame) -> None:
orig_data_types = df_h2o_binary.types
orig_shape = df_h2o_binary.shape
orig_data_types_val = df_h2o_binary_val.types
orig_shape_val = df_h2o_binary_val.shape
predictor_colnames = [col for col in df_h2o_binary.columns if col.startswith("feature")]
model = RLearnerEstimator(
target_model_class=H2OGeneralizedLinearEstimator,
target_model_params={},
final_model_class=H2OGeneralizedLinearEstimator,
final_model_params={},
predictor_colnames=predictor_colnames,
propensity_model_class=None,
propensity_model_params=None,
target_colname="outcome",
treatment_colname="treatment",
treatment_value=1,
control_value=0,
categorical_outcome=True,
fold_colname=None,
n_folds=None,
)
model.fit(df_h2o_binary, df_h2o_binary_val)
df_h2o_train_eval = df_h2o_binary.cbind(model.predict(df_h2o_binary).set_names(["predicted_cate"]))
df_pd_train_eval = df_h2o_train_eval.as_data_frame()
# check that after fitting the model and predicting, the df_h2o still has the same shape as the original df_pd
assert df_h2o_binary.shape == orig_shape
assert df_h2o_binary_val.shape == orig_shape_val
# check that the evaluation dataset has the expected shape
assert df_h2o_train_eval.shape[0] == orig_shape[0]
assert df_h2o_train_eval.shape[1] == orig_shape[1] + 1
# check that the column types in df_h2o still match the original ones
assert df_h2o_binary.types == orig_data_types
assert df_h2o_binary_val.types == orig_data_types_val
# check that the predictions are better than random
assert r2_score(df_pd_train_eval["actual_cate"], df_pd_train_eval["predicted_cate"]) > 0
# check that the predicted treatment effect values are between [-1, 1]
assert df_pd_train_eval["predicted_cate"].min() >= -1
assert df_pd_train_eval["predicted_cate"].max() <= 1
| 39.296875 | 120 | 0.735984 | 1,071 | 7,545 | 4.823529 | 0.093371 | 0.064847 | 0.072396 | 0.045296 | 0.928959 | 0.908827 | 0.896245 | 0.896245 | 0.889857 | 0.872629 | 0 | 0.022505 | 0.187276 | 7,545 | 191 | 121 | 39.502618 | 0.819961 | 0.183035 | 0 | 0.779528 | 0 | 0 | 0.054063 | 0 | 0 | 0 | 0 | 0 | 0.220472 | 1 | 0.031496 | false | 0 | 0.03937 | 0 | 0.070866 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0799b03305a3ecc302330090d2cc73c92b9513d3 | 63 | py | Python | 基础教程/A1-Python与基础知识/算法第一步/ExampleCodes/chapter06/6-5.py | microsoft/ai-edu | 2f59fa4d3cf19f14e0b291e907d89664bcdc8df3 | [
"Apache-2.0"
] | 11,094 | 2019-05-07T02:48:50.000Z | 2022-03-31T08:49:42.000Z | 基础教程/A1-Python与基础知识/算法第一步/ExampleCodes/chapter06/6-5.py | microsoft/ai-edu | 2f59fa4d3cf19f14e0b291e907d89664bcdc8df3 | [
"Apache-2.0"
] | 157 | 2019-05-13T15:07:19.000Z | 2022-03-23T08:52:32.000Z | 基础教程/A1-Python与基础知识/算法第一步/ExampleCodes/chapter06/6-5.py | microsoft/ai-edu | 2f59fa4d3cf19f14e0b291e907d89664bcdc8df3 | [
"Apache-2.0"
] | 2,412 | 2019-05-07T02:55:15.000Z | 2022-03-30T06:56:52.000Z | print(6-2)
print(7.5/3)
print((3-1.12)*10/4*(2-3.325)+3*2-5/10) | 21 | 39 | 0.587302 | 20 | 63 | 1.85 | 0.55 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.360656 | 0.031746 | 63 | 3 | 39 | 21 | 0.245902 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
07bc6d31c9cf7ee45b9249d80bf099343eedf4e3 | 12,454 | py | Python | TWLight/users/migrations/0001_initial.py | jajodiaraghav/TWLight | 22359ab0b95ee3653e8ffa0eb698acd7bb8ebf70 | [
"MIT"
] | 1 | 2019-10-24T04:49:52.000Z | 2019-10-24T04:49:52.000Z | TWLight/users/migrations/0001_initial.py | jajodiaraghav/TWLight | 22359ab0b95ee3653e8ffa0eb698acd7bb8ebf70 | [
"MIT"
] | 1 | 2019-03-29T15:29:45.000Z | 2019-03-29T15:57:20.000Z | TWLight/users/migrations/0001_initial.py | jajodiaraghav/TWLight | 22359ab0b95ee3653e8ffa0eb698acd7bb8ebf70 | [
"MIT"
] | 1 | 2019-09-26T14:40:27.000Z | 2019-09-26T14:40:27.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models, migrations
from django.conf import settings
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='Editor',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('last_updated', models.DateField(help_text='When this information was last edited', auto_now=True)),
('account_created', models.DateField(help_text='When this information was first created', auto_now_add=True)),
('wp_username', models.CharField(help_text='Username', max_length=235)),
('wp_editcount', models.IntegerField(help_text='Wikipedia edit count')),
('wp_registered', models.DateField(help_text='Date registered at Wikipedia')),
('wp_sub', models.IntegerField(help_text='Wikipedia user ID')),
('_wp_internal', models.TextField()),
('home_wiki', models.CharField(help_text='Home wiki, as indicated by user', max_length=4, choices=[(b'aa', b'aa.wikipedia.org/wiki/'), (b'ab', b'ab.wikipedia.org/wiki/'), (b'ace', b'ace.wikipedia.org/wiki/'), (b'af', b'af.wikipedia.org/wiki/'), (b'ak', b'ak.wikipedia.org/wiki/'), (b'als', b'als.wikipedia.org/wiki/'), (b'am', b'am.wikipedia.org/wiki/'), (b'an', b'an.wikipedia.org/wiki/'), (b'ang', b'ang.wikipedia.org/wiki/'), (b'ar', b'ar.wikipedia.org/wiki/'), (b'arc', b'arc.wikipedia.org/wiki/'), (b'arz', b'arz.wikipedia.org/wiki/'), (b'as', b'as.wikipedia.org/wiki/'), (b'ast', b'ast.wikipedia.org/wiki/'), (b'av', b'av.wikipedia.org/wiki/'), (b'ay', b'ay.wikipedia.org/wiki/'), (b'az', b'az.wikipedia.org/wiki/'), (b'azb', b'azb.wikipedia.org/wiki/'), (b'ba', b'ba.wikipedia.org/wiki/'), (b'bar', b'bar.wikipedia.org/wiki/'), (b'bat-smg', b'bat-smg.wikipedia.org/wiki/'), (b'bcl', b'bcl.wikipedia.org/wiki/'), (b'be', b'be.wikipedia.org/wiki/'), (b'be-x-old', b'be-x-old.wikipedia.org/wiki/'), (b'bg', b'bg.wikipedia.org/wiki/'), (b'bh', b'bh.wikipedia.org/wiki/'), (b'bi', b'bi.wikipedia.org/wiki/'), (b'bjn', b'bjn.wikipedia.org/wiki/'), (b'bm', b'bm.wikipedia.org/wiki/'), (b'bn', b'bn.wikipedia.org/wiki/'), (b'bo', b'bo.wikipedia.org/wiki/'), (b'bpy', b'bpy.wikipedia.org/wiki/'), (b'br', b'br.wikipedia.org/wiki/'), (b'bs', b'bs.wikipedia.org/wiki/'), (b'bug', b'bug.wikipedia.org/wiki/'), (b'bxr', b'bxr.wikipedia.org/wiki/'), (b'ca', b'ca.wikipedia.org/wiki/'), (b'cbk-zam', b'cbk-zam.wikipedia.org/wiki/'), (b'cdo', b'cdo.wikipedia.org/wiki/'), (b'ce', b'ce.wikipedia.org/wiki/'), (b'ceb', b'ceb.wikipedia.org/wiki/'), (b'ch', b'ch.wikipedia.org/wiki/'), (b'cho', b'cho.wikipedia.org/wiki/'), (b'chr', b'chr.wikipedia.org/wiki/'), (b'chy', b'chy.wikipedia.org/wiki/'), (b'ckb', b'ckb.wikipedia.org/wiki/'), (b'co', b'co.wikipedia.org/wiki/'), (b'cr', b'cr.wikipedia.org/wiki/'), (b'crh', b'crh.wikipedia.org/wiki/'), (b'cs', b'cs.wikipedia.org/wiki/'), (b'csb', b'csb.wikipedia.org/wiki/'), (b'cu', b'cu.wikipedia.org/wiki/'), (b'cv', b'cv.wikipedia.org/wiki/'), (b'cy', b'cy.wikipedia.org/wiki/'), (b'da', b'da.wikipedia.org/wiki/'), (b'de', b'de.wikipedia.org/wiki/'), (b'diq', b'diq.wikipedia.org/wiki/'), (b'dsb', b'dsb.wikipedia.org/wiki/'), (b'dv', b'dv.wikipedia.org/wiki/'), (b'dz', b'dz.wikipedia.org/wiki/'), (b'ee', b'ee.wikipedia.org/wiki/'), (b'el', b'el.wikipedia.org/wiki/'), (b'eml', b'eml.wikipedia.org/wiki/'), (b'en', b'en.wikipedia.org/wiki/'), (b'eo', b'eo.wikipedia.org/wiki/'), (b'es', b'es.wikipedia.org/wiki/'), (b'et', b'et.wikipedia.org/wiki/'), (b'eu', b'eu.wikipedia.org/wiki/'), (b'ext', b'ext.wikipedia.org/wiki/'), (b'fa', b'fa.wikipedia.org/wiki/'), (b'ff', b'ff.wikipedia.org/wiki/'), (b'fi', b'fi.wikipedia.org/wiki/'), (b'fiu-vro', b'fiu-vro.wikipedia.org/wiki/'), (b'fj', b'fj.wikipedia.org/wiki/'), (b'fo', b'fo.wikipedia.org/wiki/'), (b'fr', b'fr.wikipedia.org/wiki/'), (b'frp', b'frp.wikipedia.org/wiki/'), (b'frr', b'frr.wikipedia.org/wiki/'), (b'fur', b'fur.wikipedia.org/wiki/'), (b'fy', b'fy.wikipedia.org/wiki/'), (b'ga', b'ga.wikipedia.org/wiki/'), (b'gag', b'gag.wikipedia.org/wiki/'), (b'gan', b'gan.wikipedia.org/wiki/'), (b'gd', b'gd.wikipedia.org/wiki/'), (b'gl', b'gl.wikipedia.org/wiki/'), (b'glk', b'glk.wikipedia.org/wiki/'), (b'gn', b'gn.wikipedia.org/wiki/'), (b'gom', b'gom.wikipedia.org/wiki/'), (b'got', b'got.wikipedia.org/wiki/'), (b'gu', b'gu.wikipedia.org/wiki/'), (b'gv', b'gv.wikipedia.org/wiki/'), (b'ha', b'ha.wikipedia.org/wiki/'), (b'hak', b'hak.wikipedia.org/wiki/'), (b'haw', b'haw.wikipedia.org/wiki/'), (b'he', b'he.wikipedia.org/wiki/'), (b'hi', b'hi.wikipedia.org/wiki/'), (b'hif', b'hif.wikipedia.org/wiki/'), (b'ho', b'ho.wikipedia.org/wiki/'), (b'hr', b'hr.wikipedia.org/wiki/'), (b'hsb', b'hsb.wikipedia.org/wiki/'), (b'ht', b'ht.wikipedia.org/wiki/'), (b'hu', b'hu.wikipedia.org/wiki/'), (b'hy', b'hy.wikipedia.org/wiki/'), (b'hz', b'hz.wikipedia.org/wiki/'), (b'ia', b'ia.wikipedia.org/wiki/'), (b'id', b'id.wikipedia.org/wiki/'), (b'ie', b'ie.wikipedia.org/wiki/'), (b'ig', b'ig.wikipedia.org/wiki/'), (b'ii', b'ii.wikipedia.org/wiki/'), (b'ik', b'ik.wikipedia.org/wiki/'), (b'ilo', b'ilo.wikipedia.org/wiki/'), (b'io', b'io.wikipedia.org/wiki/'), (b'is', b'is.wikipedia.org/wiki/'), (b'it', b'it.wikipedia.org/wiki/'), (b'iu', b'iu.wikipedia.org/wiki/'), (b'ja', b'ja.wikipedia.org/wiki/'), (b'jbo', b'jbo.wikipedia.org/wiki/'), (b'jv', b'jv.wikipedia.org/wiki/'), (b'ka', b'ka.wikipedia.org/wiki/'), (b'kaa', b'kaa.wikipedia.org/wiki/'), (b'kab', b'kab.wikipedia.org/wiki/'), (b'kbd', b'kbd.wikipedia.org/wiki/'), (b'kg', b'kg.wikipedia.org/wiki/'), (b'ki', b'ki.wikipedia.org/wiki/'), (b'kj', b'kj.wikipedia.org/wiki/'), (b'kk', b'kk.wikipedia.org/wiki/'), (b'kl', b'kl.wikipedia.org/wiki/'), (b'km', b'km.wikipedia.org/wiki/'), (b'kn', b'kn.wikipedia.org/wiki/'), (b'ko', b'ko.wikipedia.org/wiki/'), (b'koi', b'koi.wikipedia.org/wiki/'), (b'kr', b'kr.wikipedia.org/wiki/'), (b'krc', b'krc.wikipedia.org/wiki/'), (b'ks', b'ks.wikipedia.org/wiki/'), (b'ksh', b'ksh.wikipedia.org/wiki/'), (b'ku', b'ku.wikipedia.org/wiki/'), (b'kv', b'kv.wikipedia.org/wiki/'), (b'kw', b'kw.wikipedia.org/wiki/'), (b'ky', b'ky.wikipedia.org/wiki/'), (b'la', b'la.wikipedia.org/wiki/'), (b'lad', b'lad.wikipedia.org/wiki/'), (b'lb', b'lb.wikipedia.org/wiki/'), (b'lbe', b'lbe.wikipedia.org/wiki/'), (b'lez', b'lez.wikipedia.org/wiki/'), (b'lg', b'lg.wikipedia.org/wiki/'), (b'li', b'li.wikipedia.org/wiki/'), (b'lij', b'lij.wikipedia.org/wiki/'), (b'lmo', b'lmo.wikipedia.org/wiki/'), (b'ln', b'ln.wikipedia.org/wiki/'), (b'lo', b'lo.wikipedia.org/wiki/'), (b'lrc', b'lrc.wikipedia.org/wiki/'), (b'lt', b'lt.wikipedia.org/wiki/'), (b'ltg', b'ltg.wikipedia.org/wiki/'), (b'lv', b'lv.wikipedia.org/wiki/'), (b'mai', b'mai.wikipedia.org/wiki/'), (b'map-bms', b'map-bms.wikipedia.org/wiki/'), (b'mdf', b'mdf.wikipedia.org/wiki/'), (b'mg', b'mg.wikipedia.org/wiki/'), (b'mh', b'mh.wikipedia.org/wiki/'), (b'mhr', b'mhr.wikipedia.org/wiki/'), (b'mi', b'mi.wikipedia.org/wiki/'), (b'min', b'min.wikipedia.org/wiki/'), (b'mk', b'mk.wikipedia.org/wiki/'), (b'ml', b'ml.wikipedia.org/wiki/'), (b'mn', b'mn.wikipedia.org/wiki/'), (b'mo', b'mo.wikipedia.org/wiki/'), (b'mr', b'mr.wikipedia.org/wiki/'), (b'mrj', b'mrj.wikipedia.org/wiki/'), (b'ms', b'ms.wikipedia.org/wiki/'), (b'mt', b'mt.wikipedia.org/wiki/'), (b'mus', b'mus.wikipedia.org/wiki/'), (b'mwl', b'mwl.wikipedia.org/wiki/'), (b'my', b'my.wikipedia.org/wiki/'), (b'myv', b'myv.wikipedia.org/wiki/'), (b'mzn', b'mzn.wikipedia.org/wiki/'), (b'na', b'na.wikipedia.org/wiki/'), (b'nah', b'nah.wikipedia.org/wiki/'), (b'nap', b'nap.wikipedia.org/wiki/'), (b'nds', b'nds.wikipedia.org/wiki/'), (b'nds-nl', b'nds-nl.wikipedia.org/wiki/'), (b'ne', b'ne.wikipedia.org/wiki/'), (b'new', b'new.wikipedia.org/wiki/'), (b'ng', b'ng.wikipedia.org/wiki/'), (b'nl', b'nl.wikipedia.org/wiki/'), (b'nn', b'nn.wikipedia.org/wiki/'), (b'no', b'no.wikipedia.org/wiki/'), (b'nov', b'nov.wikipedia.org/wiki/'), (b'nrm', b'nrm.wikipedia.org/wiki/'), (b'nso', b'nso.wikipedia.org/wiki/'), (b'nv', b'nv.wikipedia.org/wiki/'), (b'ny', b'ny.wikipedia.org/wiki/'), (b'oc', b'oc.wikipedia.org/wiki/'), (b'om', b'om.wikipedia.org/wiki/'), (b'or', b'or.wikipedia.org/wiki/'), (b'os', b'os.wikipedia.org/wiki/'), (b'pa', b'pa.wikipedia.org/wiki/'), (b'pag', b'pag.wikipedia.org/wiki/'), (b'pam', b'pam.wikipedia.org/wiki/'), (b'pap', b'pap.wikipedia.org/wiki/'), (b'pcd', b'pcd.wikipedia.org/wiki/'), (b'pdc', b'pdc.wikipedia.org/wiki/'), (b'pfl', b'pfl.wikipedia.org/wiki/'), (b'pi', b'pi.wikipedia.org/wiki/'), (b'pih', b'pih.wikipedia.org/wiki/'), (b'pl', b'pl.wikipedia.org/wiki/'), (b'pms', b'pms.wikipedia.org/wiki/'), (b'pnb', b'pnb.wikipedia.org/wiki/'), (b'pnt', b'pnt.wikipedia.org/wiki/'), (b'ps', b'ps.wikipedia.org/wiki/'), (b'pt', b'pt.wikipedia.org/wiki/'), (b'qu', b'qu.wikipedia.org/wiki/'), (b'rm', b'rm.wikipedia.org/wiki/'), (b'rmy', b'rmy.wikipedia.org/wiki/'), (b'rn', b'rn.wikipedia.org/wiki/'), (b'ro', b'ro.wikipedia.org/wiki/'), (b'roa-rup', b'roa-rup.wikipedia.org/wiki/'), (b'roa-tara', b'roa-tara.wikipedia.org/wiki/'), (b'ru', b'ru.wikipedia.org/wiki/'), (b'rue', b'rue.wikipedia.org/wiki/'), (b'rw', b'rw.wikipedia.org/wiki/'), (b'sa', b'sa.wikipedia.org/wiki/'), (b'sah', b'sah.wikipedia.org/wiki/'), (b'sc', b'sc.wikipedia.org/wiki/'), (b'scn', b'scn.wikipedia.org/wiki/'), (b'sco', b'sco.wikipedia.org/wiki/'), (b'sd', b'sd.wikipedia.org/wiki/'), (b'se', b'se.wikipedia.org/wiki/'), (b'sg', b'sg.wikipedia.org/wiki/'), (b'sh', b'sh.wikipedia.org/wiki/'), (b'si', b'si.wikipedia.org/wiki/'), (b'simple', b'simple.wikipedia.org/wiki/'), (b'sk', b'sk.wikipedia.org/wiki/'), (b'sl', b'sl.wikipedia.org/wiki/'), (b'sm', b'sm.wikipedia.org/wiki/'), (b'sn', b'sn.wikipedia.org/wiki/'), (b'so', b'so.wikipedia.org/wiki/'), (b'sq', b'sq.wikipedia.org/wiki/'), (b'sr', b'sr.wikipedia.org/wiki/'), (b'srn', b'srn.wikipedia.org/wiki/'), (b'ss', b'ss.wikipedia.org/wiki/'), (b'st', b'st.wikipedia.org/wiki/'), (b'stq', b'stq.wikipedia.org/wiki/'), (b'su', b'su.wikipedia.org/wiki/'), (b'sv', b'sv.wikipedia.org/wiki/'), (b'sw', b'sw.wikipedia.org/wiki/'), (b'szl', b'szl.wikipedia.org/wiki/'), (b'ta', b'ta.wikipedia.org/wiki/'), (b'te', b'te.wikipedia.org/wiki/'), (b'tet', b'tet.wikipedia.org/wiki/'), (b'tg', b'tg.wikipedia.org/wiki/'), (b'th', b'th.wikipedia.org/wiki/'), (b'ti', b'ti.wikipedia.org/wiki/'), (b'tk', b'tk.wikipedia.org/wiki/'), (b'tl', b'tl.wikipedia.org/wiki/'), (b'tn', b'tn.wikipedia.org/wiki/'), (b'to', b'to.wikipedia.org/wiki/'), (b'tpi', b'tpi.wikipedia.org/wiki/'), (b'tr', b'tr.wikipedia.org/wiki/'), (b'ts', b'ts.wikipedia.org/wiki/'), (b'tt', b'tt.wikipedia.org/wiki/'), (b'tum', b'tum.wikipedia.org/wiki/'), (b'tw', b'tw.wikipedia.org/wiki/'), (b'ty', b'ty.wikipedia.org/wiki/'), (b'tyv', b'tyv.wikipedia.org/wiki/'), (b'udm', b'udm.wikipedia.org/wiki/'), (b'ug', b'ug.wikipedia.org/wiki/'), (b'uk', b'uk.wikipedia.org/wiki/'), (b'ur', b'ur.wikipedia.org/wiki/'), (b'uz', b'uz.wikipedia.org/wiki/'), (b've', b've.wikipedia.org/wiki/'), (b'vec', b'vec.wikipedia.org/wiki/'), (b'vep', b'vep.wikipedia.org/wiki/'), (b'vi', b'vi.wikipedia.org/wiki/'), (b'vls', b'vls.wikipedia.org/wiki/'), (b'vo', b'vo.wikipedia.org/wiki/'), (b'wa', b'wa.wikipedia.org/wiki/'), (b'war', b'war.wikipedia.org/wiki/'), (b'wo', b'wo.wikipedia.org/wiki/'), (b'wuu', b'wuu.wikipedia.org/wiki/'), (b'xal', b'xal.wikipedia.org/wiki/'), (b'xh', b'xh.wikipedia.org/wiki/'), (b'xmf', b'xmf.wikipedia.org/wiki/'), (b'yi', b'yi.wikipedia.org/wiki/'), (b'yo', b'yo.wikipedia.org/wiki/'), (b'za', b'za.wikipedia.org/wiki/'), (b'zea', b'zea.wikipedia.org/wiki/'), (b'zh', b'zh.wikipedia.org/wiki/'), (b'zh-classical', b'zh-classical.wikipedia.org/wiki/'), (b'zh-min-nan', b'zh-min-nan.wikipedia.org/wiki/'), (b'zh-yue', b'zh-yue.wikipedia.org/wiki/'), (b'zu', b'zu.wikipedia.org/wiki/')])),
('contributions', models.TextField(help_text='Wiki contributions, as entered by user')),
('email', models.EmailField(help_text='Email, as entered by user', max_length=75)),
('user', models.OneToOneField(to=settings.AUTH_USER_MODEL)),
],
options={
},
bases=(models.Model,),
),
]
| 345.944444 | 10,925 | 0.621005 | 2,239 | 12,454 | 3.437695 | 0.169272 | 0.453683 | 0.604911 | 0.640509 | 0.050929 | 0.011693 | 0.011693 | 0.011693 | 0 | 0 | 0 | 0.000611 | 0.079573 | 12,454 | 35 | 10,926 | 355.828571 | 0.670854 | 0.001686 | 0 | 0 | 0 | 0 | 0.617891 | 0.528357 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.103448 | 0 | 0.206897 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
07cc1bf6d3c8ed5791863bf6d670f6e3d8a15a98 | 52,891 | py | Python | cohesivenet/api/vns3/monitoring_alerting_api.py | cohesive/python-cohesivenet-sdk | 5620acfa669ff97c94d9aa04a16facda37d648c1 | [
"MIT"
] | null | null | null | cohesivenet/api/vns3/monitoring_alerting_api.py | cohesive/python-cohesivenet-sdk | 5620acfa669ff97c94d9aa04a16facda37d648c1 | [
"MIT"
] | null | null | null | cohesivenet/api/vns3/monitoring_alerting_api.py | cohesive/python-cohesivenet-sdk | 5620acfa669ff97c94d9aa04a16facda37d648c1 | [
"MIT"
] | null | null | null | # coding: utf-8
"""
VNS3 Controller API
Cohesive networks VNS3 API providing complete control of your network's addresses, routes, rules and edge # noqa: E501
The version of the OpenAPI document: 4.8
Contact: solutionscohesive.net
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
from cohesivenet.api_builder import VersionRouter
def get_webhooks(api_client, **kwargs): # noqa: E501
"""get_webhooks # noqa: E501
Retrieve all Alert integrations (webhooks) # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_webhooks(client, async_req=True)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/webhooks",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_webhook(api_client, webhook_id, **kwargs): # noqa: E501
"""get_webhook # noqa: E501
Retrieve details for single webhook integration # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_webhook(client, webhook_id, async_req=True)
:param async_req bool: execute request asynchronously
:param int webhook_id: ID for webhook integration (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"webhook_id": webhook_id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/webhook/{webhook_id}",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def post_create_webhook(
api_client,
name=None,
url=None,
events=None,
body=None,
validate_cert=None,
custom_properties=None,
headers=None,
parameters=None,
**kwargs
): # noqa: E501
"""post_create_webhook # noqa: E501
Define new Webhook integration # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.post_create_webhook(client, name="New slack webhook", async_req=True)
:param name str: (required)
:param url str:
:param events List[str]:
:param body str: Webhook payload
:param validate_cert bool: verify SSL
:param custom_properties List[{
name str: (required)
value str:
description str:
}]:
:param headers List[{
name str:
value str:
}]:
:param parameters List[{
name str:
value str:
}]:
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = [
"name",
"url",
"events",
"body",
"validate_cert",
"custom_properties",
"headers",
"parameters",
]
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# HTTP header `Content-Type`
header_params["Content-Type"] = api_client.select_header_content_type( # noqa: E501
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/webhook",
"POST",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def put_update_webhook(
api_client,
webhook_id,
name=None,
url=None,
events=None,
body=None,
validate_cert=None,
custom_properties=None,
headers=None,
parameters=None,
**kwargs
): # noqa: E501
"""put_update_webhook # noqa: E501
Edit defined webhook integration # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.put_update_webhook(webhook_id, update_webhook_request, async_req=True)
:param int webhook_id: ID for webhook integration (required)
:param name str:
:param url str:
:param events List[str]:
:param body str: Webhook payload
:param validate_cert bool: execute request asynchronously
:param custom_properties List[{
name str: (required)
value str:
description str:
}]:
:param headers List[{
name str:
value str:
}]:
:param parameters List[{
name str:
value str:
}]:
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = [
"name",
"url",
"events",
"body",
"validate_cert",
"custom_properties",
"headers",
"parameters",
]
collection_formats = {}
path_params = {"webhook_id": webhook_id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# HTTP header `Content-Type`
header_params["Content-Type"] = api_client.select_header_content_type( # noqa: E501
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/webhook/{webhook_id}",
"PUT",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def delete_webhook(api_client, webhook_id, **kwargs): # noqa: E501
"""delete_webhook # noqa: E501
Delete defined webhook integration # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.delete_webhook(webhook_id, async_req=True)
:param int webhook_id: ID for webhook integration (required)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"webhook_id": webhook_id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/webhook/{webhook_id}",
"DELETE",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def delete_alert(api_client, alert_id, **kwargs): # noqa: E501
"""delete_alert # noqa: E501
Delete defined alert # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.delete_alert(client, async_req=True)
:param int alert_id: ID for webhook integration (required)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"alert_id": alert_id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/alert/{alert_id}",
"DELETE",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_alert(api_client, alert_id, **kwargs): # noqa: E501
"""get_alert # noqa: E501
Retrieve details for single alert # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_alert(alert_id, async_req=True)
:param int alert_id: ID for Alert definition (required)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"alert_id": alert_id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/alert/{alert_id}",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_alerts(api_client, **kwargs): # noqa: E501
"""get_alerts # noqa: E501
Retrieve all alerts # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_alerts(async_req=True)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/alerts",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def post_create_alert(
api_client,
name=None,
url=None,
enabled=None,
events=None,
custom_properties=None,
webhook_id=None,
webhook_name=None,
template_id=None,
**kwargs
): # noqa: E501
"""post_create_alert # noqa: E501
Define new alert # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.post_create_alert(client, async_req=True)
:param name str: (required)
:param url str: (required)
:param enabled bool: True by default
:param events List[str]:
:param custom_properties List[{
name str:
value str:
}]:
:param webhook_id int: webhook_id or template_id must be provided. Only webhook_id supported starting with 4.9
:param webhook_name str: What to name new webhook if creating. Starting with 4.9
:param template_id int: webhook_id or template_id must be provided. Only webhook_id supported starting with 4.9
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = [
"name",
"url",
"enabled",
"events",
"custom_properties",
"webhook_id",
"template_id",
"webhook_name",
]
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# HTTP header `Content-Type`
header_params["Content-Type"] = api_client.select_header_content_type( # noqa: E501
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/alert",
"POST",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def post_test_alert(api_client, alert_id, **kwargs): # noqa: E501
"""post_test_alert # noqa: E501
Send test alert for this defined alert # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.post_test_alert(client, alert_id, async_req=True)
:param int alert_id: ID for Alert definition (required)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"alert_id": alert_id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/alert/{alert_id}/test",
"POST",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def post_toggle_enable_alert(api_client, alert_id, **kwargs): # noqa: E501
"""post_toggle_enable_alert # noqa: E501
Toggle enabled property on alert # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.post_toggle_enable_alert(alert_id, async_req=True)
:param async_req bool: execute request asynchronously
:param int alert_id: ID for Alert definition (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"alert_id": alert_id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/alert/{alert_id}/toggle_enabled",
"POST",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def put_update_alert(
api_client,
alert_id,
name=None,
url=None,
enabled=None,
events=None,
custom_properties=None,
webhook_id=None,
template_id=None,
**kwargs
): # noqa: E501
"""put_update_alert # noqa: E501
Edit defined alert # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.put_update_alert(alert_id, update_alert_request, async_req=True)
:param async_req bool: execute request asynchronously
:param int alert_id: ID for Alert definition (required)
:param name str:
:param url str:
:param enabled bool: True by default
:param events List[str]:
:param custom_properties List[{
name str:
value str:
}]:
:param webhook_id int: webhook_id or template_id must be provided. Only webhook_id supported starting with 4.9.
:param template_id int: webhook_id or template_id must be provided. Only webhook_id supported starting with 4.9.
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = [
"name",
"url",
"enabled",
"events",
"custom_properties",
"webhook_id",
"template_id",
]
collection_formats = {}
path_params = {"alert_id": alert_id}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# HTTP header `Content-Type`
header_params["Content-Type"] = api_client.select_header_content_type( # noqa: E501
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/alert/{alert_id}",
"PUT",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_alert_event_types(api_client, **kwargs): # noqa: E501
"""get_alert_event_types # noqa: E501
Retrieve all possible alert events # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_alert_event_types(client, async_req=True)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/alert_events",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_packet_monitors(api_client, **kwargs): # noqa: E501
"""get_packet_monitors # noqa: E501
Get all packet monitors defined # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_packet_monitors(client, async_req=True)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/packet_monitors",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def create_packet_monitor(
api_client,
name=None,
type=None,
interface=None,
filter=None,
duration=None,
destination=None,
**kwargs
): # noqa: E501
"""create_packet_monitor # noqa: E501
Define new packet monitor. Three types of packet monitors are supported: conntrack for connection tracking,
netflow for Netflow formatted data, and pcap for local packet capture # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.create_packet_monitor(client, name="New slack webhook", async_req=True)
:param name str: Name of packet monitor. Must be conntrack for type=conntrack as only one conntrack monitor can run at a time (required)
:param type str: conntrack, netflow or pcap (required)
:param interface str: Interface to monitor, e.g. eth0 (required)
:param filter str: filter strings are particular to the type of packet monitor. For instance, "-p 8000" for pcap. Can be empty string. (required)
:param duration str: Indicates length of time to run capture for. Can be forever or some string parsable by the Linux date command. (required)
:param destination str: must be file if pcap or conntrack. Otherwise a host should be specified with the prefix "host". E.g. "host:10.0.3.2:4000" (required)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
request_params = ["name", "type", "interface", "filter", "duration", "destination"]
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = {}
for param in [p for p in request_params if local_var_params.get(p) is not None]:
body_params[param] = local_var_params[param]
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# HTTP header `Content-Type`
header_params["Content-Type"] = api_client.select_header_content_type( # noqa: E501
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/packet_monitor",
"POST",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def get_packet_monitor(api_client, monitor_name, **kwargs): # noqa: E501
"""get_packet_monitor # noqa: E501
Retrieve details for single Packet Monitor # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.get_packet_monitor(monitor_name, async_req=True)
:param monitor_name str: unique name of monitor (required)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"name": monitor_name}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/packet_monitor/{name}",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def delete_packet_monitor(api_client, monitor_name, **kwargs): # noqa: E501
"""delete_packet_monitor # noqa: E501
Delete Packet Monitor # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.delete_packet_monitor(monitor_name, async_req=True)
:param monitor_name str: unique name of monitor (required)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"name": monitor_name}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/packet_monitor/{name}",
"DELETE",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def put_start_packet_monitor(api_client, monitor_name, **kwargs): # noqa: E501
"""put_start_packet_monitor # noqa: E501
Start packet monitor # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.put_start_packet_monitor(monitor_name, async_req=True)
:param monitor_name str: unique name of monitor (required)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"name": monitor_name}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/packet_monitor/{name}/start",
"PUT",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def put_stop_packet_monitor(api_client, monitor_name, **kwargs): # noqa: E501
"""put_stop_packet_monitor # noqa: E501
Start packet monitor # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.put_stop_packet_monitor(monitor_name, async_req=True)
:param monitor_name str: unique name of monitor (required)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"name": monitor_name}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/packet_monitor/{name}/stop",
"PUT",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
def download_packet_monitor_data(api_client, monitor_name, **kwargs): # noqa: E501
"""download_packet_monitor_data # noqa: E501
Download packet monitor data file # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> response = await api.download_packet_monitor_data(monitor_name, async_req=True)
:param monitor_name str: unique name of monitor (required)
:param async_req bool: execute request asynchronously
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: APIResponse or awaitable if async
"""
local_var_params = locals()
collection_formats = {}
path_params = {"name": monitor_name}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params["Accept"] = api_client.select_header_accept(
["application/json"]
) # noqa: E501
# Authentication setting
auth_settings = ["ApiTokenAuth", "basicAuth"] # noqa: E501
return api_client.call_api(
"/packet_monitor/{name}/download",
"GET",
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type="object", # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get("async_req"),
_return_http_data_only=local_var_params.get(
"_return_http_data_only"
), # noqa: E501
_preload_content=local_var_params.get("_preload_content", True),
_request_timeout=local_var_params.get("_request_timeout"),
collection_formats=collection_formats,
)
class MonitoringAlertingApiRouter(VersionRouter):
function_library = {
"get_webhooks": {"4.8.4-5.1.5": get_webhooks},
"get_webhook": {"4.8.4-5.1.5": get_webhook},
"post_create_webhook": {"4.8.4-5.1.5": post_create_webhook},
"put_update_webhook": {"4.8.4-5.1.5": put_update_webhook},
"delete_webhook": {"4.8.4-5.1.5": delete_webhook},
"delete_alert": {"4.8.4-5.1.5": delete_alert},
"get_alert": {"4.8.4-5.1.5": get_alert},
"get_alerts": {"4.8.4-5.1.5": get_alerts},
"post_create_alert": {"4.8.4-5.1.5": post_create_alert},
"post_test_alert": {"4.8.4-5.1.5": post_test_alert},
"post_toggle_enable_alert": {"4.8.4-5.1.5": post_toggle_enable_alert},
"put_update_alert": {"4.8.4-5.1.5": put_update_alert},
"get_packet_monitors": {"4.8.4-5.1.5": get_packet_monitors},
"create_packet_monitor": {"4.8.4-5.1.5": create_packet_monitor},
"get_packet_monitor": {"4.8.4-5.1.5": get_packet_monitor},
"delete_packet_monitor": {"4.8.4-5.1.5": delete_packet_monitor},
"put_start_packet_monitor": {"4.8.4-5.1.5": put_start_packet_monitor},
"put_stop_packet_monitor": {"4.8.4-5.1.5": put_stop_packet_monitor},
"download_packet_monitor_data": {"4.8.4-5.1.5": download_packet_monitor_data},
"get_alert_event_types": {"4.9.1-5.1.5": get_alert_event_types},
}
| 33.882767 | 160 | 0.631733 | 6,177 | 52,891 | 5.126437 | 0.040635 | 0.038148 | 0.048633 | 0.045633 | 0.922914 | 0.918114 | 0.906809 | 0.89904 | 0.888113 | 0.880692 | 0 | 0.016552 | 0.286098 | 52,891 | 1,560 | 161 | 33.904487 | 0.822082 | 0.466374 | 0 | 0.850955 | 0 | 0 | 0.143779 | 0.032242 | 0 | 0 | 0 | 0 | 0 | 1 | 0.025478 | false | 0 | 0.003822 | 0 | 0.057325 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
07d38ede7e6f81031911cec06027d47f3004e356 | 14,871 | py | Python | openbook_notifications/tests/test_views.py | TamaraAbells/okuna-api | f87d8e80d2f182c01dbce68155ded0078ee707e4 | [
"MIT"
] | 164 | 2019-07-29T17:59:06.000Z | 2022-03-19T21:36:01.000Z | openbook_notifications/tests/test_views.py | TamaraAbells/okuna-api | f87d8e80d2f182c01dbce68155ded0078ee707e4 | [
"MIT"
] | 188 | 2019-03-16T09:53:25.000Z | 2019-07-25T14:57:24.000Z | openbook_notifications/tests/test_views.py | TamaraAbells/okuna-api | f87d8e80d2f182c01dbce68155ded0078ee707e4 | [
"MIT"
] | 80 | 2019-08-03T17:49:08.000Z | 2022-02-28T16:56:33.000Z | import json
from django.urls import reverse
from faker import Faker
from rest_framework import status
from openbook_common.tests.models import OpenbookAPITestCase
from openbook_common.tests.helpers import make_user, make_authentication_headers_for_user, make_notification
from openbook_notifications.models import Notification
fake = Faker()
class NotificationsAPITests(OpenbookAPITestCase):
"""
NotificationsAPI
"""
def test_can_retrieve_notifications(self):
"""
should be able to retrieve all notifications and return 200
"""
user = make_user()
amount_of_notifications = 5
notifications_ids = []
for i in range(0, amount_of_notifications):
notification = make_notification(owner=user)
notifications_ids.append(notification.pk)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_notifications = json.loads(response.content)
self.assertEqual(len(response_notifications), len(notifications_ids))
for response_notification in response_notifications:
response_notification_id = response_notification.get('id')
self.assertIn(response_notification_id, notifications_ids)
def test_can_retrieve_notifications_by_type(self):
"""
should be able to retrieve notifications of the specified types and return 200
"""
user = make_user()
amount_of_notifications = len(Notification.NOTIFICATION_TYPES)
notifications_ids = []
valid_ids = []
valid_types = []
for i in range(0, amount_of_notifications):
notification_type = Notification.NOTIFICATION_TYPES.__getitem__(i)[0]
notification = make_notification(owner=user, notification_type=notification_type)
notifications_ids.append(notification.pk)
if i < 3:
valid_types.append(notification_type)
valid_ids.append(notification.pk)
url = '{0}?types={1}'.format(self._get_url(), ','.join(valid_types))
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_notifications = json.loads(response.content)
self.assertEqual(len(response_notifications), len(valid_types))
for response_notification in response_notifications:
response_notification_id = response_notification.get('id')
response_notification_type = response_notification.get('notification_type')
self.assertIn(response_notification_id, valid_ids)
self.assertIn(response_notification_type, valid_types)
def test_cant_retrieve_notifications_with_bad_type(self):
"""
should return 400 if invalid a notification type is specified
"""
user = make_user()
url = '{0}?types={1}'.format(self._get_url(), 'AA')
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_can_delete_notifications(self):
"""
should be able to delete all notifications and return 200
"""
user = make_user()
amount_of_notifications = 5
notifications_ids = []
for i in range(0, amount_of_notifications):
notification = make_notification(owner=user)
notifications_ids.append(notification.pk)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.delete(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertFalse(Notification.objects.filter(owner=user).exists())
def _get_url(self):
return reverse('notifications')
class ReadNotificationsAPITests(OpenbookAPITestCase):
"""
ReadNotificationsAPI
"""
def test_should_be_able_to_read_notifications(self):
"""
should be able to read all notifications and return 200
"""
user = make_user()
amount_of_notifications = 5
notifications_ids = []
for i in range(0, amount_of_notifications):
notification = make_notification(owner=user)
notifications_ids.append(notification.pk)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.post(url, {}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(Notification.objects.filter(owner=user, read=True, id__in=notifications_ids).count() == len(
notifications_ids))
def test_should_be_able_to_read_notifications_with_max_id(self):
"""
should be able to read all notifications with a max_id and return 200
"""
user = make_user()
amount_of_notifications = 5
notifications_ids = []
for i in range(0, amount_of_notifications):
notification = make_notification(owner=user)
notifications_ids.append(notification.pk)
max_id = notifications_ids[-3]
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.post(url, {
'max_id': max_id
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(Notification.objects.filter(owner=user, read=True, id__lte=max_id).exists())
self.assertTrue(Notification.objects.filter(owner=user, read=False, id__gt=max_id).exists())
def test_should_be_able_to_read_notifications_by_type(self):
"""
should be able to read notifications of the specified types and return 200
"""
user = make_user()
amount_of_notifications = len(Notification.NOTIFICATION_TYPES)
notifications_ids = []
valid_ids = []
valid_types = []
for i in range(0, amount_of_notifications):
notification_type = Notification.NOTIFICATION_TYPES.__getitem__(i)[0]
notification = make_notification(owner=user, notification_type=notification_type)
notifications_ids.append(notification.pk)
if i < 3:
valid_types.append(notification_type)
valid_ids.append(notification.pk)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.post(url, {
'types': ','.join(valid_types)
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
invalid_ids = set(notifications_ids) - set(valid_ids)
self.assertEqual(Notification.objects.filter(owner=user, read=True, id__in=valid_ids).count(),
len(valid_ids))
self.assertEqual(Notification.objects.filter(owner=user, read=False, id__in=invalid_ids).count(),
len(invalid_ids))
def test_should_not_be_able_to_read_notifications_with_bad_type(self):
"""
should return 400 if an invalid notification type is specified
"""
user = make_user()
amount_of_notifications = len(Notification.NOTIFICATION_TYPES)
notifications_ids = []
for i in range(0, amount_of_notifications):
notification = make_notification(owner=user)
notifications_ids.append(notification.pk)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.post(url, {
'types': 'AA'
}, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertEqual(Notification.objects.filter(owner=user, read=True, id__in=notifications_ids).count(), 0)
def _get_url(self):
return reverse('read-notifications')
class NotificationItemAPITests(OpenbookAPITestCase):
"""
NotificationItemAPI
"""
def test_can_delete_own_notification(self):
"""
should be able to delete an own notification and return 200
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
notification = make_notification(owner=user)
notification_id = notification.pk
url = self._get_url(notification_id)
response = self.client.delete(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertFalse(Notification.objects.filter(id=notification_id).exists())
def test_cannot_delete_foreign_notification(self):
"""
should not be able to delete a foreign notification and return 200
"""
user = make_user()
foreign_user = make_user()
headers = make_authentication_headers_for_user(user)
notification = make_notification(owner=foreign_user)
notification_id = notification.pk
url = self._get_url(notification_id)
response = self.client.delete(url, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertTrue(Notification.objects.filter(id=notification_id).exists())
def _get_url(self, notification_id):
return reverse('notification', kwargs={
'notification_id': notification_id
})
class ReadNotificationAPITests(OpenbookAPITestCase):
"""
ReadNotificationAPI
"""
def test_can_read_own_notification(self):
"""
should be able to read an own notification and return 200
"""
user = make_user()
headers = make_authentication_headers_for_user(user)
notification = make_notification(owner=user)
notification_id = notification.pk
url = self._get_url(notification_id)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(Notification.objects.filter(id=notification_id, read=True).exists())
def test_cannot_read_foreign_notification(self):
"""
should not be able to read a foreign notification and return 400
"""
user = make_user()
foreign_user = make_user()
headers = make_authentication_headers_for_user(user)
notification = make_notification(owner=foreign_user)
notification_id = notification.pk
url = self._get_url(notification_id)
response = self.client.post(url, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertTrue(Notification.objects.filter(id=notification_id, read=False).exists())
def _get_url(self, notification_id):
return reverse('read-notification', kwargs={
'notification_id': notification_id
})
class UnreadNotificationsCountAPITests(OpenbookAPITestCase):
"""
UnreadNotificationsCountAPI
"""
def test_should_be_able_to_get_unread_notifications_count(self):
"""
should be able to get all unread count notifications and return 200
"""
user = make_user()
amount_of_notifications = 5
notifications_ids = []
for i in range(0, amount_of_notifications):
notification = make_notification(owner=user)
notifications_ids.append(notification.pk)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {}, **headers)
parsed_response = json.loads(response.content)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(parsed_response['count'] == 5)
def test_should_be_able_to_get_unread_notifications_count_with_max_id(self):
"""
should be able to get all unread notifications count with a max_id and return 200
"""
user = make_user()
amount_of_notifications = 5
notifications_ids = []
for i in range(0, amount_of_notifications):
notification = make_notification(owner=user)
notifications_ids.append(notification.pk)
max_id = notifications_ids[3]
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'max_id': max_id
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
parsed_response = json.loads(response.content)
print(parsed_response['count'])
self.assertTrue(parsed_response['count'] == 4)
def test_should_be_able_to_get_unread_notifications_count_by_type(self):
"""
should be able to get unread notifications count of the specified types and return 200
"""
user = make_user()
amount_of_notifications = len(Notification.NOTIFICATION_TYPES)
notifications_ids = []
valid_ids = []
valid_types = []
for i in range(0, amount_of_notifications):
notification_type = Notification.NOTIFICATION_TYPES.__getitem__(i)[0]
notification = make_notification(owner=user, notification_type=notification_type)
notifications_ids.append(notification.pk)
if i < 3:
valid_types.append(notification_type)
valid_ids.append(notification.pk)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'types': ','.join(valid_types)
}, **headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
parsed_response = json.loads(response.content)
self.assertEqual(parsed_response['count'], len(valid_ids))
def test_should_not_be_able_to_get_unread_notifications_count_with_bad_type(self):
"""
should return 400 if an invalid notification type is specified
"""
user = make_user()
amount_of_notifications = len(Notification.NOTIFICATION_TYPES)
notifications_ids = []
for i in range(0, amount_of_notifications):
notification = make_notification(owner=user)
notifications_ids.append(notification.pk)
url = self._get_url()
headers = make_authentication_headers_for_user(user)
response = self.client.get(url, {
'types': 'AA'
}, **headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def _get_url(self):
return reverse('unread-notifications-count')
| 33.568849 | 116 | 0.669558 | 1,669 | 14,871 | 5.65728 | 0.070701 | 0.050837 | 0.04893 | 0.050413 | 0.877886 | 0.852468 | 0.833298 | 0.807774 | 0.73279 | 0.714361 | 0 | 0.011369 | 0.242889 | 14,871 | 442 | 117 | 33.644796 | 0.827249 | 0.079215 | 0 | 0.744094 | 0 | 0 | 0.016907 | 0.001962 | 0 | 0 | 0 | 0 | 0.137795 | 1 | 0.082677 | false | 0 | 0.027559 | 0.019685 | 0.149606 | 0.003937 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
07d98b9c1775152b43bbf4522ef3315e922ae851 | 140 | py | Python | pylangacq/tests/test_measures.py | mitjanikolaus/pylangacq | 88c327ee5b41af4a7c1a59f1813d52bf49bf3bed | [
"MIT"
] | 16 | 2019-09-03T13:24:42.000Z | 2022-03-25T00:19:08.000Z | pylangacq/tests/test_measures.py | mitjanikolaus/pylangacq | 88c327ee5b41af4a7c1a59f1813d52bf49bf3bed | [
"MIT"
] | 11 | 2019-12-02T06:04:02.000Z | 2022-03-22T13:27:32.000Z | pylangacq/tests/test_measures.py | mitjanikolaus/pylangacq | 88c327ee5b41af4a7c1a59f1813d52bf49bf3bed | [
"MIT"
] | 11 | 2020-01-07T07:27:07.000Z | 2021-10-02T17:48:28.000Z | from pylangacq.measures import _get_lemma_from_mor
def test__get_lemma_from_mor():
assert _get_lemma_from_mor("foo&bar-baz") == "foo"
| 23.333333 | 54 | 0.785714 | 23 | 140 | 4.217391 | 0.565217 | 0.247423 | 0.371134 | 0.463918 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 140 | 5 | 55 | 28 | 0.782258 | 0 | 0 | 0 | 0 | 0 | 0.1 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
07dfdd95111f199b686ef4b85a1c2a62d919aa52 | 173 | py | Python | crypy/pair/chart.py | asmodehn/crypy | 351af6588f110612d5207a5fbb29d51bfa7c3268 | [
"MIT"
] | 2 | 2019-01-20T14:15:54.000Z | 2019-07-13T17:20:32.000Z | crypy/pair/chart.py | asmodehn/crypy | 351af6588f110612d5207a5fbb29d51bfa7c3268 | [
"MIT"
] | 12 | 2019-05-07T09:27:34.000Z | 2019-06-04T12:36:41.000Z | crypy/pair/chart.py | asmodehn/crypy | 351af6588f110612d5207a5fbb29d51bfa7c3268 | [
"MIT"
] | null | null | null | from ..graph import plot
# TODO : pandas series as input
def chart(series):
# print the chart
print("\n" + plot(series[-120:], {'height': 20})) # print the chart
| 21.625 | 72 | 0.630058 | 25 | 173 | 4.36 | 0.68 | 0.146789 | 0.238532 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036765 | 0.213873 | 173 | 7 | 73 | 24.714286 | 0.764706 | 0.352601 | 0 | 0 | 0 | 0 | 0.074074 | 0 | 0 | 0 | 0 | 0.142857 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0 | 0.666667 | 0.333333 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
07f22d2023aa0fcdafb3221ad0d4bbc9115beca2 | 16,953 | py | Python | tests/strategies/test_strategies.py | vishalbelsare/FinMind | 9890c631952f3ab91560ada6f49971bff26a5858 | [
"Apache-2.0"
] | 1,106 | 2019-10-04T15:16:59.000Z | 2022-03-31T03:50:19.000Z | tests/strategies/test_strategies.py | Jerremiah/FinMind | bf57f8e68adc0583495b29135a91e47515cf4cf1 | [
"Apache-2.0"
] | 119 | 2019-10-07T09:18:18.000Z | 2022-03-12T08:25:58.000Z | tests/strategies/test_strategies.py | Jerremiah/FinMind | bf57f8e68adc0583495b29135a91e47515cf4cf1 | [
"Apache-2.0"
] | 184 | 2019-10-06T08:26:53.000Z | 2022-03-21T06:25:31.000Z | import os
import pandas as pd
import pytest
from FinMind import strategies
from FinMind.data import DataLoader
user_id = os.environ.get("FINMIND_USER", "")
password = os.environ.get("FINMIND_PASSWORD", "")
@pytest.fixture(scope="module")
def data_loader():
data_loader = DataLoader()
data_loader.login(user_id, password)
return data_loader
def test_get_stock_price(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
data_loader=data_loader,
# strategy=ContinueHolding,
)
assert isinstance(obj.stock_price, pd.DataFrame)
def test_continue_holding(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
strategy=strategies.ContinueHolding,
data_loader=data_loader,
)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == 2810
assert int(obj.final_stats.MaxLoss) == -9663
assert int(obj.final_stats.FinalProfit) == 3407
assert obj.final_stats["MeanProfitPer"] == 0.56
assert obj.final_stats["FinalProfitPer"] == 0.68
assert obj.final_stats["MaxLossPer"] == -1.93
assert obj.trade_detail.to_dict("r")[1] == {
"stock_id": "0056",
"date": "2018-01-03",
"EverytimeProfit": -96.83,
"RealizedProfit": 0.0,
"UnrealizedProfit": -96.83,
"board_lot": 1000,
"hold_cost": 25.18583875,
"hold_volume": 1000,
"signal": 1,
"trade_price": 25.15,
"trader_fund": 474814.16125,
"EverytimeTotalProfit": 474717.33125,
"CashEarningsDistribution": 0.0,
"StockEarningsDistribution": 0.0,
}
assert obj.compare_market_detail.to_dict("r")[-1] == {
"CumDailyReturn": -0.61003,
"CumTaiExDailyReturn": -0.0963,
"date": "2018-12-28",
}
assert obj.compare_market_stats["AnnualTaiexReturnPer"] == -9.6
assert obj.compare_market_stats["AnnualReturnPer"] == 0.68
def test_continue_holding_add_strategy(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
# strategy=ContinueHolding,
data_loader=data_loader,
)
obj.add_strategy(strategies.ContinueHolding)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == 2810
assert int(obj.final_stats.MaxLoss) == -9663
assert int(obj.final_stats.FinalProfit) == 3407
assert obj.final_stats["MeanProfitPer"] == 0.56
assert obj.final_stats["FinalProfitPer"] == 0.68
assert obj.final_stats["MaxLossPer"] == -1.93
assert obj.trade_detail.to_dict("r")[1] == {
"EverytimeProfit": -96.83,
"RealizedProfit": 0.0,
"UnrealizedProfit": -96.83,
"board_lot": 1000.0,
"date": "2018-01-03",
"hold_cost": 25.18583875,
"hold_volume": 1000.0,
"signal": 1,
"stock_id": "0056",
"trade_price": 25.15,
"trader_fund": 474814.16125,
"EverytimeTotalProfit": 474717.33125,
"CashEarningsDistribution": 0.0,
"StockEarningsDistribution": 0.0,
}
def test_bias(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
strategy=strategies.Bias,
data_loader=data_loader,
)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == 984
assert int(obj.final_stats.MaxLoss) == -863
assert int(obj.final_stats.FinalProfit) == 2845
assert obj.final_stats["MeanProfitPer"] == 0.20
assert obj.final_stats["FinalProfitPer"] == 0.57
assert obj.final_stats["MaxLossPer"] == -0.17
assert obj.trade_detail.to_dict("r")[1] == {
"EverytimeProfit": 0.0,
"RealizedProfit": 0.0,
"UnrealizedProfit": 0.0,
"board_lot": 1000.0,
"date": "2018-02-05",
"hold_cost": 0.0,
"hold_volume": 0.0,
"signal": 0.0,
"stock_id": "0056",
"trade_price": 26.1,
"trader_fund": 500000.0,
"EverytimeTotalProfit": 500000.0,
"CashEarningsDistribution": 0.0,
"StockEarningsDistribution": 0.0,
}
assert obj.compare_market_detail.to_dict("r")[-1] == {
"CumDailyReturn": -0.39843,
"CumTaiExDailyReturn": -0.0963,
"date": "2018-12-28",
}
assert obj.compare_market_stats["AnnualTaiexReturnPer"] == -9.6
assert obj.compare_market_stats["AnnualReturnPer"] == 0.57
def test_bias_add_strategy(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
# strategy=Bias,
data_loader=data_loader,
)
obj.add_strategy(strategies.Bias)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == 984
assert int(obj.final_stats.MaxLoss) == -863
assert int(obj.final_stats.FinalProfit) == 2845
assert obj.final_stats["MeanProfitPer"] == 0.20
assert obj.final_stats["FinalProfitPer"] == 0.57
assert obj.final_stats["MaxLossPer"] == -0.17
assert obj.trade_detail.to_dict("r")[1] == {
"EverytimeProfit": 0.0,
"RealizedProfit": 0.0,
"UnrealizedProfit": 0.0,
"board_lot": 1000.0,
"date": "2018-02-05",
"hold_cost": 0.0,
"hold_volume": 0.0,
"signal": 0.0,
"stock_id": "0056",
"trade_price": 26.1,
"trader_fund": 500000.0,
"EverytimeTotalProfit": 500000.0,
"CashEarningsDistribution": 0.0,
"StockEarningsDistribution": 0.0,
}
def test_naive_kd(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
strategy=strategies.NaiveKd,
data_loader=data_loader,
)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == 5418
assert int(obj.final_stats.MaxLoss) == -2094
assert int(obj.final_stats.FinalProfit) == 15033
assert obj.final_stats["MeanProfitPer"] == 1.08
assert obj.final_stats["FinalProfitPer"] == 3.01
assert obj.final_stats["MaxLossPer"] == -0.42
def test_naive_kd_add_strategy(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
# strategy=strategies.NaiveKd,
data_loader=data_loader,
)
obj.add_strategy(strategies.NaiveKd)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == 5418
assert int(obj.final_stats.MaxLoss) == -2094
assert int(obj.final_stats.FinalProfit) == 15033
assert obj.final_stats["MeanProfitPer"] == 1.08
assert obj.final_stats["FinalProfitPer"] == 3.01
assert obj.final_stats["MaxLossPer"] == -0.42
def test_kd(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
strategy=strategies.Kd,
data_loader=data_loader,
)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == 2356
assert int(obj.final_stats.MaxLoss) == -1425
assert int(obj.final_stats.FinalProfit) == 6196
assert obj.final_stats["MeanProfitPer"] == 0.47
assert obj.final_stats["FinalProfitPer"] == 1.24
assert obj.final_stats["MaxLossPer"] == -0.29
def test_kd_add_strategy(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
# strategy=strategies.Kd,
data_loader=data_loader,
)
obj.add_strategy(strategies.Kd)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == 2356
assert int(obj.final_stats.MaxLoss) == -1425
assert int(obj.final_stats.FinalProfit) == 6196
assert obj.final_stats["MeanProfitPer"] == 0.47
assert obj.final_stats["FinalProfitPer"] == 1.24
assert obj.final_stats["MaxLossPer"] == -0.29
def test_kd_crossover(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
strategy=strategies.KdCrossOver,
data_loader=data_loader,
)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == 349
assert int(obj.final_stats.MaxLoss) == -1223
assert int(obj.final_stats.FinalProfit) == 933
assert obj.final_stats["MeanProfitPer"] == 0.07
assert obj.final_stats["FinalProfitPer"] == 0.19
assert obj.final_stats["MaxLossPer"] == -0.24
def test_kd_crossover_add_strategy(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
# strategy=strategies.KdCrossOver,
data_loader=data_loader,
)
obj.add_strategy(strategies.KdCrossOver)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == 349
assert int(obj.final_stats.MaxLoss) == -1223
assert int(obj.final_stats.FinalProfit) == 933
assert obj.final_stats["MeanProfitPer"] == 0.07
assert obj.final_stats["FinalProfitPer"] == 0.19
assert obj.final_stats["MaxLossPer"] == -0.24
def test_institutional_investors_follower(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
strategy=strategies.InstitutionalInvestorsFollower,
data_loader=data_loader,
)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == 6021
assert int(obj.final_stats.MaxLoss) == -15410
assert int(obj.final_stats.FinalProfit) == 10699
assert obj.final_stats["MeanProfitPer"] == 1.2
assert obj.final_stats["FinalProfitPer"] == 2.14
assert obj.final_stats["MaxLossPer"] == -3.08
def test_institutional_investors_follower_add_strategy(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
# strategy=strategies.InstitutionalInvestorsFollower,
data_loader=data_loader,
)
obj.add_strategy(strategies.InstitutionalInvestorsFollower)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == 6021
assert int(obj.final_stats.MaxLoss) == -15410
assert int(obj.final_stats.FinalProfit) == 10699
assert obj.final_stats["MeanProfitPer"] == 1.2
assert obj.final_stats["FinalProfitPer"] == 2.14
assert obj.final_stats["MaxLossPer"] == -3.08
def test_short_sale_margin_purchase_ratio(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
strategy=strategies.ShortSaleMarginPurchaseRatio,
data_loader=data_loader,
)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == 12946
assert int(obj.final_stats.MaxLoss) == -14706
assert int(obj.final_stats.FinalProfit) == 22576
assert obj.final_stats["MeanProfitPer"] == 2.59
assert obj.final_stats["FinalProfitPer"] == 4.52
assert obj.final_stats["MaxLossPer"] == -2.94
def test_short_sale_margin_purchase_ratio_add_strategy(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
# strategy=strategies.ShortSaleMarginPurchaseRatio,
data_loader=data_loader,
)
obj.add_strategy(strategies.ShortSaleMarginPurchaseRatio)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == 12946
assert int(obj.final_stats.MaxLoss) == -14706
assert int(obj.final_stats.FinalProfit) == 22576
assert obj.final_stats["MeanProfitPer"] == 2.59
assert obj.final_stats["FinalProfitPer"] == 4.52
assert obj.final_stats["MaxLossPer"] == -2.94
def test_macd_crossover(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
strategy=strategies.MacdCrossOver,
data_loader=data_loader,
)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == 1347
assert int(obj.final_stats.MaxLoss) == -397
assert int(obj.final_stats.FinalProfit) == 3232
assert obj.final_stats["MeanProfitPer"] == 0.27
assert obj.final_stats["FinalProfitPer"] == 0.65
assert obj.final_stats["MaxLossPer"] == -0.08
def test_macd_crossover_add_strategy(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
# strategy=strategies.MacdCrossOver,
data_loader=data_loader,
)
obj.add_strategy(strategies.MacdCrossOver)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == 1347
assert int(obj.final_stats.MaxLoss) == -397
assert int(obj.final_stats.FinalProfit) == 3232
assert obj.final_stats["MeanProfitPer"] == 0.27
assert obj.final_stats["FinalProfitPer"] == 0.65
assert obj.final_stats["MaxLossPer"] == -0.08
def test_ma_crossover(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
strategy=strategies.MaCrossOver,
data_loader=data_loader,
)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == -381
assert int(obj.final_stats.MaxLoss) == -1230
assert int(obj.final_stats.FinalProfit) == -1230
assert obj.final_stats["MeanProfitPer"] == -0.08
assert obj.final_stats["FinalProfitPer"] == -0.25
assert obj.final_stats["MaxLossPer"] == -0.25
def test_ma_crossover_add_strategy(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
# strategy=strategies.MaCrossOver,
data_loader=data_loader,
)
obj.add_strategy(strategies.MaCrossOver)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == -381
assert int(obj.final_stats.MaxLoss) == -1230
assert int(obj.final_stats.FinalProfit) == -1230
assert obj.final_stats["MeanProfitPer"] == -0.08
assert obj.final_stats["FinalProfitPer"] == -0.25
assert obj.final_stats["MaxLossPer"] == -0.25
def test_max_min_period_bias(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
strategy=strategies.MaxMinPeriodBias,
data_loader=data_loader,
)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == 0
assert int(obj.final_stats.MaxLoss) == 0
assert int(obj.final_stats.FinalProfit) == 0
assert obj.final_stats["MeanProfitPer"] == 0
assert obj.final_stats["FinalProfitPer"] == 0
assert obj.final_stats["MaxLossPer"] == 0
def test_max_min_period_bias_add_strategy(data_loader):
obj = strategies.BackTest(
stock_id="0056",
start_date="2018-01-01",
end_date="2019-01-01",
trader_fund=500000.0,
fee=0.001425,
# strategy=strategies.MaxMinPeriodBias,
data_loader=data_loader,
)
obj.add_strategy(strategies.MaxMinPeriodBias)
obj.simulate()
assert int(obj.final_stats.MeanProfit) == 0
assert int(obj.final_stats.MaxLoss) == 0
assert int(obj.final_stats.FinalProfit) == 0
assert obj.final_stats["MeanProfitPer"] == 0
assert obj.final_stats["FinalProfitPer"] == 0
assert obj.final_stats["MaxLossPer"] == 0
| 31.049451 | 69 | 0.625612 | 2,052 | 16,953 | 4.972222 | 0.08577 | 0.09409 | 0.152896 | 0.099971 | 0.938155 | 0.931197 | 0.923552 | 0.911791 | 0.885328 | 0.885328 | 0 | 0.109435 | 0.245384 | 16,953 | 545 | 70 | 31.106422 | 0.688111 | 0.021235 | 0 | 0.8 | 0 | 0 | 0.139436 | 0.012223 | 0 | 0 | 0 | 0 | 0.297727 | 1 | 0.05 | false | 0.004545 | 0.011364 | 0 | 0.063636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ed4b1fd2fa5aa56214fa7acdf83f023b2861b6bc | 5,068 | py | Python | tests/test_db_setup.py | olendorf/coop_evolve | 147b987088bfd6b8da23c3775e8b871ee19518b3 | [
"Apache-2.0"
] | null | null | null | tests/test_db_setup.py | olendorf/coop_evolve | 147b987088bfd6b8da23c3775e8b871ee19518b3 | [
"Apache-2.0"
] | 5 | 2017-06-16T17:39:50.000Z | 2019-11-13T14:49:44.000Z | tests/test_db_setup.py | olendorf/coop_evolve | 147b987088bfd6b8da23c3775e8b871ee19518b3 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import psycopg2
import pytest
from app_settings import AppSettings
from coop_evolve.db_setup import DB_Setup
class TestDBSetup:
def test_clean_setup(self):
cfg = AppSettings()
setup = DB_Setup()
conn = psycopg2.connect(
f"dbname= '{cfg.database}' " +
f"user='{cfg.db_user}' " +
f"password={cfg.db_password} " +
f"host=localhost"
)
cur = conn.cursor()
cur.execute(f"DROP SCHEMA IF EXISTS {cfg.schema_name} CASCADE")
conn.commit()
setup.setup()
cur.execute(
"SELECT t.table_name FROM information_schema.tables t WHERE t.table_schema = 'test' AND t.table_type = 'BASE TABLE'"
)
result = [ table[0] for table in cur.fetchall() ]
result.sort()
expected = ['runs', 'subpop_data', 'experiments', 'pop_data']
expected.sort()
assert result == expected
class TestDBReset:
def test_table_creation(self):
cfg = AppSettings()
setup = DB_Setup()
setup.reset()
conn = psycopg2.connect(
f"dbname= '{cfg.database}' " +
f"user='{cfg.db_user}' " +
f"password={cfg.db_password} " +
f"host=localhost"
)
cur = conn.cursor()
cur.execute(
"SELECT t.table_name FROM information_schema.tables t WHERE t.table_schema = 'test' AND t.table_type = 'BASE TABLE'"
)
result = [ table[0] for table in cur.fetchall() ]
result.sort()
expected = ['runs', 'subpop_data', 'experiments', 'pop_data']
expected.sort()
assert result == expected
def test_subpop_data_table(self):
cfg = AppSettings()
setup = DB_Setup()
setup.reset()
conn = psycopg2.connect(
f"dbname= '{cfg.database}' " +
f"user='{cfg.db_user}' " +
f"password={cfg.db_password} " +
f"host=localhost"
)
cur = conn.cursor()
cur.execute(
"SELECT column_name FROM information_schema.columns WHERE table_schema = 'test' AND table_name = 'subpop_data'"
)
result = [item[0] for item in cur.fetchall()]
result.sort()
expected = ['id', 'run_id', 'generation', 'x_coord', 'y_coord', 'mean_fitness', 'behavior', 'census']
expected.sort()
assert result == expected
def test_simulations_table(self):
cfg = AppSettings()
setup = DB_Setup()
setup.reset()
conn = psycopg2.connect(
f"dbname= '{cfg.database}' " +
f"user='{cfg.db_user}' " +
f"password={cfg.db_password} " +
f"host=localhost"
)
cur = conn.cursor()
cur.execute(
"SELECT column_name FROM information_schema.columns WHERE table_schema = 'test' AND table_name = 'experiments'"
)
result = [item[0] for item in cur.fetchall()]
result.sort()
expected = ['id', 'behaviors', 'gene_delimiter', 'wild_cards', 'chromosome_length', 'mutation_rate', 'crossover_rate', 'interaction_length']
expected.sort()
assert result == expected
def test_pop_data_table(self):
cfg = AppSettings()
setup = DB_Setup()
setup.reset()
conn = psycopg2.connect(
f"dbname= '{cfg.database}' " +
f"user='{cfg.db_user}' " +
f"password={cfg.db_password} " +
f"host=localhost"
)
cur = conn.cursor()
cur.execute(
"SELECT column_name FROM information_schema.columns WHERE table_schema = 'test' AND table_name = 'pop_data'"
)
result = [item[0] for item in cur.fetchall()]
result.sort()
expected = ['id', 'run_id', 'generation', 'mean_fitness', 'behavior', 'census']
expected.sort()
assert result == expected
def test_runs_table(self):
cfg = AppSettings()
setup = DB_Setup()
setup.reset()
conn = psycopg2.connect(
f"dbname= '{cfg.database}' " +
f"user='{cfg.db_user}' " +
f"password={cfg.db_password} " +
f"host=localhost"
)
cur = conn.cursor()
cur.execute(
"SELECT column_name FROM information_schema.columns WHERE table_schema = 'test' AND table_name = 'runs'"
)
result = [item[0] for item in cur.fetchall()]
result.sort()
expected = ['id', 'simulation_id', 'generations', 'width', 'length', 'subpop_size', 'relative_fitnesses',
'migration_distance', 'migration_survival', 'initial_sequence']
expected.sort()
assert result == expected
| 31.478261 | 148 | 0.527427 | 525 | 5,068 | 4.929524 | 0.2 | 0.023184 | 0.041731 | 0.053323 | 0.805255 | 0.79289 | 0.781298 | 0.763524 | 0.763524 | 0.763524 | 0 | 0.004249 | 0.349842 | 5,068 | 160 | 149 | 31.675 | 0.781184 | 0.008287 | 0 | 0.707317 | 0 | 0.01626 | 0.32086 | 0.061704 | 0 | 0 | 0 | 0 | 0.04878 | 1 | 0.04878 | false | 0.04878 | 0.03252 | 0 | 0.097561 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ed5359fa0584520bd278f6c02930c168f83b85c6 | 10,409 | py | Python | ironic/tests/unit/drivers/test_ilo.py | armohamm/ironic | 21093ca886ed736a7a25bf5e71e05d41e132fd2f | [
"Apache-2.0"
] | 2 | 2019-06-17T21:37:53.000Z | 2020-07-11T03:58:39.000Z | ironic/tests/unit/drivers/test_ilo.py | armohamm/ironic | 21093ca886ed736a7a25bf5e71e05d41e132fd2f | [
"Apache-2.0"
] | 5 | 2019-08-14T06:46:03.000Z | 2021-12-13T20:01:25.000Z | ironic/tests/unit/drivers/test_ilo.py | armohamm/ironic | 21093ca886ed736a7a25bf5e71e05d41e132fd2f | [
"Apache-2.0"
] | 6 | 2019-06-13T12:49:33.000Z | 2021-04-17T16:33:19.000Z | # Copyright 2017 Hewlett-Packard Enterprise Company, L.P.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""
Test class for iLO Drivers
"""
from ironic.conductor import task_manager
from ironic.drivers import ilo
from ironic.drivers.modules import agent
from ironic.drivers.modules.ilo import raid
from ironic.drivers.modules import inspector
from ironic.drivers.modules import iscsi_deploy
from ironic.drivers.modules import noop
from ironic.tests.unit.db import base as db_base
from ironic.tests.unit.objects import utils as obj_utils
class IloHardwareTestCase(db_base.DbTestCase):
def setUp(self):
super(IloHardwareTestCase, self).setUp()
self.config(enabled_hardware_types=['ilo'],
enabled_boot_interfaces=['ilo-virtual-media', 'ilo-pxe'],
enabled_bios_interfaces=['no-bios', 'ilo'],
enabled_console_interfaces=['ilo'],
enabled_deploy_interfaces=['iscsi', 'direct'],
enabled_inspect_interfaces=['ilo'],
enabled_management_interfaces=['ilo'],
enabled_power_interfaces=['ilo'],
enabled_raid_interfaces=['no-raid', 'agent'],
enabled_rescue_interfaces=['no-rescue', 'agent'],
enabled_vendor_interfaces=['ilo', 'no-vendor'])
def test_default_interfaces(self):
node = obj_utils.create_test_node(self.context,
driver='ilo')
with task_manager.acquire(self.context, node.id) as task:
self.assertIsInstance(task.driver.boot,
ilo.boot.IloVirtualMediaBoot)
self.assertIsInstance(task.driver.bios,
ilo.bios.IloBIOS)
self.assertIsInstance(task.driver.console,
ilo.console.IloConsoleInterface)
self.assertIsInstance(task.driver.deploy,
iscsi_deploy.ISCSIDeploy)
self.assertIsInstance(task.driver.inspect,
ilo.inspect.IloInspect)
self.assertIsInstance(task.driver.management,
ilo.management.IloManagement)
self.assertIsInstance(task.driver.power,
ilo.power.IloPower)
self.assertIsInstance(task.driver.raid,
noop.NoRAID)
self.assertIsInstance(task.driver.vendor,
ilo.vendor.VendorPassthru)
self.assertIsInstance(task.driver.rescue,
noop.NoRescue)
def test_override_with_inspector(self):
self.config(enabled_inspect_interfaces=['inspector', 'ilo'])
node = obj_utils.create_test_node(
self.context, driver='ilo',
deploy_interface='direct',
inspect_interface='inspector',
raid_interface='agent',
vendor_interface='no-vendor')
with task_manager.acquire(self.context, node.id) as task:
self.assertIsInstance(task.driver.boot,
ilo.boot.IloVirtualMediaBoot)
self.assertIsInstance(task.driver.console,
ilo.console.IloConsoleInterface)
self.assertIsInstance(task.driver.deploy,
agent.AgentDeploy)
self.assertIsInstance(task.driver.inspect,
inspector.Inspector)
self.assertIsInstance(task.driver.management,
ilo.management.IloManagement)
self.assertIsInstance(task.driver.power,
ilo.power.IloPower)
self.assertIsInstance(task.driver.raid,
agent.AgentRAID)
self.assertIsInstance(task.driver.rescue,
noop.NoRescue)
self.assertIsInstance(task.driver.vendor,
noop.NoVendor)
def test_override_with_pxe(self):
node = obj_utils.create_test_node(
self.context, driver='ilo',
boot_interface='ilo-pxe',
raid_interface='agent')
with task_manager.acquire(self.context, node.id) as task:
self.assertIsInstance(task.driver.boot,
ilo.boot.IloPXEBoot)
self.assertIsInstance(task.driver.console,
ilo.console.IloConsoleInterface)
self.assertIsInstance(task.driver.deploy,
iscsi_deploy.ISCSIDeploy)
self.assertIsInstance(task.driver.inspect,
ilo.inspect.IloInspect)
self.assertIsInstance(task.driver.management,
ilo.management.IloManagement)
self.assertIsInstance(task.driver.power,
ilo.power.IloPower)
self.assertIsInstance(task.driver.raid,
agent.AgentRAID)
self.assertIsInstance(task.driver.rescue,
noop.NoRescue)
self.assertIsInstance(task.driver.vendor,
ilo.vendor.VendorPassthru)
def test_override_with_agent_rescue(self):
self.config(enabled_inspect_interfaces=['inspector', 'ilo'])
node = obj_utils.create_test_node(
self.context, driver='ilo',
deploy_interface='direct',
rescue_interface='agent',
raid_interface='agent')
with task_manager.acquire(self.context, node.id) as task:
self.assertIsInstance(task.driver.boot,
ilo.boot.IloVirtualMediaBoot)
self.assertIsInstance(task.driver.console,
ilo.console.IloConsoleInterface)
self.assertIsInstance(task.driver.deploy,
agent.AgentDeploy)
self.assertIsInstance(task.driver.inspect,
ilo.inspect.IloInspect)
self.assertIsInstance(task.driver.management,
ilo.management.IloManagement)
self.assertIsInstance(task.driver.power,
ilo.power.IloPower)
self.assertIsInstance(task.driver.raid,
agent.AgentRAID)
self.assertIsInstance(task.driver.rescue,
agent.AgentRescue)
self.assertIsInstance(task.driver.vendor,
ilo.vendor.VendorPassthru)
def test_override_with_no_bios(self):
node = obj_utils.create_test_node(
self.context, driver='ilo',
boot_interface='ilo-pxe',
bios_interface='no-bios',
deploy_interface='direct',
raid_interface='agent')
with task_manager.acquire(self.context, node.id) as task:
self.assertIsInstance(task.driver.boot,
ilo.boot.IloPXEBoot)
self.assertIsInstance(task.driver.bios,
noop.NoBIOS)
self.assertIsInstance(task.driver.console,
ilo.console.IloConsoleInterface)
self.assertIsInstance(task.driver.deploy,
agent.AgentDeploy)
self.assertIsInstance(task.driver.raid,
agent.AgentRAID)
class Ilo5HardwareTestCase(db_base.DbTestCase):
def setUp(self):
super(Ilo5HardwareTestCase, self).setUp()
self.config(enabled_hardware_types=['ilo5'],
enabled_boot_interfaces=['ilo-virtual-media', 'ilo-pxe'],
enabled_console_interfaces=['ilo'],
enabled_deploy_interfaces=['iscsi', 'direct'],
enabled_inspect_interfaces=['ilo'],
enabled_management_interfaces=['ilo'],
enabled_power_interfaces=['ilo'],
enabled_raid_interfaces=['ilo5'],
enabled_rescue_interfaces=['no-rescue', 'agent'],
enabled_vendor_interfaces=['ilo', 'no-vendor'])
def test_default_interfaces(self):
node = obj_utils.create_test_node(self.context, driver='ilo5')
with task_manager.acquire(self.context, node.id) as task:
self.assertIsInstance(task.driver.raid, raid.Ilo5RAID)
def test_override_with_no_raid(self):
self.config(enabled_raid_interfaces=['no-raid', 'ilo5'])
node = obj_utils.create_test_node(self.context, driver='ilo5',
raid_interface='no-raid')
with task_manager.acquire(self.context, node.id) as task:
self.assertIsInstance(task.driver.raid, noop.NoRAID)
self.assertIsInstance(task.driver.boot,
ilo.boot.IloVirtualMediaBoot)
self.assertIsInstance(task.driver.console,
ilo.console.IloConsoleInterface)
self.assertIsInstance(task.driver.deploy,
iscsi_deploy.ISCSIDeploy)
self.assertIsInstance(task.driver.inspect,
ilo.inspect.IloInspect)
self.assertIsInstance(task.driver.management,
ilo.management.IloManagement)
self.assertIsInstance(task.driver.power,
ilo.power.IloPower)
self.assertIsInstance(task.driver.rescue,
noop.NoRescue)
self.assertIsInstance(task.driver.vendor,
ilo.vendor.VendorPassthru)
| 48.868545 | 77 | 0.568162 | 946 | 10,409 | 6.123679 | 0.15222 | 0.179527 | 0.215432 | 0.269291 | 0.803383 | 0.771621 | 0.771621 | 0.73537 | 0.730364 | 0.714656 | 0 | 0.002353 | 0.346623 | 10,409 | 212 | 78 | 49.099057 | 0.849434 | 0.058219 | 0 | 0.767956 | 0 | 0 | 0.033323 | 0 | 0 | 0 | 0 | 0 | 0.287293 | 1 | 0.049724 | false | 0.022099 | 0.049724 | 0 | 0.110497 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
ed565291698b388cb00f8c21e7c9492142405f1d | 37 | py | Python | models/__init__.py | georgemunyoro/pyspace-api | 7e5d9b56996d633c70f99fa4a67740fa5bdaebd8 | [
"MIT"
] | null | null | null | models/__init__.py | georgemunyoro/pyspace-api | 7e5d9b56996d633c70f99fa4a67740fa5bdaebd8 | [
"MIT"
] | null | null | null | models/__init__.py | georgemunyoro/pyspace-api | 7e5d9b56996d633c70f99fa4a67740fa5bdaebd8 | [
"MIT"
] | null | null | null | from . import post
from . import user | 18.5 | 18 | 0.756757 | 6 | 37 | 4.666667 | 0.666667 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.189189 | 37 | 2 | 19 | 18.5 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ed5924603bfeffd0a737e761d88c206b88c1283f | 3,147 | py | Python | cart_venv/Lib/site-packages/tensorflow_core/_api/v1/sparse/__init__.py | juice1000/Synchronous-vs-Asynchronous-Learning-Tensorflow- | 654be60f7986ac9bb7ce1d080ddee377c3389f93 | [
"MIT"
] | 2 | 2019-08-04T20:28:14.000Z | 2019-10-27T23:26:42.000Z | cart_venv/Lib/site-packages/tensorflow_core/_api/v1/sparse/__init__.py | juice1000/Synchronous-vs-Asynchronous-Learning-Tensorflow- | 654be60f7986ac9bb7ce1d080ddee377c3389f93 | [
"MIT"
] | null | null | null | cart_venv/Lib/site-packages/tensorflow_core/_api/v1/sparse/__init__.py | juice1000/Synchronous-vs-Asynchronous-Learning-Tensorflow- | 654be60f7986ac9bb7ce1d080ddee377c3389f93 | [
"MIT"
] | 1 | 2020-11-04T03:16:29.000Z | 2020-11-04T03:16:29.000Z | # This file is MACHINE GENERATED! Do not edit.
# Generated by: tensorflow/python/tools/api/generator/create_python_api.py script.
"""Sparse Tensor Representation.
See also `tf.SparseTensor`.
"""
from __future__ import print_function as _print_function
import sys as _sys
from tensorflow.python.framework.sparse_tensor import SparseTensor
from tensorflow.python.ops.array_ops import sparse_mask as mask
from tensorflow.python.ops.array_ops import sparse_placeholder as placeholder
from tensorflow.python.ops.data_flow_ops import SparseConditionalAccumulator
from tensorflow.python.ops.math_ops import sparse_segment_mean as segment_mean
from tensorflow.python.ops.math_ops import sparse_segment_sqrt_n as segment_sqrt_n
from tensorflow.python.ops.math_ops import sparse_segment_sum as segment_sum
from tensorflow.python.ops.sparse_ops import _sparse_cross as cross
from tensorflow.python.ops.sparse_ops import _sparse_cross_hashed as cross_hashed
from tensorflow.python.ops.sparse_ops import from_dense
from tensorflow.python.ops.sparse_ops import sparse_add as add
from tensorflow.python.ops.sparse_ops import sparse_concat as concat
from tensorflow.python.ops.sparse_ops import sparse_expand_dims as expand_dims
from tensorflow.python.ops.sparse_ops import sparse_eye as eye
from tensorflow.python.ops.sparse_ops import sparse_fill_empty_rows as fill_empty_rows
from tensorflow.python.ops.sparse_ops import sparse_maximum as maximum
from tensorflow.python.ops.sparse_ops import sparse_merge as merge
from tensorflow.python.ops.sparse_ops import sparse_minimum as minimum
from tensorflow.python.ops.sparse_ops import sparse_reduce_max as reduce_max
from tensorflow.python.ops.sparse_ops import sparse_reduce_max_sparse as reduce_max_sparse
from tensorflow.python.ops.sparse_ops import sparse_reduce_sum as reduce_sum
from tensorflow.python.ops.sparse_ops import sparse_reduce_sum_sparse as reduce_sum_sparse
from tensorflow.python.ops.sparse_ops import sparse_reorder as reorder
from tensorflow.python.ops.sparse_ops import sparse_reset_shape as reset_shape
from tensorflow.python.ops.sparse_ops import sparse_reshape as reshape
from tensorflow.python.ops.sparse_ops import sparse_retain as retain
from tensorflow.python.ops.sparse_ops import sparse_slice as slice
from tensorflow.python.ops.sparse_ops import sparse_softmax as softmax
from tensorflow.python.ops.sparse_ops import sparse_split as split
from tensorflow.python.ops.sparse_ops import sparse_tensor_dense_matmul as matmul
from tensorflow.python.ops.sparse_ops import sparse_tensor_dense_matmul as sparse_dense_matmul
from tensorflow.python.ops.sparse_ops import sparse_tensor_to_dense as to_dense
from tensorflow.python.ops.sparse_ops import sparse_to_indicator as to_indicator
from tensorflow.python.ops.sparse_ops import sparse_transpose as transpose
del _print_function
from tensorflow.python.util import module_wrapper as _module_wrapper
if not isinstance(_sys.modules[__name__], _module_wrapper.TFModuleWrapper):
_sys.modules[__name__] = _module_wrapper.TFModuleWrapper(
_sys.modules[__name__], "sparse", public_apis=None, deprecation=True,
has_lite=False)
| 56.196429 | 94 | 0.864951 | 487 | 3,147 | 5.289528 | 0.184805 | 0.223602 | 0.271739 | 0.294643 | 0.637811 | 0.637811 | 0.637811 | 0.623059 | 0.350543 | 0.15528 | 0 | 0 | 0.086114 | 3,147 | 55 | 95 | 57.218182 | 0.895688 | 0.058786 | 0 | 0 | 1 | 0 | 0.002032 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.880952 | 0 | 0.880952 | 0.047619 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
9c26186447760448c9c73449208c583fd222b5bc | 1,629 | py | Python | qst_nn/models/classifier.py | quantshah/qst-nn | e4dc252fea7c98d0fedc7069502b01e30b47d422 | [
"MIT"
] | 5 | 2020-12-20T04:08:41.000Z | 2022-01-08T13:19:20.000Z | qst_nn/models/classifier.py | quantshah/qst-nn | e4dc252fea7c98d0fedc7069502b01e30b47d422 | [
"MIT"
] | null | null | null | qst_nn/models/classifier.py | quantshah/qst-nn | e4dc252fea7c98d0fedc7069502b01e30b47d422 | [
"MIT"
] | 3 | 2021-02-23T06:59:43.000Z | 2022-02-26T02:46:06.000Z | import tensorflow as tf
import tensorflow_addons as tfa
def Classifier():
inp = tf.keras.layers.Input(shape=[32, 32, 1], name='input_image')
x = tf.keras.layers.Conv2D(32, 3, strides=1,
use_bias=False,
)(inp)
x = tf.keras.layers.LeakyReLU()(x)
x = tf.keras.layers.Conv2D(32, 3, strides=1,
use_bias=False,
)(x)
x = tf.keras.layers.LeakyReLU()(x)
x = tf.keras.layers.GaussianNoise(0.005)(x)
x = tf.keras.layers.Dropout(0.4)(x)
x = tf.keras.layers.Conv2D(32, 3, strides=2,
use_bias=False)(x)
x = tf.keras.layers.LeakyReLU()(x)
x = tf.keras.layers.Conv2D(64, 3, strides=1,
use_bias=False)(x)
x = tf.keras.layers.LeakyReLU()(x)
x = tf.keras.layers.GaussianNoise(0.005)(x)
x = tf.keras.layers.Dropout(0.4)(x)
x = tf.keras.layers.Conv2D(64, 3, strides=1,
use_bias=False)(x)
x = tf.keras.layers.LeakyReLU()(x)
x = tf.keras.layers.Conv2D(64, 3, strides=2,
use_bias=False)(x)
x = tf.keras.layers.LeakyReLU()(x)
x = tf.keras.layers.Dropout(0.4)(x)
x = tf.keras.layers.Flatten()(x)
x = tf.keras.layers.Dense(512)(x)
x = tf.keras.layers.LeakyReLU()(x)
x = tf.keras.layers.Dropout(0.4)(x)
x = tf.keras.layers.Dense(256)(x)
x = tf.keras.layers.LeakyReLU()(x)
x = tf.keras.layers.Dense(7)(x)
return tf.keras.Model(inputs=inp, outputs=x)
| 30.166667 | 70 | 0.524862 | 232 | 1,629 | 3.650862 | 0.176724 | 0.214876 | 0.383707 | 0.396694 | 0.813459 | 0.813459 | 0.77804 | 0.77804 | 0.77804 | 0.775679 | 0 | 0.052252 | 0.3186 | 1,629 | 53 | 71 | 30.735849 | 0.710811 | 0 | 0 | 0.648649 | 0 | 0 | 0.006753 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027027 | false | 0 | 0.054054 | 0 | 0.108108 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
9c480f11fc6aa88a6332c051118420f5f2c7da9e | 15,514 | py | Python | tests/components/usb/test_init.py | switschel/core | 0ecca246bdc3028c30bf8ccbf2b4c7f2a8b3f9aa | [
"Apache-2.0"
] | 1 | 2015-12-31T09:26:30.000Z | 2015-12-31T09:26:30.000Z | tests/components/usb/test_init.py | switschel/core | 0ecca246bdc3028c30bf8ccbf2b4c7f2a8b3f9aa | [
"Apache-2.0"
] | 69 | 2020-08-04T09:03:43.000Z | 2022-03-31T06:13:01.000Z | tests/components/usb/test_init.py | switschel/core | 0ecca246bdc3028c30bf8ccbf2b4c7f2a8b3f9aa | [
"Apache-2.0"
] | 1 | 2020-12-13T08:27:33.000Z | 2020-12-13T08:27:33.000Z | """Tests for the USB Discovery integration."""
import os
import sys
from unittest.mock import MagicMock, patch, sentinel
import pytest
from homeassistant.components import usb
from homeassistant.const import EVENT_HOMEASSISTANT_STARTED
from homeassistant.setup import async_setup_component
from . import slae_sh_device
@pytest.fixture(name="operating_system")
def mock_operating_system():
"""Mock running Home Assistant Operating system."""
with patch(
"homeassistant.components.usb.system_info.async_get_system_info",
return_value={
"hassio": True,
"docker": True,
},
):
yield
@pytest.fixture(name="docker")
def mock_docker():
"""Mock running Home Assistant in docker container."""
with patch(
"homeassistant.components.usb.system_info.async_get_system_info",
return_value={
"hassio": False,
"docker": True,
},
):
yield
@pytest.mark.skipif(
not sys.platform.startswith("linux"),
reason="Only works on linux",
)
async def test_discovered_by_observer_before_started(hass, operating_system):
"""Test a device is discovered by the observer before started."""
async def _mock_monitor_observer_callback(callback):
await hass.async_add_executor_job(
callback, MagicMock(action="add", device_path="/dev/new")
)
def _create_mock_monitor_observer(monitor, callback, name):
hass.async_create_task(_mock_monitor_observer_callback(callback))
return MagicMock()
new_usb = [{"domain": "test1", "vid": "3039", "pid": "3039"}]
mock_comports = [
MagicMock(
device=slae_sh_device.device,
vid=12345,
pid=12345,
serial_number=slae_sh_device.serial_number,
manufacturer=slae_sh_device.manufacturer,
description=slae_sh_device.description,
)
]
with patch(
"homeassistant.components.usb.async_get_usb", return_value=new_usb
), patch(
"homeassistant.components.usb.comports", return_value=mock_comports
), patch(
"pyudev.MonitorObserver", new=_create_mock_monitor_observer
):
assert await async_setup_component(hass, "usb", {"usb": {}})
await hass.async_block_till_done()
with patch("homeassistant.components.usb.comports", return_value=[]), patch.object(
hass.config_entries.flow, "async_init"
) as mock_config_flow:
hass.bus.async_fire(EVENT_HOMEASSISTANT_STARTED)
await hass.async_block_till_done()
assert len(mock_config_flow.mock_calls) == 1
assert mock_config_flow.mock_calls[0][1][0] == "test1"
@pytest.mark.skipif(
not sys.platform.startswith("linux"),
reason="Only works on linux",
)
async def test_removal_by_observer_before_started(hass, operating_system):
"""Test a device is removed by the observer before started."""
async def _mock_monitor_observer_callback(callback):
await hass.async_add_executor_job(
callback, MagicMock(action="remove", device_path="/dev/new")
)
def _create_mock_monitor_observer(monitor, callback, name):
hass.async_create_task(_mock_monitor_observer_callback(callback))
return MagicMock()
new_usb = [{"domain": "test1", "vid": "3039", "pid": "3039"}]
mock_comports = [
MagicMock(
device=slae_sh_device.device,
vid=12345,
pid=12345,
serial_number=slae_sh_device.serial_number,
manufacturer=slae_sh_device.manufacturer,
description=slae_sh_device.description,
)
]
with patch(
"homeassistant.components.usb.async_get_usb", return_value=new_usb
), patch(
"homeassistant.components.usb.comports", return_value=mock_comports
), patch(
"pyudev.MonitorObserver", new=_create_mock_monitor_observer
), patch.object(
hass.config_entries.flow, "async_init"
) as mock_config_flow:
assert await async_setup_component(hass, "usb", {"usb": {}})
await hass.async_block_till_done()
with patch("homeassistant.components.usb.comports", return_value=[]):
hass.bus.async_fire(EVENT_HOMEASSISTANT_STARTED)
await hass.async_block_till_done()
assert len(mock_config_flow.mock_calls) == 0
async def test_discovered_by_websocket_scan(hass, hass_ws_client):
"""Test a device is discovered from websocket scan."""
new_usb = [{"domain": "test1", "vid": "3039", "pid": "3039"}]
mock_comports = [
MagicMock(
device=slae_sh_device.device,
vid=12345,
pid=12345,
serial_number=slae_sh_device.serial_number,
manufacturer=slae_sh_device.manufacturer,
description=slae_sh_device.description,
)
]
with patch("pyudev.Context", side_effect=ImportError), patch(
"homeassistant.components.usb.async_get_usb", return_value=new_usb
), patch(
"homeassistant.components.usb.comports", return_value=mock_comports
), patch.object(
hass.config_entries.flow, "async_init"
) as mock_config_flow:
assert await async_setup_component(hass, "usb", {"usb": {}})
await hass.async_block_till_done()
hass.bus.async_fire(EVENT_HOMEASSISTANT_STARTED)
await hass.async_block_till_done()
ws_client = await hass_ws_client(hass)
await ws_client.send_json({"id": 1, "type": "usb/scan"})
response = await ws_client.receive_json()
assert response["success"]
await hass.async_block_till_done()
assert len(mock_config_flow.mock_calls) == 1
assert mock_config_flow.mock_calls[0][1][0] == "test1"
async def test_discovered_by_websocket_scan_match_vid_only(hass, hass_ws_client):
"""Test a device is discovered from websocket scan only matching vid."""
new_usb = [{"domain": "test1", "vid": "3039"}]
mock_comports = [
MagicMock(
device=slae_sh_device.device,
vid=12345,
pid=12345,
serial_number=slae_sh_device.serial_number,
manufacturer=slae_sh_device.manufacturer,
description=slae_sh_device.description,
)
]
with patch("pyudev.Context", side_effect=ImportError), patch(
"homeassistant.components.usb.async_get_usb", return_value=new_usb
), patch(
"homeassistant.components.usb.comports", return_value=mock_comports
), patch.object(
hass.config_entries.flow, "async_init"
) as mock_config_flow:
assert await async_setup_component(hass, "usb", {"usb": {}})
await hass.async_block_till_done()
hass.bus.async_fire(EVENT_HOMEASSISTANT_STARTED)
await hass.async_block_till_done()
ws_client = await hass_ws_client(hass)
await ws_client.send_json({"id": 1, "type": "usb/scan"})
response = await ws_client.receive_json()
assert response["success"]
await hass.async_block_till_done()
assert len(mock_config_flow.mock_calls) == 1
assert mock_config_flow.mock_calls[0][1][0] == "test1"
async def test_discovered_by_websocket_scan_match_vid_wrong_pid(hass, hass_ws_client):
"""Test a device is discovered from websocket scan only matching vid but wrong pid."""
new_usb = [{"domain": "test1", "vid": "3039", "pid": "9999"}]
mock_comports = [
MagicMock(
device=slae_sh_device.device,
vid=12345,
pid=12345,
serial_number=slae_sh_device.serial_number,
manufacturer=slae_sh_device.manufacturer,
description=slae_sh_device.description,
)
]
with patch("pyudev.Context", side_effect=ImportError), patch(
"homeassistant.components.usb.async_get_usb", return_value=new_usb
), patch(
"homeassistant.components.usb.comports", return_value=mock_comports
), patch.object(
hass.config_entries.flow, "async_init"
) as mock_config_flow:
assert await async_setup_component(hass, "usb", {"usb": {}})
await hass.async_block_till_done()
hass.bus.async_fire(EVENT_HOMEASSISTANT_STARTED)
await hass.async_block_till_done()
ws_client = await hass_ws_client(hass)
await ws_client.send_json({"id": 1, "type": "usb/scan"})
response = await ws_client.receive_json()
assert response["success"]
await hass.async_block_till_done()
assert len(mock_config_flow.mock_calls) == 0
async def test_discovered_by_websocket_no_vid_pid(hass, hass_ws_client):
"""Test a device is discovered from websocket scan with no vid or pid."""
new_usb = [{"domain": "test1", "vid": "3039", "pid": "9999"}]
mock_comports = [
MagicMock(
device=slae_sh_device.device,
vid=None,
pid=None,
serial_number=slae_sh_device.serial_number,
manufacturer=slae_sh_device.manufacturer,
description=slae_sh_device.description,
)
]
with patch("pyudev.Context", side_effect=ImportError), patch(
"homeassistant.components.usb.async_get_usb", return_value=new_usb
), patch(
"homeassistant.components.usb.comports", return_value=mock_comports
), patch.object(
hass.config_entries.flow, "async_init"
) as mock_config_flow:
assert await async_setup_component(hass, "usb", {"usb": {}})
await hass.async_block_till_done()
hass.bus.async_fire(EVENT_HOMEASSISTANT_STARTED)
await hass.async_block_till_done()
ws_client = await hass_ws_client(hass)
await ws_client.send_json({"id": 1, "type": "usb/scan"})
response = await ws_client.receive_json()
assert response["success"]
await hass.async_block_till_done()
assert len(mock_config_flow.mock_calls) == 0
@pytest.mark.parametrize("exception_type", [ImportError, OSError])
async def test_non_matching_discovered_by_scanner_after_started(
hass, exception_type, hass_ws_client
):
"""Test a websocket scan that does not match."""
new_usb = [{"domain": "test1", "vid": "4444", "pid": "4444"}]
mock_comports = [
MagicMock(
device=slae_sh_device.device,
vid=12345,
pid=12345,
serial_number=slae_sh_device.serial_number,
manufacturer=slae_sh_device.manufacturer,
description=slae_sh_device.description,
)
]
with patch("pyudev.Context", side_effect=exception_type), patch(
"homeassistant.components.usb.async_get_usb", return_value=new_usb
), patch(
"homeassistant.components.usb.comports", return_value=mock_comports
), patch.object(
hass.config_entries.flow, "async_init"
) as mock_config_flow:
assert await async_setup_component(hass, "usb", {"usb": {}})
await hass.async_block_till_done()
hass.bus.async_fire(EVENT_HOMEASSISTANT_STARTED)
await hass.async_block_till_done()
ws_client = await hass_ws_client(hass)
await ws_client.send_json({"id": 1, "type": "usb/scan"})
response = await ws_client.receive_json()
assert response["success"]
await hass.async_block_till_done()
assert len(mock_config_flow.mock_calls) == 0
@pytest.mark.skipif(
not sys.platform.startswith("linux"),
reason="Only works on linux",
)
async def test_not_discovered_by_observer_before_started_on_docker(hass, docker):
"""Test a device is not discovered since observer is not running on bare docker."""
async def _mock_monitor_observer_callback(callback):
await hass.async_add_executor_job(
callback, MagicMock(action="add", device_path="/dev/new")
)
def _create_mock_monitor_observer(monitor, callback, name):
hass.async_create_task(_mock_monitor_observer_callback(callback))
return MagicMock()
new_usb = [{"domain": "test1", "vid": "3039", "pid": "3039"}]
mock_comports = [
MagicMock(
device=slae_sh_device.device,
vid=12345,
pid=12345,
serial_number=slae_sh_device.serial_number,
manufacturer=slae_sh_device.manufacturer,
description=slae_sh_device.description,
)
]
with patch(
"homeassistant.components.usb.async_get_usb", return_value=new_usb
), patch(
"homeassistant.components.usb.comports", return_value=mock_comports
), patch(
"pyudev.MonitorObserver", new=_create_mock_monitor_observer
):
assert await async_setup_component(hass, "usb", {"usb": {}})
await hass.async_block_till_done()
with patch("homeassistant.components.usb.comports", return_value=[]), patch.object(
hass.config_entries.flow, "async_init"
) as mock_config_flow:
hass.bus.async_fire(EVENT_HOMEASSISTANT_STARTED)
await hass.async_block_till_done()
assert len(mock_config_flow.mock_calls) == 0
def test_get_serial_by_id_no_dir():
"""Test serial by id conversion if there's no /dev/serial/by-id."""
p1 = patch("os.path.isdir", MagicMock(return_value=False))
p2 = patch("os.scandir")
with p1 as is_dir_mock, p2 as scan_mock:
res = usb.get_serial_by_id(sentinel.path)
assert res is sentinel.path
assert is_dir_mock.call_count == 1
assert scan_mock.call_count == 0
def test_get_serial_by_id():
"""Test serial by id conversion."""
p1 = patch("os.path.isdir", MagicMock(return_value=True))
p2 = patch("os.scandir")
def _realpath(path):
if path is sentinel.matched_link:
return sentinel.path
return sentinel.serial_link_path
p3 = patch("os.path.realpath", side_effect=_realpath)
with p1 as is_dir_mock, p2 as scan_mock, p3:
res = usb.get_serial_by_id(sentinel.path)
assert res is sentinel.path
assert is_dir_mock.call_count == 1
assert scan_mock.call_count == 1
entry1 = MagicMock(spec_set=os.DirEntry)
entry1.is_symlink.return_value = True
entry1.path = sentinel.some_path
entry2 = MagicMock(spec_set=os.DirEntry)
entry2.is_symlink.return_value = False
entry2.path = sentinel.other_path
entry3 = MagicMock(spec_set=os.DirEntry)
entry3.is_symlink.return_value = True
entry3.path = sentinel.matched_link
scan_mock.return_value = [entry1, entry2, entry3]
res = usb.get_serial_by_id(sentinel.path)
assert res is sentinel.matched_link
assert is_dir_mock.call_count == 2
assert scan_mock.call_count == 2
def test_human_readable_device_name():
"""Test human readable device name includes the passed data."""
name = usb.human_readable_device_name(
"/dev/null",
"612020FD",
"Silicon Labs",
"HubZ Smart Home Controller - HubZ Z-Wave Com Port",
"10C4",
"8A2A",
)
assert "/dev/null" in name
assert "612020FD" in name
assert "Silicon Labs" in name
assert "HubZ Smart Home Controller - HubZ Z-Wave Com Port"[:26] in name
assert "10C4" in name
assert "8A2A" in name
name = usb.human_readable_device_name(
"/dev/null",
"612020FD",
"Silicon Labs",
None,
"10C4",
"8A2A",
)
assert "/dev/null" in name
assert "612020FD" in name
assert "Silicon Labs" in name
assert "10C4" in name
assert "8A2A" in name
| 35.020316 | 90 | 0.666495 | 1,922 | 15,514 | 5.078564 | 0.097815 | 0.020285 | 0.04057 | 0.066694 | 0.876037 | 0.841102 | 0.836595 | 0.831984 | 0.824198 | 0.816207 | 0 | 0.019773 | 0.227408 | 15,514 | 442 | 91 | 35.099548 | 0.794594 | 0.018371 | 0 | 0.755618 | 0 | 0 | 0.131013 | 0.063664 | 0 | 0 | 0 | 0 | 0.123596 | 1 | 0.025281 | false | 0 | 0.036517 | 0 | 0.075843 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9c5bdbbe9180aa644eb2123ba6e21e8211286dd7 | 130 | py | Python | src/book/api/views/__init__.py | DmytroKaminskiy/ltt | d08df4d102e678651cd42928e2343733c3308d71 | [
"Apache-2.0"
] | null | null | null | src/book/api/views/__init__.py | DmytroKaminskiy/ltt | d08df4d102e678651cd42928e2343733c3308d71 | [
"Apache-2.0"
] | null | null | null | src/book/api/views/__init__.py | DmytroKaminskiy/ltt | d08df4d102e678651cd42928e2343733c3308d71 | [
"Apache-2.0"
] | null | null | null | from .book import * # noqa
from .bookrent import * # noqa
from .category import * # noqa
from .rentdayhistory import * # noqa
| 26 | 37 | 0.692308 | 16 | 130 | 5.625 | 0.4375 | 0.444444 | 0.466667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.215385 | 130 | 4 | 38 | 32.5 | 0.882353 | 0.146154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
92c9e0c6ad47c08ae2c16d1bc83a3ea38bae7c40 | 11,024 | py | Python | dash/routes/manager/verification.py | AllinolCP/dash | 4e817cc419ddec1519892e5b37f2e579febb86e3 | [
"MIT"
] | null | null | null | dash/routes/manager/verification.py | AllinolCP/dash | 4e817cc419ddec1519892e5b37f2e579febb86e3 | [
"MIT"
] | null | null | null | dash/routes/manager/verification.py | AllinolCP/dash | 4e817cc419ddec1519892e5b37f2e579febb86e3 | [
"MIT"
] | null | null | null | from sanic import Blueprint, response
from dash import env
from urllib.parse import parse_qs
from sqlalchemy import func
from dash.data.penguin import Penguin
from dash.routes.manager.login import login_auth
verification = Blueprint('verification', url_prefix='/verify')
@verification.get('/')
@login_auth()
async def verify_page(_):
return response.redirect('/manager/verify/en')
@verification.get('/<lang>')
@login_auth()
async def verify_page(request, lang):
template = env.get_template('manager/verify.html')
data = await Penguin.query.where(func.lower(Penguin.username) == request['session']['username']).gino.first()
if lang == 'de':
unverified_penguins = await Penguin.query.where(
(Penguin.approval_de == False) & (Penguin.rejection_de == False)
).gino.all()
elif lang == 'es':
unverified_penguins = await Penguin.query.where(
(Penguin.approval_es == False) & (Penguin.rejection_es == False)
).gino.all()
elif lang == 'fr':
unverified_penguins = await Penguin.query.where(
(Penguin.approval_fr == False) & (Penguin.rejection_fr == False)
).gino.all()
elif lang == 'pt':
unverified_penguins = await Penguin.query.where(
(Penguin.approval_pt == False) & (Penguin.rejection_pt == False)
).gino.all()
elif lang == 'ru':
unverified_penguins = await Penguin.query.where(
(Penguin.approval_ru == False) & (Penguin.rejection_ru == False)
).gino.all()
else:
lang = 'en'
unverified_penguins = await Penguin.query.where(
(Penguin.approval_en == False) & (Penguin.rejection_en == False)
).gino.all()
unverified_penguins = get_paginated_result(unverified_penguins)
page = template.render(
success_message='',
error_message='',
unverified_penguins=unverified_penguins,
penguin=data,
language=lang
)
return response.html(page)
@verification.post('/search')
@login_auth()
async def search_username(request):
template = env.get_template('manager/verify.html')
query_string = request.body.decode('UTF-8')
post_data = parse_qs(query_string)
username = post_data.get('username', [None])[0]
language = post_data.get('language', [None])[0]
data = await Penguin.query.where(func.lower(Penguin.username) == request['session']['username']).gino.first()
if not language:
return response.text('You must provide a valid language.')
elif not username:
return response.text('You must provide a valid username.')
if language == 'en':
unverified_penguins = await Penguin.query.where(
(Penguin.approval_en == False) & (Penguin.rejection_en == False)
& (Penguin.username.ilike(f"%{username}%"))
).gino.all()
elif language == 'de':
unverified_penguins = await Penguin.query.where(
(Penguin.approval_de == False) & (Penguin.rejection_de == False)
& (Penguin.username.ilike(f"%{username}%"))
).gino.all()
elif language == 'es':
unverified_penguins = await Penguin.query.where(
(Penguin.approval_es == False) & (Penguin.rejection_es == False)
& (Penguin.username.ilike(f"%{username}%"))
).gino.all()
elif language == 'fr':
unverified_penguins = await Penguin.query.where(
(Penguin.approval_fr == False) & (Penguin.rejection_fr == False)
& (Penguin.username.ilike(f"%{username}%"))
).gino.all()
elif language == 'pt':
unverified_penguins = await Penguin.query.where(
(Penguin.approval_pt == False) & (Penguin.rejection_pt == False)
& (Penguin.username.ilike(f"%{username}%"))
).gino.all()
elif language == 'ru':
unverified_penguins = await Penguin.query.where(
(Penguin.approval_ru == False) & (Penguin.rejection_ru == False)
& (Penguin.username.ilike(f"%{username}%"))
).gino.all()
else:
language = 'en'
unverified_penguins = await Penguin.query.where(
(Penguin.approval_en == False) & (Penguin.rejection_en == False)
& (Penguin.username.ilike(f"%{username}%"))
).gino.all()
unverified_penguins = get_paginated_result(unverified_penguins)
page = template.render(
success_message=f"Searched usernames similar to {username}.",
error_message='',
unverified_penguins=unverified_penguins,
penguin=data,
language=language
)
return response.html(page)
@verification.post('/approve/<penguin_id>')
@login_auth()
async def approve_request(request, penguin_id):
template = env.get_template('manager/verify.html')
query_string = request.body.decode('UTF-8')
post_data = parse_qs(query_string)
language = post_data.get('language', [None])[0]
data = await Penguin.query.where(func.lower(Penguin.username) == request['session']['username']).gino.first()
penguin = await Penguin.query.where(Penguin.id == int(penguin_id)).gino.first()
if not language:
return response.text('You must provide a valid language.')
if not penguin:
return response.text('You must provide a valid penguin ID.')
if language == 'en':
await Penguin.update.values(approval_en=True).where(Penguin.id == penguin.id).gino.status()
unverified_penguins = await Penguin.query.where(
(Penguin.approval_en == False) & (Penguin.rejection_en == False)
).gino.all()
elif language == 'de':
await Penguin.update.values(approval_de=True).where(Penguin.id == penguin.id).gino.status()
unverified_penguins = await Penguin.query.where(
(Penguin.approval_de == False) & (Penguin.rejection_de == False)
).gino.all()
elif language == 'es':
await Penguin.update.values(approval_es=True).where(Penguin.id == penguin.id).gino.status()
unverified_penguins = await Penguin.query.where(
(Penguin.approval_es == False) & (Penguin.rejection_es == False)
).gino.all()
elif language == 'fr':
await Penguin.update.values(approval_fr=True).where(Penguin.id == penguin.id).gino.status()
unverified_penguins = await Penguin.query.where(
(Penguin.approval_fr == False) & (Penguin.rejection_fr == False)
).gino.all()
elif language == 'pt':
await Penguin.update.values(approval_pt=True).where(Penguin.id == penguin.id).gino.status()
unverified_penguins = await Penguin.query.where(
(Penguin.approval_pt == False) & (Penguin.rejection_pt == False)
).gino.all()
elif language == 'ru':
await Penguin.update.values(approval_ru=True).where(Penguin.id == penguin.id).gino.status()
unverified_penguins = await Penguin.query.where(
(Penguin.approval_ru == False) & (Penguin.rejection_ru == False)
).gino.all()
else:
language = 'en'
unverified_penguins = await Penguin.query.where(
(Penguin.approval_en == False) & (Penguin.rejection_en == False)
).gino.all()
unverified_penguins = get_paginated_result(unverified_penguins)
page = template.render(
success_message=f"Successfully approved {penguin.username}'s username.",
error_message='',
unverified_penguins=unverified_penguins,
penguin=data,
language=language
)
return response.html(page)
@verification.post('/reject/<penguin_id>')
@login_auth()
async def reject_request(request, penguin_id):
template = env.get_template('manager/verify.html')
query_string = request.body.decode('UTF-8')
post_data = parse_qs(query_string)
language = post_data.get('language', [None])[0]
data = await Penguin.query.where(func.lower(Penguin.username) == request['session']['username']).gino.first()
penguin = await Penguin.query.where(Penguin.id == int(penguin_id)).gino.first()
if not language:
return response.text('You must provide a valid language.')
if not penguin:
return response.text('You must provide a valid penguin ID.')
if language == 'en':
await Penguin.update.values(rejection_en=True).where(Penguin.id == penguin.id).gino.status()
unverified_penguins = await Penguin.query.where(
(Penguin.approval_en == False) & (Penguin.rejection_en == False)
).gino.all()
elif language == 'de':
await Penguin.update.values(rejection_de=True).where(Penguin.id == penguin.id).gino.status()
unverified_penguins = await Penguin.query.where(
(Penguin.approval_de == False) & (Penguin.rejection_de == False)
).gino.all()
elif language == 'es':
await Penguin.update.values(rejection_es=True).where(Penguin.id == penguin.id).gino.status()
unverified_penguins = await Penguin.query.where(
(Penguin.approval_es == False) & (Penguin.rejection_es == False)
).gino.all()
elif language == 'fr':
await Penguin.update.values(rejection_fr=True).where(Penguin.id == penguin.id).gino.status()
unverified_penguins = await Penguin.query.where(
(Penguin.approval_fr == False) & (Penguin.rejection_fr == False)
).gino.all()
elif language == 'pt':
await Penguin.update.values(rejection_pt=True).where(Penguin.id == penguin.id).gino.status()
unverified_penguins = await Penguin.query.where(
(Penguin.approval_pt == False) & (Penguin.rejection_pt == False)
).gino.all()
elif language == 'ru':
await Penguin.update.values(rejection_ru=True).where(Penguin.id == penguin.id).gino.status()
unverified_penguins = await Penguin.query.where(
(Penguin.approval_ru == False) & (Penguin.rejection_ru == False)
).gino.all()
else:
language = 'en'
unverified_penguins = await Penguin.query.where(
(Penguin.approval_en == False) & (Penguin.rejection_en == False)
).gino.all()
unverified_penguins = get_paginated_result(unverified_penguins)
page = template.render(
success_message=f"Successfully rejected {penguin.username}'s username.",
error_message='',
unverified_penguins=unverified_penguins,
penguin=data,
language=language
)
return response.html(page)
def get_paginated_result(results):
paginated_results = {}
current_count = 0
pagination_limit = current_count + 10
page = 1
for result in results:
if current_count == 0:
paginated_results[page] = []
paginated_results[page].append(result)
elif current_count == pagination_limit:
page += 1
pagination_limit = current_count + 10
paginated_results[page] = []
paginated_results[page].append(result)
else:
paginated_results[page].append(result)
current_count += 1
return paginated_results
| 43.401575 | 113 | 0.645954 | 1,256 | 11,024 | 5.523885 | 0.084395 | 0.077832 | 0.080859 | 0.104641 | 0.905592 | 0.875613 | 0.854857 | 0.849236 | 0.824589 | 0.814932 | 0 | 0.00186 | 0.219793 | 11,024 | 253 | 114 | 43.573123 | 0.80479 | 0 | 0 | 0.758333 | 0 | 0 | 0.069582 | 0.001905 | 0 | 0 | 0 | 0 | 0 | 1 | 0.004167 | false | 0 | 0.025 | 0 | 0.079167 | 0.008333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
92d5176b2d3dd86daceb670f273725cebf597b9d | 85 | py | Python | lua/refactoring/tests/refactor/106/py/global-scope/extract.expected.py | willruggiano/refactoring.nvim | e6113bd7365c5f369e6a1e5bccf6af0697a45ca8 | [
"MIT"
] | 751 | 2021-07-20T15:17:26.000Z | 2022-03-31T18:59:29.000Z | lua/refactoring/tests/refactor/106/py/global-scope/extract.expected.py | alvarotroya/refactoring.nvim | fd1b6fe63e09b1915c22eb7047b3c3f6a492b31d | [
"MIT"
] | 138 | 2021-07-22T16:25:19.000Z | 2022-03-29T13:44:37.000Z | lua/refactoring/tests/refactor/106/py/global-scope/extract.expected.py | alvarotroya/refactoring.nvim | fd1b6fe63e09b1915c22eb7047b3c3f6a492b31d | [
"MIT"
] | 54 | 2021-07-22T12:12:39.000Z | 2022-03-29T18:27:21.000Z |
def foo():
print("o hai")
print("o hai")
print("o hai")
print("o hai")
foo()
| 7.727273 | 18 | 0.529412 | 15 | 85 | 3 | 0.333333 | 0.533333 | 0.8 | 0.933333 | 0.8 | 0.8 | 0.8 | 0.8 | 0.8 | 0 | 0 | 0 | 0.223529 | 85 | 10 | 19 | 8.5 | 0.681818 | 0 | 0 | 0.666667 | 0 | 0 | 0.240964 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | true | 0 | 0 | 0 | 0.166667 | 0.666667 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 10 |
1334f4b50d6bdad91482f2bf620d5b4224d6e925 | 8,718 | py | Python | src/ggrc/migrations/versions/20170307131717_1a5ec1ed04af_fix_audit_context.py | Killswitchz/ggrc-core | 2460df94daf66727af248ad821462692917c97a9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | src/ggrc/migrations/versions/20170307131717_1a5ec1ed04af_fix_audit_context.py | Killswitchz/ggrc-core | 2460df94daf66727af248ad821462692917c97a9 | [
"ECL-2.0",
"Apache-2.0"
] | 10 | 2018-07-06T00:04:23.000Z | 2021-02-26T21:13:20.000Z | src/ggrc/migrations/versions/20170307131717_1a5ec1ed04af_fix_audit_context.py | Killswitchz/ggrc-core | 2460df94daf66727af248ad821462692917c97a9 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2017-11-11T22:16:56.000Z | 2017-11-11T22:16:56.000Z | # Copyright (C) 2017 Google Inc.
# Licensed under http://www.apache.org/licenses/LICENSE-2.0 <see LICENSE file>
"""
Fix audit context
Create Date: 2017-03-07 13:17:17.472314
"""
# disable Invalid constant name pylint warning for mandatory Alembic variables.
# pylint: disable=invalid-name
from alembic import op
# revision identifiers, used by Alembic.
revision = '1a5ec1ed04af'
down_revision = '3f615f3b5192'
def upgrade():
"""Upgrade database schema and/or data, creating a new revision."""
# pylint: disable=too-many-statements
# clear bad audit ids, clearing instead of just fixing the bad audit ids will
# make it easier to propagate this change to all of its related objects that
# also have an invalid context id.
# Clear audit_
sql = """
UPDATE audits AS au
LEFT JOIN contexts AS c ON
au.context_id = c.id
SET au.context_id = NULL
WHERE c.related_object_type != "Audit"
"""
op.execute(sql)
# Clear all bad contexts from any audit related object.
# Clear snapshot_
sql = """
UPDATE snapshots AS s
JOIN audits AS au ON
s.parent_id = au.id AND
s.parent_type = "Audit"
SET s.context_id = NULL
WHERE au.context_id IS NULL
"""
op.execute(sql)
# Clear issue_
sql = """
UPDATE issues AS i
JOIN relationships AS r ON r.source_id = i.id
AND r.source_type = 'Issue' AND r.destination_type = 'Audit'
JOIN audits AS au ON r.destination_id = au.id
SET i.context_id = NULL
WHERE au.context_id is NULL;
"""
op.execute(sql)
sql = """
UPDATE issues AS i
JOIN relationships AS r ON r.destination_id = i.id
AND r.destination_type = 'Issue' AND r.source_type = 'Audit'
JOIN audits AS au ON r.source_id = au.id
SET i.context_id = NULL
WHERE au.context_id is NULL
"""
op.execute(sql)
# Clear assessment_
sql = """
UPDATE assessments AS a
JOIN relationships AS r ON r.source_id = a.id
AND r.source_type = 'Assessment' AND r.destination_type = 'Audit'
JOIN audits AS au ON r.destination_id = au.id
SET a.context_id = NULL
WHERE au.context_id is NULL;
"""
op.execute(sql)
sql = """
UPDATE assessments AS a
JOIN relationships AS r ON r.destination_id = a.id
AND r.destination_type = 'Assessment' AND r.source_type = 'Audit'
JOIN audits AS au ON r.source_id = au.id
SET a.context_id = NULL
WHERE au.context_id is NULL
"""
op.execute(sql)
# Clear object_document
sql = """
UPDATE object_documents AS od
JOIN assessments AS a ON od.documentable_id = a.id
SET od.context_id = NULL
WHERE documentable_type = 'Assessment' AND a.context_id IS NULL;
"""
op.execute(sql)
sql = """
UPDATE object_documents AS od
JOIN issues AS i ON od.documentable_id = i.id
SET od.context_id = NULL
WHERE documentable_type = 'Issue' AND i.context_id IS NULL;
"""
op.execute(sql)
# Clear document_
sql = """
UPDATE documents AS d
JOIN object_documents AS od ON od.document_id = d.id
AND od.documentable_type in ("Assessment", "Issue")
SET d.context_id = NULL
WHERE od.context_id IS NULL
"""
op.execute(sql)
sql = """
UPDATE documents AS d
JOIN relationships AS r ON
r.source_id = d.id AND
r.source_type = "Document" AND
r.destination_type = "Assessment"
JOIN assessments AS a ON
r.destination_id = a.id
SET d.context_id = NULL
WHERE a.context_id IS NULL
"""
op.execute(sql)
sql = """
UPDATE documents AS d
JOIN relationships AS r ON
r.destination_id = d.id AND
r.destination_type = "Document" AND
r.source_type = "Assessment"
JOIN assessments AS a ON
r.source_id = a.id
SET d.context_id = NULL
WHERE a.context_id IS NULL
"""
op.execute(sql)
# Clear assessment_template
sql = """
UPDATE assessment_templates AS at
JOIN relationships AS r ON
r.source_id = at.id AND
r.source_type = "AssessmentTemplate" AND
r.destination_type = "Audit"
JOIN audits AS au ON
r.destination_id = au.id
SET at.context_id = NULL
WHERE au.context_id IS NULL
"""
op.execute(sql)
sql = """
UPDATE assessment_templates AS at
JOIN relationships AS r ON
r.destination_id = at.id AND
r.destination_type = "AssessmentTemplate" AND
r.source_type = "Audit"
JOIN audits AS au ON
r.source_id = au.id
SET at.context_id = NULL
WHERE au.context_id IS NULL
"""
op.execute(sql)
#############################################################################
# Now we will apply the audit context fix and propagate the change to all of
# its related objects
#############################################################################
# Set audit_
sql = """
UPDATE audits AS au
JOIN contexts AS c ON
au.id = c.related_object_id AND
c.related_object_type = "Audit"
SET au.context_id = c.id
where au.context_id is NULL
"""
op.execute(sql)
# Set snapshot_
sql = """
UPDATE snapshots AS s
JOIN audits AS au ON
s.parent_id = au.id AND
s.parent_type = "Audit"
SET s.context_id = au.context_id
WHERE s.context_id IS NULL
"""
op.execute(sql)
# Set issue_
sql = """
UPDATE issues AS i
JOIN relationships AS r ON r.source_id = i.id
AND r.source_type = 'Issue' AND r.destination_type = 'Audit'
JOIN audits AS au ON r.destination_id = au.id
SET i.context_id = au.context_id
WHERE i.context_id is NULL;
"""
op.execute(sql)
sql = """
UPDATE issues AS i
JOIN relationships AS r ON r.destination_id = i.id
AND r.destination_type = 'Issue' AND r.source_type = 'Audit'
JOIN audits AS au ON r.source_id = au.id
SET i.context_id = au.context_id
WHERE i.context_id is NULL;
"""
op.execute(sql)
# Set assessment_
sql = """
UPDATE assessments AS a
JOIN relationships AS r ON r.source_id = a.id
AND r.source_type = 'Assessment' AND r.destination_type = 'Audit'
JOIN audits AS au ON r.destination_id = au.id
SET a.context_id = au.context_id
WHERE a.context_id is NULL;
"""
op.execute(sql)
sql = """
UPDATE assessments AS a
JOIN relationships AS r ON r.destination_id = a.id
AND r.destination_type = 'Assessment' AND r.source_type = 'Audit'
JOIN audits AS au ON r.source_id = au.id
SET a.context_id = au.context_id
WHERE a.context_id is NULL;
"""
op.execute(sql)
# Set object_document
sql = """
UPDATE object_documents AS od
JOIN assessments AS a ON od.documentable_id = a.id
SET od.context_id = a.context_id
WHERE documentable_type = 'Assessment' AND od.context_id IS NULL;
"""
op.execute(sql)
sql = """
UPDATE object_documents AS od
JOIN issues AS i ON od.documentable_id = i.id
SET od.context_id = i.context_id
WHERE documentable_type = 'Issue' AND od.context_id IS NULL;
"""
op.execute(sql)
# Set document_
sql = """
UPDATE documents AS d
JOIN object_documents AS od ON od.document_id = d.id
AND od.documentable_type in ("Assessment", "Issue")
SET d.context_id = od.context_id
WHERE d.context_id IS NULL
"""
op.execute(sql)
sql = """
UPDATE documents AS d
JOIN relationships AS r ON
r.source_id = d.id AND
r.source_type = "Document" AND
r.destination_type = "Assessment"
JOIN assessments AS a ON
r.destination_id = a.id
SET d.context_id = a.context_id
WHERE d.context_id IS NULL
"""
op.execute(sql)
sql = """
UPDATE documents AS d
JOIN relationships AS r ON
r.destination_id = d.id AND
r.destination_type = "Document" AND
r.source_type = "Assessment"
JOIN assessments AS a ON
r.source_id = a.id
SET d.context_id = a.context_id
WHERE d.context_id IS NULL
"""
op.execute(sql)
# Set assessment_template
sql = """
UPDATE assessment_templates AS at
JOIN relationships AS r ON
r.source_id = at.id AND
r.source_type = "AssessmentTemplate" AND
r.destination_type = "Audit"
JOIN audits AS au ON
r.destination_id = au.id
SET at.context_id = au.context_id
WHERE at.context_id IS NULL
"""
op.execute(sql)
sql = """
UPDATE assessment_templates AS at
JOIN relationships AS r ON
r.destination_id = at.id AND
r.destination_type = "AssessmentTemplate" AND
r.source_type = "Audit"
JOIN audits AS au ON
r.source_id = au.id
SET at.context_id = au.context_id
WHERE at.context_id IS NULL
"""
op.execute(sql)
def downgrade():
"""Downgrade database schema and/or data back to the previous revision."""
pass
| 26.742331 | 79 | 0.64579 | 1,339 | 8,718 | 4.073936 | 0.099328 | 0.107241 | 0.057195 | 0.068744 | 0.868011 | 0.831347 | 0.808249 | 0.79725 | 0.784051 | 0.76022 | 0 | 0.006119 | 0.250172 | 8,718 | 325 | 80 | 26.824615 | 0.828362 | 0.120096 | 0 | 0.897233 | 0 | 0 | 0.856703 | 0.005619 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007905 | false | 0.003953 | 0.003953 | 0 | 0.011858 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
13c32dbe1bda98621d89f01b359de609dfb152cb | 27,734 | py | Python | app/test/integrationtest/test_sensor.py | michalkoziara/IoT-RESTful-Webservice | ecb0f3e09cded3190f3646e5cd6c913056d94981 | [
"bzip2-1.0.6"
] | 2 | 2021-09-24T02:45:32.000Z | 2021-11-15T09:44:44.000Z | app/test/integrationtest/test_sensor.py | PKramek/IoT-RESTful-Webservice-1 | ecb0f3e09cded3190f3646e5cd6c913056d94981 | [
"bzip2-1.0.6"
] | null | null | null | app/test/integrationtest/test_sensor.py | PKramek/IoT-RESTful-Webservice-1 | ecb0f3e09cded3190f3646e5cd6c913056d94981 | [
"bzip2-1.0.6"
] | 1 | 2021-09-11T11:47:32.000Z | 2021-09-11T11:47:32.000Z | import json
from datetime import datetime
from sqlalchemy import and_
from app.main.model.deleted_device import DeletedDevice
from app.main.model.sensor import Sensor
from app.main.model.unconfigured_device import UnconfiguredDevice
from app.main.repository.sensor_repository import SensorRepository
from app.main.util.auth_utils import Auth
from app.main.util.constants import Constants
def test_get_sensor_info_should_return_sensor_info_when_valid_request(
client,
insert_device_group,
insert_sensor,
insert_user,
get_user_group_default_values,
insert_user_group,
insert_sensor_type,
insert_sensor_reading,
get_sensor_type_default_values):
content_type = 'application/json'
device_group = insert_device_group()
user = insert_user()
user_group_values = get_user_group_default_values()
user_group_values['users'] = [user]
insert_user_group(user_group_values)
sensor_type_values = get_sensor_type_default_values()
sensor_type_values['reading_type'] = 'Decimal'
sensor_type = insert_sensor_type(sensor_type_values)
sensor = insert_sensor()
sensor_reading = insert_sensor_reading()
response = client.get(
'/api/hubs/' + device_group.product_key + '/sensors/' + sensor.device_key,
content_type=content_type,
headers={
'Authorization': 'Bearer ' + Auth.encode_auth_token(user.id, False)
}
)
assert response is not None
assert response.status_code == 200
assert response.content_type == content_type
response_data = json.loads(response.data.decode())
assert response_data is not None
assert response_data
assert response_data['name'] == sensor.name
assert response_data['isUpdated'] == sensor.is_updated
assert response_data['isActive'] == sensor.is_active
assert response_data['isAssigned'] == sensor.is_assigned
assert response_data['deviceKey'] == sensor.device_key
assert response_data['sensorTypeName'] == sensor_type.name
assert response_data['sensorTypeName'] == sensor_type.name
assert response_data['readingValue'] == sensor_reading.value
def test_get_sensor_info_should_return_sensor_info_when_valid_request_and_user_is_admin(
client,
insert_device_group,
insert_sensor,
insert_admin,
get_user_group_default_values,
insert_user_group,
insert_sensor_type,
insert_sensor_reading,
get_sensor_type_default_values):
content_type = 'application/json'
device_group = insert_device_group()
admin = insert_admin()
user_group_values = get_user_group_default_values()
insert_user_group(user_group_values)
sensor_type_values = get_sensor_type_default_values()
sensor_type_values['reading_type'] = 'Decimal'
sensor_type = insert_sensor_type(sensor_type_values)
sensor = insert_sensor()
sensor_reading = insert_sensor_reading()
response = client.get(
'/api/hubs/' + device_group.product_key + '/sensors/' + sensor.device_key,
content_type=content_type,
headers={
'Authorization': 'Bearer ' + Auth.encode_auth_token(admin.id, True)
}
)
assert response is not None
assert response.status_code == 200
assert response.content_type == content_type
response_data = json.loads(response.data.decode())
assert response_data is not None
assert response_data
assert response_data['name'] == sensor.name
assert response_data['isUpdated'] == sensor.is_updated
assert response_data['isActive'] == sensor.is_active
assert response_data['isAssigned'] == sensor.is_assigned
assert response_data['deviceKey'] == sensor.device_key
assert response_data['sensorTypeName'] == sensor_type.name
assert response_data['sensorTypeName'] == sensor_type.name
assert response_data['readingValue'] == sensor_reading.value
def test_get_sensor_info_should_not_return_sensor_info_when_bad_product_key(
client,
insert_device_group,
insert_sensor,
insert_user,
get_user_group_default_values,
insert_sensor_type):
content_type = 'application/json'
device_group = insert_device_group()
user = insert_user()
user_group_values = get_user_group_default_values()
user_group_values['users'] = [user]
insert_sensor_type()
sensor = insert_sensor()
response = client.get(
'/api/hubs/' + device_group.product_key + "test" + '/sensors/' + sensor.device_key,
content_type=content_type,
headers={
'Authorization': 'Bearer ' + Auth.encode_auth_token(user.id, False)
}
)
assert response is not None
assert response.status_code == 400
response_data = json.loads(response.data.decode())
error_message = Constants.RESPONSE_MESSAGE_PRODUCT_KEY_NOT_FOUND
assert error_message == response_data['errorMessage']
def test_get_sensor_info_should_not_return_sensor_info_when_bad_device_key(
client,
insert_device_group,
insert_sensor,
insert_user,
get_user_group_default_values,
insert_sensor_type):
content_type = 'application/json'
device_group = insert_device_group()
user = insert_user()
user_group_values = get_user_group_default_values()
user_group_values['users'] = [user]
sensor = insert_sensor()
response = client.get(
'/api/hubs/' + device_group.product_key + '/sensors/' + sensor.device_key + '1',
content_type=content_type,
headers={
'Authorization': 'Bearer ' + Auth.encode_auth_token(user.id, False)
}
)
assert response is not None
assert response.status_code == 400
response_data = json.loads(response.data.decode())
error_message = Constants.RESPONSE_MESSAGE_DEVICE_KEY_NOT_FOUND
assert error_message == response_data['errorMessage']
def test_get_sensor_readings_should_return_sensors_readings_when_valid_request(
client,
insert_device_group,
get_sensor_type_default_values,
insert_sensor_type,
insert_sensor,
insert_user,
get_user_group_default_values,
insert_user_group,
get_sensor_reading_default_values,
sensor_reading_default_values,
insert_sensor_readings):
content_type = 'application/json'
device_group = insert_device_group()
user = insert_user()
user_group_values = get_user_group_default_values()
user_group_values['users'] = [user]
user_group = insert_user_group(user_group_values)
sensor_type_values = get_sensor_type_default_values()
sensor_type_values['reading_type'] = 'Decimal'
insert_sensor_type(sensor_type_values)
sensor = insert_sensor()
sensors_first_reading = get_sensor_reading_default_values()
sensors_second_reading = get_sensor_reading_default_values()
sensors_second_reading['id'] += 1
sensors_second_reading['value'] += 0.1
sensors_second_reading['date'] = datetime(2019, 8, 5, 8, 10, 10, 10)
sensor_readings = insert_sensor_readings([sensors_first_reading, sensors_second_reading])
expected_values = [
{
'value': sensor_readings[1].value,
'date': str(sensor_readings[1].date)
},
{
'value': sensor_readings[0].value,
'date': str(sensor_readings[0].date)
}
]
assert sensor.user_group_id == user_group.id
assert user_group.device_group_id == device_group.id
response = client.get(
'/api/hubs/' + device_group.product_key + '/sensors/' + sensor.device_key + '/readings',
content_type=content_type,
headers={
'Authorization': 'Bearer ' + Auth.encode_auth_token(user.id, False)
}
)
assert response is not None
assert response.status_code == 200
assert response.content_type == content_type
response_data = json.loads(response.data.decode())
assert response_data is not None
assert response_data['sensorName'] == sensor.name
assert response_data['values'] == expected_values
def test_get_get_list_of_sensors_should_return_list_of_sensors_info_when_is_is_admin_of_device_group(
client,
insert_device_group,
get_sensor_default_values,
insert_sensor,
insert_admin,
get_user_group_default_values,
insert_user_group):
content_type = 'application/json'
device_group = insert_device_group()
admin = insert_admin()
device_group.admin_id = admin.id
user_group_values = get_user_group_default_values()
user_group_values['name'] = 'Master'
user_group = insert_user_group(user_group_values)
first_sensor_values = get_sensor_default_values()
second_sensor_values = get_sensor_default_values()
third_sensor_values = get_sensor_default_values()
first_sensor_values['name'] = 'first'
second_sensor_values['name'] = 'second'
third_sensor_values['name'] = 'second'
first_sensor_values['user_group_id'] = user_group.id
second_sensor_values['user_group_id'] = None
third_sensor_values['user_group_id'] = None
second_sensor_values['id'] += 1
third_sensor_values['id'] += 2
second_sensor_values['device_key'] += '1'
third_sensor_values['device_key'] += '2'
first_sensor = insert_sensor(first_sensor_values)
second_sensor = insert_sensor(second_sensor_values)
third_sensor = insert_sensor(third_sensor_values)
expected_output_values = [
{
'name': first_sensor.name,
'deviceKey': first_sensor.device_key,
'isActive': first_sensor.is_active
},
{
'name': second_sensor.name,
'deviceKey': second_sensor.device_key,
'isActive': second_sensor.is_active
},
{
'name': third_sensor.name,
'deviceKey': third_sensor.device_key,
'isActive': third_sensor.is_active
}
]
response = client.get(
'/api/hubs/' + device_group.product_key + '/sensors',
content_type=content_type,
headers={
'Authorization': 'Bearer ' + Auth.encode_auth_token(admin.id, True)
}
)
assert response is not None
assert response.status_code == 200
assert response.content_type == content_type
response_data = json.loads(response.data.decode())
assert response_data is not None
assert response_data == expected_output_values
def test_get_get_list_of_unassigned_sensors_should_return_list_of_sensors_info_when_valid_request_and_user_is_not_admin(
client,
insert_device_group,
get_sensor_default_values,
insert_sensor,
insert_user,
get_user_group_default_values,
insert_user_group):
content_type = 'application/json'
device_group = insert_device_group()
user = insert_user()
user_group_values = get_user_group_default_values()
user_group_values['users'] = [user]
user_group_values['name'] = 'Master'
user_group = insert_user_group(user_group_values)
assert user in user_group.users
first_sensor_values = get_sensor_default_values()
second_sensor_values = get_sensor_default_values()
third_sensor_values = get_sensor_default_values()
first_sensor_values['name'] = 'first'
second_sensor_values['name'] = 'second'
third_sensor_values['name'] = 'second'
first_sensor_values['user_group_id'] = user_group.id
second_sensor_values['user_group_id'] = None
third_sensor_values['user_group_id'] = None
second_sensor_values['id'] += 1
third_sensor_values['id'] += 2
second_sensor_values['device_key'] += '1'
third_sensor_values['device_key'] += '2'
insert_sensor(first_sensor_values)
second_sensor = insert_sensor(second_sensor_values)
third_sensor = insert_sensor(third_sensor_values)
expected_output_values = [
{
'name': second_sensor.name,
'deviceKey': second_sensor.device_key,
'isActive': second_sensor.is_active
},
{
'name': third_sensor.name,
'deviceKey': third_sensor.device_key,
'isActive': third_sensor.is_active
}
]
response = client.get(
'/api/hubs/' + device_group.product_key + '/sensors/unassigned',
content_type=content_type,
headers={
'Authorization': 'Bearer ' + Auth.encode_auth_token(user.id, False)
}
)
assert response is not None
assert response.status_code == 200
assert response.content_type == content_type
response_data = json.loads(response.data.decode())
assert response_data is not None
assert response_data == expected_output_values
def test_get_list_of_unassigned_sensors_should_return_list_of_sensors_info_when_valid_request_and_user_is_admin(
client,
insert_device_group,
get_sensor_default_values,
insert_sensor,
insert_admin,
get_user_group_default_values,
insert_user_group):
content_type = 'application/json'
device_group = insert_device_group()
admin = insert_admin()
user_group = insert_user_group()
assert device_group.admin_id == admin.id
first_sensor_values = get_sensor_default_values()
second_sensor_values = get_sensor_default_values()
third_sensor_values = get_sensor_default_values()
first_sensor_values['name'] = 'first'
second_sensor_values['name'] = 'second'
third_sensor_values['name'] = 'second'
first_sensor_values['user_group_id'] = user_group.id
second_sensor_values['user_group_id'] = None
third_sensor_values['user_group_id'] = None
second_sensor_values['id'] += 1
third_sensor_values['id'] += 2
second_sensor_values['device_key'] += '1'
third_sensor_values['device_key'] += '2'
insert_sensor(first_sensor_values)
second_sensor = insert_sensor(second_sensor_values)
third_sensor = insert_sensor(third_sensor_values)
expected_output_values = [
{
'name': second_sensor.name,
'deviceKey': second_sensor.device_key,
'isActive': second_sensor.is_active
},
{
'name': third_sensor.name,
'deviceKey': third_sensor.device_key,
'isActive': third_sensor.is_active
}
]
response = client.get(
'/api/hubs/' + device_group.product_key + '/sensors/unassigned',
content_type=content_type,
headers={
'Authorization': 'Bearer ' + Auth.encode_auth_token(admin.id, True)
}
)
assert response is not None
assert response.status_code == 200
assert response.content_type == content_type
response_data = json.loads(response.data.decode())
assert response_data is not None
assert response_data == expected_output_values
def test_get_get_list_of_unassigned_sensors_should_return_error_message_when_valid_request_and_user_is_not_in_master_user_group(
client,
insert_device_group,
get_sensor_default_values,
insert_sensor,
insert_user,
get_user_group_default_values,
insert_user_group):
content_type = 'application/json'
device_group = insert_device_group()
user = insert_user()
user_group_values = get_user_group_default_values()
user_group_values['name'] = 'Master'
user_group = insert_user_group(user_group_values)
assert user not in user_group.users
response = client.get(
'/api/hubs/' + device_group.product_key + '/sensors/unassigned',
content_type=content_type,
headers={
'Authorization': 'Bearer ' + Auth.encode_auth_token(user.id, False)
}
)
assert response is not None
assert response.status_code == 403
assert response.content_type == content_type
response_data = json.loads(response.data.decode())
assert response_data is not None
assert response_data['errorMessage'] == Constants.RESPONSE_MESSAGE_USER_DOES_NOT_HAVE_PRIVILEGES
def test_add_sensor_to_device_group_should_add_sensor_to_device_group_when_valid_request(
client,
insert_device_group,
insert_admin,
insert_sensor_type,
insert_unconfigured_device
):
content_type = 'application/json'
device_group = insert_device_group()
admin = insert_admin()
sensor_type = insert_sensor_type()
unconfigured_device = insert_unconfigured_device()
assert device_group.sensors == []
assert device_group.admin_id == admin.id
response = client.post(
'/api/hubs/' + device_group.product_key + '/sensors',
data=json.dumps(
{
"deviceKey": unconfigured_device.device_key,
"password": device_group.password,
"sensorName": 'test_sensor_name',
"sensorTypeName": sensor_type.name
}
),
content_type=content_type,
headers={
'Authorization': 'Bearer ' + Auth.encode_auth_token(admin.id, True)
}
)
assert response is not None
assert response.status_code == 201
assert response.content_type == content_type
sensor = Sensor.query.filter(
and_(
Sensor.device_key == unconfigured_device.device_key,
Sensor.device_group_id == device_group.id
)
).first()
deleted_unconfigured_device = UnconfiguredDevice.query.filter(
and_(
UnconfiguredDevice.device_key == unconfigured_device.device_key,
UnconfiguredDevice.device_group_id == device_group.id
)
).first()
assert deleted_unconfigured_device is None
assert sensor
assert sensor.device_group_id == device_group.id
assert sensor.name == 'test_sensor_name'
assert sensor.is_updated is False
assert sensor.is_active is False
assert sensor.is_updated is False
assert sensor.is_assigned is False
assert sensor.device_key == unconfigured_device.device_key
assert sensor.sensor_type_id == sensor_type.id
assert sensor.user_group_id is None
assert sensor.sensor_readings == []
def test_add_sensor_to_device_group_should_return_error_message_when_device_key_already_in_sensors_table(
client,
insert_device_group,
insert_admin,
insert_sensor_type,
insert_unconfigured_device,
get_sensor_default_values,
insert_sensor
):
content_type = 'application/json'
device_group = insert_device_group()
admin = insert_admin()
sensor_type = insert_sensor_type()
unconfigured_device = insert_unconfigured_device()
sensor_values = get_sensor_default_values()
sensor_values['device_key'] = unconfigured_device.device_key
sensor = insert_sensor(sensor_values)
assert device_group.sensors == [sensor]
assert device_group.admin_id == admin.id
response = client.post(
'/api/hubs/' + device_group.product_key + '/sensors',
data=json.dumps(
{
"deviceKey": unconfigured_device.device_key,
"password": device_group.password,
"sensorName": 'test_sensor_name',
"sensorTypeName": sensor_type.name
}
),
content_type=content_type,
headers={
'Authorization': 'Bearer ' + Auth.encode_auth_token(admin.id, True)
}
)
assert response is not None
assert response.status_code == 409
assert response.content_type == content_type
response_data = json.loads(response.data.decode())
assert response_data
assert response_data['errorMessage'] == Constants.RESPONSE_MESSAGE_CONFLICTING_DATA
not_deleted_unconfigured_device = UnconfiguredDevice.query.filter(
and_(
UnconfiguredDevice.device_key == unconfigured_device.device_key,
UnconfiguredDevice.device_group_id == device_group.id
)
).first()
assert not_deleted_unconfigured_device is unconfigured_device
assert device_group.sensors == [sensor]
def test_modify_sensor_should_modify_sensor_when_valid_request(
client,
insert_device_group,
insert_user,
get_user_group_default_values,
insert_user_group,
get_sensor_type_default_values,
insert_sensor_type,
get_sensor_default_values,
insert_sensor):
content_type = 'application/json'
device_group = insert_device_group()
user = insert_user()
old_user_group_values = get_user_group_default_values()
new_user_group_values = get_user_group_default_values()
old_user_group_values["users"] = [user]
new_user_group_values["users"] = [user]
old_user_group_values["name"] = "Master"
new_user_group_values["name"] = "new"
new_user_group_values["id"] += 1
old_user_group = insert_user_group(old_user_group_values)
new_user_group = insert_user_group(new_user_group_values)
old_sensor_type = insert_sensor_type()
new_sensor_type_values = get_sensor_type_default_values()
new_sensor_type_values['reading_type'] = 'Decimal'
new_sensor_type_values['name'] = 'New'
new_sensor_type_values['id'] += 1
new_sensor_type_values['range_min'] = 0.0
new_sensor_type_values['range_min'] = 1.0
new_sensor_type = insert_sensor_type(new_sensor_type_values)
sensor_values = get_sensor_default_values()
sensor_values['name'] = "to be changed"
sensor_values['state'] = 0.5
sensor_values['is_updated'] = False
sensor_values['is_formula_used'] = False
sensor_values['executive_type_id'] = old_sensor_type.id
sensor_values['user_group_id'] = old_user_group.id
sensor = insert_sensor()
new_name = "Changed"
response = client.put(
'/api/hubs/' + device_group.product_key + '/sensors/' + sensor.device_key,
data=json.dumps(
{
"name": new_name,
"typeName": new_sensor_type.name,
"userGroupName": new_user_group.name,
"isFormulaUsed": True
}
),
content_type=content_type,
headers={
'Authorization': 'Bearer ' + Auth.encode_auth_token(user.id, False)
}
)
assert response is not None
assert response.status_code == 200
assert response.content_type == content_type
response_data = json.loads(response.data.decode())
assert response_data is not None
assert response_data
assert response_data["changedName"] == new_name
assert response_data["changedType"] == new_sensor_type.name
assert response_data["changedUserGroupName"] == new_user_group.name
def test_modify_sensor_should_modify_sensor_when_valid_request_and_user_is_admin(
client,
insert_device_group,
insert_admin,
get_user_group_default_values,
insert_user_group,
get_sensor_type_default_values,
insert_sensor_type,
get_sensor_default_values,
insert_sensor):
content_type = 'application/json'
device_group = insert_device_group()
admin = insert_admin()
old_user_group_values = get_user_group_default_values()
new_user_group_values = get_user_group_default_values()
old_user_group_values["name"] = "Master"
new_user_group_values["name"] = "new"
new_user_group_values["id"] += 1
old_user_group = insert_user_group(old_user_group_values)
new_user_group = insert_user_group(new_user_group_values)
old_sensor_type = insert_sensor_type()
new_sensor_type_values = get_sensor_type_default_values()
new_sensor_type_values['reading_type'] = 'Decimal'
new_sensor_type_values['name'] = 'New'
new_sensor_type_values['id'] += 1
new_sensor_type_values['range_min'] = 0.0
new_sensor_type_values['range_min'] = 1.0
new_sensor_type = insert_sensor_type(new_sensor_type_values)
sensor_values = get_sensor_default_values()
sensor_values['name'] = "to be changed"
sensor_values['state'] = 0.5
sensor_values['is_updated'] = False
sensor_values['is_formula_used'] = False
sensor_values['executive_type_id'] = old_sensor_type.id
sensor_values['user_group_id'] = old_user_group.id
sensor = insert_sensor()
new_name = "Changed"
response = client.put(
'/api/hubs/' + device_group.product_key + '/sensors/' + sensor.device_key,
data=json.dumps(
{
"name": new_name,
"typeName": new_sensor_type.name,
"userGroupName": new_user_group.name,
"isFormulaUsed": True
}
),
content_type=content_type,
headers={
'Authorization': 'Bearer ' + Auth.encode_auth_token(admin.id, True)
}
)
assert response is not None
assert response.status_code == 200
assert response.content_type == content_type
response_data = json.loads(response.data.decode())
assert response_data is not None
assert response_data
assert response_data["changedName"] == new_name
assert response_data["changedType"] == new_sensor_type.name
assert response_data["changedUserGroupName"] == new_user_group.name
def test_delete_sensor_should_delete_sensor_when_valid_request(
client,
insert_device_group,
get_sensor_default_values,
insert_admin,
insert_sensor,
insert_sensor_type):
content_type = 'application/json'
device_group = insert_device_group()
admin = insert_admin()
insert_sensor_type()
sensor = insert_sensor()
sensor_device_key = sensor.device_key
response = client.delete(
'/api/hubs/' + device_group.product_key + '/sensors/' + sensor.device_key,
content_type=content_type,
headers={
'Authorization': 'Bearer ' + Auth.encode_auth_token(admin.id, True)
}
)
assert response is not None
assert response.status_code == 200
assert response.content_type == content_type
sensor_in_db = SensorRepository.get_instance().get_sensor_by_device_key_and_device_group_id(
sensor_device_key,
device_group.id)
assert sensor_in_db is None
deleted_devices = DeletedDevice.query.filter(DeletedDevice.device_key == sensor_device_key).all()
assert deleted_devices
assert len(deleted_devices) == 1
def test_delete_sensor_should_not_delete_sensor_when_not_valid_request(
client,
insert_device_group,
get_sensor_default_values,
insert_admin,
insert_sensor,
insert_sensor_type,
get_sensor_reading_default_values,
insert_sensor_reading):
content_type = 'application/json'
device_group = insert_device_group()
admin = insert_admin()
insert_sensor_type()
sensor = insert_sensor()
sensor_device_key = sensor.device_key
response = client.delete(
'/api/hubs/' + device_group.product_key + '/sensors/' + sensor.device_key,
content_type=content_type,
headers={
'Authorization': 'Bearer ' + Auth.encode_auth_token(admin.id, False)
}
)
assert response is not None
assert response.status_code == 403
assert response.content_type == content_type
response_data = json.loads(response.data.decode())
assert response_data is not None
assert response_data['errorMessage'] == Constants.RESPONSE_MESSAGE_USER_DOES_NOT_HAVE_PRIVILEGES
sensor_in_db = SensorRepository.get_instance().get_sensor_by_device_key_and_device_group_id(
sensor_device_key,
device_group.id)
assert sensor_in_db is sensor
| 31.480136 | 128 | 0.694887 | 3,335 | 27,734 | 5.327736 | 0.043778 | 0.059264 | 0.045588 | 0.034669 | 0.929199 | 0.908712 | 0.896049 | 0.874775 | 0.861718 | 0.843483 | 0 | 0.004388 | 0.219298 | 27,734 | 880 | 129 | 31.515909 | 0.816229 | 0 | 0 | 0.778748 | 0 | 0 | 0.08221 | 0 | 0 | 0 | 0 | 0 | 0.170306 | 1 | 0.021834 | false | 0.002911 | 0.0131 | 0 | 0.034935 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
13c7615b40a9030ebb01dcb36307da190f549b1c | 603 | py | Python | v3/Libraries/python-docx/docx-parser.py | TheShellLand/python | a35e9b32bec3a3ff03d6f0f4c2c2cc891180e516 | [
"MIT"
] | null | null | null | v3/Libraries/python-docx/docx-parser.py | TheShellLand/python | a35e9b32bec3a3ff03d6f0f4c2c2cc891180e516 | [
"MIT"
] | 1 | 2021-06-01T22:50:19.000Z | 2021-06-01T22:50:19.000Z | v3/Libraries/python-docx/docx-parser.py | TheShellLand/python | a35e9b32bec3a3ff03d6f0f4c2c2cc891180e516 | [
"MIT"
] | null | null | null |
# Examples from https://automatetheboringstuff.com/chapter13/
"""
# >>> import docx
# ❶ >>> doc = docx.Document('demo.docx')
# ❷ >>> len(doc.paragraphs)
# 7
# ❸ >>> doc.paragraphs[0].text
# 'Document Title'
# ❹ >>> doc.paragraphs[1].text
# 'A plain paragraph with some bold and some italic'
# ❺ >>> len(doc.paragraphs[1].runs)
# 4
# ❻ >>> doc.paragraphs[1].runs[0].text
# 'A plain paragraph with some '
# ❼ >>> doc.paragraphs[1].runs[1].text
# 'bold'
# ❽ >>> doc.paragraphs[1].runs[2].text
# ' and some '
# ➒ >>> doc.paragraphs[1].runs[3].text
# 'italic'
"""
import docx
| 21.535714 | 61 | 0.60199 | 86 | 603 | 4.22093 | 0.453488 | 0.286501 | 0.231405 | 0.247934 | 0.14876 | 0.14876 | 0 | 0 | 0 | 0 | 0 | 0.04918 | 0.190713 | 603 | 27 | 62 | 22.333333 | 0.694672 | 0.955224 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
13eb22807d747eb5564c45c475f9215748beb986 | 5,380 | py | Python | S3_File_Uploader/UI/Styles.py | rluzuriaga/S3_File_Uploader | 300e66d046f84a97bed7ef499352f35982578136 | [
"MIT"
] | null | null | null | S3_File_Uploader/UI/Styles.py | rluzuriaga/S3_File_Uploader | 300e66d046f84a97bed7ef499352f35982578136 | [
"MIT"
] | 18 | 2020-12-08T23:03:21.000Z | 2021-08-30T22:41:31.000Z | S3_File_Uploader/UI/Styles.py | rluzuriaga/S3_File_Uploader | 300e66d046f84a97bed7ef499352f35982578136 | [
"MIT"
] | null | null | null | from tkinter import ttk
from config import IS_MAC, IS_WINDOWS
class Styles:
def __init__(self) -> None:
self.style = ttk.Style()
self.all_ui_styles()
self.main_window_styles()
self.setup_window_styles()
self.update_database_styles()
self.mass_upload_styles()
def all_ui_styles(self) -> None:
if IS_MAC:
self.style.configure('regular.TButton', font=('Helvetica', 15))
self.style.configure('regular.TCheckbutton', font=('Helvetica', 15))
if IS_WINDOWS:
self.style.configure('regular.TButton', font=('Helvetica', 12))
self.style.configure('regular.TCheckbutton', font=('Helvetica', 12))
def main_window_styles(self) -> None:
if IS_MAC:
self.style.configure('main_window_top_label.TLabel', font=('Helvetica', 15))
self.style.configure('main_window_statusbar.TLabel', font=('Helvetica', 11))
if IS_WINDOWS:
self.style.configure('main_window_top_label.TLabel', font=('Helvetica', 15))
self.style.configure('main_window_statusbar.TLabel', font=('Helvetica', 9))
def setup_window_styles(self) -> None:
if IS_MAC:
self.style.configure('setup_window_top_label.TLabel', font=('Helvetica', 15))
self.style.configure('setup_window_output_message_label.TLabel', font=('Helvetica', 15))
# AWS
self.style.configure('aws_heading_label.TLabel', font=('Helvetica', 15, 'bold', 'underline'))
self.style.configure('access_key_id_label.TLabel', font=('Helvetica', 15))
self.style.configure('aws_secret_key_label.TLabel', font=('Helvetica', 15))
self.style.configure('region_name_label.TLabel', font=('Helvetica', 15))
# FFMPEG
self.style.configure('ffmpeg_heading_label.TLabel', font=('Helvetica', 15, 'bold', 'underline'))
self.style.configure('ffmpeg_input_label.TLabel', font=('Helvetica', 15))
self.style.configure('converted_file_suffix_label.TLabel', font=('Helvetica', 15))
self.style.configure('ffmpeg_example_label.TLabel', font=('Helvetica', 13))
if IS_WINDOWS:
self.style.configure('setup_window_top_label.TLabel', font=('Helvetica', 15))
self.style.configure('setup_window_output_message_label.TLabel', font=('Helvetica', 13))
# AWS
self.style.configure('aws_heading_label.TLabel', font=('Helvetica', 14, 'bold', 'underline'))
self.style.configure('access_key_id_label.TLabel', font=('Helvetica', 13))
self.style.configure('aws_secret_key_label.TLabel', font=('Helvetica', 13))
self.style.configure('region_name_label.TLabel', font=('Helvetica', 13))
# FFMPEG
self.style.configure('ffmpeg_heading_label.TLabel', font=('Helvetica', 14, 'bold', 'underline'))
self.style.configure('ffmpeg_input_label.TLabel', font=('Helvetica', 13))
self.style.configure('converted_file_suffix_label.TLabel', font=('Helvetica', 13))
self.style.configure('ffmpeg_example_label.TLabel', font=('Helvetica', 10))
def update_database_styles(self) -> None:
if IS_MAC:
self.style.configure('update_db_top_label.TLabel', font=('Helvetica', 18, 'underline'))
self.style.configure('explanation_text_label.TLabel', font=('Helvetica', 15))
if IS_WINDOWS:
self.style.configure('update_db_top_label.TLabel', font=('Helvetica', 15, 'underline'))
self.style.configure('explanation_text_label.TLabel', font=('Helvetica', 13))
def mass_upload_styles(self) -> None:
if IS_MAC:
self.style.configure('mass_upload_header_label.TLabel', font=('Helvetica', 18, 'underline'))
self.style.configure('update_label.TLabel', font=('Helvetica', 15))
self.style.configure('overall_progressbar_label.TLabel', font=('Helvetica', 13))
self.style.configure('ffmpeg_upload_progressbar_label.TLabel', font=('Helvetica', 13))
self.style.configure('path_to_mass_upload_label.TLabel', font=('Helvetica', 15))
self.style.configure('s3_bucket_location_label.TLabel', font=('Helvetica', 15))
self.style.configure('radio_button_label.TLabel', font=('Helvetica', 15))
# VideoCheckboxes
self.style.configure('hover_text_label.TLabel', font=('Helvetica', 14, 'italic'))
if IS_WINDOWS:
self.style.configure('mass_upload_header_label.TLabel', font=('Helvetica', 15, 'underline'))
self.style.configure('update_label.TLabel', font=('Helvetica', 14))
self.style.configure('overall_progressbar_label.TLabel', font=('Helvetica', 12))
self.style.configure('ffmpeg_upload_progressbar_label.TLabel', font=('Helvetica', 12))
self.style.configure('path_to_mass_upload_label.TLabel', font=('Helvetica', 12))
self.style.configure('s3_bucket_location_label.TLabel', font=('Helvetica', 12))
self.style.configure('radio_button_label.TLabel', font=('Helvetica', 12))
self.style.configure('radio_buttons.TRadiobutton', font=('Helvetica', 12))
# VideoCheckboxes
self.style.configure('hover_text_label.TLabel', font=('Helvetica', 12, 'italic'))
| 52.745098 | 108 | 0.653346 | 626 | 5,380 | 5.376997 | 0.126198 | 0.13369 | 0.262032 | 0.299465 | 0.911171 | 0.905229 | 0.894236 | 0.852347 | 0.845811 | 0.626857 | 0 | 0.022965 | 0.198699 | 5,380 | 101 | 109 | 53.267327 | 0.757829 | 0.009851 | 0 | 0.189189 | 0 | 0 | 0.354699 | 0.232707 | 0 | 0 | 0 | 0 | 0 | 1 | 0.081081 | false | 0 | 0.027027 | 0 | 0.121622 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b92976b65a241d022e86d6071fcd334cc9e75bac | 1,474 | py | Python | backend.py | morkos11/files-reader-desktop-app | d0b9c5586c12b87a840352163e2c2063bb6d82da | [
"Apache-2.0"
] | null | null | null | backend.py | morkos11/files-reader-desktop-app | d0b9c5586c12b87a840352163e2c2063bb6d82da | [
"Apache-2.0"
] | null | null | null | backend.py | morkos11/files-reader-desktop-app | d0b9c5586c12b87a840352163e2c2063bb6d82da | [
"Apache-2.0"
] | null | null | null | import sqlite3
def special_file_export(name,id,phone,first_name,last_name,email,birthday,gender,locale,link):
name = name+'.db'
conn = sqlite3.connect(name)
cur = conn.cursor()
cur.execute('CREATE TABLE IF NOT EXISTS special_facebookusers (id INT PRIMARY KEY , phone VARCHAR(25), first_name VARCHAR(35), last_name VARCHAR(35), email VARCHAR(125),birthday VARCHAR(35),gender VARCHAR(25),locale VARCHAR(35),link VARCHAR(150))')
conn.commit()
conn.close()
conn = sqlite3.connect(name)
cur = conn.cursor()
cur.execute('INSERT INTO special_facebookusers VALUES (?,?,?,?,?,?,?,?,?)',(id,phone,first_name,last_name,email,birthday,gender,locale,link))
conn.commit()
conn.close()
def file_export(name,id,phone,first_name,last_name,email,birthday,gender,locale,hometown,location,link):
name = name+'.db'
conn = sqlite3.connect(name)
cur = conn.cursor()
cur.execute('CREATE TABLE IF NOT EXISTS facebookusers (id INT PRIMARY KEY , phone VARCHAR(25), first_name VARCHAR(35), last_name VARCHAR(35), email VARCHAR(125),birthday VARCHAR(35),gender VARCHAR(25),locale VARCHAR(35),hometown VARCHAR(35),location VARCHAR(35),link VARCHAR(150))')
conn.commit()
conn.close()
conn = sqlite3.connect(name)
cur = conn.cursor()
cur.execute('INSERT INTO facebookusers VALUES (?,?,?,?,?,?,?,?,?,?,?)',(id,phone,first_name,last_name,email,birthday,gender,locale,hometown,location,link))
conn.commit()
conn.close() | 54.592593 | 286 | 0.710312 | 205 | 1,474 | 5.02439 | 0.209756 | 0.087379 | 0.046602 | 0.062136 | 0.936893 | 0.9 | 0.9 | 0.9 | 0.9 | 0.9 | 0 | 0.035074 | 0.129579 | 1,474 | 27 | 287 | 54.592593 | 0.767732 | 0 | 0 | 0.72 | 0 | 0.08 | 0.421695 | 0.072542 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08 | false | 0 | 0.04 | 0 | 0.12 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b930e1e9ff0e54d04ce9e049eaa85adbe5284aad | 73,760 | py | Python | AGENDA.py | JSNavas/Agenda-Python | a848ea2944396988532799b404368353fd268fd8 | [
"MIT"
] | null | null | null | AGENDA.py | JSNavas/Agenda-Python | a848ea2944396988532799b404368353fd268fd8 | [
"MIT"
] | null | null | null | AGENDA.py | JSNavas/Agenda-Python | a848ea2944396988532799b404368353fd268fd8 | [
"MIT"
] | null | null | null | # encoding: utf-8
import os
import time
class archivo(object):
def __init__(self):
self.lista = []
self.mensajes = []
self.historial = 0
self.llamadas = []
self.historialDellamadas = 0
def limpiar(self):
os.system("clear")
# METODOS ARCHIVOS CONTACTOS
def crearArchivo(self):
archivo = open("contactos", "a")
archivo.close()
def crearArchivoNuevo(self):
archivo = open("contactos", "w")
archivo.close()
def escribirArchivo(self):
archivo = open("contactos", "w")
self.lista.sort()
for elemento in self.lista:
archivo.write(elemento + "\n")
archivo.close()
def cargarArchivo(self):
archivo = open("contactos", "r")
self.lista.sort()
linea = archivo.readline()
if linea:
while linea:
if linea[-1] == "\n":
linea = linea[0:-1]
self.lista.append(linea)
linea = archivo.readline()
archivo.close()
# METODOS MENSAJES
def crearArchivoMensajes(self):
archivo = open("mensajes", "a")
archivo.close()
def escribirArchivoMensajes(self):
archivo = open("mensajes", "w")
for elemento in self.mensajes:
archivo.write(elemento + "\n")
archivo.close()
def cargarArchivoMensajes(self):
archivo = open("mensajes", "r")
linea = archivo.readline()
if linea:
while linea:
if linea[-1] == "\n":
linea = linea[0:-1]
self.mensajes.append(linea)
self.historial = len(self.mensajes)
linea = archivo.readline()
archivo.close()
# METODOS LLAMADAS
def crearArchivoLlamadas(self):
archivo = open("llamadas.txt", "a")
archivo.close()
def escribirArchivoLlamadas(self):
archivo = open("llamadas.txt", "w")
for elemento in self.llamadas:
archivo.write(elemento + "\n")
archivo.close()
def cargarArchivoLlamadas(self):
archivo = open("llamadas.txt", "r")
linea = archivo.readline()
if linea:
while linea:
if linea[-1] == "\n":
linea = linea[0:-1]
self.llamadas.append(linea)
self.historialDellamadas = len(self.llamadas)
linea = archivo.readline()
archivo.close()
def eliminarMensajes(self):
print " -------------------------------------------------------------------------------"
print " ====================| ELIMINAR HISTORIAL DE MENSAJES |==================="
print " -------------------------------------------------------------------------------"
eliminado = False
if ((self.mensajes != 0) and (self.historial != 0)):
print " Seguro que desea eliminar el historial de mensajes enviados?"
SN = raw_input("\n (s/n): ")
if SN.lower() == 's':
archivo = open("mensajes", "w")
archivo.close()
self.mensajes = []
self.historial = 0
print "\n 'Se ha eliminado todo el historial'"
else:
print "\n 'Su historial de mensajes esta vacio'"
# METODOS HISTORIAL DE MENSAJES
def agregarAlHistorial(self):
self.historial += 1
def historialMensajes(self):
print " -------------------------------------------------------------------------------"
print " =============================| HISTORIAL SMS |============================="
print " -------------------------------------------------------------------------------"
print "\n Mensajes enviados:", self.historial
print " -------------------------------------------------------------------------------"
for elemento in self.mensajes:
arreglo = elemento.split('\t')
print " Para:", arreglo[0]
print " Mensaje:", arreglo[1]
print "\t\t\t\t\t\t\tHora de envio:", arreglo[2]
print "\t\t\t\t\t\t\tFecha de envio:", arreglo[3]
print " -------------------------------------------------------------------------------"
# METODOS HISTORIAL DE LLAMADAS
def agregarAlHistorialDellamadas(self):
self.historialDellamadas += 1
def historialDellamadas(self):
print " -------------------------------------------------------------------------------"
print " =========================| HISTORIAL DE LLAMADAS |========================="
print " -------------------------------------------------------------------------------"
print "\n Llamadas realizadas:", self.historialDellamadas
print " -------------------------------------------------------------------------------"
for elemento in self.llamadas:
arreglo = elemento.split('\t')
print " Para:", arreglo[0]
print "\t\t\t\t\t\t\tHora de llamada:", arreglo[1]
print "\t\t\t\t\t\t\tFin de llamada:", arreglo[2]
print "\t\t\t\t\t\t\tFecha de llamada:", arreglo[3]
print " -------------------------------------------------------------------------------"
# METODOS CONTACTOS
def crearContacto(self):
print
creado = False
validarNombre = False
while validarNombre == False:
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ============================| NUEVO CONTACTO |============================="
print " -------------------------------------------------------------------------------"
print
nombre = raw_input(" Nombre: ")
if len(nombre) == 0:
print "\n\t\t\t\t error!".upper()
print "\n\t\t 'El campo (Nombre) no puede estar vacio'"
print " Presione ENTER para continuar"
raw_input()
validarNombre = False
else:
encontrado = False
for elemento in self.lista:
mostrar = elemento.split('\t')
if nombre.lower() == mostrar[0].lower():
encontrado = True
break
else:
encontrado = False
if encontrado == True:
print "\n\t\t\t\t error!".upper()
print "\n\t\t 'Ya tiene un contacto con ese nombre'"
print " Presione ENTER para continuar"
raw_input()
validarNombre = False
else:
validarNombre = True
apellido = raw_input(" Apellido: ")
validarNumero = False
while validarNumero == False:
telefono = raw_input(" Numero de telefono: ")
if len(telefono) == 0:
print "\n\t\t\t\t error!".upper()
print "\n\t\t 'El campo (Numero) no puede estar vacio'"
print " Presione ENTER para continuar"
raw_input()
validarNumero = False
elif not telefono.isdigit():
print "\n\t\t\t\t error!".upper()
print "\n\t\t 'En este campo solo puede agregar numeros'"
print " Presione ENTER para continuar"
raw_input()
validarNumero = False
else:
validarNumero = True
correo = raw_input(" Correo electronico: ")
print "\n Desea guardar este contacto?"
SN = raw_input(" (s/n): ")
if SN.lower() == 's':
self.lista.append(nombre + "\t" + apellido + "\t" + telefono + "\t" + correo + "\t")
creado = True
else:
creado = False
print "\n 'El contacto no se ha guardado'"
if creado == True:
print "\n 'Nuevo contacto añadido a su agenda'"
archivo.escribirArchivo()
def mostrarContacto(self):
listaVacia = True
self.lista.sort()
for i, elemento in enumerate(self.lista):
i += 1
mostrar = elemento.split('\t')
print " " + str(i) + ") " + mostrar[0] + "\t " + mostrar[1] + "\t " + mostrar[2] + "\t " + mostrar[3]
print " -------------------------------------------------------------------------------"
listaVacia = False
if listaVacia == True:
print "\n 'Su lista de contactos esta vacia'"
def eliminarCONTACTOS(self):
if len(self.lista) == 0:
print "\n 'Su lista de contactos esta vacia'"
else:
print " Seguro que desea eliminar todos los contactos?"
SN = raw_input("\n (s/n): ")
if SN.lower() == 's':
self.lista = []
if self.lista == []:
print "\n 'Se han eliminado todos los contactos'"
archivo.escribirArchivo()
def buscarContactoNombre(self):
campovacio = True
while campovacio == True:
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ===========================| BUSCAR POR NOMBRE |==========================="
print " -------------------------------------------------------------------------------"
nombre = raw_input("\n Ingrese el nombre que desea buscar: ")
print
if len(nombre) == 0:
print "\t\t\t'El campo de busqueda esta vacio'"
print "\n Desea buscar con otro nombre?"
SN = raw_input(" (s/n): ")
if SN.lower() == 's':
campovacio = True
else:
campovacio = False
else:
menuBus = True
while menuBus == True:
noEncontrado = True
for elemento in self.lista:
mostrar = elemento.split('\t')
if nombre.lower() == mostrar[0].lower():
archivo.limpiar()
print " Contacto encontrado: "
print "\n | Nombre | Apellido | Telefono | Correo |"
print " -------------------------------------------------------------------------------"
print " " + mostrar[0] + "\t " + mostrar[1] + "\t " + mostrar[2] + "\t " + mostrar[3]
print " -------------------------------------------------------------------------------"
noEncontrado = False
if noEncontrado == False:
print
print " Opciones \n"
print " (A) Eliminar - (B) Actualizar - (C) Enviar SMS - (D) llamar "
print
print "\n 'Presione ENTER si no desea realizar ninguna accion' "
print
opcion = raw_input(" Opcion > ")
if opcion.lower() == 'a':
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " =============================| ELIMINAR |=============================="
print " -------------------------------------------------------------------------------"
print " Seguro que desea eliminar este contacto?"
SN = raw_input("\n (s/n): ")
if SN.lower() == 's':
eliminado = False
for elemento in self.lista:
mostrar = elemento.split("\t")
if nombre.lower() == mostrar[0].lower():
self.lista.remove(elemento)
eliminado = True
if eliminado == True:
archivo.escribirArchivo()
print " \n 'El contacto ha sido eliminado'"
menuBus = False
campovacio = False
elif opcion.lower() == 'b':
menuAct = True
while menuAct == True:
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ============================| ACTUALIZAR |============================="
print " -------------------------------------------------------------------------------"
print " | (1) ACTUALIZAR NOMBRE | (2) ACTUALIZAR APELLIDO |"
print " -------------------------------------------------------------------------------"
print " | (3) ACTUALIZAR TELEFONO | (4) ACTUALIZAR CORREO |"
print " -------------------------------------------------------------------------------"
print "\n 'Presione ENTER para volver al menu anterior'\n"
opcion = raw_input(" Opcion > ")
if opcion == "1":
nombreActualizado = False
validarNombre = False
while validarNombre == False:
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " =========================| ACTUALIZAR NOMBRE |========================="
print " -------------------------------------------------------------------------------"
for elemento in self.lista:
mostrar = elemento.split("\t")
mostrar[0] = raw_input("\n Nuevo nombre: ")
if len(mostrar[0]) == 0:
print "\n\t\t\t\t error!".upper()
print "\n\t\t 'El campo (Nombre) no puede estar vacio'"
print " Presione ENTER para continuar"
raw_input()
validarNombre = False
else:
for elemento in self.lista:
arreglo = elemento.split('\t')
if mostrar[0].lower() == arreglo[0].lower():
encontrado = True
break
else:
encontrado = False
if encontrado == True:
print "\n\t\t\t\t error!".upper()
print "\n\t\t 'Ya tiene un contacto con ese nombre'"
print " Presione ENTER para continuar"
raw_input()
validarNombre = False
else:
validarNombre = True
print "\n Desea guardar los cambios?"
SN = raw_input("\n (s/n): ")
if SN.lower() == 's':
self.lista.remove(elemento)
menuAct = False
self.lista.append(mostrar[0] + "\t" + mostrar[1] + "\t" + mostrar[2] + "\t" + mostrar[3] + "\t")
nombre = mostrar[0]
nombreActualizado = True
else:
nombreActualizado = False
if nombreActualizado == True:
archivo.escribirArchivo()
print "\n 'Nombre actualizado'"
raw_input("\n 'Presione ENTER para continuar'\n")
menuAct = True
continue
if nombreActualizado == False:
print "\n 'No se han guardado los cambios'"
raw_input("\n 'Presione ENTER para continuar'\n")
menuAct = True
continue
if opcion == "2":
apellidoActualizado = False
for elemento in self.lista:
mostrar = elemento.split("\t")
if nombre.lower() == mostrar[0].lower():
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ========================| ACTUALIZAR APELLIDO |========================"
print " -------------------------------------------------------------------------------"
mostrar[1] = raw_input("\n Nuevo apellido: ")
menuAct = False
print "\n Desea guardar los cambios?"
SN = raw_input("\n (s/n): ")
if SN.lower() == 's':
self.lista.remove(elemento)
self.lista.append(mostrar[0] + "\t" + mostrar[1] + "\t" + mostrar[2] + "\t" + mostrar[3] + "\t")
apellidoActualizado = True
else:
apellidoActualizado = False
if apellidoActualizado == True:
archivo.escribirArchivo()
print "\n 'Apellido actualizado'"
raw_input("\n 'Presione ENTER para continuar'\n")
menuAct = True
continue
if apellidoActualizado == False:
print "\n 'No se han guardado los cambios'"
raw_input("\n 'Presione ENTER para continuar'\n")
menuAct = True
continue
if opcion == "3":
telefonoActualizado = False
for elemento in self.lista:
mostrar = elemento.split("\t")
if nombre.lower() == mostrar[0].lower():
validarNumero = False
while validarNumero == False:
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " =======================| ACTUALIZAR TELEFONO |========================="
print " -------------------------------------------------------------------------------"
mostrar[2] = raw_input("\n Nuevo telefono: ")
if len(mostrar[2]) == 0:
print "\n\t\t\t\t error!".upper()
print "\n\t\t 'Este campo no debe estar vacio'"
print " Presione ENTER para continuar"
raw_input()
validarNumero = False
elif not mostrar[2].isdigit():
print "\n\t\t\t\t error!".upper()
print "\n\t\t 'En este campo solo puede agregar numeros'"
print " Presione ENTER para continuar"
raw_input()
validarNumero = False
else:
validarNumero = True
print "\n Desea guardar los cambios?"
SN = raw_input("\n (s/n): ")
if SN.lower() == 's':
self.lista.remove(elemento)
self.lista.append(mostrar[0] + "\t" + mostrar[1] + "\t" + mostrar[2] + "\t" + mostrar[3] + "\t")
telefonoActualizado = True
else:
telefonoActualizado = False
if telefonoActualizado == True:
archivo.escribirArchivo()
print "\n 'Telefono actualizado'"
raw_input("\n 'Presione ENTER para continuar'\n")
menuAct = True
continue
if telefonoActualizado == False:
print "\n 'No se han guardado los cambios'"
raw_input("\n 'Presione ENTER para continuar'\n")
menuAct = True
continue
if opcion == "4":
correoActualizado = False
for elemento in self.lista:
mostrar = elemento.split("\t")
if nombre.lower() == mostrar[0].lower():
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ========================| ACTUALIZAR CORREO |=========================="
print " -------------------------------------------------------------------------------"
mostrar[3] = raw_input("\n Nuevo correo: ")
menuAct = False
print "\n Desea guardar los cambios?"
SN = raw_input("\n (s/n): ")
if SN.lower() == 's':
self.lista.remove(elemento)
self.lista.append(mostrar[0] + "\t" + mostrar[1] + "\t" + mostrar[2] + "\t" + mostrar[3] + "\t")
correoActualizado = True
else:
correoActualizado = False
if correoActualizado == True:
archivo.escribirArchivo()
print "\n 'Correo actualizado'"
raw_input("\n 'Presione ENTER para continuar'\n")
menuAct = True
continue
if correoActualizado == False:
print "\n 'No se han guardado los cambios'"
raw_input("\n 'Presione ENTER para continuar'\n")
menuAct = True
continue
else:
menuAct = False
menuBus = True
archivo.escribirArchivo()
elif opcion.lower() == 'c':
archivo.limpiar()
mensajeEnviado = False
while mensajeEnviado == False:
print " -------------------------------------------------------------------------------"
print " ===========================| ENVIAR MENSAJE |==========================="
print " -------------------------------------------------------------------------------"
print
print " Para:", nombre
print " Mensaje: \n"
horaDeEnvio = time.strftime("%I:%M:%S")
fechaDeEnvio = time.strftime("%d/%m/%y")
mensaje = raw_input(" ")
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ===========================| ENVIAR MENSAJE |==========================="
print " -------------------------------------------------------------------------------"
print
print " Enviar mensaje?"
SN = raw_input("\n (s/n): ")
if SN.lower() == 's':
mensajeEnviado = True
else:
print "\n 'Mensaje no enviado'"
raw_input()
break
if mensajeEnviado == True:
self.mensajes.append(nombre + '\t' + mensaje + "\t" + horaDeEnvio + '\t' + fechaDeEnvio)
archivo.agregarAlHistorial()
archivo.escribirArchivoMensajes()
print "\n 'Mensaje enviado'"
raw_input()
else:
menuBus = False
campovacio = False
archivo.escribirArchivo()
if noEncontrado == True:
print " 'El nombre '%s' no se ha encontrado'" % (nombre)
print "\n Desea buscar con otro nombre?"
SN = raw_input(" (s/n): ")
if SN.lower() == 's':
break
else:
campovacio = False
break
def buscarContactoNumero(self):
campovacio = True
while campovacio == True:
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ===========================| BUSCAR POR NUMERO |==========================="
print " -------------------------------------------------------------------------------"
numero = raw_input("\n Ingrese el numero que desea buscar: ")
print
if len(numero) == 0:
print "\t\t\t'El campo de busqueda esta vacio'"
print "\n Desea buscar con otro numero?"
SN = raw_input(" (s/n): ")
if SN.lower() == 's':
campovacio = True
else:
campovacio = False
else:
menuBus = True
while menuBus == True:
noEncontrado = True
for elemento in self.lista:
mostrar = elemento.split('\t')
if numero == mostrar[2]:
archivo.limpiar()
print " Contacto encontrado: "
print "\n | Nombre | Apellido | Telefono | Correo |"
print " -------------------------------------------------------------------------------"
print " " + mostrar[0] + "\t " + mostrar[1] + "\t " + mostrar[2] + "\t " + mostrar[3]
print " -------------------------------------------------------------------------------"
noEncontrado = False
if noEncontrado == False:
print
print " Opciones \n"
print " (A) Eliminar - (B) Actualizar - (C) Enviar SMS - (D) llamar "
print
print "\n 'Presione ENTER si no desea realizar ninguna accion' "
print
opcion = raw_input(" Opcion > ")
if opcion.lower() == 'a':
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " =============================| ELIMINAR |=============================="
print " -------------------------------------------------------------------------------"
print
print " Seguro que desea eliminar este contacto?"
SN = raw_input("\n (s/n): ")
if SN.lower() == 's':
eliminado = False
for elemento in self.lista:
mostrar = elemento.split("\t")
if numero == mostrar[2]:
self.lista.remove(elemento)
eliminado = True
if eliminado == True:
archivo.escribirArchivo()
print "\n 'El contacto ha sido eliminado'"
menuBus = False
campovacio = False
elif opcion.lower() == 'b':
menuAct = True
while menuAct == True:
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ============================| ACTUALIZAR |============================="
print " -------------------------------------------------------------------------------"
print " | (1) ACTUALIZAR NOMBRE | (2) ACTUALIZAR APELLIDO |"
print " -------------------------------------------------------------------------------"
print " | (3) ACTUALIZAR TELEFONO | (4) ACTUALIZAR CORREO |"
print " -------------------------------------------------------------------------------"
print "\n 'Presione ENTER para volver al menu anterior'\n"
opcion = raw_input(" Opcion > ")
if opcion == "1":
nombreActualizado = False
validarNombre = False
while validarNombre == False:
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " =========================| ACTUALIZAR NOMBRE |========================="
print " -------------------------------------------------------------------------------"
for elemento in self.lista:
mostrar = elemento.split("\t")
mostrar[0] = raw_input("\n Nuevo nombre: ")
if len(mostrar[0]) == 0:
print "\n\t\t\t\t error!".upper()
print "\n\t\t 'El campo (Nombre) no puede estar vacio'"
print " Presione ENTER para continuar"
raw_input()
validarNombre = False
else:
for elemento in self.lista:
arreglo = elemento.split('\t')
if mostrar[0].lower() == arreglo[0].lower():
encontrado = True
break
else:
encontrado = False
if encontrado == True:
print "\n\t\t\t\t error!".upper()
print "\n\t\t 'Ya tiene un contacto con ese nombre'"
print " Presione ENTER para continuar"
raw_input()
validarNombre = False
else:
validarNombre = True
print "\n Desea guardar los cambios?"
SN = raw_input("\n (s/n): ")
if SN.lower() == 's':
self.lista.remove(elemento)
menuAct = False
self.lista.append(mostrar[0] + "\t" + mostrar[1] + "\t" + mostrar[2] + "\t" + mostrar[3] + "\t")
nombre = mostrar[0]
nombreActualizado = True
else:
nombreActualizado = False
if nombreActualizado == True:
archivo.escribirArchivo()
print "\n 'Nombre actualizado'"
raw_input("\n 'Presione ENTER para continuar'\n")
menuAct = True
continue
if nombreActualizado == False:
print "\n 'No se han guardado los cambios'"
raw_input("\n 'Presione ENTER para continuar'\n")
menuAct = True
continue
if opcion == "2":
apellidoActualizado = False
for elemento in self.lista:
mostrar = elemento.split("\t")
if numero in mostrar[2]:
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ========================| ACTUALIZAR APELLIDO |========================"
print " -------------------------------------------------------------------------------"
mostrar[1] = raw_input("\n Nuevo apellido: ")
menuAct = False
print "\n Desea guardar los cambios?"
SN = raw_input("\n (s/n): ")
if SN.lower() == 's':
self.lista.remove(elemento)
self.lista.append(mostrar[0] + "\t" + mostrar[1] + "\t" + mostrar[2] + "\t" + mostrar[3] + "\t")
apellidoActualizado = True
else:
apellidoActualizado = False
if apellidoActualizado == True:
archivo.escribirArchivo()
print "\n 'Apellido actualizado'"
raw_input("\n 'Presione ENTER para continuar'\n")
menuAct = True
continue
if apellidoActualizado == False:
print "\n 'No se han guardado los cambios'"
raw_input("\n 'Presione ENTER para continuar'\n")
menuAct = True
continue
if opcion == "3":
telefonoActualizado = False
for elemento in self.lista:
mostrar = elemento.split("\t")
if numero.lower() == mostrar[2].lower():
validarNumero = False
while validarNumero == False:
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " =======================| ACTUALIZAR TELEFONO |========================="
print " -------------------------------------------------------------------------------"
mostrar[2] = raw_input("\n Nuevo telefono: ")
if len(mostrar[2]) == 0:
print "\n\t\t\t\t error!".upper()
print "\n\t\t 'Este campo no debe estar vacio'"
print " Presione ENTER para continuar"
raw_input()
validarNumero = False
elif not mostrar[2].isdigit():
print "\n\t\t\t\t error!".upper()
print "\n\t\t 'En este campo solo puede agregar numeros'"
print " Presione ENTER para continuar"
raw_input()
validarNumero = False
else:
validarNumero = True
print "\n Desea guardar los cambios?"
SN = raw_input("\n (s/n): ")
if SN.lower() == 's':
self.lista.remove(elemento)
self.lista.append(mostrar[0] + "\t" + mostrar[1] + "\t" + mostrar[2] + "\t" + mostrar[3] + "\t")
numero = mostrar[2]
telefonoActualizado = True
else:
telefonoActualizado = False
if telefonoActualizado == True:
archivo.escribirArchivo()
print "\n 'Telefono actualizado'"
raw_input("\n 'Presione ENTER para continuar'\n")
menuAct = True
continue
if telefonoActualizado == False:
print "\n 'No se han guardado los cambios'"
raw_input("\n 'Presione ENTER para continuar'\n")
menuAct = True
continue
if opcion == "4":
correoActualizado = False
for elemento in self.lista:
mostrar = elemento.split("\t")
if numero in mostrar[2]:
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ========================| ACTUALIZAR CORREO |=========================="
print " -------------------------------------------------------------------------------"
mostrar[3] = raw_input("\n Nuevo correo: ")
menuAct = False
print "\n Desea guardar los cambios?"
SN = raw_input("\n (s/n): ")
if SN.lower() == 's':
self.lista.remove(elemento)
self.lista.append(mostrar[0] + "\t" + mostrar[1] + "\t" + mostrar[2] + "\t" + mostrar[3] + "\t")
correoActualizado = True
else:
correoActualizado = False
if correoActualizado == True:
archivo.escribirArchivo()
print "\n 'Correo actualizado'"
raw_input("\n 'Presione ENTER para continuar'\n")
menuAct = True
continue
if correoActualizado == False:
print "\n 'No se han guardado los cambios'"
raw_input("\n 'Presione ENTER para continuar'\n")
menuAct = True
continue
else:
menuAct = False
menuBus = True
archivo.escribirArchivo()
elif opcion.lower() == 'c':
archivo.limpiar()
mensajeEnviado = False
while mensajeEnviado == False:
print " -------------------------------------------------------------------------------"
print " ===========================| ENVIAR MENSAJE |==========================="
print " -------------------------------------------------------------------------------"
print
print " Para:", mostrar[0]
print " Mensaje: \n"
horaDeEnvio = time.strftime("%I:%M:%S")
fechaDeEnvio = time.strftime("%d/%m/%y")
mensaje = raw_input(" ")
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ===========================| ENVIAR MENSAJE |==========================="
print " -------------------------------------------------------------------------------"
print
print " Enviar mensaje?"
SN = raw_input("\n (s/n): ")
if SN.lower() == 's':
mensajeEnviado = True
else:
print "\n 'Mensaje no enviado'"
raw_input()
break
if mensajeEnviado == True:
archivo.agregarAlHistorial()
self.mensajes.append(mostrar[0] + '\t' + mensaje + "\t" + horaDeEnvio + '\t' + fechaDeEnvio)
archivo.escribirArchivoMensajes()
print "\n 'Mensaje enviado'"
raw_input()
else:
menuBus = False
campovacio = False
if noEncontrado == True:
print " 'El numero '%s' no se ha encontrado'" % (numero)
print "\n Desea buscar con otro numero?"
SN = raw_input(" (s/n): ")
if SN.lower() == 's':
break
else:
campovacio = False
break
def enviarSMS(self):
print " -------------------------------------------------------------------------------"
print " ===========================| ENVIAR MENSAJE |==========================="
print " -------------------------------------------------------------------------------"
print " | [A] SMS a un contacto | [B] SMS a un numero |"
print " -------------------------------------------------------------------------------"
opcion = raw_input("\n Opcion > ")
if opcion.lower() == 'b':
archivo.limpiar()
while True:
print " -------------------------------------------------------------------------------"
print " ===========================| ENVIAR MENSAJE |==========================="
print " -------------------------------------------------------------------------------"
numero = raw_input("\n Numero del contacto: ")
if not numero.isdigit():
print "\n 'Ingrese un numero de telefono valido'"
raw_input()
archivo.limpiar()
else:
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ===========================| ENVIAR MENSAJE |==========================="
print " -------------------------------------------------------------------------------"
print "\n Para:", numero
print " Mensaje: \n"
horaDeEnvio = time.strftime("%I:%M:%S")
fechaDeEnvio = time.strftime("%d/%m/%y")
mensaje = raw_input(" ")
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ===========================| ENVIAR MENSAJE |==========================="
print " -------------------------------------------------------------------------------"
print
print " Enviar mensaje?"
SN = raw_input("\n (s/n): ")
if SN.lower() == 's':
archivo.agregarAlHistorial()
self.mensajes.append(str(numero) + '\t' + mensaje + "\t" + horaDeEnvio + '\t' + fechaDeEnvio)
archivo.escribirArchivoMensajes()
print "\n 'Mensaje enviado'"
break
archivo.limpiar()
else:
print "\n 'Mensaje no enviado'"
break
archivo.limpiar()
if opcion.lower() == 'a':
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ===========================| ENVIAR MENSAJE |==========================="
print " -------------------------------------------------------------------------------"
nombre = raw_input("\n Nombre del contacto: ")
noEncontrado = True
for elemento in self.lista:
mostrar = elemento.split('\t')
if nombre.lower() == mostrar[0].lower():
noEncontrado = False
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ===========================| ENVIAR MENSAJE |==========================="
print " -------------------------------------------------------------------------------"
print
print " Para:", mostrar[0]
print " Mensaje: \n"
horaDeEnvio = time.strftime("%I:%M:%S")
fechaDeEnvio = time.strftime("%d/%m/%y")
mensaje = raw_input(" ")
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ===========================| ENVIAR MENSAJE |==========================="
print " -------------------------------------------------------------------------------"
print
print " Enviar mensaje?"
SN = raw_input("\n (s/n): ")
if SN.lower() == 's':
self.mensajes.append(mostrar[0] + '\t' + mensaje + "\t" + horaDeEnvio + '\t' + fechaDeEnvio)
archivo.agregarAlHistorial()
archivo.escribirArchivoMensajes()
print "\n 'Mensaje enviado'"
else:
print "\n 'Mensaje no enviado'"
if (noEncontrado == True):
print "\n 'El contacto '%s' no se ha encontrado en su agenda'" % (nombre)
print "\n Desea agregar este contacto a su agenda?"
SN = raw_input("\n (s/n): ")
if (SN.lower() == 's'):
archivo.crearContacto()
print"\t\t\t'Presione ENTER para continuar'"
raw_input()
archivo.limpiar()
archivo.enviarSMS()
def llamarAUnContacto(self):
print " -------------------------------------------------------------------------------"
print " =========================| LLAMAR A UN CONTACTO |=========================="
print " -------------------------------------------------------------------------------"
print " | [A] Contacto de la agenda | [B] Contacto Desconocido |"
print " -------------------------------------------------------------------------------"
opcion = raw_input("\n Opcion > ")
if opcion.lower() == 'a':
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " =========================| LLAMAR A UN CONTACTO |=========================="
print " -------------------------------------------------------------------------------"
nombre = raw_input("\n Nombre del contacto: ")
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " =========================| LLAMAR A UN CONTACTO |=========================="
print " -------------------------------------------------------------------------------"
print "\n Llamada para: " + nombre
print "\n 'Presione ENTER para iniciar la llamada' "
print
opcion = raw_input(" Opcion > ")
print "\n Realizando llamada..."
volverALlamar = True
while volverALlamar == True:
i = 0
duracion = 0
fechaDeLlamada = time.strftime("%d/%m/%y")
horaDeLlamada = time.strftime("%I:%M:%S")
while i <= 10:
time.sleep(1)
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " =========================| LLAMAR A UN CONTACTO |=========================="
print " -------------------------------------------------------------------------------"
print
print " LLAMANDO..."
print "\n 'Presione ENTER para finalizar la llamada' "
duracion += 1
i += 1
print " Duracion: " + str(duracion) + "s"
time.sleep(1)
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " =========================| LLAMAR A UN CONTACTO |=========================="
print " -------------------------------------------------------------------------------"
print
print
print "\n 'Presione ENTER para finalizar la llamada' "
duracion += 1
i += 1
print " Duracion: " + str(duracion) + "s"
finDeLallamada = time.strftime("%I:%M:%S")
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " =========================| LLAMAR A UN CONTACTO |=========================="
print " -------------------------------------------------------------------------------"
print "\n Duracion de la llamada: " + str(duracion) + "s"
print "\n 'Llamada finalizada'"
self.llamadas.append(nombre + "\t" + horaDeLlamada + '\t'+ finDeLallamada + '\t' + fechaDeLlamada)
archivo.agregarAlHistorialDellamadas()
archivo.escribirArchivoLlamadas()
print "\n Desea volver a llamar?"
SN = raw_input(" (s/n): ")
if SN.lower() == 's':
volverALlamar = True
else:
volverALlamar = False
if opcion.lower() == 'b':
archivo.limpiar()
validarNumero = True
while validarNumero == True:
print " -------------------------------------------------------------------------------"
print " =========================| LLAMAR A UN CONTACTO |=========================="
print " -------------------------------------------------------------------------------"
numero = raw_input("\n Numero del contacto: ")
if not numero.isdigit():
print "\n 'Ingrese un numero de telefono valido'"
raw_input()
validarNumero = True
archivo.limpiar()
else:
validarNumero = False
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " =========================| LLAMAR A UN CONTACTO |=========================="
print " -------------------------------------------------------------------------------"
print "\n Llamada para: " + numero
print "\n 'Presione ENTER para iniciar la llamada' "
print
opcion = raw_input(" Opcion > ")
print "\n Realizando llamada..."
volverALlamar = True
while volverALlamar == True:
i = 0
duracion = 0
fechaDeLlamada = time.strftime("%d/%m/%y")
horaDeLlamada = time.strftime("%I:%M:%S")
while i <= 10:
time.sleep(1)
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " =========================| LLAMAR A UN CONTACTO |=========================="
print " -------------------------------------------------------------------------------"
print
print " LLAMANDO..."
print "\n 'Presione ENTER para finalizar la llamada' "
duracion += 1
i += 1
print " Duracion: " + str(duracion) + "s"
time.sleep(1)
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " =========================| LLAMAR A UN CONTACTO |=========================="
print " -------------------------------------------------------------------------------"
print
print
print "\n 'Presione ENTER para finalizar la llamada' "
duracion += 1
i += 1
print " Duracion: " + str(duracion) + "s"
finDeLallamada = time.strftime("%I:%M:%S")
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " =========================| LLAMAR A UN CONTACTO |=========================="
print " -------------------------------------------------------------------------------"
print "\n Duracion de la llamada: " + str(duracion) + "s"
print "\n 'Llamada finalizada'"
self.llamadas.append(numero + "\t" + horaDeLlamada + '\t'+ finDeLallamada + '\t' + fechaDeLlamada)
archivo.agregarAlHistorialDellamadas()
archivo.escribirArchivoLlamadas()
print "\n Desea volver a llamar?"
SN = raw_input(" (s/n): ")
if SN.lower() == 's':
volverALlamar = True
else:
volverALlamar = False
archivo = archivo()
archivo.crearArchivo()
archivo.cargarArchivo()
archivo.crearArchivoMensajes()
archivo.cargarArchivoMensajes()
archivo.crearArchivoLlamadas()
archivo.cargarArchivoLlamadas()
menu = True
while menu == True:
archivo.escribirArchivo()
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ==============================| AGENDA |==============================="
print " -------------------------------------------------------------------------------"
print " | (1) AGREGAR CONTACTO | (2) MOSTRAR CONTACTOS |"
print " -------------------------------------------------------------------------------"
print " | (3) BUSCAR CONTACTO | (4) ELIMINAR CONTACTOS |"
print " -------------------------------------------------------------------------------"
print " | (5) ENVIAR SMS | (6) LLAMAR UN CONTACTO |"
print " -------------------------------------------------------------------------------"
print " | (7) HISTORIAL DE SMS | (8) HISTORIAL DE LLAMADAS |"
print " -------------------------------------------------------------------------------"
print " | | (0) SALIR | |"
print " -------------------------------------------------------------------------------"
try:
opcion = input("\n Opcion > ")
except:
print
print " 'Opcion invalida'"
print " Presione ENTER para continuar"
print
raw_input()
else:
if opcion == 0:
print "\n 'Ha salido de la agenda' \n"
break
elif opcion == 1:
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ============================| NUEVO CONTACTO |============================="
print " -------------------------------------------------------------------------------"
agregarOtro = True
while agregarOtro == True:
archivo.crearContacto()
print
print " (a) Agregar otro contacto"
print " (v) Volver al menu principal"
print " (s) Salir de la agenda"
elegir = raw_input("\n Opcion > ")
if elegir == 'v':
agregarOtro = False
menu = True
archivo.limpiar()
elif elegir == 's':
print "\n 'Ha salido de la agenda' \n"
agregarOtro = False
menu = False
elif elegir == 'a':
agregarOtro = True
menu = True
archivo.limpiar()
else:
break
elif opcion == 2:
archivo.limpiar()
print "\n | Nombre | Apellido | Telefono | Correo |"
print " -------------------------------------------------------------------------------"
archivo.mostrarContacto()
print
print " (v) Volver al menu principal"
print " (s) Salir de la agenda"
elegir = raw_input("\n Opcion > ")
if elegir == 'v':
menu = True
archivo.limpiar()
if elegir == 's':
print "\n 'Ha salido de la agenda' \n"
menu = False
elif opcion == 3:
elegir = True
while elegir == True:
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ==============================| BUSCAR |==============================="
print " -------------------------------------------------------------------------------"
print " | [A] Buscar por nombre | [B] Buscar por numero |"
print " -------------------------------------------------------------------------------"
print
print " (v) Volver al menu principal"
print " (s) Salir de la agenda"
elegir = raw_input("\n Opcion > ")
if elegir == 'a' or elegir == 'A':
archivo.buscarContactoNombre()
archivo.escribirArchivo()
print
print " (b) Volver al menu buscar"
print " (v) Volver al menu principal"
print " (s) Salir de la agenda"
elegir = raw_input("\n Opcion > ")
if elegir == 'v':
elegir = False
menu = True
archivo.limpiar()
elif elegir == 's':
print "\n 'Ha salido de la agenda' \n"
elegir = False
menu = False
elif elegir == 'b':
elegir = True
archivo.limpiar()
else:
break
if elegir == 'b' or elegir == 'B':
archivo.buscarContactoNumero()
archivo.escribirArchivo()
print
print " (b) Volver al menu buscar"
print " (v) Volver al menu principal"
print " (s) Salir de la agenda"
elegir = raw_input("\n Opcion > ")
if elegir == 'v':
elegir = False
menu = True
archivo.limpiar()
elif elegir == 's':
print "\n 'Ha salido de la agenda' \n"
elegir = False
menu = False
elif elegir == 'b':
elegir = True
archivo.limpiar()
else:
break
if elegir == 'v':
elegir = False
menu = True
archivo.limpiar()
if elegir == 's':
print "\n 'Ha salido de la agenda' \n"
elegir = False
menu = False
elif opcion == 4:
archivo.limpiar()
print " -------------------------------------------------------------------------------"
print " ====================| ELIMINAR TODOS LOS CONTACTOS |==================="
print " -------------------------------------------------------------------------------"
archivo.eliminarCONTACTOS()
print
print " (v) Volver al menu principal"
print " (s) Salir de la agenda"
elegir = raw_input("\n Opcion > ")
if elegir == 'v':
menu = True
archivo.limpiar()
if elegir == 's':
print "\n 'Ha salido de la agenda' \n"
menu = False
elif opcion == 5:
elegir = True
while elegir == True:
archivo.limpiar()
archivo.enviarSMS()
print
print " (e) Enviar otro mensaje"
print " (v) Volver al menu principal"
print " (s) Salir de la agenda"
elegir = raw_input("\n Opcion > ")
if elegir == 'v':
elegir = False
menu = True
archivo.limpiar()
if elegir == 's':
print "\n 'Ha salido de la agenda' \n"
elegir = False
menu = False
if elegir == 'e':
elegir = True
archivo.limpiar()
elif opcion == 6:
elegir = True
while elegir == True:
archivo.limpiar()
archivo.llamarAUnContacto()
print
print " (l) Llamar a otro contacto"
print " (v) Volver al menu principal"
print " (s) Salir de la agenda"
elegir = raw_input("\n Opcion > ")
if elegir == 'v':
elegir = False
menu = True
archivo.limpiar()
if elegir == 's':
print "\n 'Ha salido de la agenda' \n"
elegir = False
menu = False
if elegir == 'l':
elegir = True
archivo.limpiar()
elif opcion == 7:
archivo.limpiar()
elegir = True
archivo.historialMensajes()
while elegir == True:
print
print " (e) Eliminar historial de mensajes"
print " (v) Volver al menu principal"
print " (s) Salir de la agenda"
elegir = raw_input("\n Opcion > ")
if elegir == 'e':
archivo.limpiar()
archivo.eliminarMensajes()
elegir = True
if elegir == 'v':
elegir = False
menu = True
archivo.limpiar()
if elegir == 's':
print "\n 'Ha salido de la agenda' \n"
elegir = False
menu = False
elif opcion == 8:
archivo.limpiar()
elegir = True
while elegir == True:
print
print " (v) Volver al menu principal"
print " (s) Salir de la agenda"
elegir = raw_input("\n Opcion > ")
if elegir == 'v':
elegir = False
menu = True
archivo.limpiar()
if elegir == 's':
print "\n 'Ha salido de la agenda' \n"
elegir = False
menu = False
else:
print
print " 'Opcion invalida'"
print " Presione ENTER para continuar"
print
raw_input()
| 47.100894 | 152 | 0.293357 | 4,365 | 73,760 | 4.933104 | 0.051317 | 0.052013 | 0.02466 | 0.03901 | 0.836391 | 0.79139 | 0.773278 | 0.759996 | 0.74221 | 0.728045 | 0 | 0.004516 | 0.489669 | 73,760 | 1,565 | 153 | 47.13099 | 0.567531 | 0.002088 | 0 | 0.876723 | 0 | 0 | 0.300429 | 0.150099 | 0 | 0 | 0 | 0.000639 | 0 | 0 | null | null | 0 | 0.001622 | null | null | 0.326034 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b949ca8cb8525e22d5b152eefec62e7cff748469 | 98 | py | Python | test/tests/is.py | kevinxucs/pyston | bdb87c1706ac74a0d15d9bc2bae53798678a5f14 | [
"Apache-2.0"
] | 1 | 2015-11-06T03:39:51.000Z | 2015-11-06T03:39:51.000Z | test/tests/is.py | kevinxucs/pyston | bdb87c1706ac74a0d15d9bc2bae53798678a5f14 | [
"Apache-2.0"
] | null | null | null | test/tests/is.py | kevinxucs/pyston | bdb87c1706ac74a0d15d9bc2bae53798678a5f14 | [
"Apache-2.0"
] | null | null | null | class C(object):
pass
c = C()
print c is c
print c is C
print range is True
print c is None
| 9.8 | 19 | 0.653061 | 22 | 98 | 2.909091 | 0.409091 | 0.28125 | 0.375 | 0.28125 | 0.375 | 0.375 | 0 | 0 | 0 | 0 | 0 | 0 | 0.285714 | 98 | 9 | 20 | 10.888889 | 0.914286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.142857 | 0 | null | null | 0.571429 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 8 |
b95d63987bc17d580cf6f32790a4f098cc93207b | 11,570 | py | Python | tests/test_expandvars.py | Anti-Distinctlyminty/expandvars | 811841b60552268e9b3c8cd4e51897cc6511b48f | [
"MIT"
] | 17 | 2019-05-10T08:23:19.000Z | 2022-03-03T10:56:15.000Z | tests/test_expandvars.py | Anti-Distinctlyminty/expandvars | 811841b60552268e9b3c8cd4e51897cc6511b48f | [
"MIT"
] | 50 | 2019-05-27T14:39:35.000Z | 2022-03-08T10:58:38.000Z | tests/test_expandvars.py | Anti-Distinctlyminty/expandvars | 811841b60552268e9b3c8cd4e51897cc6511b48f | [
"MIT"
] | 5 | 2019-05-27T20:18:40.000Z | 2021-06-28T08:22:33.000Z | # -*- coding: utf-8 -*-
import importlib
from os import environ as env, getpid
from unittest.mock import patch
import expandvars
import pytest
@patch.dict(env, {})
def test_expandvars_constant():
importlib.reload(expandvars)
assert expandvars.expandvars("FOO") == "FOO"
assert expandvars.expandvars("$") == "$"
assert expandvars.expandvars("BAR$") == "BAR$"
@patch.dict(env, {})
def test_expandvars_empty():
importlib.reload(expandvars)
assert expandvars.expandvars("") == ""
assert expandvars.expandvars("$FOO") == ""
@patch.dict(env, {"FOO": "bar"})
def test_expandvars_simple():
importlib.reload(expandvars)
assert expandvars.expandvars("$FOO") == "bar"
assert expandvars.expandvars("${FOO}") == "bar"
@patch.dict(env, {"FOO": "bar"})
def test_expandvars_from_file():
importlib.reload(expandvars)
with open("tests/data/foo.txt") as f:
assert expandvars.expandvars(f) == "bar:bar"
@patch.dict(env, {"FOO": "bar", "BIZ": "buz"})
def test_expandvars_combo():
importlib.reload(expandvars)
assert expandvars.expandvars("${FOO}:$BIZ") == "bar:buz"
assert expandvars.expandvars("$FOO$BIZ") == "barbuz"
assert expandvars.expandvars("${FOO}$BIZ") == "barbuz"
assert expandvars.expandvars("$FOO${BIZ}") == "barbuz"
assert expandvars.expandvars("$FOO-$BIZ") == "bar-buz"
assert expandvars.expandvars("boo$BIZ") == "boobuz"
assert expandvars.expandvars("boo${BIZ}") == "boobuz"
@patch.dict(env, {})
def test_expandvars_pid():
importlib.reload(expandvars)
assert expandvars.expandvars("$$") == str(getpid())
assert expandvars.expandvars("PID( $$ )") == "PID( {0} )".format(getpid())
@patch.dict(env, {"ALTERNATE": "Alternate"})
def test_expandvars_get_default():
importlib.reload(expandvars)
assert expandvars.expandvars("${FOO-default}") == "default"
assert expandvars.expandvars("${FOO:-default}") == "default"
assert expandvars.expandvars("${FOO:-}") == ""
assert expandvars.expandvars("${FOO:-foo}:${FOO-bar}") == "foo:bar"
assert expandvars.expandvars("${FOO:-$ALTERNATE}") == "Alternate"
@patch.dict(env, {})
def test_expandvars_update_default():
importlib.reload(expandvars)
assert expandvars.expandvars("${FOO:=}") == ""
assert expandvars.expandvars("${FOO=}") == ""
del env["FOO"]
assert expandvars.expandvars("${FOO:=default}") == "default"
assert expandvars.expandvars("${FOO=default}") == "default"
assert env.get("FOO") == "default"
assert expandvars.expandvars("${FOO:=ignoreme}") == "default"
assert expandvars.expandvars("${FOO=ignoreme}:bar") == "default:bar"
@patch.dict(env, {"FOO": "bar", "BUZ": "bar"})
def test_expandvars_substitute():
importlib.reload(expandvars)
assert expandvars.expandvars("${FOO:+foo}") == "foo"
assert expandvars.expandvars("${FOO+foo}") == "foo"
assert expandvars.expandvars("${BAR:+foo}") == ""
assert expandvars.expandvars("${BAR+foo}") == ""
assert expandvars.expandvars("${BAR:+}") == ""
assert expandvars.expandvars("${BAR+}") == ""
assert expandvars.expandvars("${BUZ:+foo}") == "foo"
assert expandvars.expandvars("${BUZ+foo}:bar") == "foo:bar"
@patch.dict(env, {"FOO": "damnbigfoobar"})
def test_offset():
importlib.reload(expandvars)
assert expandvars.expandvars("${FOO:3}") == "nbigfoobar"
assert expandvars.expandvars("${FOO: 4 }") == "bigfoobar"
assert expandvars.expandvars("${FOO:30}") == ""
assert expandvars.expandvars("${FOO:0}") == "damnbigfoobar"
assert expandvars.expandvars("${FOO:-3}:bar") == "damnbigfoobar:bar"
@patch.dict(env, {"FOO": "damnbigfoobar"})
def test_offset_length():
importlib.reload(expandvars)
assert expandvars.expandvars("${FOO:4:3}") == "big"
assert expandvars.expandvars("${FOO: 7:6 }") == "foobar"
assert expandvars.expandvars("${FOO:7: 100 }") == "foobar"
assert expandvars.expandvars("${FOO:0:100}") == "damnbigfoobar"
assert expandvars.expandvars("${FOO:70:10}") == ""
assert expandvars.expandvars("${FOO:1:0}") == ""
assert expandvars.expandvars("${FOO:0:}") == ""
assert expandvars.expandvars("${FOO::}") == ""
assert expandvars.expandvars("${FOO::5}") == "damnb"
assert expandvars.expandvars("${FOO:-3:1}:bar") == "damnbigfoobar:bar"
@patch.dict(env, {"FOO": "X", "X": "foo"})
def test_expandvars_indirection():
importlib.reload(expandvars)
assert expandvars.expandvars("${!FOO}:${FOO}") == "foo:X"
assert expandvars.expandvars("${!FOO-default}") == "foo"
assert expandvars.expandvars("${!BAR-default}") == "default"
assert expandvars.expandvars("${!X-default}") == "default"
@patch.dict(env, {"FOO": "foo", "BAR": "bar"})
def test_escape():
importlib.reload(expandvars)
assert expandvars.expandvars("\\$FOO\\$BAR") == "$FOO$BAR"
assert expandvars.expandvars("\\\\$FOO") == "\\foo"
assert expandvars.expandvars("$FOO\\$BAR") == "foo$BAR"
assert expandvars.expandvars("\\$FOO$BAR") == "$FOObar"
assert expandvars.expandvars("$FOO" "\\" "\\" "\\" "$BAR") == ("foo" "\\" "$BAR")
assert expandvars.expandvars("$FOO\\$") == "foo$"
assert expandvars.expandvars("$\\FOO") == "$\\FOO"
assert expandvars.expandvars("$\\$FOO") == "$$FOO"
assert expandvars.expandvars("\\$FOO") == "$FOO"
assert (
expandvars.expandvars("D:\\\\some\\windows\\path")
== "D:\\\\some\\windows\\path"
)
@patch.dict(env, {})
def test_corner_cases():
importlib.reload(expandvars)
assert expandvars.expandvars("${FOO:-{}}{}{}{}{{}}") == "{}{}{}{}{{}}"
assert expandvars.expandvars("${FOO-{}}{}{}{}{{}}") == "{}{}{}{}{{}}"
@patch.dict(env, {})
def test_strict_parsing():
importlib.reload(expandvars)
with pytest.raises(
expandvars.ExpandvarsException, match="FOO: parameter null or not set"
) as e:
expandvars.expandvars("${FOO:?}")
assert isinstance(e.value, expandvars.ParameterNullOrNotSet)
with pytest.raises(
expandvars.ExpandvarsException, match="FOO: parameter null or not set"
) as e:
expandvars.expandvars("${FOO?}")
assert isinstance(e.value, expandvars.ParameterNullOrNotSet)
with pytest.raises(expandvars.ExpandvarsException, match="FOO: custom error") as e:
expandvars.expandvars("${FOO:?custom error}")
assert isinstance(e.value, expandvars.ParameterNullOrNotSet)
with pytest.raises(expandvars.ExpandvarsException, match="FOO: custom error") as e:
expandvars.expandvars("${FOO?custom error}")
assert isinstance(e.value, expandvars.ParameterNullOrNotSet)
env.update({"FOO": "foo"})
assert expandvars.expandvars("${FOO:?custom err}") == "foo"
assert expandvars.expandvars("${FOO?custom err}:bar") == "foo:bar"
@patch.dict(env, {"FOO": "foo"})
def test_missing_escapped_character():
importlib.reload(expandvars)
with pytest.raises(expandvars.ExpandvarsException) as e:
expandvars.expandvars("$FOO\\")
assert str(e.value) == "$FOO\\: missing escaped character"
assert isinstance(e.value, expandvars.MissingExcapedChar)
@patch.dict(env, {"FOO": "damnbigfoobar"})
def test_invalid_length_err():
importlib.reload(expandvars)
with pytest.raises(
expandvars.ExpandvarsException, match="FOO: -3: substring expression < 0"
) as e:
expandvars.expandvars("${FOO:1:-3}")
assert isinstance(e.value, expandvars.NegativeSubStringExpression)
@patch.dict(env, {"FOO": "damnbigfoobar"})
def test_bad_substitution_err():
importlib.reload(expandvars)
with pytest.raises(expandvars.ExpandvarsException) as e:
expandvars.expandvars("${FOO:}") == ""
assert str(e.value) == "${FOO:}: bad substitution"
assert isinstance(e.value, expandvars.BadSubstitution)
with pytest.raises(expandvars.ExpandvarsException) as e:
expandvars.expandvars("${}") == ""
assert str(e.value) == "${}: bad substitution"
assert isinstance(e.value, expandvars.BadSubstitution)
@patch.dict(env, {"FOO": "damnbigfoobar"})
def test_brace_never_closed_err():
importlib.reload(expandvars)
with pytest.raises(expandvars.ExpandvarsException) as e:
expandvars.expandvars("${FOO:")
assert str(e.value) == "${FOO:: missing '}'"
assert isinstance(e.value, expandvars.MissingClosingBrace)
with pytest.raises(expandvars.ExpandvarsException) as e:
expandvars.expandvars("${FOO}${BAR")
assert str(e.value) == "${FOO}${BAR: missing '}'"
assert isinstance(e.value, expandvars.MissingClosingBrace)
with pytest.raises(expandvars.ExpandvarsException) as e:
expandvars.expandvars("${FOO?")
assert str(e.value) == "${FOO?: missing '}'"
assert isinstance(e.value, expandvars.ExpandvarsException)
with pytest.raises(expandvars.ExpandvarsException) as e:
expandvars.expandvars("${FOO:1")
assert str(e.value) == "${FOO:1: missing '}'"
assert isinstance(e.value, expandvars.MissingClosingBrace)
with pytest.raises(expandvars.ExpandvarsException) as e:
expandvars.expandvars("${FOO:1:2")
assert str(e.value) == "${FOO:1:2: missing '}'"
assert isinstance(e.value, expandvars.MissingClosingBrace)
with pytest.raises(expandvars.ExpandvarsException) as e:
expandvars.expandvars("${FOO+")
assert str(e.value) == "${FOO+: missing '}'"
assert isinstance(e.value, expandvars.MissingClosingBrace)
with pytest.raises(expandvars.ExpandvarsException) as e:
expandvars.expandvars("${FOO-")
assert str(e.value) == "${FOO-: missing '}'"
assert isinstance(e.value, expandvars.MissingClosingBrace)
@patch.dict(env, {"FOO": "damnbigfoobar"})
def test_invalid_operand_err():
importlib.reload(expandvars)
oprnds = "@#$%^&*()_'\"\\"
for o in oprnds:
with pytest.raises(expandvars.ExpandvarsException) as e:
expandvars.expandvars("${{FOO:{0}}}".format(o))
assert str(e.value) == ("FOO: operand expected (error token is {0})").format(
repr(o)
)
assert isinstance(e.value, expandvars.OperandExpected)
with pytest.raises(expandvars.ExpandvarsException) as e:
expandvars.expandvars("${{FOO:0:{0}}}".format(o))
assert str(e.value) == ("FOO: operand expected (error token is {0})").format(
repr(o)
)
assert isinstance(e.value, expandvars.OperandExpected)
with pytest.raises(expandvars.ExpandvarsException) as e:
expandvars.expandvars("${{FOO:{0}:{0}}}".format(o))
assert str(e.value) == ("FOO: operand expected (error token is {0})").format(
repr(o)
)
assert isinstance(e.value, expandvars.OperandExpected)
@pytest.mark.parametrize("var_symbol", ["%", "&", "£", "="])
def test_expand_var_symbol(var_symbol):
assert (
expandvars.expand(
var_symbol + "{FOO}", environ={"FOO": "test"}, var_symbol=var_symbol
)
== "test"
)
assert (
expandvars.expand(var_symbol + "FOO", environ={}, var_symbol=var_symbol) == ""
)
assert (
expandvars.expand(
var_symbol + "{FOO:-default_value}", environ={}, var_symbol=var_symbol
)
== "default_value"
)
with pytest.raises(expandvars.ParameterNullOrNotSet):
expandvars.expand(var_symbol + "{FOO:?}", environ={}, var_symbol=var_symbol)
assert (
expandvars.expand(
var_symbol + "{FOO},$HOME", environ={"FOO": "test"}, var_symbol=var_symbol
)
== "test,$HOME"
)
| 34.849398 | 87 | 0.645117 | 1,227 | 11,570 | 6.026895 | 0.104319 | 0.235294 | 0.242596 | 0.203922 | 0.8595 | 0.773766 | 0.707235 | 0.638945 | 0.546721 | 0.504665 | 0 | 0.004979 | 0.166724 | 11,570 | 331 | 88 | 34.954683 | 0.761954 | 0.001815 | 0 | 0.348178 | 0 | 0 | 0.181519 | 0.006235 | 0 | 0 | 0 | 0 | 0.425101 | 1 | 0.08502 | false | 0 | 0.101215 | 0 | 0.186235 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b98fe9413a0531d7d50cd05201f3c71c4f759334 | 17,533 | py | Python | reports_module/tests/test_report_manager.py | ria-ee/monitor | d5cb9384abf38394b35e760729649136cbbc7548 | [
"MIT"
] | 10 | 2017-12-01T11:59:54.000Z | 2021-11-08T10:30:35.000Z | reports_module/tests/test_report_manager.py | ria-ee/monitor | d5cb9384abf38394b35e760729649136cbbc7548 | [
"MIT"
] | 16 | 2019-11-15T08:45:33.000Z | 2021-06-10T18:06:03.000Z | reports_module/tests/test_report_manager.py | ria-ee/monitor | d5cb9384abf38394b35e760729649136cbbc7548 | [
"MIT"
] | 13 | 2017-11-22T08:46:57.000Z | 2021-12-16T06:51:07.000Z | import unittest
from reports_module.reportslib.report_manager import ReportManager
from reports_module.reportslib.report_row import ReportRow
class TestReportManager(unittest.TestCase):
def test_is_producer_document(self):
report_manager = ReportManager(None, None, None, None, "2017-01-01", "2017-01-01", None, None, None, None, None,
None, None, None, None, None, None, None)
document = dict()
subsystem_code = "subsystem_code"
member_code = "member_code"
member_class = None
x_road_instance = ""
document["serviceSubsystemCode"] = subsystem_code
document["serviceMemberCode"] = member_code
document["serviceMemberClass"] = member_class
document["serviceXRoadInstance"] = x_road_instance
self.assertTrue(
report_manager.is_producer_document(document, subsystem_code, member_code, member_class, x_road_instance))
self.assertFalse(
report_manager.is_producer_document(document, subsystem_code, subsystem_code, member_class,
x_road_instance))
self.assertFalse(
report_manager.is_producer_document(document, subsystem_code, member_code, member_code, x_road_instance))
self.assertFalse(
report_manager.is_producer_document(document, subsystem_code, member_code, member_class, member_class))
self.assertFalse(
report_manager.is_producer_document(document, x_road_instance, member_code, member_class, x_road_instance))
def test_is_client_document(self):
report_manager = ReportManager(None, None, None, None, "2017-01-01", "2017-01-01", None, None, None, None, None,
None, None, None, None, None, None, None)
document = dict()
subsystem_code = "subsystem_code"
member_code = "member_code"
member_class = None
x_road_instance = ""
document["clientSubsystemCode"] = subsystem_code
document["clientMemberCode"] = member_code
document["clientMemberClass"] = member_class
document["clientXRoadInstance"] = x_road_instance
self.assertTrue(
report_manager.is_client_document(document, subsystem_code, member_code, member_class, x_road_instance))
self.assertFalse(
report_manager.is_client_document(document, subsystem_code, subsystem_code, member_class,
x_road_instance))
self.assertFalse(
report_manager.is_client_document(document, subsystem_code, member_code, member_code, x_road_instance))
self.assertFalse(
report_manager.is_client_document(document, subsystem_code, member_code, member_class, member_class))
self.assertFalse(
report_manager.is_client_document(document, x_road_instance, member_code, member_class, x_road_instance))
def test_reduce_to_plain_json(self):
report_manager = ReportManager(None, None, None, None, "2017-01-01", "2017-01-01", None, None, None, None, None,
None, None, None, None, None, None, None)
subsystem_code = "subsystem_code"
member_code = "member_code"
member_class = None
x_road_instance = ""
inner_document = dict()
inner_document["serviceSubsystemCode"] = subsystem_code
inner_document["serviceMemberCode"] = member_code
inner_document["serviceMemberClass"] = member_class
inner_document["serviceXRoadInstance"] = x_road_instance
inner_document["clientSubsystemCode"] = subsystem_code
inner_document["clientMemberCode"] = member_code
inner_document["clientMemberClass"] = member_class
inner_document["clientXRoadInstance"] = x_road_instance
document = dict()
document['client'] = inner_document.copy()
document['producer'] = inner_document.copy()
doc = report_manager.reduce_to_plain_json(document)
self.assertEqual(inner_document, doc)
document = dict()
document['client'] = None
document['producer'] = inner_document.copy()
doc = report_manager.reduce_to_plain_json(document)
for key in list(inner_document.keys()):
self.assertTrue(doc[key] == inner_document[key])
document = dict()
document['client'] = inner_document.copy()
document['producer'] = None
doc = report_manager.reduce_to_plain_json(document)
for key in list(inner_document.keys()):
self.assertTrue(doc[key] == inner_document[key])
document = dict()
document['client'] = None
document['producer'] = None
with self.assertRaises(TypeError):
report_manager.reduce_to_plain_json(document)
def test_get_service_type(self):
report_manager_producer = ReportManager(
"", None, "member_code", "subsystem_code", "2017-01-01", "2017-01-01", None, None, None, None, [], None,
None, None, None, None, None, None)
report_manager_producer_meta = ReportManager(
"", None, "member_code", "subsystem_code", "2017-01-01", "2017-01-01", None, None, None, None, ["meta"],
None, None, None, None, None, None, None)
subsystem_code = "subsystem_code"
member_code = "member_code"
member_class = None
x_road_instance = ""
inner_document = dict()
inner_document["serviceSubsystemCode"] = subsystem_code
inner_document["serviceMemberCode"] = member_code
inner_document["serviceMemberClass"] = member_class
inner_document["serviceXRoadInstance"] = x_road_instance
inner_document["clientSubsystemCode"] = "not_producer"
inner_document["clientMemberCode"] = member_code
inner_document["clientMemberClass"] = member_class
inner_document["clientXRoadInstance"] = x_road_instance
inner_document["serviceCode"] = "meta"
service_type = report_manager_producer.get_service_type(inner_document)
self.assertEqual(service_type, "ps")
service_type = report_manager_producer_meta.get_service_type(inner_document)
self.assertEqual(service_type, "pms")
inner_document["serviceMemberCode"] = "not_client"
inner_document["clientSubsystemCode"] = subsystem_code
service_type = report_manager_producer.get_service_type(inner_document)
self.assertEqual(service_type, "cs")
service_type = report_manager_producer_meta.get_service_type(inner_document)
self.assertEqual(service_type, "cms")
def test_merge_document_fields(self):
report_manager = ReportManager(None, None, None, None, "2017-01-01", "2017-01-01", None, None, None, None, None,
None, None, None, None, None, None, None)
subsystem_code = "subsystem_code"
member_code = "member_code"
member_class = None
x_road_instance = ""
service_version = "1.0"
inner_document = dict()
inner_document["serviceSubsystemCode"] = subsystem_code
inner_document["serviceMemberCode"] = member_code
inner_document["serviceMemberClass"] = member_class
inner_document["serviceXRoadInstance"] = x_road_instance
inner_document["serviceVersion"] = service_version
merged_field = report_manager.merge_document_fields(
inner_document, ["serviceSubsystemCode", "serviceMemberCode", "serviceMemberClass", "serviceXRoadInstance",
"serviceVersion"], "new_field", ".")
self.assertEqual(merged_field, "subsystem_code.member_code.1.0")
def test_get_min_mean_max(self):
report_manager = ReportManager(None, None, None, None, "2017-01-01", "2017-01-01", None, None, None, None, None,
None, None, None, None, None, None, None)
self.assertEqual(report_manager.get_min_mean_max(5, (11, 2), 5), "5 / 6 / 5")
self.assertEqual(report_manager.get_min_mean_max(-5, (-11, 2), -5), "-5 / -6 / -5")
self.assertEqual(report_manager.get_min_mean_max(5.44235, (11.12412, 3.1253), 5.415123), "5 / 4 / 5")
self.assertEqual(report_manager.get_min_mean_max(5.6, (11, 4), 5.6), "6 / 3 / 6")
self.assertEqual(report_manager.get_min_mean_max(None, (11, 2), 5), "None / 6 / 5")
self.assertEqual(report_manager.get_min_mean_max(None, (None, None), None), "None / None / None")
def test_get_name_and_count(self):
report_manager = ReportManager(None, None, None, "ss", "2017-01-01", "2017-01-01", None, None, None, None,
None, None, None, None, None, None, None, None)
key = "ee-dev"
report_row = ReportRow(None)
suc_queries = 10
report_row.succeeded_queries = suc_queries
produced_service = True
service_name = "service"
self.assertEqual(report_manager.get_name_and_count(key, report_row, produced_service, service_name),
("ss: ee-dev", suc_queries))
produced_service = False
self.assertEqual(report_manager.get_name_and_count(key, report_row, produced_service, service_name),
("ee-dev: service", suc_queries))
def test_get_succeeded_top(self):
report_manager = ReportManager(None, None, None, "ss", "2017-01-01", "2017-01-01", None, None, None, None,
None, None, None, None, None, None, None, None)
report_row = ReportRow(None)
report_row.succeeded_queries = 10
report_row_2 = ReportRow(None)
report_row_2.succeeded_queries = 15
report_row_3 = ReportRow(None)
report_row_3.succeeded_queries = 7
report_row_4 = ReportRow(None)
report_row_4.succeeded_queries = 12
report_row_5 = ReportRow(None)
report_row_5.succeeded_queries = 13
report_row_6 = ReportRow(None)
report_row_6.succeeded_queries = 8
report_row_7 = ReportRow(None)
report_row_7.succeeded_queries = 11
report_row_8 = ReportRow(None)
report_row_8.succeeded_queries = 14
report_row_9 = ReportRow(None)
report_row_9.succeeded_queries = 9
test_data = {
"service_1": {
"ee-dev": report_row,
"ee-dev_2": report_row_2,
"ee-dev_3": report_row_3
},
"service_2": {
"ee-dev_4": report_row_7,
"ee-dev_5": report_row_8,
"ee-dev_6": report_row_9
},
"service_3": {
"ee-dev_7": report_row_4,
"ee-dev_8": report_row_5,
"ee-dev_9": report_row_6
},
}
self.assertEqual(report_manager.get_succeeded_top(test_data, True),
[("ss: service_3", 13), ("ss: service_2", 14), ("ss: service_1", 15)])
self.assertEqual(report_manager.get_succeeded_top(test_data, False),
[("service_2: ee-dev_4", 11), ("service_3: ee-dev_7", 12), ("service_3: ee-dev_8", 13),
("service_2: ee-dev_5", 14), ("service_1: ee-dev_2", 15)])
def test_get_name_and_average(self):
report_manager = ReportManager(None, None, None, "ss", "2017-01-01", "2017-01-01", None, None, None, None,
None, None, None, None, None, None, None, None)
key = "ee-dev"
report_row = ReportRow(None)
report_row.duration_avg = (5123.123, 15)
produced_service = True
service_name = "service"
self.assertEqual(report_manager.get_name_and_average(key, report_row, produced_service, service_name),
("ss: ee-dev", round(report_row.duration_avg[0] / (report_row.duration_avg[1]))))
produced_service = False
self.assertEqual(report_manager.get_name_and_average(key, report_row, produced_service, service_name),
("ee-dev: service", round(report_row.duration_avg[0] / (report_row.duration_avg[1]))))
report_row.duration_avg = (None, None)
self.assertEqual(report_manager.get_name_and_average(key, report_row, produced_service, service_name),
("ee-dev: service", None))
def test_get_duration_top(self):
report_manager = ReportManager(None, None, None, "ss", "2017-01-01", "2017-01-01", None, None, None, None,
None, None, None, None, None, None, None, None)
report_row = ReportRow(None)
report_row.duration_avg = (100, 10)
report_row_2 = ReportRow(None)
report_row_2.duration_avg = (1000, 10)
report_row_3 = ReportRow(None)
report_row_3.duration_avg = (200, 10)
report_row_4 = ReportRow(None)
report_row_4.duration_avg = (109, 10)
report_row_5 = ReportRow(None)
report_row_5.duration_avg = (1100, 10)
report_row_6 = ReportRow(None)
report_row_6.duration_avg = (209, 10)
report_row_7 = ReportRow(None)
report_row_7.duration_avg = (90, 10)
report_row_8 = ReportRow(None)
report_row_8.duration_avg = (900, 10)
report_row_9 = ReportRow(None)
report_row_9.duration_avg = (190, 10)
test_data = {
"service_1": {
"ee-dev": report_row,
"ee-dev_2": report_row_2,
"ee-dev_3": report_row_3
},
"service_2": {
"ee-dev_4": report_row_7,
"ee-dev_5": report_row_8,
"ee-dev_6": report_row_9
},
"service_3": {
"ee-dev_7": report_row_4,
"ee-dev_8": report_row_5,
"ee-dev_9": report_row_6
},
}
self.assertEqual(report_manager.get_duration_top(test_data, True),
[("ss: service_2", 90), ("ss: service_1", 100), ("ss: service_3", 110)])
self.assertEqual(report_manager.get_duration_top(test_data, False),
[("service_1: ee-dev_3", 20), ("service_3: ee-dev_9", 21), ("service_2: ee-dev_5", 90),
("service_1: ee-dev_2", 100), ("service_3: ee-dev_8", 110)])
def test_get_member_name(self):
report_manager = ReportManager(None, None, None, None, "2017-01-01", "2017-01-01", None, None, None, None,
None, None, None, None, None, None, None, None)
member_code = "MemberCodeA"
subsystem_code = "SubsystemCodeA"
member_class = "MemberClassA"
x_road_instance = "CI-REPORTS"
member_name_dict = [
{
"x_road_instance": "CI-REPORTS",
"subsystem_name": {
"et": "Subsystem Name ET",
"en": "Subsystem Name EN"
},
"member_class": "MemberClassA",
"email": [],
"subsystem_code": "SubsystemCodeA",
"member_code": "MemberCodeA",
"member_name": "Member Name"
}
]
member_name = report_manager.get_member_name(member_code, subsystem_code, member_class, x_road_instance,
member_name_dict)
self.assertEqual(member_name, "Member Name")
member_name_dict = None
member_name = report_manager.get_member_name(member_code, subsystem_code, member_class, x_road_instance,
member_name_dict)
self.assertEqual(member_name, "")
def test_get_subsystem_name(self):
report_manager = ReportManager(None, None, None, None, "2017-01-01", "2017-01-01", None, None, None, None,
None, None, None, None, None, None, None, None)
member_code = "MemberCodeA"
subsystem_code = "SubsystemCodeA"
member_class = "MemberClassA"
x_road_instance = "CI-REPORTS"
language = "et"
member_name_dict = [
{
"x_road_instance": "CI-REPORTS",
"subsystem_name": {
"et": "Subsystem Name ET",
"en": "Subsystem Name EN"
},
"member_class": "MemberClassA",
"email": [],
"subsystem_code": "SubsystemCodeA",
"member_code": "MemberCodeA",
"member_name": "Member Name"
}
]
member_name = report_manager.get_subsystem_name(member_code, subsystem_code, member_class, x_road_instance,
language, member_name_dict)
self.assertEqual(member_name, "Subsystem Name ET")
language = "en"
member_name = report_manager.get_subsystem_name(member_code, subsystem_code, member_class, x_road_instance,
language, member_name_dict)
self.assertEqual(member_name, "Subsystem Name EN")
member_name_dict = None
member_name = report_manager.get_subsystem_name(member_code, subsystem_code, member_class, x_road_instance,
language, member_name_dict)
self.assertEqual(member_name, "")
| 48.433702 | 120 | 0.609251 | 1,982 | 17,533 | 5.050454 | 0.069122 | 0.140659 | 0.178621 | 0.196603 | 0.851249 | 0.815784 | 0.812088 | 0.8001 | 0.744056 | 0.706993 | 0 | 0.038333 | 0.285804 | 17,533 | 361 | 121 | 48.567867 | 0.761061 | 0 | 0 | 0.631579 | 0 | 0 | 0.134946 | 0.001711 | 0 | 0 | 0 | 0 | 0.120743 | 1 | 0.037152 | false | 0 | 0.009288 | 0 | 0.049536 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b998a5a14ec0f6155b48658a6a72c315dc31880f | 14,626 | py | Python | Standards/OhioStateStandards.py | qwoodmansee/LessonPlannerGenerator | 957e7e21541dc6714f0f4c07290aecfe263c5145 | [
"MIT"
] | 1 | 2021-07-19T15:02:45.000Z | 2021-07-19T15:02:45.000Z | Standards/OhioStateStandards.py | qwoodmansee/LessonPlannerGenerator | 957e7e21541dc6714f0f4c07290aecfe263c5145 | [
"MIT"
] | null | null | null | Standards/OhioStateStandards.py | qwoodmansee/LessonPlannerGenerator | 957e7e21541dc6714f0f4c07290aecfe263c5145 | [
"MIT"
] | 1 | 2022-01-11T16:04:42.000Z | 2022-01-11T16:04:42.000Z | from LessonPlanComponents.Grades import Grades
from LessonPlanComponents.Standard import Standard
class OhioStateStandards:
standards = {
Grades.FIRST: {
"creating" : [
Standard("1CE", "Sing a varied repertoire with accurate rhythm and pitch and expressive qualities individually and with others.", ""),
Standard("2CE", "Describe the way sound is produced by various instruments and the human voice", ""),
Standard("3CE", "Listen, identify and respond to music of different composers and world cultures.", ""),
Standard("4CE", "Discuss the lives and times of composers from various historical periods.", ""),
Standard("5CE", "Identify and respond to basic music forms (e.g., AABA and rondo).", ""),
Standard("6CE", "Identify elements of music using developmentally appropriate vocabulary.", ""),
Standard("7CE", "Describe the roles of musicians in various music settings.", ""),
Standard("8CE", "Describe the use of technology and digital tools in music.", "")
],
"performing": [
Standard("1PR", "Sing a varied repertoire with accurate rhythm and pitch and expressive qualities individually and with others.", ""),
Standard("2PR", "Use the head voice to produce a light, clear sound employing breath support and maintaining appropriate posture.", ""),
Standard("3PR", "Play a variety of classroom instruments with proper technique.", ""),
Standard("4PR", "Sing, move and respond to music from world cultures and different composers.", ""),
Standard("5PR", "Improvise and compose short compositions using a variety of classroom instruments and sound sources.", ""),
Standard("6PR", "Read, write and perform using sixteenth through whole note values including syncopated rhythms in 2/4, 3/4 and 4/4 meter.", ""),
Standard("7PR", "Read, write and perform in treble clef extended pentatonic melodies G, F and C.", ""),
Standard("8PR", "Demonstrate appropriate audience etiquette at live performances..", "")
],
"responding": [
Standard("1RE", "Explain how the elements and subject matter of music connect with disciplines outside the arts.", ""),
Standard("2RE", "Describe the connection between emotion and music in selected musical works.", ""),
Standard("3RE", "Explain classification of musical instruments, voices, composers and forms using appropriate music vocabulary.", ""),
Standard("4RE", "Discuss the roles of musicians heard in various performance settings.", ""),
Standard("5RE", "Interpret a selected musical work using dance, drama or visual art.", ""),
Standard("6RE", "Use constructive feedback to improve and refine musical performance and response.", ""),
]
},
Grades.SECOND: {
"creating" : [
Standard("1CE", "Sing a varied repertoire with accurate rhythm and pitch and expressive qualities individually and with others.", ""),
Standard("2CE", "Describe the way sound is produced by various instruments and the human voice", ""),
Standard("3CE", "Listen, identify and respond to music of different composers and world cultures.", ""),
Standard("4CE", "Discuss the lives and times of composers from various historical periods.", ""),
Standard("5CE", "Identify and respond to basic music forms (e.g., AABA and rondo).", ""),
Standard("6CE", "Identify elements of music using developmentally appropriate vocabulary.", ""),
Standard("7CE", "Describe the roles of musicians in various music settings.", ""),
Standard("8CE", "Describe the use of technology and digital tools in music.", "")
],
"performing": [
Standard("1PR", "Sing a varied repertoire with accurate rhythm and pitch and expressive qualities individually and with others.", ""),
Standard("2PR", "Use the head voice to produce a light, clear sound employing breath support and maintaining appropriate posture.", ""),
Standard("3PR", "Play a variety of classroom instruments with proper technique.", ""),
Standard("4PR", "Sing, move and respond to music from world cultures and different composers.", ""),
Standard("5PR", "Improvise and compose short compositions using a variety of classroom instruments and sound sources.", ""),
Standard("6PR", "Read, write and perform using sixteenth through whole note values including syncopated rhythms in 2/4, 3/4 and 4/4 meter.", ""),
Standard("7PR", "Read, write and perform in treble clef extended pentatonic melodies G, F and C.", ""),
Standard("8PR", "Demonstrate appropriate audience etiquette at live performances..", "")
],
"responding": [
Standard("1RE", "Explain how the elements and subject matter of music connect with disciplines outside the arts.", ""),
Standard("2RE", "Describe the connection between emotion and music in selected musical works.", ""),
Standard("3RE", "Explain classification of musical instruments, voices, composers and forms using appropriate music vocabulary.", ""),
Standard("4RE", "Discuss the roles of musicians heard in various performance settings.", ""),
Standard("5RE", "Interpret a selected musical work using dance, drama or visual art.", ""),
Standard("6RE", "Use constructive feedback to improve and refine musical performance and response.", ""),
]
},
Grades.THIRD: {
"creating" : [
Standard("1CE", "Sing a varied repertoire with accurate rhythm and pitch and expressive qualities individually and with others.", ""),
Standard("2CE", "Describe the way sound is produced by various instruments and the human voice", ""),
Standard("3CE", "Listen, identify and respond to music of different composers and world cultures.", ""),
Standard("4CE", "Discuss the lives and times of composers from various historical periods.", ""),
Standard("5CE", "Identify and respond to basic music forms (e.g., AABA and rondo).", ""),
Standard("6CE", "Identify elements of music using developmentally appropriate vocabulary.", ""),
Standard("7CE", "Describe the roles of musicians in various music settings.", ""),
Standard("8CE", "Describe the use of technology and digital tools in music.", "")
],
"performing": [
Standard("1PR", "Sing a varied repertoire with accurate rhythm and pitch and expressive qualities individually and with others.", ""),
Standard("2PR", "Use the head voice to produce a light, clear sound employing breath support and maintaining appropriate posture.", ""),
Standard("3PR", "Play a variety of classroom instruments with proper technique.", ""),
Standard("4PR", "Sing, move and respond to music from world cultures and different composers.", ""),
Standard("5PR", "Improvise and compose short compositions using a variety of classroom instruments and sound sources.", ""),
Standard("6PR", "Read, write and perform using sixteenth through whole note values including syncopated rhythms in 2/4, 3/4 and 4/4 meter.", ""),
Standard("7PR", "Read, write and perform in treble clef extended pentatonic melodies G, F and C.", ""),
Standard("8PR", "Demonstrate appropriate audience etiquette at live performances..", "")
],
"responding": [
Standard("1RE", "Explain how the elements and subject matter of music connect with disciplines outside the arts.", ""),
Standard("2RE", "Describe the connection between emotion and music in selected musical works.", ""),
Standard("3RE", "Explain classification of musical instruments, voices, composers and forms using appropriate music vocabulary.", ""),
Standard("4RE", "Discuss the roles of musicians heard in various performance settings.", ""),
Standard("5RE", "Interpret a selected musical work using dance, drama or visual art.", ""),
Standard("6RE", "Use constructive feedback to improve and refine musical performance and response.", ""),
]
},
Grades.FOURTH: {
"creating" : [
Standard("1CE", "Sing a varied repertoire with accurate rhythm and pitch and expressive qualities individually and with others.", ""),
Standard("2CE", "Describe the way sound is produced by various instruments and the human voice", ""),
Standard("3CE", "Listen, identify and respond to music of different composers and world cultures.", ""),
Standard("4CE", "Discuss the lives and times of composers from various historical periods.", ""),
Standard("5CE", "Identify and respond to basic music forms (e.g., AABA and rondo).", ""),
Standard("6CE", "Identify elements of music using developmentally appropriate vocabulary.", ""),
Standard("7CE", "Describe the roles of musicians in various music settings.", ""),
Standard("8CE", "Describe the use of technology and digital tools in music.", "")
],
"performing": [
Standard("1PR", "Sing a varied repertoire with accurate rhythm and pitch and expressive qualities individually and with others.", ""),
Standard("2PR", "Use the head voice to produce a light, clear sound employing breath support and maintaining appropriate posture.", ""),
Standard("3PR", "Play a variety of classroom instruments with proper technique.", ""),
Standard("4PR", "Sing, move and respond to music from world cultures and different composers.", ""),
Standard("5PR", "Improvise and compose short compositions using a variety of classroom instruments and sound sources.", ""),
Standard("6PR", "Read, write and perform using sixteenth through whole note values including syncopated rhythms in 2/4, 3/4 and 4/4 meter.", ""),
Standard("7PR", "Read, write and perform in treble clef extended pentatonic melodies G, F and C.", ""),
Standard("8PR", "Demonstrate appropriate audience etiquette at live performances..", "")
],
"responding": [
Standard("1RE", "Explain how the elements and subject matter of music connect with disciplines outside the arts.", ""),
Standard("2RE", "Describe the connection between emotion and music in selected musical works.", ""),
Standard("3RE", "Explain classification of musical instruments, voices, composers and forms using appropriate music vocabulary.", ""),
Standard("4RE", "Discuss the roles of musicians heard in various performance settings.", ""),
Standard("5RE", "Interpret a selected musical work using dance, drama or visual art.", ""),
Standard("6RE", "Use constructive feedback to improve and refine musical performance and response.", ""),
]
},
Grades.FIFTH: {
"creating" : [
Standard("1CE", "Sing a varied repertoire with accurate rhythm and pitch and expressive qualities individually and with others.", ""),
Standard("2CE", "Describe the way sound is produced by various instruments and the human voice", ""),
Standard("3CE", "Listen, identify and respond to music of different composers and world cultures.", ""),
Standard("4CE", "Discuss the lives and times of composers from various historical periods.", ""),
Standard("5CE", "Identify and respond to basic music forms (e.g., AABA and rondo).", ""),
Standard("6CE", "Identify elements of music using developmentally appropriate vocabulary.", ""),
Standard("7CE", "Describe the roles of musicians in various music settings.", ""),
Standard("8CE", "Describe the use of technology and digital tools in music.", "")
],
"performing": [
Standard("1PR", "Sing a varied repertoire with accurate rhythm and pitch and expressive qualities individually and with others.", ""),
Standard("2PR", "Use the head voice to produce a light, clear sound employing breath support and maintaining appropriate posture.", ""),
Standard("3PR", "Play a variety of classroom instruments with proper technique.", ""),
Standard("4PR", "Sing, move and respond to music from world cultures and different composers.", ""),
Standard("5PR", "Improvise and compose short compositions using a variety of classroom instruments and sound sources.", ""),
Standard("6PR", "Read, write and perform using sixteenth through whole note values including syncopated rhythms in 2/4, 3/4 and 4/4 meter.", ""),
Standard("7PR", "Read, write and perform in treble clef extended pentatonic melodies G, F and C.", ""),
Standard("8PR", "Demonstrate appropriate audience etiquette at live performances..", "")
],
"responding": [
Standard("1RE", "Explain how the elements and subject matter of music connect with disciplines outside the arts.", ""),
Standard("2RE", "Describe the connection between emotion and music in selected musical works.", ""),
Standard("3RE", "Explain classification of musical instruments, voices, composers and forms using appropriate music vocabulary.", ""),
Standard("4RE", "Discuss the roles of musicians heard in various performance settings.", ""),
Standard("5RE", "Interpret a selected musical work using dance, drama or visual art.", ""),
Standard("6RE", "Use constructive feedback to improve and refine musical performance and response.", ""),
]
}
}
def __init__(self):
print("initiaized State of Ohio Music Education Standards")
| 91.4125 | 161 | 0.631478 | 1,634 | 14,626 | 5.649939 | 0.111383 | 0.02383 | 0.019497 | 0.022747 | 0.976928 | 0.976928 | 0.976928 | 0.976928 | 0.976928 | 0.976928 | 0 | 0.013014 | 0.264461 | 14,626 | 159 | 162 | 91.987421 | 0.845139 | 0 | 0 | 0.859873 | 0 | 0.031847 | 0.656365 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.006369 | false | 0 | 0.012739 | 0 | 0.031847 | 0.006369 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b9a04f9d9d3265627544b4de4f921f9b2f53e045 | 38,815 | py | Python | sdk/python/pulumi_oci/loadbalancer/listener.py | EladGabay/pulumi-oci | 6841e27d4a1a7e15c672306b769912efbfd3ba99 | [
"ECL-2.0",
"Apache-2.0"
] | 5 | 2021-08-17T11:14:46.000Z | 2021-12-31T02:07:03.000Z | sdk/python/pulumi_oci/loadbalancer/listener.py | pulumi-oci/pulumi-oci | 6841e27d4a1a7e15c672306b769912efbfd3ba99 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2021-09-06T11:21:29.000Z | 2021-09-06T11:21:29.000Z | sdk/python/pulumi_oci/loadbalancer/listener.py | pulumi-oci/pulumi-oci | 6841e27d4a1a7e15c672306b769912efbfd3ba99 | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2021-08-24T23:31:30.000Z | 2022-01-02T19:26:54.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['ListenerArgs', 'Listener']
@pulumi.input_type
class ListenerArgs:
def __init__(__self__, *,
default_backend_set_name: pulumi.Input[str],
load_balancer_id: pulumi.Input[str],
port: pulumi.Input[int],
protocol: pulumi.Input[str],
connection_configuration: Optional[pulumi.Input['ListenerConnectionConfigurationArgs']] = None,
hostname_names: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
name: Optional[pulumi.Input[str]] = None,
path_route_set_name: Optional[pulumi.Input[str]] = None,
routing_policy_name: Optional[pulumi.Input[str]] = None,
rule_set_names: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
ssl_configuration: Optional[pulumi.Input['ListenerSslConfigurationArgs']] = None):
"""
The set of arguments for constructing a Listener resource.
:param pulumi.Input[str] default_backend_set_name: (Updatable) The name of the associated backend set. Example: `example_backend_set`
:param pulumi.Input[str] load_balancer_id: The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the load balancer on which to add a listener.
:param pulumi.Input[int] port: (Updatable) The communication port for the listener. Example: `80`
:param pulumi.Input[str] protocol: (Updatable) The protocol on which the listener accepts connection requests. To get a list of valid protocols, use the [ListProtocols](https://docs.cloud.oracle.com/iaas/api/#/en/loadbalancer/20170115/LoadBalancerProtocol/ListProtocols) operation. Example: `HTTP`
:param pulumi.Input['ListenerConnectionConfigurationArgs'] connection_configuration: (Updatable) Configuration details for the connection between the client and backend servers.
:param pulumi.Input[Sequence[pulumi.Input[str]]] hostname_names: (Updatable) An array of hostname resource names.
:param pulumi.Input[str] name: A friendly name for the listener. It must be unique and it cannot be changed. Avoid entering confidential information. Example: `example_listener`
:param pulumi.Input[str] path_route_set_name: (Updatable) Deprecated. Please use `routingPolicies` instead.
:param pulumi.Input[str] routing_policy_name: (Updatable) The name of the routing policy applied to this listener's traffic. Example: `example_routing_policy`
:param pulumi.Input[Sequence[pulumi.Input[str]]] rule_set_names: (Updatable) The names of the [rule sets](https://docs.cloud.oracle.com/iaas/api/#/en/loadbalancer/20170115/RuleSet/) to apply to the listener. Example: ["example_rule_set"]
:param pulumi.Input['ListenerSslConfigurationArgs'] ssl_configuration: (Updatable) The load balancer's SSL handling configuration details.
"""
pulumi.set(__self__, "default_backend_set_name", default_backend_set_name)
pulumi.set(__self__, "load_balancer_id", load_balancer_id)
pulumi.set(__self__, "port", port)
pulumi.set(__self__, "protocol", protocol)
if connection_configuration is not None:
pulumi.set(__self__, "connection_configuration", connection_configuration)
if hostname_names is not None:
pulumi.set(__self__, "hostname_names", hostname_names)
if name is not None:
pulumi.set(__self__, "name", name)
if path_route_set_name is not None:
pulumi.set(__self__, "path_route_set_name", path_route_set_name)
if routing_policy_name is not None:
pulumi.set(__self__, "routing_policy_name", routing_policy_name)
if rule_set_names is not None:
pulumi.set(__self__, "rule_set_names", rule_set_names)
if ssl_configuration is not None:
pulumi.set(__self__, "ssl_configuration", ssl_configuration)
@property
@pulumi.getter(name="defaultBackendSetName")
def default_backend_set_name(self) -> pulumi.Input[str]:
"""
(Updatable) The name of the associated backend set. Example: `example_backend_set`
"""
return pulumi.get(self, "default_backend_set_name")
@default_backend_set_name.setter
def default_backend_set_name(self, value: pulumi.Input[str]):
pulumi.set(self, "default_backend_set_name", value)
@property
@pulumi.getter(name="loadBalancerId")
def load_balancer_id(self) -> pulumi.Input[str]:
"""
The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the load balancer on which to add a listener.
"""
return pulumi.get(self, "load_balancer_id")
@load_balancer_id.setter
def load_balancer_id(self, value: pulumi.Input[str]):
pulumi.set(self, "load_balancer_id", value)
@property
@pulumi.getter
def port(self) -> pulumi.Input[int]:
"""
(Updatable) The communication port for the listener. Example: `80`
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: pulumi.Input[int]):
pulumi.set(self, "port", value)
@property
@pulumi.getter
def protocol(self) -> pulumi.Input[str]:
"""
(Updatable) The protocol on which the listener accepts connection requests. To get a list of valid protocols, use the [ListProtocols](https://docs.cloud.oracle.com/iaas/api/#/en/loadbalancer/20170115/LoadBalancerProtocol/ListProtocols) operation. Example: `HTTP`
"""
return pulumi.get(self, "protocol")
@protocol.setter
def protocol(self, value: pulumi.Input[str]):
pulumi.set(self, "protocol", value)
@property
@pulumi.getter(name="connectionConfiguration")
def connection_configuration(self) -> Optional[pulumi.Input['ListenerConnectionConfigurationArgs']]:
"""
(Updatable) Configuration details for the connection between the client and backend servers.
"""
return pulumi.get(self, "connection_configuration")
@connection_configuration.setter
def connection_configuration(self, value: Optional[pulumi.Input['ListenerConnectionConfigurationArgs']]):
pulumi.set(self, "connection_configuration", value)
@property
@pulumi.getter(name="hostnameNames")
def hostname_names(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
(Updatable) An array of hostname resource names.
"""
return pulumi.get(self, "hostname_names")
@hostname_names.setter
def hostname_names(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "hostname_names", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
A friendly name for the listener. It must be unique and it cannot be changed. Avoid entering confidential information. Example: `example_listener`
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="pathRouteSetName")
def path_route_set_name(self) -> Optional[pulumi.Input[str]]:
"""
(Updatable) Deprecated. Please use `routingPolicies` instead.
"""
return pulumi.get(self, "path_route_set_name")
@path_route_set_name.setter
def path_route_set_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path_route_set_name", value)
@property
@pulumi.getter(name="routingPolicyName")
def routing_policy_name(self) -> Optional[pulumi.Input[str]]:
"""
(Updatable) The name of the routing policy applied to this listener's traffic. Example: `example_routing_policy`
"""
return pulumi.get(self, "routing_policy_name")
@routing_policy_name.setter
def routing_policy_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "routing_policy_name", value)
@property
@pulumi.getter(name="ruleSetNames")
def rule_set_names(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
(Updatable) The names of the [rule sets](https://docs.cloud.oracle.com/iaas/api/#/en/loadbalancer/20170115/RuleSet/) to apply to the listener. Example: ["example_rule_set"]
"""
return pulumi.get(self, "rule_set_names")
@rule_set_names.setter
def rule_set_names(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "rule_set_names", value)
@property
@pulumi.getter(name="sslConfiguration")
def ssl_configuration(self) -> Optional[pulumi.Input['ListenerSslConfigurationArgs']]:
"""
(Updatable) The load balancer's SSL handling configuration details.
"""
return pulumi.get(self, "ssl_configuration")
@ssl_configuration.setter
def ssl_configuration(self, value: Optional[pulumi.Input['ListenerSslConfigurationArgs']]):
pulumi.set(self, "ssl_configuration", value)
@pulumi.input_type
class _ListenerState:
def __init__(__self__, *,
connection_configuration: Optional[pulumi.Input['ListenerConnectionConfigurationArgs']] = None,
default_backend_set_name: Optional[pulumi.Input[str]] = None,
hostname_names: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
load_balancer_id: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
path_route_set_name: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
protocol: Optional[pulumi.Input[str]] = None,
routing_policy_name: Optional[pulumi.Input[str]] = None,
rule_set_names: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
ssl_configuration: Optional[pulumi.Input['ListenerSslConfigurationArgs']] = None,
state: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering Listener resources.
:param pulumi.Input['ListenerConnectionConfigurationArgs'] connection_configuration: (Updatable) Configuration details for the connection between the client and backend servers.
:param pulumi.Input[str] default_backend_set_name: (Updatable) The name of the associated backend set. Example: `example_backend_set`
:param pulumi.Input[Sequence[pulumi.Input[str]]] hostname_names: (Updatable) An array of hostname resource names.
:param pulumi.Input[str] load_balancer_id: The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the load balancer on which to add a listener.
:param pulumi.Input[str] name: A friendly name for the listener. It must be unique and it cannot be changed. Avoid entering confidential information. Example: `example_listener`
:param pulumi.Input[str] path_route_set_name: (Updatable) Deprecated. Please use `routingPolicies` instead.
:param pulumi.Input[int] port: (Updatable) The communication port for the listener. Example: `80`
:param pulumi.Input[str] protocol: (Updatable) The protocol on which the listener accepts connection requests. To get a list of valid protocols, use the [ListProtocols](https://docs.cloud.oracle.com/iaas/api/#/en/loadbalancer/20170115/LoadBalancerProtocol/ListProtocols) operation. Example: `HTTP`
:param pulumi.Input[str] routing_policy_name: (Updatable) The name of the routing policy applied to this listener's traffic. Example: `example_routing_policy`
:param pulumi.Input[Sequence[pulumi.Input[str]]] rule_set_names: (Updatable) The names of the [rule sets](https://docs.cloud.oracle.com/iaas/api/#/en/loadbalancer/20170115/RuleSet/) to apply to the listener. Example: ["example_rule_set"]
:param pulumi.Input['ListenerSslConfigurationArgs'] ssl_configuration: (Updatable) The load balancer's SSL handling configuration details.
"""
if connection_configuration is not None:
pulumi.set(__self__, "connection_configuration", connection_configuration)
if default_backend_set_name is not None:
pulumi.set(__self__, "default_backend_set_name", default_backend_set_name)
if hostname_names is not None:
pulumi.set(__self__, "hostname_names", hostname_names)
if load_balancer_id is not None:
pulumi.set(__self__, "load_balancer_id", load_balancer_id)
if name is not None:
pulumi.set(__self__, "name", name)
if path_route_set_name is not None:
pulumi.set(__self__, "path_route_set_name", path_route_set_name)
if port is not None:
pulumi.set(__self__, "port", port)
if protocol is not None:
pulumi.set(__self__, "protocol", protocol)
if routing_policy_name is not None:
pulumi.set(__self__, "routing_policy_name", routing_policy_name)
if rule_set_names is not None:
pulumi.set(__self__, "rule_set_names", rule_set_names)
if ssl_configuration is not None:
pulumi.set(__self__, "ssl_configuration", ssl_configuration)
if state is not None:
pulumi.set(__self__, "state", state)
@property
@pulumi.getter(name="connectionConfiguration")
def connection_configuration(self) -> Optional[pulumi.Input['ListenerConnectionConfigurationArgs']]:
"""
(Updatable) Configuration details for the connection between the client and backend servers.
"""
return pulumi.get(self, "connection_configuration")
@connection_configuration.setter
def connection_configuration(self, value: Optional[pulumi.Input['ListenerConnectionConfigurationArgs']]):
pulumi.set(self, "connection_configuration", value)
@property
@pulumi.getter(name="defaultBackendSetName")
def default_backend_set_name(self) -> Optional[pulumi.Input[str]]:
"""
(Updatable) The name of the associated backend set. Example: `example_backend_set`
"""
return pulumi.get(self, "default_backend_set_name")
@default_backend_set_name.setter
def default_backend_set_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "default_backend_set_name", value)
@property
@pulumi.getter(name="hostnameNames")
def hostname_names(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
(Updatable) An array of hostname resource names.
"""
return pulumi.get(self, "hostname_names")
@hostname_names.setter
def hostname_names(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "hostname_names", value)
@property
@pulumi.getter(name="loadBalancerId")
def load_balancer_id(self) -> Optional[pulumi.Input[str]]:
"""
The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the load balancer on which to add a listener.
"""
return pulumi.get(self, "load_balancer_id")
@load_balancer_id.setter
def load_balancer_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "load_balancer_id", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
A friendly name for the listener. It must be unique and it cannot be changed. Avoid entering confidential information. Example: `example_listener`
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="pathRouteSetName")
def path_route_set_name(self) -> Optional[pulumi.Input[str]]:
"""
(Updatable) Deprecated. Please use `routingPolicies` instead.
"""
return pulumi.get(self, "path_route_set_name")
@path_route_set_name.setter
def path_route_set_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path_route_set_name", value)
@property
@pulumi.getter
def port(self) -> Optional[pulumi.Input[int]]:
"""
(Updatable) The communication port for the listener. Example: `80`
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "port", value)
@property
@pulumi.getter
def protocol(self) -> Optional[pulumi.Input[str]]:
"""
(Updatable) The protocol on which the listener accepts connection requests. To get a list of valid protocols, use the [ListProtocols](https://docs.cloud.oracle.com/iaas/api/#/en/loadbalancer/20170115/LoadBalancerProtocol/ListProtocols) operation. Example: `HTTP`
"""
return pulumi.get(self, "protocol")
@protocol.setter
def protocol(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "protocol", value)
@property
@pulumi.getter(name="routingPolicyName")
def routing_policy_name(self) -> Optional[pulumi.Input[str]]:
"""
(Updatable) The name of the routing policy applied to this listener's traffic. Example: `example_routing_policy`
"""
return pulumi.get(self, "routing_policy_name")
@routing_policy_name.setter
def routing_policy_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "routing_policy_name", value)
@property
@pulumi.getter(name="ruleSetNames")
def rule_set_names(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
(Updatable) The names of the [rule sets](https://docs.cloud.oracle.com/iaas/api/#/en/loadbalancer/20170115/RuleSet/) to apply to the listener. Example: ["example_rule_set"]
"""
return pulumi.get(self, "rule_set_names")
@rule_set_names.setter
def rule_set_names(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "rule_set_names", value)
@property
@pulumi.getter(name="sslConfiguration")
def ssl_configuration(self) -> Optional[pulumi.Input['ListenerSslConfigurationArgs']]:
"""
(Updatable) The load balancer's SSL handling configuration details.
"""
return pulumi.get(self, "ssl_configuration")
@ssl_configuration.setter
def ssl_configuration(self, value: Optional[pulumi.Input['ListenerSslConfigurationArgs']]):
pulumi.set(self, "ssl_configuration", value)
@property
@pulumi.getter
def state(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "state")
@state.setter
def state(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "state", value)
class Listener(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
connection_configuration: Optional[pulumi.Input[pulumi.InputType['ListenerConnectionConfigurationArgs']]] = None,
default_backend_set_name: Optional[pulumi.Input[str]] = None,
hostname_names: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
load_balancer_id: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
path_route_set_name: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
protocol: Optional[pulumi.Input[str]] = None,
routing_policy_name: Optional[pulumi.Input[str]] = None,
rule_set_names: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
ssl_configuration: Optional[pulumi.Input[pulumi.InputType['ListenerSslConfigurationArgs']]] = None,
__props__=None):
"""
This resource provides the Listener resource in Oracle Cloud Infrastructure Load Balancer service.
Adds a listener to a load balancer.
## Example Usage
```python
import pulumi
import pulumi_oci as oci
test_listener = oci.loadbalancer.Listener("testListener",
default_backend_set_name=oci_load_balancer_backend_set["test_backend_set"]["name"],
load_balancer_id=oci_load_balancer_load_balancer["test_load_balancer"]["id"],
port=var["listener_port"],
protocol=var["listener_protocol"],
connection_configuration=oci.loadbalancer.ListenerConnectionConfigurationArgs(
idle_timeout_in_seconds=var["listener_connection_configuration_idle_timeout_in_seconds"],
backend_tcp_proxy_protocol_version=var["listener_connection_configuration_backend_tcp_proxy_protocol_version"],
),
hostname_names=[oci_load_balancer_hostname["test_hostname"]["name"]],
path_route_set_name=oci_load_balancer_path_route_set["test_path_route_set"]["name"],
routing_policy_name=oci_load_balancer_load_balancer_routing_policy["test_load_balancer_routing_policy"]["name"],
rule_set_names=[oci_load_balancer_rule_set["test_rule_set"]["name"]],
ssl_configuration=oci.loadbalancer.ListenerSslConfigurationArgs(
certificate_name=oci_load_balancer_certificate["test_certificate"]["name"],
cipher_suite_name=var["listener_ssl_configuration_cipher_suite_name"],
protocols=var["listener_ssl_configuration_protocols"],
server_order_preference=var["listener_ssl_configuration_server_order_preference"],
verify_depth=var["listener_ssl_configuration_verify_depth"],
verify_peer_certificate=var["listener_ssl_configuration_verify_peer_certificate"],
))
```
## Import
Listeners can be imported using the `id`, e.g.
```sh
$ pulumi import oci:loadbalancer/listener:Listener test_listener "loadBalancers/{loadBalancerId}/listeners/{listenerName}"
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['ListenerConnectionConfigurationArgs']] connection_configuration: (Updatable) Configuration details for the connection between the client and backend servers.
:param pulumi.Input[str] default_backend_set_name: (Updatable) The name of the associated backend set. Example: `example_backend_set`
:param pulumi.Input[Sequence[pulumi.Input[str]]] hostname_names: (Updatable) An array of hostname resource names.
:param pulumi.Input[str] load_balancer_id: The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the load balancer on which to add a listener.
:param pulumi.Input[str] name: A friendly name for the listener. It must be unique and it cannot be changed. Avoid entering confidential information. Example: `example_listener`
:param pulumi.Input[str] path_route_set_name: (Updatable) Deprecated. Please use `routingPolicies` instead.
:param pulumi.Input[int] port: (Updatable) The communication port for the listener. Example: `80`
:param pulumi.Input[str] protocol: (Updatable) The protocol on which the listener accepts connection requests. To get a list of valid protocols, use the [ListProtocols](https://docs.cloud.oracle.com/iaas/api/#/en/loadbalancer/20170115/LoadBalancerProtocol/ListProtocols) operation. Example: `HTTP`
:param pulumi.Input[str] routing_policy_name: (Updatable) The name of the routing policy applied to this listener's traffic. Example: `example_routing_policy`
:param pulumi.Input[Sequence[pulumi.Input[str]]] rule_set_names: (Updatable) The names of the [rule sets](https://docs.cloud.oracle.com/iaas/api/#/en/loadbalancer/20170115/RuleSet/) to apply to the listener. Example: ["example_rule_set"]
:param pulumi.Input[pulumi.InputType['ListenerSslConfigurationArgs']] ssl_configuration: (Updatable) The load balancer's SSL handling configuration details.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: ListenerArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
This resource provides the Listener resource in Oracle Cloud Infrastructure Load Balancer service.
Adds a listener to a load balancer.
## Example Usage
```python
import pulumi
import pulumi_oci as oci
test_listener = oci.loadbalancer.Listener("testListener",
default_backend_set_name=oci_load_balancer_backend_set["test_backend_set"]["name"],
load_balancer_id=oci_load_balancer_load_balancer["test_load_balancer"]["id"],
port=var["listener_port"],
protocol=var["listener_protocol"],
connection_configuration=oci.loadbalancer.ListenerConnectionConfigurationArgs(
idle_timeout_in_seconds=var["listener_connection_configuration_idle_timeout_in_seconds"],
backend_tcp_proxy_protocol_version=var["listener_connection_configuration_backend_tcp_proxy_protocol_version"],
),
hostname_names=[oci_load_balancer_hostname["test_hostname"]["name"]],
path_route_set_name=oci_load_balancer_path_route_set["test_path_route_set"]["name"],
routing_policy_name=oci_load_balancer_load_balancer_routing_policy["test_load_balancer_routing_policy"]["name"],
rule_set_names=[oci_load_balancer_rule_set["test_rule_set"]["name"]],
ssl_configuration=oci.loadbalancer.ListenerSslConfigurationArgs(
certificate_name=oci_load_balancer_certificate["test_certificate"]["name"],
cipher_suite_name=var["listener_ssl_configuration_cipher_suite_name"],
protocols=var["listener_ssl_configuration_protocols"],
server_order_preference=var["listener_ssl_configuration_server_order_preference"],
verify_depth=var["listener_ssl_configuration_verify_depth"],
verify_peer_certificate=var["listener_ssl_configuration_verify_peer_certificate"],
))
```
## Import
Listeners can be imported using the `id`, e.g.
```sh
$ pulumi import oci:loadbalancer/listener:Listener test_listener "loadBalancers/{loadBalancerId}/listeners/{listenerName}"
```
:param str resource_name: The name of the resource.
:param ListenerArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(ListenerArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
connection_configuration: Optional[pulumi.Input[pulumi.InputType['ListenerConnectionConfigurationArgs']]] = None,
default_backend_set_name: Optional[pulumi.Input[str]] = None,
hostname_names: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
load_balancer_id: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
path_route_set_name: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
protocol: Optional[pulumi.Input[str]] = None,
routing_policy_name: Optional[pulumi.Input[str]] = None,
rule_set_names: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
ssl_configuration: Optional[pulumi.Input[pulumi.InputType['ListenerSslConfigurationArgs']]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = ListenerArgs.__new__(ListenerArgs)
__props__.__dict__["connection_configuration"] = connection_configuration
if default_backend_set_name is None and not opts.urn:
raise TypeError("Missing required property 'default_backend_set_name'")
__props__.__dict__["default_backend_set_name"] = default_backend_set_name
__props__.__dict__["hostname_names"] = hostname_names
if load_balancer_id is None and not opts.urn:
raise TypeError("Missing required property 'load_balancer_id'")
__props__.__dict__["load_balancer_id"] = load_balancer_id
__props__.__dict__["name"] = name
__props__.__dict__["path_route_set_name"] = path_route_set_name
if port is None and not opts.urn:
raise TypeError("Missing required property 'port'")
__props__.__dict__["port"] = port
if protocol is None and not opts.urn:
raise TypeError("Missing required property 'protocol'")
__props__.__dict__["protocol"] = protocol
__props__.__dict__["routing_policy_name"] = routing_policy_name
__props__.__dict__["rule_set_names"] = rule_set_names
__props__.__dict__["ssl_configuration"] = ssl_configuration
__props__.__dict__["state"] = None
super(Listener, __self__).__init__(
'oci:loadbalancer/listener:Listener',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
connection_configuration: Optional[pulumi.Input[pulumi.InputType['ListenerConnectionConfigurationArgs']]] = None,
default_backend_set_name: Optional[pulumi.Input[str]] = None,
hostname_names: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
load_balancer_id: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
path_route_set_name: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
protocol: Optional[pulumi.Input[str]] = None,
routing_policy_name: Optional[pulumi.Input[str]] = None,
rule_set_names: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
ssl_configuration: Optional[pulumi.Input[pulumi.InputType['ListenerSslConfigurationArgs']]] = None,
state: Optional[pulumi.Input[str]] = None) -> 'Listener':
"""
Get an existing Listener resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['ListenerConnectionConfigurationArgs']] connection_configuration: (Updatable) Configuration details for the connection between the client and backend servers.
:param pulumi.Input[str] default_backend_set_name: (Updatable) The name of the associated backend set. Example: `example_backend_set`
:param pulumi.Input[Sequence[pulumi.Input[str]]] hostname_names: (Updatable) An array of hostname resource names.
:param pulumi.Input[str] load_balancer_id: The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the load balancer on which to add a listener.
:param pulumi.Input[str] name: A friendly name for the listener. It must be unique and it cannot be changed. Avoid entering confidential information. Example: `example_listener`
:param pulumi.Input[str] path_route_set_name: (Updatable) Deprecated. Please use `routingPolicies` instead.
:param pulumi.Input[int] port: (Updatable) The communication port for the listener. Example: `80`
:param pulumi.Input[str] protocol: (Updatable) The protocol on which the listener accepts connection requests. To get a list of valid protocols, use the [ListProtocols](https://docs.cloud.oracle.com/iaas/api/#/en/loadbalancer/20170115/LoadBalancerProtocol/ListProtocols) operation. Example: `HTTP`
:param pulumi.Input[str] routing_policy_name: (Updatable) The name of the routing policy applied to this listener's traffic. Example: `example_routing_policy`
:param pulumi.Input[Sequence[pulumi.Input[str]]] rule_set_names: (Updatable) The names of the [rule sets](https://docs.cloud.oracle.com/iaas/api/#/en/loadbalancer/20170115/RuleSet/) to apply to the listener. Example: ["example_rule_set"]
:param pulumi.Input[pulumi.InputType['ListenerSslConfigurationArgs']] ssl_configuration: (Updatable) The load balancer's SSL handling configuration details.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _ListenerState.__new__(_ListenerState)
__props__.__dict__["connection_configuration"] = connection_configuration
__props__.__dict__["default_backend_set_name"] = default_backend_set_name
__props__.__dict__["hostname_names"] = hostname_names
__props__.__dict__["load_balancer_id"] = load_balancer_id
__props__.__dict__["name"] = name
__props__.__dict__["path_route_set_name"] = path_route_set_name
__props__.__dict__["port"] = port
__props__.__dict__["protocol"] = protocol
__props__.__dict__["routing_policy_name"] = routing_policy_name
__props__.__dict__["rule_set_names"] = rule_set_names
__props__.__dict__["ssl_configuration"] = ssl_configuration
__props__.__dict__["state"] = state
return Listener(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="connectionConfiguration")
def connection_configuration(self) -> pulumi.Output['outputs.ListenerConnectionConfiguration']:
"""
(Updatable) Configuration details for the connection between the client and backend servers.
"""
return pulumi.get(self, "connection_configuration")
@property
@pulumi.getter(name="defaultBackendSetName")
def default_backend_set_name(self) -> pulumi.Output[str]:
"""
(Updatable) The name of the associated backend set. Example: `example_backend_set`
"""
return pulumi.get(self, "default_backend_set_name")
@property
@pulumi.getter(name="hostnameNames")
def hostname_names(self) -> pulumi.Output[Sequence[str]]:
"""
(Updatable) An array of hostname resource names.
"""
return pulumi.get(self, "hostname_names")
@property
@pulumi.getter(name="loadBalancerId")
def load_balancer_id(self) -> pulumi.Output[str]:
"""
The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the load balancer on which to add a listener.
"""
return pulumi.get(self, "load_balancer_id")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
A friendly name for the listener. It must be unique and it cannot be changed. Avoid entering confidential information. Example: `example_listener`
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="pathRouteSetName")
def path_route_set_name(self) -> pulumi.Output[str]:
"""
(Updatable) Deprecated. Please use `routingPolicies` instead.
"""
return pulumi.get(self, "path_route_set_name")
@property
@pulumi.getter
def port(self) -> pulumi.Output[int]:
"""
(Updatable) The communication port for the listener. Example: `80`
"""
return pulumi.get(self, "port")
@property
@pulumi.getter
def protocol(self) -> pulumi.Output[str]:
"""
(Updatable) The protocol on which the listener accepts connection requests. To get a list of valid protocols, use the [ListProtocols](https://docs.cloud.oracle.com/iaas/api/#/en/loadbalancer/20170115/LoadBalancerProtocol/ListProtocols) operation. Example: `HTTP`
"""
return pulumi.get(self, "protocol")
@property
@pulumi.getter(name="routingPolicyName")
def routing_policy_name(self) -> pulumi.Output[str]:
"""
(Updatable) The name of the routing policy applied to this listener's traffic. Example: `example_routing_policy`
"""
return pulumi.get(self, "routing_policy_name")
@property
@pulumi.getter(name="ruleSetNames")
def rule_set_names(self) -> pulumi.Output[Sequence[str]]:
"""
(Updatable) The names of the [rule sets](https://docs.cloud.oracle.com/iaas/api/#/en/loadbalancer/20170115/RuleSet/) to apply to the listener. Example: ["example_rule_set"]
"""
return pulumi.get(self, "rule_set_names")
@property
@pulumi.getter(name="sslConfiguration")
def ssl_configuration(self) -> pulumi.Output[Optional['outputs.ListenerSslConfiguration']]:
"""
(Updatable) The load balancer's SSL handling configuration details.
"""
return pulumi.get(self, "ssl_configuration")
@property
@pulumi.getter
def state(self) -> pulumi.Output[str]:
return pulumi.get(self, "state")
| 53.760388 | 306 | 0.687492 | 4,506 | 38,815 | 5.660897 | 0.05415 | 0.076329 | 0.060373 | 0.042261 | 0.928885 | 0.917516 | 0.902462 | 0.888545 | 0.88384 | 0.869296 | 0 | 0.004128 | 0.207471 | 38,815 | 721 | 307 | 53.834951 | 0.825076 | 0.411387 | 0 | 0.756892 | 1 | 0 | 0.138544 | 0.060001 | 0 | 0 | 0 | 0 | 0 | 1 | 0.162907 | false | 0.002506 | 0.017544 | 0.005013 | 0.278195 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b9f383cc7ca250d1841dedbaf3b0e2d15ada2592 | 27,127 | py | Python | sdk/python/pulumi_vault/identity/group.py | pulumi/pulumi-vault | 1682875f4a5d7d508f36e166529ad2b8aec34090 | [
"ECL-2.0",
"Apache-2.0"
] | 10 | 2019-10-07T17:44:18.000Z | 2022-03-30T20:46:33.000Z | sdk/python/pulumi_vault/identity/group.py | pulumi/pulumi-vault | 1682875f4a5d7d508f36e166529ad2b8aec34090 | [
"ECL-2.0",
"Apache-2.0"
] | 79 | 2019-10-11T18:13:07.000Z | 2022-03-31T21:09:41.000Z | sdk/python/pulumi_vault/identity/group.py | pulumi/pulumi-vault | 1682875f4a5d7d508f36e166529ad2b8aec34090 | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2019-10-28T10:08:40.000Z | 2020-03-17T14:20:55.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['GroupArgs', 'Group']
@pulumi.input_type
class GroupArgs:
def __init__(__self__, *,
external_member_entity_ids: Optional[pulumi.Input[bool]] = None,
external_policies: Optional[pulumi.Input[bool]] = None,
member_entity_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
member_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
metadata: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
name: Optional[pulumi.Input[str]] = None,
policies: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
type: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a Group resource.
:param pulumi.Input[bool] external_member_entity_ids: `false` by default. If set to `true`, this resource will ignore any Entity IDs returned from Vault or specified in the resource. You can use `identity.GroupMemberEntityIds` to manage Entity IDs for this group in a decoupled manner.
:param pulumi.Input[bool] external_policies: `false` by default. If set to `true`, this resource will ignore any policies returned from Vault or specified in the resource. You can use `identity.GroupPolicies` to manage policies for this group in a decoupled manner.
:param pulumi.Input[Sequence[pulumi.Input[str]]] member_entity_ids: A list of Entity IDs to be assigned as group members. Not allowed on `external` groups.
:param pulumi.Input[Sequence[pulumi.Input[str]]] member_group_ids: A list of Group IDs to be assigned as group members. Not allowed on `external` groups.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] metadata: A Map of additional metadata to associate with the group.
:param pulumi.Input[str] name: Name of the identity group to create.
:param pulumi.Input[Sequence[pulumi.Input[str]]] policies: A list of policies to apply to the group.
:param pulumi.Input[str] type: Type of the group, internal or external. Defaults to `internal`.
"""
if external_member_entity_ids is not None:
pulumi.set(__self__, "external_member_entity_ids", external_member_entity_ids)
if external_policies is not None:
pulumi.set(__self__, "external_policies", external_policies)
if member_entity_ids is not None:
pulumi.set(__self__, "member_entity_ids", member_entity_ids)
if member_group_ids is not None:
pulumi.set(__self__, "member_group_ids", member_group_ids)
if metadata is not None:
pulumi.set(__self__, "metadata", metadata)
if name is not None:
pulumi.set(__self__, "name", name)
if policies is not None:
pulumi.set(__self__, "policies", policies)
if type is not None:
pulumi.set(__self__, "type", type)
@property
@pulumi.getter(name="externalMemberEntityIds")
def external_member_entity_ids(self) -> Optional[pulumi.Input[bool]]:
"""
`false` by default. If set to `true`, this resource will ignore any Entity IDs returned from Vault or specified in the resource. You can use `identity.GroupMemberEntityIds` to manage Entity IDs for this group in a decoupled manner.
"""
return pulumi.get(self, "external_member_entity_ids")
@external_member_entity_ids.setter
def external_member_entity_ids(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "external_member_entity_ids", value)
@property
@pulumi.getter(name="externalPolicies")
def external_policies(self) -> Optional[pulumi.Input[bool]]:
"""
`false` by default. If set to `true`, this resource will ignore any policies returned from Vault or specified in the resource. You can use `identity.GroupPolicies` to manage policies for this group in a decoupled manner.
"""
return pulumi.get(self, "external_policies")
@external_policies.setter
def external_policies(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "external_policies", value)
@property
@pulumi.getter(name="memberEntityIds")
def member_entity_ids(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
A list of Entity IDs to be assigned as group members. Not allowed on `external` groups.
"""
return pulumi.get(self, "member_entity_ids")
@member_entity_ids.setter
def member_entity_ids(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "member_entity_ids", value)
@property
@pulumi.getter(name="memberGroupIds")
def member_group_ids(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
A list of Group IDs to be assigned as group members. Not allowed on `external` groups.
"""
return pulumi.get(self, "member_group_ids")
@member_group_ids.setter
def member_group_ids(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "member_group_ids", value)
@property
@pulumi.getter
def metadata(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A Map of additional metadata to associate with the group.
"""
return pulumi.get(self, "metadata")
@metadata.setter
def metadata(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "metadata", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Name of the identity group to create.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def policies(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
A list of policies to apply to the group.
"""
return pulumi.get(self, "policies")
@policies.setter
def policies(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "policies", value)
@property
@pulumi.getter
def type(self) -> Optional[pulumi.Input[str]]:
"""
Type of the group, internal or external. Defaults to `internal`.
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "type", value)
@pulumi.input_type
class _GroupState:
def __init__(__self__, *,
external_member_entity_ids: Optional[pulumi.Input[bool]] = None,
external_policies: Optional[pulumi.Input[bool]] = None,
member_entity_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
member_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
metadata: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
name: Optional[pulumi.Input[str]] = None,
policies: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
type: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering Group resources.
:param pulumi.Input[bool] external_member_entity_ids: `false` by default. If set to `true`, this resource will ignore any Entity IDs returned from Vault or specified in the resource. You can use `identity.GroupMemberEntityIds` to manage Entity IDs for this group in a decoupled manner.
:param pulumi.Input[bool] external_policies: `false` by default. If set to `true`, this resource will ignore any policies returned from Vault or specified in the resource. You can use `identity.GroupPolicies` to manage policies for this group in a decoupled manner.
:param pulumi.Input[Sequence[pulumi.Input[str]]] member_entity_ids: A list of Entity IDs to be assigned as group members. Not allowed on `external` groups.
:param pulumi.Input[Sequence[pulumi.Input[str]]] member_group_ids: A list of Group IDs to be assigned as group members. Not allowed on `external` groups.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] metadata: A Map of additional metadata to associate with the group.
:param pulumi.Input[str] name: Name of the identity group to create.
:param pulumi.Input[Sequence[pulumi.Input[str]]] policies: A list of policies to apply to the group.
:param pulumi.Input[str] type: Type of the group, internal or external. Defaults to `internal`.
"""
if external_member_entity_ids is not None:
pulumi.set(__self__, "external_member_entity_ids", external_member_entity_ids)
if external_policies is not None:
pulumi.set(__self__, "external_policies", external_policies)
if member_entity_ids is not None:
pulumi.set(__self__, "member_entity_ids", member_entity_ids)
if member_group_ids is not None:
pulumi.set(__self__, "member_group_ids", member_group_ids)
if metadata is not None:
pulumi.set(__self__, "metadata", metadata)
if name is not None:
pulumi.set(__self__, "name", name)
if policies is not None:
pulumi.set(__self__, "policies", policies)
if type is not None:
pulumi.set(__self__, "type", type)
@property
@pulumi.getter(name="externalMemberEntityIds")
def external_member_entity_ids(self) -> Optional[pulumi.Input[bool]]:
"""
`false` by default. If set to `true`, this resource will ignore any Entity IDs returned from Vault or specified in the resource. You can use `identity.GroupMemberEntityIds` to manage Entity IDs for this group in a decoupled manner.
"""
return pulumi.get(self, "external_member_entity_ids")
@external_member_entity_ids.setter
def external_member_entity_ids(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "external_member_entity_ids", value)
@property
@pulumi.getter(name="externalPolicies")
def external_policies(self) -> Optional[pulumi.Input[bool]]:
"""
`false` by default. If set to `true`, this resource will ignore any policies returned from Vault or specified in the resource. You can use `identity.GroupPolicies` to manage policies for this group in a decoupled manner.
"""
return pulumi.get(self, "external_policies")
@external_policies.setter
def external_policies(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "external_policies", value)
@property
@pulumi.getter(name="memberEntityIds")
def member_entity_ids(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
A list of Entity IDs to be assigned as group members. Not allowed on `external` groups.
"""
return pulumi.get(self, "member_entity_ids")
@member_entity_ids.setter
def member_entity_ids(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "member_entity_ids", value)
@property
@pulumi.getter(name="memberGroupIds")
def member_group_ids(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
A list of Group IDs to be assigned as group members. Not allowed on `external` groups.
"""
return pulumi.get(self, "member_group_ids")
@member_group_ids.setter
def member_group_ids(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "member_group_ids", value)
@property
@pulumi.getter
def metadata(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A Map of additional metadata to associate with the group.
"""
return pulumi.get(self, "metadata")
@metadata.setter
def metadata(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "metadata", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Name of the identity group to create.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def policies(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
A list of policies to apply to the group.
"""
return pulumi.get(self, "policies")
@policies.setter
def policies(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "policies", value)
@property
@pulumi.getter
def type(self) -> Optional[pulumi.Input[str]]:
"""
Type of the group, internal or external. Defaults to `internal`.
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "type", value)
class Group(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
external_member_entity_ids: Optional[pulumi.Input[bool]] = None,
external_policies: Optional[pulumi.Input[bool]] = None,
member_entity_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
member_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
metadata: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
name: Optional[pulumi.Input[str]] = None,
policies: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
type: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Creates an Identity Group for Vault. The [Identity secrets engine](https://www.vaultproject.io/docs/secrets/identity/index.html) is the identity management solution for Vault.
A group can contain multiple entities as its members. A group can also have subgroups. Policies set on the group is granted to all members of the group. During request time, when the token's entity ID is being evaluated for the policies that it has access to; along with the policies on the entity itself, policies that are inherited due to group memberships are also granted.
## Example Usage
### Internal Group
```python
import pulumi
import pulumi_vault as vault
internal = vault.identity.Group("internal",
metadata={
"version": "2",
},
policies=[
"dev",
"test",
],
type="internal")
```
### External Group
```python
import pulumi
import pulumi_vault as vault
group = vault.identity.Group("group",
metadata={
"version": "1",
},
policies=["test"],
type="external")
```
## Import
Identity group can be imported using the `id`, e.g.
```sh
$ pulumi import vault:identity/group:Group test 'fcbf1efb-2b69-4209-bed8-811e3475dad3'
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[bool] external_member_entity_ids: `false` by default. If set to `true`, this resource will ignore any Entity IDs returned from Vault or specified in the resource. You can use `identity.GroupMemberEntityIds` to manage Entity IDs for this group in a decoupled manner.
:param pulumi.Input[bool] external_policies: `false` by default. If set to `true`, this resource will ignore any policies returned from Vault or specified in the resource. You can use `identity.GroupPolicies` to manage policies for this group in a decoupled manner.
:param pulumi.Input[Sequence[pulumi.Input[str]]] member_entity_ids: A list of Entity IDs to be assigned as group members. Not allowed on `external` groups.
:param pulumi.Input[Sequence[pulumi.Input[str]]] member_group_ids: A list of Group IDs to be assigned as group members. Not allowed on `external` groups.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] metadata: A Map of additional metadata to associate with the group.
:param pulumi.Input[str] name: Name of the identity group to create.
:param pulumi.Input[Sequence[pulumi.Input[str]]] policies: A list of policies to apply to the group.
:param pulumi.Input[str] type: Type of the group, internal or external. Defaults to `internal`.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: Optional[GroupArgs] = None,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Creates an Identity Group for Vault. The [Identity secrets engine](https://www.vaultproject.io/docs/secrets/identity/index.html) is the identity management solution for Vault.
A group can contain multiple entities as its members. A group can also have subgroups. Policies set on the group is granted to all members of the group. During request time, when the token's entity ID is being evaluated for the policies that it has access to; along with the policies on the entity itself, policies that are inherited due to group memberships are also granted.
## Example Usage
### Internal Group
```python
import pulumi
import pulumi_vault as vault
internal = vault.identity.Group("internal",
metadata={
"version": "2",
},
policies=[
"dev",
"test",
],
type="internal")
```
### External Group
```python
import pulumi
import pulumi_vault as vault
group = vault.identity.Group("group",
metadata={
"version": "1",
},
policies=["test"],
type="external")
```
## Import
Identity group can be imported using the `id`, e.g.
```sh
$ pulumi import vault:identity/group:Group test 'fcbf1efb-2b69-4209-bed8-811e3475dad3'
```
:param str resource_name: The name of the resource.
:param GroupArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(GroupArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
external_member_entity_ids: Optional[pulumi.Input[bool]] = None,
external_policies: Optional[pulumi.Input[bool]] = None,
member_entity_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
member_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
metadata: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
name: Optional[pulumi.Input[str]] = None,
policies: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
type: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = GroupArgs.__new__(GroupArgs)
__props__.__dict__["external_member_entity_ids"] = external_member_entity_ids
__props__.__dict__["external_policies"] = external_policies
__props__.__dict__["member_entity_ids"] = member_entity_ids
__props__.__dict__["member_group_ids"] = member_group_ids
__props__.__dict__["metadata"] = metadata
__props__.__dict__["name"] = name
__props__.__dict__["policies"] = policies
__props__.__dict__["type"] = type
super(Group, __self__).__init__(
'vault:identity/group:Group',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
external_member_entity_ids: Optional[pulumi.Input[bool]] = None,
external_policies: Optional[pulumi.Input[bool]] = None,
member_entity_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
member_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
metadata: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
name: Optional[pulumi.Input[str]] = None,
policies: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
type: Optional[pulumi.Input[str]] = None) -> 'Group':
"""
Get an existing Group resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[bool] external_member_entity_ids: `false` by default. If set to `true`, this resource will ignore any Entity IDs returned from Vault or specified in the resource. You can use `identity.GroupMemberEntityIds` to manage Entity IDs for this group in a decoupled manner.
:param pulumi.Input[bool] external_policies: `false` by default. If set to `true`, this resource will ignore any policies returned from Vault or specified in the resource. You can use `identity.GroupPolicies` to manage policies for this group in a decoupled manner.
:param pulumi.Input[Sequence[pulumi.Input[str]]] member_entity_ids: A list of Entity IDs to be assigned as group members. Not allowed on `external` groups.
:param pulumi.Input[Sequence[pulumi.Input[str]]] member_group_ids: A list of Group IDs to be assigned as group members. Not allowed on `external` groups.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] metadata: A Map of additional metadata to associate with the group.
:param pulumi.Input[str] name: Name of the identity group to create.
:param pulumi.Input[Sequence[pulumi.Input[str]]] policies: A list of policies to apply to the group.
:param pulumi.Input[str] type: Type of the group, internal or external. Defaults to `internal`.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _GroupState.__new__(_GroupState)
__props__.__dict__["external_member_entity_ids"] = external_member_entity_ids
__props__.__dict__["external_policies"] = external_policies
__props__.__dict__["member_entity_ids"] = member_entity_ids
__props__.__dict__["member_group_ids"] = member_group_ids
__props__.__dict__["metadata"] = metadata
__props__.__dict__["name"] = name
__props__.__dict__["policies"] = policies
__props__.__dict__["type"] = type
return Group(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="externalMemberEntityIds")
def external_member_entity_ids(self) -> pulumi.Output[Optional[bool]]:
"""
`false` by default. If set to `true`, this resource will ignore any Entity IDs returned from Vault or specified in the resource. You can use `identity.GroupMemberEntityIds` to manage Entity IDs for this group in a decoupled manner.
"""
return pulumi.get(self, "external_member_entity_ids")
@property
@pulumi.getter(name="externalPolicies")
def external_policies(self) -> pulumi.Output[Optional[bool]]:
"""
`false` by default. If set to `true`, this resource will ignore any policies returned from Vault or specified in the resource. You can use `identity.GroupPolicies` to manage policies for this group in a decoupled manner.
"""
return pulumi.get(self, "external_policies")
@property
@pulumi.getter(name="memberEntityIds")
def member_entity_ids(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
A list of Entity IDs to be assigned as group members. Not allowed on `external` groups.
"""
return pulumi.get(self, "member_entity_ids")
@property
@pulumi.getter(name="memberGroupIds")
def member_group_ids(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
A list of Group IDs to be assigned as group members. Not allowed on `external` groups.
"""
return pulumi.get(self, "member_group_ids")
@property
@pulumi.getter
def metadata(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
A Map of additional metadata to associate with the group.
"""
return pulumi.get(self, "metadata")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
Name of the identity group to create.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def policies(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
A list of policies to apply to the group.
"""
return pulumi.get(self, "policies")
@property
@pulumi.getter
def type(self) -> pulumi.Output[Optional[str]]:
"""
Type of the group, internal or external. Defaults to `internal`.
"""
return pulumi.get(self, "type")
| 48.527728 | 384 | 0.657537 | 3,387 | 27,127 | 5.092412 | 0.062297 | 0.102041 | 0.064935 | 0.056528 | 0.911758 | 0.905264 | 0.897843 | 0.893263 | 0.890306 | 0.88503 | 0 | 0.001893 | 0.240462 | 27,127 | 558 | 385 | 48.614695 | 0.835226 | 0.398275 | 0 | 0.83391 | 1 | 0 | 0.088172 | 0.022212 | 0 | 0 | 0 | 0 | 0 | 1 | 0.16263 | false | 0.00346 | 0.017301 | 0 | 0.276817 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b9f9ca914d0cb207d65b6c898c89519bc8a97009 | 113,757 | py | Python | tencentcloud/iot/v20180123/models.py | qin5506/tencentcloud-sdk-python | e9c59d80beabf75fb96456bb8d7a53400346fe9a | [
"Apache-2.0"
] | null | null | null | tencentcloud/iot/v20180123/models.py | qin5506/tencentcloud-sdk-python | e9c59d80beabf75fb96456bb8d7a53400346fe9a | [
"Apache-2.0"
] | null | null | null | tencentcloud/iot/v20180123/models.py | qin5506/tencentcloud-sdk-python | e9c59d80beabf75fb96456bb8d7a53400346fe9a | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf8 -*-
# Copyright (c) 2017-2018 THL A29 Limited, a Tencent company. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import warnings
from tencentcloud.common.abstract_model import AbstractModel
class Action(AbstractModel):
"""规则引擎转发动作
"""
def __init__(self):
"""
:param Topic: 转发至topic
注意:此字段可能返回 null,表示取不到有效值。
:type Topic: :class:`tencentcloud.iot.v20180123.models.TopicAction`
:param Service: 转发至第三发
注意:此字段可能返回 null,表示取不到有效值。
:type Service: :class:`tencentcloud.iot.v20180123.models.ServiceAction`
:param Ckafka: 转发至第三发Ckafka
注意:此字段可能返回 null,表示取不到有效值。
:type Ckafka: :class:`tencentcloud.iot.v20180123.models.CkafkaAction`
"""
self.Topic = None
self.Service = None
self.Ckafka = None
def _deserialize(self, params):
if params.get("Topic") is not None:
self.Topic = TopicAction()
self.Topic._deserialize(params.get("Topic"))
if params.get("Service") is not None:
self.Service = ServiceAction()
self.Service._deserialize(params.get("Service"))
if params.get("Ckafka") is not None:
self.Ckafka = CkafkaAction()
self.Ckafka._deserialize(params.get("Ckafka"))
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class ActivateRuleRequest(AbstractModel):
"""ActivateRule请求参数结构体
"""
def __init__(self):
"""
:param RuleId: 规则Id
:type RuleId: str
"""
self.RuleId = None
def _deserialize(self, params):
self.RuleId = params.get("RuleId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class ActivateRuleResponse(AbstractModel):
"""ActivateRule返回参数结构体
"""
def __init__(self):
"""
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.RequestId = None
def _deserialize(self, params):
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AddDeviceRequest(AbstractModel):
"""AddDevice请求参数结构体
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
:param DeviceName: 设备名称,唯一标识某产品下的一个设备
:type DeviceName: str
"""
self.ProductId = None
self.DeviceName = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.DeviceName = params.get("DeviceName")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AddDeviceResponse(AbstractModel):
"""AddDevice返回参数结构体
"""
def __init__(self):
"""
:param Device: 设备信息
:type Device: :class:`tencentcloud.iot.v20180123.models.Device`
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.Device = None
self.RequestId = None
def _deserialize(self, params):
if params.get("Device") is not None:
self.Device = Device()
self.Device._deserialize(params.get("Device"))
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AddProductRequest(AbstractModel):
"""AddProduct请求参数结构体
"""
def __init__(self):
"""
:param Name: 产品名称,同一区域产品名称需唯一,支持中文、英文字母、中划线和下划线,长度不超过31个字符,中文占两个字符
:type Name: str
:param Description: 产品描述
:type Description: str
:param DataTemplate: 数据模版
:type DataTemplate: list of DataTemplate
:param DataProtocol: 产品版本(native表示基础版,template表示高级版,默认值为template)
:type DataProtocol: str
:param AuthType: 设备认证方式(1:动态令牌,2:签名直连鉴权)
:type AuthType: int
:param CommProtocol: 通信方式(other/wifi/cellular/nb-iot)
:type CommProtocol: str
:param DeviceType: 产品的设备类型(device: 直连设备;sub_device:子设备;gateway:网关设备)
:type DeviceType: str
"""
self.Name = None
self.Description = None
self.DataTemplate = None
self.DataProtocol = None
self.AuthType = None
self.CommProtocol = None
self.DeviceType = None
def _deserialize(self, params):
self.Name = params.get("Name")
self.Description = params.get("Description")
if params.get("DataTemplate") is not None:
self.DataTemplate = []
for item in params.get("DataTemplate"):
obj = DataTemplate()
obj._deserialize(item)
self.DataTemplate.append(obj)
self.DataProtocol = params.get("DataProtocol")
self.AuthType = params.get("AuthType")
self.CommProtocol = params.get("CommProtocol")
self.DeviceType = params.get("DeviceType")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AddProductResponse(AbstractModel):
"""AddProduct返回参数结构体
"""
def __init__(self):
"""
:param Product: 产品信息
:type Product: :class:`tencentcloud.iot.v20180123.models.Product`
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.Product = None
self.RequestId = None
def _deserialize(self, params):
if params.get("Product") is not None:
self.Product = Product()
self.Product._deserialize(params.get("Product"))
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AddRuleRequest(AbstractModel):
"""AddRule请求参数结构体
"""
def __init__(self):
"""
:param Name: 名称
:type Name: str
:param Description: 描述
:type Description: str
:param Query: 查询
:type Query: :class:`tencentcloud.iot.v20180123.models.RuleQuery`
:param Actions: 转发动作列表
:type Actions: list of Action
:param DataType: 数据类型(0:文本,1:二进制)
:type DataType: int
"""
self.Name = None
self.Description = None
self.Query = None
self.Actions = None
self.DataType = None
def _deserialize(self, params):
self.Name = params.get("Name")
self.Description = params.get("Description")
if params.get("Query") is not None:
self.Query = RuleQuery()
self.Query._deserialize(params.get("Query"))
if params.get("Actions") is not None:
self.Actions = []
for item in params.get("Actions"):
obj = Action()
obj._deserialize(item)
self.Actions.append(obj)
self.DataType = params.get("DataType")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AddRuleResponse(AbstractModel):
"""AddRule返回参数结构体
"""
def __init__(self):
"""
:param Rule: 规则
:type Rule: :class:`tencentcloud.iot.v20180123.models.Rule`
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.Rule = None
self.RequestId = None
def _deserialize(self, params):
if params.get("Rule") is not None:
self.Rule = Rule()
self.Rule._deserialize(params.get("Rule"))
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AddTopicRequest(AbstractModel):
"""AddTopic请求参数结构体
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
:param TopicName: Topic名称
:type TopicName: str
"""
self.ProductId = None
self.TopicName = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.TopicName = params.get("TopicName")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AddTopicResponse(AbstractModel):
"""AddTopic返回参数结构体
"""
def __init__(self):
"""
:param Topic: Topic信息
:type Topic: :class:`tencentcloud.iot.v20180123.models.Topic`
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.Topic = None
self.RequestId = None
def _deserialize(self, params):
if params.get("Topic") is not None:
self.Topic = Topic()
self.Topic._deserialize(params.get("Topic"))
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppAddUserRequest(AbstractModel):
"""AppAddUser请求参数结构体
"""
def __init__(self):
"""
:param UserName: 用户名
:type UserName: str
:param Password: 密码
:type Password: str
"""
self.UserName = None
self.Password = None
def _deserialize(self, params):
self.UserName = params.get("UserName")
self.Password = params.get("Password")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppAddUserResponse(AbstractModel):
"""AppAddUser返回参数结构体
"""
def __init__(self):
"""
:param AppUser: 应用用户
:type AppUser: :class:`tencentcloud.iot.v20180123.models.AppUser`
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.AppUser = None
self.RequestId = None
def _deserialize(self, params):
if params.get("AppUser") is not None:
self.AppUser = AppUser()
self.AppUser._deserialize(params.get("AppUser"))
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppDeleteDeviceRequest(AbstractModel):
"""AppDeleteDevice请求参数结构体
"""
def __init__(self):
"""
:param AccessToken: 访问Token
:type AccessToken: str
:param ProductId: 产品Id
:type ProductId: str
:param DeviceName: 设备名称
:type DeviceName: str
"""
self.AccessToken = None
self.ProductId = None
self.DeviceName = None
def _deserialize(self, params):
self.AccessToken = params.get("AccessToken")
self.ProductId = params.get("ProductId")
self.DeviceName = params.get("DeviceName")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppDeleteDeviceResponse(AbstractModel):
"""AppDeleteDevice返回参数结构体
"""
def __init__(self):
"""
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.RequestId = None
def _deserialize(self, params):
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppDevice(AbstractModel):
"""绑定设备
"""
def __init__(self):
"""
:param DeviceId: 设备Id
:type DeviceId: str
:param ProductId: 所属产品Id
:type ProductId: str
:param DeviceName: 设备名称
:type DeviceName: str
:param AliasName: 别名
:type AliasName: str
:param Region: 地区
:type Region: str
:param CreateTime: 创建时间
:type CreateTime: str
:param UpdateTime: 更新时间
:type UpdateTime: str
"""
self.DeviceId = None
self.ProductId = None
self.DeviceName = None
self.AliasName = None
self.Region = None
self.CreateTime = None
self.UpdateTime = None
def _deserialize(self, params):
self.DeviceId = params.get("DeviceId")
self.ProductId = params.get("ProductId")
self.DeviceName = params.get("DeviceName")
self.AliasName = params.get("AliasName")
self.Region = params.get("Region")
self.CreateTime = params.get("CreateTime")
self.UpdateTime = params.get("UpdateTime")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppDeviceDetail(AbstractModel):
"""绑定设备详情
"""
def __init__(self):
"""
:param DeviceId: 设备Id
:type DeviceId: str
:param ProductId: 所属产品Id
:type ProductId: str
:param DeviceName: 设备名称
:type DeviceName: str
:param AliasName: 别名
:type AliasName: str
:param Region: 地区
:type Region: str
:param CreateTime: 创建时间
:type CreateTime: str
:param UpdateTime: 更新时间
:type UpdateTime: str
:param DeviceInfo: 设备信息(json)
:type DeviceInfo: str
:param DataTemplate: 数据模板
:type DataTemplate: list of DataTemplate
"""
self.DeviceId = None
self.ProductId = None
self.DeviceName = None
self.AliasName = None
self.Region = None
self.CreateTime = None
self.UpdateTime = None
self.DeviceInfo = None
self.DataTemplate = None
def _deserialize(self, params):
self.DeviceId = params.get("DeviceId")
self.ProductId = params.get("ProductId")
self.DeviceName = params.get("DeviceName")
self.AliasName = params.get("AliasName")
self.Region = params.get("Region")
self.CreateTime = params.get("CreateTime")
self.UpdateTime = params.get("UpdateTime")
self.DeviceInfo = params.get("DeviceInfo")
if params.get("DataTemplate") is not None:
self.DataTemplate = []
for item in params.get("DataTemplate"):
obj = DataTemplate()
obj._deserialize(item)
self.DataTemplate.append(obj)
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppGetDeviceDataRequest(AbstractModel):
"""AppGetDeviceData请求参数结构体
"""
def __init__(self):
"""
:param AccessToken: 访问Token
:type AccessToken: str
:param ProductId: 产品Id
:type ProductId: str
:param DeviceName: 设备名称
:type DeviceName: str
"""
self.AccessToken = None
self.ProductId = None
self.DeviceName = None
def _deserialize(self, params):
self.AccessToken = params.get("AccessToken")
self.ProductId = params.get("ProductId")
self.DeviceName = params.get("DeviceName")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppGetDeviceDataResponse(AbstractModel):
"""AppGetDeviceData返回参数结构体
"""
def __init__(self):
"""
:param DeviceData: 设备数据。
:type DeviceData: str
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.DeviceData = None
self.RequestId = None
def _deserialize(self, params):
self.DeviceData = params.get("DeviceData")
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppGetDeviceRequest(AbstractModel):
"""AppGetDevice请求参数结构体
"""
def __init__(self):
"""
:param AccessToken: 访问Token
:type AccessToken: str
:param ProductId: 产品Id
:type ProductId: str
:param DeviceName: 设备名称
:type DeviceName: str
"""
self.AccessToken = None
self.ProductId = None
self.DeviceName = None
def _deserialize(self, params):
self.AccessToken = params.get("AccessToken")
self.ProductId = params.get("ProductId")
self.DeviceName = params.get("DeviceName")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppGetDeviceResponse(AbstractModel):
"""AppGetDevice返回参数结构体
"""
def __init__(self):
"""
:param AppDevice: 绑定设备详情
:type AppDevice: :class:`tencentcloud.iot.v20180123.models.AppDeviceDetail`
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.AppDevice = None
self.RequestId = None
def _deserialize(self, params):
if params.get("AppDevice") is not None:
self.AppDevice = AppDeviceDetail()
self.AppDevice._deserialize(params.get("AppDevice"))
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppGetDeviceStatusesRequest(AbstractModel):
"""AppGetDeviceStatuses请求参数结构体
"""
def __init__(self):
"""
:param AccessToken: 访问Token
:type AccessToken: str
:param DeviceIds: 设备Id列表(单次限制1000个设备)
:type DeviceIds: list of str
"""
self.AccessToken = None
self.DeviceIds = None
def _deserialize(self, params):
self.AccessToken = params.get("AccessToken")
self.DeviceIds = params.get("DeviceIds")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppGetDeviceStatusesResponse(AbstractModel):
"""AppGetDeviceStatuses返回参数结构体
"""
def __init__(self):
"""
:param DeviceStatuses: 设备状态
:type DeviceStatuses: list of DeviceStatus
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.DeviceStatuses = None
self.RequestId = None
def _deserialize(self, params):
if params.get("DeviceStatuses") is not None:
self.DeviceStatuses = []
for item in params.get("DeviceStatuses"):
obj = DeviceStatus()
obj._deserialize(item)
self.DeviceStatuses.append(obj)
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppGetDevicesRequest(AbstractModel):
"""AppGetDevices请求参数结构体
"""
def __init__(self):
"""
:param AccessToken: 访问Token
:type AccessToken: str
"""
self.AccessToken = None
def _deserialize(self, params):
self.AccessToken = params.get("AccessToken")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppGetDevicesResponse(AbstractModel):
"""AppGetDevices返回参数结构体
"""
def __init__(self):
"""
:param Devices: 绑定设备列表
:type Devices: list of AppDevice
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.Devices = None
self.RequestId = None
def _deserialize(self, params):
if params.get("Devices") is not None:
self.Devices = []
for item in params.get("Devices"):
obj = AppDevice()
obj._deserialize(item)
self.Devices.append(obj)
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppGetTokenRequest(AbstractModel):
"""AppGetToken请求参数结构体
"""
def __init__(self):
"""
:param UserName: 用户名
:type UserName: str
:param Password: 密码
:type Password: str
:param Expire: TTL
:type Expire: int
"""
self.UserName = None
self.Password = None
self.Expire = None
def _deserialize(self, params):
self.UserName = params.get("UserName")
self.Password = params.get("Password")
self.Expire = params.get("Expire")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppGetTokenResponse(AbstractModel):
"""AppGetToken返回参数结构体
"""
def __init__(self):
"""
:param AccessToken: 访问Token
:type AccessToken: str
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.AccessToken = None
self.RequestId = None
def _deserialize(self, params):
self.AccessToken = params.get("AccessToken")
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppGetUserRequest(AbstractModel):
"""AppGetUser请求参数结构体
"""
def __init__(self):
"""
:param AccessToken: 访问Token
:type AccessToken: str
"""
self.AccessToken = None
def _deserialize(self, params):
self.AccessToken = params.get("AccessToken")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppGetUserResponse(AbstractModel):
"""AppGetUser返回参数结构体
"""
def __init__(self):
"""
:param AppUser: 用户信息
:type AppUser: :class:`tencentcloud.iot.v20180123.models.AppUser`
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.AppUser = None
self.RequestId = None
def _deserialize(self, params):
if params.get("AppUser") is not None:
self.AppUser = AppUser()
self.AppUser._deserialize(params.get("AppUser"))
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppIssueDeviceControlRequest(AbstractModel):
"""AppIssueDeviceControl请求参数结构体
"""
def __init__(self):
"""
:param AccessToken: 访问Token
:type AccessToken: str
:param ProductId: 产品Id
:type ProductId: str
:param DeviceName: 设备名称
:type DeviceName: str
:param ControlData: 控制数据(json)
:type ControlData: str
:param Metadata: 是否发送metadata字段
:type Metadata: bool
"""
self.AccessToken = None
self.ProductId = None
self.DeviceName = None
self.ControlData = None
self.Metadata = None
def _deserialize(self, params):
self.AccessToken = params.get("AccessToken")
self.ProductId = params.get("ProductId")
self.DeviceName = params.get("DeviceName")
self.ControlData = params.get("ControlData")
self.Metadata = params.get("Metadata")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppIssueDeviceControlResponse(AbstractModel):
"""AppIssueDeviceControl返回参数结构体
"""
def __init__(self):
"""
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.RequestId = None
def _deserialize(self, params):
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppResetPasswordRequest(AbstractModel):
"""AppResetPassword请求参数结构体
"""
def __init__(self):
"""
:param AccessToken: 访问Token
:type AccessToken: str
:param OldPassword: 旧密码
:type OldPassword: str
:param NewPassword: 新密码
:type NewPassword: str
"""
self.AccessToken = None
self.OldPassword = None
self.NewPassword = None
def _deserialize(self, params):
self.AccessToken = params.get("AccessToken")
self.OldPassword = params.get("OldPassword")
self.NewPassword = params.get("NewPassword")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppResetPasswordResponse(AbstractModel):
"""AppResetPassword返回参数结构体
"""
def __init__(self):
"""
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.RequestId = None
def _deserialize(self, params):
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppSecureAddDeviceRequest(AbstractModel):
"""AppSecureAddDevice请求参数结构体
"""
def __init__(self):
"""
:param AccessToken: 访问Token
:type AccessToken: str
:param DeviceSignature: 设备签名
:type DeviceSignature: str
"""
self.AccessToken = None
self.DeviceSignature = None
def _deserialize(self, params):
self.AccessToken = params.get("AccessToken")
self.DeviceSignature = params.get("DeviceSignature")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppSecureAddDeviceResponse(AbstractModel):
"""AppSecureAddDevice返回参数结构体
"""
def __init__(self):
"""
:param AppDevice: 绑定设备信息
:type AppDevice: :class:`tencentcloud.iot.v20180123.models.AppDevice`
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.AppDevice = None
self.RequestId = None
def _deserialize(self, params):
if params.get("AppDevice") is not None:
self.AppDevice = AppDevice()
self.AppDevice._deserialize(params.get("AppDevice"))
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppUpdateDeviceRequest(AbstractModel):
"""AppUpdateDevice请求参数结构体
"""
def __init__(self):
"""
:param AccessToken: 访问Token
:type AccessToken: str
:param ProductId: 产品Id
:type ProductId: str
:param DeviceName: 设备名称
:type DeviceName: str
:param AliasName: 设备别名
:type AliasName: str
"""
self.AccessToken = None
self.ProductId = None
self.DeviceName = None
self.AliasName = None
def _deserialize(self, params):
self.AccessToken = params.get("AccessToken")
self.ProductId = params.get("ProductId")
self.DeviceName = params.get("DeviceName")
self.AliasName = params.get("AliasName")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppUpdateDeviceResponse(AbstractModel):
"""AppUpdateDevice返回参数结构体
"""
def __init__(self):
"""
:param AppDevice: 设备信息
:type AppDevice: :class:`tencentcloud.iot.v20180123.models.AppDevice`
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.AppDevice = None
self.RequestId = None
def _deserialize(self, params):
if params.get("AppDevice") is not None:
self.AppDevice = AppDevice()
self.AppDevice._deserialize(params.get("AppDevice"))
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppUpdateUserRequest(AbstractModel):
"""AppUpdateUser请求参数结构体
"""
def __init__(self):
"""
:param AccessToken: 访问Token
:type AccessToken: str
:param NickName: 昵称
:type NickName: str
"""
self.AccessToken = None
self.NickName = None
def _deserialize(self, params):
self.AccessToken = params.get("AccessToken")
self.NickName = params.get("NickName")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppUpdateUserResponse(AbstractModel):
"""AppUpdateUser返回参数结构体
"""
def __init__(self):
"""
:param AppUser: 应用用户
:type AppUser: :class:`tencentcloud.iot.v20180123.models.AppUser`
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.AppUser = None
self.RequestId = None
def _deserialize(self, params):
if params.get("AppUser") is not None:
self.AppUser = AppUser()
self.AppUser._deserialize(params.get("AppUser"))
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AppUser(AbstractModel):
"""应用用户
"""
def __init__(self):
"""
:param ApplicationId: 应用Id
:type ApplicationId: str
:param UserName: 用户名
:type UserName: str
:param NickName: 昵称
:type NickName: str
:param CreateTime: 创建时间
:type CreateTime: str
:param UpdateTime: 修改时间
:type UpdateTime: str
"""
self.ApplicationId = None
self.UserName = None
self.NickName = None
self.CreateTime = None
self.UpdateTime = None
def _deserialize(self, params):
self.ApplicationId = params.get("ApplicationId")
self.UserName = params.get("UserName")
self.NickName = params.get("NickName")
self.CreateTime = params.get("CreateTime")
self.UpdateTime = params.get("UpdateTime")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AssociateSubDeviceToGatewayProductRequest(AbstractModel):
"""AssociateSubDeviceToGatewayProduct请求参数结构体
"""
def __init__(self):
"""
:param SubDeviceProductId: 子设备产品Id
:type SubDeviceProductId: str
:param GatewayProductId: 网关产品Id
:type GatewayProductId: str
"""
self.SubDeviceProductId = None
self.GatewayProductId = None
def _deserialize(self, params):
self.SubDeviceProductId = params.get("SubDeviceProductId")
self.GatewayProductId = params.get("GatewayProductId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class AssociateSubDeviceToGatewayProductResponse(AbstractModel):
"""AssociateSubDeviceToGatewayProduct返回参数结构体
"""
def __init__(self):
"""
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.RequestId = None
def _deserialize(self, params):
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class BoolData(AbstractModel):
"""布尔类型数据
"""
def __init__(self):
"""
:param Name: 名称
:type Name: str
:param Desc: 描述
:type Desc: str
:param Mode: 读写模式
:type Mode: str
:param Range: 取值列表
:type Range: list of bool
"""
self.Name = None
self.Desc = None
self.Mode = None
self.Range = None
def _deserialize(self, params):
self.Name = params.get("Name")
self.Desc = params.get("Desc")
self.Mode = params.get("Mode")
self.Range = params.get("Range")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class CkafkaAction(AbstractModel):
"""转发至Ckafka
"""
def __init__(self):
"""
:param InstanceId: 实例Id
:type InstanceId: str
:param TopicName: topic名称
:type TopicName: str
:param Region: 地域
:type Region: str
"""
self.InstanceId = None
self.TopicName = None
self.Region = None
def _deserialize(self, params):
self.InstanceId = params.get("InstanceId")
self.TopicName = params.get("TopicName")
self.Region = params.get("Region")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class DataHistoryEntry(AbstractModel):
"""数据历史条目
"""
def __init__(self):
"""
:param Id: 日志id
:type Id: str
:param Timestamp: 时间戳
:type Timestamp: int
:param DeviceName: 设备名称
:type DeviceName: str
:param Data: 数据
:type Data: str
"""
self.Id = None
self.Timestamp = None
self.DeviceName = None
self.Data = None
def _deserialize(self, params):
self.Id = params.get("Id")
self.Timestamp = params.get("Timestamp")
self.DeviceName = params.get("DeviceName")
self.Data = params.get("Data")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class DataTemplate(AbstractModel):
"""数据模版
"""
def __init__(self):
"""
:param Number: 数字类型
注意:此字段可能返回 null,表示取不到有效值。
:type Number: :class:`tencentcloud.iot.v20180123.models.NumberData`
:param String: 字符串类型
注意:此字段可能返回 null,表示取不到有效值。
:type String: :class:`tencentcloud.iot.v20180123.models.StringData`
:param Enum: 枚举类型
注意:此字段可能返回 null,表示取不到有效值。
:type Enum: :class:`tencentcloud.iot.v20180123.models.EnumData`
:param Bool: 布尔类型
注意:此字段可能返回 null,表示取不到有效值。
:type Bool: :class:`tencentcloud.iot.v20180123.models.BoolData`
"""
self.Number = None
self.String = None
self.Enum = None
self.Bool = None
def _deserialize(self, params):
if params.get("Number") is not None:
self.Number = NumberData()
self.Number._deserialize(params.get("Number"))
if params.get("String") is not None:
self.String = StringData()
self.String._deserialize(params.get("String"))
if params.get("Enum") is not None:
self.Enum = EnumData()
self.Enum._deserialize(params.get("Enum"))
if params.get("Bool") is not None:
self.Bool = BoolData()
self.Bool._deserialize(params.get("Bool"))
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class DeactivateRuleRequest(AbstractModel):
"""DeactivateRule请求参数结构体
"""
def __init__(self):
"""
:param RuleId: 规则Id
:type RuleId: str
"""
self.RuleId = None
def _deserialize(self, params):
self.RuleId = params.get("RuleId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class DeactivateRuleResponse(AbstractModel):
"""DeactivateRule返回参数结构体
"""
def __init__(self):
"""
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.RequestId = None
def _deserialize(self, params):
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class DebugLogEntry(AbstractModel):
"""设备日志条目
"""
def __init__(self):
"""
:param Id: 日志id
:type Id: str
:param Event: 行为(事件)
:type Event: str
:param LogType: shadow/action/mqtt, 分别表示:影子/规则引擎/上下线日志
:type LogType: str
:param Timestamp: 时间戳
:type Timestamp: int
:param Result: success/fail
:type Result: str
:param Data: 日志详细内容
:type Data: str
:param Topic: 数据来源topic
:type Topic: str
:param DeviceName: 设备名称
:type DeviceName: str
"""
self.Id = None
self.Event = None
self.LogType = None
self.Timestamp = None
self.Result = None
self.Data = None
self.Topic = None
self.DeviceName = None
def _deserialize(self, params):
self.Id = params.get("Id")
self.Event = params.get("Event")
self.LogType = params.get("LogType")
self.Timestamp = params.get("Timestamp")
self.Result = params.get("Result")
self.Data = params.get("Data")
self.Topic = params.get("Topic")
self.DeviceName = params.get("DeviceName")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class DeleteDeviceRequest(AbstractModel):
"""DeleteDevice请求参数结构体
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
:param DeviceName: 设备名称
:type DeviceName: str
"""
self.ProductId = None
self.DeviceName = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.DeviceName = params.get("DeviceName")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class DeleteDeviceResponse(AbstractModel):
"""DeleteDevice返回参数结构体
"""
def __init__(self):
"""
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.RequestId = None
def _deserialize(self, params):
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class DeleteProductRequest(AbstractModel):
"""DeleteProduct请求参数结构体
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
"""
self.ProductId = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class DeleteProductResponse(AbstractModel):
"""DeleteProduct返回参数结构体
"""
def __init__(self):
"""
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.RequestId = None
def _deserialize(self, params):
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class DeleteRuleRequest(AbstractModel):
"""DeleteRule请求参数结构体
"""
def __init__(self):
"""
:param RuleId: 规则Id
:type RuleId: str
"""
self.RuleId = None
def _deserialize(self, params):
self.RuleId = params.get("RuleId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class DeleteRuleResponse(AbstractModel):
"""DeleteRule返回参数结构体
"""
def __init__(self):
"""
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.RequestId = None
def _deserialize(self, params):
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class DeleteTopicRequest(AbstractModel):
"""DeleteTopic请求参数结构体
"""
def __init__(self):
"""
:param TopicId: TopicId
:type TopicId: str
:param ProductId: 产品Id
:type ProductId: str
"""
self.TopicId = None
self.ProductId = None
def _deserialize(self, params):
self.TopicId = params.get("TopicId")
self.ProductId = params.get("ProductId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class DeleteTopicResponse(AbstractModel):
"""DeleteTopic返回参数结构体
"""
def __init__(self):
"""
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.RequestId = None
def _deserialize(self, params):
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class Device(AbstractModel):
"""设备
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
:param DeviceName: 设备名称
:type DeviceName: str
:param DeviceSecret: 设备密钥
:type DeviceSecret: str
:param UpdateTime: 更新时间
:type UpdateTime: str
:param CreateTime: 创建时间
:type CreateTime: str
:param DeviceInfo: 设备信息(json)
:type DeviceInfo: str
"""
self.ProductId = None
self.DeviceName = None
self.DeviceSecret = None
self.UpdateTime = None
self.CreateTime = None
self.DeviceInfo = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.DeviceName = params.get("DeviceName")
self.DeviceSecret = params.get("DeviceSecret")
self.UpdateTime = params.get("UpdateTime")
self.CreateTime = params.get("CreateTime")
self.DeviceInfo = params.get("DeviceInfo")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class DeviceEntry(AbstractModel):
"""设备条目
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
:param DeviceName: 设备名称
:type DeviceName: str
:param DeviceSecret: 设备密钥
:type DeviceSecret: str
:param CreateTime: 创建时间
:type CreateTime: str
"""
self.ProductId = None
self.DeviceName = None
self.DeviceSecret = None
self.CreateTime = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.DeviceName = params.get("DeviceName")
self.DeviceSecret = params.get("DeviceSecret")
self.CreateTime = params.get("CreateTime")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class DeviceLogEntry(AbstractModel):
"""设备日志条目
"""
def __init__(self):
"""
:param Id: 日志id
:type Id: str
:param Msg: 日志内容
:type Msg: str
:param Code: 状态码
:type Code: str
:param Timestamp: 时间戳
:type Timestamp: int
:param DeviceName: 设备名称
:type DeviceName: str
:param Method: 设备动作
:type Method: str
"""
self.Id = None
self.Msg = None
self.Code = None
self.Timestamp = None
self.DeviceName = None
self.Method = None
def _deserialize(self, params):
self.Id = params.get("Id")
self.Msg = params.get("Msg")
self.Code = params.get("Code")
self.Timestamp = params.get("Timestamp")
self.DeviceName = params.get("DeviceName")
self.Method = params.get("Method")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class DeviceSignature(AbstractModel):
"""设备签名
"""
def __init__(self):
"""
:param DeviceName: 设备名称
:type DeviceName: str
:param DeviceSignature: 设备签名
:type DeviceSignature: str
"""
self.DeviceName = None
self.DeviceSignature = None
def _deserialize(self, params):
self.DeviceName = params.get("DeviceName")
self.DeviceSignature = params.get("DeviceSignature")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class DeviceStatData(AbstractModel):
"""设备统计数据
"""
def __init__(self):
"""
:param Datetime: 时间点
:type Datetime: str
:param DeviceOnline: 在线设备数
:type DeviceOnline: int
:param DeviceActive: 激活设备数
:type DeviceActive: int
:param DeviceTotal: 设备总数
:type DeviceTotal: int
"""
self.Datetime = None
self.DeviceOnline = None
self.DeviceActive = None
self.DeviceTotal = None
def _deserialize(self, params):
self.Datetime = params.get("Datetime")
self.DeviceOnline = params.get("DeviceOnline")
self.DeviceActive = params.get("DeviceActive")
self.DeviceTotal = params.get("DeviceTotal")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class DeviceStatus(AbstractModel):
"""设备状态
"""
def __init__(self):
"""
:param DeviceName: 设备名称
:type DeviceName: str
:param Status: 设备状态(inactive, online, offline)
:type Status: str
:param FirstOnline: 首次上线时间
注意:此字段可能返回 null,表示取不到有效值。
:type FirstOnline: str
:param LastOnline: 最后上线时间
注意:此字段可能返回 null,表示取不到有效值。
:type LastOnline: str
:param OnlineTimes: 上线次数
:type OnlineTimes: int
"""
self.DeviceName = None
self.Status = None
self.FirstOnline = None
self.LastOnline = None
self.OnlineTimes = None
def _deserialize(self, params):
self.DeviceName = params.get("DeviceName")
self.Status = params.get("Status")
self.FirstOnline = params.get("FirstOnline")
self.LastOnline = params.get("LastOnline")
self.OnlineTimes = params.get("OnlineTimes")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class EnumData(AbstractModel):
"""枚举类型数据
"""
def __init__(self):
"""
:param Name: 名称
:type Name: str
:param Desc: 描述
:type Desc: str
:param Mode: 读写模式
:type Mode: str
:param Range: 取值列表
:type Range: list of str
"""
self.Name = None
self.Desc = None
self.Mode = None
self.Range = None
def _deserialize(self, params):
self.Name = params.get("Name")
self.Desc = params.get("Desc")
self.Mode = params.get("Mode")
self.Range = params.get("Range")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetDataHistoryRequest(AbstractModel):
"""GetDataHistory请求参数结构体
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
:param DeviceNames: 设备名称列表,允许最多一次100台
:type DeviceNames: list of str
:param StartTime: 查询开始时间
:type StartTime: str
:param EndTime: 查询结束时间
:type EndTime: str
:param Size: 查询数据量
:type Size: int
:param Order: 时间排序(desc/asc)
:type Order: str
:param ScrollId: 查询游标
:type ScrollId: str
"""
self.ProductId = None
self.DeviceNames = None
self.StartTime = None
self.EndTime = None
self.Size = None
self.Order = None
self.ScrollId = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.DeviceNames = params.get("DeviceNames")
self.StartTime = params.get("StartTime")
self.EndTime = params.get("EndTime")
self.Size = params.get("Size")
self.Order = params.get("Order")
self.ScrollId = params.get("ScrollId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetDataHistoryResponse(AbstractModel):
"""GetDataHistory返回参数结构体
"""
def __init__(self):
"""
:param DataHistory: 数据历史
:type DataHistory: list of DataHistoryEntry
:param ScrollId: 查询游标
:type ScrollId: str
:param ScrollTimeout: 查询游标超时
:type ScrollTimeout: int
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.DataHistory = None
self.ScrollId = None
self.ScrollTimeout = None
self.RequestId = None
def _deserialize(self, params):
if params.get("DataHistory") is not None:
self.DataHistory = []
for item in params.get("DataHistory"):
obj = DataHistoryEntry()
obj._deserialize(item)
self.DataHistory.append(obj)
self.ScrollId = params.get("ScrollId")
self.ScrollTimeout = params.get("ScrollTimeout")
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetDebugLogRequest(AbstractModel):
"""GetDebugLog请求参数结构体
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
:param DeviceNames: 设备名称列表,最大支持100台
:type DeviceNames: list of str
:param StartTime: 查询开始时间
:type StartTime: str
:param EndTime: 查询结束时间
:type EndTime: str
:param Size: 查询数据量
:type Size: int
:param Order: 时间排序(desc/asc)
:type Order: str
:param ScrollId: 查询游标
:type ScrollId: str
:param Type: 日志类型(shadow/action/mqtt)
:type Type: str
"""
self.ProductId = None
self.DeviceNames = None
self.StartTime = None
self.EndTime = None
self.Size = None
self.Order = None
self.ScrollId = None
self.Type = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.DeviceNames = params.get("DeviceNames")
self.StartTime = params.get("StartTime")
self.EndTime = params.get("EndTime")
self.Size = params.get("Size")
self.Order = params.get("Order")
self.ScrollId = params.get("ScrollId")
self.Type = params.get("Type")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetDebugLogResponse(AbstractModel):
"""GetDebugLog返回参数结构体
"""
def __init__(self):
"""
:param DebugLog: 调试日志
:type DebugLog: list of DebugLogEntry
:param ScrollId: 查询游标
:type ScrollId: str
:param ScrollTimeout: 游标超时
:type ScrollTimeout: int
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.DebugLog = None
self.ScrollId = None
self.ScrollTimeout = None
self.RequestId = None
def _deserialize(self, params):
if params.get("DebugLog") is not None:
self.DebugLog = []
for item in params.get("DebugLog"):
obj = DebugLogEntry()
obj._deserialize(item)
self.DebugLog.append(obj)
self.ScrollId = params.get("ScrollId")
self.ScrollTimeout = params.get("ScrollTimeout")
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetDeviceDataRequest(AbstractModel):
"""GetDeviceData请求参数结构体
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
:param DeviceName: 设备名称
:type DeviceName: str
"""
self.ProductId = None
self.DeviceName = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.DeviceName = params.get("DeviceName")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetDeviceDataResponse(AbstractModel):
"""GetDeviceData返回参数结构体
"""
def __init__(self):
"""
:param DeviceData: 设备数据
:type DeviceData: str
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.DeviceData = None
self.RequestId = None
def _deserialize(self, params):
self.DeviceData = params.get("DeviceData")
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetDeviceLogRequest(AbstractModel):
"""GetDeviceLog请求参数结构体
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
:param DeviceNames: 设备名称列表,最大支持100台
:type DeviceNames: list of str
:param StartTime: 查询开始时间
:type StartTime: str
:param EndTime: 查询结束时间
:type EndTime: str
:param Size: 查询数据量
:type Size: int
:param Order: 时间排序(desc/asc)
:type Order: str
:param ScrollId: 查询游标
:type ScrollId: str
:param Type: 日志类型(comm/status)
:type Type: str
"""
self.ProductId = None
self.DeviceNames = None
self.StartTime = None
self.EndTime = None
self.Size = None
self.Order = None
self.ScrollId = None
self.Type = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.DeviceNames = params.get("DeviceNames")
self.StartTime = params.get("StartTime")
self.EndTime = params.get("EndTime")
self.Size = params.get("Size")
self.Order = params.get("Order")
self.ScrollId = params.get("ScrollId")
self.Type = params.get("Type")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetDeviceLogResponse(AbstractModel):
"""GetDeviceLog返回参数结构体
"""
def __init__(self):
"""
:param DeviceLog: 设备日志
:type DeviceLog: list of DeviceLogEntry
:param ScrollId: 查询游标
:type ScrollId: str
:param ScrollTimeout: 游标超时
:type ScrollTimeout: int
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.DeviceLog = None
self.ScrollId = None
self.ScrollTimeout = None
self.RequestId = None
def _deserialize(self, params):
if params.get("DeviceLog") is not None:
self.DeviceLog = []
for item in params.get("DeviceLog"):
obj = DeviceLogEntry()
obj._deserialize(item)
self.DeviceLog.append(obj)
self.ScrollId = params.get("ScrollId")
self.ScrollTimeout = params.get("ScrollTimeout")
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetDeviceRequest(AbstractModel):
"""GetDevice请求参数结构体
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
:param DeviceName: 设备名称
:type DeviceName: str
"""
self.ProductId = None
self.DeviceName = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.DeviceName = params.get("DeviceName")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetDeviceResponse(AbstractModel):
"""GetDevice返回参数结构体
"""
def __init__(self):
"""
:param Device: 设备信息
:type Device: :class:`tencentcloud.iot.v20180123.models.Device`
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.Device = None
self.RequestId = None
def _deserialize(self, params):
if params.get("Device") is not None:
self.Device = Device()
self.Device._deserialize(params.get("Device"))
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetDeviceSignaturesRequest(AbstractModel):
"""GetDeviceSignatures请求参数结构体
"""
def __init__(self):
"""
:param ProductId: 产品ID
:type ProductId: str
:param DeviceNames: 设备名称列表(单次限制1000个设备)
:type DeviceNames: list of str
:param Expire: 过期时间
:type Expire: int
"""
self.ProductId = None
self.DeviceNames = None
self.Expire = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.DeviceNames = params.get("DeviceNames")
self.Expire = params.get("Expire")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetDeviceSignaturesResponse(AbstractModel):
"""GetDeviceSignatures返回参数结构体
"""
def __init__(self):
"""
:param DeviceSignatures: 设备绑定签名列表
:type DeviceSignatures: list of DeviceSignature
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.DeviceSignatures = None
self.RequestId = None
def _deserialize(self, params):
if params.get("DeviceSignatures") is not None:
self.DeviceSignatures = []
for item in params.get("DeviceSignatures"):
obj = DeviceSignature()
obj._deserialize(item)
self.DeviceSignatures.append(obj)
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetDeviceStatisticsRequest(AbstractModel):
"""GetDeviceStatistics请求参数结构体
"""
def __init__(self):
"""
:param Products: 产品Id列表
:type Products: list of str
:param StartDate: 开始日期
:type StartDate: str
:param EndDate: 结束日期
:type EndDate: str
"""
self.Products = None
self.StartDate = None
self.EndDate = None
def _deserialize(self, params):
self.Products = params.get("Products")
self.StartDate = params.get("StartDate")
self.EndDate = params.get("EndDate")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetDeviceStatisticsResponse(AbstractModel):
"""GetDeviceStatistics返回参数结构体
"""
def __init__(self):
"""
:param DeviceStatistics: 统计数据
:type DeviceStatistics: list of DeviceStatData
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.DeviceStatistics = None
self.RequestId = None
def _deserialize(self, params):
if params.get("DeviceStatistics") is not None:
self.DeviceStatistics = []
for item in params.get("DeviceStatistics"):
obj = DeviceStatData()
obj._deserialize(item)
self.DeviceStatistics.append(obj)
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetDeviceStatusesRequest(AbstractModel):
"""GetDeviceStatuses请求参数结构体
"""
def __init__(self):
"""
:param ProductId: 产品ID
:type ProductId: str
:param DeviceNames: 设备名称列表(单次限制1000个设备)
:type DeviceNames: list of str
"""
self.ProductId = None
self.DeviceNames = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.DeviceNames = params.get("DeviceNames")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetDeviceStatusesResponse(AbstractModel):
"""GetDeviceStatuses返回参数结构体
"""
def __init__(self):
"""
:param DeviceStatuses: 设备状态列表
:type DeviceStatuses: list of DeviceStatus
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.DeviceStatuses = None
self.RequestId = None
def _deserialize(self, params):
if params.get("DeviceStatuses") is not None:
self.DeviceStatuses = []
for item in params.get("DeviceStatuses"):
obj = DeviceStatus()
obj._deserialize(item)
self.DeviceStatuses.append(obj)
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetDevicesRequest(AbstractModel):
"""GetDevices请求参数结构体
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
:param Offset: 偏移
:type Offset: int
:param Length: 长度
:type Length: int
:param Keyword: 关键字查询
:type Keyword: str
"""
self.ProductId = None
self.Offset = None
self.Length = None
self.Keyword = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.Offset = params.get("Offset")
self.Length = params.get("Length")
self.Keyword = params.get("Keyword")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetDevicesResponse(AbstractModel):
"""GetDevices返回参数结构体
"""
def __init__(self):
"""
:param Devices: 设备列表
:type Devices: list of DeviceEntry
:param Total: 设备总数
:type Total: int
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.Devices = None
self.Total = None
self.RequestId = None
def _deserialize(self, params):
if params.get("Devices") is not None:
self.Devices = []
for item in params.get("Devices"):
obj = DeviceEntry()
obj._deserialize(item)
self.Devices.append(obj)
self.Total = params.get("Total")
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetProductRequest(AbstractModel):
"""GetProduct请求参数结构体
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
"""
self.ProductId = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetProductResponse(AbstractModel):
"""GetProduct返回参数结构体
"""
def __init__(self):
"""
:param Product: 产品信息
:type Product: :class:`tencentcloud.iot.v20180123.models.Product`
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.Product = None
self.RequestId = None
def _deserialize(self, params):
if params.get("Product") is not None:
self.Product = Product()
self.Product._deserialize(params.get("Product"))
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetProductsRequest(AbstractModel):
"""GetProducts请求参数结构体
"""
def __init__(self):
"""
:param Offset: 偏移
:type Offset: int
:param Length: 长度
:type Length: int
"""
self.Offset = None
self.Length = None
def _deserialize(self, params):
self.Offset = params.get("Offset")
self.Length = params.get("Length")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetProductsResponse(AbstractModel):
"""GetProducts返回参数结构体
"""
def __init__(self):
"""
:param Products: Product列表
:type Products: list of ProductEntry
:param Total: Product总数
:type Total: int
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.Products = None
self.Total = None
self.RequestId = None
def _deserialize(self, params):
if params.get("Products") is not None:
self.Products = []
for item in params.get("Products"):
obj = ProductEntry()
obj._deserialize(item)
self.Products.append(obj)
self.Total = params.get("Total")
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetRuleRequest(AbstractModel):
"""GetRule请求参数结构体
"""
def __init__(self):
"""
:param RuleId: 规则Id
:type RuleId: str
"""
self.RuleId = None
def _deserialize(self, params):
self.RuleId = params.get("RuleId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetRuleResponse(AbstractModel):
"""GetRule返回参数结构体
"""
def __init__(self):
"""
:param Rule: 规则
:type Rule: :class:`tencentcloud.iot.v20180123.models.Rule`
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.Rule = None
self.RequestId = None
def _deserialize(self, params):
if params.get("Rule") is not None:
self.Rule = Rule()
self.Rule._deserialize(params.get("Rule"))
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetRulesRequest(AbstractModel):
"""GetRules请求参数结构体
"""
def __init__(self):
"""
:param Offset: 偏移
:type Offset: int
:param Length: 长度
:type Length: int
"""
self.Offset = None
self.Length = None
def _deserialize(self, params):
self.Offset = params.get("Offset")
self.Length = params.get("Length")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetRulesResponse(AbstractModel):
"""GetRules返回参数结构体
"""
def __init__(self):
"""
:param Rules: 规则列表
:type Rules: list of Rule
:param Total: 规则总数
:type Total: int
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.Rules = None
self.Total = None
self.RequestId = None
def _deserialize(self, params):
if params.get("Rules") is not None:
self.Rules = []
for item in params.get("Rules"):
obj = Rule()
obj._deserialize(item)
self.Rules.append(obj)
self.Total = params.get("Total")
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetTopicRequest(AbstractModel):
"""GetTopic请求参数结构体
"""
def __init__(self):
"""
:param TopicId: TopicId
:type TopicId: str
:param ProductId: 产品Id
:type ProductId: str
"""
self.TopicId = None
self.ProductId = None
def _deserialize(self, params):
self.TopicId = params.get("TopicId")
self.ProductId = params.get("ProductId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetTopicResponse(AbstractModel):
"""GetTopic返回参数结构体
"""
def __init__(self):
"""
:param Topic: Topic信息
:type Topic: :class:`tencentcloud.iot.v20180123.models.Topic`
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.Topic = None
self.RequestId = None
def _deserialize(self, params):
if params.get("Topic") is not None:
self.Topic = Topic()
self.Topic._deserialize(params.get("Topic"))
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetTopicsRequest(AbstractModel):
"""GetTopics请求参数结构体
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
:param Offset: 偏移
:type Offset: int
:param Length: 长度
:type Length: int
"""
self.ProductId = None
self.Offset = None
self.Length = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.Offset = params.get("Offset")
self.Length = params.get("Length")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class GetTopicsResponse(AbstractModel):
"""GetTopics返回参数结构体
"""
def __init__(self):
"""
:param Topics: Topic列表
:type Topics: list of Topic
:param Total: Topic总数
:type Total: int
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.Topics = None
self.Total = None
self.RequestId = None
def _deserialize(self, params):
if params.get("Topics") is not None:
self.Topics = []
for item in params.get("Topics"):
obj = Topic()
obj._deserialize(item)
self.Topics.append(obj)
self.Total = params.get("Total")
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class IssueDeviceControlRequest(AbstractModel):
"""IssueDeviceControl请求参数结构体
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
:param DeviceName: 设备名称
:type DeviceName: str
:param ControlData: 控制数据(json)
:type ControlData: str
:param Metadata: 是否发送metadata字段
:type Metadata: bool
"""
self.ProductId = None
self.DeviceName = None
self.ControlData = None
self.Metadata = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.DeviceName = params.get("DeviceName")
self.ControlData = params.get("ControlData")
self.Metadata = params.get("Metadata")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class IssueDeviceControlResponse(AbstractModel):
"""IssueDeviceControl返回参数结构体
"""
def __init__(self):
"""
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.RequestId = None
def _deserialize(self, params):
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class NumberData(AbstractModel):
"""数字类型数据
"""
def __init__(self):
"""
:param Name: 名称
:type Name: str
:param Desc: 描述
:type Desc: str
:param Mode: 读写模式
:type Mode: str
:param Range: 取值范围
:type Range: list of float
"""
self.Name = None
self.Desc = None
self.Mode = None
self.Range = None
def _deserialize(self, params):
self.Name = params.get("Name")
self.Desc = params.get("Desc")
self.Mode = params.get("Mode")
self.Range = params.get("Range")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class Product(AbstractModel):
"""产品
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
:param ProductKey: 产品Key
:type ProductKey: str
:param AppId: AppId
:type AppId: int
:param Name: 产品名称
:type Name: str
:param Description: 产品描述
:type Description: str
:param Domain: 连接域名
:type Domain: str
:param Standard: 产品规格
:type Standard: int
:param AuthType: 鉴权类型(0:直连,1:Token)
:type AuthType: int
:param Deleted: 删除(0未删除)
:type Deleted: int
:param Message: 备注
:type Message: str
:param CreateTime: 创建时间
:type CreateTime: str
:param UpdateTime: 更新时间
:type UpdateTime: str
:param DataTemplate: 数据模版
:type DataTemplate: list of DataTemplate
:param DataProtocol: 数据协议(native/template)
:type DataProtocol: str
:param Username: 直连用户名
:type Username: str
:param Password: 直连密码
:type Password: str
:param CommProtocol: 通信方式
:type CommProtocol: str
:param Qps: qps
:type Qps: int
:param Region: 地域
:type Region: str
:param DeviceType: 产品的设备类型
:type DeviceType: str
:param AssociatedProducts: 关联的产品列表
:type AssociatedProducts: list of str
"""
self.ProductId = None
self.ProductKey = None
self.AppId = None
self.Name = None
self.Description = None
self.Domain = None
self.Standard = None
self.AuthType = None
self.Deleted = None
self.Message = None
self.CreateTime = None
self.UpdateTime = None
self.DataTemplate = None
self.DataProtocol = None
self.Username = None
self.Password = None
self.CommProtocol = None
self.Qps = None
self.Region = None
self.DeviceType = None
self.AssociatedProducts = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.ProductKey = params.get("ProductKey")
self.AppId = params.get("AppId")
self.Name = params.get("Name")
self.Description = params.get("Description")
self.Domain = params.get("Domain")
self.Standard = params.get("Standard")
self.AuthType = params.get("AuthType")
self.Deleted = params.get("Deleted")
self.Message = params.get("Message")
self.CreateTime = params.get("CreateTime")
self.UpdateTime = params.get("UpdateTime")
if params.get("DataTemplate") is not None:
self.DataTemplate = []
for item in params.get("DataTemplate"):
obj = DataTemplate()
obj._deserialize(item)
self.DataTemplate.append(obj)
self.DataProtocol = params.get("DataProtocol")
self.Username = params.get("Username")
self.Password = params.get("Password")
self.CommProtocol = params.get("CommProtocol")
self.Qps = params.get("Qps")
self.Region = params.get("Region")
self.DeviceType = params.get("DeviceType")
self.AssociatedProducts = params.get("AssociatedProducts")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class ProductEntry(AbstractModel):
"""产品条目
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
:param ProductKey: 产品Key
:type ProductKey: str
:param AppId: AppId
:type AppId: int
:param Name: 产品名称
:type Name: str
:param Description: 产品描述
:type Description: str
:param Domain: 连接域名
:type Domain: str
:param AuthType: 鉴权类型(0:直连,1:Token)
:type AuthType: int
:param DataProtocol: 数据协议(native/template)
:type DataProtocol: str
:param Deleted: 删除(0未删除)
:type Deleted: int
:param Message: 备注
:type Message: str
:param CreateTime: 创建时间
:type CreateTime: str
:param CommProtocol: 通信方式
:type CommProtocol: str
:param Region: 地域
:type Region: str
:param DeviceType: 设备类型
:type DeviceType: str
"""
self.ProductId = None
self.ProductKey = None
self.AppId = None
self.Name = None
self.Description = None
self.Domain = None
self.AuthType = None
self.DataProtocol = None
self.Deleted = None
self.Message = None
self.CreateTime = None
self.CommProtocol = None
self.Region = None
self.DeviceType = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.ProductKey = params.get("ProductKey")
self.AppId = params.get("AppId")
self.Name = params.get("Name")
self.Description = params.get("Description")
self.Domain = params.get("Domain")
self.AuthType = params.get("AuthType")
self.DataProtocol = params.get("DataProtocol")
self.Deleted = params.get("Deleted")
self.Message = params.get("Message")
self.CreateTime = params.get("CreateTime")
self.CommProtocol = params.get("CommProtocol")
self.Region = params.get("Region")
self.DeviceType = params.get("DeviceType")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class PublishMsgRequest(AbstractModel):
"""PublishMsg请求参数结构体
"""
def __init__(self):
"""
:param Topic: Topic
:type Topic: str
:param Message: 消息内容
:type Message: str
:param Qos: Qos(目前QoS支持0与1)
:type Qos: int
"""
self.Topic = None
self.Message = None
self.Qos = None
def _deserialize(self, params):
self.Topic = params.get("Topic")
self.Message = params.get("Message")
self.Qos = params.get("Qos")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class PublishMsgResponse(AbstractModel):
"""PublishMsg返回参数结构体
"""
def __init__(self):
"""
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.RequestId = None
def _deserialize(self, params):
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class ResetDeviceRequest(AbstractModel):
"""ResetDevice请求参数结构体
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
:param DeviceName: 设备名称
:type DeviceName: str
"""
self.ProductId = None
self.DeviceName = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.DeviceName = params.get("DeviceName")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class ResetDeviceResponse(AbstractModel):
"""ResetDevice返回参数结构体
"""
def __init__(self):
"""
:param Device: 设备信息
:type Device: :class:`tencentcloud.iot.v20180123.models.Device`
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.Device = None
self.RequestId = None
def _deserialize(self, params):
if params.get("Device") is not None:
self.Device = Device()
self.Device._deserialize(params.get("Device"))
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class Rule(AbstractModel):
"""规则
"""
def __init__(self):
"""
:param RuleId: 规则Id
:type RuleId: str
:param AppId: AppId
:type AppId: int
:param Name: 名称
:type Name: str
:param Description: 描述
:type Description: str
:param Query: 查询
:type Query: :class:`tencentcloud.iot.v20180123.models.RuleQuery`
:param Actions: 转发
:type Actions: list of Action
:param Active: 已启动
:type Active: int
:param Deleted: 已删除
:type Deleted: int
:param CreateTime: 创建时间
:type CreateTime: str
:param UpdateTime: 更新时间
:type UpdateTime: str
:param MsgOrder: 消息顺序
:type MsgOrder: int
:param DataType: 数据类型(0:文本,1:二进制)
:type DataType: int
"""
self.RuleId = None
self.AppId = None
self.Name = None
self.Description = None
self.Query = None
self.Actions = None
self.Active = None
self.Deleted = None
self.CreateTime = None
self.UpdateTime = None
self.MsgOrder = None
self.DataType = None
def _deserialize(self, params):
self.RuleId = params.get("RuleId")
self.AppId = params.get("AppId")
self.Name = params.get("Name")
self.Description = params.get("Description")
if params.get("Query") is not None:
self.Query = RuleQuery()
self.Query._deserialize(params.get("Query"))
if params.get("Actions") is not None:
self.Actions = []
for item in params.get("Actions"):
obj = Action()
obj._deserialize(item)
self.Actions.append(obj)
self.Active = params.get("Active")
self.Deleted = params.get("Deleted")
self.CreateTime = params.get("CreateTime")
self.UpdateTime = params.get("UpdateTime")
self.MsgOrder = params.get("MsgOrder")
self.DataType = params.get("DataType")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class RuleQuery(AbstractModel):
"""查询
"""
def __init__(self):
"""
:param Field: 字段
:type Field: str
:param Condition: 过滤规则
:type Condition: str
:param Topic: Topic
注意:此字段可能返回 null,表示取不到有效值。
:type Topic: str
:param ProductId: 产品Id
注意:此字段可能返回 null,表示取不到有效值。
:type ProductId: str
"""
self.Field = None
self.Condition = None
self.Topic = None
self.ProductId = None
def _deserialize(self, params):
self.Field = params.get("Field")
self.Condition = params.get("Condition")
self.Topic = params.get("Topic")
self.ProductId = params.get("ProductId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class ServiceAction(AbstractModel):
"""转发到第三方http(s)服务
"""
def __init__(self):
"""
:param Url: 服务url地址
:type Url: str
"""
self.Url = None
def _deserialize(self, params):
self.Url = params.get("Url")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class StringData(AbstractModel):
"""数字类型数据
"""
def __init__(self):
"""
:param Name: 名称
:type Name: str
:param Desc: 描述
:type Desc: str
:param Mode: 读写模式
:type Mode: str
:param Range: 长度范围
:type Range: list of int non-negative
"""
self.Name = None
self.Desc = None
self.Mode = None
self.Range = None
def _deserialize(self, params):
self.Name = params.get("Name")
self.Desc = params.get("Desc")
self.Mode = params.get("Mode")
self.Range = params.get("Range")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class Topic(AbstractModel):
"""Topic
"""
def __init__(self):
"""
:param TopicId: TopicId
:type TopicId: str
:param TopicName: Topic名称
:type TopicName: str
:param ProductId: 产品Id
:type ProductId: str
:param MsgLife: 消息最大生命周期
:type MsgLife: int
:param MsgSize: 消息最大大小
:type MsgSize: int
:param MsgCount: 消息最大数量
:type MsgCount: int
:param Deleted: 已删除
:type Deleted: int
:param Path: Topic完整路径
:type Path: str
:param CreateTime: 创建时间
:type CreateTime: str
:param UpdateTime: 更新时间
:type UpdateTime: str
"""
self.TopicId = None
self.TopicName = None
self.ProductId = None
self.MsgLife = None
self.MsgSize = None
self.MsgCount = None
self.Deleted = None
self.Path = None
self.CreateTime = None
self.UpdateTime = None
def _deserialize(self, params):
self.TopicId = params.get("TopicId")
self.TopicName = params.get("TopicName")
self.ProductId = params.get("ProductId")
self.MsgLife = params.get("MsgLife")
self.MsgSize = params.get("MsgSize")
self.MsgCount = params.get("MsgCount")
self.Deleted = params.get("Deleted")
self.Path = params.get("Path")
self.CreateTime = params.get("CreateTime")
self.UpdateTime = params.get("UpdateTime")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class TopicAction(AbstractModel):
"""转发到topic动作
"""
def __init__(self):
"""
:param Topic: 目标topic
:type Topic: str
"""
self.Topic = None
def _deserialize(self, params):
self.Topic = params.get("Topic")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class UnassociateSubDeviceFromGatewayProductRequest(AbstractModel):
"""UnassociateSubDeviceFromGatewayProduct请求参数结构体
"""
def __init__(self):
"""
:param SubDeviceProductId: 子设备产品Id
:type SubDeviceProductId: str
:param GatewayProductId: 网关设备产品Id
:type GatewayProductId: str
"""
self.SubDeviceProductId = None
self.GatewayProductId = None
def _deserialize(self, params):
self.SubDeviceProductId = params.get("SubDeviceProductId")
self.GatewayProductId = params.get("GatewayProductId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class UnassociateSubDeviceFromGatewayProductResponse(AbstractModel):
"""UnassociateSubDeviceFromGatewayProduct返回参数结构体
"""
def __init__(self):
"""
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.RequestId = None
def _deserialize(self, params):
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class UpdateProductRequest(AbstractModel):
"""UpdateProduct请求参数结构体
"""
def __init__(self):
"""
:param ProductId: 产品Id
:type ProductId: str
:param Name: 产品名称
:type Name: str
:param Description: 产品描述
:type Description: str
:param DataTemplate: 数据模版
:type DataTemplate: list of DataTemplate
"""
self.ProductId = None
self.Name = None
self.Description = None
self.DataTemplate = None
def _deserialize(self, params):
self.ProductId = params.get("ProductId")
self.Name = params.get("Name")
self.Description = params.get("Description")
if params.get("DataTemplate") is not None:
self.DataTemplate = []
for item in params.get("DataTemplate"):
obj = DataTemplate()
obj._deserialize(item)
self.DataTemplate.append(obj)
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class UpdateProductResponse(AbstractModel):
"""UpdateProduct返回参数结构体
"""
def __init__(self):
"""
:param Product: 更新后的产品信息
:type Product: :class:`tencentcloud.iot.v20180123.models.Product`
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.Product = None
self.RequestId = None
def _deserialize(self, params):
if params.get("Product") is not None:
self.Product = Product()
self.Product._deserialize(params.get("Product"))
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class UpdateRuleRequest(AbstractModel):
"""UpdateRule请求参数结构体
"""
def __init__(self):
"""
:param RuleId: 规则Id
:type RuleId: str
:param Name: 名称
:type Name: str
:param Description: 描述
:type Description: str
:param Query: 查询
:type Query: :class:`tencentcloud.iot.v20180123.models.RuleQuery`
:param Actions: 转发动作列表
:type Actions: list of Action
:param DataType: 数据类型(0:文本,1:二进制)
:type DataType: int
"""
self.RuleId = None
self.Name = None
self.Description = None
self.Query = None
self.Actions = None
self.DataType = None
def _deserialize(self, params):
self.RuleId = params.get("RuleId")
self.Name = params.get("Name")
self.Description = params.get("Description")
if params.get("Query") is not None:
self.Query = RuleQuery()
self.Query._deserialize(params.get("Query"))
if params.get("Actions") is not None:
self.Actions = []
for item in params.get("Actions"):
obj = Action()
obj._deserialize(item)
self.Actions.append(obj)
self.DataType = params.get("DataType")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
class UpdateRuleResponse(AbstractModel):
"""UpdateRule返回参数结构体
"""
def __init__(self):
"""
:param Rule: 规则
:type Rule: :class:`tencentcloud.iot.v20180123.models.Rule`
:param RequestId: 唯一请求 ID,每次请求都会返回。定位问题时需要提供该次请求的 RequestId。
:type RequestId: str
"""
self.Rule = None
self.RequestId = None
def _deserialize(self, params):
if params.get("Rule") is not None:
self.Rule = Rule()
self.Rule._deserialize(params.get("Rule"))
self.RequestId = params.get("RequestId")
memeber_set = set(params.keys())
for name, value in vars(self).items():
if name in memeber_set:
memeber_set.remove(name)
if len(memeber_set) > 0:
warnings.warn("%s fileds are useless." % ",".join(memeber_set), Warning)
| 28.828434 | 84 | 0.574901 | 11,821 | 113,757 | 5.431097 | 0.053633 | 0.089562 | 0.019704 | 0.02866 | 0.815626 | 0.788601 | 0.774567 | 0.767776 | 0.751733 | 0.743368 | 0 | 0.004931 | 0.313642 | 113,757 | 3,946 | 85 | 28.828434 | 0.817337 | 0.208163 | 0 | 0.861979 | 0 | 0 | 0.073421 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.119792 | false | 0.00625 | 0.001042 | 0 | 0.180729 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6a0a953df260e1d7358599cd4dccbd5a2ee78038 | 1,009 | py | Python | build/SCA/script/x_modules/init.py | oliverpatrick/python-screen_click_ai | a96e40b727d343ebd36b0bb684ef8e029ff54baa | [
"MIT"
] | 26 | 2019-12-01T16:47:39.000Z | 2020-06-18T19:59:28.000Z | build/SCA/script/x_modules/init.py | Suirdna/SCA-ScreenClickAi | a96e40b727d343ebd36b0bb684ef8e029ff54baa | [
"MIT"
] | 5 | 2020-01-25T20:46:36.000Z | 2020-11-07T06:23:47.000Z | build/SCA/script/x_modules/init.py | Suirdna/python-screen_click_ai | a96e40b727d343ebd36b0bb684ef8e029ff54baa | [
"MIT"
] | 7 | 2019-12-04T02:18:20.000Z | 2020-07-04T23:27:41.000Z | import time
import interface
class init():
console = None;
TIMER = 5;
def __init__(self):
self.console = interface.interface;
def startWait(self):
print("{}{}".format(self.console.TERMINAL_INTERFACE[0],self.console.MODULE[0]));
time.sleep(1);
print("{}{}{}".format(self.console.TERMINAL_INTERFACE[0],self.console.MODULE[1],self.TIMER));
time.sleep(1);
print("{}{}{}".format(self.console.TERMINAL_INTERFACE[0],self.console.MODULE[1],self.TIMER-1));
time.sleep(1);
print("{}{}{}".format(self.console.TERMINAL_INTERFACE[0],self.console.MODULE[1],self.TIMER-2));
time.sleep(1);
print("{}{}{}".format(self.console.TERMINAL_INTERFACE[0],self.console.MODULE[1],self.TIMER-3));
time.sleep(1);
print("{}{}{}".format(self.console.TERMINAL_INTERFACE[0],self.console.MODULE[1],self.TIMER-4));
time.sleep(1);
print("{}{}".format(self.console.TERMINAL_INTERFACE[0],self.console.MODULE[2]));
| 38.807692 | 103 | 0.627354 | 129 | 1,009 | 4.821705 | 0.170543 | 0.265273 | 0.16881 | 0.247588 | 0.818328 | 0.818328 | 0.818328 | 0.818328 | 0.818328 | 0.818328 | 0 | 0.029586 | 0.162537 | 1,009 | 25 | 104 | 40.36 | 0.706509 | 0 | 0 | 0.285714 | 0 | 0 | 0.037661 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.095238 | false | 0 | 0.095238 | 0 | 0.333333 | 0.333333 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
dbe3d85ab3cec2922c159861e9868c93062794b9 | 1,005 | py | Python | binarysearch.io/26_longest_common_subsequence.py | mishrakeshav/Competitive-Programming | b25dcfeec0fb9a9c71bf3a05644b619f4ca83dd2 | [
"MIT"
] | 2 | 2020-06-25T21:10:32.000Z | 2020-12-10T06:53:45.000Z | binarysearch.io/26_longest_common_subsequence.py | mishrakeshav/Competitive-Programming | b25dcfeec0fb9a9c71bf3a05644b619f4ca83dd2 | [
"MIT"
] | null | null | null | binarysearch.io/26_longest_common_subsequence.py | mishrakeshav/Competitive-Programming | b25dcfeec0fb9a9c71bf3a05644b619f4ca83dd2 | [
"MIT"
] | 3 | 2020-05-15T14:17:09.000Z | 2021-07-25T13:18:20.000Z | # dp solution
class Solution:
def solve(self, a, b):
n = len(a)
m = len(b)
dp = dict()
def helper(a,b,i,j,n,m):
if (i,j) in dp:
return dp[(i,j)]
if i == n or j == m:
return 0
if(a[i] == b[j]):
dp[(i,j)] = helper(a,b,i+1,j+1,n,m) + 1
return dp[(i,j)]
m1 = helper(a,b,i+1,j,n,m)
m2 = helper(a,b,i,j+1,n,m)
dp[(i,j)] = max(m1,m2)
return dp[(i,j)]
return helper(a,b,0,0,n,m)
# recursive solution
class Solution:
def solve(self, a, b):
n = len(a)
m = len(b)
def helper(a,b,i,j,n,m):
if i == n or j == m:
return 0
if(a[i] == b[j]):
return helper(a,b,i+1,j+1,n,m) + 1
m1 = helper(a,b,i+1,j,n,m)
m2 = helper(a,b,i,j+1,n,m)
return max(m1,m2)
return helper(a,b,0,0,n,m) | 27.916667 | 56 | 0.375124 | 177 | 1,005 | 2.129944 | 0.141243 | 0.06366 | 0.212202 | 0.190981 | 0.790451 | 0.771883 | 0.771883 | 0.771883 | 0.676393 | 0.676393 | 0 | 0.043088 | 0.445771 | 1,005 | 36 | 57 | 27.916667 | 0.633752 | 0.029851 | 0 | 0.806452 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.129032 | false | 0 | 0 | 0 | 0.483871 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e01e92917c60894b73e56863b653321d30827230 | 7,705 | py | Python | src/hist/_internal/axis.py | LovelyBuggies/Nino-Hist | b53305567336b49280c43a700524c0e3ab05997e | [
"MIT"
] | 1 | 2020-03-12T15:55:18.000Z | 2020-03-12T15:55:18.000Z | src/hist/_internal/axis.py | LovelyBuggies/Nino-Hist | b53305567336b49280c43a700524c0e3ab05997e | [
"MIT"
] | 12 | 2020-03-07T12:09:26.000Z | 2020-05-19T02:01:19.000Z | src/hist/_internal/axis.py | LovelyBuggies/Nino-hist | b53305567336b49280c43a700524c0e3ab05997e | [
"MIT"
] | null | null | null | import re
import boost_histogram.axis as bha
class Regular(bha.Regular):
def __init__(
self,
bins,
start,
stop,
*,
name=None,
title=None,
underflow=True,
overflow=True,
growth=False,
circular=False,
transform=None
):
if re.match(r"^[0-9a-zA-Z][0-9a-zA-Z_]*", name).group() == name:
metadata = dict(name=name)
else: raise Exception("Name should be a Python Identifier.")
metadata["title"] = title
super().__init__(
bins,
start,
stop,
metadata=metadata,
underflow=underflow,
overflow=overflow,
growth=growth,
circular=circular,
transform=transform,
)
def __repr__(self):
return "{self.__class__.__name__}({args}{kwargs})".format(
self=self, args=self._repr_args(), kwargs=self._repr_kwargs()
)
@property
def name(self):
'''
Get or set the name for the Regular axis
'''
return self.metadata["name"]
@name.setter
def name(self, value):
self.metadata["name"] = value
@property
def title(self):
'''
Get or set the title for the Regular axis
'''
return self.metadata["title"]
@title.setter
def title(self, value):
self.metadata["title"] = value
class Bool(bha.Regular):
def __init__(
self,
*,
name=None,
title=None,
underflow=False,
overflow=False,
growth=False,
circular=False,
transform=None
):
if re.match(r"^[0-9a-zA-Z][0-9a-zA-Z_]*", name).group() == name:
metadata = dict(name=name)
else: raise Exception("Name should be a Python Identifier.")
metadata["title"] = title
super().__init__(
2,
0,
2,
metadata=metadata,
underflow=underflow,
overflow=overflow,
growth=growth,
circular=circular,
transform=transform,
)
def __repr__(self):
return "{self.__class__.__name__}({args}{kwargs})".format(
self=self, args=self._repr_args(), kwargs=self._repr_kwargs()
)
@property
def name(self):
'''
Get or set the name for the Bool axis
'''
return self.metadata["name"]
@name.setter
def name(self, value):
self.metadata["name"] = value
@property
def title(self):
'''
Get or set the title for the Bool axis
'''
return self.metadata["title"]
@title.setter
def title(self, value):
self.metadata["title"] = value
class Variable(bha.Variable):
def __init__(
self,
edges,
*,
name=None,
title=None,
underflow=True,
overflow=True,
growth=False
):
if re.match(r"^[0-9a-zA-Z][0-9a-zA-Z_]*", name).group() == name:
metadata = dict(name=name)
else: raise Exception("Name should be a Python Identifier.")
metadata["title"] = title
super().__init__(
edges,
metadata=metadata,
underflow=underflow,
overflow=overflow,
growth=growth
)
def __repr__(self):
return "{self.__class__.__name__}({args}{kwargs})".format(
self=self, args=self._repr_args(), kwargs=self._repr_kwargs()
)
@property
def name(self):
'''
Get or set the name for the Variable axis
'''
return self.metadata["name"]
@name.setter
def name(self, value):
self.metadata["name"] = value
@property
def title(self):
'''
Get or set the title for the Variable axis
'''
return self.metadata["title"]
@title.setter
def title(self, value):
self.metadata["title"] = value
class Integer(bha.Integer):
def __init__(
self,
start,
stop,
*,
name=None,
title=None,
underflow=True,
overflow=True,
growth=False
):
if re.match(r"^[0-9a-zA-Z][0-9a-zA-Z_]*", name).group() == name:
metadata = dict(name=name)
else: raise Exception("Name should be a Python Identifier.")
metadata["title"] = title
super().__init__(
start,
stop,
metadata=metadata,
underflow=underflow,
overflow=overflow,
growth=growth
)
def __repr__(self):
return "{self.__class__.__name__}({args}{kwargs})".format(
self=self, args=self._repr_args(), kwargs=self._repr_kwargs()
)
@property
def name(self):
'''
Get or set the name for the Integer axis
'''
return self.metadata["name"]
@name.setter
def name(self, value):
self.metadata["name"] = value
@property
def title(self):
'''
Get or set the title for the Integer axis
'''
return self.metadata["title"]
@title.setter
def title(self, value):
self.metadata["title"] = value
class IntCategory(bha.IntCategory):
def __init__(
self,
categories=None,
*,
name=None,
title=None,
growth=False
):
if re.match(r"^[0-9a-zA-Z][0-9a-zA-Z_]*", name).group() == name:
metadata = dict(name=name)
else: raise Exception("Name should be a Python Identifier.")
metadata["title"] = title
super().__init__(
categories,
metadata=metadata,
growth=growth
)
def __repr__(self):
return "{self.__class__.__name__}({args}{kwargs})".format(
self=self, args=self._repr_args(), kwargs=self._repr_kwargs()
)
@property
def name(self):
'''
Get or set the name for the IntCategory axis
'''
return self.metadata["name"]
@name.setter
def name(self, value):
self.metadata["name"] = value
@property
def title(self):
'''
Get or set the title for the IntCategory axis
'''
return self.metadata["title"]
@title.setter
def title(self, value):
self.metadata["title"] = value
class StrCategory(bha.StrCategory):
def __init__(
self,
categories=None,
*,
name=None,
title=None,
growth=False
):
if re.match(r"^[0-9a-zA-Z][0-9a-zA-Z_]*", name).group() == name:
metadata = dict(name=name)
else: raise Exception("Name should be a Python Identifier.")
metadata["title"] = title
super().__init__(
categories,
metadata=metadata,
growth=growth
)
def __repr__(self):
return "{self.__class__.__name__}({args}{kwargs})".format(
self=self, args=self._repr_args(), kwargs=self._repr_kwargs()
)
@property
def name(self):
'''
Get or set the name for the StrCategory axis
'''
return self.metadata["name"]
@name.setter
def name(self, value):
self.metadata["name"] = value
@property
def title(self):
'''
Get or set the title for the StrCategory axis
'''
return self.metadata["title"]
@title.setter
def title(self, value):
self.metadata["title"] = value
| 23.634969 | 73 | 0.519403 | 821 | 7,705 | 4.690621 | 0.079172 | 0.074786 | 0.01558 | 0.018696 | 0.948065 | 0.930408 | 0.930408 | 0.905479 | 0.905479 | 0.905479 | 0 | 0.005461 | 0.358339 | 7,705 | 325 | 74 | 23.707692 | 0.773463 | 0.066061 | 0 | 0.902954 | 0 | 0 | 0.10753 | 0.057234 | 0 | 0 | 0 | 0 | 0 | 1 | 0.151899 | false | 0 | 0.008439 | 0.025316 | 0.261603 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e032f25bba589d16409308279d3511e93def3fd1 | 15,534 | py | Python | tests/Controller/DatabaseManagerTest.py | TheConstructRIT/Machine-Swipe-System | 857e5c5205638a212736d58ac9e1ae27fa300946 | [
"MIT"
] | null | null | null | tests/Controller/DatabaseManagerTest.py | TheConstructRIT/Machine-Swipe-System | 857e5c5205638a212736d58ac9e1ae27fa300946 | [
"MIT"
] | null | null | null | tests/Controller/DatabaseManagerTest.py | TheConstructRIT/Machine-Swipe-System | 857e5c5205638a212736d58ac9e1ae27fa300946 | [
"MIT"
] | null | null | null | """
Zachary Cook
Unit tests for the DatabaseManager.
"""
import unittest
import tempfile
import sqlite3
from Controller import DatabaseManager
from Model import Session,User
"""
Test the DatabaseManager class.
"""
class TestDatabaseManagerClass(unittest.TestCase):
"""
Sets up the unit test.
"""
def setUp(self):
# Create a temporary file for the database.
databaseFile = tempfile.TemporaryFile()
self.databaseFile = databaseFile.name
databaseFile.close()
del databaseFile
# Create the database.
self.initialDatabase = sqlite3.connect(self.databaseFile)
"""
Tests the constructor with an uninitialized database.
"""
def test_uninitializedDatabase(self):
# Close the initial database.
self.initialDatabase.close()
# Create the database and assert the tables exist by running queries without errors.
CuT = DatabaseManager.DatabaseManager(self.databaseFile)
CuT.database.execute("SELECT * FROM Users;")
CuT.database.execute("SELECT * FROM Sessions;")
"""
Tests the constructor with an initialized database.
"""
def test_initializedDatabase(self):
# Initialize and close the initial database.
self.initialDatabase.execute("CREATE TABLE Users (Id char(9),AccessType STRING);")
self.initialDatabase.execute("CREATE TABLE Sessions (Id char(9),StartTime BIGINT,EndTime BIGINT);")
self.initialDatabase.commit()
self.initialDatabase.close()
# Create the database and assert the tables exist by running queries without errors.
CuT = DatabaseManager.DatabaseManager(self.databaseFile)
CuT.database.execute("SELECT * FROM Users;")
CuT.database.execute("SELECT * FROM Sessions;")
"""
Tests the constructor with an initialized database with sessions.
"""
def test_initializedDatabaseWithSessions(self):
# Initialize and close the initial database.
self.initialDatabase.execute("CREATE TABLE Users (Id char(9),AccessType STRING);")
self.initialDatabase.execute("CREATE TABLE Sessions (Id char(9),StartTime BIGINT,EndTime BIGINT);")
self.initialDatabase.execute("INSERT INTO Sessions VALUES (\"000000001\",100,105);")
self.initialDatabase.execute("INSERT INTO Sessions VALUES (\"000000002\",106,109);")
self.initialDatabase.execute("INSERT INTO Sessions VALUES (\"000000002\",109,0);")
self.initialDatabase.commit()
self.initialDatabase.close()
# Create the database and assert sessions are correct.
CuT = DatabaseManager.DatabaseManager(self.databaseFile)
sessions = CuT.database.execute("SELECT * FROM Sessions;").fetchall()
self.assertEqual(sessions[0],("000000001",100,105),"Session is incorrect.")
self.assertEqual(sessions[1],("000000002",106,109),"Session is incorrect.")
self.assertEqual(sessions[2],("000000002",109,-1),"Session is incorrect.")
"""
Tests the getUserAccessType method.
"""
def test_getUserAccessType(self):
# Initialize and close the initial database.
self.initialDatabase.execute("CREATE TABLE Users (Id char(9),AccessType STRING);")
self.initialDatabase.execute("CREATE TABLE Sessions (Id char(9),StartTime BIGINT,EndTime BIGINT);")
self.initialDatabase.execute("INSERT INTO Users VALUES (\"000000001\",\"AUTHORIZED\");")
self.initialDatabase.execute("INSERT INTO Users VALUES (\"000000002\",\"ADMIN\");")
self.initialDatabase.commit()
self.initialDatabase.close()
# Create the database and assert the user types are correct.
CuT = DatabaseManager.DatabaseManager(self.databaseFile)
self.assertEqual(CuT.getUserAccessType("000000001"),"AUTHORIZED","Type is incorrect.")
self.assertEqual(CuT.getUserAccessType("000000002"),"ADMIN","Type is incorrect.")
self.assertEqual(CuT.getUserAccessType("000000003"),"UNAUTHORIZED","Type is incorrect.")
"""
Tests the setUserAccessType method.
"""
def test_setUserAccessType(self):
# Initialize and close the initial database.
self.initialDatabase.execute("CREATE TABLE Users (Id char(9),AccessType STRING);")
self.initialDatabase.execute("CREATE TABLE Sessions (Id char(9),StartTime BIGINT,EndTime BIGINT);")
self.initialDatabase.commit()
self.initialDatabase.close()
# Set a new user type and assert the entries are correct.
CuT = DatabaseManager.DatabaseManager(self.databaseFile)
CuT.setUserAccessType("000000001","AUTHORIZED")
users = CuT.database.execute("SELECT * FROM Users;").fetchall()
self.assertEqual(users[0],("000000001","AUTHORIZED"),"User is incorrect.")
# Set a new user type and assert the entries are correct.
CuT.setUserAccessType("000000002","ADMIN")
users = CuT.database.execute("SELECT * FROM Users;").fetchall()
self.assertEqual(users[0],("000000001","AUTHORIZED"),"User is incorrect.")
self.assertEqual(users[1],("000000002","ADMIN"),"User is incorrect.")
# Set a new user type and assert the entries are correct.
CuT.setUserAccessType("000000001","ADMIN")
users = CuT.database.execute("SELECT * FROM Users;").fetchall()
self.assertEqual(users[0],("000000001","ADMIN"),"User is incorrect.")
self.assertEqual(users[1],("000000002","ADMIN"),"User is incorrect.")
# Set a new user type and assert the entries are correct.
CuT.setUserAccessType("000000001","UNAUTHORIZED")
users = CuT.database.execute("SELECT * FROM Users;").fetchall()
self.assertEqual(users[0],("000000002","ADMIN"),"User is incorrect.")
"""
Tests the sessionStarted method.
"""
def test_sessionStarted(self):
# Initialize and close the initial database.
self.initialDatabase.execute("CREATE TABLE Users (Id char(9),AccessType STRING);")
self.initialDatabase.execute("CREATE TABLE Sessions (Id char(9),StartTime BIGINT,EndTime BIGINT);")
self.initialDatabase.commit()
self.initialDatabase.close()
# Create the database, start sessions, and assert the sessions are correct.
CuT = DatabaseManager.DatabaseManager(self.databaseFile)
CuT.sessionStarted(Session.Session(User.User("000000001",10),5))
CuT.sessionStarted(Session.Session(User.User("000000002",10),8))
CuT.sessionStarted(Session.Session(User.User("000000001",10),15))
sessions = CuT.database.execute("SELECT * FROM Sessions;").fetchall()
self.assertEqual(sessions[0],("000000001",5,0),"Session is incorrect.")
self.assertEqual(sessions[1],("000000002",8,0),"Session is incorrect.")
self.assertEqual(sessions[2],("000000001",15,0),"Session is incorrect.")
"""
Tests the sessionStarted method.
"""
def test_sessionEnded(self):
# Mock the Time module.
currentTime = 0
class MockTimeModule:
def getCurrentTimestamp(self):
return currentTime
DatabaseManager.Time = MockTimeModule()
# Initialize and close the initial database.
self.initialDatabase.execute("CREATE TABLE Users (Id char(9),AccessType STRING);")
self.initialDatabase.execute("CREATE TABLE Sessions (Id char(9),StartTime BIGINT,EndTime BIGINT);")
self.initialDatabase.commit()
self.initialDatabase.close()
# Create the database, start and end sessions, and assert the sessions are correct.
CuT = DatabaseManager.DatabaseManager(self.databaseFile)
CuT.sessionStarted(Session.Session(User.User("000000001",10),5))
currentTime = 7
CuT.sessionEnded(Session.Session(User.User("000000001",10),5))
CuT.sessionStarted(Session.Session(User.User("000000002",10),8))
currentTime = 9
CuT.sessionEnded(Session.Session(User.User("000000002",10),8))
CuT.sessionStarted(Session.Session(User.User("000000001",10),15))
sessions = CuT.database.execute("SELECT * FROM Sessions;").fetchall()
self.assertEqual(sessions[0],("000000001",5,7),"Session is incorrect.")
self.assertEqual(sessions[1],("000000002",8,9),"Session is incorrect.")
self.assertEqual(sessions[2],("000000001",15,0),"Session is incorrect.")
"""
Test the static methods.
"""
class TestStaticMethods(unittest.TestCase):
"""
Sets up the unit test.
"""
def setUp(self):
# Create a temporary file for the database.
databaseFile = tempfile.TemporaryFile()
self.databaseFile = databaseFile.name
databaseFile.close()
del databaseFile
# Create the database.
self.initialDatabase = sqlite3.connect(self.databaseFile)
"""
Tests the getUser method.
"""
def test_getUser(self):
# Initialize and close the initial database.
self.initialDatabase.execute("CREATE TABLE Users (Id char(9),AccessType STRING);")
self.initialDatabase.execute("CREATE TABLE Sessions (Id char(9),StartTime BIGINT,EndTime BIGINT);")
self.initialDatabase.execute("INSERT INTO Users VALUES (\"000000001\",\"AUTHORIZED\");")
self.initialDatabase.execute("INSERT INTO Users VALUES (\"000000002\",\"ADMIN\");")
self.initialDatabase.commit()
self.initialDatabase.close()
# Set the static database and assert the users are correct.
DatabaseManager.staticDatabaseManager = DatabaseManager.DatabaseManager(self.databaseFile)
self.assertEqual(DatabaseManager.getUser("000000001").getId(),"000000001","Id is incorrect.")
self.assertEqual(DatabaseManager.getUser("000000001").getAccessType(),"AUTHORIZED","Access type is incorrect.")
self.assertNotEqual(DatabaseManager.getUser("000000001").getSessionTime(),0,"Session time is zero.")
self.assertEqual(DatabaseManager.getUser("000000002").getId(),"000000002","Id is incorrect.")
self.assertEqual(DatabaseManager.getUser("000000002").getAccessType(),"ADMIN","Access type is incorrect.")
self.assertNotEqual(DatabaseManager.getUser("000000002").getSessionTime(),0,"Session time is zero.")
self.assertEqual(DatabaseManager.getUser("000000003").getId(),"000000003","Id is incorrect.")
self.assertEqual(DatabaseManager.getUser("000000003").getAccessType(),"UNAUTHORIZED","Access type is incorrect.")
self.assertEqual(DatabaseManager.getUser("000000003").getSessionTime(),0,"Session time is non-zero.")
"""
Tests the setUserAccessType method.
"""
def test_setUserAccessType(self):
# Initialize and close the initial database.
self.initialDatabase.execute("CREATE TABLE Users (Id char(9),AccessType STRING);")
self.initialDatabase.execute("CREATE TABLE Sessions (Id char(9),StartTime BIGINT,EndTime BIGINT);")
self.initialDatabase.commit()
self.initialDatabase.close()
# Set a new user type and assert the entries are correct.
DatabaseManager.staticDatabaseManager = DatabaseManager.DatabaseManager(self.databaseFile)
DatabaseManager.setUserAccessType("000000001", "AUTHORIZED")
users = DatabaseManager.staticDatabaseManager.database.execute("SELECT * FROM Users;").fetchall()
self.assertEqual(users[0], ("000000001", "AUTHORIZED"), "User is incorrect.")
# Set a new user type and assert the entries are correct.
DatabaseManager.setUserAccessType("000000002", "ADMIN")
users = DatabaseManager.staticDatabaseManager.database.execute("SELECT * FROM Users;").fetchall()
self.assertEqual(users[0], ("000000001", "AUTHORIZED"), "User is incorrect.")
self.assertEqual(users[1], ("000000002", "ADMIN"), "User is incorrect.")
# Set a new user type and assert the entries are correct.
DatabaseManager.setUserAccessType("000000001", "ADMIN")
users = DatabaseManager.staticDatabaseManager.database.execute("SELECT * FROM Users;").fetchall()
self.assertEqual(users[0], ("000000001", "ADMIN"), "User is incorrect.")
self.assertEqual(users[1], ("000000002", "ADMIN"), "User is incorrect.")
# Set a new user type and assert the entries are correct.
DatabaseManager.setUserAccessType("000000001", "UNAUTHORIZED")
users = DatabaseManager.staticDatabaseManager.database.execute("SELECT * FROM Users;").fetchall()
self.assertEqual(users[0], ("000000002", "ADMIN"), "User is incorrect.")
"""
Tests the sessionStarted method.
"""
def test_sessionStarted(self):
# Initialize and close the initial database.
self.initialDatabase.execute("CREATE TABLE Users (Id char(9),AccessType STRING);")
self.initialDatabase.execute("CREATE TABLE Sessions (Id char(9),StartTime BIGINT,EndTime BIGINT);")
self.initialDatabase.commit()
self.initialDatabase.close()
# Set the static database, start sessions, and assert the sessions are correct.
DatabaseManager.staticDatabaseManager = DatabaseManager.DatabaseManager(self.databaseFile)
DatabaseManager.sessionStarted(Session.Session(User.User("000000001",10),5))
DatabaseManager.sessionStarted(Session.Session(User.User("000000002",10),8))
DatabaseManager.sessionStarted(Session.Session(User.User("000000001",10),15))
sessions = DatabaseManager.staticDatabaseManager.database.execute("SELECT * FROM Sessions;").fetchall()
self.assertEqual(sessions[0], ("000000001",5,0), "Session is incorrect.")
self.assertEqual(sessions[1], ("000000002",8,0), "Session is incorrect.")
self.assertEqual(sessions[2], ("000000001",15,0), "Session is incorrect.")
"""
Tests the sessionEnded method.
"""
def test_sessionEnded(self):
# Mock the Time module.
currentTime = 0
class MockTimeModule:
def getCurrentTimestamp(self):
return currentTime
DatabaseManager.Time = MockTimeModule()
# Initialize and close the initial database.
self.initialDatabase.execute("CREATE TABLE Users (Id char(9),AccessType STRING);")
self.initialDatabase.execute("CREATE TABLE Sessions (Id char(9),StartTime BIGINT,EndTime BIGINT);")
self.initialDatabase.commit()
self.initialDatabase.close()
# Set the static database, start and end sessions, and assert the sessions are correct.
DatabaseManager.staticDatabaseManager = DatabaseManager.DatabaseManager(self.databaseFile)
DatabaseManager.sessionStarted(Session.Session(User.User("000000001",10),5))
currentTime = 7
DatabaseManager.sessionEnded(Session.Session(User.User("000000001",10),5))
DatabaseManager.sessionStarted(Session.Session(User.User("000000002",10),8))
currentTime = 9
DatabaseManager.sessionEnded(Session.Session(User.User("000000002",10),8))
DatabaseManager.sessionStarted(Session.Session(User.User("000000001",10),15))
sessions = DatabaseManager.staticDatabaseManager.database.execute("SELECT * FROM Sessions;").fetchall()
self.assertEqual(sessions[0],("000000001",5,7),"Session is incorrect.")
self.assertEqual(sessions[1],("000000002",8,9),"Session is incorrect.")
self.assertEqual(sessions[2],("000000001",15,0),"Session is incorrect.")
"""
Runs the unit tests.
"""
def main():
unittest.main()
# Run the tests.
if __name__ == '__main__':
main() | 49.158228 | 121 | 0.68894 | 1,638 | 15,534 | 6.521978 | 0.081197 | 0.088926 | 0.065712 | 0.059908 | 0.913133 | 0.899279 | 0.890948 | 0.850604 | 0.79809 | 0.790415 | 0 | 0.065522 | 0.186494 | 15,534 | 316 | 122 | 49.158228 | 0.779853 | 0.118643 | 0 | 0.754011 | 0 | 0 | 0.259133 | 0 | 0 | 0 | 0 | 0 | 0.208556 | 1 | 0.085562 | false | 0 | 0.026738 | 0.010695 | 0.144385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e03766089ca0ef96a66a5def7cd14718558586a9 | 2,952 | py | Python | lib/stolgo/trend.py | gmtpritam/stolgo | 8ced9b4c3ea2b0a89c929c2d2765ebc8593d00b2 | [
"MIT"
] | 125 | 2020-05-24T19:37:25.000Z | 2022-03-10T10:15:20.000Z | lib/stolgo/trend.py | gmtpritam/stolgo | 8ced9b4c3ea2b0a89c929c2d2765ebc8593d00b2 | [
"MIT"
] | 84 | 2020-05-05T13:04:38.000Z | 2021-12-21T01:51:42.000Z | lib/stolgo/trend.py | gmtpritam/stolgo | 8ced9b4c3ea2b0a89c929c2d2765ebc8593d00b2 | [
"MIT"
] | 38 | 2020-05-05T17:05:26.000Z | 2022-02-25T15:30:16.000Z | import pandas as pd
import sys
from stolgo.exception import BadDataError
class Trend:
def __init__(self,periods=13,percentage=2):
self.periods = periods
self.percentage = percentage
def is_giant_uptrend(self,dfs,periods=None,percentage=None):
"""Check if price in consolidating in range for given periods
:param dfs: input candles
:type dfs: pandas dataframe
:param periods: Number of candles, defaults to None
:type periods: integer, optional
:param percentage: range of consolidation in percentage, defaults to None
:type percentage: float, optional
:raises BadDataError: data error
:return: is_consolidating
:rtype: bool
"""
try:
if not periods:
periods = self.periods
if not percentage:
percentage = self.percentage
if dfs.shape[0] < periods:
raise BadDataError("Data is not enough for this periods")
recent_dfs = dfs[-1*periods:]
prev_candle = None
for candle in recent_dfs.iterrows():
#check if the candle is green
if(candle[1]["Close"] < candle[1]["Open"]):
return False
if(not prev_candle):
prev_candle = candle
continue
if(prev_candle[1]["Close"] > candle[1]["Close"]):
return False
return True
except Exception as err:
raise Exception(str(err))
def is_giant_downtrend(self,dfs,periods=None,percentage=None):
"""Check if price in consolidating in range for given periods
:param dfs: input candles
:type dfs: pandas dataframe
:param periods: Number of candles, defaults to None
:type periods: integer, optional
:param percentage: range of consolidation in percentage, defaults to None
:type percentage: float, optional
:raises BadDataError: data error
:return: is_consolidating
:rtype: bool
"""
try:
if not periods:
periods = self.periods
if not percentage:
percentage = self.percentage
if dfs.shape[0] < periods:
raise BadDataError("Data is not enough for this periods")
recent_dfs = dfs[-1*periods:]
prev_candle = None
for candle in recent_dfs.iterrows():
#check if the candle is red
if(candle[1]["Close"] > candle[1]["Open"]):
return False
if(not prev_candle):
prev_candle = candle
continue
if(prev_candle[1]["Close"] < candle[1]["Close"]):
return False
return True
except Exception as err:
raise Exception(str(err))
| 34.325581 | 81 | 0.557249 | 320 | 2,952 | 5.071875 | 0.23125 | 0.049291 | 0.044362 | 0.044362 | 0.883549 | 0.883549 | 0.883549 | 0.883549 | 0.883549 | 0.883549 | 0 | 0.00803 | 0.367209 | 2,952 | 85 | 82 | 34.729412 | 0.860814 | 0.275068 | 0 | 0.734694 | 0 | 0 | 0.054962 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.061224 | false | 0 | 0.061224 | 0 | 0.265306 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e056c5e48ee6411c5de895aaf4cba3107dffc5b4 | 143,768 | py | Python | src/cosmosdb-preview/azext_cosmosdb_preview/vendored_sdks/azure_mgmt_cosmosdb/aio/operations/_cassandra_resources_operations.py | ravithanneeru/azure-cli-extensions | e0de87f3563ae39525370e9912589aac33e7bded | [
"MIT"
] | 207 | 2017-11-29T06:59:41.000Z | 2022-03-31T10:00:53.000Z | src/cosmosdb-preview/azext_cosmosdb_preview/vendored_sdks/azure_mgmt_cosmosdb/aio/operations/_cassandra_resources_operations.py | ravithanneeru/azure-cli-extensions | e0de87f3563ae39525370e9912589aac33e7bded | [
"MIT"
] | 4,061 | 2017-10-27T23:19:56.000Z | 2022-03-31T23:18:30.000Z | src/cosmosdb-preview/azext_cosmosdb_preview/vendored_sdks/azure_mgmt_cosmosdb/aio/operations/_cassandra_resources_operations.py | ravithanneeru/azure-cli-extensions | e0de87f3563ae39525370e9912589aac33e7bded | [
"MIT"
] | 802 | 2017-10-11T17:36:26.000Z | 2022-03-31T22:24:32.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------
from typing import Any, AsyncIterable, Callable, Dict, Generic, Optional, TypeVar, Union
import warnings
from azure.core.async_paging import AsyncItemPaged, AsyncList
from azure.core.exceptions import ClientAuthenticationError, HttpResponseError, ResourceExistsError, ResourceNotFoundError, map_error
from azure.core.pipeline import PipelineResponse
from azure.core.pipeline.transport import AsyncHttpResponse, HttpRequest
from azure.core.polling import AsyncLROPoller, AsyncNoPolling, AsyncPollingMethod
from azure.mgmt.core.exceptions import ARMErrorFormat
from azure.mgmt.core.polling.async_arm_polling import AsyncARMPolling
from ... import models as _models
T = TypeVar('T')
ClsType = Optional[Callable[[PipelineResponse[HttpRequest, AsyncHttpResponse], T, Dict[str, Any]], Any]]
class CassandraResourcesOperations:
"""CassandraResourcesOperations async operations.
You should not instantiate this class directly. Instead, you should create a Client instance that
instantiates it for you and attaches it as an attribute.
:ivar models: Alias to model classes used in this operation group.
:type models: ~azure.mgmt.cosmosdb.models
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An object model deserializer.
"""
models = _models
def __init__(self, client, config, serializer, deserializer) -> None:
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self._config = config
def list_cassandra_keyspaces(
self,
resource_group_name: str,
account_name: str,
**kwargs: Any
) -> AsyncIterable["_models.CassandraKeyspaceListResult"]:
"""Lists the Cassandra keyspaces under an existing Azure Cosmos DB database account.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either CassandraKeyspaceListResult or the result of cls(response)
:rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.mgmt.cosmosdb.models.CassandraKeyspaceListResult]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.CassandraKeyspaceListResult"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
accept = "application/json"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
if not next_link:
# Construct URL
url = self.list_cassandra_keyspaces.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
request = self._client.get(url, query_parameters, header_parameters)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
request = self._client.get(url, query_parameters, header_parameters)
return request
async def extract_data(pipeline_response):
deserialized = self._deserialize('CassandraKeyspaceListResult', pipeline_response)
list_of_elem = deserialized.value
if cls:
list_of_elem = cls(list_of_elem)
return None, AsyncList(list_of_elem)
async def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
return pipeline_response
return AsyncItemPaged(
get_next, extract_data
)
list_cassandra_keyspaces.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces'} # type: ignore
async def get_cassandra_keyspace(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
**kwargs: Any
) -> "_models.CassandraKeyspaceGetResults":
"""Gets the Cassandra keyspaces under an existing Azure Cosmos DB database account with the
provided name.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: CassandraKeyspaceGetResults, or the result of cls(response)
:rtype: ~azure.mgmt.cosmosdb.models.CassandraKeyspaceGetResults
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.CassandraKeyspaceGetResults"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
accept = "application/json"
# Construct URL
url = self.get_cassandra_keyspace.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
deserialized = self._deserialize('CassandraKeyspaceGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_cassandra_keyspace.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}'} # type: ignore
async def _create_update_cassandra_keyspace_initial(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
create_update_cassandra_keyspace_parameters: "_models.CassandraKeyspaceCreateUpdateParameters",
**kwargs: Any
) -> Optional["_models.CassandraKeyspaceGetResults"]:
cls = kwargs.pop('cls', None) # type: ClsType[Optional["_models.CassandraKeyspaceGetResults"]]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
content_type = kwargs.pop("content_type", "application/json")
accept = "application/json"
# Construct URL
url = self._create_update_cassandra_keyspace_initial.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(create_update_cassandra_keyspace_parameters, 'CassandraKeyspaceCreateUpdateParameters')
body_content_kwargs['content'] = body_content
request = self._client.put(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('CassandraKeyspaceGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
_create_update_cassandra_keyspace_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}'} # type: ignore
async def begin_create_update_cassandra_keyspace(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
create_update_cassandra_keyspace_parameters: "_models.CassandraKeyspaceCreateUpdateParameters",
**kwargs: Any
) -> AsyncLROPoller["_models.CassandraKeyspaceGetResults"]:
"""Create or update an Azure Cosmos DB Cassandra keyspace.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param create_update_cassandra_keyspace_parameters: The parameters to provide for the current
Cassandra keyspace.
:type create_update_cassandra_keyspace_parameters: ~azure.mgmt.cosmosdb.models.CassandraKeyspaceCreateUpdateParameters
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be AsyncARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.AsyncPollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of AsyncLROPoller that returns either CassandraKeyspaceGetResults or the result of cls(response)
:rtype: ~azure.core.polling.AsyncLROPoller[~azure.mgmt.cosmosdb.models.CassandraKeyspaceGetResults]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, AsyncPollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType["_models.CassandraKeyspaceGetResults"]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = await self._create_update_cassandra_keyspace_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
create_update_cassandra_keyspace_parameters=create_update_cassandra_keyspace_parameters,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
deserialized = self._deserialize('CassandraKeyspaceGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
}
if polling is True: polling_method = AsyncARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = AsyncNoPolling()
else: polling_method = polling
if cont_token:
return AsyncLROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return AsyncLROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_create_update_cassandra_keyspace.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}'} # type: ignore
async def _delete_cassandra_keyspace_initial(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
**kwargs: Any
) -> None:
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
# Construct URL
url = self._delete_cassandra_keyspace_initial.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
request = self._client.delete(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [202, 204]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
if cls:
return cls(pipeline_response, None, {})
_delete_cassandra_keyspace_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}'} # type: ignore
async def begin_delete_cassandra_keyspace(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
**kwargs: Any
) -> AsyncLROPoller[None]:
"""Deletes an existing Azure Cosmos DB Cassandra keyspace.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be AsyncARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.AsyncPollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of AsyncLROPoller that returns either None or the result of cls(response)
:rtype: ~azure.core.polling.AsyncLROPoller[None]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, AsyncPollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType[None]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = await self._delete_cassandra_keyspace_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
if cls:
return cls(pipeline_response, None, {})
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
}
if polling is True: polling_method = AsyncARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = AsyncNoPolling()
else: polling_method = polling
if cont_token:
return AsyncLROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return AsyncLROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_delete_cassandra_keyspace.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}'} # type: ignore
async def get_cassandra_keyspace_throughput(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
**kwargs: Any
) -> "_models.ThroughputSettingsGetResults":
"""Gets the RUs per second of the Cassandra Keyspace under an existing Azure Cosmos DB database
account with the provided name.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: ThroughputSettingsGetResults, or the result of cls(response)
:rtype: ~azure.mgmt.cosmosdb.models.ThroughputSettingsGetResults
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.ThroughputSettingsGetResults"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
accept = "application/json"
# Construct URL
url = self.get_cassandra_keyspace_throughput.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_cassandra_keyspace_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/throughputSettings/default'} # type: ignore
async def _update_cassandra_keyspace_throughput_initial(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
update_throughput_parameters: "_models.ThroughputSettingsUpdateParameters",
**kwargs: Any
) -> Optional["_models.ThroughputSettingsGetResults"]:
cls = kwargs.pop('cls', None) # type: ClsType[Optional["_models.ThroughputSettingsGetResults"]]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
content_type = kwargs.pop("content_type", "application/json")
accept = "application/json"
# Construct URL
url = self._update_cassandra_keyspace_throughput_initial.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(update_throughput_parameters, 'ThroughputSettingsUpdateParameters')
body_content_kwargs['content'] = body_content
request = self._client.put(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
_update_cassandra_keyspace_throughput_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/throughputSettings/default'} # type: ignore
async def begin_update_cassandra_keyspace_throughput(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
update_throughput_parameters: "_models.ThroughputSettingsUpdateParameters",
**kwargs: Any
) -> AsyncLROPoller["_models.ThroughputSettingsGetResults"]:
"""Update RUs per second of an Azure Cosmos DB Cassandra Keyspace.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param update_throughput_parameters: The RUs per second of the parameters to provide for the
current Cassandra Keyspace.
:type update_throughput_parameters: ~azure.mgmt.cosmosdb.models.ThroughputSettingsUpdateParameters
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be AsyncARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.AsyncPollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of AsyncLROPoller that returns either ThroughputSettingsGetResults or the result of cls(response)
:rtype: ~azure.core.polling.AsyncLROPoller[~azure.mgmt.cosmosdb.models.ThroughputSettingsGetResults]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, AsyncPollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType["_models.ThroughputSettingsGetResults"]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = await self._update_cassandra_keyspace_throughput_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
update_throughput_parameters=update_throughput_parameters,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
}
if polling is True: polling_method = AsyncARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = AsyncNoPolling()
else: polling_method = polling
if cont_token:
return AsyncLROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return AsyncLROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_update_cassandra_keyspace_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/throughputSettings/default'} # type: ignore
async def _migrate_cassandra_keyspace_to_autoscale_initial(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
**kwargs: Any
) -> Optional["_models.ThroughputSettingsGetResults"]:
cls = kwargs.pop('cls', None) # type: ClsType[Optional["_models.ThroughputSettingsGetResults"]]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
accept = "application/json"
# Construct URL
url = self._migrate_cassandra_keyspace_to_autoscale_initial.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.post(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
_migrate_cassandra_keyspace_to_autoscale_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/throughputSettings/default/migrateToAutoscale'} # type: ignore
async def begin_migrate_cassandra_keyspace_to_autoscale(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
**kwargs: Any
) -> AsyncLROPoller["_models.ThroughputSettingsGetResults"]:
"""Migrate an Azure Cosmos DB Cassandra Keyspace from manual throughput to autoscale.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be AsyncARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.AsyncPollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of AsyncLROPoller that returns either ThroughputSettingsGetResults or the result of cls(response)
:rtype: ~azure.core.polling.AsyncLROPoller[~azure.mgmt.cosmosdb.models.ThroughputSettingsGetResults]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, AsyncPollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType["_models.ThroughputSettingsGetResults"]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = await self._migrate_cassandra_keyspace_to_autoscale_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
}
if polling is True: polling_method = AsyncARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = AsyncNoPolling()
else: polling_method = polling
if cont_token:
return AsyncLROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return AsyncLROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_migrate_cassandra_keyspace_to_autoscale.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/throughputSettings/default/migrateToAutoscale'} # type: ignore
async def _migrate_cassandra_keyspace_to_manual_throughput_initial(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
**kwargs: Any
) -> Optional["_models.ThroughputSettingsGetResults"]:
cls = kwargs.pop('cls', None) # type: ClsType[Optional["_models.ThroughputSettingsGetResults"]]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
accept = "application/json"
# Construct URL
url = self._migrate_cassandra_keyspace_to_manual_throughput_initial.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.post(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
_migrate_cassandra_keyspace_to_manual_throughput_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/throughputSettings/default/migrateToManualThroughput'} # type: ignore
async def begin_migrate_cassandra_keyspace_to_manual_throughput(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
**kwargs: Any
) -> AsyncLROPoller["_models.ThroughputSettingsGetResults"]:
"""Migrate an Azure Cosmos DB Cassandra Keyspace from autoscale to manual throughput.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be AsyncARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.AsyncPollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of AsyncLROPoller that returns either ThroughputSettingsGetResults or the result of cls(response)
:rtype: ~azure.core.polling.AsyncLROPoller[~azure.mgmt.cosmosdb.models.ThroughputSettingsGetResults]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, AsyncPollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType["_models.ThroughputSettingsGetResults"]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = await self._migrate_cassandra_keyspace_to_manual_throughput_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
}
if polling is True: polling_method = AsyncARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = AsyncNoPolling()
else: polling_method = polling
if cont_token:
return AsyncLROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return AsyncLROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_migrate_cassandra_keyspace_to_manual_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/throughputSettings/default/migrateToManualThroughput'} # type: ignore
def list_cassandra_tables(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
**kwargs: Any
) -> AsyncIterable["_models.CassandraTableListResult"]:
"""Lists the Cassandra table under an existing Azure Cosmos DB database account.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either CassandraTableListResult or the result of cls(response)
:rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.mgmt.cosmosdb.models.CassandraTableListResult]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.CassandraTableListResult"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
accept = "application/json"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
if not next_link:
# Construct URL
url = self.list_cassandra_tables.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
request = self._client.get(url, query_parameters, header_parameters)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
request = self._client.get(url, query_parameters, header_parameters)
return request
async def extract_data(pipeline_response):
deserialized = self._deserialize('CassandraTableListResult', pipeline_response)
list_of_elem = deserialized.value
if cls:
list_of_elem = cls(list_of_elem)
return None, AsyncList(list_of_elem)
async def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
return pipeline_response
return AsyncItemPaged(
get_next, extract_data
)
list_cassandra_tables.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/tables'} # type: ignore
async def get_cassandra_table(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
table_name: str,
**kwargs: Any
) -> "_models.CassandraTableGetResults":
"""Gets the Cassandra table under an existing Azure Cosmos DB database account.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param table_name: Cosmos DB table name.
:type table_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: CassandraTableGetResults, or the result of cls(response)
:rtype: ~azure.mgmt.cosmosdb.models.CassandraTableGetResults
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.CassandraTableGetResults"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
accept = "application/json"
# Construct URL
url = self.get_cassandra_table.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'tableName': self._serialize.url("table_name", table_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
deserialized = self._deserialize('CassandraTableGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_cassandra_table.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/tables/{tableName}'} # type: ignore
async def _create_update_cassandra_table_initial(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
table_name: str,
create_update_cassandra_table_parameters: "_models.CassandraTableCreateUpdateParameters",
**kwargs: Any
) -> Optional["_models.CassandraTableGetResults"]:
cls = kwargs.pop('cls', None) # type: ClsType[Optional["_models.CassandraTableGetResults"]]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
content_type = kwargs.pop("content_type", "application/json")
accept = "application/json"
# Construct URL
url = self._create_update_cassandra_table_initial.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'tableName': self._serialize.url("table_name", table_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(create_update_cassandra_table_parameters, 'CassandraTableCreateUpdateParameters')
body_content_kwargs['content'] = body_content
request = self._client.put(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('CassandraTableGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
_create_update_cassandra_table_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/tables/{tableName}'} # type: ignore
async def begin_create_update_cassandra_table(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
table_name: str,
create_update_cassandra_table_parameters: "_models.CassandraTableCreateUpdateParameters",
**kwargs: Any
) -> AsyncLROPoller["_models.CassandraTableGetResults"]:
"""Create or update an Azure Cosmos DB Cassandra Table.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param table_name: Cosmos DB table name.
:type table_name: str
:param create_update_cassandra_table_parameters: The parameters to provide for the current
Cassandra Table.
:type create_update_cassandra_table_parameters: ~azure.mgmt.cosmosdb.models.CassandraTableCreateUpdateParameters
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be AsyncARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.AsyncPollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of AsyncLROPoller that returns either CassandraTableGetResults or the result of cls(response)
:rtype: ~azure.core.polling.AsyncLROPoller[~azure.mgmt.cosmosdb.models.CassandraTableGetResults]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, AsyncPollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType["_models.CassandraTableGetResults"]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = await self._create_update_cassandra_table_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
table_name=table_name,
create_update_cassandra_table_parameters=create_update_cassandra_table_parameters,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
deserialized = self._deserialize('CassandraTableGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'tableName': self._serialize.url("table_name", table_name, 'str'),
}
if polling is True: polling_method = AsyncARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = AsyncNoPolling()
else: polling_method = polling
if cont_token:
return AsyncLROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return AsyncLROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_create_update_cassandra_table.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/tables/{tableName}'} # type: ignore
async def _delete_cassandra_table_initial(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
table_name: str,
**kwargs: Any
) -> None:
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
# Construct URL
url = self._delete_cassandra_table_initial.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'tableName': self._serialize.url("table_name", table_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
request = self._client.delete(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [202, 204]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
if cls:
return cls(pipeline_response, None, {})
_delete_cassandra_table_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/tables/{tableName}'} # type: ignore
async def begin_delete_cassandra_table(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
table_name: str,
**kwargs: Any
) -> AsyncLROPoller[None]:
"""Deletes an existing Azure Cosmos DB Cassandra table.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param table_name: Cosmos DB table name.
:type table_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be AsyncARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.AsyncPollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of AsyncLROPoller that returns either None or the result of cls(response)
:rtype: ~azure.core.polling.AsyncLROPoller[None]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, AsyncPollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType[None]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = await self._delete_cassandra_table_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
table_name=table_name,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
if cls:
return cls(pipeline_response, None, {})
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'tableName': self._serialize.url("table_name", table_name, 'str'),
}
if polling is True: polling_method = AsyncARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = AsyncNoPolling()
else: polling_method = polling
if cont_token:
return AsyncLROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return AsyncLROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_delete_cassandra_table.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/tables/{tableName}'} # type: ignore
async def get_cassandra_table_throughput(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
table_name: str,
**kwargs: Any
) -> "_models.ThroughputSettingsGetResults":
"""Gets the RUs per second of the Cassandra table under an existing Azure Cosmos DB database
account with the provided name.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param table_name: Cosmos DB table name.
:type table_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: ThroughputSettingsGetResults, or the result of cls(response)
:rtype: ~azure.mgmt.cosmosdb.models.ThroughputSettingsGetResults
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.ThroughputSettingsGetResults"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
accept = "application/json"
# Construct URL
url = self.get_cassandra_table_throughput.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'tableName': self._serialize.url("table_name", table_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_cassandra_table_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/tables/{tableName}/throughputSettings/default'} # type: ignore
async def _update_cassandra_table_throughput_initial(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
table_name: str,
update_throughput_parameters: "_models.ThroughputSettingsUpdateParameters",
**kwargs: Any
) -> Optional["_models.ThroughputSettingsGetResults"]:
cls = kwargs.pop('cls', None) # type: ClsType[Optional["_models.ThroughputSettingsGetResults"]]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
content_type = kwargs.pop("content_type", "application/json")
accept = "application/json"
# Construct URL
url = self._update_cassandra_table_throughput_initial.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'tableName': self._serialize.url("table_name", table_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(update_throughput_parameters, 'ThroughputSettingsUpdateParameters')
body_content_kwargs['content'] = body_content
request = self._client.put(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
_update_cassandra_table_throughput_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/tables/{tableName}/throughputSettings/default'} # type: ignore
async def begin_update_cassandra_table_throughput(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
table_name: str,
update_throughput_parameters: "_models.ThroughputSettingsUpdateParameters",
**kwargs: Any
) -> AsyncLROPoller["_models.ThroughputSettingsGetResults"]:
"""Update RUs per second of an Azure Cosmos DB Cassandra table.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param table_name: Cosmos DB table name.
:type table_name: str
:param update_throughput_parameters: The RUs per second of the parameters to provide for the
current Cassandra table.
:type update_throughput_parameters: ~azure.mgmt.cosmosdb.models.ThroughputSettingsUpdateParameters
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be AsyncARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.AsyncPollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of AsyncLROPoller that returns either ThroughputSettingsGetResults or the result of cls(response)
:rtype: ~azure.core.polling.AsyncLROPoller[~azure.mgmt.cosmosdb.models.ThroughputSettingsGetResults]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, AsyncPollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType["_models.ThroughputSettingsGetResults"]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = await self._update_cassandra_table_throughput_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
table_name=table_name,
update_throughput_parameters=update_throughput_parameters,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'tableName': self._serialize.url("table_name", table_name, 'str'),
}
if polling is True: polling_method = AsyncARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = AsyncNoPolling()
else: polling_method = polling
if cont_token:
return AsyncLROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return AsyncLROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_update_cassandra_table_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/tables/{tableName}/throughputSettings/default'} # type: ignore
async def _migrate_cassandra_table_to_autoscale_initial(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
table_name: str,
**kwargs: Any
) -> Optional["_models.ThroughputSettingsGetResults"]:
cls = kwargs.pop('cls', None) # type: ClsType[Optional["_models.ThroughputSettingsGetResults"]]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
accept = "application/json"
# Construct URL
url = self._migrate_cassandra_table_to_autoscale_initial.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'tableName': self._serialize.url("table_name", table_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.post(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
_migrate_cassandra_table_to_autoscale_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/tables/{tableName}/throughputSettings/default/migrateToAutoscale'} # type: ignore
async def begin_migrate_cassandra_table_to_autoscale(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
table_name: str,
**kwargs: Any
) -> AsyncLROPoller["_models.ThroughputSettingsGetResults"]:
"""Migrate an Azure Cosmos DB Cassandra table from manual throughput to autoscale.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param table_name: Cosmos DB table name.
:type table_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be AsyncARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.AsyncPollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of AsyncLROPoller that returns either ThroughputSettingsGetResults or the result of cls(response)
:rtype: ~azure.core.polling.AsyncLROPoller[~azure.mgmt.cosmosdb.models.ThroughputSettingsGetResults]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, AsyncPollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType["_models.ThroughputSettingsGetResults"]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = await self._migrate_cassandra_table_to_autoscale_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
table_name=table_name,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'tableName': self._serialize.url("table_name", table_name, 'str'),
}
if polling is True: polling_method = AsyncARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = AsyncNoPolling()
else: polling_method = polling
if cont_token:
return AsyncLROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return AsyncLROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_migrate_cassandra_table_to_autoscale.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/tables/{tableName}/throughputSettings/default/migrateToAutoscale'} # type: ignore
async def _migrate_cassandra_table_to_manual_throughput_initial(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
table_name: str,
**kwargs: Any
) -> Optional["_models.ThroughputSettingsGetResults"]:
cls = kwargs.pop('cls', None) # type: ClsType[Optional["_models.ThroughputSettingsGetResults"]]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
accept = "application/json"
# Construct URL
url = self._migrate_cassandra_table_to_manual_throughput_initial.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'tableName': self._serialize.url("table_name", table_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.post(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
_migrate_cassandra_table_to_manual_throughput_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/tables/{tableName}/throughputSettings/default/migrateToManualThroughput'} # type: ignore
async def begin_migrate_cassandra_table_to_manual_throughput(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
table_name: str,
**kwargs: Any
) -> AsyncLROPoller["_models.ThroughputSettingsGetResults"]:
"""Migrate an Azure Cosmos DB Cassandra table from autoscale to manual throughput.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param table_name: Cosmos DB table name.
:type table_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be AsyncARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.AsyncPollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of AsyncLROPoller that returns either ThroughputSettingsGetResults or the result of cls(response)
:rtype: ~azure.core.polling.AsyncLROPoller[~azure.mgmt.cosmosdb.models.ThroughputSettingsGetResults]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, AsyncPollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType["_models.ThroughputSettingsGetResults"]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = await self._migrate_cassandra_table_to_manual_throughput_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
table_name=table_name,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'tableName': self._serialize.url("table_name", table_name, 'str'),
}
if polling is True: polling_method = AsyncARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = AsyncNoPolling()
else: polling_method = polling
if cont_token:
return AsyncLROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return AsyncLROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_migrate_cassandra_table_to_manual_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/tables/{tableName}/throughputSettings/default/migrateToManualThroughput'} # type: ignore
def list_cassandra_views(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
**kwargs: Any
) -> AsyncIterable["_models.CassandraViewListResult"]:
"""Lists the Cassandra materialized views under an existing Azure Cosmos DB database account.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either CassandraViewListResult or the result of cls(response)
:rtype: ~azure.core.async_paging.AsyncItemPaged[~azure.mgmt.cosmosdb.models.CassandraViewListResult]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.CassandraViewListResult"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
accept = "application/json"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
if not next_link:
# Construct URL
url = self.list_cassandra_views.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
request = self._client.get(url, query_parameters, header_parameters)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
request = self._client.get(url, query_parameters, header_parameters)
return request
async def extract_data(pipeline_response):
deserialized = self._deserialize('CassandraViewListResult', pipeline_response)
list_of_elem = deserialized.value
if cls:
list_of_elem = cls(list_of_elem)
return None, AsyncList(list_of_elem)
async def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
return pipeline_response
return AsyncItemPaged(
get_next, extract_data
)
list_cassandra_views.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/views'} # type: ignore
async def get_cassandra_view(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
view_name: str,
**kwargs: Any
) -> "_models.CassandraViewGetResults":
"""Gets the Cassandra view under an existing Azure Cosmos DB database account.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param view_name: Cosmos DB view name.
:type view_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: CassandraViewGetResults, or the result of cls(response)
:rtype: ~azure.mgmt.cosmosdb.models.CassandraViewGetResults
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.CassandraViewGetResults"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
accept = "application/json"
# Construct URL
url = self.get_cassandra_view.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'viewName': self._serialize.url("view_name", view_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
deserialized = self._deserialize('CassandraViewGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_cassandra_view.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/views/{viewName}'} # type: ignore
async def _create_update_cassandra_view_initial(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
view_name: str,
create_update_cassandra_view_parameters: "_models.CassandraViewCreateUpdateParameters",
**kwargs: Any
) -> Optional["_models.CassandraViewGetResults"]:
cls = kwargs.pop('cls', None) # type: ClsType[Optional["_models.CassandraViewGetResults"]]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
content_type = kwargs.pop("content_type", "application/json")
accept = "application/json"
# Construct URL
url = self._create_update_cassandra_view_initial.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'viewName': self._serialize.url("view_name", view_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(create_update_cassandra_view_parameters, 'CassandraViewCreateUpdateParameters')
body_content_kwargs['content'] = body_content
request = self._client.put(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('CassandraViewGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
_create_update_cassandra_view_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/views/{viewName}'} # type: ignore
async def begin_create_update_cassandra_view(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
view_name: str,
create_update_cassandra_view_parameters: "_models.CassandraViewCreateUpdateParameters",
**kwargs: Any
) -> AsyncLROPoller["_models.CassandraViewGetResults"]:
"""Create or update an Azure Cosmos DB Cassandra View.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param view_name: Cosmos DB view name.
:type view_name: str
:param create_update_cassandra_view_parameters: The parameters to provide for the current
Cassandra View.
:type create_update_cassandra_view_parameters: ~azure.mgmt.cosmosdb.models.CassandraViewCreateUpdateParameters
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be AsyncARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.AsyncPollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of AsyncLROPoller that returns either CassandraViewGetResults or the result of cls(response)
:rtype: ~azure.core.polling.AsyncLROPoller[~azure.mgmt.cosmosdb.models.CassandraViewGetResults]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, AsyncPollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType["_models.CassandraViewGetResults"]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = await self._create_update_cassandra_view_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
view_name=view_name,
create_update_cassandra_view_parameters=create_update_cassandra_view_parameters,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
deserialized = self._deserialize('CassandraViewGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'viewName': self._serialize.url("view_name", view_name, 'str'),
}
if polling is True: polling_method = AsyncARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = AsyncNoPolling()
else: polling_method = polling
if cont_token:
return AsyncLROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return AsyncLROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_create_update_cassandra_view.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/views/{viewName}'} # type: ignore
async def _delete_cassandra_view_initial(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
view_name: str,
**kwargs: Any
) -> None:
cls = kwargs.pop('cls', None) # type: ClsType[None]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
# Construct URL
url = self._delete_cassandra_view_initial.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'viewName': self._serialize.url("view_name", view_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
request = self._client.delete(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202, 204]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
if cls:
return cls(pipeline_response, None, {})
_delete_cassandra_view_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/views/{viewName}'} # type: ignore
async def begin_delete_cassandra_view(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
view_name: str,
**kwargs: Any
) -> AsyncLROPoller[None]:
"""Deletes an existing Azure Cosmos DB Cassandra view.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param view_name: Cosmos DB view name.
:type view_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be AsyncARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.AsyncPollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of AsyncLROPoller that returns either None or the result of cls(response)
:rtype: ~azure.core.polling.AsyncLROPoller[None]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, AsyncPollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType[None]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = await self._delete_cassandra_view_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
view_name=view_name,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
if cls:
return cls(pipeline_response, None, {})
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'viewName': self._serialize.url("view_name", view_name, 'str'),
}
if polling is True: polling_method = AsyncARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = AsyncNoPolling()
else: polling_method = polling
if cont_token:
return AsyncLROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return AsyncLROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_delete_cassandra_view.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/views/{viewName}'} # type: ignore
async def get_cassandra_view_throughput(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
view_name: str,
**kwargs: Any
) -> "_models.ThroughputSettingsGetResults":
"""Gets the RUs per second of the Cassandra view under an existing Azure Cosmos DB database
account with the provided name.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param view_name: Cosmos DB view name.
:type view_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: ThroughputSettingsGetResults, or the result of cls(response)
:rtype: ~azure.mgmt.cosmosdb.models.ThroughputSettingsGetResults
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.ThroughputSettingsGetResults"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
accept = "application/json"
# Construct URL
url = self.get_cassandra_view_throughput.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'viewName': self._serialize.url("view_name", view_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_cassandra_view_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/views/{viewName}/throughputSettings/default'} # type: ignore
async def _update_cassandra_view_throughput_initial(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
view_name: str,
update_throughput_parameters: "_models.ThroughputSettingsUpdateParameters",
**kwargs: Any
) -> Optional["_models.ThroughputSettingsGetResults"]:
cls = kwargs.pop('cls', None) # type: ClsType[Optional["_models.ThroughputSettingsGetResults"]]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
content_type = kwargs.pop("content_type", "application/json")
accept = "application/json"
# Construct URL
url = self._update_cassandra_view_throughput_initial.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'viewName': self._serialize.url("view_name", view_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
body_content = self._serialize.body(update_throughput_parameters, 'ThroughputSettingsUpdateParameters')
body_content_kwargs['content'] = body_content
request = self._client.put(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
_update_cassandra_view_throughput_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/views/{viewName}/throughputSettings/default'} # type: ignore
async def begin_update_cassandra_view_throughput(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
view_name: str,
update_throughput_parameters: "_models.ThroughputSettingsUpdateParameters",
**kwargs: Any
) -> AsyncLROPoller["_models.ThroughputSettingsGetResults"]:
"""Update RUs per second of an Azure Cosmos DB Cassandra view.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param view_name: Cosmos DB view name.
:type view_name: str
:param update_throughput_parameters: The RUs per second of the parameters to provide for the
current Cassandra view.
:type update_throughput_parameters: ~azure.mgmt.cosmosdb.models.ThroughputSettingsUpdateParameters
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be AsyncARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.AsyncPollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of AsyncLROPoller that returns either ThroughputSettingsGetResults or the result of cls(response)
:rtype: ~azure.core.polling.AsyncLROPoller[~azure.mgmt.cosmosdb.models.ThroughputSettingsGetResults]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, AsyncPollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType["_models.ThroughputSettingsGetResults"]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = await self._update_cassandra_view_throughput_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
view_name=view_name,
update_throughput_parameters=update_throughput_parameters,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'viewName': self._serialize.url("view_name", view_name, 'str'),
}
if polling is True: polling_method = AsyncARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = AsyncNoPolling()
else: polling_method = polling
if cont_token:
return AsyncLROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return AsyncLROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_update_cassandra_view_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/views/{viewName}/throughputSettings/default'} # type: ignore
async def _migrate_cassandra_view_to_autoscale_initial(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
view_name: str,
**kwargs: Any
) -> Optional["_models.ThroughputSettingsGetResults"]:
cls = kwargs.pop('cls', None) # type: ClsType[Optional["_models.ThroughputSettingsGetResults"]]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
accept = "application/json"
# Construct URL
url = self._migrate_cassandra_view_to_autoscale_initial.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'viewName': self._serialize.url("view_name", view_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.post(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
_migrate_cassandra_view_to_autoscale_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/views/{viewName}/throughputSettings/default/migrateToAutoscale'} # type: ignore
async def begin_migrate_cassandra_view_to_autoscale(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
view_name: str,
**kwargs: Any
) -> AsyncLROPoller["_models.ThroughputSettingsGetResults"]:
"""Migrate an Azure Cosmos DB Cassandra view from manual throughput to autoscale.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param view_name: Cosmos DB view name.
:type view_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be AsyncARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.AsyncPollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of AsyncLROPoller that returns either ThroughputSettingsGetResults or the result of cls(response)
:rtype: ~azure.core.polling.AsyncLROPoller[~azure.mgmt.cosmosdb.models.ThroughputSettingsGetResults]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, AsyncPollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType["_models.ThroughputSettingsGetResults"]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = await self._migrate_cassandra_view_to_autoscale_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
view_name=view_name,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'viewName': self._serialize.url("view_name", view_name, 'str'),
}
if polling is True: polling_method = AsyncARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = AsyncNoPolling()
else: polling_method = polling
if cont_token:
return AsyncLROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return AsyncLROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_migrate_cassandra_view_to_autoscale.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/views/{viewName}/throughputSettings/default/migrateToAutoscale'} # type: ignore
async def _migrate_cassandra_view_to_manual_throughput_initial(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
view_name: str,
**kwargs: Any
) -> Optional["_models.ThroughputSettingsGetResults"]:
cls = kwargs.pop('cls', None) # type: ClsType[Optional["_models.ThroughputSettingsGetResults"]]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-10-15-preview"
accept = "application/json"
# Construct URL
url = self._migrate_cassandra_view_to_manual_throughput_initial.metadata['url'] # type: ignore
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'viewName': self._serialize.url("view_name", view_name, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.post(url, query_parameters, header_parameters)
pipeline_response = await self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200, 202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
_migrate_cassandra_view_to_manual_throughput_initial.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/views/{viewName}/throughputSettings/default/migrateToManualThroughput'} # type: ignore
async def begin_migrate_cassandra_view_to_manual_throughput(
self,
resource_group_name: str,
account_name: str,
keyspace_name: str,
view_name: str,
**kwargs: Any
) -> AsyncLROPoller["_models.ThroughputSettingsGetResults"]:
"""Migrate an Azure Cosmos DB Cassandra view from autoscale to manual throughput.
:param resource_group_name: The name of the resource group. The name is case insensitive.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param view_name: Cosmos DB view name.
:type view_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be AsyncARMPolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.AsyncPollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of AsyncLROPoller that returns either ThroughputSettingsGetResults or the result of cls(response)
:rtype: ~azure.core.polling.AsyncLROPoller[~azure.mgmt.cosmosdb.models.ThroughputSettingsGetResults]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, AsyncPollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType["_models.ThroughputSettingsGetResults"]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = await self._migrate_cassandra_view_to_manual_throughput_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
view_name=view_name,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
deserialized = self._deserialize('ThroughputSettingsGetResults', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
path_format_arguments = {
'subscriptionId': self._serialize.url("self._config.subscription_id", self._config.subscription_id, 'str', min_length=1),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'viewName': self._serialize.url("view_name", view_name, 'str'),
}
if polling is True: polling_method = AsyncARMPolling(lro_delay, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = AsyncNoPolling()
else: polling_method = polling
if cont_token:
return AsyncLROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return AsyncLROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_migrate_cassandra_view_to_manual_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/cassandraKeyspaces/{keyspaceName}/views/{viewName}/throughputSettings/default/migrateToManualThroughput'} # type: ignore
| 54.915202 | 330 | 0.68308 | 15,699 | 143,768 | 6.007708 | 0.01981 | 0.02709 | 0.035148 | 0.02163 | 0.973461 | 0.968266 | 0.96693 | 0.965234 | 0.962318 | 0.957218 | 0 | 0.008854 | 0.219082 | 143,768 | 2,617 | 331 | 54.936186 | 0.831212 | 0.063442 | 0 | 0.849046 | 0 | 0.021324 | 0.191772 | 0.114791 | 0 | 0 | 0 | 0 | 0 | 1 | 0.012346 | false | 0 | 0.005612 | 0 | 0.079686 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0ee0dfe9d6d61d3e79abfbd170dd529abb3f2d81 | 88,150 | py | Python | tempest/api/hybrid_cloud/compute/servers/test_servers_operations.py | Hybrid-Cloud/hybrid-tempest | 319e90c6fa6e46925b495c93cd5258f088a30ec0 | [
"Apache-2.0"
] | null | null | null | tempest/api/hybrid_cloud/compute/servers/test_servers_operations.py | Hybrid-Cloud/hybrid-tempest | 319e90c6fa6e46925b495c93cd5258f088a30ec0 | [
"Apache-2.0"
] | null | null | null | tempest/api/hybrid_cloud/compute/servers/test_servers_operations.py | Hybrid-Cloud/hybrid-tempest | 319e90c6fa6e46925b495c93cd5258f088a30ec0 | [
"Apache-2.0"
] | null | null | null | import testtools
from oslo_log import log
import netaddr
import base64
from six import moves
import tempest.api.compute.servers.test_attach_interfaces as test_attach_interfaces
import tempest.api.compute.servers.test_availability_zone as test_availability_zone
import tempest.api.compute.servers.test_create_server as test_create_server
import tempest.api.compute.servers.test_delete_server as test_delete_server
import tempest.api.compute.servers.test_instance_actions as test_instance_actions
import tempest.api.compute.servers.test_instance_actions_negative as test_instance_actions_negative
import tempest.api.compute.servers.test_list_server_filters as test_list_server_filters
import tempest.api.compute.servers.test_list_servers_negative as test_list_servers_negative
import tempest.api.compute.servers.test_multiple_create as test_multiple_create
import tempest.api.compute.servers.test_multiple_create_negative as test_multiple_create_negative
import tempest.api.compute.servers.test_server_actions as test_server_actions
import tempest.api.compute.servers.test_server_addresses as test_server_addresses
import tempest.api.compute.servers.test_server_addresses_negative as test_server_addresses_negative
import tempest.api.compute.servers.test_server_metadata as test_server_metadata
import tempest.api.compute.servers.test_server_metadata_negative as test_server_metadata_negative
import tempest.api.compute.servers.test_server_password as test_server_password
import tempest.api.compute.servers.test_server_personality as test_server_personality
import tempest.api.compute.servers.test_servers as test_servers
import tempest.api.compute.servers.test_servers_negative as test_servers_negative
import tempest.api.compute.servers.test_virtual_interfaces_negative as test_virtual_interfaces_negative
from tempest.common.utils import data_utils
from tempest.common.utils.linux import remote_client
from tempest.common import waiters
from tempest.common import fixed_network
from tempest.lib import exceptions as lib_exc
from tempest import test
from tempest import config
CONF = config.CONF
LOG = log.getLogger(__name__)
class HybridAttachInterfacesVCloudTestJSON(test_attach_interfaces.AttachInterfacesTestJSON):
"""Test attach interfaces"""
def _create_server_get_interfaces(self):
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.vcloud_availability_zone)
ifs = (self.client.list_interfaces(server['id'])
['interfaceAttachments'])
body = self.wait_for_interface_status(
server['id'], ifs[0]['port_id'], 'ACTIVE')
ifs[0]['port_state'] = body['port_state']
return server, ifs
class HybridAttachInterfacesAWSTestJSON(test_attach_interfaces.AttachInterfacesTestJSON):
"""Test attach interfaces"""
def _create_server_get_interfaces(self):
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.aws_availability_zone)
ifs = (self.client.list_interfaces(server['id'])
['interfaceAttachments'])
body = self.wait_for_interface_status(
server['id'], ifs[0]['port_id'], 'ACTIVE')
ifs[0]['port_state'] = body['port_state']
return server, ifs
class HybridAZV2TestJSON(test_availability_zone.AZV2TestJSON):
"""Test AZ"""
class HybridCreateVCloudServersTestJSON(test_create_server.ServersTestJSON):
"""Test create servers"""
@classmethod
def resource_setup(cls):
cls.set_validation_resources()
super(test_create_server.ServersTestJSON, cls).resource_setup()
cls.meta = {'hello': 'world'}
cls.accessIPv4 = '1.1.1.1'
cls.accessIPv6 = '0000:0000:0000:0000:0000:babe:220.12.22.2'
cls.name = data_utils.rand_name('server')
cls.password = data_utils.rand_password()
disk_config = cls.disk_config
cls.server_initial = cls.create_test_server(
validatable=True,
wait_until='ACTIVE',
name=cls.name,
metadata=cls.meta,
accessIPv4=cls.accessIPv4,
accessIPv6=cls.accessIPv6,
disk_config=disk_config,
adminPass=cls.password,
availability_zone=CONF.compute.vcloud_availability_zone)
cls.server = (cls.client.show_server(cls.server_initial['id'])
['server'])
@test.attr(type='smoke')
@test.idempotent_id('5de47127-9977-400a-936f-abcfbec1218f')
def test_verify_server_details(self):
# Verify the specified server attributes are set correctly
self.assertEqual(self.accessIPv4, self.server['accessIPv4'])
# NOTE(maurosr): See http://tools.ietf.org/html/rfc5952 (section 4)
# Here we compare directly with the canonicalized format.
self.assertEqual(self.server['accessIPv6'],
str(netaddr.IPAddress(self.accessIPv6)))
self.assertEqual(self.name, self.server['name'])
self.assertEqual(self.image_ref, self.server['image']['id'])
self.assertEqual(self.flavor_ref, self.server['flavor']['id'])
self.assertTrue(cmp(self.server['metadata'], self.meta) > 0)
@testtools.skip('Do not support host operation')
@test.idempotent_id('ed20d3fb-9d1f-4329-b160-543fbd5d9811')
def test_create_server_with_scheduler_hint_group(self):
# Create a server with the scheduler hint "group".
name = data_utils.rand_name('server_group')
policies = ['affinity']
body = self.server_groups_client.create_server_group(
name=name, policies=policies)['server_group']
group_id = body['id']
self.addCleanup(self.server_groups_client.delete_server_group,
group_id)
hints = {'group': group_id}
server = self.create_test_server(scheduler_hints=hints,
wait_until='ACTIVE')
# Check a server is in the group
server_group = (self.server_groups_client.show_server_group(group_id)
['server_group'])
self.assertIn(server['id'], server_group['members'])
@test.idempotent_id('0578d144-ed74-43f8-8e57-ab10dbf9b3c2')
@testtools.skipUnless(CONF.service_available.neutron,
'Neutron service must be available.')
def test_verify_multiple_nics_order(self):
# Verify that the networks order given at the server creation is
# preserved within the server.
net1 = self._create_net_subnet_ret_net_from_cidr('19.80.0.0/24')
net2 = self._create_net_subnet_ret_net_from_cidr('19.86.0.0/24')
networks = [{'uuid': net1['network']['id']},
{'uuid': net2['network']['id']}]
server_multi_nics = self.create_test_server(
networks=networks, wait_until='ACTIVE',availability_zone=CONF.compute.vcloud_availability_zone)
# Cleanup server; this is needed in the test case because with the LIFO
# nature of the cleanups, if we don't delete the server first, the port
# will still be part of the subnet and we'll get a 409 from Neutron
# when trying to delete the subnet. The tear down in the base class
# will try to delete the server and get a 404 but it's ignored so
# we're OK.
def cleanup_server():
self.client.delete_server(server_multi_nics['id'])
waiters.wait_for_server_termination(self.client,
server_multi_nics['id'])
self.addCleanup(cleanup_server)
addresses = (self.client.list_addresses(server_multi_nics['id'])
['addresses'])
# We can't predict the ip addresses assigned to the server on networks.
# Sometimes the assigned addresses are ['19.80.0.2', '19.86.0.2'], at
# other times ['19.80.0.3', '19.86.0.3']. So we check if the first
# address is in first network, similarly second address is in second
# network.
addr = [addresses[net1['network']['name']][0]['addr'],
addresses[net2['network']['name']][0]['addr']]
networks = [netaddr.IPNetwork('19.80.0.0/24'),
netaddr.IPNetwork('19.86.0.0/24')]
for address, network in zip(addr, networks):
self.assertIn(address, network)
@test.idempotent_id('1678d144-ed74-43f8-8e57-ab10dbf9b3c2')
@testtools.skipUnless(CONF.service_available.neutron,
'Neutron service must be available.')
# The below skipUnless should be removed once Kilo-eol happens.
@testtools.skipUnless(CONF.compute_feature_enabled.
allow_duplicate_networks,
'Duplicate networks must be allowed')
def test_verify_duplicate_network_nics(self):
# Verify that server creation does not fail when more than one nic
# is created on the same network.
net1 = self._create_net_subnet_ret_net_from_cidr('19.80.0.0/24')
net2 = self._create_net_subnet_ret_net_from_cidr('19.86.0.0/24')
networks = [{'uuid': net1['network']['id']},
{'uuid': net2['network']['id']},
{'uuid': net1['network']['id']}]
server_multi_nics = self.create_test_server(
networks=networks, wait_until='ACTIVE',availability_zone=CONF.compute.vcloud_availability_zone)
def cleanup_server():
self.client.delete_server(server_multi_nics['id'])
waiters.wait_for_server_termination(self.client,
server_multi_nics['id'])
self.addCleanup(cleanup_server)
addresses = (self.client.list_addresses(server_multi_nics['id'])
['addresses'])
addr = [addresses[net1['network']['name']][0]['addr'],
addresses[net2['network']['name']][0]['addr'],
addresses[net1['network']['name']][1]['addr']]
networks = [netaddr.IPNetwork('19.80.0.0/24'),
netaddr.IPNetwork('19.86.0.0/24'),
netaddr.IPNetwork('19.80.0.0/24')]
for address, network in zip(addr, networks):
self.assertIn(address, network)
@testtools.skip('HybridCloud Bug:Do not support host operation')
@test.idempotent_id('ac1ad47f-984b-4441-9274-c9079b7a0666')
@testtools.skipUnless(CONF.validation.run_validation,
'Instance validation tests are disabled.')
def test_host_name_is_same_as_server_name(self):
# Verify the instance host name is the same as the server name
linux_client = remote_client.RemoteClient(
self.get_server_ip(self.server),
self.ssh_user,
self.password,
self.validation_resources['keypair']['private_key'])
self.assertTrue(linux_client.hostname_equals_servername(self.name))
class HybridCreateAwsServersTestJSON(test_create_server.ServersTestJSON):
"""Test create servers"""
@classmethod
def resource_setup(cls):
cls.set_validation_resources()
super(test_create_server.ServersTestJSON, cls).resource_setup()
cls.meta = {'hello': 'world'}
cls.accessIPv4 = '1.1.1.1'
cls.accessIPv6 = '0000:0000:0000:0000:0000:babe:220.12.22.2'
cls.name = data_utils.rand_name('server')
cls.password = data_utils.rand_password()
disk_config = cls.disk_config
cls.server_initial = cls.create_test_server(
validatable=True,
wait_until='ACTIVE',
name=cls.name,
metadata=cls.meta,
accessIPv4=cls.accessIPv4,
accessIPv6=cls.accessIPv6,
disk_config=disk_config,
adminPass=cls.password,
availability_zone=CONF.compute.aws_availability_zone)
cls.server = (cls.client.show_server(cls.server_initial['id'])
['server'])
@test.attr(type='smoke')
@test.idempotent_id('5de47127-9977-400a-936f-abcfbec1218f')
def test_verify_server_details(self):
# Verify the specified server attributes are set correctly
self.assertEqual(self.accessIPv4, self.server['accessIPv4'])
# NOTE(maurosr): See http://tools.ietf.org/html/rfc5952 (section 4)
# Here we compare directly with the canonicalized format.
self.assertEqual(self.server['accessIPv6'],
str(netaddr.IPAddress(self.accessIPv6)))
self.assertEqual(self.name, self.server['name'])
self.assertEqual(self.image_ref, self.server['image']['id'])
self.assertEqual(self.flavor_ref, self.server['flavor']['id'])
self.assertTrue(cmp(self.server['metadata'], self.meta) > 0)
@testtools.skip('Do not support host operation')
@test.idempotent_id('ed20d3fb-9d1f-4329-b160-543fbd5d9811')
def test_create_server_with_scheduler_hint_group(self):
# Create a server with the scheduler hint "group".
name = data_utils.rand_name('server_group')
policies = ['affinity']
body = self.server_groups_client.create_server_group(
name=name, policies=policies)['server_group']
group_id = body['id']
self.addCleanup(self.server_groups_client.delete_server_group,
group_id)
hints = {'group': group_id}
server = self.create_test_server(scheduler_hints=hints,
wait_until='ACTIVE')
# Check a server is in the group
server_group = (self.server_groups_client.show_server_group(group_id)
['server_group'])
self.assertIn(server['id'], server_group['members'])
@test.idempotent_id('0578d144-ed74-43f8-8e57-ab10dbf9b3c2')
@testtools.skipUnless(CONF.service_available.neutron,
'Neutron service must be available.')
def test_verify_multiple_nics_order(self):
# Verify that the networks order given at the server creation is
# preserved within the server.
net1 = self._create_net_subnet_ret_net_from_cidr('19.80.0.0/24')
net2 = self._create_net_subnet_ret_net_from_cidr('19.86.0.0/24')
networks = [{'uuid': net1['network']['id']},
{'uuid': net2['network']['id']}]
server_multi_nics = self.create_test_server(
networks=networks, wait_until='ACTIVE',availability_zone=CONF.compute.aws_availability_zone)
# Cleanup server; this is needed in the test case because with the LIFO
# nature of the cleanups, if we don't delete the server first, the port
# will still be part of the subnet and we'll get a 409 from Neutron
# when trying to delete the subnet. The tear down in the base class
# will try to delete the server and get a 404 but it's ignored so
# we're OK.
def cleanup_server():
self.client.delete_server(server_multi_nics['id'])
waiters.wait_for_server_termination(self.client,
server_multi_nics['id'])
self.addCleanup(cleanup_server)
addresses = (self.client.list_addresses(server_multi_nics['id'])
['addresses'])
# We can't predict the ip addresses assigned to the server on networks.
# Sometimes the assigned addresses are ['19.80.0.2', '19.86.0.2'], at
# other times ['19.80.0.3', '19.86.0.3']. So we check if the first
# address is in first network, similarly second address is in second
# network.
addr = [addresses[net1['network']['name']][0]['addr'],
addresses[net2['network']['name']][0]['addr']]
networks = [netaddr.IPNetwork('19.80.0.0/24'),
netaddr.IPNetwork('19.86.0.0/24')]
for address, network in zip(addr, networks):
self.assertIn(address, network)
@test.idempotent_id('1678d144-ed74-43f8-8e57-ab10dbf9b3c2')
@testtools.skipUnless(CONF.service_available.neutron,
'Neutron service must be available.')
# The below skipUnless should be removed once Kilo-eol happens.
@testtools.skipUnless(CONF.compute_feature_enabled.
allow_duplicate_networks,
'Duplicate networks must be allowed')
def test_verify_duplicate_network_nics(self):
# Verify that server creation does not fail when more than one nic
# is created on the same network.
net1 = self._create_net_subnet_ret_net_from_cidr('19.80.0.0/24')
net2 = self._create_net_subnet_ret_net_from_cidr('19.86.0.0/24')
networks = [{'uuid': net1['network']['id']},
{'uuid': net2['network']['id']},
{'uuid': net1['network']['id']}]
server_multi_nics = self.create_test_server(
networks=networks, wait_until='ACTIVE',availability_zone=CONF.compute.aws_availability_zone)
def cleanup_server():
self.client.delete_server(server_multi_nics['id'])
waiters.wait_for_server_termination(self.client,
server_multi_nics['id'])
self.addCleanup(cleanup_server)
addresses = (self.client.list_addresses(server_multi_nics['id'])
['addresses'])
addr = [addresses[net1['network']['name']][0]['addr'],
addresses[net2['network']['name']][0]['addr'],
addresses[net1['network']['name']][1]['addr']]
networks = [netaddr.IPNetwork('19.80.0.0/24'),
netaddr.IPNetwork('19.86.0.0/24'),
netaddr.IPNetwork('19.80.0.0/24')]
for address, network in zip(addr, networks):
self.assertIn(address, network)
@testtools.skip('HybridCloud Bug:Do not support host operation')
@test.idempotent_id('ac1ad47f-984b-4441-9274-c9079b7a0666')
@testtools.skipUnless(CONF.validation.run_validation,
'Instance validation tests are disabled.')
def test_host_name_is_same_as_server_name(self):
# Verify the instance host name is the same as the server name
linux_client = remote_client.RemoteClient(
self.get_server_ip(self.server),
self.ssh_user,
self.password,
self.validation_resources['keypair']['private_key'])
self.assertTrue(linux_client.hostname_equals_servername(self.name))
class HybridDeleteVCloudServersTestJSON(test_delete_server.DeleteServersTestJSON):
"""Test delete server"""
@test.idempotent_id('9e6e0c87-3352-42f7-9faf-5d6210dbd159')
def test_delete_server_while_in_building_state(self):
# Delete a server while it's VM state is Building
server = self.create_test_server(wait_until='BUILD', availability_zone=CONF.compute.vcloud_availability_zone)
self.client.delete_server(server['id'])
waiters.wait_for_server_termination(self.client, server['id'])
@test.idempotent_id('925fdfb4-5b13-47ea-ac8a-c36ae6fddb05')
def test_delete_active_server(self):
# Delete a server while it's VM state is Active
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.vcloud_availability_zone)
self.client.delete_server(server['id'])
waiters.wait_for_server_termination(self.client, server['id'])
@test.idempotent_id('546d368c-bb6c-4645-979a-83ed16f3a6be')
def test_delete_server_while_in_shutoff_state(self):
# Delete a server while it's VM state is Shutoff
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.vcloud_availability_zone)
self.client.stop_server(server['id'])
waiters.wait_for_server_status(self.client, server['id'], 'SHUTOFF')
self.client.delete_server(server['id'])
waiters.wait_for_server_termination(self.client, server['id'])
@test.idempotent_id('943bd6e8-4d7a-4904-be83-7a6cc2d4213b')
@testtools.skipUnless(CONF.compute_feature_enabled.pause,
'Pause is not available.')
def test_delete_server_while_in_pause_state(self):
# Delete a server while it's VM state is Pause
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.vcloud_availability_zone)
self.client.pause_server(server['id'])
waiters.wait_for_server_status(self.client, server['id'], 'PAUSED')
self.client.delete_server(server['id'])
waiters.wait_for_server_termination(self.client, server['id'])
@test.idempotent_id('1f82ebd3-8253-4f4e-b93f-de9b7df56d8b')
@testtools.skipUnless(CONF.compute_feature_enabled.suspend,
'Suspend is not available.')
def test_delete_server_while_in_suspended_state(self):
# Delete a server while it's VM state is Suspended
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.vcloud_availability_zone)
self.client.suspend_server(server['id'])
waiters.wait_for_server_status(self.client, server['id'], 'SUSPENDED')
self.client.delete_server(server['id'])
waiters.wait_for_server_termination(self.client, server['id'])
@test.idempotent_id('bb0cb402-09dd-4947-b6e5-5e7e1cfa61ad')
@testtools.skipUnless(CONF.compute_feature_enabled.shelve,
'Shelve is not available.')
def test_delete_server_while_in_shelved_state(self):
# Delete a server while it's VM state is Shelved
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.vcloud_availability_zone)
self.client.shelve_server(server['id'])
offload_time = CONF.compute.shelved_offload_time
if offload_time >= 0:
waiters.wait_for_server_status(self.client, server['id'],
'SHELVED_OFFLOADED',
extra_timeout=offload_time)
else:
waiters.wait_for_server_status(self.client, server['id'],
'SHELVED')
self.client.delete_server(server['id'])
waiters.wait_for_server_termination(self.client, server['id'])
@test.idempotent_id('ab0c38b4-cdd8-49d3-9b92-0cb898723c01')
@testtools.skipIf(not CONF.compute_feature_enabled.resize,
'Resize not available.')
def test_delete_server_while_in_verify_resize_state(self):
# Delete a server while it's VM state is VERIFY_RESIZE
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.vcloud_availability_zone)
self.client.resize_server(server['id'], self.flavor_ref_alt)
waiters.wait_for_server_status(self.client, server['id'],
'VERIFY_RESIZE')
self.client.delete_server(server['id'])
waiters.wait_for_server_termination(self.client, server['id'])
@testtools.skip('Volume test support this operation')
@test.idempotent_id('d0f3f0d6-d9b6-4a32-8da4-23015dcab23c')
@test.services('volume')
def test_delete_server_while_in_attached_volume(self):
# Delete a server while a volume is attached to it
volumes_client = self.volumes_extensions_client
device = '/dev/%s' % CONF.compute.volume_device_name
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.vcloud_availability_zone)
volume = (volumes_client.create_volume(size=CONF.volume.volume_size,
availability_zone=CONF.compute.vcloud_availability_zone)
['volume'])
LOG.exception('create volume success')
self.addCleanup(volumes_client.delete_volume, volume['id'])
LOG.exception('wait for volume available')
waiters.wait_for_volume_status(volumes_client,
volume['id'], 'available')
LOG.exception('volume is available now')
LOG.exception('start to attach volume')
self.client.attach_volume(server['id'],
volumeId=volume['id'],
device=device)
LOG.exception('attach volume success')
waiters.wait_for_volume_status(volumes_client,
volume['id'], 'in-use')
LOG.exception('volume is in use now.')
LOG.exception('start to delete server')
self.client.delete_server(server['id'])
LOG.exception('wait for server temination')
waiters.wait_for_server_termination(self.client, server['id'])
LOG.exception('server is deleted')
LOG.exception('wait for volume available')
waiters.wait_for_volume_status(volumes_client,
volume['id'], 'available')
LOG.exception('volume is available.')
class HybridDeleteAwsServersTestJSON(test_delete_server.DeleteServersTestJSON):
"""Test delete server"""
@test.idempotent_id('9e6e0c87-3352-42f7-9faf-5d6210dbd159')
def test_delete_server_while_in_building_state(self):
# Delete a server while it's VM state is Building
server = self.create_test_server(wait_until='BUILD', availability_zone=CONF.compute.aws_availability_zone)
self.client.delete_server(server['id'])
waiters.wait_for_server_termination(self.client, server['id'])
@test.idempotent_id('925fdfb4-5b13-47ea-ac8a-c36ae6fddb05')
def test_delete_active_server(self):
# Delete a server while it's VM state is Active
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.aws_availability_zone)
self.client.delete_server(server['id'])
waiters.wait_for_server_termination(self.client, server['id'])
@test.idempotent_id('546d368c-bb6c-4645-979a-83ed16f3a6be')
def test_delete_server_while_in_shutoff_state(self):
# Delete a server while it's VM state is Shutoff
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.aws_availability_zone)
self.client.stop_server(server['id'])
waiters.wait_for_server_status(self.client, server['id'], 'SHUTOFF')
self.client.delete_server(server['id'])
waiters.wait_for_server_termination(self.client, server['id'])
@testtools.skip('Do not support host operation')
@test.idempotent_id('943bd6e8-4d7a-4904-be83-7a6cc2d4213b')
@testtools.skipUnless(CONF.compute_feature_enabled.pause,
'Pause is not available.')
def test_delete_server_while_in_pause_state(self):
# Delete a server while it's VM state is Pause
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.aws_availability_zone)
self.client.pause_server(server['id'])
waiters.wait_for_server_status(self.client, server['id'], 'PAUSED')
self.client.delete_server(server['id'])
waiters.wait_for_server_termination(self.client, server['id'])
@testtools.skip('Do not support host operation')
@test.idempotent_id('1f82ebd3-8253-4f4e-b93f-de9b7df56d8b')
@testtools.skipUnless(CONF.compute_feature_enabled.suspend,
'Suspend is not available.')
def test_delete_server_while_in_suspended_state(self):
# Delete a server while it's VM state is Suspended
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.aws_availability_zone)
self.client.suspend_server(server['id'])
waiters.wait_for_server_status(self.client, server['id'], 'SUSPENDED')
self.client.delete_server(server['id'])
waiters.wait_for_server_termination(self.client, server['id'])
@testtools.skip('Do not support host operation')
@test.idempotent_id('bb0cb402-09dd-4947-b6e5-5e7e1cfa61ad')
@testtools.skipUnless(CONF.compute_feature_enabled.shelve,
'Shelve is not available.')
def test_delete_server_while_in_shelved_state(self):
# Delete a server while it's VM state is Shelved
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.aws_availability_zone)
self.client.shelve_server(server['id'])
offload_time = CONF.compute.shelved_offload_time
if offload_time >= 0:
waiters.wait_for_server_status(self.client, server['id'],
'SHELVED_OFFLOADED',
extra_timeout=offload_time)
else:
waiters.wait_for_server_status(self.client, server['id'],
'SHELVED')
self.client.delete_server(server['id'])
waiters.wait_for_server_termination(self.client, server['id'])
@testtools.skip('Do not support host operation')
@test.idempotent_id('ab0c38b4-cdd8-49d3-9b92-0cb898723c01')
@testtools.skipIf(not CONF.compute_feature_enabled.resize,
'Resize not available.')
def test_delete_server_while_in_verify_resize_state(self):
# Delete a server while it's VM state is VERIFY_RESIZE
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.aws_availability_zone)
self.client.resize_server(server['id'], self.flavor_ref_alt)
waiters.wait_for_server_status(self.client, server['id'],
'VERIFY_RESIZE')
self.client.delete_server(server['id'])
waiters.wait_for_server_termination(self.client, server['id'])
@test.idempotent_id('d0f3f0d6-d9b6-4a32-8da4-23015dcab23c')
@test.services('volume')
def test_delete_server_while_in_attached_volume(self):
# Delete a server while a volume is attached to it
volumes_client = self.volumes_extensions_client
device = '/dev/%s' % CONF.compute.volume_device_name
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.aws_availability_zone)
volume = (volumes_client.create_volume(size=CONF.volume.volume_size,
availability_zone=CONF.compute.aws_availability_zone)
['volume'])
LOG.exception('111111111111111111111')
self.addCleanup(volumes_client.delete_volume, volume['id'])
waiters.wait_for_volume_status(volumes_client,
volume['id'], 'available')
LOG.exception('2222222222222222222222')
self.client.attach_volume(server['id'],
volumeId=volume['id'],
device=device)
waiters.wait_for_volume_status(volumes_client,
volume['id'], 'in-use')
LOG.exception('3333333333333333333333')
self.client.delete_server(server['id'])
LOG.exception('44444444444444444444444')
try:
wait_for_server_termination(self.client, server['id'])
except Exception, e:
import traceback;LOG.exception('traceback.format_exc(e)')
LOG.exception('555555555555555555555')
waiters.wait_for_volume_status(volumes_client,
volume['id'], 'available')
LOG.exception('66666666666666666666666')
import time
from tempest import exceptions
import traceback
def wait_for_server_termination(client, server_id, ignore_error=False):
"""Waits for server to reach termination."""
start_time = int(time.time())
while True:
try:
body = client.show_server(server_id)['server']
except lib_exc.NotFound:
return
except Exception, e:
LOG.exception(traceback.format_exc(e))
server_status = body['status']
if server_status == 'ERROR' and not ignore_error:
raise exceptions.BuildErrorException(server_id=server_id)
if int(time.time()) - start_time >= client.build_timeout:
raise exceptions.TimeoutException
time.sleep(client.build_interval)
class HybridDeleteVCloudServersAdminTestJSON(test_delete_server.DeleteServersAdminTestJSON):
"""Test delete admin servers"""
@test.idempotent_id('99774678-e072-49d1-9d2a-49a59bc56063')
def test_delete_server_while_in_error_state(self):
# Delete a server while it's VM state is error
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.vcloud_availability_zone)
self.admin_client.reset_state(server['id'], state='error')
# Verify server's state
server = self.non_admin_client.show_server(server['id'])['server']
self.assertEqual(server['status'], 'ERROR')
self.non_admin_client.delete_server(server['id'])
waiters.wait_for_server_termination(self.servers_client,
server['id'],
ignore_error=True)
@test.idempotent_id('73177903-6737-4f27-a60c-379e8ae8cf48')
def test_admin_delete_servers_of_others(self):
# Administrator can delete servers of others
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.vcloud_availability_zone)
self.admin_client.delete_server(server['id'])
waiters.wait_for_server_termination(self.servers_client, server['id'])
class HybridDeleteAwsServersAdminTestJSON(test_delete_server.DeleteServersAdminTestJSON):
"""Test delete admin servers"""
@test.idempotent_id('99774678-e072-49d1-9d2a-49a59bc56063')
def test_delete_server_while_in_error_state(self):
# Delete a server while it's VM state is error
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.aws_availability_zone)
self.admin_client.reset_state(server['id'], state='error')
# Verify server's state
server = self.non_admin_client.show_server(server['id'])['server']
self.assertEqual(server['status'], 'ERROR')
self.non_admin_client.delete_server(server['id'])
waiters.wait_for_server_termination(self.servers_client,
server['id'],
ignore_error=True)
@test.idempotent_id('73177903-6737-4f27-a60c-379e8ae8cf48')
def test_admin_delete_servers_of_others(self):
# Administrator can delete servers of others
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.aws_availability_zone)
self.admin_client.delete_server(server['id'])
waiters.wait_for_server_termination(self.servers_client, server['id'])
class HybridVCloudInstanceActionsTestJSON(test_instance_actions.InstanceActionsTestJSON):
"""Test instance action"""
@classmethod
def resource_setup(cls):
super(test_instance_actions.InstanceActionsTestJSON, cls).resource_setup()
server = cls.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.vcloud_availability_zone)
cls.request_id = server.response['x-compute-request-id']
cls.server_id = server['id']
class HybridAwsInstanceActionsTestJSON(test_instance_actions.InstanceActionsTestJSON):
"""Test instance action"""
@classmethod
def resource_setup(cls):
super(test_instance_actions.InstanceActionsTestJSON, cls).resource_setup()
server = cls.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.aws_availability_zone)
cls.request_id = server.response['x-compute-request-id']
cls.server_id = server['id']
class HybridVCloudInstanceActionsNegativeTestJSON(test_instance_actions_negative.InstanceActionsNegativeTestJSON):
"""Test instance negative action"""
@classmethod
def resource_setup(cls):
super(test_instance_actions_negative.InstanceActionsNegativeTestJSON, cls).resource_setup()
server = cls.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.vcloud_availability_zone)
cls.server_id = server['id']
class HybridAwsInstanceActionsNegativeTestJSON(test_instance_actions_negative.InstanceActionsNegativeTestJSON):
"""Test instance negative action"""
@classmethod
def resource_setup(cls):
super(test_instance_actions_negative.InstanceActionsNegativeTestJSON, cls).resource_setup()
server = cls.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.aws_availability_zone)
cls.server_id = server['id']
class HybridListServerFiltersTestJSON(test_list_server_filters.ListServerFiltersTestJSON):
"""Test list server filters"""
@classmethod
def resource_setup(cls):
super(test_list_server_filters.ListServerFiltersTestJSON, cls).resource_setup()
# Check to see if the alternate image ref actually exists...
images_client = cls.compute_images_client
images = images_client.list_images()['images']
if cls.image_ref != cls.image_ref_alt and \
any([image for image in images
if image['id'] == cls.image_ref_alt]):
cls.multiple_images = True
else:
cls.image_ref_alt = cls.image_ref
# Do some sanity checks here. If one of the images does
# not exist, fail early since the tests won't work...
try:
cls.compute_images_client.show_image(cls.image_ref)
except lib_exc.NotFound:
raise RuntimeError("Image %s (image_ref) was not found!" %
cls.image_ref)
try:
cls.compute_images_client.show_image(cls.image_ref_alt)
except lib_exc.NotFound:
raise RuntimeError("Image %s (image_ref_alt) was not found!" %
cls.image_ref_alt)
network = cls.get_tenant_network()
if network:
cls.fixed_network_name = network.get('name')
else:
cls.fixed_network_name = None
network_kwargs = fixed_network.set_networks_kwarg(network)
cls.s1_name = data_utils.rand_name(cls.__name__ + '-instance')
cls.s1 = cls.create_test_server(name=cls.s1_name,
wait_until='ACTIVE',
availability_zone=CONF.compute.default_availability_zone,
**network_kwargs)
cls.s2_name = data_utils.rand_name(cls.__name__ + '-instance')
cls.s2 = cls.create_test_server(name=cls.s2_name,
image_id=cls.image_ref_alt,
wait_until='ACTIVE',
availability_zone=CONF.compute.default_availability_zone)
cls.s3_name = data_utils.rand_name(cls.__name__ + '-instance')
cls.s3 = cls.create_test_server(name=cls.s3_name,
flavor=cls.flavor_ref_alt,
wait_until='ACTIVE',
availability_zone=CONF.compute.default_availability_zone)
class HybridListServersNegativeTestJSON(test_list_servers_negative.ListServersNegativeTestJSON):
"""Test list servers negative"""
@classmethod
def resource_setup(cls):
super(test_list_servers_negative.ListServersNegativeTestJSON, cls).resource_setup()
# The following servers are created for use
# by the test methods in this class. These
# servers are cleaned up automatically in the
# tearDownClass method of the super-class.
cls.existing_fixtures = []
cls.deleted_fixtures = []
for x in moves.xrange(2):
srv = cls.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.default_availability_zone)
cls.existing_fixtures.append(srv)
srv = cls.create_test_server(availability_zone=CONF.compute.default_availability_zone)
cls.client.delete_server(srv['id'])
# We ignore errors on termination because the server may
# be put into ERROR status on a quick spawn, then delete,
# as the compute node expects the instance local status
# to be spawning, not deleted. See LP Bug#1061167
waiters.wait_for_server_termination(cls.client, srv['id'],
ignore_error=True)
cls.deleted_fixtures.append(srv)
class HybridMultipleCreateVCloudTestJSON(test_multiple_create.MultipleCreateTestJSON):
"""Test multiple create servers"""
def _create_multiple_servers(self, name=None, wait_until=None, **kwargs):
# NOTE: This is the right way to create_multiple servers and manage to
# get the created servers into the servers list to be cleaned up after
# all.
kwargs['name'] = name if name else self._generate_name()
if wait_until:
kwargs['wait_until'] = wait_until
body = self.create_test_server(availability_zone=CONF.compute.vcloud_availability_zone, **kwargs)
return body
class HybridMultipleCreateAwsTestJSON(test_multiple_create.MultipleCreateTestJSON):
"""Test multiple create servers"""
def _create_multiple_servers(self, name=None, wait_until=None, **kwargs):
# NOTE: This is the right way to create_multiple servers and manage to
# get the created servers into the servers list to be cleaned up after
# all.
kwargs['name'] = name if name else self._generate_name()
if wait_until:
kwargs['wait_until'] = wait_until
body = self.create_test_server(availability_zone=CONF.compute.aws_availability_zone, **kwargs)
return body
class HybridMultipleCreateVCloudNegativeTestJSON(test_multiple_create_negative.MultipleCreateNegativeTestJSON):
"""Test multiple create negative"""
def _create_multiple_servers(self, name=None, wait_until=None, **kwargs):
# This is the right way to create_multiple servers and manage to get
# the created servers into the servers list to be cleaned up after all.
kwargs['name'] = kwargs.get('name', self._generate_name())
body = self.create_test_server(availability_zone=CONF.compute.vcloud_availability_zone, **kwargs)
return body
class HybridMultipleCreateAwsNegativeTestJSON(test_multiple_create_negative.MultipleCreateNegativeTestJSON):
"""Test multiple create negative"""
def _create_multiple_servers(self, name=None, wait_until=None, **kwargs):
# This is the right way to create_multiple servers and manage to get
# the created servers into the servers list to be cleaned up after all.
kwargs['name'] = kwargs.get('name', self._generate_name())
body = self.create_test_server(availability_zone=CONF.compute.aws_availability_zone, **kwargs)
return body
class HybridVCloudServerActionsTestJSON(test_server_actions.ServerActionsTestJSON):
"""Test server actions"""
def setUp(self):
# NOTE(afazekas): Normally we use the same server with all test cases,
# but if it has an issue, we build a new one
super(test_server_actions.ServerActionsTestJSON, self).setUp()
# Check if the server is in a clean state after test
try:
waiters.wait_for_server_status(self.client,
self.server_id, 'ACTIVE')
except lib_exc.NotFound:
# The server was deleted by previous test, create a new one
server = self.create_test_server(
validatable=True,
wait_until='ACTIVE',
availability_zone=CONF.compute.vcloud_availability_zone)
self.__class__.server_id = server['id']
except Exception:
# Rebuild server if something happened to it during a test
self.__class__.server_id = self.rebuild_server(
self.server_id, validatable=True)
@classmethod
def rebuild_server(cls, server_id, validatable=False, **kwargs):
# Destroy an existing server and creates a new one
if server_id:
try:
cls.servers_client.delete_server(server_id)
waiters.wait_for_server_termination(cls.servers_client,
server_id)
except Exception:
LOG.exception('Failed to delete server %s' % server_id)
cls.password = data_utils.rand_password()
server = cls.create_test_server(
validatable,
wait_until='ACTIVE',
adminPass=cls.password,
availability_zone=CONF.compute.vcloud_availability_zone,
**kwargs)
return server['id']
@testtools.skip('Do not support host operation')
@test.idempotent_id('80a8094c-211e-440a-ab88-9e59d556c7ee')
def test_lock_unlock_server(self):
# Lock the server,try server stop(exceptions throw),unlock it and retry
self.client.lock_server(self.server_id)
self.addCleanup(self.client.unlock_server, self.server_id)
server = self.client.show_server(self.server_id)['server']
self.assertEqual(server['status'], 'ACTIVE')
# Locked server is not allowed to be stopped by non-admin user
self.assertRaises(lib_exc.Conflict,
self.client.stop_server, self.server_id)
self.client.unlock_server(self.server_id)
self.client.stop_server(self.server_id)
waiters.wait_for_server_status(self.client, self.server_id, 'SHUTOFF')
self.client.start_server(self.server_id)
waiters.wait_for_server_status(self.client, self.server_id, 'ACTIVE')
@testtools.skip("HybridCloud Bug:after rebulding, vxlan tunnel can't set up")
@test.idempotent_id('aaa6cdf3-55a7-461a-add9-1c8596b9a07c')
def test_rebuild_server(self):
# The server should be rebuilt using the provided image and data
meta = {'rebuild': 'server'}
new_name = data_utils.rand_name('server')
password = 'rebuildPassw0rd'
rebuilt_server = self.client.rebuild_server(
self.server_id,
self.image_ref_alt,
name=new_name,
metadata=meta,
adminPass=password)['server']
# If the server was rebuilt on a different image, restore it to the
# original image once the test ends
if self.image_ref_alt != self.image_ref:
self.addCleanup(self._rebuild_server_and_check, self.image_ref)
# Verify the properties in the initial response are correct
self.assertEqual(self.server_id, rebuilt_server['id'])
rebuilt_image_id = rebuilt_server['image']['id']
self.assertTrue(self.image_ref_alt.endswith(rebuilt_image_id))
self.assertEqual(self.flavor_ref, rebuilt_server['flavor']['id'])
# Verify the server properties after the rebuild completes
waiters.wait_for_server_status(self.client,
rebuilt_server['id'], 'ACTIVE')
server = self.client.show_server(rebuilt_server['id'])['server']
rebuilt_image_id = server['image']['id']
self.assertTrue(self.image_ref_alt.endswith(rebuilt_image_id))
self.assertEqual(new_name, server['name'])
if CONF.validation.run_validation:
# Authentication is attempted in the following order of priority:
# 1.The key passed in, if one was passed in.
# 2.Any key we can find through an SSH agent (if allowed).
# 3.Any "id_rsa", "id_dsa" or "id_ecdsa" key discoverable in
# ~/.ssh/ (if allowed).
# 4.Plain username/password auth, if a password was given.
linux_client = remote_client.RemoteClient(
self.get_server_ip(rebuilt_server),
self.ssh_user,
password,
self.validation_resources['keypair']['private_key'])
linux_client.validate_authentication()
@testtools.skip("HybridCloud Bug:does not support rebuild")
@test.idempotent_id('30449a88-5aff-4f9b-9866-6ee9b17f906d')
def test_rebuild_server_in_stop_state(self):
# The server in stop state should be rebuilt using the provided
# image and remain in SHUTOFF state
server = self.client.show_server(self.server_id)['server']
old_image = server['image']['id']
new_image = (self.image_ref_alt
if old_image == self.image_ref else self.image_ref)
self.client.stop_server(self.server_id)
waiters.wait_for_server_status(self.client, self.server_id, 'SHUTOFF')
rebuilt_server = (self.client.rebuild_server(self.server_id, new_image)
['server'])
# If the server was rebuilt on a different image, restore it to the
# original image once the test ends
if self.image_ref_alt != self.image_ref:
self.addCleanup(self._rebuild_server_and_check, old_image)
# Verify the properties in the initial response are correct
self.assertEqual(self.server_id, rebuilt_server['id'])
rebuilt_image_id = rebuilt_server['image']['id']
self.assertEqual(new_image, rebuilt_image_id)
self.assertEqual(self.flavor_ref, rebuilt_server['flavor']['id'])
# Verify the server properties after the rebuild completes
waiters.wait_for_server_status(self.client,
rebuilt_server['id'], 'SHUTOFF')
server = self.client.show_server(rebuilt_server['id'])['server']
rebuilt_image_id = server['image']['id']
self.assertEqual(new_image, rebuilt_image_id)
self.client.start_server(self.server_id)
#If not so, teardown report no container exception
waiters.wait_for_server_status(self.client, self.server_id, 'ACTIVE')
@testtools.skip("HybridCloud Bug:after reboot, boot_time can't change")
@test.attr(type='smoke')
@test.idempotent_id('2cb1baf6-ac8d-4429-bf0d-ba8a0ba53e32')
def test_reboot_server_hard(self):
# The server should be power cycled
self._test_reboot_server('HARD')
@testtools.skip("HybridCloud:cascading project not support now")
@test.idempotent_id('b963d4f1-94b3-4c40-9e97-7b583f46e470')
@testtools.skipUnless(CONF.compute_feature_enabled.snapshot,
'Snapshotting not available, backup not possible.')
@test.services('image')
def test_create_backup(self):
pass
class HybridAwsServerActionsTestJSON(test_server_actions.ServerActionsTestJSON):
"""Test server actions"""
def setUp(self):
# NOTE(afazekas): Normally we use the same server with all test cases,
# but if it has an issue, we build a new one
super(test_server_actions.ServerActionsTestJSON, self).setUp()
# Check if the server is in a clean state after test
try:
waiters.wait_for_server_status(self.client,
self.server_id, 'ACTIVE')
except lib_exc.NotFound:
# The server was deleted by previous test, create a new one
server = self.create_test_server(
validatable=True,
wait_until='ACTIVE',
availability_zone=CONF.compute.aws_availability_zone)
self.__class__.server_id = server['id']
except Exception:
# Rebuild server if something happened to it during a test
self.__class__.server_id = self.rebuild_server(
self.server_id, validatable=True)
@classmethod
def rebuild_server(cls, server_id, validatable=False, **kwargs):
# Destroy an existing server and creates a new one
if server_id:
try:
cls.servers_client.delete_server(server_id)
waiters.wait_for_server_termination(cls.servers_client,
server_id)
except Exception:
LOG.exception('Failed to delete server %s' % server_id)
cls.password = data_utils.rand_password()
server = cls.create_test_server(
validatable,
wait_until='ACTIVE',
adminPass=cls.password,
availability_zone=CONF.compute.aws_availability_zone,
**kwargs)
return server['id']
@testtools.skip('Do not support host operation')
@test.idempotent_id('80a8094c-211e-440a-ab88-9e59d556c7ee')
def test_lock_unlock_server(self):
# Lock the server,try server stop(exceptions throw),unlock it and retry
self.client.lock_server(self.server_id)
self.addCleanup(self.client.unlock_server, self.server_id)
server = self.client.show_server(self.server_id)['server']
self.assertEqual(server['status'], 'ACTIVE')
# Locked server is not allowed to be stopped by non-admin user
self.assertRaises(lib_exc.Conflict,
self.client.stop_server, self.server_id)
self.client.unlock_server(self.server_id)
self.client.stop_server(self.server_id)
waiters.wait_for_server_status(self.client, self.server_id, 'SHUTOFF')
self.client.start_server(self.server_id)
waiters.wait_for_server_status(self.client, self.server_id, 'ACTIVE')
@testtools.skip("HybridCloud Bug:after rebulding, vxlan tunnel can't set up")
@test.idempotent_id('aaa6cdf3-55a7-461a-add9-1c8596b9a07c')
def test_rebuild_server(self):
# The server should be rebuilt using the provided image and data
meta = {'rebuild': 'server'}
new_name = data_utils.rand_name('server')
password = 'rebuildPassw0rd'
rebuilt_server = self.client.rebuild_server(
self.server_id,
self.image_ref_alt,
name=new_name,
metadata=meta,
adminPass=password)['server']
# If the server was rebuilt on a different image, restore it to the
# original image once the test ends
if self.image_ref_alt != self.image_ref:
self.addCleanup(self._rebuild_server_and_check, self.image_ref)
# Verify the properties in the initial response are correct
self.assertEqual(self.server_id, rebuilt_server['id'])
rebuilt_image_id = rebuilt_server['image']['id']
self.assertTrue(self.image_ref_alt.endswith(rebuilt_image_id))
self.assertEqual(self.flavor_ref, rebuilt_server['flavor']['id'])
# Verify the server properties after the rebuild completes
waiters.wait_for_server_status(self.client,
rebuilt_server['id'], 'ACTIVE')
server = self.client.show_server(rebuilt_server['id'])['server']
rebuilt_image_id = server['image']['id']
self.assertTrue(self.image_ref_alt.endswith(rebuilt_image_id))
self.assertEqual(new_name, server['name'])
if CONF.validation.run_validation:
# Authentication is attempted in the following order of priority:
# 1.The key passed in, if one was passed in.
# 2.Any key we can find through an SSH agent (if allowed).
# 3.Any "id_rsa", "id_dsa" or "id_ecdsa" key discoverable in
# ~/.ssh/ (if allowed).
# 4.Plain username/password auth, if a password was given.
linux_client = remote_client.RemoteClient(
self.get_server_ip(rebuilt_server),
self.ssh_user,
password,
self.validation_resources['keypair']['private_key'])
linux_client.validate_authentication()
@testtools.skip("HybridCloud Bug:does not support rebuild")
@test.idempotent_id('30449a88-5aff-4f9b-9866-6ee9b17f906d')
def test_rebuild_server_in_stop_state(self):
# The server in stop state should be rebuilt using the provided
# image and remain in SHUTOFF state
server = self.client.show_server(self.server_id)['server']
old_image = server['image']['id']
new_image = (self.image_ref_alt
if old_image == self.image_ref else self.image_ref)
self.client.stop_server(self.server_id)
waiters.wait_for_server_status(self.client, self.server_id, 'SHUTOFF')
rebuilt_server = (self.client.rebuild_server(self.server_id, new_image)
['server'])
# If the server was rebuilt on a different image, restore it to the
# original image once the test ends
if self.image_ref_alt != self.image_ref:
self.addCleanup(self._rebuild_server_and_check, old_image)
# Verify the properties in the initial response are correct
self.assertEqual(self.server_id, rebuilt_server['id'])
rebuilt_image_id = rebuilt_server['image']['id']
self.assertEqual(new_image, rebuilt_image_id)
self.assertEqual(self.flavor_ref, rebuilt_server['flavor']['id'])
# Verify the server properties after the rebuild completes
waiters.wait_for_server_status(self.client,
rebuilt_server['id'], 'SHUTOFF')
server = self.client.show_server(rebuilt_server['id'])['server']
rebuilt_image_id = server['image']['id']
self.assertEqual(new_image, rebuilt_image_id)
self.client.start_server(self.server_id)
#If not so, teardown report no container exception
waiters.wait_for_server_status(self.client, self.server_id, 'ACTIVE')
@testtools.skip("HybridCloud Bug:after reboot, boot_time can't change")
@test.attr(type='smoke')
@test.idempotent_id('2cb1baf6-ac8d-4429-bf0d-ba8a0ba53e32')
def test_reboot_server_hard(self):
# The server should be power cycled
self._test_reboot_server('HARD')
@testtools.skip("HybridCloud:cascading project not support now")
@test.idempotent_id('b963d4f1-94b3-4c40-9e97-7b583f46e470')
@testtools.skipUnless(CONF.compute_feature_enabled.snapshot,
'Snapshotting not available, backup not possible.')
@test.services('image')
def test_create_backup(self):
pass
class HybridServerAddressesTestJSON(test_server_addresses.ServerAddressesTestJSON):
"""Test server address"""
@classmethod
def resource_setup(cls):
super(test_server_addresses.ServerAddressesTestJSON, cls).resource_setup()
cls.server = cls.create_test_server(wait_until='ACTIVE',
availability_zone=CONF.compute.default_availability_zone)
class HybridServerAddressesNegativeTestJSON(test_server_addresses_negative.ServerAddressesNegativeTestJSON):
"""Test server address negative"""
@classmethod
def resource_setup(cls):
super(test_server_addresses_negative.ServerAddressesNegativeTestJSON, cls).resource_setup()
cls.server = cls.create_test_server(wait_until='ACTIVE',
availability_zone=CONF.compute.default_availability_zone)
class HybridServerMetadataTestJSON(test_server_metadata.ServerMetadataTestJSON):
"""Test server metadata"""
@classmethod
def resource_setup(cls):
super(test_server_metadata.ServerMetadataTestJSON, cls).resource_setup()
server = cls.create_test_server(metadata={}, wait_until='ACTIVE',
availability_zone=CONF.compute.default_availability_zone)
cls.server_id = server['id']
class HybridServerMetadataNegativeTestJSON(test_server_metadata_negative.ServerMetadataNegativeTestJSON):
"""Test server metadata negative"""
@classmethod
def resource_setup(cls):
super(test_server_metadata_negative.ServerMetadataNegativeTestJSON, cls).resource_setup()
cls.tenant_id = cls.client.tenant_id
server = cls.create_test_server(metadata={}, wait_until='ACTIVE',
availability_zone=CONF.compute.default_availability_zone)
cls.server_id = server['id']
class HybridVCloudServerPasswordTestJSON(test_server_password.ServerPasswordTestJSON):
"""Test server password"""
@classmethod
def resource_setup(cls):
super(test_server_password.ServerPasswordTestJSON, cls).resource_setup()
cls.server = cls.create_test_server(wait_until="ACTIVE",
availability_zone=CONF.compute.vcloud_availability_zone)
class HybridAwsServerPasswordTestJSON(test_server_password.ServerPasswordTestJSON):
"""Test server password"""
@classmethod
def resource_setup(cls):
super(test_server_password.ServerPasswordTestJSON, cls).resource_setup()
cls.server = cls.create_test_server(wait_until="ACTIVE",
availability_zone=CONF.compute.aws_availability_zone)
class HybridVCloudServerPersonalityTestJSON(test_server_personality.ServerPersonalityTestJSON):
"""Test server personality"""
@testtools.skip("HybridCloud Bug:when exec this testcase individual will sucess, exec in class will failed")
@test.idempotent_id('3cfe87fd-115b-4a02-b942-7dc36a337fdf')
def test_create_server_with_personality(self):
file_contents = 'This is a test file.'
file_path = '/test.txt'
personality = [{'path': file_path,
'contents': base64.b64encode(file_contents)}]
password = data_utils.rand_password()
created_server = self.create_test_server(personality=personality,
adminPass=password,
wait_until='ACTIVE',
validatable=True,
availability_zone=CONF.compute.vcloud_availability_zone)
server = self.client.show_server(created_server['id'])['server']
if CONF.validation.run_validation:
linux_client = remote_client.RemoteClient(
self.get_server_ip(server),
self.ssh_user, password,
self.validation_resources['keypair']['private_key'])
self.assertEqual(file_contents,
linux_client.exec_command(
'cat %s' % file_path))
@test.idempotent_id('128966d8-71fc-443c-8cab-08e24114ecc9')
def test_rebuild_server_with_personality(self):
server = self.create_test_server(wait_until='ACTIVE', validatable=True,
availability_zone=CONF.compute.vcloud_availability_zone)
server_id = server['id']
file_contents = 'Test server rebuild.'
personality = [{'path': 'rebuild.txt',
'contents': base64.b64encode(file_contents)}]
rebuilt_server = self.client.rebuild_server(server_id,
self.image_ref_alt,
personality=personality)
waiters.wait_for_server_status(self.client, server_id, 'ACTIVE')
self.assertEqual(self.image_ref_alt,
rebuilt_server['server']['image']['id'])
@test.idempotent_id('176cd8c9-b9e8-48ee-a480-180beab292bf')
def test_personality_files_exceed_limit(self):
# Server creation should fail if greater than the maximum allowed
# number of files are injected into the server.
file_contents = 'This is a test file.'
personality = []
limits = self.user_client.show_limits()['limits']
max_file_limit = limits['absolute']['maxPersonality']
if max_file_limit == -1:
raise self.skipException("No limit for personality files")
for i in range(0, int(max_file_limit) + 1):
path = 'etc/test' + str(i) + '.txt'
personality.append({'path': path,
'contents': base64.b64encode(file_contents)})
# A 403 Forbidden or 413 Overlimit (old behaviour) exception
# will be raised when out of quota
self.assertRaises((lib_exc.Forbidden, lib_exc.OverLimit),
self.create_test_server, personality=personality,
availability_zone=CONF.compute.vcloud_availability_zone)
@test.idempotent_id('52f12ee8-5180-40cc-b417-31572ea3d555')
def test_can_create_server_with_max_number_personality_files(self):
# Server should be created successfully if maximum allowed number of
# files is injected into the server during creation.
file_contents = 'This is a test file.'
limits = self.user_client.show_limits()['limits']
max_file_limit = limits['absolute']['maxPersonality']
if max_file_limit == -1:
raise self.skipException("No limit for personality files")
person = []
for i in range(0, int(max_file_limit)):
path = '/etc/test' + str(i) + '.txt'
person.append({
'path': path,
'contents': base64.b64encode(file_contents),
})
password = data_utils.rand_password()
created_server = self.create_test_server(personality=person,
adminPass=password,
wait_until='ACTIVE',
validatable=True,
availability_zone=CONF.compute.vcloud_availability_zone)
server = self.client.show_server(created_server['id'])['server']
if CONF.validation.run_validation:
linux_client = remote_client.RemoteClient(
self.get_server_ip(server),
self.ssh_user, password,
self.validation_resources['keypair']['private_key'])
for i in person:
self.assertEqual(base64.b64decode(i['contents']),
linux_client.exec_command(
'cat %s' % i['path']))
class HybridAwsServerPersonalityTestJSON(test_server_personality.ServerPersonalityTestJSON):
"""Test server personality"""
@testtools.skip("HybridCloud Bug:when exec this testcase individual will sucess, exec in class will failed")
@test.idempotent_id('3cfe87fd-115b-4a02-b942-7dc36a337fdf')
def test_create_server_with_personality(self):
file_contents = 'This is a test file.'
file_path = '/test.txt'
personality = [{'path': file_path,
'contents': base64.b64encode(file_contents)}]
password = data_utils.rand_password()
created_server = self.create_test_server(personality=personality,
adminPass=password,
wait_until='ACTIVE',
validatable=True,
availability_zone=CONF.compute.aws_availability_zone)
server = self.client.show_server(created_server['id'])['server']
if CONF.validation.run_validation:
linux_client = remote_client.RemoteClient(
self.get_server_ip(server),
self.ssh_user, password,
self.validation_resources['keypair']['private_key'])
self.assertEqual(file_contents,
linux_client.exec_command(
'cat %s' % file_path))
@testtools.skip("HybridCloud Bug:does not support rebuild")
@test.idempotent_id('128966d8-71fc-443c-8cab-08e24114ecc9')
def test_rebuild_server_with_personality(self):
server = self.create_test_server(wait_until='ACTIVE', validatable=True,
availability_zone=CONF.compute.aws_availability_zone)
server_id = server['id']
file_contents = 'Test server rebuild.'
personality = [{'path': 'rebuild.txt',
'contents': base64.b64encode(file_contents)}]
rebuilt_server = self.client.rebuild_server(server_id,
self.image_ref_alt,
personality=personality)
waiters.wait_for_server_status(self.client, server_id, 'ACTIVE')
self.assertEqual(self.image_ref_alt,
rebuilt_server['server']['image']['id'])
@test.idempotent_id('176cd8c9-b9e8-48ee-a480-180beab292bf')
def test_personality_files_exceed_limit(self):
# Server creation should fail if greater than the maximum allowed
# number of files are injected into the server.
file_contents = 'This is a test file.'
personality = []
limits = self.user_client.show_limits()['limits']
max_file_limit = limits['absolute']['maxPersonality']
if max_file_limit == -1:
raise self.skipException("No limit for personality files")
for i in range(0, int(max_file_limit) + 1):
path = 'etc/test' + str(i) + '.txt'
personality.append({'path': path,
'contents': base64.b64encode(file_contents)})
# A 403 Forbidden or 413 Overlimit (old behaviour) exception
# will be raised when out of quota
self.assertRaises((lib_exc.Forbidden, lib_exc.OverLimit),
self.create_test_server, personality=personality,
availability_zone=CONF.compute.aws_availability_zone)
@test.idempotent_id('52f12ee8-5180-40cc-b417-31572ea3d555')
def test_can_create_server_with_max_number_personality_files(self):
# Server should be created successfully if maximum allowed number of
# files is injected into the server during creation.
file_contents = 'This is a test file.'
limits = self.user_client.show_limits()['limits']
max_file_limit = limits['absolute']['maxPersonality']
if max_file_limit == -1:
raise self.skipException("No limit for personality files")
person = []
for i in range(0, int(max_file_limit)):
path = '/etc/test' + str(i) + '.txt'
person.append({
'path': path,
'contents': base64.b64encode(file_contents),
})
password = data_utils.rand_password()
created_server = self.create_test_server(personality=person,
adminPass=password,
wait_until='ACTIVE',
validatable=True,
availability_zone=CONF.compute.aws_availability_zone)
server = self.client.show_server(created_server['id'])['server']
if CONF.validation.run_validation:
linux_client = remote_client.RemoteClient(
self.get_server_ip(server),
self.ssh_user, password,
self.validation_resources['keypair']['private_key'])
for i in person:
self.assertEqual(base64.b64decode(i['contents']),
linux_client.exec_command(
'cat %s' % i['path']))
class HybridVCloudServersTestJSON(test_servers.ServersTestJSON):
"""Test servers"""
@test.idempotent_id('b92d5ec7-b1dd-44a2-87e4-45e888c46ef0')
@testtools.skipUnless(CONF.compute_feature_enabled.
enable_instance_password,
'Instance password not available.')
def test_create_server_with_admin_password(self):
# If an admin password is provided on server creation, the server's
# root password should be set to that password.
server = self.create_test_server(adminPass='testpassword', availability_zone=CONF.compute.vcloud_availability_zone)
# Verify the password is set correctly in the response
self.assertEqual('testpassword', server['adminPass'])
@test.idempotent_id('8fea6be7-065e-47cf-89b8-496e6f96c699')
def test_create_with_existing_server_name(self):
# Creating a server with a name that already exists is allowed
# TODO(sdague): clear out try, we do cleanup one layer up
server_name = data_utils.rand_name('server')
server = self.create_test_server(name=server_name,
wait_until='ACTIVE',
availability_zone=CONF.compute.vcloud_availability_zone)
id1 = server['id']
server = self.create_test_server(name=server_name,
wait_until='ACTIVE',
availability_zone=CONF.compute.vcloud_availability_zone)
id2 = server['id']
self.assertNotEqual(id1, id2, "Did not create a new server")
server = self.client.show_server(id1)['server']
name1 = server['name']
server = self.client.show_server(id2)['server']
name2 = server['name']
self.assertEqual(name1, name2)
@test.idempotent_id('f9e15296-d7f9-4e62-b53f-a04e89160833')
def test_create_specify_keypair(self):
# Specify a keypair while creating a server
key_name = data_utils.rand_name('key')
self.keypairs_client.create_keypair(name=key_name)
self.addCleanup(self.keypairs_client.delete_keypair, key_name)
self.keypairs_client.list_keypairs()
server = self.create_test_server(key_name=key_name, availability_zone=CONF.compute.vcloud_availability_zone)
waiters.wait_for_server_status(self.client, server['id'], 'ACTIVE')
server = self.client.show_server(server['id'])['server']
self.assertEqual(key_name, server['key_name'])
def _update_server_name(self, server_id, status, prefix_name='server'):
# The server name should be changed to the the provided value
new_name = data_utils.rand_name(prefix_name)
# Update the server with a new name
self.client.update_server(server_id,
name=new_name)
waiters.wait_for_server_status(self.client, server_id, status)
# Verify the name of the server has changed
server = self.client.show_server(server_id)['server']
self.assertEqual(new_name, server['name'])
return server
@test.idempotent_id('5e6ccff8-349d-4852-a8b3-055df7988dd2')
def test_update_server_name(self):
# The server name should be changed to the the provided value
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.vcloud_availability_zone)
# Update instance name with non-ASCII characters
prefix_name = u'\u00CD\u00F1st\u00E1\u00F1c\u00E9'
self._update_server_name(server['id'], 'ACTIVE', prefix_name)
@test.idempotent_id('6ac19cb1-27a3-40ec-b350-810bdc04c08e')
def test_update_server_name_in_stop_state(self):
# The server name should be changed to the the provided value
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.vcloud_availability_zone)
self.client.stop_server(server['id'])
waiters.wait_for_server_status(self.client, server['id'], 'SHUTOFF')
# Update instance name with non-ASCII characters
prefix_name = u'\u00CD\u00F1st\u00E1\u00F1c\u00E9'
updated_server = self._update_server_name(server['id'],
'SHUTOFF',
prefix_name)
self.assertNotIn('progress', updated_server)
@test.idempotent_id('89b90870-bc13-4b73-96af-f9d4f2b70077')
def test_update_access_server_address(self):
# The server's access addresses should reflect the provided values
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.vcloud_availability_zone)
# Update the IPv4 and IPv6 access addresses
self.client.update_server(server['id'],
accessIPv4='1.1.1.1',
accessIPv6='::babe:202:202')
waiters.wait_for_server_status(self.client, server['id'], 'ACTIVE')
# Verify the access addresses have been updated
server = self.client.show_server(server['id'])['server']
self.assertEqual('1.1.1.1', server['accessIPv4'])
self.assertEqual('::babe:202:202', server['accessIPv6'])
@test.idempotent_id('38fb1d02-c3c5-41de-91d3-9bc2025a75eb')
def test_create_server_with_ipv6_addr_only(self):
# Create a server without an IPv4 address(only IPv6 address).
server = self.create_test_server(accessIPv6='2001:2001::3',
availability_zone=CONF.compute.vcloud_availability_zone)
waiters.wait_for_server_status(self.client, server['id'], 'ACTIVE')
server = self.client.show_server(server['id'])['server']
self.assertEqual('2001:2001::3', server['accessIPv6'])
class HybridAwsServersTestJSON(test_servers.ServersTestJSON):
"""Test servers"""
@test.idempotent_id('b92d5ec7-b1dd-44a2-87e4-45e888c46ef0')
@testtools.skipUnless(CONF.compute_feature_enabled.
enable_instance_password,
'Instance password not available.')
def test_create_server_with_admin_password(self):
# If an admin password is provided on server creation, the server's
# root password should be set to that password.
server = self.create_test_server(adminPass='testpassword', availability_zone=CONF.compute.aws_availability_zone)
# Verify the password is set correctly in the response
self.assertEqual('testpassword', server['adminPass'])
@test.idempotent_id('8fea6be7-065e-47cf-89b8-496e6f96c699')
def test_create_with_existing_server_name(self):
# Creating a server with a name that already exists is allowed
# TODO(sdague): clear out try, we do cleanup one layer up
server_name = data_utils.rand_name('server')
server = self.create_test_server(name=server_name,
wait_until='ACTIVE',
availability_zone=CONF.compute.aws_availability_zone)
id1 = server['id']
server = self.create_test_server(name=server_name,
wait_until='ACTIVE',
availability_zone=CONF.compute.aws_availability_zone)
id2 = server['id']
self.assertNotEqual(id1, id2, "Did not create a new server")
server = self.client.show_server(id1)['server']
name1 = server['name']
server = self.client.show_server(id2)['server']
name2 = server['name']
self.assertEqual(name1, name2)
@test.idempotent_id('f9e15296-d7f9-4e62-b53f-a04e89160833')
def test_create_specify_keypair(self):
# Specify a keypair while creating a server
key_name = data_utils.rand_name('key')
self.keypairs_client.create_keypair(name=key_name)
self.addCleanup(self.keypairs_client.delete_keypair, key_name)
self.keypairs_client.list_keypairs()
server = self.create_test_server(key_name=key_name, availability_zone=CONF.compute.aws_availability_zone)
waiters.wait_for_server_status(self.client, server['id'], 'ACTIVE')
server = self.client.show_server(server['id'])['server']
self.assertEqual(key_name, server['key_name'])
def _update_server_name(self, server_id, status, prefix_name='server'):
# The server name should be changed to the the provided value
new_name = data_utils.rand_name(prefix_name)
# Update the server with a new name
self.client.update_server(server_id,
name=new_name)
waiters.wait_for_server_status(self.client, server_id, status)
# Verify the name of the server has changed
server = self.client.show_server(server_id)['server']
self.assertEqual(new_name, server['name'])
return server
@test.idempotent_id('5e6ccff8-349d-4852-a8b3-055df7988dd2')
def test_update_server_name(self):
# The server name should be changed to the the provided value
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.aws_availability_zone)
# Update instance name with non-ASCII characters
prefix_name = u'\u00CD\u00F1st\u00E1\u00F1c\u00E9'
self._update_server_name(server['id'], 'ACTIVE', prefix_name)
@test.idempotent_id('6ac19cb1-27a3-40ec-b350-810bdc04c08e')
def test_update_server_name_in_stop_state(self):
# The server name should be changed to the the provided value
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.aws_availability_zone)
self.client.stop_server(server['id'])
waiters.wait_for_server_status(self.client, server['id'], 'SHUTOFF')
# Update instance name with non-ASCII characters
prefix_name = u'\u00CD\u00F1st\u00E1\u00F1c\u00E9'
updated_server = self._update_server_name(server['id'],
'SHUTOFF',
prefix_name)
self.assertNotIn('progress', updated_server)
@test.idempotent_id('89b90870-bc13-4b73-96af-f9d4f2b70077')
def test_update_access_server_address(self):
# The server's access addresses should reflect the provided values
server = self.create_test_server(wait_until='ACTIVE', availability_zone=CONF.compute.aws_availability_zone)
# Update the IPv4 and IPv6 access addresses
self.client.update_server(server['id'],
accessIPv4='1.1.1.1',
accessIPv6='::babe:202:202')
waiters.wait_for_server_status(self.client, server['id'], 'ACTIVE')
# Verify the access addresses have been updated
server = self.client.show_server(server['id'])['server']
self.assertEqual('1.1.1.1', server['accessIPv4'])
self.assertEqual('::babe:202:202', server['accessIPv6'])
@test.idempotent_id('38fb1d02-c3c5-41de-91d3-9bc2025a75eb')
def test_create_server_with_ipv6_addr_only(self):
# Create a server without an IPv4 address(only IPv6 address).
server = self.create_test_server(accessIPv6='2001:2001::3',
availability_zone=CONF.compute.aws_availability_zone)
waiters.wait_for_server_status(self.client, server['id'], 'ACTIVE')
server = self.client.show_server(server['id'])['server']
self.assertEqual('2001:2001::3', server['accessIPv6'])
class HybridVCloudServersNegativeTestJSON(test_servers_negative.ServersNegativeTestJSON):
"""Test servers negative"""
@classmethod
def resource_setup(cls):
super(test_servers_negative.ServersNegativeTestJSON, cls).resource_setup()
server = cls.create_test_server(wait_until='ACTIVE',
availability_zone=CONF.compute.vcloud_availability_zone)
cls.server_id = server['id']
@classmethod
def rebuild_server(cls, server_id, validatable=False, **kwargs):
# Destroy an existing server and creates a new one
if server_id:
try:
cls.servers_client.delete_server(server_id)
waiters.wait_for_server_termination(cls.servers_client,
server_id)
except Exception:
LOG.exception('Failed to delete server %s' % server_id)
cls.password = data_utils.rand_password()
server = cls.create_test_server(
validatable,
wait_until='ACTIVE',
adminPass=cls.password,
availability_zone=CONF.compute.vcloud_availability_zone,
**kwargs)
return server['id']
@test.attr(type=['negative'])
@test.idempotent_id('98fa0458-1485-440f-873b-fe7f0d714930')
def test_rebuild_deleted_server(self):
# Rebuild a deleted server
server = self.create_test_server(wait_until='ACTIVE',
availability_zone=CONF.compute.vcloud_availability_zone)
self.client.delete_server(server['id'])
waiters.wait_for_server_termination(self.client, server['id'])
self.assertRaises(lib_exc.NotFound,
self.client.rebuild_server,
server['id'], self.image_ref_alt)
@test.attr(type=['negative'])
@test.idempotent_id('581a397d-5eab-486f-9cf9-1014bbd4c984')
def test_reboot_deleted_server(self):
# Reboot a deleted server
server = self.create_test_server(wait_until='ACTIVE',
availability_zone=CONF.compute.vcloud_availability_zone)
self.client.delete_server(server['id'])
waiters.wait_for_server_termination(self.client, server['id'])
self.assertRaises(lib_exc.NotFound, self.client.reboot_server,
server['id'], type='SOFT')
class HybridAwsServersNegativeTestJSON(test_servers_negative.ServersNegativeTestJSON):
"""Test servers negative"""
@classmethod
def resource_setup(cls):
super(test_servers_negative.ServersNegativeTestJSON, cls).resource_setup()
server = cls.create_test_server(wait_until='ACTIVE',
availability_zone=CONF.compute.aws_availability_zone)
cls.server_id = server['id']
@classmethod
def rebuild_server(cls, server_id, validatable=False, **kwargs):
# Destroy an existing server and creates a new one
if server_id:
try:
cls.servers_client.delete_server(server_id)
waiters.wait_for_server_termination(cls.servers_client,
server_id)
except Exception:
LOG.exception('Failed to delete server %s' % server_id)
cls.password = data_utils.rand_password()
server = cls.create_test_server(
validatable,
wait_until='ACTIVE',
adminPass=cls.password,
availability_zone=CONF.compute.aws_availability_zone,
**kwargs)
return server['id']
@test.attr(type=['negative'])
@test.idempotent_id('98fa0458-1485-440f-873b-fe7f0d714930')
def test_rebuild_deleted_server(self):
# Rebuild a deleted server
server = self.create_test_server(wait_until='ACTIVE',
availability_zone=CONF.compute.aws_availability_zone)
self.client.delete_server(server['id'])
waiters.wait_for_server_termination(self.client, server['id'])
self.assertRaises(lib_exc.NotFound,
self.client.rebuild_server,
server['id'], self.image_ref_alt)
@test.attr(type=['negative'])
@test.idempotent_id('581a397d-5eab-486f-9cf9-1014bbd4c984')
def test_reboot_deleted_server(self):
# Reboot a deleted server
server = self.create_test_server(wait_until='ACTIVE',
availability_zone=CONF.compute.aws_availability_zone)
self.client.delete_server(server['id'])
waiters.wait_for_server_termination(self.client, server['id'])
self.assertRaises(lib_exc.NotFound, self.client.reboot_server,
server['id'], type='SOFT')
class HybridVirtualInterfacesNegativeTestJSON(test_virtual_interfaces_negative.VirtualInterfacesNegativeTestJSON):
"""Test virtual interfaces negative"""
| 50.400229 | 123 | 0.654827 | 10,271 | 88,150 | 5.382825 | 0.073897 | 0.03386 | 0.024599 | 0.041511 | 0.924919 | 0.915676 | 0.908586 | 0.90316 | 0.885814 | 0.877675 | 0 | 0.031227 | 0.248361 | 88,150 | 1,748 | 124 | 50.429062 | 0.803206 | 0.113035 | 0 | 0.883127 | 0 | 0 | 0.112251 | 0.0362 | 0 | 0 | 0 | 0.000572 | 0.057276 | 0 | null | null | 0.037926 | 0.027864 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0ef5ac4b1c9487fe0d2bad5c33927c8a4797e57c | 3,204 | py | Python | batch_norm_layer.py | e-hulten/maf | 2c0604ac8573ab14a6bc83dd51827d47a4266a96 | [
"MIT"
] | 12 | 2020-02-29T11:42:27.000Z | 2021-12-08T04:09:21.000Z | batch_norm_layer.py | e-hulten/maf | 2c0604ac8573ab14a6bc83dd51827d47a4266a96 | [
"MIT"
] | 1 | 2021-01-22T07:02:22.000Z | 2021-01-22T07:02:22.000Z | batch_norm_layer.py | e-hulten/maf | 2c0604ac8573ab14a6bc83dd51827d47a4266a96 | [
"MIT"
] | null | null | null | import torch
import torch.nn as nn
class BatchNormLayer(nn.Module):
def __init__(self, dim, eps=1e-5):
super().__init__()
self.eps = eps
self.gamma = nn.Parameter(torch.zeros(1, dim))
self.beta = nn.Parameter(torch.zeros(1, dim))
self.batch_mean = None
self.batch_var = None
def forward(self, x):
if self.training:
m = x.mean(dim=0)
v = x.var(dim=0) + self.eps # torch.mean((x - m) ** 2, axis=0) + self.eps
self.batch_mean = None
else:
if self.batch_mean is None:
self.set_batch_stats_func(x)
m = self.batch_mean.clone()
v = self.batch_var.clone()
x_hat = (x - m) / torch.sqrt(v)
x_hat = x_hat * torch.exp(self.gamma) + self.beta
log_det = torch.sum(self.gamma - 0.5 * torch.log(v))
return x_hat, log_det
def backward(self, x):
if self.training:
m = x.mean(dim=0)
v = x.var(dim=0) + self.eps
self.batch_mean = None
else:
if self.batch_mean is None:
self.set_batch_stats_func(x)
m = self.batch_mean
v = self.batch_var
x_hat = (x - self.beta) * torch.exp(-self.gamma) * torch.sqrt(v) + m
log_det = torch.sum(-self.gamma + 0.5 * torch.log(v))
return x_hat, log_det
def set_batch_stats_func(self, x):
print("setting batch stats for validation")
self.batch_mean = x.mean(dim=0)
self.batch_var = x.var(dim=0) + self.eps
class BatchNorm_running(nn.Module):
def __init__(self, dim, eps=1e-5):
super().__init__()
self.eps = eps
self.momentum = 0.01
self.gamma = nn.Parameter(torch.zeros(1, dim), requires_grad=True)
self.beta = nn.Parameter(torch.zeros(1, dim), requires_grad=True)
self.running_mean = torch.zeros(1, dim)
self.running_var = torch.ones(1, dim)
def forward(self, x):
if self.training:
m = x.mean(dim=0)
v = x.var(dim=0) + self.eps # torch.mean((x - m) ** 2, axis=0) + self.eps
self.running_mean *= 1 - self.momentum
self.running_mean += self.momentum * m
self.running_var *= 1 - self.momentum
self.running_var += self.momentum * v
else:
m = self.running_mean
v = self.running_var
x_hat = (x - m) / torch.sqrt(v)
x_hat = x_hat * torch.exp(self.gamma) + self.beta
log_det = torch.sum(self.gamma) - 0.5 * torch.sum(torch.log(v))
return x_hat, log_det
def backward(self, x):
if self.training:
m = x.mean(dim=0)
v = x.var(dim=0) + self.eps
self.running_mean *= 1 - self.momentum
self.running_mean += self.momentum * m
self.running_var *= 1 - self.momentum
self.running_var += self.momentum * v
else:
m = self.running_mean
v = self.running_var
x_hat = (x - self.beta) * torch.exp(-self.gamma) * torch.sqrt(v) + m
log_det = torch.sum(-self.gamma + 0.5 * torch.log(v))
return x_hat, log_det
| 34.451613 | 86 | 0.547753 | 471 | 3,204 | 3.569002 | 0.125265 | 0.091612 | 0.061868 | 0.041642 | 0.839976 | 0.829268 | 0.820345 | 0.817966 | 0.772754 | 0.772754 | 0 | 0.017989 | 0.323346 | 3,204 | 92 | 87 | 34.826087 | 0.75738 | 0.027154 | 0 | 0.721519 | 0 | 0 | 0.010918 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.088608 | false | 0 | 0.025316 | 0 | 0.189873 | 0.012658 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0ef8c9956da75319a8786f4c78f5499acd798ff6 | 11,001 | py | Python | tests/integration/modules/test_vault.py | Noah-Huppert/salt | 998c382f5f2c3b4cbf7d96aa6913ada6993909b3 | [
"Apache-2.0"
] | 19 | 2016-01-29T14:37:52.000Z | 2022-03-30T18:08:01.000Z | tests/integration/modules/test_vault.py | Noah-Huppert/salt | 998c382f5f2c3b4cbf7d96aa6913ada6993909b3 | [
"Apache-2.0"
] | 223 | 2016-03-02T16:39:41.000Z | 2022-03-03T12:26:35.000Z | tests/integration/modules/test_vault.py | Noah-Huppert/salt | 998c382f5f2c3b4cbf7d96aa6913ada6993909b3 | [
"Apache-2.0"
] | 64 | 2016-02-04T19:45:26.000Z | 2021-12-15T02:02:31.000Z | # -*- coding: utf-8 -*-
"""
Integration tests for the vault execution module
"""
from __future__ import absolute_import, print_function, unicode_literals
import logging
import time
import salt.utils.path
from tests.support.case import ModuleCase
from tests.support.helpers import destructiveTest, slowTest
from tests.support.runtests import RUNTIME_VARS
from tests.support.sminion import create_sminion
from tests.support.unit import SkipTest, skipIf
log = logging.getLogger(__name__)
VAULT_BINARY_PATH = salt.utils.path.which("vault")
@destructiveTest
@skipIf(not salt.utils.path.which("dockerd"), "Docker not installed")
@skipIf(not VAULT_BINARY_PATH, "Vault not installed")
class VaultTestCase(ModuleCase):
"""
Test vault module
"""
@classmethod
def setUpClass(cls):
cls.sminion = sminion = create_sminion()
config = '{"backend": {"file": {"path": "/vault/file"}}, "default_lease_ttl": "168h", "max_lease_ttl": "720h", "disable_mlock": true}'
sminion.states.docker_image.present(name="vault", tag="0.9.6")
login_attempts = 1
container_created = False
while True:
if container_created:
sminion.states.docker_container.stopped(name="vault")
sminion.states.docker_container.absent(name="vault")
ret = sminion.states.docker_container.running(
name="vault",
image="vault:0.9.6",
port_bindings="8200:8200",
environment={
"VAULT_DEV_ROOT_TOKEN_ID": "testsecret",
"VAULT_LOCAL_CONFIG": config,
},
)
log.debug("docker_container.running return: %s", ret)
container_created = ret["result"]
time.sleep(5)
ret = sminion.functions.cmd.run_all(
cmd="{} login token=testsecret".format(VAULT_BINARY_PATH),
env={"VAULT_ADDR": "http://127.0.0.1:8200"},
hide_output=False,
)
if ret["retcode"] == 0:
break
log.debug("Vault login failed. Return: %s", ret)
login_attempts += 1
if login_attempts >= 3:
raise SkipTest("unable to login to vault")
ret = sminion.functions.cmd.retcode(
cmd="{} policy write testpolicy {}/vault.hcl".format(
VAULT_BINARY_PATH, RUNTIME_VARS.FILES
),
env={"VAULT_ADDR": "http://127.0.0.1:8200"},
)
if ret != 0:
raise SkipTest("unable to assign policy to vault")
@classmethod
def tearDownClass(cls):
cls.sminion.states.docker_container.stopped(name="vault")
cls.sminion.states.docker_container.absent(name="vault")
cls.sminion.states.docker_image.absent(name="vault", force=True)
cls.sminion = None
@slowTest
def test_write_read_secret(self):
write_return = self.run_function(
"vault.write_secret", path="secret/my/secret", user="foo", password="bar"
)
self.assertEqual(write_return, True)
assert self.run_function("vault.read_secret", arg=["secret/my/secret"]) == {
"password": "bar",
"user": "foo",
}
assert (
self.run_function("vault.read_secret", arg=["secret/my/secret", "user"])
== "foo"
)
@slowTest
def test_write_raw_read_secret(self):
assert (
self.run_function(
"vault.write_raw",
path="secret/my/secret2",
raw={"user2": "foo2", "password2": "bar2"},
)
is True
)
assert self.run_function("vault.read_secret", arg=["secret/my/secret2"]) == {
"password2": "bar2",
"user2": "foo2",
}
@slowTest
def test_delete_secret(self):
assert (
self.run_function(
"vault.write_secret",
path="secret/my/secret",
user="foo",
password="bar",
)
is True
)
assert (
self.run_function("vault.delete_secret", arg=["secret/my/secret"]) is True
)
@slowTest
def test_list_secrets(self):
assert (
self.run_function(
"vault.write_secret",
path="secret/my/secret",
user="foo",
password="bar",
)
is True
)
assert self.run_function("vault.list_secrets", arg=["secret/my/"]) == {
"keys": ["secret"]
}
@destructiveTest
@skipIf(not salt.utils.path.which("dockerd"), "Docker not installed")
@skipIf(not salt.utils.path.which("vault"), "Vault not installed")
class VaultTestCaseCurrent(ModuleCase):
"""
Test vault module against current vault
"""
@classmethod
def setUpClass(cls):
cls.sminion = sminion = create_sminion()
config = '{"backend": {"file": {"path": "/vault/file"}}, "default_lease_ttl": "168h", "max_lease_ttl": "720h", "disable_mlock": true}'
sminion.states.docker_image.present(name="vault", tag="1.3.1")
login_attempts = 1
container_created = False
while True:
if container_created:
sminion.states.docker_container.stopped(name="vault")
sminion.states.docker_container.absent(name="vault")
ret = sminion.states.docker_container.running(
name="vault",
image="vault:1.3.1",
port_bindings="8200:8200",
environment={
"VAULT_DEV_ROOT_TOKEN_ID": "testsecret",
"VAULT_LOCAL_CONFIG": config,
},
)
log.debug("docker_container.running return: %s", ret)
container_created = ret["result"]
time.sleep(5)
ret = sminion.functions.cmd.run_all(
cmd="{} login token=testsecret".format(VAULT_BINARY_PATH),
env={"VAULT_ADDR": "http://127.0.0.1:8200"},
hide_output=False,
)
if ret["retcode"] == 0:
break
log.debug("Vault login failed. Return: %s", ret)
login_attempts += 1
if login_attempts >= 3:
raise SkipTest("unable to login to vault")
ret = sminion.functions.cmd.retcode(
cmd="{} policy write testpolicy {}/vault.hcl".format(
VAULT_BINARY_PATH, RUNTIME_VARS.FILES
),
env={"VAULT_ADDR": "http://127.0.0.1:8200"},
)
if ret != 0:
raise SkipTest("unable to assign policy to vault")
@classmethod
def tearDownClass(cls):
cls.sminion.states.docker_container.stopped(name="vault")
cls.sminion.states.docker_container.absent(name="vault")
cls.sminion.states.docker_image.absent(name="vault", force=True)
cls.sminion = None
@slowTest
def test_write_read_secret_kv2(self):
write_return = self.run_function(
"vault.write_secret", path="secret/my/secret", user="foo", password="bar"
)
# write_secret output:
# {'created_time': '2020-01-12T23:09:34.571294241Z', 'destroyed': False,
# 'version': 1, 'deletion_time': ''}
expected_write = {"destroyed": False, "deletion_time": ""}
self.assertDictContainsSubset(expected_write, write_return)
read_return = self.run_function(
"vault.read_secret", arg=["secret/my/secret"], metadata=True
)
# read_secret output:
# {'data': {'password': 'bar', 'user': 'foo'},
# 'metadata': {'created_time': '2020-01-12T23:07:18.829326918Z', 'destroyed': False,
# 'version': 1, 'deletion_time': ''}}
expected_read = {"data": {"password": "bar", "user": "foo"}}
self.assertDictContainsSubset(expected_read, read_return)
expected_read = {"password": "bar", "user": "foo"}
read_return = self.run_function("vault.read_secret", arg=["secret/my/secret"])
self.assertDictContainsSubset(expected_read, read_return)
read_return = self.run_function(
"vault.read_secret", arg=["secret/my/secret", "user"]
)
self.assertEqual(read_return, "foo")
@slowTest
def test_list_secrets_kv2(self):
write_return = self.run_function(
"vault.write_secret", path="secret/my/secret", user="foo", password="bar"
)
expected_write = {"destroyed": False, "deletion_time": ""}
self.assertDictContainsSubset(expected_write, write_return)
list_return = self.run_function("vault.list_secrets", arg=["secret/my/"])
self.assertIn("secret", list_return["keys"])
@slowTest
def test_write_raw_read_secret_kv2(self):
write_return = self.run_function(
"vault.write_raw",
path="secret/my/secret2",
raw={"user2": "foo2", "password2": "bar2"},
)
expected_write = {"destroyed": False, "deletion_time": ""}
self.assertDictContainsSubset(expected_write, write_return)
read_return = self.run_function(
"vault.read_secret", arg=["secret/my/secret2"], metadata=True
)
expected_read = {"data": {"password2": "bar2", "user2": "foo2"}}
self.assertDictContainsSubset(expected_read, read_return)
read_return = self.run_function("vault.read_secret", arg=["secret/my/secret2"])
expected_read = {"password2": "bar2", "user2": "foo2"}
self.assertDictContainsSubset(expected_read, read_return)
@slowTest
def test_delete_secret_kv2(self):
write_return = self.run_function(
"vault.write_secret",
path="secret/my/secret3",
user3="foo3",
password3="bar3",
)
expected_write = {"destroyed": False, "deletion_time": ""}
self.assertDictContainsSubset(expected_write, write_return)
delete_return = self.run_function(
"vault.delete_secret", arg=["secret/my/secret3"]
)
self.assertEqual(delete_return, True)
@slowTest
def test_destroy_secret_kv2(self):
write_return = self.run_function(
"vault.write_secret",
path="secret/my/secret4",
user3="foo4",
password4="bar4",
)
expected_write = {"destroyed": False, "deletion_time": ""}
self.assertDictContainsSubset(expected_write, write_return)
destroy_return = self.run_function(
"vault.destroy_secret", arg=["secret/my/secret4", "1"]
)
self.assertEqual(destroy_return, True)
# self.assertIsNone(self.run_function('vault.read_secret', arg=['secret/my/secret4']))
# list_return = self.run_function('vault.list_secrets', arg=['secret/my/'])
# self.assertNotIn('secret4', list_return['keys'])
| 36.916107 | 142 | 0.582674 | 1,174 | 11,001 | 5.268313 | 0.155026 | 0.027162 | 0.058205 | 0.077607 | 0.824414 | 0.77882 | 0.767825 | 0.74713 | 0.743573 | 0.722716 | 0 | 0.023598 | 0.28352 | 11,001 | 297 | 143 | 37.040404 | 0.761101 | 0.058904 | 0 | 0.614754 | 0 | 0.008197 | 0.212324 | 0.009122 | 0 | 0 | 0 | 0 | 0.090164 | 1 | 0.053279 | false | 0.061475 | 0.036885 | 0 | 0.098361 | 0.004098 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
163a454d0bdfe4315e6df7049f52f7e2195ba402 | 74 | py | Python | xldigest/tests/test_cell.py | hammerheadlemon/xldigest | f5130121de5e1b152aa42bdd29ffd57cff6b5733 | [
"MIT"
] | null | null | null | xldigest/tests/test_cell.py | hammerheadlemon/xldigest | f5130121de5e1b152aa42bdd29ffd57cff6b5733 | [
"MIT"
] | null | null | null | xldigest/tests/test_cell.py | hammerheadlemon/xldigest | f5130121de5e1b152aa42bdd29ffd57cff6b5733 | [
"MIT"
] | null | null | null | from xldigest.process.cell import Cell
def test_cell_object():
pass
| 12.333333 | 38 | 0.756757 | 11 | 74 | 4.909091 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.175676 | 74 | 5 | 39 | 14.8 | 0.885246 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
166a4a814b75cc5a2c7628439550a7393007aa0d | 113 | py | Python | molsysmt/element/group/nucleotide/is_nucleotide.py | uibcdf/MolModMTs | 4f6b6f671a9fa3e73008d1e9c48686d5f20a6573 | [
"MIT"
] | null | null | null | molsysmt/element/group/nucleotide/is_nucleotide.py | uibcdf/MolModMTs | 4f6b6f671a9fa3e73008d1e9c48686d5f20a6573 | [
"MIT"
] | null | null | null | molsysmt/element/group/nucleotide/is_nucleotide.py | uibcdf/MolModMTs | 4f6b6f671a9fa3e73008d1e9c48686d5f20a6573 | [
"MIT"
] | null | null | null | from .nucleotide_names import nucleotide_names
def is_nucleotide(name):
return (name in nucleotide_names)
| 16.142857 | 46 | 0.79646 | 15 | 113 | 5.733333 | 0.6 | 0.523256 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150442 | 113 | 6 | 47 | 18.833333 | 0.895833 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
167b2a4efb97d81ae1056fea7d8f7e3c18b84ba0 | 145 | py | Python | module/__init__.py | Makiras/express-subscribe-bot | 5565f311d16ac1274e5a6b47784a24bca1c587a0 | [
"Apache-2.0"
] | 1 | 2020-03-26T06:49:57.000Z | 2020-03-26T06:49:57.000Z | module/__init__.py | Makiras/express-subscribe-bot | 5565f311d16ac1274e5a6b47784a24bca1c587a0 | [
"Apache-2.0"
] | null | null | null | module/__init__.py | Makiras/express-subscribe-bot | 5565f311d16ac1274e5a6b47784a24bca1c587a0 | [
"Apache-2.0"
] | null | null | null | from module.db import *
from module.corntab import *
from module.kuaidi100 import *
import module.sendmsg as sendmsg
from module.weather import * | 29 | 32 | 0.806897 | 21 | 145 | 5.571429 | 0.428571 | 0.34188 | 0.273504 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02381 | 0.131034 | 145 | 5 | 33 | 29 | 0.904762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
167b601c5ebad2ea0ebdafb2768ff419a480eced | 48,656 | py | Python | tests/combine_elts/conftest.py | analyzere/analyzere-python-viz | bbd401392888c7165a1f36ff12e130728c944c2b | [
"MIT"
] | 1 | 2017-02-15T13:57:16.000Z | 2017-02-15T13:57:16.000Z | tests/combine_elts/conftest.py | analyzere/analyzere-python-viz | bbd401392888c7165a1f36ff12e130728c944c2b | [
"MIT"
] | 5 | 2017-01-25T22:19:20.000Z | 2017-02-09T18:03:36.000Z | tests/combine_elts/conftest.py | analyzere/analyzere-python-extras | bbd401392888c7165a1f36ff12e130728c944c2b | [
"MIT"
] | null | null | null | import pytest
from analyzere.base_resources import convert_to_analyzere_object
@pytest.fixture(scope='session')
def portfolio():
portfolio_dict = {
'_type': 'StaticPortfolio',
'created': '2016-03-09T18:38:04.801820Z',
'description': 'Charlie Group 2019',
'id': '05f80984-5cc5-4954-a88c-e2317d1dbe37',
'layers': [
{
'_type': 'CatXL',
'attachment': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 500000.0,
'value_date': None
},
'created': '2016-03-09T18:38:04.801820Z',
'description': 'Model A - Charlie Group Layer 1',
'expiry_date': '2017-01-01T00:00:00.000000Z',
'fees': None,
'franchise': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 0.0,
'value_date': None
},
'id': '07a98f2a-87c0-49d8-a98d-deb626707da2',
'inception_date': '2016-01-01T00:00:00.000000Z',
'limit': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 500000.0,
'value_date': None
},
'loss_sets': [
{
'_type': 'ELTLossSet',
'created': '2016-03-09T18:38:04.801820Z',
'currency': 'USD',
'data': {
'ref_id': 'd99999e4-d66e-4c0a-a6ba-554a8598fea5'
},
'description': 'FL_HU_Program_1',
'event_catalogs': [
{'ref_id': 'aa8c316c-128c-485b-9144-5eed3f051c36'}
],
'id': '6a9bf278-4edc-4d28-b4bb-59f2283dbcb0',
'loss_type': 'LossGross',
'meta_data': {},
'modified': '2016-03-09T18:38:04.801820Z',
'profile': {
'attributes': {
'ModelID': ['1'],
'Peril': ['HU'],
'Region': ['US'],
'SubRegion': ['Florida']
},
'avg_annual_loss': 446831.486739948,
'currency': 'USD',
'max_loss': 8995286.82877091,
'min_loss': 10553.9672063884,
'non_zero_losses': 1000,
'num_losses': 1000
},
'status': 'processing_succeeded',
'status_message': None
},
{
'_type': 'ELTLossSet',
'created': '2016-03-09T18:38:04.801820Z',
'currency': 'USD',
'data': {
'ref_id': 'd99999e4-d66e-4c0a-a6ba-554a8598fea5'
},
'description': 'FL_HU_Program_1',
'event_catalogs': [
{'ref_id': 'aa8c316c-128c-485b-9144-5eed3f051c36'}
],
'id': 'a3bbebae-d25d-493e-8c02-fa4305e7dbd0',
'loss_type': 'LossGross',
'meta_data': {},
'modified': '2016-03-09T18:38:04.801820Z',
'profile': {
'attributes': {
'ModelID': ['1'],
'Peril': ['HU'],
'Region': ['US'],
'SubRegion': ['Florida']
},
'avg_annual_loss': 446831.486739948,
'currency': 'USD',
'max_loss': 8995286.82877091,
'min_loss': 10553.9672063884,
'non_zero_losses': 1000,
'num_losses': 1000
},
'status': 'processing_succeeded',
'status_message': None
}
],
'meta_data': {
'business_unit': 'North America',
'cat_modeler': 'Elizabeth Hawks',
'client_id': 'Model A - Charlie Group Layer 1',
'peril': 'EQ',
'program_id': 3,
'program_name': 'Charlie Group',
'region': 'California',
'underwriter': 'Travis Costa'
},
'modified': '2016-03-09T18:38:04.801820Z',
'nth': 1,
'participation': 0.092,
'policy': None,
'premium': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 39939.9225,
'value_date': None
},
'reinstatements': []
},
{
'_type': 'CatXL',
'attachment': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 500000.0,
'value_date': None
},
'created': '2016-03-09T18:38:04.801820Z',
'description': 'Model A - Charlie Group Layer 1',
'expiry_date': '2017-01-01T00:00:00.000000Z',
'fees': None,
'franchise': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 0.0,
'value_date': None
},
'id': '286226d2-e0c8-4ecf-b6cf-ba381789fbd2',
'inception_date': '2016-01-01T00:00:00.000000Z',
'limit': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 500000.0,
'value_date': None
},
'loss_sets': [
{
'_type': 'ELTLossSet',
'created': '2016-03-09T18:38:04.801820Z',
'currency': 'USD',
'data': {
'ref_id': 'd99999e4-d66e-4c0a-a6ba-554a8598fea5'
},
'description': 'FL_HU_Program_1',
'event_catalogs': [
{'ref_id': 'aa8c316c-128c-485b-9144-5eed3f051c36'}
],
'id': '5c468e85-3716-4abb-87b2-bcd39f6d6afd',
'loss_type': 'LossGross',
'meta_data': {},
'modified': '2016-03-09T18:38:04.801820Z',
'profile': {
'attributes': {
'ModelID': ['1'],
'Peril': ['HU'],
'Region': ['US'],
'SubRegion': ['Florida']
},
'avg_annual_loss': 446831.486739948,
'currency': 'USD',
'max_loss': 8995286.82877091,
'min_loss': 10553.9672063884,
'non_zero_losses': 1000,
'num_losses': 1000
},
'status': 'processing_succeeded',
'status_message': None
},
{
'_type': 'ELTLossSet',
'created': '2016-03-09T18:38:04.801820Z',
'currency': 'USD',
'data': {
'ref_id': 'd99999e4-d66e-4c0a-a6ba-554a8598fea5'
},
'description': 'FL_HU_Program_1',
'event_catalogs': [
{'ref_id': 'aa8c316c-128c-485b-9144-5eed3f051c36'}
],
'id': 'c1c0d452-1dd1-44a6-8489-4de06c0a1b58',
'loss_type': 'LossGross',
'meta_data': {},
'modified': '2016-03-09T18:38:04.801820Z',
'profile': {
'attributes': {
'ModelID': ['1'],
'Peril': ['HU'],
'Region': ['US'],
'SubRegion': ['Florida']
},
'avg_annual_loss': 446831.486739948,
'currency': 'USD',
'max_loss': 8995286.82877091,
'min_loss': 10553.9672063884,
'non_zero_losses': 1000,
'num_losses': 1000
},
'status': 'processing_succeeded',
'status_message': None
}
],
'meta_data': {
'business_unit': 'North America',
'cat_modeler': 'Elizabeth Hawks',
'client_id': 'Model A - Charlie Group Layer 1',
'peril': 'EQ',
'program_id': 3,
'program_name': 'Charlie Group',
'region': 'California',
'underwriter': 'Travis Costa'
},
'modified': '2016-03-09T18:38:04.801820Z',
'nth': 1,
'participation': 0.092,
'policy': None,
'premium': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 39939.9225,
'value_date': None
},
'reinstatements': []
}
],
'meta_data': {},
'modified': '2016-03-09T18:38:04.801820Z',
'name': 'Charlie Group 2019'
}
analyzere_portfolio = convert_to_analyzere_object(portfolio_dict)
return [
'05f80984-5cc5-4954-a88c-e2317d1dbe37',
[
'6a9bf278-4edc-4d28-b4bb-59f2283dbcb0',
'a3bbebae-d25d-493e-8c02-fa4305e7dbd0',
'5c468e85-3716-4abb-87b2-bcd39f6d6afd',
'c1c0d452-1dd1-44a6-8489-4de06c0a1b58'
],
analyzere_portfolio
]
@pytest.fixture(scope='session')
def portfolio_view_with_layer_views():
portfolio_view_dict = {
'analysis_profile': {'ref_id': '3bb4950c-777f-44dd-a5c2-73a2dfa04d26'},
'id': '3c9c86e3-56ab-447b-9008-d6d95adb0d11',
'portfolio': None,
'layer_views': [
{
'analysis_profile': {
'ref_id': '3bb4950c-777f-44dd-a5c2-73a2dfa04d26'
},
'id': '333bf15d-225a-a30c-b95d-22fff4722b5d',
'layer': {
'_type': 'CatXL',
'attachment': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 500000.0,
'value_date': None
},
'created': '2016-03-09T18:38:04.801820Z',
'description': 'Model A - Charlie Group Layer 1',
'expiry_date': '2017-01-01T00:00:00.000000Z',
'fees': None,
'franchise': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 0.0,
'value_date': None
},
'id': '07a98f2a-87c0-49d8-a98d-deb626707da2',
'inception_date': '2016-01-01T00:00:00.000000Z',
'limit': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 500000.0,
'value_date': None
},
'loss_sets': [
{
'_type': 'ELTLossSet',
'created': '2016-03-09T18:38:04.801820Z',
'currency': 'USD',
'data': {
'ref_id':
'd99999e4-d66e-4c0a-a6ba-554a8598fea5'
},
'description': 'FL_HU_Program_1',
'event_catalogs': [
{'ref_id':
'aa8c316c-128c-485b-9144-5eed3f051c36'}
],
'id': '6a9bf278-4edc-4d28-b4bb-59f2283dbcb0',
'loss_type': 'LossGross',
'meta_data': {},
'modified': '2016-03-09T18:38:04.801820Z',
'profile': {
'attributes': {
'ModelID': ['1'],
'Peril': ['HU'],
'Region': ['US'],
'SubRegion': ['Florida']
},
'avg_annual_loss': 446831.486739948,
'currency': 'USD',
'max_loss': 8995286.82877091,
'min_loss': 10553.9672063884,
'non_zero_losses': 1000,
'num_losses': 1000
},
'status': 'processing_succeeded',
'status_message': None
}
],
'meta_data': {
'business_unit': 'North America',
'cat_modeler': 'Elizabeth Hawks',
'client_id': 'Model A - Charlie Group Layer 1',
'peril': 'EQ',
'program_id': 3,
'program_name': 'Charlie Group',
'region': 'California',
'underwriter': 'Travis Costa'
},
'modified': '2016-03-09T18:38:04.801820Z',
'nth': 1,
'participation': 0.092,
'policy': None,
'premium': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 39939.9225,
'value_date': None
},
'reinstatements': []
},
'target_currency': 'USD',
'ylt_id': '99838575-6a3c-a7a2-d1e5-40a8e1b67b47'
},
{
'analysis_profile': {
'ref_id': '3bb4950c-777f-44dd-a5c2-73a2dfa04d26'
},
'id': 'fd4fe20c-fd04-4b85-bc65-67568be37d43',
'layer': {
'_type': 'CatXL',
'attachment': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 500000.0,
'value_date': None
},
'created': '2016-03-09T18:38:04.801820Z',
'description': 'Model A - Charlie Group Layer 1',
'expiry_date': '2017-01-01T00:00:00.000000Z',
'fees': None,
'franchise': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 0.0,
'value_date': None
},
'id': '07a98f2a-87c0-49d8-a98d-deb626707da2',
'inception_date': '2016-01-01T00:00:00.000000Z',
'limit': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 500000.0,
'value_date': None
},
'loss_sets': [
{
'_type': 'ELTLossSet',
'created': '2016-03-09T18:38:04.801820Z',
'currency': 'USD',
'data': {
'ref_id':
'd99999e4-d66e-4c0a-a6ba-554a8598fea5'
},
'description': 'FL_HU_Program_1',
'event_catalogs': [
{'ref_id':
'aa8c316c-128c-485b-9144-5eed3f051c36'}
],
'id': 'acbdb659-b4b2-4784-873c-79952c071e97',
'loss_type': 'LossGross',
'meta_data': {},
'modified': '2016-03-09T18:38:04.801820Z',
'profile': {
'attributes': {
'ModelID': ['1'],
'Peril': ['HU'],
'Region': ['US'],
'SubRegion': ['Florida']
},
'avg_annual_loss': 446831.486739948,
'currency': 'USD',
'max_loss': 8995286.82877091,
'min_loss': 10553.9672063884,
'non_zero_losses': 1000,
'num_losses': 1000
},
'status': 'processing_succeeded',
'status_message': None
}
],
'meta_data': {
'business_unit': 'North America',
'cat_modeler': 'Elizabeth Hawks',
'client_id': 'Model A - Charlie Group Layer 1',
'peril': 'EQ',
'program_id': 3,
'program_name': 'Charlie Group',
'region': 'California',
'underwriter': 'Travis Costa'
},
'modified': '2016-03-09T18:38:04.801820Z',
'nth': 1,
'participation': 0.092,
'policy': None,
'premium': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 39939.9225,
'value_date': None
},
'reinstatements': []
},
'target_currency': 'USD',
'ylt_id': '99838575-6a3c-a7a2-d1e5-40a8e1b67b47'
}
],
'target_currency': 'USD',
'ylt_id': '99838575-6a3c-a7a2-d1e5-40a8e1b67b47'
}
analyzere_portfolio_view = convert_to_analyzere_object(portfolio_view_dict)
return [
'3c9c86e3-56ab-447b-9008-d6d95adb0d11',
[
'6a9bf278-4edc-4d28-b4bb-59f2283dbcb0',
'acbdb659-b4b2-4784-873c-79952c071e97'
],
analyzere_portfolio_view
]
@pytest.fixture(scope='session')
def portfolio_view_with_portfolio():
portfolio_view_dict = {
'analysis_profile': {'ref_id': '3bb4950c-777f-44dd-a5c2-73a2dfa04d26'},
'id': '7397662d-2be0-47c4-8956-b7f6e3c56c6c',
'portfolio': {
'_type': 'StaticPortfolio',
'created': '2016-03-09T18:38:04.801820Z',
'description': 'Charlie Group 2019',
'id': '05f80984-5cc5-4954-a88c-e2317d1dbe37',
'layers': [
{
'_type': 'CatXL',
'attachment': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 500000.0,
'value_date': None
},
'created': '2016-03-09T18:38:04.801820Z',
'description': 'Model A - Charlie Group Layer 1',
'expiry_date': '2017-01-01T00:00:00.000000Z',
'fees': None,
'franchise': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 0.0,
'value_date': None
},
'id': '07a98f2a-87c0-49d8-a98d-deb626707da2',
'inception_date': '2016-01-01T00:00:00.000000Z',
'limit': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 500000.0,
'value_date': None
},
'loss_sets': [
{
'_type': 'ELTLossSet',
'created': '2016-03-09T18:38:04.801820Z',
'currency': 'USD',
'data': {
'ref_id':
'd99999e4-d66e-4c0a-a6ba-554a8598fea5'
},
'description': 'FL_HU_Program_1',
'event_catalogs': [
{'ref_id':
'aa8c316c-128c-485b-9144-5eed3f051c36'}
],
'id': '6a9bf278-4edc-4d28-b4bb-59f2283dbcb0',
'loss_type': 'LossGross',
'meta_data': {},
'modified': '2016-03-09T18:38:04.801820Z',
'profile': {
'attributes': {
'ModelID': ['1'],
'Peril': ['HU'],
'Region': ['US'],
'SubRegion': ['Florida']
},
'avg_annual_loss': 446831.486739948,
'currency': 'USD',
'max_loss': 8995286.82877091,
'min_loss': 10553.9672063884,
'non_zero_losses': 1000,
'num_losses': 1000
},
'status': 'processing_succeeded',
'status_message': None
},
{
'_type': 'ELTLossSet',
'created': '2016-03-09T18:38:04.801820Z',
'currency': 'USD',
'data': {
'ref_id':
'd99999e4-d66e-4c0a-a6ba-554a8598fea5'
},
'description': 'FL_HU_Program_1',
'event_catalogs': [
{'ref_id':
'aa8c316c-128c-485b-9144-5eed3f051c36'}
],
'id': 'a3bbebae-d25d-493e-8c02-fa4305e7dbd0',
'loss_type': 'LossGross',
'meta_data': {},
'modified': '2016-03-09T18:38:04.801820Z',
'profile': {
'attributes': {
'ModelID': ['1'],
'Peril': ['HU'],
'Region': ['US'],
'SubRegion': ['Florida']
},
'avg_annual_loss': 446831.486739948,
'currency': 'USD',
'max_loss': 8995286.82877091,
'min_loss': 10553.9672063884,
'non_zero_losses': 1000,
'num_losses': 1000
},
'status': 'processing_succeeded',
'status_message': None
}
],
'meta_data': {
'business_unit': 'North America',
'cat_modeler': 'Elizabeth Hawks',
'client_id': 'Model A - Charlie Group Layer 1',
'peril': 'EQ',
'program_id': 3,
'program_name': 'Charlie Group',
'region': 'California',
'underwriter': 'Travis Costa'
},
'modified': '2016-03-09T18:38:04.801820Z',
'nth': 1,
'participation': 0.092,
'policy': None,
'premium': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 39939.9225,
'value_date': None
},
'reinstatements': []
},
{
'_type': 'CatXL',
'attachment': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 500000.0,
'value_date': None
},
'created': '2016-03-09T18:38:04.801820Z',
'description': 'Model A - Charlie Group Layer 1',
'expiry_date': '2017-01-01T00:00:00.000000Z',
'fees': None,
'franchise': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 0.0,
'value_date': None
},
'id': '286226d2-e0c8-4ecf-b6cf-ba381789fbd2',
'inception_date': '2016-01-01T00:00:00.000000Z',
'limit': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 500000.0,
'value_date': None
},
'loss_sets': [
{
'_type': 'ELTLossSet',
'created': '2016-03-09T18:38:04.801820Z',
'currency': 'USD',
'data': {
'ref_id':
'd99999e4-d66e-4c0a-a6ba-554a8598fea5'
},
'description': 'FL_HU_Program_1',
'event_catalogs': [
{'ref_id':
'aa8c316c-128c-485b-9144-5eed3f051c36'}
],
'id': '5c468e85-3716-4abb-87b2-bcd39f6d6afd',
'loss_type': 'LossGross',
'meta_data': {},
'modified': '2016-03-09T18:38:04.801820Z',
'profile': {
'attributes': {
'ModelID': ['1'],
'Peril': ['HU'],
'Region': ['US'],
'SubRegion': ['Florida']
},
'avg_annual_loss': 446831.486739948,
'currency': 'USD',
'max_loss': 8995286.82877091,
'min_loss': 10553.9672063884,
'non_zero_losses': 1000,
'num_losses': 1000
},
'status': 'processing_succeeded',
'status_message': None
},
{
'_type': 'ELTLossSet',
'created': '2016-03-09T18:38:04.801820Z',
'currency': 'USD',
'data': {
'ref_id':
'd99999e4-d66e-4c0a-a6ba-554a8598fea5'
},
'description': 'FL_HU_Program_1',
'event_catalogs': [
{'ref_id':
'aa8c316c-128c-485b-9144-5eed3f051c36'}
],
'id': 'c1c0d452-1dd1-44a6-8489-4de06c0a1b58',
'loss_type': 'LossGross',
'meta_data': {},
'modified': '2016-03-09T18:38:04.801820Z',
'profile': {
'attributes': {
'ModelID': ['1'],
'Peril': ['HU'],
'Region': ['US'],
'SubRegion': ['Florida']
},
'avg_annual_loss': 446831.486739948,
'currency': 'USD',
'max_loss': 8995286.82877091,
'min_loss': 10553.9672063884,
'non_zero_losses': 1000,
'num_losses': 1000
},
'status': 'processing_succeeded',
'status_message': None
}
],
'meta_data': {
'business_unit': 'North America',
'cat_modeler': 'Elizabeth Hawks',
'client_id': 'Model A - Charlie Group Layer 1',
'peril': 'EQ',
'program_id': 3,
'program_name': 'Charlie Group',
'region': 'California',
'underwriter': 'Travis Costa'
},
'modified': '2016-03-09T18:38:04.801820Z',
'nth': 1,
'participation': 0.092,
'policy': None,
'premium': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 39939.9225,
'value_date': None
},
'reinstatements': []
}
],
'meta_data': {},
'modified': '2016-03-09T18:38:04.801820Z',
'name': 'Charlie Group 2019'
},
'target_currency': 'USD',
'ylt_id': '99838575-6a3c-a7a2-d1e5-40a8e1b67b47'
}
analyzere_portfolio_view = convert_to_analyzere_object(portfolio_view_dict)
return [
'7397662d-2be0-47c4-8956-b7f6e3c56c6c',
[
'6a9bf278-4edc-4d28-b4bb-59f2283dbcb0',
'a3bbebae-d25d-493e-8c02-fa4305e7dbd0',
'5c468e85-3716-4abb-87b2-bcd39f6d6afd',
'c1c0d452-1dd1-44a6-8489-4de06c0a1b58'
],
analyzere_portfolio_view
]
@pytest.fixture(scope='session')
def layer():
layer_dict = {
'_type': 'CatXL',
'attachment': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 500000.0,
'value_date': None
},
'created': '2016-03-09T18:38:04.801820Z',
'description': 'Model A - Charlie Group Layer 1',
'expiry_date': '2017-01-01T00:00:00.000000Z',
'fees': None,
'franchise': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 0.0,
'value_date': None
},
'id': '07a98f2a-87c0-49d8-a98d-deb626707da2',
'inception_date': '2016-01-01T00:00:00.000000Z',
'limit': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 500000.0,
'value_date': None
},
'loss_sets': [
{
'_type': 'ELTLossSet',
'created': '2016-03-09T18:38:04.801820Z',
'currency': 'USD',
'data': {'ref_id': 'd99999e4-d66e-4c0a-a6ba-554a8598fea5'},
'description': 'FL_HU_Program_1',
'event_catalogs': [
{'ref_id': 'aa8c316c-128c-485b-9144-5eed3f051c36'}
],
'id': '6a9bf278-4edc-4d28-b4bb-59f2283dbcb0',
'loss_type': 'LossGross',
'meta_data': {},
'modified': '2016-03-09T18:38:04.801820Z',
'profile': {
'attributes': {
'ModelID': ['1'],
'Peril': ['HU'],
'Region': ['US'],
'SubRegion': ['Florida']
},
'avg_annual_loss': 446831.486739948,
'currency': 'USD',
'max_loss': 8995286.82877091,
'min_loss': 10553.9672063884,
'non_zero_losses': 1000,
'num_losses': 1000
},
'status': 'processing_succeeded',
'status_message': None
},
{
'_type': 'ELTLossSet',
'created': '2016-03-09T18:38:04.801820Z',
'currency': 'USD',
'data': {'ref_id': 'd99999e4-d66e-4c0a-a6ba-554a8598fea5'},
'description': 'FL_HU_Program_1',
'event_catalogs': [
{'ref_id': 'aa8c316c-128c-485b-9144-5eed3f051c36'}
],
'id': 'a3bbebae-d25d-493e-8c02-fa4305e7dbd0',
'loss_type': 'LossGross',
'meta_data': {},
'modified': '2016-03-09T18:38:04.801820Z',
'profile': {
'attributes': {
'ModelID': ['1'],
'Peril': ['HU'],
'Region': ['US'],
'SubRegion': ['Florida']
},
'avg_annual_loss': 446831.486739948,
'currency': 'USD',
'max_loss': 8995286.82877091,
'min_loss': 10553.9672063884,
'non_zero_losses': 1000,
'num_losses': 1000
},
'status': 'processing_succeeded',
'status_message': None
}
],
'meta_data': {
'business_unit': 'North America',
'cat_modeler': 'Elizabeth Hawks',
'client_id': 'Model A - Charlie Group Layer 1',
'peril': 'EQ',
'program_id': 3,
'program_name': 'Charlie Group',
'region': 'California',
'underwriter': 'Travis Costa'
},
'modified': '2016-03-09T18:38:04.801820Z',
'nth': 1,
'participation': 0.092,
'policy': None,
'premium': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 39939.9225,
'value_date': None
},
'reinstatements': []
}
analyzere_layer = convert_to_analyzere_object(layer_dict)
return [
'07a98f2a-87c0-49d8-a98d-deb626707da2',
[
'6a9bf278-4edc-4d28-b4bb-59f2283dbcb0',
'a3bbebae-d25d-493e-8c02-fa4305e7dbd0'
],
analyzere_layer
]
@pytest.fixture(scope='session')
def layer_view():
layer_view_dict = {
'analysis_profile': {'ref_id': '3bb4950c-777f-44dd-a5c2-73a2dfa04d26'},
'id': '333bf15d-225a-a30c-b95d-22fff4722b5d',
'layer': {
'_type': 'CatXL',
'attachment': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 500000.0,
'value_date': None
},
'created': '2016-03-09T18:38:04.801820Z',
'description': 'Model A - Charlie Group Layer 1',
'expiry_date': '2017-01-01T00:00:00.000000Z',
'fees': None,
'franchise': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 0.0,
'value_date': None
},
'id': '07a98f2a-87c0-49d8-a98d-deb626707da2',
'inception_date': '2016-01-01T00:00:00.000000Z',
'limit': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 500000.0,
'value_date': None
},
'loss_sets': [
{
'_type': 'ELTLossSet',
'created': '2016-03-09T18:38:04.801820Z',
'currency': 'USD',
'data': {'ref_id': 'd99999e4-d66e-4c0a-a6ba-554a8598fea5'},
'description': 'FL_HU_Program_1',
'event_catalogs': [
{'ref_id': 'aa8c316c-128c-485b-9144-5eed3f051c36'}
],
'id': '6a9bf278-4edc-4d28-b4bb-59f2283dbcb0',
'loss_type': 'LossGross',
'meta_data': {},
'modified': '2016-03-09T18:38:04.801820Z',
'profile': {
'attributes': {
'ModelID': ['1'],
'Peril': ['HU'],
'Region': ['US'],
'SubRegion': ['Florida']
},
'avg_annual_loss': 446831.486739948,
'currency': 'USD',
'max_loss': 8995286.82877091,
'min_loss': 10553.9672063884,
'non_zero_losses': 1000,
'num_losses': 1000
},
'status': 'processing_succeeded',
'status_message': None
}
],
'meta_data': {
'business_unit': 'North America',
'cat_modeler': 'Elizabeth Hawks',
'client_id': 'Model A - Charlie Group Layer 1',
'peril': 'EQ',
'program_id': 3,
'program_name': 'Charlie Group',
'region': 'California',
'underwriter': 'Travis Costa'
},
'modified': '2016-03-09T18:38:04.801820Z',
'nth': 1,
'participation': 0.092,
'policy': None,
'premium': {
'currency': 'USD',
'rate': None,
'rate_currency': None,
'value': 39939.9225,
'value_date': None
},
'reinstatements': []
},
'target_currency': 'USD',
'ylt_id': '99838575-6a3c-a7a2-d1e5-40a8e1b67b47'
}
analyzere_layer_view = convert_to_analyzere_object(layer_view_dict)
return [
'333bf15d-225a-a30c-b95d-22fff4722b5d',
['6a9bf278-4edc-4d28-b4bb-59f2283dbcb0'],
analyzere_layer_view
]
@pytest.fixture(scope='session')
def loss_set():
loss_set_dict = {
'_type': 'ELTLossSet',
'created': '2016-03-09T18:38:04.801820Z',
'currency': 'USD',
'data': {'ref_id': 'd99999e4-d66e-4c0a-a6ba-554a8598fea5'},
'description': 'FL_HU_Program_1',
'event_catalogs': [
{'ref_id': 'aa8c316c-128c-485b-9144-5eed3f051c36'}
],
'id': '6a9bf278-4edc-4d28-b4bb-59f2283dbcb0',
'loss_type': 'LossGross',
'meta_data': {},
'modified': '2016-03-09T18:38:04.801820Z',
'profile': {
'attributes': {
'ModelID': ['1'],
'Peril': ['HU'],
'Region': ['US'],
'SubRegion': ['Florida']
},
'avg_annual_loss': 446831.486739948,
'currency': 'USD',
'max_loss': 8995286.82877091,
'min_loss': 10553.9672063884,
'non_zero_losses': 1000,
'num_losses': 1000
},
'status': 'processing_succeeded',
'status_message': None
}
analyzere_loss_set = convert_to_analyzere_object(loss_set_dict)
return [
'6a9bf278-4edc-4d28-b4bb-59f2283dbcb0',
analyzere_loss_set
]
@pytest.fixture(scope='session')
def portfolio_with_non_elt_loss_sets(portfolio):
non_elt_portfolio = []
non_elt_portfolio.append(portfolio[0])
non_elt_portfolio.append(portfolio[1])
non_elt_portfolio.append(portfolio[2])
for i in range(0, len(non_elt_portfolio[2].layers)):
for j in range(0, len(non_elt_portfolio[2].layers[i].loss_sets)):
non_elt_portfolio[2].layers[i].loss_sets[j].type = 'YELTLossSet'
return non_elt_portfolio
@pytest.fixture(scope='session')
def pv_with_p_with_non_elt_loss_sets(
portfolio_view_with_portfolio):
non_elt_portfolio_view = []
non_elt_portfolio_view.append(portfolio_view_with_portfolio[0])
non_elt_portfolio_view.append(portfolio_view_with_portfolio[1])
non_elt_portfolio_view.append(portfolio_view_with_portfolio[2])
for i in range(0, len(non_elt_portfolio_view[2].portfolio.layers)):
for j in range(0, len(
non_elt_portfolio_view[2].portfolio.layers[i].loss_sets)):
non_elt_portfolio_view[2].portfolio.layers[i].loss_sets[j].type =\
'YELTLossSet'
return non_elt_portfolio_view
@pytest.fixture(scope='session')
def pv_with_lv_with_non_elt_loss_sets(
portfolio_view_with_layer_views):
non_elt_portfolio_view = []
non_elt_portfolio_view.append(portfolio_view_with_layer_views[0])
non_elt_portfolio_view.append(portfolio_view_with_layer_views[1])
non_elt_portfolio_view.append(portfolio_view_with_layer_views[2])
for i in range(0, len(non_elt_portfolio_view[2].layer_views)):
for j in range(0, len(
non_elt_portfolio_view[2].layer_views[i].layer.loss_sets)):
non_elt_portfolio_view[2].layer_views[i].layer.loss_sets[j].type =\
'YELTLossSet'
return non_elt_portfolio_view
@pytest.fixture(scope='session')
def layer_with_non_elt_loss_sets(layer):
non_elt_layer = []
non_elt_layer.append(layer[0])
non_elt_layer.append(layer[1])
non_elt_layer.append(layer[2])
for i in range(0, len(non_elt_layer[2].loss_sets)):
non_elt_layer[2].loss_sets[i].type = 'YELTLossSet'
return non_elt_layer
@pytest.fixture(scope='session')
def layer_view_with_non_elt_loss_sets(layer_view):
non_elt_layer_view = []
non_elt_layer_view.append(layer_view[0])
non_elt_layer_view.append(layer_view[1])
non_elt_layer_view.append(layer_view[2])
for i in range(0, len(non_elt_layer_view[2].layer.loss_sets)):
non_elt_layer_view[2].layer.loss_sets[i].type = 'YELTLossSet'
return non_elt_layer_view
@pytest.fixture(scope='session')
def non_elt_loss_set(loss_set):
non_elt_loss_set = []
non_elt_loss_set.append(loss_set[0])
non_elt_loss_set.append(loss_set[1])
non_elt_loss_set[1].type = 'YELTLossSet'
return non_elt_loss_set
@pytest.fixture(scope='session')
def elt_response_1():
id = 'c054b33f-45df-4007-94f1-13d24935524d'
elt_str_list = []
elt_str_list.append('EventId,Loss')
elt_str_list.append('1000,10.5')
elt_str_list.append('1001,400.0')
elt_str_list.append('1003,200.0')
return [id, elt_str_list]
@pytest.fixture(scope='session')
def elt_response_2():
id = '11ace104-d814-4238-99ba-0a1a2faa4f2d'
elt_str_list = []
elt_str_list.append('EventId,Loss')
elt_str_list.append('2100,20.25')
elt_str_list.append('1000,600.0')
return [id, elt_str_list]
@pytest.fixture(scope='session')
def elt_response_3():
id = '756f989c-1250-4149-97c1-9b57e3aa36b8'
elt_str_list = []
elt_str_list.append('EventId,Loss')
elt_str_list.append('3000,10.5')
elt_str_list.append('1003,1200.0')
elt_str_list.append('3002,3000.1')
elt_str_list.append('2100,3150.0')
return [id, elt_str_list]
@pytest.fixture(scope='session')
def elt_response_additional_columns_1():
id = '97fcab1e-1afd-40ca-b439-5b0bab3cd576'
elt_str_list = []
elt_str_list.append('EventId,Loss,STDDEVI,STDDEVC,EXPVALUE')
elt_str_list.append('02,100.5,4,5.7,99')
elt_str_list.append('01,40.0,0.0,0,0.00')
elt_str_list.append('3000,23300.0,4,5,234')
return [id, elt_str_list]
@pytest.fixture(scope='session')
def elt_response_additional_columns_2():
id = 'c48bcda4-8ba5-4366-9c95-1904beb0d19e'
elt_str_list = []
elt_str_list.append('EventId,Loss,STDDEVI,STDDEVC,EXPVALUE')
elt_str_list.append('03,20.250,4,5.6,0.01')
elt_str_list.append('1000,600.00,0.01,0.04,0')
return [id, elt_str_list]
@pytest.fixture(scope='session')
def elt_response_additional_columns_3():
id = '5489f16a-dbcf-42e6-a5c6-3b0ea05b293a'
elt_str_list = []
elt_str_list.append('EventId,Loss,STDDEVI,STDDEVC,EXPVALUE')
elt_str_list.append('3000,105.5,0,0,0')
elt_str_list.append('1420,120.0,1,1,1')
elt_str_list.append('11420,30400.1,0.5,0.4,0.3')
elt_str_list.append('02,3150.0,0.9,8.0,22')
return [id, elt_str_list]
@pytest.fixture(scope='session')
def elt_response_EventId():
id = '0c7b46ac-3ae0-432e-931f-ae2c3f03481b'
elt_str_list = []
elt_str_list.append('EventId,Loss')
elt_str_list.append('99000,1000000')
elt_str_list.append('104000,2000000')
return [id, elt_str_list]
@pytest.fixture(scope='session')
def elt_response_EventID():
id = '90c03895-ea84-4df6-81b4-b6582976004b'
elt_str_list = []
elt_str_list.append('EventID,Loss')
elt_str_list.append('6454,4500.00')
elt_str_list.append('6201,560.8')
return [id, elt_str_list]
@pytest.fixture(scope='session')
def elt_response_dict(elt_response_EventId, elt_response_EventID):
elt_response_dict = {}
elt_response_dict['elt_response_EventId'] = elt_response_EventId
elt_response_dict['elt_response_EventID'] = elt_response_EventID
return elt_response_dict
| 39.719184 | 79 | 0.398697 | 3,767 | 48,656 | 4.931245 | 0.081763 | 0.038491 | 0.028424 | 0.033592 | 0.952304 | 0.910099 | 0.891527 | 0.873654 | 0.853898 | 0.842 | 0 | 0.17497 | 0.479406 | 48,656 | 1,224 | 80 | 39.751634 | 0.558389 | 0 | 0 | 0.769432 | 0 | 0 | 0.300292 | 0.108332 | 0 | 0 | 0 | 0 | 0 | 1 | 0.018341 | false | 0 | 0.001747 | 0 | 0.038428 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1698f44b25b842bada34acb62596d4413a3b370c | 4,591 | py | Python | edbdeploy/spec/vmware.py | vincentp7212/postgres-deployment | ea0ed0e06a4eb99cc28600398eddcf2320778113 | [
"BSD-3-Clause"
] | 58 | 2020-02-24T21:02:50.000Z | 2022-03-28T14:51:56.000Z | edbdeploy/spec/vmware.py | vincentp7212/postgres-deployment | ea0ed0e06a4eb99cc28600398eddcf2320778113 | [
"BSD-3-Clause"
] | 108 | 2020-09-18T12:53:44.000Z | 2022-02-02T09:02:31.000Z | edbdeploy/spec/vmware.py | vincentp7212/postgres-deployment | ea0ed0e06a4eb99cc28600398eddcf2320778113 | [
"BSD-3-Clause"
] | 47 | 2020-03-04T15:51:01.000Z | 2022-02-27T13:48:05.000Z | from . import SpecValidator
VMWareSpec = {
'EDB-RA-1': {
'ssh_user': SpecValidator(type='string', default=None),
'pg_data': SpecValidator(type='string', default=None),
'pg_wal': SpecValidator(type='string', default=None),
'postgres_server_1': {
'name': SpecValidator(type='string', default='primary1'),
'public_ip': SpecValidator(type='ipv4', default=None),
'private_ip': SpecValidator(type='ipv4', default=None),
},
'pem_server_1': {
'name': SpecValidator(type='string', default='pem1'),
'public_ip': SpecValidator(type='ipv4', default=None),
'private_ip': SpecValidator(type='ipv4', default=None),
},
'backup_server_1': {
'name': SpecValidator(type='string', default='barman1'),
'public_ip': SpecValidator(type='ipv4', default=None),
'private_ip': SpecValidator(type='ipv4', default=None),
}
},
'EDB-RA-2': {
'ssh_user': SpecValidator(type='string', default=None),
'pg_data': SpecValidator(type='string', default=None),
'pg_wal': SpecValidator(type='string', default=None),
'postgres_server_1': {
'name': SpecValidator(type='string', default='primary1'),
'public_ip': SpecValidator(type='ipv4', default=None),
'private_ip': SpecValidator(type='ipv4', default=None),
},
'postgres_server_2': {
'name': SpecValidator(type='string', default='primary2'),
'public_ip': SpecValidator(type='ipv4', default=None),
'private_ip': SpecValidator(type='ipv4', default=None),
},
'postgres_server_3': {
'name': SpecValidator(type='string', default='primary3'),
'public_ip': SpecValidator(type='ipv4', default=None),
'private_ip': SpecValidator(type='ipv4', default=None),
},
'pem_server_1': {
'name': SpecValidator(type='string', default='pem1'),
'public_ip': SpecValidator(type='ipv4', default=None),
'private_ip': SpecValidator(type='ipv4', default=None),
},
'backup_server_1': {
'name': SpecValidator(type='string', default='barman1'),
'public_ip': SpecValidator(type='ipv4', default=None),
'private_ip': SpecValidator(type='ipv4', default=None),
}
},
'EDB-RA-3': {
'ssh_user': SpecValidator(type='string', default=None),
'pg_data': SpecValidator(type='string', default=None),
'pg_wal': SpecValidator(type='string', default=None),
'postgres_server_1': {
'name': SpecValidator(type='string', default='primary1'),
'public_ip': SpecValidator(type='ipv4', default=None),
'private_ip': SpecValidator(type='ipv4', default=None),
},
'postgres_server_2': {
'name': SpecValidator(type='string', default='primary2'),
'public_ip': SpecValidator(type='ipv4', default=None),
'private_ip': SpecValidator(type='ipv4', default=None),
},
'postgres_server_3': {
'name': SpecValidator(type='string', default='primary3'),
'public_ip': SpecValidator(type='ipv4', default=None),
'private_ip': SpecValidator(type='ipv4', default=None),
},
'pooler_server_1': {
'name': SpecValidator(type='string', default='pooler1'),
'public_ip': SpecValidator(type='ipv4', default=None),
'private_ip': SpecValidator(type='ipv4', default=None),
},
'pooler_server_2': {
'name': SpecValidator(type='string', default='pooler2'),
'public_ip': SpecValidator(type='ipv4', default=None),
'private_ip': SpecValidator(type='ipv4', default=None),
},
'pooler_server_3': {
'name': SpecValidator(type='string', default='pooler3'),
'public_ip': SpecValidator(type='ipv4', default=None),
'private_ip': SpecValidator(type='ipv4', default=None),
},
'pem_server_1': {
'name': SpecValidator(type='string', default='pem1'),
'public_ip': SpecValidator(type='ipv4', default=None),
'private_ip': SpecValidator(type='ipv4', default=None),
},
'backup_server_1': {
'name': SpecValidator(type='string', default='barman1'),
'public_ip': SpecValidator(type='ipv4', default=None),
'private_ip': SpecValidator(type='ipv4', default=None),
}
}
}
| 45.91 | 69 | 0.576563 | 444 | 4,591 | 5.797297 | 0.081081 | 0.376457 | 0.236208 | 0.285936 | 0.975913 | 0.975913 | 0.975913 | 0.93512 | 0.93512 | 0.93512 | 0 | 0.019556 | 0.253757 | 4,591 | 99 | 70 | 46.373737 | 0.731757 | 0 | 0 | 0.683673 | 0 | 0 | 0.236985 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.010204 | 0 | 0.010204 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
bc484eba1b422270e37ac1ce2c38816c12afedd4 | 98,486 | py | Python | leetcode/medium/greedy/132_gas_station.py | phantomnat/python-learning | addc7ba5fc4fb8920cdd2891d4b2e79efd1a524a | [
"MIT"
] | null | null | null | leetcode/medium/greedy/132_gas_station.py | phantomnat/python-learning | addc7ba5fc4fb8920cdd2891d4b2e79efd1a524a | [
"MIT"
] | null | null | null | leetcode/medium/greedy/132_gas_station.py | phantomnat/python-learning | addc7ba5fc4fb8920cdd2891d4b2e79efd1a524a | [
"MIT"
] | null | null | null | class Solution:
def canCompleteCircuit(self, gas, cost) -> int:
n = len(gas)
l, r = 0, n-1
total = gas[r]-cost[r]
while l < r:
if total >= 0:
total += gas[l]-cost[l]
l += 1
else:
r -= 1
total += gas[r]-cost[r]
return r if total >= 0 else -1
if __name__ == '__main__':
s = Solution()
# print(s.canCompleteCircuit(
# [1,2,3,4,5],
# [3,4,5,1,2]))
print(s.canCompleteCircuit(
[5,1,2,3,4],[4,4,1,5,1]))
print(s.canCompleteCircuit( [2,3,4], [3,4,3]))
print(s.canCompleteCircuit([2],[2]))
print(s.canCompleteCircuit([3897,3898,3899,3900,3901,3902,3903,3904,3905,3906,3907,3908,3909,3910,3911,3912,3913,3914,3915,3916,3917,3918,3919,3920,3921,3922,3923,3924,3925,3926,3927,3928,3929,3930,3931,3932,3933,3934,3935,3936,3937,3938,3939,3940,3941,3942,3943,3944,3945,3946,3947,3948,3949,3950,3951,3952,3953,3954,3955,3956,3957,3958,3959,3960,3961,3962,3963,3964,3965,3966,3967,3968,3969,3970,3971,3972,3973,3974,3975,3976,3977,3978,3979,3980,3981,3982,3983,3984,3985,3986,3987,3988,3989,3990,3991,3992,3993,3994,3995,3996,3997,3998,3999,4000,4001,4002,4003,4004,4005,4006,4007,4008,4009,4010,4011,4012,4013,4014,4015,4016,4017,4018,4019,4020,4021,4022,4023,4024,4025,4026,4027,4028,4029,4030,4031,4032,4033,4034,4035,4036,4037,4038,4039,4040,4041,4042,4043,4044,4045,4046,4047,4048,4049,4050,4051,4052,4053,4054,4055,4056,4057,4058,4059,4060,4061,4062,4063,4064,4065,4066,4067,4068,4069,4070,4071,4072,4073,4074,4075,4076,4077,4078,4079,4080,4081,4082,4083,4084,4085,4086,4087,4088,4089,4090,4091,4092,4093,4094,4095,4096,4097,4098,4099,4100,4101,4102,4103,4104,4105,4106,4107,4108,4109,4110,4111,4112,4113,4114,4115,4116,4117,4118,4119,4120,4121,4122,4123,4124,4125,4126,4127,4128,4129,4130,4131,4132,4133,4134,4135,4136,4137,4138,4139,4140,4141,4142,4143,4144,4145,4146,4147,4148,4149,4150,4151,4152,4153,4154,4155,4156,4157,4158,4159,4160,4161,4162,4163,4164,4165,4166,4167,4168,4169,4170,4171,4172,4173,4174,4175,4176,4177,4178,4179,4180,4181,4182,4183,4184,4185,4186,4187,4188,4189,4190,4191,4192,4193,4194,4195,4196,4197,4198,4199,4200,4201,4202,4203,4204,4205,4206,4207,4208,4209,4210,4211,4212,4213,4214,4215,4216,4217,4218,4219,4220,4221,4222,4223,4224,4225,4226,4227,4228,4229,4230,4231,4232,4233,4234,4235,4236,4237,4238,4239,4240,4241,4242,4243,4244,4245,4246,4247,4248,4249,4250,4251,4252,4253,4254,4255,4256,4257,4258,4259,4260,4261,4262,4263,4264,4265,4266,4267,4268,4269,4270,4271,4272,4273,4274,4275,4276,4277,4278,4279,4280,4281,4282,4283,4284,4285,4286,4287,4288,4289,4290,4291,4292,4293,4294,4295,4296,4297,4298,4299,4300,4301,4302,4303,4304,4305,4306,4307,4308,4309,4310,4311,4312,4313,4314,4315,4316,4317,4318,4319,4320,4321,4322,4323,4324,4325,4326,4327,4328,4329,4330,4331,4332,4333,4334,4335,4336,4337,4338,4339,4340,4341,4342,4343,4344,4345,4346,4347,4348,4349,4350,4351,4352,4353,4354,4355,4356,4357,4358,4359,4360,4361,4362,4363,4364,4365,4366,4367,4368,4369,4370,4371,4372,4373,4374,4375,4376,4377,4378,4379,4380,4381,4382,4383,4384,4385,4386,4387,4388,4389,4390,4391,4392,4393,4394,4395,4396,4397,4398,4399,4400,4401,4402,4403,4404,4405,4406,4407,4408,4409,4410,4411,4412,4413,4414,4415,4416,4417,4418,4419,4420,4421,4422,4423,4424,4425,4426,4427,4428,4429,4430,4431,4432,4433,4434,4435,4436,4437,4438,4439,4440,4441,4442,4443,4444,4445,4446,4447,4448,4449,4450,4451,4452,4453,4454,4455,4456,4457,4458,4459,4460,4461,4462,4463,4464,4465,4466,4467,4468,4469,4470,4471,4472,4473,4474,4475,4476,4477,4478,4479,4480,4481,4482,4483,4484,4485,4486,4487,4488,4489,4490,4491,4492,4493,4494,4495,4496,4497,4498,4499,4500,4501,4502,4503,4504,4505,4506,4507,4508,4509,4510,4511,4512,4513,4514,4515,4516,4517,4518,4519,4520,4521,4522,4523,4524,4525,4526,4527,4528,4529,4530,4531,4532,4533,4534,4535,4536,4537,4538,4539,4540,4541,4542,4543,4544,4545,4546,4547,4548,4549,4550,4551,4552,4553,4554,4555,4556,4557,4558,4559,4560,4561,4562,4563,4564,4565,4566,4567,4568,4569,4570,4571,4572,4573,4574,4575,4576,4577,4578,4579,4580,4581,4582,4583,4584,4585,4586,4587,4588,4589,4590,4591,4592,4593,4594,4595,4596,4597,4598,4599,4600,4601,4602,4603,4604,4605,4606,4607,4608,4609,4610,4611,4612,4613,4614,4615,4616,4617,4618,4619,4620,4621,4622,4623,4624,4625,4626,4627,4628,4629,4630,4631,4632,4633,4634,4635,4636,4637,4638,4639,4640,4641,4642,4643,4644,4645,4646,4647,4648,4649,4650,4651,4652,4653,4654,4655,4656,4657,4658,4659,4660,4661,4662,4663,4664,4665,4666,4667,4668,4669,4670,4671,4672,4673,4674,4675,4676,4677,4678,4679,4680,4681,4682,4683,4684,4685,4686,4687,4688,4689,4690,4691,4692,4693,4694,4695,4696,4697,4698,4699,4700,4701,4702,4703,4704,4705,4706,4707,4708,4709,4710,4711,4712,4713,4714,4715,4716,4717,4718,4719,4720,4721,4722,4723,4724,4725,4726,4727,4728,4729,4730,4731,4732,4733,4734,4735,4736,4737,4738,4739,4740,4741,4742,4743,4744,4745,4746,4747,4748,4749,4750,4751,4752,4753,4754,4755,4756,4757,4758,4759,4760,4761,4762,4763,4764,4765,4766,4767,4768,4769,4770,4771,4772,4773,4774,4775,4776,4777,4778,4779,4780,4781,4782,4783,4784,4785,4786,4787,4788,4789,4790,4791,4792,4793,4794,4795,4796,4797,4798,4799,4800,4801,4802,4803,4804,4805,4806,4807,4808,4809,4810,4811,4812,4813,4814,4815,4816,4817,4818,4819,4820,4821,4822,4823,4824,4825,4826,4827,4828,4829,4830,4831,4832,4833,4834,4835,4836,4837,4838,4839,4840,4841,4842,4843,4844,4845,4846,4847,4848,4849,4850,4851,4852,4853,4854,4855,4856,4857,4858,4859,4860,4861,4862,4863,4864,4865,4866,4867,4868,4869,4870,4871,4872,4873,4874,4875,4876,4877,4878,4879,4880,4881,4882,4883,4884,4885,4886,4887,4888,4889,4890,4891,4892,4893,4894,4895,4896,4897,4898,4899,4900,4901,4902,4903,4904,4905,4906,4907,4908,4909,4910,4911,4912,4913,4914,4915,4916,4917,4918,4919,4920,4921,4922,4923,4924,4925,4926,4927,4928,4929,4930,4931,4932,4933,4934,4935,4936,4937,4938,4939,4940,4941,4942,4943,4944,4945,4946,4947,4948,4949,4950,4951,4952,4953,4954,4955,4956,4957,4958,4959,4960,4961,4962,4963,4964,4965,4966,4967,4968,4969,4970,4971,4972,4973,4974,4975,4976,4977,4978,4979,4980,4981,4982,4983,4984,4985,4986,4987,4988,4989,4990,4991,4992,4993,4994,4995,4996,4997,4998,4999,5000,5001,5002,5003,5004,5005,5006,5007,5008,5009,5010,5011,5012,5013,5014,5015,5016,5017,5018,5019,5020,5021,5022,5023,5024,5025,5026,5027,5028,5029,5030,5031,5032,5033,5034,5035,5036,5037,5038,5039,5040,5041,5042,5043,5044,5045,5046,5047,5048,5049,5050,5051,5052,5053,5054,5055,5056,5057,5058,5059,5060,5061,5062,5063,5064,5065,5066,5067,5068,5069,5070,5071,5072,5073,5074,5075,5076,5077,5078,5079,5080,5081,5082,5083,5084,5085,5086,5087,5088,5089,5090,5091,5092,5093,5094,5095,5096,5097,5098,5099,5100,5101,5102,5103,5104,5105,5106,5107,5108,5109,5110,5111,5112,5113,5114,5115,5116,5117,5118,5119,5120,5121,5122,5123,5124,5125,5126,5127,5128,5129,5130,5131,5132,5133,5134,5135,5136,5137,5138,5139,5140,5141,5142,5143,5144,5145,5146,5147,5148,5149,5150,5151,5152,5153,5154,5155,5156,5157,5158,5159,5160,5161,5162,5163,5164,5165,5166,5167,5168,5169,5170,5171,5172,5173,5174,5175,5176,5177,5178,5179,5180,5181,5182,5183,5184,5185,5186,5187,5188,5189,5190,5191,5192,5193,5194,5195,5196,5197,5198,5199,5200,5201,5202,5203,5204,5205,5206,5207,5208,5209,5210,5211,5212,5213,5214,5215,5216,5217,5218,5219,5220,5221,5222,5223,5224,5225,5226,5227,5228,5229,5230,5231,5232,5233,5234,5235,5236,5237,5238,5239,5240,5241,5242,5243,5244,5245,5246,5247,5248,5249,5250,5251,5252,5253,5254,5255,5256,5257,5258,5259,5260,5261,5262,5263,5264,5265,5266,5267,5268,5269,5270,5271,5272,5273,5274,5275,5276,5277,5278,5279,5280,5281,5282,5283,5284,5285,5286,5287,5288,5289,5290,5291,5292,5293,5294,5295,5296,5297,5298,5299,5300,5301,5302,5303,5304,5305,5306,5307,5308,5309,5310,5311,5312,5313,5314,5315,5316,5317,5318,5319,5320,5321,5322,5323,5324,5325,5326,5327,5328,5329,5330,5331,5332,5333,5334,5335,5336,5337,5338,5339,5340,5341,5342,5343,5344,5345,5346,5347,5348,5349,5350,5351,5352,5353,5354,5355,5356,5357,5358,5359,5360,5361,5362,5363,5364,5365,5366,5367,5368,5369,5370,5371,5372,5373,5374,5375,5376,5377,5378,5379,5380,5381,5382,5383,5384,5385,5386,5387,5388,5389,5390,5391,5392,5393,5394,5395,5396,5397,5398,5399,5400,5401,5402,5403,5404,5405,5406,5407,5408,5409,5410,5411,5412,5413,5414,5415,5416,5417,5418,5419,5420,5421,5422,5423,5424,5425,5426,5427,5428,5429,5430,5431,5432,5433,5434,5435,5436,5437,5438,5439,5440,5441,5442,5443,5444,5445,5446,5447,5448,5449,5450,5451,5452,5453,5454,5455,5456,5457,5458,5459,5460,5461,5462,5463,5464,5465,5466,5467,5468,5469,5470,5471,5472,5473,5474,5475,5476,5477,5478,5479,5480,5481,5482,5483,5484,5485,5486,5487,5488,5489,5490,5491,5492,5493,5494,5495,5496,5497,5498,5499,5500,5501,5502,5503,5504,5505,5506,5507,5508,5509,5510,5511,5512,5513,5514,5515,5516,5517,5518,5519,5520,5521,5522,5523,5524,5525,5526,5527,5528,5529,5530,5531,5532,5533,5534,5535,5536,5537,5538,5539,5540,5541,5542,5543,5544,5545,5546,5547,5548,5549,5550,5551,5552,5553,5554,5555,5556,5557,5558,5559,5560,5561,5562,5563,5564,5565,5566,5567,5568,5569,5570,5571,5572,5573,5574,5575,5576,5577,5578,5579,5580,5581,5582,5583,5584,5585,5586,5587,5588,5589,5590,5591,5592,5593,5594,5595,5596,5597,5598,5599,5600,5601,5602,5603,5604,5605,5606,5607,5608,5609,5610,5611,5612,5613,5614,5615,5616,5617,5618,5619,5620,5621,5622,5623,5624,5625,5626,5627,5628,5629,5630,5631,5632,5633,5634,5635,5636,5637,5638,5639,5640,5641,5642,5643,5644,5645,5646,5647,5648,5649,5650,5651,5652,5653,5654,5655,5656,5657,5658,5659,5660,5661,5662,5663,5664,5665,5666,5667,5668,5669,5670,5671,5672,5673,5674,5675,5676,5677,5678,5679,5680,5681,5682,5683,5684,5685,5686,5687,5688,5689,5690,5691,5692,5693,5694,5695,5696,5697,5698,5699,5700,5701,5702,5703,5704,5705,5706,5707,5708,5709,5710,5711,5712,5713,5714,5715,5716,5717,5718,5719,5720,5721,5722,5723,5724,5725,5726,5727,5728,5729,5730,5731,5732,5733,5734,5735,5736,5737,5738,5739,5740,5741,5742,5743,5744,5745,5746,5747,5748,5749,5750,5751,5752,5753,5754,5755,5756,5757,5758,5759,5760,5761,5762,5763,5764,5765,5766,5767,5768,5769,5770,5771,5772,5773,5774,5775,5776,5777,5778,5779,5780,5781,5782,5783,5784,5785,5786,5787,5788,5789,5790,5791,5792,5793,5794,5795,5796,5797,5798,5799,5800,5801,5802,5803,5804,5805,5806,5807,5808,5809,5810,5811,5812,5813,5814,5815,5816,5817,5818,5819,5820,5821,5822,5823,5824,5825,5826,5827,5828,5829,5830,5831,5832,5833,5834,5835,5836,5837,5838,5839,5840,5841,5842,5843,5844,5845,5846,5847,5848,5849,5850,5851,5852,5853,5854,5855,5856,5857,5858,5859,5860,5861,5862,5863,5864,5865,5866,5867,5868,5869,5870,5871,5872,5873,5874,5875,5876,5877,5878,5879,5880,5881,5882,5883,5884,5885,5886,5887,5888,5889,5890,5891,5892,5893,5894,5895,5896,5897,5898,5899,5900,5901,5902,5903,5904,5905,5906,5907,5908,5909,5910,5911,5912,5913,5914,5915,5916,5917,5918,5919,5920,5921,5922,5923,5924,5925,5926,5927,5928,5929,5930,5931,5932,5933,5934,5935,5936,5937,5938,5939,5940,5941,5942,5943,5944,5945,5946,5947,5948,5949,5950,5951,5952,5953,5954,5955,5956,5957,5958,5959,5960,5961,5962,5963,5964,5965,5966,5967,5968,5969,5970,5971,5972,5973,5974,5975,5976,5977,5978,5979,5980,5981,5982,5983,5984,5985,5986,5987,5988,5989,5990,5991,5992,5993,5994,5995,5996,5997,5998,5999,6000,6001,6002,6003,6004,6005,6006,6007,6008,6009,6010,6011,6012,6013,6014,6015,6016,6017,6018,6019,6020,6021,6022,6023,6024,6025,6026,6027,6028,6029,6030,6031,6032,6033,6034,6035,6036,6037,6038,6039,6040,6041,6042,6043,6044,6045,6046,6047,6048,6049,6050,6051,6052,6053,6054,6055,6056,6057,6058,6059,6060,6061,6062,6063,6064,6065,6066,6067,6068,6069,6070,6071,6072,6073,6074,6075,6076,6077,6078,6079,6080,6081,6082,6083,6084,6085,6086,6087,6088,6089,6090,6091,6092,6093,6094,6095,6096,6097,6098,6099,6100,6101,6102,6103,6104,6105,6106,6107,6108,6109,6110,6111,6112,6113,6114,6115,6116,6117,6118,6119,6120,6121,6122,6123,6124,6125,6126,6127,6128,6129,6130,6131,6132,6133,6134,6135,6136,6137,6138,6139,6140,6141,6142,6143,6144,6145,6146,6147,6148,6149,6150,6151,6152,6153,6154,6155,6156,6157,6158,6159,6160,6161,6162,6163,6164,6165,6166,6167,6168,6169,6170,6171,6172,6173,6174,6175,6176,6177,6178,6179,6180,6181,6182,6183,6184,6185,6186,6187,6188,6189,6190,6191,6192,6193,6194,6195,6196,6197,6198,6199,6200,6201,6202,6203,6204,6205,6206,6207,6208,6209,6210,6211,6212,6213,6214,6215,6216,6217,6218,6219,6220,6221,6222,6223,6224,6225,6226,6227,6228,6229,6230,6231,6232,6233,6234,6235,6236,6237,6238,6239,6240,6241,6242,6243,6244,6245,6246,6247,6248,6249,6250,6251,6252,6253,6254,6255,6256,6257,6258,6259,6260,6261,6262,6263,6264,6265,6266,6267,6268,6269,6270,6271,6272,6273,6274,6275,6276,6277,6278,6279,6280,6281,6282,6283,6284,6285,6286,6287,6288,6289,6290,6291,6292,6293,6294,6295,6296,6297,6298,6299,6300,6301,6302,6303,6304,6305,6306,6307,6308,6309,6310,6311,6312,6313,6314,6315,6316,6317,6318,6319,6320,6321,6322,6323,6324,6325,6326,6327,6328,6329,6330,6331,6332,6333,6334,6335,6336,6337,6338,6339,6340,6341,6342,6343,6344,6345,6346,6347,6348,6349,6350,6351,6352,6353,6354,6355,6356,6357,6358,6359,6360,6361,6362,6363,6364,6365,6366,6367,6368,6369,6370,6371,6372,6373,6374,6375,6376,6377,6378,6379,6380,6381,6382,6383,6384,6385,6386,6387,6388,6389,6390,6391,6392,6393,6394,6395,6396,6397,6398,6399,6400,6401,6402,6403,6404,6405,6406,6407,6408,6409,6410,6411,6412,6413,6414,6415,6416,6417,6418,6419,6420,6421,6422,6423,6424,6425,6426,6427,6428,6429,6430,6431,6432,6433,6434,6435,6436,6437,6438,6439,6440,6441,6442,6443,6444,6445,6446,6447,6448,6449,6450,6451,6452,6453,6454,6455,6456,6457,6458,6459,6460,6461,6462,6463,6464,6465,6466,6467,6468,6469,6470,6471,6472,6473,6474,6475,6476,6477,6478,6479,6480,6481,6482,6483,6484,6485,6486,6487,6488,6489,6490,6491,6492,6493,6494,6495,6496,6497,6498,6499,6500,6501,6502,6503,6504,6505,6506,6507,6508,6509,6510,6511,6512,6513,6514,6515,6516,6517,6518,6519,6520,6521,6522,6523,6524,6525,6526,6527,6528,6529,6530,6531,6532,6533,6534,6535,6536,6537,6538,6539,6540,6541,6542,6543,6544,6545,6546,6547,6548,6549,6550,6551,6552,6553,6554,6555,6556,6557,6558,6559,6560,6561,6562,6563,6564,6565,6566,6567,6568,6569,6570,6571,6572,6573,6574,6575,6576,6577,6578,6579,6580,6581,6582,6583,6584,6585,6586,6587,6588,6589,6590,6591,6592,6593,6594,6595,6596,6597,6598,6599,6600,6601,6602,6603,6604,6605,6606,6607,6608,6609,6610,6611,6612,6613,6614,6615,6616,6617,6618,6619,6620,6621,6622,6623,6624,6625,6626,6627,6628,6629,6630,6631,6632,6633,6634,6635,6636,6637,6638,6639,6640,6641,6642,6643,6644,6645,6646,6647,6648,6649,6650,6651,6652,6653,6654,6655,6656,6657,6658,6659,6660,6661,6662,6663,6664,6665,6666,6667,6668,6669,6670,6671,6672,6673,6674,6675,6676,6677,6678,6679,6680,6681,6682,6683,6684,6685,6686,6687,6688,6689,6690,6691,6692,6693,6694,6695,6696,6697,6698,6699,6700,6701,6702,6703,6704,6705,6706,6707,6708,6709,6710,6711,6712,6713,6714,6715,6716,6717,6718,6719,6720,6721,6722,6723,6724,6725,6726,6727,6728,6729,6730,6731,6732,6733,6734,6735,6736,6737,6738,6739,6740,6741,6742,6743,6744,6745,6746,6747,6748,6749,6750,6751,6752,6753,6754,6755,6756,6757,6758,6759,6760,6761,6762,6763,6764,6765,6766,6767,6768,6769,6770,6771,6772,6773,6774,6775,6776,6777,6778,6779,6780,6781,6782,6783,6784,6785,6786,6787,6788,6789,6790,6791,6792,6793,6794,6795,6796,6797,6798,6799,6800,6801,6802,6803,6804,6805,6806,6807,6808,6809,6810,6811,6812,6813,6814,6815,6816,6817,6818,6819,6820,6821,6822,6823,6824,6825,6826,6827,6828,6829,6830,6831,6832,6833,6834,6835,6836,6837,6838,6839,6840,6841,6842,6843,6844,6845,6846,6847,6848,6849,6850,6851,6852,6853,6854,6855,6856,6857,6858,6859,6860,6861,6862,6863,6864,6865,6866,6867,6868,6869,6870,6871,6872,6873,6874,6875,6876,6877,6878,6879,6880,6881,6882,6883,6884,6885,6886,6887,6888,6889,6890,6891,6892,6893,6894,6895,6896,6897,6898,6899,6900,6901,6902,6903,6904,6905,6906,6907,6908,6909,6910,6911,6912,6913,6914,6915,6916,6917,6918,6919,6920,6921,6922,6923,6924,6925,6926,6927,6928,6929,6930,6931,6932,6933,6934,6935,6936,6937,6938,6939,6940,6941,6942,6943,6944,6945,6946,6947,6948,6949,6950,6951,6952,6953,6954,6955,6956,6957,6958,6959,6960,6961,6962,6963,6964,6965,6966,6967,6968,6969,6970,6971,6972,6973,6974,6975,6976,6977,6978,6979,6980,6981,6982,6983,6984,6985,6986,6987,6988,6989,6990,6991,6992,6993,6994,6995,6996,6997,6998,6999,7000,7001,7002,7003,7004,7005,7006,7007,7008,7009,7010,7011,7012,7013,7014,7015,7016,7017,7018,7019,7020,7021,7022,7023,7024,7025,7026,7027,7028,7029,7030,7031,7032,7033,7034,7035,7036,7037,7038,7039,7040,7041,7042,7043,7044,7045,7046,7047,7048,7049,7050,7051,7052,7053,7054,7055,7056,7057,7058,7059,7060,7061,7062,7063,7064,7065,7066,7067,7068,7069,7070,7071,7072,7073,7074,7075,7076,7077,7078,7079,7080,7081,7082,7083,7084,7085,7086,7087,7088,7089,7090,7091,7092,7093,7094,7095,7096,7097,7098,7099,7100,7101,7102,7103,7104,7105,7106,7107,7108,7109,7110,7111,7112,7113,7114,7115,7116,7117,7118,7119,7120,7121,7122,7123,7124,7125,7126,7127,7128,7129,7130,7131,7132,7133,7134,7135,7136,7137,7138,7139,7140,7141,7142,7143,7144,7145,7146,7147,7148,7149,7150,7151,7152,7153,7154,7155,7156,7157,7158,7159,7160,7161,7162,7163,7164,7165,7166,7167,7168,7169,7170,7171,7172,7173,7174,7175,7176,7177,7178,7179,7180,7181,7182,7183,7184,7185,7186,7187,7188,7189,7190,7191,7192,7193,7194,7195,7196,7197,7198,7199,7200,7201,7202,7203,7204,7205,7206,7207,7208,7209,7210,7211,7212,7213,7214,7215,7216,7217,7218,7219,7220,7221,7222,7223,7224,7225,7226,7227,7228,7229,7230,7231,7232,7233,7234,7235,7236,7237,7238,7239,7240,7241,7242,7243,7244,7245,7246,7247,7248,7249,7250,7251,7252,7253,7254,7255,7256,7257,7258,7259,7260,7261,7262,7263,7264,7265,7266,7267,7268,7269,7270,7271,7272,7273,7274,7275,7276,7277,7278,7279,7280,7281,7282,7283,7284,7285,7286,7287,7288,7289,7290,7291,7292,7293,7294,7295,7296,7297,7298,7299,7300,7301,7302,7303,7304,7305,7306,7307,7308,7309,7310,7311,7312,7313,7314,7315,7316,7317,7318,7319,7320,7321,7322,7323,7324,7325,7326,7327,7328,7329,7330,7331,7332,7333,7334,7335,7336,7337,7338,7339,7340,7341,7342,7343,7344,7345,7346,7347,7348,7349,7350,7351,7352,7353,7354,7355,7356,7357,7358,7359,7360,7361,7362,7363,7364,7365,7366,7367,7368,7369,7370,7371,7372,7373,7374,7375,7376,7377,7378,7379,7380,7381,7382,7383,7384,7385,7386,7387,7388,7389,7390,7391,7392,7393,7394,7395,7396,7397,7398,7399,7400,7401,7402,7403,7404,7405,7406,7407,7408,7409,7410,7411,7412,7413,7414,7415,7416,7417,7418,7419,7420,7421,7422,7423,7424,7425,7426,7427,7428,7429,7430,7431,7432,7433,7434,7435,7436,7437,7438,7439,7440,7441,7442,7443,7444,7445,7446,7447,7448,7449,7450,7451,7452,7453,7454,7455,7456,7457,7458,7459,7460,7461,7462,7463,7464,7465,7466,7467,7468,7469,7470,7471,7472,7473,7474,7475,7476,7477,7478,7479,7480,7481,7482,7483,7484,7485,7486,7487,7488,7489,7490,7491,7492,7493,7494,7495,7496,7497,7498,7499,7500,7501,7502,7503,7504,7505,7506,7507,7508,7509,7510,7511,7512,7513,7514,7515,7516,7517,7518,7519,7520,7521,7522,7523,7524,7525,7526,7527,7528,7529,7530,7531,7532,7533,7534,7535,7536,7537,7538,7539,7540,7541,7542,7543,7544,7545,7546,7547,7548,7549,7550,7551,7552,7553,7554,7555,7556,7557,7558,7559,7560,7561,7562,7563,7564,7565,7566,7567,7568,7569,7570,7571,7572,7573,7574,7575,7576,7577,7578,7579,7580,7581,7582,7583,7584,7585,7586,7587,7588,7589,7590,7591,7592,7593,7594,7595,7596,7597,7598,7599,7600,7601,7602,7603,7604,7605,7606,7607,7608,7609,7610,7611,7612,7613,7614,7615,7616,7617,7618,7619,7620,7621,7622,7623,7624,7625,7626,7627,7628,7629,7630,7631,7632,7633,7634,7635,7636,7637,7638,7639,7640,7641,7642,7643,7644,7645,7646,7647,7648,7649,7650,7651,7652,7653,7654,7655,7656,7657,7658,7659,7660,7661,7662,7663,7664,7665,7666,7667,7668,7669,7670,7671,7672,7673,7674,7675,7676,7677,7678,7679,7680,7681,7682,7683,7684,7685,7686,7687,7688,7689,7690,7691,7692,7693,7694,7695,7696,7697,7698,7699,7700,7701,7702,7703,7704,7705,7706,7707,7708,7709,7710,7711,7712,7713,7714,7715,7716,7717,7718,7719,7720,7721,7722,7723,7724,7725,7726,7727,7728,7729,7730,7731,7732,7733,7734,7735,7736,7737,7738,7739,7740,7741,7742,7743,7744,7745,7746,7747,7748,7749,7750,7751,7752,7753,7754,7755,7756,7757,7758,7759,7760,7761,7762,7763,7764,7765,7766,7767,7768,7769,7770,7771,7772,7773,7774,7775,7776,7777,7778,7779,7780,7781,7782,7783,7784,7785,7786,7787,7788,7789,7790,7791,7792,7793,7794,7795,7796,7797,7798,7799,7800,7801,7802,7803,7804,7805,7806,7807,7808,7809,7810,7811,7812,7813,7814,7815,7816,7817,7818,7819,7820,7821,7822,7823,7824,7825,7826,7827,7828,7829,7830,7831,7832,7833,7834,7835,7836,7837,7838,7839,7840,7841,7842,7843,7844,7845,7846,7847,7848,7849,7850,7851,7852,7853,7854,7855,7856,7857,7858,7859,7860,7861,7862,7863,7864,7865,7866,7867,7868,7869,7870,7871,7872,7873,7874,7875,7876,7877,7878,7879,7880,7881,7882,7883,7884,7885,7886,7887,7888,7889,7890,7891,7892,7893,7894,7895,7896,7897,7898,7899,7900,7901,7902,7903,7904,7905,7906,7907,7908,7909,7910,7911,7912,7913,7914,7915,7916,7917,7918,7919,7920,7921,7922,7923,7924,7925,7926,7927,7928,7929,7930,7931,7932,7933,7934,7935,7936,7937,7938,7939,7940,7941,7942,7943,7944,7945,7946,7947,7948,7949,7950,7951,7952,7953,7954,7955,7956,7957,7958,7959,7960,7961,7962,7963,7964,7965,7966,7967,7968,7969,7970,7971,7972,7973,7974,7975,7976,7977,7978,7979,7980,7981,7982,7983,7984,7985,7986,7987,7988,7989,7990,7991,7992,7993,7994,7995,7996,7997,7998,7999,8000,8001,8002,8003,8004,8005,8006,8007,8008,8009,8010,8011,8012,8013,8014,8015,8016,8017,8018,8019,8020,8021,8022,8023,8024,8025,8026,8027,8028,8029,8030,8031,8032,8033,8034,8035,8036,8037,8038,8039,8040,8041,8042,8043,8044,8045,8046,8047,8048,8049,8050,8051,8052,8053,8054,8055,8056,8057,8058,8059,8060,8061,8062,8063,8064,8065,8066,8067,8068,8069,8070,8071,8072,8073,8074,8075,8076,8077,8078,8079,8080,8081,8082,8083,8084,8085,8086,8087,8088,8089,8090,8091,8092,8093,8094,8095,8096,8097,8098,8099,8100,8101,8102,8103,8104,8105,8106,8107,8108,8109,8110,8111,8112,8113,8114,8115,8116,8117,8118,8119,8120,8121,8122,8123,8124,8125,8126,8127,8128,8129,8130,8131,8132,8133,8134,8135,8136,8137,8138,8139,8140,8141,8142,8143,8144,8145,8146,8147,8148,8149,8150,8151,8152,8153,8154,8155,8156,8157,8158,8159,8160,8161,8162,8163,8164,8165,8166,8167,8168,8169,8170,8171,8172,8173,8174,8175,8176,8177,8178,8179,8180,8181,8182,8183,8184,8185,8186,8187,8188,8189,8190,8191,8192,8193,8194,8195,8196,8197,8198,8199,8200,8201,8202,8203,8204,8205,8206,8207,8208,8209,8210,8211,8212,8213,8214,8215,8216,8217,8218,8219,8220,8221,8222,8223,8224,8225,8226,8227,8228,8229,8230,8231,8232,8233,8234,8235,8236,8237,8238,8239,8240,8241,8242,8243,8244,8245,8246,8247,8248,8249,8250,8251,8252,8253,8254,8255,8256,8257,8258,8259,8260,8261,8262,8263,8264,8265,8266,8267,8268,8269,8270,8271,8272,8273,8274,8275,8276,8277,8278,8279,8280,8281,8282,8283,8284,8285,8286,8287,8288,8289,8290,8291,8292,8293,8294,8295,8296,8297,8298,8299,8300,8301,8302,8303,8304,8305,8306,8307,8308,8309,8310,8311,8312,8313,8314,8315,8316,8317,8318,8319,8320,8321,8322,8323,8324,8325,8326,8327,8328,8329,8330,8331,8332,8333,8334,8335,8336,8337,8338,8339,8340,8341,8342,8343,8344,8345,8346,8347,8348,8349,8350,8351,8352,8353,8354,8355,8356,8357,8358,8359,8360,8361,8362,8363,8364,8365,8366,8367,8368,8369,8370,8371,8372,8373,8374,8375,8376,8377,8378,8379,8380,8381,8382,8383,8384,8385,8386,8387,8388,8389,8390,8391,8392,8393,8394,8395,8396,8397,8398,8399,8400,8401,8402,8403,8404,8405,8406,8407,8408,8409,8410,8411,8412,8413,8414,8415,8416,8417,8418,8419,8420,8421,8422,8423,8424,8425,8426,8427,8428,8429,8430,8431,8432,8433,8434,8435,8436,8437,8438,8439,8440,8441,8442,8443,8444,8445,8446,8447,8448,8449,8450,8451,8452,8453,8454,8455,8456,8457,8458,8459,8460,8461,8462,8463,8464,8465,8466,8467,8468,8469,8470,8471,8472,8473,8474,8475,8476,8477,8478,8479,8480,8481,8482,8483,8484,8485,8486,8487,8488,8489,8490,8491,8492,8493,8494,8495,8496,8497,8498,8499,8500,8501,8502,8503,8504,8505,8506,8507,8508,8509,8510,8511,8512,8513,8514,8515,8516,8517,8518,8519,8520,8521,8522,8523,8524,8525,8526,8527,8528,8529,8530,8531,8532,8533,8534,8535,8536,8537,8538,8539,8540,8541,8542,8543,8544,8545,8546,8547,8548,8549,8550,8551,8552,8553,8554,8555,8556,8557,8558,8559,8560,8561,8562,8563,8564,8565,8566,8567,8568,8569,8570,8571,8572,8573,8574,8575,8576,8577,8578,8579,8580,8581,8582,8583,8584,8585,8586,8587,8588,8589,8590,8591,8592,8593,8594,8595,8596,8597,8598,8599,8600,8601,8602,8603,8604,8605,8606,8607,8608,8609,8610,8611,8612,8613,8614,8615,8616,8617,8618,8619,8620,8621,8622,8623,8624,8625,8626,8627,8628,8629,8630,8631,8632,8633,8634,8635,8636,8637,8638,8639,8640,8641,8642,8643,8644,8645,8646,8647,8648,8649,8650,8651,8652,8653,8654,8655,8656,8657,8658,8659,8660,8661,8662,8663,8664,8665,8666,8667,8668,8669,8670,8671,8672,8673,8674,8675,8676,8677,8678,8679,8680,8681,8682,8683,8684,8685,8686,8687,8688,8689,8690,8691,8692,8693,8694,8695,8696,8697,8698,8699,8700,8701,8702,8703,8704,8705,8706,8707,8708,8709,8710,8711,8712,8713,8714,8715,8716,8717,8718,8719,8720,8721,8722,8723,8724,8725,8726,8727,8728,8729,8730,8731,8732,8733,8734,8735,8736,8737,8738,8739,8740,8741,8742,8743,8744,8745,8746,8747,8748,8749,8750,8751,8752,8753,8754,8755,8756,8757,8758,8759,8760,8761,8762,8763,8764,8765,8766,8767,8768,8769,8770,8771,8772,8773,8774,8775,8776,8777,8778,8779,8780,8781,8782,8783,8784,8785,8786,8787,8788,8789,8790,8791,8792,8793,8794,8795,8796,8797,8798,8799,8800,8801,8802,8803,8804,8805,8806,8807,8808,8809,8810,8811,8812,8813,8814,8815,8816,8817,8818,8819,8820,8821,8822,8823,8824,8825,8826,8827,8828,8829,8830,8831,8832,8833,8834,8835,8836,8837,8838,8839,8840,8841,8842,8843,8844,8845,8846,8847,8848,8849,8850,8851,8852,8853,8854,8855,8856,8857,8858,8859,8860,8861,8862,8863,8864,8865,8866,8867,8868,8869,8870,8871,8872,8873,8874,8875,8876,8877,8878,8879,8880,8881,8882,8883,8884,8885,8886,8887,8888,8889,8890,8891,8892,8893,8894,8895,8896,8897,8898,8899,8900,8901,8902,8903,8904,8905,8906,8907,8908,8909,8910,8911,8912,8913,8914,8915,8916,8917,8918,8919,8920,8921,8922,8923,8924,8925,8926,8927,8928,8929,8930,8931,8932,8933,8934,8935,8936,8937,8938,8939,8940,8941,8942,8943,8944,8945,8946,8947,8948,8949,8950,8951,8952,8953,8954,8955,8956,8957,8958,8959,8960,8961,8962,8963,8964,8965,8966,8967,8968,8969,8970,8971,8972,8973,8974,8975,8976,8977,8978,8979,8980,8981,8982,8983,8984,8985,8986,8987,8988,8989,8990,8991,8992,8993,8994,8995,8996,8997,8998,8999,9000,9001,9002,9003,9004,9005,9006,9007,9008,9009,9010,9011,9012,9013,9014,9015,9016,9017,9018,9019,9020,9021,9022,9023,9024,9025,9026,9027,9028,9029,9030,9031,9032,9033,9034,9035,9036,9037,9038,9039,9040,9041,9042,9043,9044,9045,9046,9047,9048,9049,9050,9051,9052,9053,9054,9055,9056,9057,9058,9059,9060,9061,9062,9063,9064,9065,9066,9067,9068,9069,9070,9071,9072,9073,9074,9075,9076,9077,9078,9079,9080,9081,9082,9083,9084,9085,9086,9087,9088,9089,9090,9091,9092,9093,9094,9095,9096,9097,9098,9099,9100,9101,9102,9103,9104,9105,9106,9107,9108,9109,9110,9111,9112,9113,9114,9115,9116,9117,9118,9119,9120,9121,9122,9123,9124,9125,9126,9127,9128,9129,9130,9131,9132,9133,9134,9135,9136,9137,9138,9139,9140,9141,9142,9143,9144,9145,9146,9147,9148,9149,9150,9151,9152,9153,9154,9155,9156,9157,9158,9159,9160,9161,9162,9163,9164,9165,9166,9167,9168,9169,9170,9171,9172,9173,9174,9175,9176,9177,9178,9179,9180,9181,9182,9183,9184,9185,9186,9187,9188,9189,9190,9191,9192,9193,9194,9195,9196,9197,9198,9199,9200,9201,9202,9203,9204,9205,9206,9207,9208,9209,9210,9211,9212,9213,9214,9215,9216,9217,9218,9219,9220,9221,9222,9223,9224,9225,9226,9227,9228,9229,9230,9231,9232,9233,9234,9235,9236,9237,9238,9239,9240,9241,9242,9243,9244,9245,9246,9247,9248,9249,9250,9251,9252,9253,9254,9255,9256,9257,9258,9259,9260,9261,9262,9263,9264,9265,9266,9267,9268,9269,9270,9271,9272,9273,9274,9275,9276,9277,9278,9279,9280,9281,9282,9283,9284,9285,9286,9287,9288,9289,9290,9291,9292,9293,9294,9295,9296,9297,9298,9299,9300,9301,9302,9303,9304,9305,9306,9307,9308,9309,9310,9311,9312,9313,9314,9315,9316,9317,9318,9319,9320,9321,9322,9323,9324,9325,9326,9327,9328,9329,9330,9331,9332,9333,9334,9335,9336,9337,9338,9339,9340,9341,9342,9343,9344,9345,9346,9347,9348,9349,9350,9351,9352,9353,9354,9355,9356,9357,9358,9359,9360,9361,9362,9363,9364,9365,9366,9367,9368,9369,9370,9371,9372,9373,9374,9375,9376,9377,9378,9379,9380,9381,9382,9383,9384,9385,9386,9387,9388,9389,9390,9391,9392,9393,9394,9395,9396,9397,9398,9399,9400,9401,9402,9403,9404,9405,9406,9407,9408,9409,9410,9411,9412,9413,9414,9415,9416,9417,9418,9419,9420,9421,9422,9423,9424,9425,9426,9427,9428,9429,9430,9431,9432,9433,9434,9435,9436,9437,9438,9439,9440,9441,9442,9443,9444,9445,9446,9447,9448,9449,9450,9451,9452,9453,9454,9455,9456,9457,9458,9459,9460,9461,9462,9463,9464,9465,9466,9467,9468,9469,9470,9471,9472,9473,9474,9475,9476,9477,9478,9479,9480,9481,9482,9483,9484,9485,9486,9487,9488,9489,9490,9491,9492,9493,9494,9495,9496,9497,9498,9499,9500,9501,9502,9503,9504,9505,9506,9507,9508,9509,9510,9511,9512,9513,9514,9515,9516,9517,9518,9519,9520,9521,9522,9523,9524,9525,9526,9527,9528,9529,9530,9531,9532,9533,9534,9535,9536,9537,9538,9539,9540,9541,9542,9543,9544,9545,9546,9547,9548,9549,9550,9551,9552,9553,9554,9555,9556,9557,9558,9559,9560,9561,9562,9563,9564,9565,9566,9567,9568,9569,9570,9571,9572,9573,9574,9575,9576,9577,9578,9579,9580,9581,9582,9583,9584,9585,9586,9587,9588,9589,9590,9591,9592,9593,9594,9595,9596,9597,9598,9599,9600,9601,9602,9603,9604,9605,9606,9607,9608,9609,9610,9611,9612,9613,9614,9615,9616,9617,9618,9619,9620,9621,9622,9623,9624,9625,9626,9627,9628,9629,9630,9631,9632,9633,9634,9635,9636,9637,9638,9639,9640,9641,9642,9643,9644,9645,9646,9647,9648,9649,9650,9651,9652,9653,9654,9655,9656,9657,9658,9659,9660,9661,9662,9663,9664,9665,9666,9667,9668,9669,9670,9671,9672,9673,9674,9675,9676,9677,9678,9679,9680,9681,9682,9683,9684,9685,9686,9687,9688,9689,9690,9691,9692,9693,9694,9695,9696,9697,9698,9699,9700,9701,9702,9703,9704,9705,9706,9707,9708,9709,9710,9711,9712,9713,9714,9715,9716,9717,9718,9719,9720,9721,9722,9723,9724,9725,9726,9727,9728,9729,9730,9731,9732,9733,9734,9735,9736,9737,9738,9739,9740,9741,9742,9743,9744,9745,9746,9747,9748,9749,9750,9751,9752,9753,9754,9755,9756,9757,9758,9759,9760,9761,9762,9763,9764,9765,9766,9767,9768,9769,9770,9771,9772,9773,9774,9775,9776,9777,9778,9779,9780,9781,9782,9783,9784,9785,9786,9787,9788,9789,9790,9791,9792,9793,9794,9795,9796,9797,9798,9799,9800,9801,9802,9803,9804,9805,9806,9807,9808,9809,9810,9811,9812,9813,9814,9815,9816,9817,9818,9819,9820,9821,9822,9823,9824,9825,9826,9827,9828,9829,9830,9831,9832,9833,9834,9835,9836,9837,9838,9839,9840,9841,9842,9843,9844,9845,9846,9847,9848,9849,9850,9851,9852,9853,9854,9855,9856,9857,9858,9859,9860,9861,9862,9863,9864,9865,9866,9867,9868,9869,9870,9871,9872,9873,9874,9875,9876,9877,9878,9879,9880,9881,9882,9883,9884,9885,9886,9887,9888,9889,9890,9891,9892,9893,9894,9895,9896,9897,9898,9899,9900,9901,9902,9903,9904,9905,9906,9907,9908,9909,9910,9911,9912,9913,9914,9915,9916,9917,9918,9919,9920,9921,9922,9923,9924,9925,9926,9927,9928,9929,9930,9931,9932,9933,9934,9935,9936,9937,9938,9939,9940,9941,9942,9943,9944,9945,9946,9947,9948,9949,9950,9951,9952,9953,9954,9955,9956,9957,9958,9959,9960,9961,9962,9963,9964,9965,9966,9967,9968,9969,9970,9971,9972,9973,9974,9975,9976,9977,9978,9979,9980,9981,9982,9983,9984,9985,9986,9987,9988,9989,9990,9991,9992,9993,9994,9995,9996,9997,9998,9999,10000,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,160,161,162,163,164,165,166,167,168,169,170,171,172,173,174,175,176,177,178,179,180,181,182,183,184,185,186,187,188,189,190,191,192,193,194,195,196,197,198,199,200,201,202,203,204,205,206,207,208,209,210,211,212,213,214,215,216,217,218,219,220,221,222,223,224,225,226,227,228,229,230,231,232,233,234,235,236,237,238,239,240,241,242,243,244,245,246,247,248,249,250,251,252,253,254,255,256,257,258,259,260,261,262,263,264,265,266,267,268,269,270,271,272,273,274,275,276,277,278,279,280,281,282,283,284,285,286,287,288,289,290,291,292,293,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,336,337,338,339,340,341,342,343,344,345,346,347,348,349,350,351,352,353,354,355,356,357,358,359,360,361,362,363,364,365,366,367,368,369,370,371,372,373,374,375,376,377,378,379,380,381,382,383,384,385,386,387,388,389,390,391,392,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,409,410,411,412,413,414,415,416,417,418,419,420,421,422,423,424,425,426,427,428,429,430,431,432,433,434,435,436,437,438,439,440,441,442,443,444,445,446,447,448,449,450,451,452,453,454,455,456,457,458,459,460,461,462,463,464,465,466,467,468,469,470,471,472,473,474,475,476,477,478,479,480,481,482,483,484,485,486,487,488,489,490,491,492,493,494,495,496,497,498,499,500,501,502,503,504,505,506,507,508,509,510,511,512,513,514,515,516,517,518,519,520,521,522,523,524,525,526,527,528,529,530,531,532,533,534,535,536,537,538,539,540,541,542,543,544,545,546,547,548,549,550,551,552,553,554,555,556,557,558,559,560,561,562,563,564,565,566,567,568,569,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,595,596,597,598,599,600,601,602,603,604,605,606,607,608,609,610,611,612,613,614,615,616,617,618,619,620,621,622,623,624,625,626,627,628,629,630,631,632,633,634,635,636,637,638,639,640,641,642,643,644,645,646,647,648,649,650,651,652,653,654,655,656,657,658,659,660,661,662,663,664,665,666,667,668,669,670,671,672,673,674,675,676,677,678,679,680,681,682,683,684,685,686,687,688,689,690,691,692,693,694,695,696,697,698,699,700,701,702,703,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,731,732,733,734,735,736,737,738,739,740,741,742,743,744,745,746,747,748,749,750,751,752,753,754,755,756,757,758,759,760,761,762,763,764,765,766,767,768,769,770,771,772,773,774,775,776,777,778,779,780,781,782,783,784,785,786,787,788,789,790,791,792,793,794,795,796,797,798,799,800,801,802,803,804,805,806,807,808,809,810,811,812,813,814,815,816,817,818,819,820,821,822,823,824,825,826,827,828,829,830,831,832,833,834,835,836,837,838,839,840,841,842,843,844,845,846,847,848,849,850,851,852,853,854,855,856,857,858,859,860,861,862,863,864,865,866,867,868,869,870,871,872,873,874,875,876,877,878,879,880,881,882,883,884,885,886,887,888,889,890,891,892,893,894,895,896,897,898,899,900,901,902,903,904,905,906,907,908,909,910,911,912,913,914,915,916,917,918,919,920,921,922,923,924,925,926,927,928,929,930,931,932,933,934,935,936,937,938,939,940,941,942,943,944,945,946,947,948,949,950,951,952,953,954,955,956,957,958,959,960,961,962,963,964,965,966,967,968,969,970,971,972,973,974,975,976,977,978,979,980,981,982,983,984,985,986,987,988,989,990,991,992,993,994,995,996,997,998,999,1000,1001,1002,1003,1004,1005,1006,1007,1008,1009,1010,1011,1012,1013,1014,1015,1016,1017,1018,1019,1020,1021,1022,1023,1024,1025,1026,1027,1028,1029,1030,1031,1032,1033,1034,1035,1036,1037,1038,1039,1040,1041,1042,1043,1044,1045,1046,1047,1048,1049,1050,1051,1052,1053,1054,1055,1056,1057,1058,1059,1060,1061,1062,1063,1064,1065,1066,1067,1068,1069,1070,1071,1072,1073,1074,1075,1076,1077,1078,1079,1080,1081,1082,1083,1084,1085,1086,1087,1088,1089,1090,1091,1092,1093,1094,1095,1096,1097,1098,1099,1100,1101,1102,1103,1104,1105,1106,1107,1108,1109,1110,1111,1112,1113,1114,1115,1116,1117,1118,1119,1120,1121,1122,1123,1124,1125,1126,1127,1128,1129,1130,1131,1132,1133,1134,1135,1136,1137,1138,1139,1140,1141,1142,1143,1144,1145,1146,1147,1148,1149,1150,1151,1152,1153,1154,1155,1156,1157,1158,1159,1160,1161,1162,1163,1164,1165,1166,1167,1168,1169,1170,1171,1172,1173,1174,1175,1176,1177,1178,1179,1180,1181,1182,1183,1184,1185,1186,1187,1188,1189,1190,1191,1192,1193,1194,1195,1196,1197,1198,1199,1200,1201,1202,1203,1204,1205,1206,1207,1208,1209,1210,1211,1212,1213,1214,1215,1216,1217,1218,1219,1220,1221,1222,1223,1224,1225,1226,1227,1228,1229,1230,1231,1232,1233,1234,1235,1236,1237,1238,1239,1240,1241,1242,1243,1244,1245,1246,1247,1248,1249,1250,1251,1252,1253,1254,1255,1256,1257,1258,1259,1260,1261,1262,1263,1264,1265,1266,1267,1268,1269,1270,1271,1272,1273,1274,1275,1276,1277,1278,1279,1280,1281,1282,1283,1284,1285,1286,1287,1288,1289,1290,1291,1292,1293,1294,1295,1296,1297,1298,1299,1300,1301,1302,1303,1304,1305,1306,1307,1308,1309,1310,1311,1312,1313,1314,1315,1316,1317,1318,1319,1320,1321,1322,1323,1324,1325,1326,1327,1328,1329,1330,1331,1332,1333,1334,1335,1336,1337,1338,1339,1340,1341,1342,1343,1344,1345,1346,1347,1348,1349,1350,1351,1352,1353,1354,1355,1356,1357,1358,1359,1360,1361,1362,1363,1364,1365,1366,1367,1368,1369,1370,1371,1372,1373,1374,1375,1376,1377,1378,1379,1380,1381,1382,1383,1384,1385,1386,1387,1388,1389,1390,1391,1392,1393,1394,1395,1396,1397,1398,1399,1400,1401,1402,1403,1404,1405,1406,1407,1408,1409,1410,1411,1412,1413,1414,1415,1416,1417,1418,1419,1420,1421,1422,1423,1424,1425,1426,1427,1428,1429,1430,1431,1432,1433,1434,1435,1436,1437,1438,1439,1440,1441,1442,1443,1444,1445,1446,1447,1448,1449,1450,1451,1452,1453,1454,1455,1456,1457,1458,1459,1460,1461,1462,1463,1464,1465,1466,1467,1468,1469,1470,1471,1472,1473,1474,1475,1476,1477,1478,1479,1480,1481,1482,1483,1484,1485,1486,1487,1488,1489,1490,1491,1492,1493,1494,1495,1496,1497,1498,1499,1500,1501,1502,1503,1504,1505,1506,1507,1508,1509,1510,1511,1512,1513,1514,1515,1516,1517,1518,1519,1520,1521,1522,1523,1524,1525,1526,1527,1528,1529,1530,1531,1532,1533,1534,1535,1536,1537,1538,1539,1540,1541,1542,1543,1544,1545,1546,1547,1548,1549,1550,1551,1552,1553,1554,1555,1556,1557,1558,1559,1560,1561,1562,1563,1564,1565,1566,1567,1568,1569,1570,1571,1572,1573,1574,1575,1576,1577,1578,1579,1580,1581,1582,1583,1584,1585,1586,1587,1588,1589,1590,1591,1592,1593,1594,1595,1596,1597,1598,1599,1600,1601,1602,1603,1604,1605,1606,1607,1608,1609,1610,1611,1612,1613,1614,1615,1616,1617,1618,1619,1620,1621,1622,1623,1624,1625,1626,1627,1628,1629,1630,1631,1632,1633,1634,1635,1636,1637,1638,1639,1640,1641,1642,1643,1644,1645,1646,1647,1648,1649,1650,1651,1652,1653,1654,1655,1656,1657,1658,1659,1660,1661,1662,1663,1664,1665,1666,1667,1668,1669,1670,1671,1672,1673,1674,1675,1676,1677,1678,1679,1680,1681,1682,1683,1684,1685,1686,1687,1688,1689,1690,1691,1692,1693,1694,1695,1696,1697,1698,1699,1700,1701,1702,1703,1704,1705,1706,1707,1708,1709,1710,1711,1712,1713,1714,1715,1716,1717,1718,1719,1720,1721,1722,1723,1724,1725,1726,1727,1728,1729,1730,1731,1732,1733,1734,1735,1736,1737,1738,1739,1740,1741,1742,1743,1744,1745,1746,1747,1748,1749,1750,1751,1752,1753,1754,1755,1756,1757,1758,1759,1760,1761,1762,1763,1764,1765,1766,1767,1768,1769,1770,1771,1772,1773,1774,1775,1776,1777,1778,1779,1780,1781,1782,1783,1784,1785,1786,1787,1788,1789,1790,1791,1792,1793,1794,1795,1796,1797,1798,1799,1800,1801,1802,1803,1804,1805,1806,1807,1808,1809,1810,1811,1812,1813,1814,1815,1816,1817,1818,1819,1820,1821,1822,1823,1824,1825,1826,1827,1828,1829,1830,1831,1832,1833,1834,1835,1836,1837,1838,1839,1840,1841,1842,1843,1844,1845,1846,1847,1848,1849,1850,1851,1852,1853,1854,1855,1856,1857,1858,1859,1860,1861,1862,1863,1864,1865,1866,1867,1868,1869,1870,1871,1872,1873,1874,1875,1876,1877,1878,1879,1880,1881,1882,1883,1884,1885,1886,1887,1888,1889,1890,1891,1892,1893,1894,1895,1896,1897,1898,1899,1900,1901,1902,1903,1904,1905,1906,1907,1908,1909,1910,1911,1912,1913,1914,1915,1916,1917,1918,1919,1920,1921,1922,1923,1924,1925,1926,1927,1928,1929,1930,1931,1932,1933,1934,1935,1936,1937,1938,1939,1940,1941,1942,1943,1944,1945,1946,1947,1948,1949,1950,1951,1952,1953,1954,1955,1956,1957,1958,1959,1960,1961,1962,1963,1964,1965,1966,1967,1968,1969,1970,1971,1972,1973,1974,1975,1976,1977,1978,1979,1980,1981,1982,1983,1984,1985,1986,1987,1988,1989,1990,1991,1992,1993,1994,1995,1996,1997,1998,1999,2000,2001,2002,2003,2004,2005,2006,2007,2008,2009,2010,2011,2012,2013,2014,2015,2016,2017,2018,2019,2020,2021,2022,2023,2024,2025,2026,2027,2028,2029,2030,2031,2032,2033,2034,2035,2036,2037,2038,2039,2040,2041,2042,2043,2044,2045,2046,2047,2048,2049,2050,2051,2052,2053,2054,2055,2056,2057,2058,2059,2060,2061,2062,2063,2064,2065,2066,2067,2068,2069,2070,2071,2072,2073,2074,2075,2076,2077,2078,2079,2080,2081,2082,2083,2084,2085,2086,2087,2088,2089,2090,2091,2092,2093,2094,2095,2096,2097,2098,2099,2100,2101,2102,2103,2104,2105,2106,2107,2108,2109,2110,2111,2112,2113,2114,2115,2116,2117,2118,2119,2120,2121,2122,2123,2124,2125,2126,2127,2128,2129,2130,2131,2132,2133,2134,2135,2136,2137,2138,2139,2140,2141,2142,2143,2144,2145,2146,2147,2148,2149,2150,2151,2152,2153,2154,2155,2156,2157,2158,2159,2160,2161,2162,2163,2164,2165,2166,2167,2168,2169,2170,2171,2172,2173,2174,2175,2176,2177,2178,2179,2180,2181,2182,2183,2184,2185,2186,2187,2188,2189,2190,2191,2192,2193,2194,2195,2196,2197,2198,2199,2200,2201,2202,2203,2204,2205,2206,2207,2208,2209,2210,2211,2212,2213,2214,2215,2216,2217,2218,2219,2220,2221,2222,2223,2224,2225,2226,2227,2228,2229,2230,2231,2232,2233,2234,2235,2236,2237,2238,2239,2240,2241,2242,2243,2244,2245,2246,2247,2248,2249,2250,2251,2252,2253,2254,2255,2256,2257,2258,2259,2260,2261,2262,2263,2264,2265,2266,2267,2268,2269,2270,2271,2272,2273,2274,2275,2276,2277,2278,2279,2280,2281,2282,2283,2284,2285,2286,2287,2288,2289,2290,2291,2292,2293,2294,2295,2296,2297,2298,2299,2300,2301,2302,2303,2304,2305,2306,2307,2308,2309,2310,2311,2312,2313,2314,2315,2316,2317,2318,2319,2320,2321,2322,2323,2324,2325,2326,2327,2328,2329,2330,2331,2332,2333,2334,2335,2336,2337,2338,2339,2340,2341,2342,2343,2344,2345,2346,2347,2348,2349,2350,2351,2352,2353,2354,2355,2356,2357,2358,2359,2360,2361,2362,2363,2364,2365,2366,2367,2368,2369,2370,2371,2372,2373,2374,2375,2376,2377,2378,2379,2380,2381,2382,2383,2384,2385,2386,2387,2388,2389,2390,2391,2392,2393,2394,2395,2396,2397,2398,2399,2400,2401,2402,2403,2404,2405,2406,2407,2408,2409,2410,2411,2412,2413,2414,2415,2416,2417,2418,2419,2420,2421,2422,2423,2424,2425,2426,2427,2428,2429,2430,2431,2432,2433,2434,2435,2436,2437,2438,2439,2440,2441,2442,2443,2444,2445,2446,2447,2448,2449,2450,2451,2452,2453,2454,2455,2456,2457,2458,2459,2460,2461,2462,2463,2464,2465,2466,2467,2468,2469,2470,2471,2472,2473,2474,2475,2476,2477,2478,2479,2480,2481,2482,2483,2484,2485,2486,2487,2488,2489,2490,2491,2492,2493,2494,2495,2496,2497,2498,2499,2500,2501,2502,2503,2504,2505,2506,2507,2508,2509,2510,2511,2512,2513,2514,2515,2516,2517,2518,2519,2520,2521,2522,2523,2524,2525,2526,2527,2528,2529,2530,2531,2532,2533,2534,2535,2536,2537,2538,2539,2540,2541,2542,2543,2544,2545,2546,2547,2548,2549,2550,2551,2552,2553,2554,2555,2556,2557,2558,2559,2560,2561,2562,2563,2564,2565,2566,2567,2568,2569,2570,2571,2572,2573,2574,2575,2576,2577,2578,2579,2580,2581,2582,2583,2584,2585,2586,2587,2588,2589,2590,2591,2592,2593,2594,2595,2596,2597,2598,2599,2600,2601,2602,2603,2604,2605,2606,2607,2608,2609,2610,2611,2612,2613,2614,2615,2616,2617,2618,2619,2620,2621,2622,2623,2624,2625,2626,2627,2628,2629,2630,2631,2632,2633,2634,2635,2636,2637,2638,2639,2640,2641,2642,2643,2644,2645,2646,2647,2648,2649,2650,2651,2652,2653,2654,2655,2656,2657,2658,2659,2660,2661,2662,2663,2664,2665,2666,2667,2668,2669,2670,2671,2672,2673,2674,2675,2676,2677,2678,2679,2680,2681,2682,2683,2684,2685,2686,2687,2688,2689,2690,2691,2692,2693,2694,2695,2696,2697,2698,2699,2700,2701,2702,2703,2704,2705,2706,2707,2708,2709,2710,2711,2712,2713,2714,2715,2716,2717,2718,2719,2720,2721,2722,2723,2724,2725,2726,2727,2728,2729,2730,2731,2732,2733,2734,2735,2736,2737,2738,2739,2740,2741,2742,2743,2744,2745,2746,2747,2748,2749,2750,2751,2752,2753,2754,2755,2756,2757,2758,2759,2760,2761,2762,2763,2764,2765,2766,2767,2768,2769,2770,2771,2772,2773,2774,2775,2776,2777,2778,2779,2780,2781,2782,2783,2784,2785,2786,2787,2788,2789,2790,2791,2792,2793,2794,2795,2796,2797,2798,2799,2800,2801,2802,2803,2804,2805,2806,2807,2808,2809,2810,2811,2812,2813,2814,2815,2816,2817,2818,2819,2820,2821,2822,2823,2824,2825,2826,2827,2828,2829,2830,2831,2832,2833,2834,2835,2836,2837,2838,2839,2840,2841,2842,2843,2844,2845,2846,2847,2848,2849,2850,2851,2852,2853,2854,2855,2856,2857,2858,2859,2860,2861,2862,2863,2864,2865,2866,2867,2868,2869,2870,2871,2872,2873,2874,2875,2876,2877,2878,2879,2880,2881,2882,2883,2884,2885,2886,2887,2888,2889,2890,2891,2892,2893,2894,2895,2896,2897,2898,2899,2900,2901,2902,2903,2904,2905,2906,2907,2908,2909,2910,2911,2912,2913,2914,2915,2916,2917,2918,2919,2920,2921,2922,2923,2924,2925,2926,2927,2928,2929,2930,2931,2932,2933,2934,2935,2936,2937,2938,2939,2940,2941,2942,2943,2944,2945,2946,2947,2948,2949,2950,2951,2952,2953,2954,2955,2956,2957,2958,2959,2960,2961,2962,2963,2964,2965,2966,2967,2968,2969,2970,2971,2972,2973,2974,2975,2976,2977,2978,2979,2980,2981,2982,2983,2984,2985,2986,2987,2988,2989,2990,2991,2992,2993,2994,2995,2996,2997,2998,2999,3000,3001,3002,3003,3004,3005,3006,3007,3008,3009,3010,3011,3012,3013,3014,3015,3016,3017,3018,3019,3020,3021,3022,3023,3024,3025,3026,3027,3028,3029,3030,3031,3032,3033,3034,3035,3036,3037,3038,3039,3040,3041,3042,3043,3044,3045,3046,3047,3048,3049,3050,3051,3052,3053,3054,3055,3056,3057,3058,3059,3060,3061,3062,3063,3064,3065,3066,3067,3068,3069,3070,3071,3072,3073,3074,3075,3076,3077,3078,3079,3080,3081,3082,3083,3084,3085,3086,3087,3088,3089,3090,3091,3092,3093,3094,3095,3096,3097,3098,3099,3100,3101,3102,3103,3104,3105,3106,3107,3108,3109,3110,3111,3112,3113,3114,3115,3116,3117,3118,3119,3120,3121,3122,3123,3124,3125,3126,3127,3128,3129,3130,3131,3132,3133,3134,3135,3136,3137,3138,3139,3140,3141,3142,3143,3144,3145,3146,3147,3148,3149,3150,3151,3152,3153,3154,3155,3156,3157,3158,3159,3160,3161,3162,3163,3164,3165,3166,3167,3168,3169,3170,3171,3172,3173,3174,3175,3176,3177,3178,3179,3180,3181,3182,3183,3184,3185,3186,3187,3188,3189,3190,3191,3192,3193,3194,3195,3196,3197,3198,3199,3200,3201,3202,3203,3204,3205,3206,3207,3208,3209,3210,3211,3212,3213,3214,3215,3216,3217,3218,3219,3220,3221,3222,3223,3224,3225,3226,3227,3228,3229,3230,3231,3232,3233,3234,3235,3236,3237,3238,3239,3240,3241,3242,3243,3244,3245,3246,3247,3248,3249,3250,3251,3252,3253,3254,3255,3256,3257,3258,3259,3260,3261,3262,3263,3264,3265,3266,3267,3268,3269,3270,3271,3272,3273,3274,3275,3276,3277,3278,3279,3280,3281,3282,3283,3284,3285,3286,3287,3288,3289,3290,3291,3292,3293,3294,3295,3296,3297,3298,3299,3300,3301,3302,3303,3304,3305,3306,3307,3308,3309,3310,3311,3312,3313,3314,3315,3316,3317,3318,3319,3320,3321,3322,3323,3324,3325,3326,3327,3328,3329,3330,3331,3332,3333,3334,3335,3336,3337,3338,3339,3340,3341,3342,3343,3344,3345,3346,3347,3348,3349,3350,3351,3352,3353,3354,3355,3356,3357,3358,3359,3360,3361,3362,3363,3364,3365,3366,3367,3368,3369,3370,3371,3372,3373,3374,3375,3376,3377,3378,3379,3380,3381,3382,3383,3384,3385,3386,3387,3388,3389,3390,3391,3392,3393,3394,3395,3396,3397,3398,3399,3400,3401,3402,3403,3404,3405,3406,3407,3408,3409,3410,3411,3412,3413,3414,3415,3416,3417,3418,3419,3420,3421,3422,3423,3424,3425,3426,3427,3428,3429,3430,3431,3432,3433,3434,3435,3436,3437,3438,3439,3440,3441,3442,3443,3444,3445,3446,3447,3448,3449,3450,3451,3452,3453,3454,3455,3456,3457,3458,3459,3460,3461,3462,3463,3464,3465,3466,3467,3468,3469,3470,3471,3472,3473,3474,3475,3476,3477,3478,3479,3480,3481,3482,3483,3484,3485,3486,3487,3488,3489,3490,3491,3492,3493,3494,3495,3496,3497,3498,3499,3500,3501,3502,3503,3504,3505,3506,3507,3508,3509,3510,3511,3512,3513,3514,3515,3516,3517,3518,3519,3520,3521,3522,3523,3524,3525,3526,3527,3528,3529,3530,3531,3532,3533,3534,3535,3536,3537,3538,3539,3540,3541,3542,3543,3544,3545,3546,3547,3548,3549,3550,3551,3552,3553,3554,3555,3556,3557,3558,3559,3560,3561,3562,3563,3564,3565,3566,3567,3568,3569,3570,3571,3572,3573,3574,3575,3576,3577,3578,3579,3580,3581,3582,3583,3584,3585,3586,3587,3588,3589,3590,3591,3592,3593,3594,3595,3596,3597,3598,3599,3600,3601,3602,3603,3604,3605,3606,3607,3608,3609,3610,3611,3612,3613,3614,3615,3616,3617,3618,3619,3620,3621,3622,3623,3624,3625,3626,3627,3628,3629,3630,3631,3632,3633,3634,3635,3636,3637,3638,3639,3640,3641,3642,3643,3644,3645,3646,3647,3648,3649,3650,3651,3652,3653,3654,3655,3656,3657,3658,3659,3660,3661,3662,3663,3664,3665,3666,3667,3668,3669,3670,3671,3672,3673,3674,3675,3676,3677,3678,3679,3680,3681,3682,3683,3684,3685,3686,3687,3688,3689,3690,3691,3692,3693,3694,3695,3696,3697,3698,3699,3700,3701,3702,3703,3704,3705,3706,3707,3708,3709,3710,3711,3712,3713,3714,3715,3716,3717,3718,3719,3720,3721,3722,3723,3724,3725,3726,3727,3728,3729,3730,3731,3732,3733,3734,3735,3736,3737,3738,3739,3740,3741,3742,3743,3744,3745,3746,3747,3748,3749,3750,3751,3752,3753,3754,3755,3756,3757,3758,3759,3760,3761,3762,3763,3764,3765,3766,3767,3768,3769,3770,3771,3772,3773,3774,3775,3776,3777,3778,3779,3780,3781,3782,3783,3784,3785,3786,3787,3788,3789,3790,3791,3792,3793,3794,3795,3796,3797,3798,3799,3800,3801,3802,3803,3804,3805,3806,3807,3808,3809,3810,3811,3812,3813,3814,3815,3816,3817,3818,3819,3820,3821,3822,3823,3824,3825,3826,3827,3828,3829,3830,3831,3832,3833,3834,3835,3836,3837,3838,3839,3840,3841,3842,3843,3844,3845,3846,3847,3848,3849,3850,3851,3852,3853,3854,3855,3856,3857,3858,3859,3860,3861,3862,3863,3864,3865,3866,3867,3868,3869,3870,3871,3872,3873,3874,3875,3876,3877,3878,3879,3880,3881,3882,3883,3884,3885,3886,3887,3888,3889,3890,3891,3892,3893,3894,3895,3896],
[3311,3312,3313,3314,3315,3316,3317,3318,3319,3320,3321,3322,3323,3324,3325,3326,3327,3328,3329,3330,3331,3332,3333,3334,3335,3336,3337,3338,3339,3340,3341,3342,3343,3344,3345,3346,3347,3348,3349,3350,3351,3352,3353,3354,3355,3356,3357,3358,3359,3360,3361,3362,3363,3364,3365,3366,3367,3368,3369,3370,3371,3372,3373,3374,3375,3376,3377,3378,3379,3380,3381,3382,3383,3384,3385,3386,3387,3388,3389,3390,3391,3392,3393,3394,3395,3396,3397,3398,3399,3400,3401,3402,3403,3404,3405,3406,3407,3408,3409,3410,3411,3412,3413,3414,3415,3416,3417,3418,3419,3420,3421,3422,3423,3424,3425,3426,3427,3428,3429,3430,3431,3432,3433,3434,3435,3436,3437,3438,3439,3440,3441,3442,3443,3444,3445,3446,3447,3448,3449,3450,3451,3452,3453,3454,3455,3456,3457,3458,3459,3460,3461,3462,3463,3464,3465,3466,3467,3468,3469,3470,3471,3472,3473,3474,3475,3476,3477,3478,3479,3480,3481,3482,3483,3484,3485,3486,3487,3488,3489,3490,3491,3492,3493,3494,3495,3496,3497,3498,3499,3500,3501,3502,3503,3504,3505,3506,3507,3508,3509,3510,3511,3512,3513,3514,3515,3516,3517,3518,3519,3520,3521,3522,3523,3524,3525,3526,3527,3528,3529,3530,3531,3532,3533,3534,3535,3536,3537,3538,3539,3540,3541,3542,3543,3544,3545,3546,3547,3548,3549,3550,3551,3552,3553,3554,3555,3556,3557,3558,3559,3560,3561,3562,3563,3564,3565,3566,3567,3568,3569,3570,3571,3572,3573,3574,3575,3576,3577,3578,3579,3580,3581,3582,3583,3584,3585,3586,3587,3588,3589,3590,3591,3592,3593,3594,3595,3596,3597,3598,3599,3600,3601,3602,3603,3604,3605,3606,3607,3608,3609,3610,3611,3612,3613,3614,3615,3616,3617,3618,3619,3620,3621,3622,3623,3624,3625,3626,3627,3628,3629,3630,3631,3632,3633,3634,3635,3636,3637,3638,3639,3640,3641,3642,3643,3644,3645,3646,3647,3648,3649,3650,3651,3652,3653,3654,3655,3656,3657,3658,3659,3660,3661,3662,3663,3664,3665,3666,3667,3668,3669,3670,3671,3672,3673,3674,3675,3676,3677,3678,3679,3680,3681,3682,3683,3684,3685,3686,3687,3688,3689,3690,3691,3692,3693,3694,3695,3696,3697,3698,3699,3700,3701,3702,3703,3704,3705,3706,3707,3708,3709,3710,3711,3712,3713,3714,3715,3716,3717,3718,3719,3720,3721,3722,3723,3724,3725,3726,3727,3728,3729,3730,3731,3732,3733,3734,3735,3736,3737,3738,3739,3740,3741,3742,3743,3744,3745,3746,3747,3748,3749,3750,3751,3752,3753,3754,3755,3756,3757,3758,3759,3760,3761,3762,3763,3764,3765,3766,3767,3768,3769,3770,3771,3772,3773,3774,3775,3776,3777,3778,3779,3780,3781,3782,3783,3784,3785,3786,3787,3788,3789,3790,3791,3792,3793,3794,3795,3796,3797,3798,3799,3800,3801,3802,3803,3804,3805,3806,3807,3808,3809,3810,3811,3812,3813,3814,3815,3816,3817,3818,3819,3820,3821,3822,3823,3824,3825,3826,3827,3828,3829,3830,3831,3832,3833,3834,3835,3836,3837,3838,3839,3840,3841,3842,3843,3844,3845,3846,3847,3848,3849,3850,3851,3852,3853,3854,3855,3856,3857,3858,3859,3860,3861,3862,3863,3864,3865,3866,3867,3868,3869,3870,3871,3872,3873,3874,3875,3876,3877,3878,3879,3880,3881,3882,3883,3884,3885,3886,3887,3888,3889,3890,3891,3892,3893,3894,3895,3896,3897,3898,3899,3900,3901,3902,3903,3904,3905,3906,3907,3908,3909,3910,3911,3912,3913,3914,3915,3916,3917,3918,3919,3920,3921,3922,3923,3924,3925,3926,3927,3928,3929,3930,3931,3932,3933,3934,3935,3936,3937,3938,3939,3940,3941,3942,3943,3944,3945,3946,3947,3948,3949,3950,3951,3952,3953,3954,3955,3956,3957,3958,3959,3960,3961,3962,3963,3964,3965,3966,3967,3968,3969,3970,3971,3972,3973,3974,3975,3976,3977,3978,3979,3980,3981,3982,3983,3984,3985,3986,3987,3988,3989,3990,3991,3992,3993,3994,3995,3996,3997,3998,3999,4000,4001,4002,4003,4004,4005,4006,4007,4008,4009,4010,4011,4012,4013,4014,4015,4016,4017,4018,4019,4020,4021,4022,4023,4024,4025,4026,4027,4028,4029,4030,4031,4032,4033,4034,4035,4036,4037,4038,4039,4040,4041,4042,4043,4044,4045,4046,4047,4048,4049,4050,4051,4052,4053,4054,4055,4056,4057,4058,4059,4060,4061,4062,4063,4064,4065,4066,4067,4068,4069,4070,4071,4072,4073,4074,4075,4076,4077,4078,4079,4080,4081,4082,4083,4084,4085,4086,4087,4088,4089,4090,4091,4092,4093,4094,4095,4096,4097,4098,4099,4100,4101,4102,4103,4104,4105,4106,4107,4108,4109,4110,4111,4112,4113,4114,4115,4116,4117,4118,4119,4120,4121,4122,4123,4124,4125,4126,4127,4128,4129,4130,4131,4132,4133,4134,4135,4136,4137,4138,4139,4140,4141,4142,4143,4144,4145,4146,4147,4148,4149,4150,4151,4152,4153,4154,4155,4156,4157,4158,4159,4160,4161,4162,4163,4164,4165,4166,4167,4168,4169,4170,4171,4172,4173,4174,4175,4176,4177,4178,4179,4180,4181,4182,4183,4184,4185,4186,4187,4188,4189,4190,4191,4192,4193,4194,4195,4196,4197,4198,4199,4200,4201,4202,4203,4204,4205,4206,4207,4208,4209,4210,4211,4212,4213,4214,4215,4216,4217,4218,4219,4220,4221,4222,4223,4224,4225,4226,4227,4228,4229,4230,4231,4232,4233,4234,4235,4236,4237,4238,4239,4240,4241,4242,4243,4244,4245,4246,4247,4248,4249,4250,4251,4252,4253,4254,4255,4256,4257,4258,4259,4260,4261,4262,4263,4264,4265,4266,4267,4268,4269,4270,4271,4272,4273,4274,4275,4276,4277,4278,4279,4280,4281,4282,4283,4284,4285,4286,4287,4288,4289,4290,4291,4292,4293,4294,4295,4296,4297,4298,4299,4300,4301,4302,4303,4304,4305,4306,4307,4308,4309,4310,4311,4312,4313,4314,4315,4316,4317,4318,4319,4320,4321,4322,4323,4324,4325,4326,4327,4328,4329,4330,4331,4332,4333,4334,4335,4336,4337,4338,4339,4340,4341,4342,4343,4344,4345,4346,4347,4348,4349,4350,4351,4352,4353,4354,4355,4356,4357,4358,4359,4360,4361,4362,4363,4364,4365,4366,4367,4368,4369,4370,4371,4372,4373,4374,4375,4376,4377,4378,4379,4380,4381,4382,4383,4384,4385,4386,4387,4388,4389,4390,4391,4392,4393,4394,4395,4396,4397,4398,4399,4400,4401,4402,4403,4404,4405,4406,4407,4408,4409,4410,4411,4412,4413,4414,4415,4416,4417,4418,4419,4420,4421,4422,4423,4424,4425,4426,4427,4428,4429,4430,4431,4432,4433,4434,4435,4436,4437,4438,4439,4440,4441,4442,4443,4444,4445,4446,4447,4448,4449,4450,4451,4452,4453,4454,4455,4456,4457,4458,4459,4460,4461,4462,4463,4464,4465,4466,4467,4468,4469,4470,4471,4472,4473,4474,4475,4476,4477,4478,4479,4480,4481,4482,4483,4484,4485,4486,4487,4488,4489,4490,4491,4492,4493,4494,4495,4496,4497,4498,4499,4500,4501,4502,4503,4504,4505,4506,4507,4508,4509,4510,4511,4512,4513,4514,4515,4516,4517,4518,4519,4520,4521,4522,4523,4524,4525,4526,4527,4528,4529,4530,4531,4532,4533,4534,4535,4536,4537,4538,4539,4540,4541,4542,4543,4544,4545,4546,4547,4548,4549,4550,4551,4552,4553,4554,4555,4556,4557,4558,4559,4560,4561,4562,4563,4564,4565,4566,4567,4568,4569,4570,4571,4572,4573,4574,4575,4576,4577,4578,4579,4580,4581,4582,4583,4584,4585,4586,4587,4588,4589,4590,4591,4592,4593,4594,4595,4596,4597,4598,4599,4600,4601,4602,4603,4604,4605,4606,4607,4608,4609,4610,4611,4612,4613,4614,4615,4616,4617,4618,4619,4620,4621,4622,4623,4624,4625,4626,4627,4628,4629,4630,4631,4632,4633,4634,4635,4636,4637,4638,4639,4640,4641,4642,4643,4644,4645,4646,4647,4648,4649,4650,4651,4652,4653,4654,4655,4656,4657,4658,4659,4660,4661,4662,4663,4664,4665,4666,4667,4668,4669,4670,4671,4672,4673,4674,4675,4676,4677,4678,4679,4680,4681,4682,4683,4684,4685,4686,4687,4688,4689,4690,4691,4692,4693,4694,4695,4696,4697,4698,4699,4700,4701,4702,4703,4704,4705,4706,4707,4708,4709,4710,4711,4712,4713,4714,4715,4716,4717,4718,4719,4720,4721,4722,4723,4724,4725,4726,4727,4728,4729,4730,4731,4732,4733,4734,4735,4736,4737,4738,4739,4740,4741,4742,4743,4744,4745,4746,4747,4748,4749,4750,4751,4752,4753,4754,4755,4756,4757,4758,4759,4760,4761,4762,4763,4764,4765,4766,4767,4768,4769,4770,4771,4772,4773,4774,4775,4776,4777,4778,4779,4780,4781,4782,4783,4784,4785,4786,4787,4788,4789,4790,4791,4792,4793,4794,4795,4796,4797,4798,4799,4800,4801,4802,4803,4804,4805,4806,4807,4808,4809,4810,4811,4812,4813,4814,4815,4816,4817,4818,4819,4820,4821,4822,4823,4824,4825,4826,4827,4828,4829,4830,4831,4832,4833,4834,4835,4836,4837,4838,4839,4840,4841,4842,4843,4844,4845,4846,4847,4848,4849,4850,4851,4852,4853,4854,4855,4856,4857,4858,4859,4860,4861,4862,4863,4864,4865,4866,4867,4868,4869,4870,4871,4872,4873,4874,4875,4876,4877,4878,4879,4880,4881,4882,4883,4884,4885,4886,4887,4888,4889,4890,4891,4892,4893,4894,4895,4896,4897,4898,4899,4900,4901,4902,4903,4904,4905,4906,4907,4908,4909,4910,4911,4912,4913,4914,4915,4916,4917,4918,4919,4920,4921,4922,4923,4924,4925,4926,4927,4928,4929,4930,4931,4932,4933,4934,4935,4936,4937,4938,4939,4940,4941,4942,4943,4944,4945,4946,4947,4948,4949,4950,4951,4952,4953,4954,4955,4956,4957,4958,4959,4960,4961,4962,4963,4964,4965,4966,4967,4968,4969,4970,4971,4972,4973,4974,4975,4976,4977,4978,4979,4980,4981,4982,4983,4984,4985,4986,4987,4988,4989,4990,4991,4992,4993,4994,4995,4996,4997,4998,4999,5000,5001,5002,5003,5004,5005,5006,5007,5008,5009,5010,5011,5012,5013,5014,5015,5016,5017,5018,5019,5020,5021,5022,5023,5024,5025,5026,5027,5028,5029,5030,5031,5032,5033,5034,5035,5036,5037,5038,5039,5040,5041,5042,5043,5044,5045,5046,5047,5048,5049,5050,5051,5052,5053,5054,5055,5056,5057,5058,5059,5060,5061,5062,5063,5064,5065,5066,5067,5068,5069,5070,5071,5072,5073,5074,5075,5076,5077,5078,5079,5080,5081,5082,5083,5084,5085,5086,5087,5088,5089,5090,5091,5092,5093,5094,5095,5096,5097,5098,5099,5100,5101,5102,5103,5104,5105,5106,5107,5108,5109,5110,5111,5112,5113,5114,5115,5116,5117,5118,5119,5120,5121,5122,5123,5124,5125,5126,5127,5128,5129,5130,5131,5132,5133,5134,5135,5136,5137,5138,5139,5140,5141,5142,5143,5144,5145,5146,5147,5148,5149,5150,5151,5152,5153,5154,5155,5156,5157,5158,5159,5160,5161,5162,5163,5164,5165,5166,5167,5168,5169,5170,5171,5172,5173,5174,5175,5176,5177,5178,5179,5180,5181,5182,5183,5184,5185,5186,5187,5188,5189,5190,5191,5192,5193,5194,5195,5196,5197,5198,5199,5200,5201,5202,5203,5204,5205,5206,5207,5208,5209,5210,5211,5212,5213,5214,5215,5216,5217,5218,5219,5220,5221,5222,5223,5224,5225,5226,5227,5228,5229,5230,5231,5232,5233,5234,5235,5236,5237,5238,5239,5240,5241,5242,5243,5244,5245,5246,5247,5248,5249,5250,5251,5252,5253,5254,5255,5256,5257,5258,5259,5260,5261,5262,5263,5264,5265,5266,5267,5268,5269,5270,5271,5272,5273,5274,5275,5276,5277,5278,5279,5280,5281,5282,5283,5284,5285,5286,5287,5288,5289,5290,5291,5292,5293,5294,5295,5296,5297,5298,5299,5300,5301,5302,5303,5304,5305,5306,5307,5308,5309,5310,5311,5312,5313,5314,5315,5316,5317,5318,5319,5320,5321,5322,5323,5324,5325,5326,5327,5328,5329,5330,5331,5332,5333,5334,5335,5336,5337,5338,5339,5340,5341,5342,5343,5344,5345,5346,5347,5348,5349,5350,5351,5352,5353,5354,5355,5356,5357,5358,5359,5360,5361,5362,5363,5364,5365,5366,5367,5368,5369,5370,5371,5372,5373,5374,5375,5376,5377,5378,5379,5380,5381,5382,5383,5384,5385,5386,5387,5388,5389,5390,5391,5392,5393,5394,5395,5396,5397,5398,5399,5400,5401,5402,5403,5404,5405,5406,5407,5408,5409,5410,5411,5412,5413,5414,5415,5416,5417,5418,5419,5420,5421,5422,5423,5424,5425,5426,5427,5428,5429,5430,5431,5432,5433,5434,5435,5436,5437,5438,5439,5440,5441,5442,5443,5444,5445,5446,5447,5448,5449,5450,5451,5452,5453,5454,5455,5456,5457,5458,5459,5460,5461,5462,5463,5464,5465,5466,5467,5468,5469,5470,5471,5472,5473,5474,5475,5476,5477,5478,5479,5480,5481,5482,5483,5484,5485,5486,5487,5488,5489,5490,5491,5492,5493,5494,5495,5496,5497,5498,5499,5500,5501,5502,5503,5504,5505,5506,5507,5508,5509,5510,5511,5512,5513,5514,5515,5516,5517,5518,5519,5520,5521,5522,5523,5524,5525,5526,5527,5528,5529,5530,5531,5532,5533,5534,5535,5536,5537,5538,5539,5540,5541,5542,5543,5544,5545,5546,5547,5548,5549,5550,5551,5552,5553,5554,5555,5556,5557,5558,5559,5560,5561,5562,5563,5564,5565,5566,5567,5568,5569,5570,5571,5572,5573,5574,5575,5576,5577,5578,5579,5580,5581,5582,5583,5584,5585,5586,5587,5588,5589,5590,5591,5592,5593,5594,5595,5596,5597,5598,5599,5600,5601,5602,5603,5604,5605,5606,5607,5608,5609,5610,5611,5612,5613,5614,5615,5616,5617,5618,5619,5620,5621,5622,5623,5624,5625,5626,5627,5628,5629,5630,5631,5632,5633,5634,5635,5636,5637,5638,5639,5640,5641,5642,5643,5644,5645,5646,5647,5648,5649,5650,5651,5652,5653,5654,5655,5656,5657,5658,5659,5660,5661,5662,5663,5664,5665,5666,5667,5668,5669,5670,5671,5672,5673,5674,5675,5676,5677,5678,5679,5680,5681,5682,5683,5684,5685,5686,5687,5688,5689,5690,5691,5692,5693,5694,5695,5696,5697,5698,5699,5700,5701,5702,5703,5704,5705,5706,5707,5708,5709,5710,5711,5712,5713,5714,5715,5716,5717,5718,5719,5720,5721,5722,5723,5724,5725,5726,5727,5728,5729,5730,5731,5732,5733,5734,5735,5736,5737,5738,5739,5740,5741,5742,5743,5744,5745,5746,5747,5748,5749,5750,5751,5752,5753,5754,5755,5756,5757,5758,5759,5760,5761,5762,5763,5764,5765,5766,5767,5768,5769,5770,5771,5772,5773,5774,5775,5776,5777,5778,5779,5780,5781,5782,5783,5784,5785,5786,5787,5788,5789,5790,5791,5792,5793,5794,5795,5796,5797,5798,5799,5800,5801,5802,5803,5804,5805,5806,5807,5808,5809,5810,5811,5812,5813,5814,5815,5816,5817,5818,5819,5820,5821,5822,5823,5824,5825,5826,5827,5828,5829,5830,5831,5832,5833,5834,5835,5836,5837,5838,5839,5840,5841,5842,5843,5844,5845,5846,5847,5848,5849,5850,5851,5852,5853,5854,5855,5856,5857,5858,5859,5860,5861,5862,5863,5864,5865,5866,5867,5868,5869,5870,5871,5872,5873,5874,5875,5876,5877,5878,5879,5880,5881,5882,5883,5884,5885,5886,5887,5888,5889,5890,5891,5892,5893,5894,5895,5896,5897,5898,5899,5900,5901,5902,5903,5904,5905,5906,5907,5908,5909,5910,5911,5912,5913,5914,5915,5916,5917,5918,5919,5920,5921,5922,5923,5924,5925,5926,5927,5928,5929,5930,5931,5932,5933,5934,5935,5936,5937,5938,5939,5940,5941,5942,5943,5944,5945,5946,5947,5948,5949,5950,5951,5952,5953,5954,5955,5956,5957,5958,5959,5960,5961,5962,5963,5964,5965,5966,5967,5968,5969,5970,5971,5972,5973,5974,5975,5976,5977,5978,5979,5980,5981,5982,5983,5984,5985,5986,5987,5988,5989,5990,5991,5992,5993,5994,5995,5996,5997,5998,5999,6000,6001,6002,6003,6004,6005,6006,6007,6008,6009,6010,6011,6012,6013,6014,6015,6016,6017,6018,6019,6020,6021,6022,6023,6024,6025,6026,6027,6028,6029,6030,6031,6032,6033,6034,6035,6036,6037,6038,6039,6040,6041,6042,6043,6044,6045,6046,6047,6048,6049,6050,6051,6052,6053,6054,6055,6056,6057,6058,6059,6060,6061,6062,6063,6064,6065,6066,6067,6068,6069,6070,6071,6072,6073,6074,6075,6076,6077,6078,6079,6080,6081,6082,6083,6084,6085,6086,6087,6088,6089,6090,6091,6092,6093,6094,6095,6096,6097,6098,6099,6100,6101,6102,6103,6104,6105,6106,6107,6108,6109,6110,6111,6112,6113,6114,6115,6116,6117,6118,6119,6120,6121,6122,6123,6124,6125,6126,6127,6128,6129,6130,6131,6132,6133,6134,6135,6136,6137,6138,6139,6140,6141,6142,6143,6144,6145,6146,6147,6148,6149,6150,6151,6152,6153,6154,6155,6156,6157,6158,6159,6160,6161,6162,6163,6164,6165,6166,6167,6168,6169,6170,6171,6172,6173,6174,6175,6176,6177,6178,6179,6180,6181,6182,6183,6184,6185,6186,6187,6188,6189,6190,6191,6192,6193,6194,6195,6196,6197,6198,6199,6200,6201,6202,6203,6204,6205,6206,6207,6208,6209,6210,6211,6212,6213,6214,6215,6216,6217,6218,6219,6220,6221,6222,6223,6224,6225,6226,6227,6228,6229,6230,6231,6232,6233,6234,6235,6236,6237,6238,6239,6240,6241,6242,6243,6244,6245,6246,6247,6248,6249,6250,6251,6252,6253,6254,6255,6256,6257,6258,6259,6260,6261,6262,6263,6264,6265,6266,6267,6268,6269,6270,6271,6272,6273,6274,6275,6276,6277,6278,6279,6280,6281,6282,6283,6284,6285,6286,6287,6288,6289,6290,6291,6292,6293,6294,6295,6296,6297,6298,6299,6300,6301,6302,6303,6304,6305,6306,6307,6308,6309,6310,6311,6312,6313,6314,6315,6316,6317,6318,6319,6320,6321,6322,6323,6324,6325,6326,6327,6328,6329,6330,6331,6332,6333,6334,6335,6336,6337,6338,6339,6340,6341,6342,6343,6344,6345,6346,6347,6348,6349,6350,6351,6352,6353,6354,6355,6356,6357,6358,6359,6360,6361,6362,6363,6364,6365,6366,6367,6368,6369,6370,6371,6372,6373,6374,6375,6376,6377,6378,6379,6380,6381,6382,6383,6384,6385,6386,6387,6388,6389,6390,6391,6392,6393,6394,6395,6396,6397,6398,6399,6400,6401,6402,6403,6404,6405,6406,6407,6408,6409,6410,6411,6412,6413,6414,6415,6416,6417,6418,6419,6420,6421,6422,6423,6424,6425,6426,6427,6428,6429,6430,6431,6432,6433,6434,6435,6436,6437,6438,6439,6440,6441,6442,6443,6444,6445,6446,6447,6448,6449,6450,6451,6452,6453,6454,6455,6456,6457,6458,6459,6460,6461,6462,6463,6464,6465,6466,6467,6468,6469,6470,6471,6472,6473,6474,6475,6476,6477,6478,6479,6480,6481,6482,6483,6484,6485,6486,6487,6488,6489,6490,6491,6492,6493,6494,6495,6496,6497,6498,6499,6500,6501,6502,6503,6504,6505,6506,6507,6508,6509,6510,6511,6512,6513,6514,6515,6516,6517,6518,6519,6520,6521,6522,6523,6524,6525,6526,6527,6528,6529,6530,6531,6532,6533,6534,6535,6536,6537,6538,6539,6540,6541,6542,6543,6544,6545,6546,6547,6548,6549,6550,6551,6552,6553,6554,6555,6556,6557,6558,6559,6560,6561,6562,6563,6564,6565,6566,6567,6568,6569,6570,6571,6572,6573,6574,6575,6576,6577,6578,6579,6580,6581,6582,6583,6584,6585,6586,6587,6588,6589,6590,6591,6592,6593,6594,6595,6596,6597,6598,6599,6600,6601,6602,6603,6604,6605,6606,6607,6608,6609,6610,6611,6612,6613,6614,6615,6616,6617,6618,6619,6620,6621,6622,6623,6624,6625,6626,6627,6628,6629,6630,6631,6632,6633,6634,6635,6636,6637,6638,6639,6640,6641,6642,6643,6644,6645,6646,6647,6648,6649,6650,6651,6652,6653,6654,6655,6656,6657,6658,6659,6660,6661,6662,6663,6664,6665,6666,6667,6668,6669,6670,6671,6672,6673,6674,6675,6676,6677,6678,6679,6680,6681,6682,6683,6684,6685,6686,6687,6688,6689,6690,6691,6692,6693,6694,6695,6696,6697,6698,6699,6700,6701,6702,6703,6704,6705,6706,6707,6708,6709,6710,6711,6712,6713,6714,6715,6716,6717,6718,6719,6720,6721,6722,6723,6724,6725,6726,6727,6728,6729,6730,6731,6732,6733,6734,6735,6736,6737,6738,6739,6740,6741,6742,6743,6744,6745,6746,6747,6748,6749,6750,6751,6752,6753,6754,6755,6756,6757,6758,6759,6760,6761,6762,6763,6764,6765,6766,6767,6768,6769,6770,6771,6772,6773,6774,6775,6776,6777,6778,6779,6780,6781,6782,6783,6784,6785,6786,6787,6788,6789,6790,6791,6792,6793,6794,6795,6796,6797,6798,6799,6800,6801,6802,6803,6804,6805,6806,6807,6808,6809,6810,6811,6812,6813,6814,6815,6816,6817,6818,6819,6820,6821,6822,6823,6824,6825,6826,6827,6828,6829,6830,6831,6832,6833,6834,6835,6836,6837,6838,6839,6840,6841,6842,6843,6844,6845,6846,6847,6848,6849,6850,6851,6852,6853,6854,6855,6856,6857,6858,6859,6860,6861,6862,6863,6864,6865,6866,6867,6868,6869,6870,6871,6872,6873,6874,6875,6876,6877,6878,6879,6880,6881,6882,6883,6884,6885,6886,6887,6888,6889,6890,6891,6892,6893,6894,6895,6896,6897,6898,6899,6900,6901,6902,6903,6904,6905,6906,6907,6908,6909,6910,6911,6912,6913,6914,6915,6916,6917,6918,6919,6920,6921,6922,6923,6924,6925,6926,6927,6928,6929,6930,6931,6932,6933,6934,6935,6936,6937,6938,6939,6940,6941,6942,6943,6944,6945,6946,6947,6948,6949,6950,6951,6952,6953,6954,6955,6956,6957,6958,6959,6960,6961,6962,6963,6964,6965,6966,6967,6968,6969,6970,6971,6972,6973,6974,6975,6976,6977,6978,6979,6980,6981,6982,6983,6984,6985,6986,6987,6988,6989,6990,6991,6992,6993,6994,6995,6996,6997,6998,6999,7000,7001,7002,7003,7004,7005,7006,7007,7008,7009,7010,7011,7012,7013,7014,7015,7016,7017,7018,7019,7020,7021,7022,7023,7024,7025,7026,7027,7028,7029,7030,7031,7032,7033,7034,7035,7036,7037,7038,7039,7040,7041,7042,7043,7044,7045,7046,7047,7048,7049,7050,7051,7052,7053,7054,7055,7056,7057,7058,7059,7060,7061,7062,7063,7064,7065,7066,7067,7068,7069,7070,7071,7072,7073,7074,7075,7076,7077,7078,7079,7080,7081,7082,7083,7084,7085,7086,7087,7088,7089,7090,7091,7092,7093,7094,7095,7096,7097,7098,7099,7100,7101,7102,7103,7104,7105,7106,7107,7108,7109,7110,7111,7112,7113,7114,7115,7116,7117,7118,7119,7120,7121,7122,7123,7124,7125,7126,7127,7128,7129,7130,7131,7132,7133,7134,7135,7136,7137,7138,7139,7140,7141,7142,7143,7144,7145,7146,7147,7148,7149,7150,7151,7152,7153,7154,7155,7156,7157,7158,7159,7160,7161,7162,7163,7164,7165,7166,7167,7168,7169,7170,7171,7172,7173,7174,7175,7176,7177,7178,7179,7180,7181,7182,7183,7184,7185,7186,7187,7188,7189,7190,7191,7192,7193,7194,7195,7196,7197,7198,7199,7200,7201,7202,7203,7204,7205,7206,7207,7208,7209,7210,7211,7212,7213,7214,7215,7216,7217,7218,7219,7220,7221,7222,7223,7224,7225,7226,7227,7228,7229,7230,7231,7232,7233,7234,7235,7236,7237,7238,7239,7240,7241,7242,7243,7244,7245,7246,7247,7248,7249,7250,7251,7252,7253,7254,7255,7256,7257,7258,7259,7260,7261,7262,7263,7264,7265,7266,7267,7268,7269,7270,7271,7272,7273,7274,7275,7276,7277,7278,7279,7280,7281,7282,7283,7284,7285,7286,7287,7288,7289,7290,7291,7292,7293,7294,7295,7296,7297,7298,7299,7300,7301,7302,7303,7304,7305,7306,7307,7308,7309,7310,7311,7312,7313,7314,7315,7316,7317,7318,7319,7320,7321,7322,7323,7324,7325,7326,7327,7328,7329,7330,7331,7332,7333,7334,7335,7336,7337,7338,7339,7340,7341,7342,7343,7344,7345,7346,7347,7348,7349,7350,7351,7352,7353,7354,7355,7356,7357,7358,7359,7360,7361,7362,7363,7364,7365,7366,7367,7368,7369,7370,7371,7372,7373,7374,7375,7376,7377,7378,7379,7380,7381,7382,7383,7384,7385,7386,7387,7388,7389,7390,7391,7392,7393,7394,7395,7396,7397,7398,7399,7400,7401,7402,7403,7404,7405,7406,7407,7408,7409,7410,7411,7412,7413,7414,7415,7416,7417,7418,7419,7420,7421,7422,7423,7424,7425,7426,7427,7428,7429,7430,7431,7432,7433,7434,7435,7436,7437,7438,7439,7440,7441,7442,7443,7444,7445,7446,7447,7448,7449,7450,7451,7452,7453,7454,7455,7456,7457,7458,7459,7460,7461,7462,7463,7464,7465,7466,7467,7468,7469,7470,7471,7472,7473,7474,7475,7476,7477,7478,7479,7480,7481,7482,7483,7484,7485,7486,7487,7488,7489,7490,7491,7492,7493,7494,7495,7496,7497,7498,7499,7500,7501,7502,7503,7504,7505,7506,7507,7508,7509,7510,7511,7512,7513,7514,7515,7516,7517,7518,7519,7520,7521,7522,7523,7524,7525,7526,7527,7528,7529,7530,7531,7532,7533,7534,7535,7536,7537,7538,7539,7540,7541,7542,7543,7544,7545,7546,7547,7548,7549,7550,7551,7552,7553,7554,7555,7556,7557,7558,7559,7560,7561,7562,7563,7564,7565,7566,7567,7568,7569,7570,7571,7572,7573,7574,7575,7576,7577,7578,7579,7580,7581,7582,7583,7584,7585,7586,7587,7588,7589,7590,7591,7592,7593,7594,7595,7596,7597,7598,7599,7600,7601,7602,7603,7604,7605,7606,7607,7608,7609,7610,7611,7612,7613,7614,7615,7616,7617,7618,7619,7620,7621,7622,7623,7624,7625,7626,7627,7628,7629,7630,7631,7632,7633,7634,7635,7636,7637,7638,7639,7640,7641,7642,7643,7644,7645,7646,7647,7648,7649,7650,7651,7652,7653,7654,7655,7656,7657,7658,7659,7660,7661,7662,7663,7664,7665,7666,7667,7668,7669,7670,7671,7672,7673,7674,7675,7676,7677,7678,7679,7680,7681,7682,7683,7684,7685,7686,7687,7688,7689,7690,7691,7692,7693,7694,7695,7696,7697,7698,7699,7700,7701,7702,7703,7704,7705,7706,7707,7708,7709,7710,7711,7712,7713,7714,7715,7716,7717,7718,7719,7720,7721,7722,7723,7724,7725,7726,7727,7728,7729,7730,7731,7732,7733,7734,7735,7736,7737,7738,7739,7740,7741,7742,7743,7744,7745,7746,7747,7748,7749,7750,7751,7752,7753,7754,7755,7756,7757,7758,7759,7760,7761,7762,7763,7764,7765,7766,7767,7768,7769,7770,7771,7772,7773,7774,7775,7776,7777,7778,7779,7780,7781,7782,7783,7784,7785,7786,7787,7788,7789,7790,7791,7792,7793,7794,7795,7796,7797,7798,7799,7800,7801,7802,7803,7804,7805,7806,7807,7808,7809,7810,7811,7812,7813,7814,7815,7816,7817,7818,7819,7820,7821,7822,7823,7824,7825,7826,7827,7828,7829,7830,7831,7832,7833,7834,7835,7836,7837,7838,7839,7840,7841,7842,7843,7844,7845,7846,7847,7848,7849,7850,7851,7852,7853,7854,7855,7856,7857,7858,7859,7860,7861,7862,7863,7864,7865,7866,7867,7868,7869,7870,7871,7872,7873,7874,7875,7876,7877,7878,7879,7880,7881,7882,7883,7884,7885,7886,7887,7888,7889,7890,7891,7892,7893,7894,7895,7896,7897,7898,7899,7900,7901,7902,7903,7904,7905,7906,7907,7908,7909,7910,7911,7912,7913,7914,7915,7916,7917,7918,7919,7920,7921,7922,7923,7924,7925,7926,7927,7928,7929,7930,7931,7932,7933,7934,7935,7936,7937,7938,7939,7940,7941,7942,7943,7944,7945,7946,7947,7948,7949,7950,7951,7952,7953,7954,7955,7956,7957,7958,7959,7960,7961,7962,7963,7964,7965,7966,7967,7968,7969,7970,7971,7972,7973,7974,7975,7976,7977,7978,7979,7980,7981,7982,7983,7984,7985,7986,7987,7988,7989,7990,7991,7992,7993,7994,7995,7996,7997,7998,7999,8000,8001,8002,8003,8004,8005,8006,8007,8008,8009,8010,8011,8012,8013,8014,8015,8016,8017,8018,8019,8020,8021,8022,8023,8024,8025,8026,8027,8028,8029,8030,8031,8032,8033,8034,8035,8036,8037,8038,8039,8040,8041,8042,8043,8044,8045,8046,8047,8048,8049,8050,8051,8052,8053,8054,8055,8056,8057,8058,8059,8060,8061,8062,8063,8064,8065,8066,8067,8068,8069,8070,8071,8072,8073,8074,8075,8076,8077,8078,8079,8080,8081,8082,8083,8084,8085,8086,8087,8088,8089,8090,8091,8092,8093,8094,8095,8096,8097,8098,8099,8100,8101,8102,8103,8104,8105,8106,8107,8108,8109,8110,8111,8112,8113,8114,8115,8116,8117,8118,8119,8120,8121,8122,8123,8124,8125,8126,8127,8128,8129,8130,8131,8132,8133,8134,8135,8136,8137,8138,8139,8140,8141,8142,8143,8144,8145,8146,8147,8148,8149,8150,8151,8152,8153,8154,8155,8156,8157,8158,8159,8160,8161,8162,8163,8164,8165,8166,8167,8168,8169,8170,8171,8172,8173,8174,8175,8176,8177,8178,8179,8180,8181,8182,8183,8184,8185,8186,8187,8188,8189,8190,8191,8192,8193,8194,8195,8196,8197,8198,8199,8200,8201,8202,8203,8204,8205,8206,8207,8208,8209,8210,8211,8212,8213,8214,8215,8216,8217,8218,8219,8220,8221,8222,8223,8224,8225,8226,8227,8228,8229,8230,8231,8232,8233,8234,8235,8236,8237,8238,8239,8240,8241,8242,8243,8244,8245,8246,8247,8248,8249,8250,8251,8252,8253,8254,8255,8256,8257,8258,8259,8260,8261,8262,8263,8264,8265,8266,8267,8268,8269,8270,8271,8272,8273,8274,8275,8276,8277,8278,8279,8280,8281,8282,8283,8284,8285,8286,8287,8288,8289,8290,8291,8292,8293,8294,8295,8296,8297,8298,8299,8300,8301,8302,8303,8304,8305,8306,8307,8308,8309,8310,8311,8312,8313,8314,8315,8316,8317,8318,8319,8320,8321,8322,8323,8324,8325,8326,8327,8328,8329,8330,8331,8332,8333,8334,8335,8336,8337,8338,8339,8340,8341,8342,8343,8344,8345,8346,8347,8348,8349,8350,8351,8352,8353,8354,8355,8356,8357,8358,8359,8360,8361,8362,8363,8364,8365,8366,8367,8368,8369,8370,8371,8372,8373,8374,8375,8376,8377,8378,8379,8380,8381,8382,8383,8384,8385,8386,8387,8388,8389,8390,8391,8392,8393,8394,8395,8396,8397,8398,8399,8400,8401,8402,8403,8404,8405,8406,8407,8408,8409,8410,8411,8412,8413,8414,8415,8416,8417,8418,8419,8420,8421,8422,8423,8424,8425,8426,8427,8428,8429,8430,8431,8432,8433,8434,8435,8436,8437,8438,8439,8440,8441,8442,8443,8444,8445,8446,8447,8448,8449,8450,8451,8452,8453,8454,8455,8456,8457,8458,8459,8460,8461,8462,8463,8464,8465,8466,8467,8468,8469,8470,8471,8472,8473,8474,8475,8476,8477,8478,8479,8480,8481,8482,8483,8484,8485,8486,8487,8488,8489,8490,8491,8492,8493,8494,8495,8496,8497,8498,8499,8500,8501,8502,8503,8504,8505,8506,8507,8508,8509,8510,8511,8512,8513,8514,8515,8516,8517,8518,8519,8520,8521,8522,8523,8524,8525,8526,8527,8528,8529,8530,8531,8532,8533,8534,8535,8536,8537,8538,8539,8540,8541,8542,8543,8544,8545,8546,8547,8548,8549,8550,8551,8552,8553,8554,8555,8556,8557,8558,8559,8560,8561,8562,8563,8564,8565,8566,8567,8568,8569,8570,8571,8572,8573,8574,8575,8576,8577,8578,8579,8580,8581,8582,8583,8584,8585,8586,8587,8588,8589,8590,8591,8592,8593,8594,8595,8596,8597,8598,8599,8600,8601,8602,8603,8604,8605,8606,8607,8608,8609,8610,8611,8612,8613,8614,8615,8616,8617,8618,8619,8620,8621,8622,8623,8624,8625,8626,8627,8628,8629,8630,8631,8632,8633,8634,8635,8636,8637,8638,8639,8640,8641,8642,8643,8644,8645,8646,8647,8648,8649,8650,8651,8652,8653,8654,8655,8656,8657,8658,8659,8660,8661,8662,8663,8664,8665,8666,8667,8668,8669,8670,8671,8672,8673,8674,8675,8676,8677,8678,8679,8680,8681,8682,8683,8684,8685,8686,8687,8688,8689,8690,8691,8692,8693,8694,8695,8696,8697,8698,8699,8700,8701,8702,8703,8704,8705,8706,8707,8708,8709,8710,8711,8712,8713,8714,8715,8716,8717,8718,8719,8720,8721,8722,8723,8724,8725,8726,8727,8728,8729,8730,8731,8732,8733,8734,8735,8736,8737,8738,8739,8740,8741,8742,8743,8744,8745,8746,8747,8748,8749,8750,8751,8752,8753,8754,8755,8756,8757,8758,8759,8760,8761,8762,8763,8764,8765,8766,8767,8768,8769,8770,8771,8772,8773,8774,8775,8776,8777,8778,8779,8780,8781,8782,8783,8784,8785,8786,8787,8788,8789,8790,8791,8792,8793,8794,8795,8796,8797,8798,8799,8800,8801,8802,8803,8804,8805,8806,8807,8808,8809,8810,8811,8812,8813,8814,8815,8816,8817,8818,8819,8820,8821,8822,8823,8824,8825,8826,8827,8828,8829,8830,8831,8832,8833,8834,8835,8836,8837,8838,8839,8840,8841,8842,8843,8844,8845,8846,8847,8848,8849,8850,8851,8852,8853,8854,8855,8856,8857,8858,8859,8860,8861,8862,8863,8864,8865,8866,8867,8868,8869,8870,8871,8872,8873,8874,8875,8876,8877,8878,8879,8880,8881,8882,8883,8884,8885,8886,8887,8888,8889,8890,8891,8892,8893,8894,8895,8896,8897,8898,8899,8900,8901,8902,8903,8904,8905,8906,8907,8908,8909,8910,8911,8912,8913,8914,8915,8916,8917,8918,8919,8920,8921,8922,8923,8924,8925,8926,8927,8928,8929,8930,8931,8932,8933,8934,8935,8936,8937,8938,8939,8940,8941,8942,8943,8944,8945,8946,8947,8948,8949,8950,8951,8952,8953,8954,8955,8956,8957,8958,8959,8960,8961,8962,8963,8964,8965,8966,8967,8968,8969,8970,8971,8972,8973,8974,8975,8976,8977,8978,8979,8980,8981,8982,8983,8984,8985,8986,8987,8988,8989,8990,8991,8992,8993,8994,8995,8996,8997,8998,8999,9000,9001,9002,9003,9004,9005,9006,9007,9008,9009,9010,9011,9012,9013,9014,9015,9016,9017,9018,9019,9020,9021,9022,9023,9024,9025,9026,9027,9028,9029,9030,9031,9032,9033,9034,9035,9036,9037,9038,9039,9040,9041,9042,9043,9044,9045,9046,9047,9048,9049,9050,9051,9052,9053,9054,9055,9056,9057,9058,9059,9060,9061,9062,9063,9064,9065,9066,9067,9068,9069,9070,9071,9072,9073,9074,9075,9076,9077,9078,9079,9080,9081,9082,9083,9084,9085,9086,9087,9088,9089,9090,9091,9092,9093,9094,9095,9096,9097,9098,9099,9100,9101,9102,9103,9104,9105,9106,9107,9108,9109,9110,9111,9112,9113,9114,9115,9116,9117,9118,9119,9120,9121,9122,9123,9124,9125,9126,9127,9128,9129,9130,9131,9132,9133,9134,9135,9136,9137,9138,9139,9140,9141,9142,9143,9144,9145,9146,9147,9148,9149,9150,9151,9152,9153,9154,9155,9156,9157,9158,9159,9160,9161,9162,9163,9164,9165,9166,9167,9168,9169,9170,9171,9172,9173,9174,9175,9176,9177,9178,9179,9180,9181,9182,9183,9184,9185,9186,9187,9188,9189,9190,9191,9192,9193,9194,9195,9196,9197,9198,9199,9200,9201,9202,9203,9204,9205,9206,9207,9208,9209,9210,9211,9212,9213,9214,9215,9216,9217,9218,9219,9220,9221,9222,9223,9224,9225,9226,9227,9228,9229,9230,9231,9232,9233,9234,9235,9236,9237,9238,9239,9240,9241,9242,9243,9244,9245,9246,9247,9248,9249,9250,9251,9252,9253,9254,9255,9256,9257,9258,9259,9260,9261,9262,9263,9264,9265,9266,9267,9268,9269,9270,9271,9272,9273,9274,9275,9276,9277,9278,9279,9280,9281,9282,9283,9284,9285,9286,9287,9288,9289,9290,9291,9292,9293,9294,9295,9296,9297,9298,9299,9300,9301,9302,9303,9304,9305,9306,9307,9308,9309,9310,9311,9312,9313,9314,9315,9316,9317,9318,9319,9320,9321,9322,9323,9324,9325,9326,9327,9328,9329,9330,9331,9332,9333,9334,9335,9336,9337,9338,9339,9340,9341,9342,9343,9344,9345,9346,9347,9348,9349,9350,9351,9352,9353,9354,9355,9356,9357,9358,9359,9360,9361,9362,9363,9364,9365,9366,9367,9368,9369,9370,9371,9372,9373,9374,9375,9376,9377,9378,9379,9380,9381,9382,9383,9384,9385,9386,9387,9388,9389,9390,9391,9392,9393,9394,9395,9396,9397,9398,9399,9400,9401,9402,9403,9404,9405,9406,9407,9408,9409,9410,9411,9412,9413,9414,9415,9416,9417,9418,9419,9420,9421,9422,9423,9424,9425,9426,9427,9428,9429,9430,9431,9432,9433,9434,9435,9436,9437,9438,9439,9440,9441,9442,9443,9444,9445,9446,9447,9448,9449,9450,9451,9452,9453,9454,9455,9456,9457,9458,9459,9460,9461,9462,9463,9464,9465,9466,9467,9468,9469,9470,9471,9472,9473,9474,9475,9476,9477,9478,9479,9480,9481,9482,9483,9484,9485,9486,9487,9488,9489,9490,9491,9492,9493,9494,9495,9496,9497,9498,9499,9500,9501,9502,9503,9504,9505,9506,9507,9508,9509,9510,9511,9512,9513,9514,9515,9516,9517,9518,9519,9520,9521,9522,9523,9524,9525,9526,9527,9528,9529,9530,9531,9532,9533,9534,9535,9536,9537,9538,9539,9540,9541,9542,9543,9544,9545,9546,9547,9548,9549,9550,9551,9552,9553,9554,9555,9556,9557,9558,9559,9560,9561,9562,9563,9564,9565,9566,9567,9568,9569,9570,9571,9572,9573,9574,9575,9576,9577,9578,9579,9580,9581,9582,9583,9584,9585,9586,9587,9588,9589,9590,9591,9592,9593,9594,9595,9596,9597,9598,9599,9600,9601,9602,9603,9604,9605,9606,9607,9608,9609,9610,9611,9612,9613,9614,9615,9616,9617,9618,9619,9620,9621,9622,9623,9624,9625,9626,9627,9628,9629,9630,9631,9632,9633,9634,9635,9636,9637,9638,9639,9640,9641,9642,9643,9644,9645,9646,9647,9648,9649,9650,9651,9652,9653,9654,9655,9656,9657,9658,9659,9660,9661,9662,9663,9664,9665,9666,9667,9668,9669,9670,9671,9672,9673,9674,9675,9676,9677,9678,9679,9680,9681,9682,9683,9684,9685,9686,9687,9688,9689,9690,9691,9692,9693,9694,9695,9696,9697,9698,9699,9700,9701,9702,9703,9704,9705,9706,9707,9708,9709,9710,9711,9712,9713,9714,9715,9716,9717,9718,9719,9720,9721,9722,9723,9724,9725,9726,9727,9728,9729,9730,9731,9732,9733,9734,9735,9736,9737,9738,9739,9740,9741,9742,9743,9744,9745,9746,9747,9748,9749,9750,9751,9752,9753,9754,9755,9756,9757,9758,9759,9760,9761,9762,9763,9764,9765,9766,9767,9768,9769,9770,9771,9772,9773,9774,9775,9776,9777,9778,9779,9780,9781,9782,9783,9784,9785,9786,9787,9788,9789,9790,9791,9792,9793,9794,9795,9796,9797,9798,9799,9800,9801,9802,9803,9804,9805,9806,9807,9808,9809,9810,9811,9812,9813,9814,9815,9816,9817,9818,9819,9820,9821,9822,9823,9824,9825,9826,9827,9828,9829,9830,9831,9832,9833,9834,9835,9836,9837,9838,9839,9840,9841,9842,9843,9844,9845,9846,9847,9848,9849,9850,9851,9852,9853,9854,9855,9856,9857,9858,9859,9860,9861,9862,9863,9864,9865,9866,9867,9868,9869,9870,9871,9872,9873,9874,9875,9876,9877,9878,9879,9880,9881,9882,9883,9884,9885,9886,9887,9888,9889,9890,9891,9892,9893,9894,9895,9896,9897,9898,9899,9900,9901,9902,9903,9904,9905,9906,9907,9908,9909,9910,9911,9912,9913,9914,9915,9916,9917,9918,9919,9920,9921,9922,9923,9924,9925,9926,9927,9928,9929,9930,9931,9932,9933,9934,9935,9936,9937,9938,9939,9940,9941,9942,9943,9944,9945,9946,9947,9948,9949,9950,9951,9952,9953,9954,9955,9956,9957,9958,9959,9960,9961,9962,9963,9964,9965,9966,9967,9968,9969,9970,9971,9972,9973,9974,9975,9976,9977,9978,9979,9980,9981,9982,9983,9984,9985,9986,9987,9988,9989,9990,9991,9992,9993,9994,9995,9996,9997,9998,9999,10000,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,160,161,162,163,164,165,166,167,168,169,170,171,172,173,174,175,176,177,178,179,180,181,182,183,184,185,186,187,188,189,190,191,192,193,194,195,196,197,198,199,200,201,202,203,204,205,206,207,208,209,210,211,212,213,214,215,216,217,218,219,220,221,222,223,224,225,226,227,228,229,230,231,232,233,234,235,236,237,238,239,240,241,242,243,244,245,246,247,248,249,250,251,252,253,254,255,256,257,258,259,260,261,262,263,264,265,266,267,268,269,270,271,272,273,274,275,276,277,278,279,280,281,282,283,284,285,286,287,288,289,290,291,292,293,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,336,337,338,339,340,341,342,343,344,345,346,347,348,349,350,351,352,353,354,355,356,357,358,359,360,361,362,363,364,365,366,367,368,369,370,371,372,373,374,375,376,377,378,379,380,381,382,383,384,385,386,387,388,389,390,391,392,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,409,410,411,412,413,414,415,416,417,418,419,420,421,422,423,424,425,426,427,428,429,430,431,432,433,434,435,436,437,438,439,440,441,442,443,444,445,446,447,448,449,450,451,452,453,454,455,456,457,458,459,460,461,462,463,464,465,466,467,468,469,470,471,472,473,474,475,476,477,478,479,480,481,482,483,484,485,486,487,488,489,490,491,492,493,494,495,496,497,498,499,500,501,502,503,504,505,506,507,508,509,510,511,512,513,514,515,516,517,518,519,520,521,522,523,524,525,526,527,528,529,530,531,532,533,534,535,536,537,538,539,540,541,542,543,544,545,546,547,548,549,550,551,552,553,554,555,556,557,558,559,560,561,562,563,564,565,566,567,568,569,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,595,596,597,598,599,600,601,602,603,604,605,606,607,608,609,610,611,612,613,614,615,616,617,618,619,620,621,622,623,624,625,626,627,628,629,630,631,632,633,634,635,636,637,638,639,640,641,642,643,644,645,646,647,648,649,650,651,652,653,654,655,656,657,658,659,660,661,662,663,664,665,666,667,668,669,670,671,672,673,674,675,676,677,678,679,680,681,682,683,684,685,686,687,688,689,690,691,692,693,694,695,696,697,698,699,700,701,702,703,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,731,732,733,734,735,736,737,738,739,740,741,742,743,744,745,746,747,748,749,750,751,752,753,754,755,756,757,758,759,760,761,762,763,764,765,766,767,768,769,770,771,772,773,774,775,776,777,778,779,780,781,782,783,784,785,786,787,788,789,790,791,792,793,794,795,796,797,798,799,800,801,802,803,804,805,806,807,808,809,810,811,812,813,814,815,816,817,818,819,820,821,822,823,824,825,826,827,828,829,830,831,832,833,834,835,836,837,838,839,840,841,842,843,844,845,846,847,848,849,850,851,852,853,854,855,856,857,858,859,860,861,862,863,864,865,866,867,868,869,870,871,872,873,874,875,876,877,878,879,880,881,882,883,884,885,886,887,888,889,890,891,892,893,894,895,896,897,898,899,900,901,902,903,904,905,906,907,908,909,910,911,912,913,914,915,916,917,918,919,920,921,922,923,924,925,926,927,928,929,930,931,932,933,934,935,936,937,938,939,940,941,942,943,944,945,946,947,948,949,950,951,952,953,954,955,956,957,958,959,960,961,962,963,964,965,966,967,968,969,970,971,972,973,974,975,976,977,978,979,980,981,982,983,984,985,986,987,988,989,990,991,992,993,994,995,996,997,998,999,1000,1001,1002,1003,1004,1005,1006,1007,1008,1009,1010,1011,1012,1013,1014,1015,1016,1017,1018,1019,1020,1021,1022,1023,1024,1025,1026,1027,1028,1029,1030,1031,1032,1033,1034,1035,1036,1037,1038,1039,1040,1041,1042,1043,1044,1045,1046,1047,1048,1049,1050,1051,1052,1053,1054,1055,1056,1057,1058,1059,1060,1061,1062,1063,1064,1065,1066,1067,1068,1069,1070,1071,1072,1073,1074,1075,1076,1077,1078,1079,1080,1081,1082,1083,1084,1085,1086,1087,1088,1089,1090,1091,1092,1093,1094,1095,1096,1097,1098,1099,1100,1101,1102,1103,1104,1105,1106,1107,1108,1109,1110,1111,1112,1113,1114,1115,1116,1117,1118,1119,1120,1121,1122,1123,1124,1125,1126,1127,1128,1129,1130,1131,1132,1133,1134,1135,1136,1137,1138,1139,1140,1141,1142,1143,1144,1145,1146,1147,1148,1149,1150,1151,1152,1153,1154,1155,1156,1157,1158,1159,1160,1161,1162,1163,1164,1165,1166,1167,1168,1169,1170,1171,1172,1173,1174,1175,1176,1177,1178,1179,1180,1181,1182,1183,1184,1185,1186,1187,1188,1189,1190,1191,1192,1193,1194,1195,1196,1197,1198,1199,1200,1201,1202,1203,1204,1205,1206,1207,1208,1209,1210,1211,1212,1213,1214,1215,1216,1217,1218,1219,1220,1221,1222,1223,1224,1225,1226,1227,1228,1229,1230,1231,1232,1233,1234,1235,1236,1237,1238,1239,1240,1241,1242,1243,1244,1245,1246,1247,1248,1249,1250,1251,1252,1253,1254,1255,1256,1257,1258,1259,1260,1261,1262,1263,1264,1265,1266,1267,1268,1269,1270,1271,1272,1273,1274,1275,1276,1277,1278,1279,1280,1281,1282,1283,1284,1285,1286,1287,1288,1289,1290,1291,1292,1293,1294,1295,1296,1297,1298,1299,1300,1301,1302,1303,1304,1305,1306,1307,1308,1309,1310,1311,1312,1313,1314,1315,1316,1317,1318,1319,1320,1321,1322,1323,1324,1325,1326,1327,1328,1329,1330,1331,1332,1333,1334,1335,1336,1337,1338,1339,1340,1341,1342,1343,1344,1345,1346,1347,1348,1349,1350,1351,1352,1353,1354,1355,1356,1357,1358,1359,1360,1361,1362,1363,1364,1365,1366,1367,1368,1369,1370,1371,1372,1373,1374,1375,1376,1377,1378,1379,1380,1381,1382,1383,1384,1385,1386,1387,1388,1389,1390,1391,1392,1393,1394,1395,1396,1397,1398,1399,1400,1401,1402,1403,1404,1405,1406,1407,1408,1409,1410,1411,1412,1413,1414,1415,1416,1417,1418,1419,1420,1421,1422,1423,1424,1425,1426,1427,1428,1429,1430,1431,1432,1433,1434,1435,1436,1437,1438,1439,1440,1441,1442,1443,1444,1445,1446,1447,1448,1449,1450,1451,1452,1453,1454,1455,1456,1457,1458,1459,1460,1461,1462,1463,1464,1465,1466,1467,1468,1469,1470,1471,1472,1473,1474,1475,1476,1477,1478,1479,1480,1481,1482,1483,1484,1485,1486,1487,1488,1489,1490,1491,1492,1493,1494,1495,1496,1497,1498,1499,1500,1501,1502,1503,1504,1505,1506,1507,1508,1509,1510,1511,1512,1513,1514,1515,1516,1517,1518,1519,1520,1521,1522,1523,1524,1525,1526,1527,1528,1529,1530,1531,1532,1533,1534,1535,1536,1537,1538,1539,1540,1541,1542,1543,1544,1545,1546,1547,1548,1549,1550,1551,1552,1553,1554,1555,1556,1557,1558,1559,1560,1561,1562,1563,1564,1565,1566,1567,1568,1569,1570,1571,1572,1573,1574,1575,1576,1577,1578,1579,1580,1581,1582,1583,1584,1585,1586,1587,1588,1589,1590,1591,1592,1593,1594,1595,1596,1597,1598,1599,1600,1601,1602,1603,1604,1605,1606,1607,1608,1609,1610,1611,1612,1613,1614,1615,1616,1617,1618,1619,1620,1621,1622,1623,1624,1625,1626,1627,1628,1629,1630,1631,1632,1633,1634,1635,1636,1637,1638,1639,1640,1641,1642,1643,1644,1645,1646,1647,1648,1649,1650,1651,1652,1653,1654,1655,1656,1657,1658,1659,1660,1661,1662,1663,1664,1665,1666,1667,1668,1669,1670,1671,1672,1673,1674,1675,1676,1677,1678,1679,1680,1681,1682,1683,1684,1685,1686,1687,1688,1689,1690,1691,1692,1693,1694,1695,1696,1697,1698,1699,1700,1701,1702,1703,1704,1705,1706,1707,1708,1709,1710,1711,1712,1713,1714,1715,1716,1717,1718,1719,1720,1721,1722,1723,1724,1725,1726,1727,1728,1729,1730,1731,1732,1733,1734,1735,1736,1737,1738,1739,1740,1741,1742,1743,1744,1745,1746,1747,1748,1749,1750,1751,1752,1753,1754,1755,1756,1757,1758,1759,1760,1761,1762,1763,1764,1765,1766,1767,1768,1769,1770,1771,1772,1773,1774,1775,1776,1777,1778,1779,1780,1781,1782,1783,1784,1785,1786,1787,1788,1789,1790,1791,1792,1793,1794,1795,1796,1797,1798,1799,1800,1801,1802,1803,1804,1805,1806,1807,1808,1809,1810,1811,1812,1813,1814,1815,1816,1817,1818,1819,1820,1821,1822,1823,1824,1825,1826,1827,1828,1829,1830,1831,1832,1833,1834,1835,1836,1837,1838,1839,1840,1841,1842,1843,1844,1845,1846,1847,1848,1849,1850,1851,1852,1853,1854,1855,1856,1857,1858,1859,1860,1861,1862,1863,1864,1865,1866,1867,1868,1869,1870,1871,1872,1873,1874,1875,1876,1877,1878,1879,1880,1881,1882,1883,1884,1885,1886,1887,1888,1889,1890,1891,1892,1893,1894,1895,1896,1897,1898,1899,1900,1901,1902,1903,1904,1905,1906,1907,1908,1909,1910,1911,1912,1913,1914,1915,1916,1917,1918,1919,1920,1921,1922,1923,1924,1925,1926,1927,1928,1929,1930,1931,1932,1933,1934,1935,1936,1937,1938,1939,1940,1941,1942,1943,1944,1945,1946,1947,1948,1949,1950,1951,1952,1953,1954,1955,1956,1957,1958,1959,1960,1961,1962,1963,1964,1965,1966,1967,1968,1969,1970,1971,1972,1973,1974,1975,1976,1977,1978,1979,1980,1981,1982,1983,1984,1985,1986,1987,1988,1989,1990,1991,1992,1993,1994,1995,1996,1997,1998,1999,2000,2001,2002,2003,2004,2005,2006,2007,2008,2009,2010,2011,2012,2013,2014,2015,2016,2017,2018,2019,2020,2021,2022,2023,2024,2025,2026,2027,2028,2029,2030,2031,2032,2033,2034,2035,2036,2037,2038,2039,2040,2041,2042,2043,2044,2045,2046,2047,2048,2049,2050,2051,2052,2053,2054,2055,2056,2057,2058,2059,2060,2061,2062,2063,2064,2065,2066,2067,2068,2069,2070,2071,2072,2073,2074,2075,2076,2077,2078,2079,2080,2081,2082,2083,2084,2085,2086,2087,2088,2089,2090,2091,2092,2093,2094,2095,2096,2097,2098,2099,2100,2101,2102,2103,2104,2105,2106,2107,2108,2109,2110,2111,2112,2113,2114,2115,2116,2117,2118,2119,2120,2121,2122,2123,2124,2125,2126,2127,2128,2129,2130,2131,2132,2133,2134,2135,2136,2137,2138,2139,2140,2141,2142,2143,2144,2145,2146,2147,2148,2149,2150,2151,2152,2153,2154,2155,2156,2157,2158,2159,2160,2161,2162,2163,2164,2165,2166,2167,2168,2169,2170,2171,2172,2173,2174,2175,2176,2177,2178,2179,2180,2181,2182,2183,2184,2185,2186,2187,2188,2189,2190,2191,2192,2193,2194,2195,2196,2197,2198,2199,2200,2201,2202,2203,2204,2205,2206,2207,2208,2209,2210,2211,2212,2213,2214,2215,2216,2217,2218,2219,2220,2221,2222,2223,2224,2225,2226,2227,2228,2229,2230,2231,2232,2233,2234,2235,2236,2237,2238,2239,2240,2241,2242,2243,2244,2245,2246,2247,2248,2249,2250,2251,2252,2253,2254,2255,2256,2257,2258,2259,2260,2261,2262,2263,2264,2265,2266,2267,2268,2269,2270,2271,2272,2273,2274,2275,2276,2277,2278,2279,2280,2281,2282,2283,2284,2285,2286,2287,2288,2289,2290,2291,2292,2293,2294,2295,2296,2297,2298,2299,2300,2301,2302,2303,2304,2305,2306,2307,2308,2309,2310,2311,2312,2313,2314,2315,2316,2317,2318,2319,2320,2321,2322,2323,2324,2325,2326,2327,2328,2329,2330,2331,2332,2333,2334,2335,2336,2337,2338,2339,2340,2341,2342,2343,2344,2345,2346,2347,2348,2349,2350,2351,2352,2353,2354,2355,2356,2357,2358,2359,2360,2361,2362,2363,2364,2365,2366,2367,2368,2369,2370,2371,2372,2373,2374,2375,2376,2377,2378,2379,2380,2381,2382,2383,2384,2385,2386,2387,2388,2389,2390,2391,2392,2393,2394,2395,2396,2397,2398,2399,2400,2401,2402,2403,2404,2405,2406,2407,2408,2409,2410,2411,2412,2413,2414,2415,2416,2417,2418,2419,2420,2421,2422,2423,2424,2425,2426,2427,2428,2429,2430,2431,2432,2433,2434,2435,2436,2437,2438,2439,2440,2441,2442,2443,2444,2445,2446,2447,2448,2449,2450,2451,2452,2453,2454,2455,2456,2457,2458,2459,2460,2461,2462,2463,2464,2465,2466,2467,2468,2469,2470,2471,2472,2473,2474,2475,2476,2477,2478,2479,2480,2481,2482,2483,2484,2485,2486,2487,2488,2489,2490,2491,2492,2493,2494,2495,2496,2497,2498,2499,2500,2501,2502,2503,2504,2505,2506,2507,2508,2509,2510,2511,2512,2513,2514,2515,2516,2517,2518,2519,2520,2521,2522,2523,2524,2525,2526,2527,2528,2529,2530,2531,2532,2533,2534,2535,2536,2537,2538,2539,2540,2541,2542,2543,2544,2545,2546,2547,2548,2549,2550,2551,2552,2553,2554,2555,2556,2557,2558,2559,2560,2561,2562,2563,2564,2565,2566,2567,2568,2569,2570,2571,2572,2573,2574,2575,2576,2577,2578,2579,2580,2581,2582,2583,2584,2585,2586,2587,2588,2589,2590,2591,2592,2593,2594,2595,2596,2597,2598,2599,2600,2601,2602,2603,2604,2605,2606,2607,2608,2609,2610,2611,2612,2613,2614,2615,2616,2617,2618,2619,2620,2621,2622,2623,2624,2625,2626,2627,2628,2629,2630,2631,2632,2633,2634,2635,2636,2637,2638,2639,2640,2641,2642,2643,2644,2645,2646,2647,2648,2649,2650,2651,2652,2653,2654,2655,2656,2657,2658,2659,2660,2661,2662,2663,2664,2665,2666,2667,2668,2669,2670,2671,2672,2673,2674,2675,2676,2677,2678,2679,2680,2681,2682,2683,2684,2685,2686,2687,2688,2689,2690,2691,2692,2693,2694,2695,2696,2697,2698,2699,2700,2701,2702,2703,2704,2705,2706,2707,2708,2709,2710,2711,2712,2713,2714,2715,2716,2717,2718,2719,2720,2721,2722,2723,2724,2725,2726,2727,2728,2729,2730,2731,2732,2733,2734,2735,2736,2737,2738,2739,2740,2741,2742,2743,2744,2745,2746,2747,2748,2749,2750,2751,2752,2753,2754,2755,2756,2757,2758,2759,2760,2761,2762,2763,2764,2765,2766,2767,2768,2769,2770,2771,2772,2773,2774,2775,2776,2777,2778,2779,2780,2781,2782,2783,2784,2785,2786,2787,2788,2789,2790,2791,2792,2793,2794,2795,2796,2797,2798,2799,2800,2801,2802,2803,2804,2805,2806,2807,2808,2809,2810,2811,2812,2813,2814,2815,2816,2817,2818,2819,2820,2821,2822,2823,2824,2825,2826,2827,2828,2829,2830,2831,2832,2833,2834,2835,2836,2837,2838,2839,2840,2841,2842,2843,2844,2845,2846,2847,2848,2849,2850,2851,2852,2853,2854,2855,2856,2857,2858,2859,2860,2861,2862,2863,2864,2865,2866,2867,2868,2869,2870,2871,2872,2873,2874,2875,2876,2877,2878,2879,2880,2881,2882,2883,2884,2885,2886,2887,2888,2889,2890,2891,2892,2893,2894,2895,2896,2897,2898,2899,2900,2901,2902,2903,2904,2905,2906,2907,2908,2909,2910,2911,2912,2913,2914,2915,2916,2917,2918,2919,2920,2921,2922,2923,2924,2925,2926,2927,2928,2929,2930,2931,2932,2933,2934,2935,2936,2937,2938,2939,2940,2941,2942,2943,2944,2945,2946,2947,2948,2949,2950,2951,2952,2953,2954,2955,2956,2957,2958,2959,2960,2961,2962,2963,2964,2965,2966,2967,2968,2969,2970,2971,2972,2973,2974,2975,2976,2977,2978,2979,2980,2981,2982,2983,2984,2985,2986,2987,2988,2989,2990,2991,2992,2993,2994,2995,2996,2997,2998,2999,3000,3001,3002,3003,3004,3005,3006,3007,3008,3009,3010,3011,3012,3013,3014,3015,3016,3017,3018,3019,3020,3021,3022,3023,3024,3025,3026,3027,3028,3029,3030,3031,3032,3033,3034,3035,3036,3037,3038,3039,3040,3041,3042,3043,3044,3045,3046,3047,3048,3049,3050,3051,3052,3053,3054,3055,3056,3057,3058,3059,3060,3061,3062,3063,3064,3065,3066,3067,3068,3069,3070,3071,3072,3073,3074,3075,3076,3077,3078,3079,3080,3081,3082,3083,3084,3085,3086,3087,3088,3089,3090,3091,3092,3093,3094,3095,3096,3097,3098,3099,3100,3101,3102,3103,3104,3105,3106,3107,3108,3109,3110,3111,3112,3113,3114,3115,3116,3117,3118,3119,3120,3121,3122,3123,3124,3125,3126,3127,3128,3129,3130,3131,3132,3133,3134,3135,3136,3137,3138,3139,3140,3141,3142,3143,3144,3145,3146,3147,3148,3149,3150,3151,3152,3153,3154,3155,3156,3157,3158,3159,3160,3161,3162,3163,3164,3165,3166,3167,3168,3169,3170,3171,3172,3173,3174,3175,3176,3177,3178,3179,3180,3181,3182,3183,3184,3185,3186,3187,3188,3189,3190,3191,3192,3193,3194,3195,3196,3197,3198,3199,3200,3201,3202,3203,3204,3205,3206,3207,3208,3209,3210,3211,3212,3213,3214,3215,3216,3217,3218,3219,3220,3221,3222,3223,3224,3225,3226,3227,3228,3229,3230,3231,3232,3233,3234,3235,3236,3237,3238,3239,3240,3241,3242,3243,3244,3245,3246,3247,3248,3249,3250,3251,3252,3253,3254,3255,3256,3257,3258,3259,3260,3261,3262,3263,3264,3265,3266,3267,3268,3269,3270,3271,3272,3273,3274,3275,3276,3277,3278,3279,3280,3281,3282,3283,3284,3285,3286,3287,3288,3289,3290,3291,3292,3293,3294,3295,3296,3297,3298,3299,3300,3301,3302,3303,3304,3305,3306,3307,3308,3309,3310])) | 3,787.923077 | 48,927 | 0.793087 | 20,097 | 98,486 | 3.886152 | 0.498681 | 0.000179 | 0.001536 | 0.000205 | 0.996453 | 0.996389 | 0.996005 | 0.996005 | 0.996005 | 0.996005 | 0 | 0.792213 | 0.002549 | 98,486 | 26 | 48,928 | 3,787.923077 | 0.00282 | 0.000558 | 0 | 0 | 0 | 0 | 0.000081 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0 | 0 | 0 | 0.142857 | 0.190476 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
bc48594f37c1b363dec2e401c69c81155fae9686 | 165 | py | Python | statistical_clear_sky/__init__.py | elsirdavid/StatisticalClearSky | bc3aa9de56a9347c10e2afe23af486d32d476273 | [
"BSD-2-Clause"
] | 16 | 2019-05-09T14:17:22.000Z | 2022-02-23T18:41:13.000Z | statistical_clear_sky/__init__.py | elsirdavid/StatisticalClearSky | bc3aa9de56a9347c10e2afe23af486d32d476273 | [
"BSD-2-Clause"
] | 7 | 2019-07-09T18:32:29.000Z | 2021-07-01T22:28:32.000Z | statistical_clear_sky/__init__.py | elsirdavid/StatisticalClearSky | bc3aa9de56a9347c10e2afe23af486d32d476273 | [
"BSD-2-Clause"
] | 4 | 2019-12-20T19:15:09.000Z | 2021-04-29T17:40:40.000Z | from statistical_clear_sky.algorithm.iterative_fitting import IterativeFitting
from statistical_clear_sky.algorithm.iterative_fitting import IterativeFitting as SCSF | 82.5 | 86 | 0.921212 | 20 | 165 | 7.3 | 0.55 | 0.205479 | 0.273973 | 0.315068 | 0.958904 | 0.958904 | 0.958904 | 0.958904 | 0.958904 | 0 | 0 | 0 | 0.054545 | 165 | 2 | 86 | 82.5 | 0.935897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 13 |
bc97d6789d127aca548cac6d4e81f99de434a140 | 120 | py | Python | omni_reports/client/__init__.py | paretogroup/omni-reports | 563febd7044cdf988d704019ae5fd114cfd824d3 | [
"MIT"
] | 24 | 2020-09-09T20:57:36.000Z | 2022-03-13T17:32:41.000Z | omni_reports/client/__init__.py | paretogroup/omni-reports | 563febd7044cdf988d704019ae5fd114cfd824d3 | [
"MIT"
] | 13 | 2020-09-01T19:34:24.000Z | 2021-03-31T19:58:53.000Z | omni_reports/client/__init__.py | paretogroup/omni-reports | 563febd7044cdf988d704019ae5fd114cfd824d3 | [
"MIT"
] | null | null | null | from omni_reports.client.client import ReportClient
from omni_reports.client.resolvers import ReportTypeResolverBuilder
| 40 | 67 | 0.9 | 14 | 120 | 7.571429 | 0.571429 | 0.150943 | 0.283019 | 0.396226 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 120 | 2 | 68 | 60 | 0.946429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
bcae3adeb4199a3d5f1290bb611e76a9fb418cf1 | 154 | py | Python | backend/api/app/tests/unit/games/__init__.py | jhouser/houseoffun | a5a9dab377864d4da15f7ba64b505d2db3af34ef | [
"MIT"
] | null | null | null | backend/api/app/tests/unit/games/__init__.py | jhouser/houseoffun | a5a9dab377864d4da15f7ba64b505d2db3af34ef | [
"MIT"
] | 3 | 2018-03-31T09:52:03.000Z | 2018-08-16T18:12:51.000Z | backend/api/app/tests/unit/games/__init__.py | jhouser/houseoffun | a5a9dab377864d4da15f7ba64b505d2db3af34ef | [
"MIT"
] | 1 | 2018-03-21T16:05:36.000Z | 2018-03-21T16:05:36.000Z | from api.app.tests.unit.games.TestGames import *
from api.app.tests.unit.games.TestSignups import *
from api.app.tests.unit.games.TestCharacters import *
| 38.5 | 53 | 0.805195 | 24 | 154 | 5.166667 | 0.416667 | 0.169355 | 0.241935 | 0.362903 | 0.677419 | 0.677419 | 0.483871 | 0 | 0 | 0 | 0 | 0 | 0.077922 | 154 | 3 | 54 | 51.333333 | 0.873239 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
bcb12c70b5816e931161b4e66f323a935ce347a6 | 5,634 | py | Python | tests/components/google_assistant/__init__.py | patrickeasters/home-assistant | 1d579587c1d6605f144fb98f23935d7a40993660 | [
"Apache-2.0"
] | 1 | 2018-12-01T02:09:25.000Z | 2018-12-01T02:09:25.000Z | tests/components/google_assistant/__init__.py | patrickeasters/home-assistant | 1d579587c1d6605f144fb98f23935d7a40993660 | [
"Apache-2.0"
] | null | null | null | tests/components/google_assistant/__init__.py | patrickeasters/home-assistant | 1d579587c1d6605f144fb98f23935d7a40993660 | [
"Apache-2.0"
] | 2 | 2018-11-09T13:10:17.000Z | 2019-01-08T11:44:08.000Z | """Tests for the Google Assistant integration."""
DEMO_DEVICES = [{
'id':
'light.kitchen_lights',
'name': {
'name': 'Kitchen Lights'
},
'traits': [
'action.devices.traits.OnOff', 'action.devices.traits.Brightness',
'action.devices.traits.ColorSpectrum',
'action.devices.traits.ColorTemperature'
],
'type':
'action.devices.types.LIGHT',
'willReportState':
False
}, {
'id':
'switch.ac',
'name': {
'name': 'AC'
},
'traits': [
'action.devices.traits.OnOff'
],
'type': 'action.devices.types.SWITCH',
'willReportState':
False
}, {
'id':
'switch.decorative_lights',
'name': {
'name': 'Decorative Lights'
},
'traits': [
'action.devices.traits.OnOff'
],
'type': 'action.devices.types.LIGHT', # This is used for custom type
'willReportState':
False
}, {
'id':
'light.ceiling_lights',
'name': {
'name': 'Roof Lights',
'nicknames': ['top lights', 'ceiling lights']
},
'traits': [
'action.devices.traits.OnOff', 'action.devices.traits.Brightness',
'action.devices.traits.ColorSpectrum',
'action.devices.traits.ColorTemperature'
],
'type':
'action.devices.types.LIGHT',
'willReportState':
False
}, {
'id':
'light.bed_light',
'name': {
'name': 'Bed Light'
},
'traits': [
'action.devices.traits.OnOff', 'action.devices.traits.Brightness',
'action.devices.traits.ColorSpectrum',
'action.devices.traits.ColorTemperature'
],
'type':
'action.devices.types.LIGHT',
'willReportState':
False
}, {
'id': 'group.all_lights',
'name': {
'name': 'all lights'
},
'traits': ['action.devices.traits.OnOff'],
'type': 'action.devices.types.SWITCH',
'willReportState': False
}, {
'id': 'group.all_switches',
'name': {
'name': 'all switches'
},
'traits': ['action.devices.traits.OnOff'],
'type': 'action.devices.types.SWITCH',
'willReportState': False
}, {
'id':
'cover.living_room_window',
'name': {
'name': 'Living Room Window'
},
'traits':
['action.devices.traits.OnOff', 'action.devices.traits.Brightness'],
'type':
'action.devices.types.LIGHT',
'willReportState':
False
}, {
'id':
'cover.hall_window',
'name': {
'name': 'Hall Window'
},
'traits':
['action.devices.traits.OnOff', 'action.devices.traits.Brightness'],
'type':
'action.devices.types.LIGHT',
'willReportState':
False
}, {
'id': 'cover.garage_door',
'name': {
'name': 'Garage Door'
},
'traits': ['action.devices.traits.OnOff'],
'type': 'action.devices.types.LIGHT',
'willReportState': False
}, {
'id': 'cover.kitchen_window',
'name': {
'name': 'Kitchen Window'
},
'traits': ['action.devices.traits.OnOff'],
'type': 'action.devices.types.LIGHT',
'willReportState': False
}, {
'id': 'group.all_covers',
'name': {
'name': 'all covers'
},
'traits': ['action.devices.traits.OnOff'],
'type': 'action.devices.types.SWITCH',
'willReportState': False
}, {
'id':
'media_player.bedroom',
'name': {
'name': 'Bedroom'
},
'traits':
['action.devices.traits.OnOff', 'action.devices.traits.Brightness'],
'type':
'action.devices.types.LIGHT',
'willReportState':
False
}, {
'id':
'media_player.living_room',
'name': {
'name': 'Living Room'
},
'traits':
['action.devices.traits.OnOff', 'action.devices.traits.Brightness'],
'type':
'action.devices.types.LIGHT',
'willReportState':
False
}, {
'id': 'media_player.lounge_room',
'name': {
'name': 'Lounge room'
},
'traits': ['action.devices.traits.OnOff'],
'type': 'action.devices.types.LIGHT',
'willReportState': False
}, {
'id':
'media_player.walkman',
'name': {
'name': 'Walkman'
},
'traits':
['action.devices.traits.OnOff', 'action.devices.traits.Brightness'],
'type':
'action.devices.types.LIGHT',
'willReportState':
False
}, {
'id': 'fan.living_room_fan',
'name': {
'name': 'Living Room Fan'
},
'traits': ['action.devices.traits.OnOff'],
'type': 'action.devices.types.SWITCH',
'willReportState': False
}, {
'id': 'fan.ceiling_fan',
'name': {
'name': 'Ceiling Fan'
},
'traits': ['action.devices.traits.OnOff'],
'type': 'action.devices.types.SWITCH',
'willReportState': False
}, {
'id': 'group.all_fans',
'name': {
'name': 'all fans'
},
'traits': ['action.devices.traits.OnOff'],
'type': 'action.devices.types.SWITCH',
'willReportState': False
}, {
'id': 'climate.hvac',
'name': {
'name': 'Hvac'
},
'traits': ['action.devices.traits.TemperatureSetting'],
'type': 'action.devices.types.THERMOSTAT',
'willReportState': False,
'attributes': {
'availableThermostatModes': 'heat,cool,off',
'thermostatTemperatureUnit': 'C',
},
}, {
'id': 'climate.heatpump',
'name': {
'name': 'HeatPump'
},
'traits': ['action.devices.traits.TemperatureSetting'],
'type': 'action.devices.types.THERMOSTAT',
'willReportState': False
}, {
'id': 'climate.ecobee',
'name': {
'name': 'Ecobee'
},
'traits': ['action.devices.traits.TemperatureSetting'],
'type': 'action.devices.types.THERMOSTAT',
'willReportState': False
}]
| 24.076923 | 74 | 0.561413 | 518 | 5,634 | 6.061776 | 0.127413 | 0.240127 | 0.217834 | 0.175159 | 0.748726 | 0.745223 | 0.740764 | 0.740764 | 0.740764 | 0.739172 | 0 | 0 | 0.244409 | 5,634 | 233 | 75 | 24.180258 | 0.737609 | 0.012957 | 0 | 0.679654 | 0 | 0 | 0.575878 | 0.332493 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
bcbd8795aed17d8229a3b80d16f839e87d2751f0 | 209 | py | Python | Novelty_utils.py | mayank18garg/novelty-detection-dataset-generator | d0025cf6fda057f1ff5a404f214d92585cbafa29 | [
"MIT"
] | null | null | null | Novelty_utils.py | mayank18garg/novelty-detection-dataset-generator | d0025cf6fda057f1ff5a404f214d92585cbafa29 | [
"MIT"
] | null | null | null | Novelty_utils.py | mayank18garg/novelty-detection-dataset-generator | d0025cf6fda057f1ff5a404f214d92585cbafa29 | [
"MIT"
] | 1 | 2022-02-18T07:42:04.000Z | 2022-02-18T07:42:04.000Z | import pandas as pd
novelty_mapping = {"buy_discount_property":"buy_property", "buy_property_right":"buy_property", "buy_property_rightplayers":"buy_property", "improvePropertyRed_novelty":"improve_property"} | 69.666667 | 188 | 0.827751 | 25 | 209 | 6.44 | 0.52 | 0.341615 | 0.354037 | 0.273292 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047847 | 209 | 3 | 188 | 69.666667 | 0.809045 | 0 | 0 | 0 | 0 | 0 | 0.67619 | 0.342857 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
bcf639d97556f406b6af8c895225dbe1744749f6 | 12,172 | py | Python | PINNTraining/easy_pde_test/equations.py | ls2716/PIV_PINN_data_extraction | 198754c8adeed92eea52e9904a39e993bc475ada | [
"MIT"
] | 2 | 2021-11-19T07:01:08.000Z | 2022-01-09T15:30:18.000Z | PINNTraining/easy_pde_test/equations.py | ls2716/PIV_PINN_data_extraction | 198754c8adeed92eea52e9904a39e993bc475ada | [
"MIT"
] | null | null | null | PINNTraining/easy_pde_test/equations.py | ls2716/PIV_PINN_data_extraction | 198754c8adeed92eea52e9904a39e993bc475ada | [
"MIT"
] | null | null | null | import deepxde as dde
def easy_eq(slope):
def pde(X, V):
u = V[:, 0:1]
# v = V[:, 1:2]
# p = V[:,2:3]
du_x = dde.grad.jacobian(V, X, i=0, j=0)
# dv_y = dde.grad.jacobian(V, X, i=1, j=1)
# du_y = dde.grad.jacobian(V, X, i=0, j=1)
# dv_x = dde.grad.jacobian(V, X, i=1, j=0)
# dp_x = dde.grad.jacobian(V, X, i=2, j=0)
# dp_y = dde.grad.jacobian(V, X, i=2, j=1)
# du_xx = dde.grad.hessian(u, X, i=0, j=0)
# dv_xx = dde.grad.hessian(v, X, i=0, j=0)
# du_yy = dde.grad.hessian(u, X, i=1, j=1)
# dv_yy = dde.grad.hessian(v, X, i=1, j=1)
return [
du_x -slope
]
return pde
def NS2D(Rey):
def pde(X, V):
u = V[:, 0:1]
v = V[:, 1:2]
# p = V[:,2:3]
du_x = dde.grad.jacobian(V, X, i=0, j=0)
dv_y = dde.grad.jacobian(V, X, i=1, j=1)
du_y = dde.grad.jacobian(V, X, i=0, j=1)
dv_x = dde.grad.jacobian(V, X, i=1, j=0)
dp_x = dde.grad.jacobian(V, X, i=2, j=0)
dp_y = dde.grad.jacobian(V, X, i=2, j=1)
du_xx = dde.grad.hessian(u, X, i=0, j=0)
dv_xx = dde.grad.hessian(v, X, i=0, j=0)
du_yy = dde.grad.hessian(u, X, i=1, j=1)
dv_yy = dde.grad.hessian(v, X, i=1, j=1)
return [
du_x + dv_y,
u * du_x + v * du_y + dp_x - 1.0 / Rey * (du_xx + du_yy),
u * dv_x + v * dv_y + dp_y - 1.0 / Rey * (dv_xx + dv_yy),
]
return pde
def RANS2D(Rey):
def pde(X, V):
u = V[:, 0:1]
v = V[:, 1:2]
# p = V[:,2:3]
du_x = dde.grad.jacobian(V, X, i=0, j=0)
dv_y = dde.grad.jacobian(V, X, i=1, j=1)
du_y = dde.grad.jacobian(V, X, i=0, j=1)
dv_x = dde.grad.jacobian(V, X, i=1, j=0)
dp_x = dde.grad.jacobian(V, X, i=2, j=0)
dp_y = dde.grad.jacobian(V, X, i=2, j=1)
du_xx = dde.grad.hessian(u, X, i=0, j=0)
dv_xx = dde.grad.hessian(v, X, i=0, j=0)
du_yy = dde.grad.hessian(u, X, i=1, j=1)
dv_yy = dde.grad.hessian(v, X, i=1, j=1)
# Reynolds stresses
duu_x = dde.grad.jacobian(V, X, i=3, j=0)
duv_x = dde.grad.jacobian(V, X, i=5, j=0)
dvv_y = dde.grad.jacobian(V, X, i=4, j=1)
duv_y = dde.grad.jacobian(V, X, i=5, j=1)
return [
du_x + dv_y,
u * du_x + v * du_y + dp_x - 1.0 / Rey * (du_xx + du_yy) - duu_x - duv_x,
u * dv_x + v * dv_y + dp_y - 1.0 / Rey * (dv_xx + dv_yy) - dvv_y - duv_y,
]
return pde
def RANSf2D(Rey):
def pde(X, V):
u = V[:, 0:1]
v = V[:, 1:2]
p = V[:, 2:3]
fsx = V[:, 3:4]
fsy = V[:, 4:5]
psi = V[:, 5:6]
ppsi = p+psi
du_x = dde.grad.jacobian(V, X, i=0, j=0)
dv_y = dde.grad.jacobian(V, X, i=1, j=1)
du_y = dde.grad.jacobian(V, X, i=0, j=1)
dv_x = dde.grad.jacobian(V, X, i=1, j=0)
dppsi_x = dde.grad.jacobian(ppsi[:,None], X, i=0, j=0)
dppsi_y = dde.grad.jacobian(ppsi[:,None], X, i=0, j=1)
du_xx = dde.grad.hessian(u, X, i=0, j=0)
dv_xx = dde.grad.hessian(v, X, i=0, j=0)
du_yy = dde.grad.hessian(u, X, i=1, j=1)
dv_yy = dde.grad.hessian(v, X, i=1, j=1)
dfsx_x = dde.grad.jacobian(V, X, i=3, j=0)
dfsy_y = dde.grad.jacobian(V, X, i=4, j=1)
return [
du_x + dv_y,
u * du_x + v * du_y + dppsi_x - 1.0 / Rey * (du_xx + du_yy) + fsx,
u * dv_x + v * dv_y + dppsi_y - 1.0 / Rey * (dv_xx + dv_yy) + fsy,
dfsx_x + dfsy_y
]
return pde
def RANSf02D(Rey):
def pde(X, V):
u = V[:, 0:1]
v = V[:, 1:2]
p = V[:, 2:3]
fsx = V[:, 3:4]
fsy = V[:, 4:5]
# p = V[:,2:3]
du_x = dde.grad.jacobian(V, X, i=0, j=0)
dv_y = dde.grad.jacobian(V, X, i=1, j=1)
du_y = dde.grad.jacobian(V, X, i=0, j=1)
dv_x = dde.grad.jacobian(V, X, i=1, j=0)
dp_x = dde.grad.jacobian(V, X, i=2, j=0)
dp_y = dde.grad.jacobian(V, X, i=2, j=1)
du_xx = dde.grad.hessian(u, X, i=0, j=0)
dv_xx = dde.grad.hessian(v, X, i=0, j=0)
du_yy = dde.grad.hessian(u, X, i=1, j=1)
dv_yy = dde.grad.hessian(v, X, i=1, j=1)
dfsx_x = dde.grad.jacobian(V, X, i=3, j=0)
dfsy_y = dde.grad.jacobian(V, X, i=4, j=1)
return [
du_x + dv_y,
u * du_x + v * du_y + dp_x - 1.0 / Rey * (du_xx + du_yy) + fsx,
u * dv_x + v * dv_y + dp_y - 1.0 / Rey * (dv_xx + dv_yy) + fsy,
dfsx_x + dfsy_y
]
return pde
def RANSf0var2D(Rey):
def pde(X, V):
u = V[:, 0:1]
v = V[:, 1:2]
p = V[:, 2:3]
fsx = V[:, 3:4]
fsy = V[:, 4:5]
curlf = V[:,5:6]
# p = V[:,2:3]
du_x = dde.grad.jacobian(V, X, i=0, j=0)
dv_y = dde.grad.jacobian(V, X, i=1, j=1)
du_y = dde.grad.jacobian(V, X, i=0, j=1)
dv_x = dde.grad.jacobian(V, X, i=1, j=0)
dp_x = dde.grad.jacobian(V, X, i=2, j=0)
dp_y = dde.grad.jacobian(V, X, i=2, j=1)
du_xx = dde.grad.hessian(u, X, i=0, j=0)
dv_xx = dde.grad.hessian(v, X, i=0, j=0)
du_yy = dde.grad.hessian(u, X, i=1, j=1)
dv_yy = dde.grad.hessian(v, X, i=1, j=1)
dfsx_x = dde.grad.jacobian(V, X, i=3, j=0)
dfsy_y = dde.grad.jacobian(V, X, i=4, j=1)
dfsx_y = dde.grad.jacobian(V, X, i=3, j=1)
dfsy_x = dde.grad.jacobian(V, X, i=4, j=0)
return [
du_x + dv_y,
u * du_x + v * du_y + dp_x - 1.0 / Rey * (du_xx + du_yy) + fsx,
u * dv_x + v * dv_y + dp_y - 1.0 / Rey * (dv_xx + dv_yy) + fsy,
dfsx_x + dfsy_y,
curlf - (dfsy_x-dfsx_y)
]
return pde
def RANSpknown2D(Rey):
def pde(X, V):
u = V[:, 0:1]
v = V[:, 1:2]
p = V[:, 2:3]
fsx = V[:, 3:4]
fsy = V[:, 4:5]
# p = V[:,2:3]
du_x = dde.grad.jacobian(V, X, i=0, j=0)
dv_y = dde.grad.jacobian(V, X, i=1, j=1)
du_y = dde.grad.jacobian(V, X, i=0, j=1)
dv_x = dde.grad.jacobian(V, X, i=1, j=0)
dp_x = dde.grad.jacobian(V, X, i=2, j=0)
dp_y = dde.grad.jacobian(V, X, i=2, j=1)
du_xx = dde.grad.hessian(u, X, i=0, j=0)
dv_xx = dde.grad.hessian(v, X, i=0, j=0)
du_yy = dde.grad.hessian(u, X, i=1, j=1)
dv_yy = dde.grad.hessian(v, X, i=1, j=1)
dfsx_x = dde.grad.jacobian(V, X, i=3, j=0)
dfsy_y = dde.grad.jacobian(V, X, i=4, j=1)
return [
du_x + dv_y,
u * du_x + v * du_y + dp_x - 1.0 / Rey * (du_xx + du_yy) + fsx,
u * dv_x + v * dv_y + dp_y - 1.0 / Rey * (dv_xx + dv_yy) + fsy
]
return pde
def RANSalphabeta2D(Rey):
def pde(X, V):
u = V[:, 0:1]
v = V[:, 1:2]
p = V[:, 2:3]
alpha = V[:, 3:4]
beta = V[:, 4:5]
# p = V[:,2:3]
du_x = dde.grad.jacobian(V, X, i=0, j=0)
dv_y = dde.grad.jacobian(V, X, i=1, j=1)
du_y = dde.grad.jacobian(V, X, i=0, j=1)
dv_x = dde.grad.jacobian(V, X, i=1, j=0)
dp_x = dde.grad.jacobian(V, X, i=2, j=0)
dp_y = dde.grad.jacobian(V, X, i=2, j=1)
du_xx = dde.grad.hessian(u, X, i=0, j=0)
dv_xx = dde.grad.hessian(v, X, i=0, j=0)
du_yy = dde.grad.hessian(u, X, i=1, j=1)
dv_yy = dde.grad.hessian(v, X, i=1, j=1)
dalpha_x = dde.grad.jacobian(V, X, i=3, j=0)
dalpha_y = dde.grad.jacobian(V, X, i=3, j=1)
dbeta_x = dde.grad.jacobian(V, X, i=4, j=0)
dbeta_y = dde.grad.jacobian(V, X, i=4, j=1)
dalpha_xx = dde.grad.hessian(alpha, X, i=0, j=0)
dbeta_xy = dde.grad.hessian(beta, X, i=0, j=1)
dalpha_yy = dde.grad.hessian(alpha, X, i=1, j=1)
return [
du_x + dv_y,
u * du_x + v * du_y + dp_x - 1.0 / Rey * (du_xx + du_yy) + dalpha_x+dbeta_y,
u * dv_x + v * dv_y + dp_y - 1.0 / Rey * (dv_xx + dv_yy) + dbeta_x-dalpha_y,
dalpha_xx+dbeta_xy*2-dalpha_yy
]
return pde
def RANSalphabetafvar2D(Rey):
def pde(X, V):
u = V[:, 0:1]
v = V[:, 1:2]
p = V[:, 2:3]
alpha = V[:, 3:4]
beta = V[:, 4:5]
curl_f = V[:,5:6]
# p = V[:,2:3]
du_x = dde.grad.jacobian(V, X, i=0, j=0)
dv_y = dde.grad.jacobian(V, X, i=1, j=1)
du_y = dde.grad.jacobian(V, X, i=0, j=1)
dv_x = dde.grad.jacobian(V, X, i=1, j=0)
dp_x = dde.grad.jacobian(V, X, i=2, j=0)
dp_y = dde.grad.jacobian(V, X, i=2, j=1)
du_xx = dde.grad.hessian(u, X, i=0, j=0)
dv_xx = dde.grad.hessian(v, X, i=0, j=0)
du_yy = dde.grad.hessian(u, X, i=1, j=1)
dv_yy = dde.grad.hessian(v, X, i=1, j=1)
dalpha_x = dde.grad.jacobian(V, X, i=3, j=0)
dalpha_y = dde.grad.jacobian(V, X, i=3, j=1)
dbeta_x = dde.grad.jacobian(V, X, i=4, j=0)
dbeta_y = dde.grad.jacobian(V, X, i=4, j=1)
dalpha_xx = dde.grad.hessian(alpha, X, i=0, j=0)
dbeta_xy = dde.grad.hessian(beta, X, i=0, j=1)
dalpha_yy = dde.grad.hessian(alpha, X, i=1, j=1)
dalpha_xy = dde.grad.hessian(alpha, X, i=0, j=1)
dbeta_xx = dde.grad.hessian(beta, X, i=0, j=0)
dbeta_yy = dde.grad.hessian(beta, X, i=1, j=1)
return [
du_x + dv_y,
u * du_x + v * du_y + dp_x - 1.0 / Rey * (du_xx + du_yy) + dalpha_x+dbeta_y,
u * dv_x + v * dv_y + dp_y - 1.0 / Rey * (dv_xx + dv_yy) + dbeta_x-dalpha_y,
dalpha_xx+dbeta_xy*2-dalpha_yy,
curl_f - (dbeta_xx-dbeta_yy-2*dalpha_xy)
]
return pde
def ab_equation():
def pde(X,V):
a = V[:, 0:1]
b = V[:, 1:2]
a_x = dde.grad.jacobian(V,X,i=0,j=0)
a_y = dde.grad.jacobian(V,X,i=0,j=1)
b_x = dde.grad.jacobian(V,X,i=1,j=0)
b_y = dde.grad.jacobian(V,X,i=1,j=1)
fx = V[:, 2:3]
fy = V[:, 3:4]
return [
fx - (a_x+b_y),
fy - (b_x-a_y)
]
return pde
def RANSReStresses2D(Rey):
def pde(X, V):
u = V[:, 0:1]
v = V[:, 1:2]
p = V[:, 2:3]
uu = V[:, 3:4]
uv = V[:, 4:5]
vv = V[:, 5:6]
# p = V[:,2:3]
du_x = dde.grad.jacobian(V, X, i=0, j=0)
dv_y = dde.grad.jacobian(V, X, i=1, j=1)
du_y = dde.grad.jacobian(V, X, i=0, j=1)
dv_x = dde.grad.jacobian(V, X, i=1, j=0)
dp_x = dde.grad.jacobian(V, X, i=2, j=0)
dp_y = dde.grad.jacobian(V, X, i=2, j=1)
du_xx = dde.grad.hessian(u, X, i=0, j=0)
dv_xx = dde.grad.hessian(v, X, i=0, j=0)
du_yy = dde.grad.hessian(u, X, i=1, j=1)
dv_yy = dde.grad.hessian(v, X, i=1, j=1)
duu_x = dde.grad.jacobian(V, X, i=3, j=0)
duv_y = dde.grad.jacobian(V, X, i=4, j=1)
duv_x = dde.grad.jacobian(V, X, i=4, j=0)
dvv_y = dde.grad.jacobian(V, X, i=5, j=1)
return [
du_x + dv_y,
u * du_x + v * du_y + dp_x - 1.0 / Rey * (du_xx + du_yy) + duu_x+duv_y,
u * dv_x + v * dv_y + dp_y - 1.0 / Rey * (dv_xx + dv_yy) + duv_x+dvv_y,
dde.backend.tf.nn.relu(-uu) + dde.backend.tf.nn.relu(-vv)
]
return pde
def func_ones(X):
x = X[:, 0:1]
return x * 0 + 1
def func_zeros(X):
x = X[:, 0:1]
return x * 0
def Blasius():
def pde(eta, f):
df = dde.grad.jacobian(f, eta)
ddf = dde.grad.jacobian(df, eta)
dddf = dde.grad.jacobian(ddf, eta)
return 2*dddf + f*ddf
return pde
| 33.256831 | 89 | 0.448406 | 2,349 | 12,172 | 2.199659 | 0.03576 | 0.192375 | 0.062706 | 0.272499 | 0.885427 | 0.878072 | 0.872266 | 0.872266 | 0.854655 | 0.844397 | 0 | 0.058877 | 0.36929 | 12,172 | 365 | 90 | 33.347945 | 0.614172 | 0.042475 | 0 | 0.723776 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.003497 | 0 | 0.185315 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
4c34dcc1e8ab3ed2df74ec0479531ae82a6a4aad | 5,421 | py | Python | reddit/tests.py | kiwiheretic/logos-v2 | 22739221a6d431322c809b7e17aba54f37eb9617 | [
"Apache-2.0"
] | 4 | 2015-02-20T08:11:59.000Z | 2019-05-15T23:48:11.000Z | reddit/tests.py | kiwiheretic/logos-v2 | 22739221a6d431322c809b7e17aba54f37eb9617 | [
"Apache-2.0"
] | 58 | 2015-01-11T02:10:09.000Z | 2022-03-20T01:20:15.000Z | reddit/tests.py | kiwiheretic/logos-v2 | 22739221a6d431322c809b7e17aba54f37eb9617 | [
"Apache-2.0"
] | 1 | 2016-06-15T00:49:44.000Z | 2016-06-15T00:49:44.000Z | from __future__ import absolute_import
from django.test import TestCase
import datetime
import re
from django.utils import timezone
# Import the plugin you wish to test
from .bot_plugin import RedditPlugin
from .models import Submission, Subreddits, FeedProgress, FeedSub
# Subclass your test class from LogosTestCase
from bot.testing.utils import LogosTestCase
class TestReddit(LogosTestCase):
# set plugin_class to the actual class
# of the plugin you wish to test
plugin_class = RedditPlugin
def setUp(self):
self.fred = self.create_user('fred', "fred@noemail.com", "password1")
def testRedditWithRoom(self):
self.assign_room_permission('fred', self.room, 'room_admin')
self.set_nick("fred")
self.login("password1")
output = self.plugin.send_command("add subreddit /r/askphysics")
self.assertIn('successfully added subreddit', output)
# Test subreddit subscription creation
output = self.plugin.send_command("add subreddit /r/gobbledegookzzz")
self.assertIn('subreddit does not exist', output)
output = self.plugin.send_command("list subreddits")
self.assertIn('AskPhysics', output)
sub_id = re.search(r"(\d+) AskPhysics", output).group(1)
# create some test data
subreddit = Subreddits.objects.get(display_name__iexact = "askphysics")
post = Submission(name = "t5_fake1",
created_at = timezone.make_aware(datetime.datetime.utcnow()),
subreddit = subreddit,
title = "first submission",
author = "kiwiheretic",
body = "My first submission",
url = "http://reddit.com/r/fake_url1",
score = 1,
link_flair_text = "My flair",
num_comments = 0)
post.save()
output =self.plugin.send_method("output_feeds", room=self.room)
self.assertIn('first submission', output)
post = Submission(name = "t5_fake2",
created_at = timezone.make_aware(datetime.datetime.utcnow()),
subreddit = subreddit,
title = "second_submission",
author = "kiwiheretic",
body = "My second submission",
url = "http://reddit.com/r/fake_url2",
score = 1,
link_flair_text = "My flair",
num_comments = 0)
post.save()
output =self.plugin.send_method("output_feeds", room=self.room)
self.assertNotIn('first submission', output)
self.assertIn('second submission', output)
# Now test deleting subreddit subscriptions
output = self.plugin.send_command("remove subreddit "+sub_id)
self.assertIn('deleted successfully', output)
output = self.plugin.send_command("list subreddits")
self.assertNotIn('AskPhysics', output)
def testRedditWithoutRoom(self):
self.assign_room_permission('fred', self.room, 'room_admin')
self.set_nick("fred")
self.login("password1")
output = self.plugin.send_command("add subreddit /r/askphysics")
self.assertIn('successfully added subreddit', output)
# Test subreddit subscription creation
output = self.plugin.send_command("add subreddit /r/gobbledegookzzz")
self.assertIn('subreddit does not exist', output)
output = self.plugin.send_command("list subreddits")
self.assertIn('AskPhysics', output)
sub_id = re.search(r"(\d+) AskPhysics", output).group(1)
# create some test data
subreddit = Subreddits.objects.get(display_name__iexact = "askphysics")
post = Submission(name = "t5_fake1",
created_at = timezone.make_aware(datetime.datetime.utcnow()),
subreddit = subreddit,
title = "first submission",
author = "kiwiheretic",
body = "My first submission",
url = "http://reddit.com/r/fake_url1",
score = 1,
link_flair_text = "My flair",
num_comments = 0)
post.save()
output =self.plugin.send_method("on_timer")
self.assertIn('first submission', output)
post = Submission(name = "t5_fake2",
created_at = timezone.make_aware(datetime.datetime.utcnow()),
subreddit = subreddit,
title = "second_submission",
author = "kiwiheretic",
body = "My second submission",
url = "http://reddit.com/r/fake_url2",
score = 1,
link_flair_text = "My flair",
num_comments = 0)
post.save()
output =self.plugin.send_method("on_timer")
self.assertNotIn('first submission', output)
self.assertIn('second submission', output)
# Now test deleting subreddit subscriptions
output = self.plugin.send_command("remove subreddit "+sub_id)
self.assertIn('deleted successfully', output)
output = self.plugin.send_command("list subreddits")
self.assertNotIn('AskPhysics', output)
def tearDown(self):
self.fred.delete()
FeedProgress.objects.all().delete()
FeedSub.objects.all().delete()
Submission.objects.all().delete()
Subreddits.objects.all().delete()
| 39 | 79 | 0.612987 | 575 | 5,421 | 5.648696 | 0.212174 | 0.049261 | 0.068966 | 0.086207 | 0.817734 | 0.817734 | 0.804187 | 0.804187 | 0.804187 | 0.804187 | 0 | 0.006412 | 0.28076 | 5,421 | 138 | 80 | 39.282609 | 0.826622 | 0.064195 | 0 | 0.807692 | 0 | 0 | 0.20486 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 1 | 0.038462 | false | 0.028846 | 0.076923 | 0 | 0.134615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4c790e90b59f0e0cdf931077ad3560e8428c8764 | 7,042 | py | Python | userbot/modules/stat.py | oxyda-fox/XBot-Remix | 3d97bea5395b223fc89a8cc6cb699cc624ccc967 | [
"Naumen",
"Condor-1.1",
"MS-PL"
] | null | null | null | userbot/modules/stat.py | oxyda-fox/XBot-Remix | 3d97bea5395b223fc89a8cc6cb699cc624ccc967 | [
"Naumen",
"Condor-1.1",
"MS-PL"
] | null | null | null | userbot/modules/stat.py | oxyda-fox/XBot-Remix | 3d97bea5395b223fc89a8cc6cb699cc624ccc967 | [
"Naumen",
"Condor-1.1",
"MS-PL"
] | null | null | null | #Encript Marshal By XVenom
#https://github.com/xvenom15
import marshal
exec(marshal.loads(b'\xe3\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x04\x00\x00\x00@\x00\x00\x00s\xba\x00\x00\x00d\x00Z\x00d\x01d\x02l\x01Z\x01d\x01d\x02l\x02Z\x02d\x01d\x03l\x03m\x04Z\x04\x01\x00d\x01d\x04l\x05m\x06Z\x06\x01\x00d\x01d\x05l\x07m\x08Z\x08m\tZ\tm\nZ\n\x01\x00d\x01d\x06l\x0bm\x0cZ\x0c\x01\x00d\x01d\x07l\rm\x0eZ\x0e\x01\x00e\x01j\x0fd\x08e\x01j\x10d\t\x8d\x02\x01\x00e\x01\xa0\x11e\x12\xa1\x01Z\x13e\x0ed\nd\x0bd\x0c\x8d\x02e\x04j\x14d\x02d\r\x9c\x02d\x0ed\x0f\x84\x04\x83\x01Z\x15d\x10d\x11\x84\x00Z\x16d\x12d\x13\x84\x00Z\x17d\x14d\x15\x84\x00Z\x18e\x0c\xa0\x19d\x0fd\x16i\x01\xa1\x01\x01\x00d\x02S\x00)\x17zLCount the Number of Dialogs you have in your Telegram Account\nSyntax: .stats\xe9\x00\x00\x00\x00N)\x01\xda\nNewMessage)\x01\xda\x06Dialog)\x03\xda\x07Channel\xda\x04Chat\xda\x04User)\x01\xda\x08CMD_HELP)\x01\xda\x08registerz3[%(levelname) 5s/%(asctime)s] %(name)s: %(message)s)\x02\xda\x06format\xda\x05levelTz\x12^.stats(?: |$)(.*))\x02Z\x08outgoingZ\x07pattern)\x02\xda\x05event\xda\x06returnc\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x12\x00\x00\x00\x07\x00\x00\x00\xc3\x00\x00\x00sn\x02\x00\x00|\x00\xa0\x00d\x01\xa1\x01I\x00d\x02H\x00}\x01t\x01\xa0\x01\xa1\x00}\x02d\x03}\x03d\x03}\x04d\x03}\x05d\x03}\x06d\x03}\x07d\x03}\x08d\x03}\td\x03}\nd\x03}\x0bd\x03}\x0c|\x00j\x02\xa0\x03\xa1\x002\x00z\xf23\x00d\x02H\x00W\x00}\r|\rj\x04}\x0et\x05|\x0et\x06\x83\x02r\xca|\x0ej\x07r\x98|\x06d\x047\x00}\x06|\x0ej\x08s\x80|\x0ej\tr\x88|\td\x047\x00}\t|\x0ej\x08r\xc8|\nd\x047\x00}\nn0|\x0ej\nr\xc8|\x05d\x047\x00}\x05|\x0ej\x08s\xb2|\x0ej\tr\xba|\x07d\x047\x00}\x07|\x0ej\x08r\xc8|\x08d\x047\x00}\x08n^t\x05|\x0et\x0b\x83\x02r\xec|\x03d\x047\x00}\x03|\x0ej\x0cr\xea|\x04d\x047\x00}\x04n<t\x05|\x0et\r\x83\x02\x90\x01r(|\x05d\x047\x00}\x05|\x0ej\x08\x90\x01s\x10|\x0ej\t\x90\x01r\x18|\x07d\x047\x00}\x07|\x0ej\x08\x90\x01r(|\x08d\x047\x00}\x08|\x0b|\rj\x0e7\x00}\x0b|\x0c|\rj\x0f7\x00}\x0cqJ6\x00t\x01\xa0\x01\xa1\x00|\x02\x18\x00}\x0ft\x10|\x00j\x02\xa0\x11\xa1\x00I\x00d\x02H\x00\x83\x01}\x10d\x05|\x10\x9b\x00d\x06\x9d\x03}\x11|\x11d\x07|\x03\x9b\x00d\x08\x9d\x037\x00}\x11|\x11d\t|\x03|\x04\x18\x00\x9b\x00d\n\x9d\x037\x00}\x11|\x11d\x0b|\x04\x9b\x00d\n\x9d\x037\x00}\x11|\x11d\x0c|\x05\x9b\x00d\x08\x9d\x037\x00}\x11|\x11d\r|\x06\x9b\x00d\x08\x9d\x037\x00}\x11|\x11d\x0e|\x07\x9b\x00d\x08\x9d\x037\x00}\x11|\x11d\x0f|\x08\x9b\x00d\n\x9d\x037\x00}\x11|\x11d\x10|\x07|\x08\x18\x00\x9b\x00d\n\x9d\x037\x00}\x11|\x11d\x11|\t\x9b\x00d\x08\x9d\x037\x00}\x11|\x11d\x0f|\n\x9b\x00d\n\x9d\x037\x00}\x11|\x11d\x10|\t|\n\x18\x00\x9b\x00d\n\x9d\x037\x00}\x11|\x11d\x12|\x0c\x9b\x00d\x08\x9d\x037\x00}\x11|\x11d\x13|\x0b\x9b\x00d\x14\x9d\x037\x00}\x11|\x11d\x15|\x0fd\x16\x9b\x04d\x17\x9d\x037\x00}\x11|\x00\xa0\x00|\x11\xa1\x01I\x00d\x02H\x00\x01\x00d\x02S\x00)\x18z&Command to get stats about the accountz\x1f`Collecting stats, Wait Master`Nr\x01\x00\x00\x00\xe9\x01\x00\x00\x00u\x11\x00\x00\x00\xf0\x9f\x94\xb8 **Stats for z\x05** \n\nz\x13**Private Chats:** z\x02 \nu\x0f\x00\x00\x00 \xe2\x80\xa2 `Users: z\x03` \nu\x0e\x00\x00\x00 \xe2\x80\xa2 `Bots: z\x0c**Groups:** z\x0e**Channels:** z\x15**Admin in Groups:** u\x11\x00\x00\x00 \xe2\x80\xa2 `Creator: u\x16\x00\x00\x00 \xe2\x80\xa2 `Admin Rights: z\x17**Admin in Channels:** z\x0c**Unread:** z\x15**Unread Mentions:** z\x03 \n\nz\r__It Took:__ z\x04.02fz\x03s \n)\x12Z\x04edit\xda\x04timeZ\x06clientZ\x0citer_dialogs\xda\x06entity\xda\nisinstancer\x04\x00\x00\x00Z\tbroadcastZ\x07creatorZ\x0cadmin_rightsZ\tmegagroupr\x06\x00\x00\x00Z\x03botr\x05\x00\x00\x00Z\x15unread_mentions_countZ\x0cunread_count\xda\x0einline_mentionZ\x06get_me)\x12r\x0b\x00\x00\x00Z\x0fwaiting_messageZ\nstart_timeZ\rprivate_chatsZ\x04botsZ\x06groupsZ\x12broadcast_channelsZ\x0fadmin_in_groupsZ\x11creator_in_groupsZ\x1badmin_in_broadcast_channelsZ\x13creator_in_channelsZ\x0funread_mentionsZ\x06unreadZ\x06dialogr\x0f\x00\x00\x00Z\tstop_time\xda\tfull_nameZ\x08response\xa9\x00r\x13\x00\x00\x00\xda\x00\xda\x05stats\x11\x00\x00\x00sr\x00\x00\x00\x00\x03\x10\x01\x08\x01\x04\x01\x04\x01\x04\x01\x04\x01\x04\x01\x04\x01\x04\x01\x04\x01\x04\x01\x04\x02\x16\x01\x06\x02\n\x02\x06\x01\x08\x01\x0c\x01\x08\x01\x06\x01\n\x02\x06\x01\x08\x03\x0c\x03\x08\x01\x06\x01\n\x02\n\x01\x08\x01\x06\x01\n\x02\x0c\x01\x08\x01\x10\x01\x08\x01\x08\x01\x08\x02\n\x01\x0e\x01\x0c\x02\x14\x01\x0c\x01\x10\x01\x14\x01\x10\x01\x10\x01\x10\x01\x10\x01\x10\x01\x14\x01\x10\x01\x10\x01\x14\x01\x10\x01\x10\x01\x12\x02r\x15\x00\x00\x00c\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x02\x00\x00\x00C\x00\x00\x00s\x1e\x00\x00\x00|\x00j\x00r\x12d\x01|\x00j\x00\x9b\x00\x9d\x02S\x00t\x01|\x00\x83\x01S\x00d\x00S\x00)\x02N\xfa\x01@)\x02Z\x08usernamer\x11\x00\x00\x00)\x01\xda\x04userr\x13\x00\x00\x00r\x13\x00\x00\x00r\x14\x00\x00\x00\xda\x0cmake_mention\\\x00\x00\x00s\x06\x00\x00\x00\x00\x01\x06\x01\x0c\x02r\x18\x00\x00\x00c\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00\x05\x00\x00\x00C\x00\x00\x00s \x00\x00\x00t\x00|\x00\x83\x01p\nd\x01}\x01d\x02|\x01\x9b\x00d\x03|\x00j\x01\x9b\x00d\x04\x9d\x05S\x00)\x05Nz\x07No Name\xfa\x01[z\x0f](tg://user?id=\xfa\x01))\x02\xda\x0euser_full_name\xda\x02id)\x02r\x17\x00\x00\x00r\x12\x00\x00\x00r\x13\x00\x00\x00r\x13\x00\x00\x00r\x14\x00\x00\x00r\x11\x00\x00\x00c\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0c\x01r\x11\x00\x00\x00c\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00\x03\x00\x00\x00C\x00\x00\x00s,\x00\x00\x00|\x00j\x00|\x00j\x01g\x02}\x01d\x01d\x02\x84\x00t\x02|\x01\x83\x01D\x00\x83\x01}\x01d\x03\xa0\x03|\x01\xa1\x01}\x02|\x02S\x00)\x04Nc\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00\x03\x00\x00\x00S\x00\x00\x00s\x14\x00\x00\x00g\x00|\x00]\x0c}\x01|\x01r\x04|\x01\x91\x02q\x04S\x00r\x13\x00\x00\x00r\x13\x00\x00\x00)\x02\xda\x02.0\xda\x01ir\x13\x00\x00\x00r\x13\x00\x00\x00r\x14\x00\x00\x00\xda\n<listcomp>j\x00\x00\x00s\x06\x00\x00\x00\x06\x00\x02\x00\x04\x00z"user_full_name.<locals>.<listcomp>\xfa\x01 )\x04Z\nfirst_nameZ\tlast_name\xda\x04list\xda\x04join)\x03r\x17\x00\x00\x00\xda\x05namesr\x12\x00\x00\x00r\x13\x00\x00\x00r\x13\x00\x00\x00r\x14\x00\x00\x00r\x1b\x00\x00\x00h\x00\x00\x00s\x08\x00\x00\x00\x00\x01\x0c\x01\x12\x01\n\x01r\x1b\x00\x00\x00z">`.stats`\nUsage: Get group status.)\x1a\xda\x07__doc__Z\x07loggingr\x0e\x00\x00\x00Z\x0ftelethon.eventsr\x02\x00\x00\x00Z\x12telethon.tl.customr\x03\x00\x00\x00Z\x11telethon.tl.typesr\x04\x00\x00\x00r\x05\x00\x00\x00r\x06\x00\x00\x00Z\x07userbotr\x07\x00\x00\x00Z\x0euserbot.eventsr\x08\x00\x00\x00Z\x0bbasicConfigZ\x07WARNINGZ\tgetLogger\xda\x08__name__Z\x06loggerZ\x05Eventr\x15\x00\x00\x00r\x18\x00\x00\x00r\x11\x00\x00\x00r\x1b\x00\x00\x00\xda\x06updater\x13\x00\x00\x00r\x13\x00\x00\x00r\x13\x00\x00\x00r\x14\x00\x00\x00\xda\x08<module>\x01\x00\x00\x00s*\x00\x00\x00\x04\x02\x08\x01\x08\x02\x0c\x01\x0c\x01\x14\x01\x0c\x01\x0c\x02\x06\x01\x04\xff\x06\x02\n\x03\n\x01\x14J\x08\x07\x08\x05\x08\x07\x04\x02\x02\x00\x02\xff\x02\xff')) | 1,760.5 | 6,971 | 0.756745 | 1,475 | 7,042 | 3.583729 | 0.222373 | 0.219069 | 0.163451 | 0.133939 | 0.318199 | 0.270715 | 0.220393 | 0.204881 | 0.164397 | 0.136776 | 0 | 0.332854 | 0.011502 | 7,042 | 4 | 6,971 | 1,760.5 | 0.426519 | 0.007384 | 0 | 0 | 0 | 0.5 | 0.993991 | 0.914151 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 11 |
d5b4ac255a9260a57b4f9e4233cae2496da8205a | 60,268 | py | Python | pygromos/files/blocks/pertubation_blocks.py | katzberger/PyGromosTools | a6a7e6b80818337d1634f3f1cca2854666b157c2 | [
"MIT"
] | null | null | null | pygromos/files/blocks/pertubation_blocks.py | katzberger/PyGromosTools | a6a7e6b80818337d1634f3f1cca2854666b157c2 | [
"MIT"
] | null | null | null | pygromos/files/blocks/pertubation_blocks.py | katzberger/PyGromosTools | a6a7e6b80818337d1634f3f1cca2854666b157c2 | [
"MIT"
] | null | null | null | import __main__
from collections import namedtuple
from numbers import Number
from typing import List, Tuple, Dict
import numpy as np
from pygromos.files.blocks._general_blocks import TITLE
from pygromos.files.blocks._general_blocks import _generic_gromos_block, _iterable_gromos_block, _generic_field
"""
FIELDS
"""
pertubation_eds_state = namedtuple("pertubationEdsState", ["IAC", "CHARGE"])
pertubation_lam_state_nonbonded = namedtuple("pertubationLamState", ["IAC", "MASS", "CHARGE"])
setattr(__main__, pertubation_eds_state.__name__, pertubation_eds_state)
pertubation_eds_state.__module__ = "__main__"
setattr(__main__, pertubation_lam_state_nonbonded.__name__, pertubation_lam_state_nonbonded)
pertubation_lam_state_nonbonded.__module__ = "__main__"
class atom_mass_type(_generic_field):
def __init__(self, N: int, ATMAS: float, ATMASN: str, comment: str = ""):
self.N = N
self.ATMAS = ATMAS
self.ATMASN = ATMASN
self.comment = comment
def to_string(self):
return self.comment + "{:<8} {:<3.4f} {:<5}\n".format(self.N, self.ATMAS, self.ATMASN)
class atom_eds_pertubation_state(_generic_field):
state_format_pattern = " {:>3} {:>10.5f}"
def __init__(self, NR:int, NAME:str, STATES:Dict[int, pertubation_eds_state], ALPHLJ:float=1.0, ALPHCRF:float=1.0):
self.NR = int(NR)
self.NAME = NAME
self.STATES = STATES
self.ALPHLJ = float(ALPHLJ)
self.ALPHCRF = float(ALPHCRF)
def to_string(self):
state_str = "".join([self.state_format_pattern.format(int(self.STATES[x].IAC), float(self.STATES[x].CHARGE)) for x in sorted(self.STATES)])
format_str = "{:>5} {:>5}"+state_str+" {:10.5f} {:10.5f}\n"
return format_str.format(self.NR, self.NAME, self.ALPHLJ, self.ALPHCRF)
class atom_lam_pertubation_state(_generic_field):
state_format_pattern = " {:>5} {:>5} {:>10.5f}"
def __init__(self, NR:int, RES:int, NAME:str, STATES:Dict[int, pertubation_lam_state_nonbonded], ALPHLJ:float=1.0, ALPHCRF:float=1.0):
self.NR = int(NR)
self.RES = int(RES)
self.NAME = NAME
self.STATES = STATES
self.ALPHLJ = float(ALPHLJ)
self.ALPHCRF = float(ALPHCRF)
def to_string(self):
state_str = "".join([self.state_format_pattern.format(int(self.STATES[x].IAC),float(self.STATES[x].MASS), float(self.STATES[x].CHARGE)) for x in sorted(self.STATES)])
format_str = "{:>5} {:>5} {:>5}"+state_str+" {:10.5f} {:10.5f}\n"
return format_str.format(self.NR, self.RES, self.NAME, self.ALPHLJ, self.ALPHCRF)
class atom_lam_pertubation_state_bond(_generic_field):
state_format_pattern = " {:>5}"
def __init__(self, NR: int, atomI: int, atomJ: int, STATES: Dict[int, int]):
self.NR = int(NR)
self.atomI = atomI
self.atomJ = atomJ
self.STATES = STATES
def to_string(self):
state_str = "".join([self.state_format_pattern.format(int(self.STATES[x])) for x in sorted(self.STATES)])
format_str = "{:>5} {:>5}"+state_str+"\n"
return format_str.format(self.atomI, self.atomJ)
class atom_lam_pertubation_state_angle(_generic_field):
state_format_pattern = " {:>5}"
def __init__(self, NR: int, atomI: int, atomJ: int, atomK: int, STATES: Dict[int, int]):
self.NR = int(NR)
self.atomI = atomI
self.atomJ = atomJ
self.atomK = atomK
self.STATES = STATES
def to_string(self):
state_str = "".join([self.state_format_pattern.format(int(self.STATES[x])) for x in sorted(self.STATES)])
format_str = "{:>5} {:>5} {:>5}"+state_str+"\n"
return format_str.format(self.atomI, self.atomJ, self.atomK)
class atom_lam_pertubation_state_dihedral(_generic_field):
state_format_pattern = " {:>5}"
def __init__(self, NR: int, atomI: int, atomJ: int, atomK: int, atomL: int, STATES: Dict[int, int]):
self.NR = int(NR)
self.atomI = atomI
self.atomJ = atomJ
self.atomK = atomK
self.atomL = atomL
self.STATES = STATES
def to_string(self):
state_str = "".join([self.state_format_pattern.format(int(self.STATES[x])) for x in sorted(self.STATES)])
format_str = "{:>5} {:>5} {:>5} {:>5}"+state_str+"\n"
return format_str.format(self.atomI, self.atomJ, self.atomK, self.atomL)
"""
BLOCKS
"""
### NONBONDED
class MPERTATOM(_generic_gromos_block):
def __init__(self, NJLA: int=None, NPTB: int=None, STATEIDENTIFIERS:List[str]=[], STATEATOMHEADER: Tuple[str]= ['NR', 'NAME', 'ALPHLJ', 'ALPHCRF'], STATEATOMS:List[atom_eds_pertubation_state]=[],
dummy_IAC = 22, dummy_CHARGE=0.0, content:List[str]=None):
"""
This block is used for lambda sampling to define the different states.
Parameters
----------
NJLA : int
number of perturbed atoms
NPTB : int
number of pertubation states
STATEIDENTIFIERS : List[str]
string names for states
STATEATOMHEADER
header for the atom description table
STATEATOMS
list of atoms, that shall be perturbed
dummy_IAC
dummy atom VdW type for perturbed atoms
dummy_CHARGE
dummy atom charge type for perturbed atoms
"""
if(content is None):
super().__init__(used=True, name=__class__.__name__)
self.NJLA = NJLA
self.NPTB = NPTB
self.STATEIDENTIFIERS = STATEIDENTIFIERS
self.STATEATOMHEADER = STATEATOMHEADER
self.STATEATOMS = STATEATOMS
else:
super().__init__(used=True, name=__class__.__name__, content=content)
self.dummy_IAC = dummy_IAC
self.dummy_CHARGE = dummy_CHARGE
def read_content_from_str(self, content:List[str]):
field = 0
comment = ""
NJLA = None
NTPB = None
STATEIDENTIFIERS = []
STATEATOMS = []
first = True
for line in content:
# print(line)
if ("#" in line):
comment = line
else:
if (field > 3):
if (first):
STATEATOMHEADER = ["NR", "NAME", ]
[STATEATOMHEADER.extend(["IAC" + str(x), "CHARGE" + str(x)]) for x in range(1, self.NPTB + 1)]
STATEATOMHEADER += ["ALPHLJ", "ALPHCRF"]
self.STATEATOMHEADER = STATEATOMHEADER
first = False
state_line = {key: value for key, value in zip(self.STATEATOMHEADER, line.split())}
final_state_line = {key: state_line[key] for key in state_line if
(not "IAC" in key and not "CHARGE" in key)}
states = {x: pertubation_eds_state(IAC=int(state_line["IAC" + str(x)]),
CHARGE=float(state_line["CHARGE" + str(x)])) for x in
range(1, 1 + self.NPTB)}
final_state_line.update({"STATES": states})
STATEATOMS.append(atom_eds_pertubation_state(**final_state_line))
elif (field == 0):
NJLA, NPTB = tuple(map(int, line.split()))
self.NJLA = NJLA
self.NPTB = NPTB
elif (field == 1):
STATEIDENTIFIERS = line.split()
self.STATEIDENTIFIERS = STATEIDENTIFIERS
field += 1
self.STATEATOMS = STATEATOMS
@property
def nStates(self)->int:
return self.NPTB
@property
def nTotalStateAtoms(self)->int:
return self.NJLA
@property
def states(self)->dict:
return {self.STATEIDENTIFIERS[state-1]: {atom.NR: atom.STATES[state] for atom in sorted(self.STATEATOMS, key=lambda x: x.NR)} for state in range(1, self.NPTB+1)}
"""
ADD FUNCTIONS
"""
def add_state_atoms(self, state_atoms: List[atom_eds_pertubation_state]):
"""
This function can add states and atoms, but also overwrite state values of existing atoms.
If a new state is defined only for a subset of atoms, all other atoms are set to the default dummy.
If a new atom misses a state definition, this state will be set to dummy.
Parameters
----------
state_atoms: List[atom_eds_pertubation_state]
"""
#some preperations:
dummy_state = pertubation_eds_state(IAC=self.dummy_IAC, CHARGE=self.dummy_CHARGE)
insert_id = self.STATEATOMHEADER.index("ALPHLJ")
#find all new states
unique_stateIDs = np.unique(np.concatenate([list(natom.STATES.keys()) for natom in state_atoms]))
## Todo: not urgent; state number adaptation ( present states 1,2,3,4 new state 8 - id should be 5 not 8)
unique_states = list(map(str, [ "state"+str(x) if isinstance(x, Number) else x for x in unique_stateIDs]))
#insert new state IDs
off = 0
for unique_state in unique_stateIDs:
self.STATEATOMHEADER.insert(insert_id+off, "IAC"+str(unique_state))
self.STATEATOMHEADER.insert(insert_id+off+1, "CHARGE"+str(unique_state))
off+=2
#add new state names
self.STATEIDENTIFIERS.extend(unique_states)
#increase the number of new states
self.NPTB += len(unique_states)
#1. Update already present atoms:
atomIDs = [atom.NR for atom in state_atoms]
for atom in self.STATEATOMS:
if(atom.NR in atomIDs):
new_atom = state_atoms[atomIDs.index(atom.NR)]
atom.NAME = new_atom.NAME
atom.STATES.update({key: val for key, val in new_atom.STATES.items()})
#add missing dummies
#print(unique_stateIDs)
atom.STATES.update({key: dummy_state for key in unique_stateIDs if not key in atom.STATES})
#remove present atom
del atomIDs[atomIDs.index(atom.NR)]
else:
#add missing dummies
atom.STATES.update({key: dummy_state for key in unique_stateIDs if not key in atom.STATES})
#2. add new atoms
new_atoms = [atom for atom in state_atoms if (atom.NR in atomIDs)]
for atom in new_atoms:
atom.STATES.update({key:dummy_state for key in range(1, self.NPTB+1) if (key not in atom.STATES)})
self.STATEATOMS.append(atom)
self.NJLA +=1
"""
DELETING FUNCTIONS
"""
def delete_state(self, stateIDs:(int, List[int])=None, stateNames:(str, List[str])=None):
"""
This function deletes an state column.
Parameters
----------
stateIDs: int
number of the state
Returns
-------
"""
if(not stateIDs is None):
if(isinstance(stateIDs, int)):
stateIDs = [stateIDs]
for state in stateIDs:
for atom in self.STATEATOMS:
if(state in atom.STATES):
del atom.STATES[state]
del self.STATEIDENTIFIERS[state - 1]
self.STATEATOMHEADER = [x for x in self.STATEATOMHEADER if
(not x == "IAC" + str(state) and not "CHARGE" + str(state) == x)]
self.NPTB-=len(set(stateIDs))
elif(not stateNames is None):
if(isinstance(stateNames, str)):
stateNames = [stateNames]
for stateN in stateNames:
#print(stateN)
stateID = self.STATEIDENTIFIERS.index(stateN)+1
for atom in self.STATEATOMS:
if(stateID in atom.STATES):
del atom.STATES[stateID]
del self.STATEIDENTIFIERS[stateID-1]
self.STATEATOMHEADER = [x for x in self.STATEATOMHEADER if( not x == "IAC"+str(stateID) and not "CHARGE"+str(stateID) == x)]
self.NPTB -= len(set(stateNames))
elif(not stateNames is None and not stateIDs is None):
raise Exception("Please give either stateNames or stateIDs")
def delete_atom(self, atomNR:(int, List[int])):
"""
This function removes atom lines from the ptp file.
Parameters
----------
atomNR: int
atom to be removed.
"""
if(isinstance(atomNR, int)):
atomNR = [atomNR]
ind_offset = 0
new_STATEATOMS = []
for ind, atom in enumerate(self.STATEATOMS):
if (atom.NR in atomNR):
continue
else:
new_STATEATOMS.append(atom)
self.STATEATOMS = new_STATEATOMS
self.NJLA -= len(atomNR)
"""
STR FUNCTIONS
"""
def _state_STATEATOMHEADER_str(self):
state_format_pattern = "{:>3} {:>5}"+"".join([" {:>3}{:>10}"for x in range(self.NPTB)])+" {:10} {:10}"
if(len(self.STATEATOMHEADER) != self.NPTB*2+4):
tmp_list = " ".join(["CHARGE"+str(x)+" "+"IAC"+str(x) for x in range(self.NPTB)])
self.STATEATOMHEADER = self.STATEATOMHEADER[:2]+tmp_list.split(" ")+self.STATEATOMHEADER[-2:]
return state_format_pattern.format(*self.STATEATOMHEADER)
def block_to_string(self) -> str:
result = self.name + self.line_seperator
result += "# NJLA " + self.field_seperator + "NPTB" + self.line_seperator
result += self.field_seperator + str(self.NJLA) + self.field_seperator + str(self.NPTB) + self.line_seperator
result += "# state_identifiers" + self.line_seperator
result += self.field_seperator + self.field_seperator.join(map(str, self.STATEIDENTIFIERS)) + self.line_seperator
result += "# " + self._state_STATEATOMHEADER_str() + self.line_seperator
result += "".join(map(str, sorted(self.STATEATOMS, key=lambda x: x.NR)))
result += "END"+self.line_seperator
return result
class PERTATOMPARAM(_generic_gromos_block):
def __init__(self, STATEATOMS:List[atom_lam_pertubation_state]=None,
STATEATOMHEADER: Tuple[str]= None,
NJLA: int=None, STATEIDENTIFIERS=None,
dummy_IAC = 22, dummy_CHARGE=0.0, content:List[str]=None):
self.NPTB = 2
self.dummy_IAC = dummy_IAC
self.dummy_CHARGE = dummy_CHARGE
if(content is None):
if(STATEATOMHEADER is None):
self.STATEATOMHEADER = ["NR", "RES", "NAME",]
for s in range(self.NPTB):
self.STATEATOMHEADER += ["IAC", "MASS", "CHARGE",]
self.STATEATOMHEADER += ["ALPHLJ", "ALPHCRF"]
else:
self.STATEATOMHEADER = STATEATOMHEADER
if(STATEATOMS is None):
self.STATEATOMS = []
else:
self.STATEATOMS = []
self.NJLA = 0
self.add_state_atoms(STATEATOMS)
super().__init__(used=True, name=__class__.__name__)
else:
super().__init__(used=True, name=__class__.__name__, content=content)
# You can check yourself :)
if(not NJLA is None and not len(STATEATOMS)==NJLA):
raise ValueError("NJLA must be equal to the length of STATEATOMS! NJLA="+str(NJLA)+"\t stateatoms"+str(len(STATEATOMS))+"\n\n"+str(self))
def read_content_from_str(self, content:List[str]):
field = 0
NJLA = None
STATEIDENTIFIERS = None
STATEATOMHEADER = None
STATEATOMS = []
first = True
stdid = False
for line in content:
if ("#" in line):
comment = line
if("state_identifiers" in line):
stdid=True
elif(stdid):
STATEIDENTIFIERS = line.replace("#", "").split()
stdid=False
continue
else:
if (field > 0):
if(first):
STATEATOMHEADER = ["NR", "RES", "NAME",]
[STATEATOMHEADER.extend(["IAC" + str(x), "MASS" + str(x), "CHARGE" + str(x)]) for x in range(1, 3)]
STATEATOMHEADER += ["ALPHLJ", "ALPHCRF"]
first = False
state_line = {key: value for key, value in zip(STATEATOMHEADER, line.split())}
final_state_line = {key: state_line[key] for key in state_line if
(not "IAC" in key and not "CHARGE" in key and not "MASS" in key)}
states = {x: pertubation_lam_state_nonbonded(IAC=int(round(float(state_line["IAC" + str(x)]))),
MASS=float(state_line["MASS" + str(x)]),
CHARGE=float(state_line["CHARGE" + str(x)])) for x in range(1, 3)}
final_state_line.update({"STATES":states})
STATEATOMS.append(atom_lam_pertubation_state(**final_state_line))
elif (field == 0):
NJLA = int(line.strip())
field += 1
self.NJLA = NJLA
self.STATEIDENTIFIERS = STATEIDENTIFIERS
self.STATEATOMHEADER = STATEATOMHEADER
self.STATEATOMS = STATEATOMS
@property
def nStates(self)->int:
return self.NPTB
@property
def nTotalStateAtoms(self)->int:
return self.NJLA
@property
def states(self)->dict:
return {self.STATEIDENTIFIERS[state-1]: {atom.NR: atom.STATES[state] for atom in sorted(self.STATEATOMS, key=lambda x: x.NR)} for state in range(1, self.NPTB+1)}
"""
ADD FUNCTIONS
"""
def add_state_atoms(self, state_atoms: List[atom_lam_pertubation_state]):
"""
This function can add states and atoms, but also overwrite state values of existing atoms.
If a new state is defined only for a subset of atoms, all other atoms are set to the default dummy.
If a new atom misses a state definition, this state will be set to dummy.
Parameters
----------
state_atoms: List[atom_eds_pertubation_state]
"""
#some preperations:
pre_dummy_state = lambda atomMass: pertubation_lam_state_nonbonded(IAC=self.dummy_IAC, MASS=atomMass, CHARGE=self.dummy_CHARGE)
insert_id = self.STATEATOMHEADER.index("ALPHLJ")
#find all new states
keys = np.array([list(natom.STATES.keys()) for natom in state_atoms], ndmin=1)
unique_stateIDs = np.unique(np.concatenate(keys))
## Todo: not urgent; state number adaptation ( present states 1,2,3,4 new state 8 - id should be 5 not 8)
unique_states = list(map(str, [ "state"+str(x) if isinstance(x, Number) else x for x in unique_stateIDs]))
#insert new state IDs
off = 0
for unique_state in unique_stateIDs:
self.STATEATOMHEADER.insert(insert_id+off, "IAC"+str(unique_state))
self.STATEATOMHEADER.insert(insert_id+off+1, "mass"+str(unique_state))
self.STATEATOMHEADER.insert(insert_id+off+2, "CHARGE"+str(unique_state))
off+=3
#add new state names
if(hasattr(self, "STATEIDENTIFIERS")):
self.STATEIDENTIFIERS.extend(unique_states)
self.NPTB += len(unique_states)
else:
self.STATEIDENTIFIERS = unique_states
self.NPTB = len(unique_states)
#increase the number of new states
#1. Update already present atoms:
atomIDs = [atom.NR for atom in state_atoms]
for atom in self.STATEATOMS:
atom.STATES.update({key: val for key, val in atom.STATES.items()})
possible_masses = [val.MASS for key, val in atom.STATES.items() if(val.MASS >0)]
dummy_state = pre_dummy_state(atomMass=possible_masses[0])
if(atom.NR in atomIDs):
new_atom = state_atoms[atomIDs.index(atom.NR)]
atom.NAME = new_atom.NAME
atom.STATES.update({key: val for key, val in new_atom.STATES.items()})
possible_masses = [val.MASS for key, val in new_atom.STATES.items() if(val.MASS >0)]
#add missing dummies
#print(unique_stateIDs)
atom.STATES.update({key: dummy_state for key in unique_stateIDs if not key in atom.STATES})
#remove present atom
del atomIDs[atomIDs.index(atom.NR)]
else:
#add missing dummies
atom.STATES.update({key: dummy_state for key in unique_stateIDs if not key in atom.STATES})
#2. add new atoms
new_atoms = [atom for atom in state_atoms if (atom.NR in atomIDs)]
for atom in new_atoms:
atom.STATES.update({key: val for key, val in atom.STATES.items()})
possible_masses = [val.MASS for key, val in atom.STATES.items() if(val.MASS >0)]
dummy_state = pre_dummy_state(atomMass=possible_masses[0])
atom.STATES.update({key:dummy_state for key in range(1, self.NPTB+1) if (key not in atom.STATES)})
self.STATEATOMS.append(atom)
self.NJLA +=1
"""
DELETING FUNCTIONS
"""
def delete_state(self, stateIDs:(int, List[int])=None, stateNames:(str, List[str])=None):
"""
This function deletes an state column.
Parameters
----------
stateIDs: int
number of the state
Returns
-------
"""
if(not stateIDs is None):
if(isinstance(stateIDs, int)):
stateIDs = [stateIDs]
for state in stateIDs:
for atom in self.STATEATOMS:
if(state in atom.STATES):
del atom.STATES[state]
del self.STATEIDENTIFIERS[state - 1]
self.STATEATOMHEADER = [x for x in self.STATEATOMHEADER if
(not x == "IAC" + str(state) and not "CHARGE" + str(state) == x)]
self.NPTB-=len(set(stateIDs))
elif(not stateNames is None):
if(isinstance(stateNames, str)):
stateNames = [stateNames]
for stateN in stateNames:
#print(stateN)
stateID = self.STATEIDENTIFIERS.index(stateN)+1
for atom in self.STATEATOMS:
if(stateID in atom.STATES):
del atom.STATES[stateID]
del self.STATEIDENTIFIERS[stateID-1]
self.STATEATOMHEADER = [x for x in self.STATEATOMHEADER if( not x == "IAC"+str(stateID) and not "CHARGE"+str(stateID) == x)]
self.NPTB -= len(set(stateNames))
elif(not stateNames is None and not stateIDs is None):
raise Exception("Please give either stateNames or stateIDs")
def delete_atom(self, atomNR:(int, List[int])):
"""
This function removes atom lines from the ptp file.
Parameters
----------
atomNR: int
atom to be removed.
"""
if(isinstance(atomNR, int)):
atomNR = [atomNR]
ind_offset = 0
new_STATEATOMS = []
for ind, atom in enumerate(self.STATEATOMS):
if (atom.NR in atomNR):
continue
else:
new_STATEATOMS.append(atom)
self.STATEATOMS = new_STATEATOMS
self.NJLA -= len(atomNR)
"""
STR FUNCTIONS
"""
def _state_STATEATOMHEADER_str(self):
state_format_pattern = "{:>5} {:>5} {:>5}"+"".join([" {:>5}{:>5}{:>10}"for x in range(self.NPTB)])+" {:10} {:10}"
return state_format_pattern.format(*self.STATEATOMHEADER)
def block_to_string(self) -> str:
result = self.name + self.line_seperator
result += "# NJLA " + self.field_seperator + "NPTB = " + self.field_seperator + str(self.NPTB) + self.field_seperator+ self.line_seperator
result += self.field_seperator + str(self.NJLA)+self.line_seperator
result += "# state_identifiers" + self.line_seperator
result += "# "+self.field_seperator + self.field_seperator.join(map(str, self.STATEIDENTIFIERS)) + self.line_seperator
result += "# " + self._state_STATEATOMHEADER_str() + self.line_seperator
result += "".join(map(str, sorted(self.STATEATOMS, key=lambda x: x.NR)))
result += "END"+self.line_seperator
return result
### BONDED
class PERTBONDSTRETCH(_generic_gromos_block):
def __init__(self, STATEATOMS:List[atom_lam_pertubation_state_bond]=None,
STATEATOMHEADER: Tuple[str]= None,
NPB: int=None,
dummy_BOND = 22, content:List[str]=None):
self.NPTB = 2
self.dummy_BOND = dummy_BOND
if(content is None):
if(STATEATOMHEADER is None):
self.STATEATOMHEADER = ["atomI", "atomJ", "type1", "type2"]
else:
self.STATEATOMHEADER = STATEATOMHEADER
if(STATEATOMS is None):
self.STATEATOMS = []
else:
self.STATEATOMS = []
self.NPB = 0
#self.add_state_atoms(STATEATOMS)
super().__init__(used=True, name=__class__.__name__)
else:
super().__init__(used=True, name=__class__.__name__, content=content)
# You can check yourself :)
if(not NPB is None and not len(STATEATOMS)==NPB):
raise ValueError("NJLA must be equal to the length of STATEATOMS! NJLA="+str(NPB)+"\t stateatoms"+str(len(STATEATOMS))+"\n\n"+str(self))
@property
def nStates(self)->int:
return self.NPTB
@property
def nTotalStateAtoms(self)->int:
return self.NPB
@property
def states(self)->dict:
return {self.STATEIDENTIFIERS[state-1]: {atom.NR: atom.STATES[state] for atom in sorted(self.STATEATOMS, key=lambda x: x.NR)} for state in range(1, self.NPTB+1)}
def read_content_from_str(self, content:List[str]):
field = 0
NPB = None
STATEIDENTIFIERS = None
STATEATOMHEADER = None
STATEATOMS = []
first = True
stdid = False
i=1
for line in content:
if ("#" in line):
comment = line
if("state_identifiers" in line):
stdid=True
elif(stdid):
STATEIDENTIFIERS = line.replace("#", "").split()
stdid=False
continue
else:
if (field > 0):
if(first):
STATEATOMHEADER = ["atomI", "atomJ", "type1", "type2"]
first = False
state_line = {key: value for key, value in zip(STATEATOMHEADER, line.split())}
state_line.update({"NR":len(STATEATOMS)+1})
final_state_line = {key: state_line[key] for key in state_line if (not "type" in key)}
states = {1: state_line["type1"],
2: state_line["type2"]}
final_state_line.update({"STATES":states})
STATEATOMS.append(atom_lam_pertubation_state_bond(**final_state_line))
elif (field == 0):
NPB = int(line.strip())
field += 1
self.NPB = NPB
self.STATEIDENTIFIERS = STATEIDENTIFIERS
self.STATEATOMHEADER = STATEATOMHEADER
self.STATEATOMS = STATEATOMS
"""
STR FUNCTIONS
"""
def _state_STATEATOMHEADER_str(self):
state_format_pattern = "{:>5} {:>5}"+"".join([" {:>5} "for x in range(self.NPTB)])+""
return state_format_pattern.format(*self.STATEATOMHEADER)
def block_to_string(self) -> str:
result = self.name + self.line_seperator
result += "# NPB " + self.field_seperator + "NPTB = " + self.field_seperator + str(self.NPTB) + self.field_seperator+ self.line_seperator
result += self.field_seperator + str(self.NPB)+self.line_seperator
result += "# " + self._state_STATEATOMHEADER_str() + self.line_seperator
result += "".join(map(str, sorted(self.STATEATOMS, key=lambda x: x.NR)))
result += "END"+self.line_seperator
return result
class PERTBONDSTRETCHH(_generic_gromos_block):
def __init__(self, STATEATOMS:List[atom_lam_pertubation_state_bond]=None,
STATEATOMHEADER: Tuple[str]= None,
NPB: int=None,
dummy_BOND = 22, content:List[str]=None):
self.NPTB = 2
self.dummy_BOND = dummy_BOND
if(content is None):
if(STATEATOMHEADER is None):
self.STATEATOMHEADER = ["atomI", "atomJ", "type1", "type2"]
else:
self.STATEATOMHEADER = STATEATOMHEADER
if(STATEATOMS is None):
self.STATEATOMS = []
else:
self.STATEATOMS = []
self.NPB = 0
#self.add_state_atoms(STATEATOMS)
super().__init__(used=True, name=__class__.__name__)
else:
super().__init__(used=True, name=__class__.__name__, content=content)
# You can check yourself :)
if(not NPB is None and not len(STATEATOMS)==NPB):
raise ValueError("NJLA must be equal to the length of STATEATOMS! NJLA="+str(NPB)+"\t stateatoms"+str(len(STATEATOMS))+"\n\n"+str(self))
@property
def nStates(self)->int:
return self.NPTB
@property
def nTotalStateAtoms(self)->int:
return self.NPB
@property
def states(self)->dict:
return {self.STATEIDENTIFIERS[state-1]: {atom.NR: atom.STATES[state] for atom in sorted(self.STATEATOMS, key=lambda x: x.NR)} for state in range(1, self.NPTB+1)}
def read_content_from_str(self, content:List[str]):
field = 0
NPB = None
STATEIDENTIFIERS = None
STATEATOMHEADER = None
STATEATOMS = []
first = True
stdid = False
i=1
for line in content:
if ("#" in line):
comment = line
if("state_identifiers" in line):
stdid=True
elif(stdid):
STATEIDENTIFIERS = line.replace("#", "").split()
stdid=False
continue
else:
if (field > 0):
if(first):
STATEATOMHEADER = ["atomI", "atomJ", "type1", "type2"]
first = False
state_line = {key: value for key, value in zip(STATEATOMHEADER, line.split())}
state_line.update({"NR":len(STATEATOMS)+1})
final_state_line = {key: state_line[key] for key in state_line if (not "type" in key)}
states = {1: state_line["type1"],
2: state_line["type2"]}
final_state_line.update({"STATES":states})
STATEATOMS.append(atom_lam_pertubation_state_bond(**final_state_line))
elif (field == 0):
NPB = int(line.strip())
field += 1
self.NPB = NPB
self.STATEIDENTIFIERS = STATEIDENTIFIERS
self.STATEATOMHEADER = STATEATOMHEADER
self.STATEATOMS = STATEATOMS
"""
STR FUNCTIONS
"""
def _state_STATEATOMHEADER_str(self):
state_format_pattern = "{:>5} {:>5}"+"".join([" {:>5} "for x in range(self.NPTB)])+""
return state_format_pattern.format(*self.STATEATOMHEADER)
def block_to_string(self) -> str:
result = self.name + self.line_seperator
result += "# NPB " + self.field_seperator + "NPTB = " + self.field_seperator + str(self.NPTB) + self.field_seperator+ self.line_seperator
result += self.field_seperator + str(self.NPB)+self.line_seperator
result += "# " + self._state_STATEATOMHEADER_str() + self.line_seperator
result += "".join(map(str, sorted(self.STATEATOMS, key=lambda x: x.NR)))
result += "END"+self.line_seperator
return result
### ANGLE
class PERTBONDANGLE(_generic_gromos_block):
def __init__(self, STATEATOMS:List[atom_lam_pertubation_state_angle]=None,
STATEATOMHEADER: Tuple[str]= None,
NPA: int=None,
dummy_ANGLE = 22, content:List[str]=None):
self.NPTB = 2
self.dummy_ANGLE = dummy_ANGLE
if(content is None):
if(STATEATOMHEADER is None):
self.STATEATOMHEADER = ["atomI", "atomJ", "atomK", "type1", "type2"]
else:
self.STATEATOMHEADER = STATEATOMHEADER
if(STATEATOMS is None):
self.STATEATOMS = []
else:
self.STATEATOMS = []
self.NPA = 0
#self.add_state_atoms(STATEATOMS)
super().__init__(used=True, name=__class__.__name__)
else:
super().__init__(used=True, name=__class__.__name__, content=content)
# You can check yourself :)
if(not NPA is None and not len(STATEATOMS)==NPA):
raise ValueError("NJLA must be equal to the length of STATEATOMS! NJLA="+str(NPA)+"\t stateatoms"+str(len(STATEATOMS))+"\n\n"+str(self))
@property
def nStates(self)->int:
return self.NPTB
@property
def nTotalStateAtoms(self)->int:
return self.NPA
@property
def states(self)->dict:
return {self.STATEIDENTIFIERS[state-1]: {atom.NR: atom.STATES[state] for atom in sorted(self.STATEATOMS, key=lambda x: x.NR)} for state in range(1, self.NPTB+1)}
def read_content_from_str(self, content:List[str]):
field = 0
NPA = None
STATEIDENTIFIERS = None
STATEATOMHEADER = None
STATEATOMS = []
first = True
stdid = False
i=1
for line in content:
if ("#" in line):
comment = line
if("state_identifiers" in line):
stdid=True
elif(stdid):
STATEIDENTIFIERS = line.replace("#", "").split()
stdid=False
continue
else:
if (field > 0):
if(first):
STATEATOMHEADER = ["atomI", "atomJ", "atomK", "type1", "type2"]
first = False
state_line = {key: value for key, value in zip(STATEATOMHEADER, line.split())}
state_line.update({"NR":len(STATEATOMS)+1})
final_state_line = {key: state_line[key] for key in state_line if (not "type" in key)}
states = {1: state_line["type1"],
2: state_line["type2"]}
final_state_line.update({"STATES":states})
STATEATOMS.append(atom_lam_pertubation_state_angle(**final_state_line))
elif (field == 0):
NPA = int(line.strip())
field += 1
self.NPA = NPA
self.STATEIDENTIFIERS = STATEIDENTIFIERS
self.STATEATOMHEADER = STATEATOMHEADER
self.STATEATOMS = STATEATOMS
"""
STR FUNCTIONS
"""
def _state_STATEATOMHEADER_str(self):
state_format_pattern = "{:>5} {:>5} {:>5}"+"".join([" {:>5} "for x in range(self.NPTB)])+""
return state_format_pattern.format(*self.STATEATOMHEADER)
def block_to_string(self) -> str:
result = self.name + self.line_seperator
result += "# NPA " + self.field_seperator + "NPTB = " + self.field_seperator + str(self.NPTB) + self.field_seperator+ self.line_seperator
result += self.field_seperator + str(self.NPA)+self.line_seperator
result += "# " + self._state_STATEATOMHEADER_str() + self.line_seperator
result += "".join(map(str, sorted(self.STATEATOMS, key=lambda x: x.NR)))
result += "END"+self.line_seperator
return result
class PERTBONDANGLEH(_generic_gromos_block):
def __init__(self, STATEATOMS:List[atom_lam_pertubation_state_angle]=None,
STATEATOMHEADER: Tuple[str]= None,
NPA: int=None,
dummy_ANGLE = 22, content:List[str]=None):
self.NPTB = 2
self.dummy_ANGLE = dummy_ANGLE
if(content is None):
if(STATEATOMHEADER is None):
self.STATEATOMHEADER = ["atomI", "atomJ", "atomK", "type1", "type2"]
else:
self.STATEATOMHEADER = STATEATOMHEADER
if(STATEATOMS is None):
self.STATEATOMS = []
else:
self.STATEATOMS = []
self.NPA = 0
#self.add_state_atoms(STATEATOMS)
super().__init__(used=True, name=__class__.__name__)
else:
super().__init__(used=True, name=__class__.__name__, content=content)
# You can check yourself :)
if(not NPA is None and not len(STATEATOMS)==NPA):
raise ValueError("NJLA must be equal to the length of STATEATOMS! NJLA="+str(NPA)+"\t stateatoms"+str(len(STATEATOMS))+"\n\n"+str(self))
@property
def nStates(self)->int:
return self.NPTB
@property
def nTotalStateAtoms(self)->int:
return self.NPA
@property
def states(self)->dict:
return {self.STATEIDENTIFIERS[state-1]: {atom.NR: atom.STATES[state] for atom in sorted(self.STATEATOMS, key=lambda x: x.NR)} for state in range(1, self.NPTB+1)}
def read_content_from_str(self, content:List[str]):
field = 0
NPA = None
STATEIDENTIFIERS = None
STATEATOMHEADER = None
STATEATOMS = []
first = True
stdid = False
i=1
for line in content:
if ("#" in line):
comment = line
if("state_identifiers" in line):
stdid=True
elif(stdid):
STATEIDENTIFIERS = line.replace("#", "").split()
stdid=False
continue
else:
if (field > 0):
if(first):
STATEATOMHEADER = ["atomI", "atomJ", "atomK", "type1", "type2"]
first = False
state_line = {key: value for key, value in zip(STATEATOMHEADER, line.split())}
state_line.update({"NR":len(STATEATOMS)+1})
final_state_line = {key: state_line[key] for key in state_line if (not "type" in key)}
states = {1: state_line["type1"],
2: state_line["type2"]}
final_state_line.update({"STATES":states})
STATEATOMS.append(atom_lam_pertubation_state_angle(**final_state_line))
elif (field == 0):
NPA = int(line.strip())
field += 1
self.NPA = NPA
self.STATEIDENTIFIERS = STATEIDENTIFIERS
self.STATEATOMHEADER = STATEATOMHEADER
self.STATEATOMS = STATEATOMS
"""
STR FUNCTIONS
"""
def _state_STATEATOMHEADER_str(self):
state_format_pattern = "{:>5} {:>5} {:>5}"+"".join([" {:>5} "for x in range(self.NPTB)])+""
return state_format_pattern.format(*self.STATEATOMHEADER)
def block_to_string(self) -> str:
result = self.name + self.line_seperator
result += "# NPA " + self.field_seperator + "NPTB = " + self.field_seperator + str(self.NPTB) + self.field_seperator+ self.line_seperator
result += self.field_seperator + str(self.NPA)+self.line_seperator
result += "# " + self._state_STATEATOMHEADER_str() + self.line_seperator
result += "".join(map(str, sorted(self.STATEATOMS, key=lambda x: x.NR)))
result += "END"+self.line_seperator
return result
### DIHEDRAL
class PERTPROPERDIH(_generic_gromos_block):
def __init__(self, STATEATOMS:List[atom_lam_pertubation_state_dihedral]=None,
STATEATOMHEADER: Tuple[str]= None,
NPD: int=None,
dummy_DIH = 22, content:List[str]=None):
self.NPTB = 2
self.dummy_DIH = dummy_DIH
if(content is None):
if(STATEATOMHEADER is None):
self.STATEATOMHEADER = ["atomI", "atomJ", "atomK", "atomL", "type1", "type2"]
else:
self.STATEATOMHEADER = STATEATOMHEADER
if(STATEATOMS is None):
self.STATEATOMS = []
else:
self.STATEATOMS = []
self.NPD = 0
#self.add_state_atoms(STATEATOMS)
super().__init__(used=True, name=__class__.__name__)
else:
super().__init__(used=True, name=__class__.__name__, content=content)
# You can check yourself :)
if(not NPD is None and not len(STATEATOMS)==NPD):
raise ValueError("NJLA must be equal to the length of STATEATOMS! NJLA="+str(NPD)+"\t stateatoms"+str(len(STATEATOMS))+"\n\n"+str(self))
@property
def nStates(self)->int:
return self.NPTB
@property
def nTotalStateAtoms(self)->int:
return self.NPD
@property
def states(self)->dict:
return {self.STATEIDENTIFIERS[state-1]: {atom.NR: atom.STATES[state] for atom in sorted(self.STATEATOMS, key=lambda x: x.NR)} for state in range(1, self.NPTB+1)}
def read_content_from_str(self, content:List[str]):
field = 0
NPD = None
STATEIDENTIFIERS = None
STATEATOMHEADER = None
STATEATOMS = []
first = True
stdid = False
i=1
for line in content:
if ("#" in line):
comment = line
if("state_identifiers" in line):
stdid=True
elif(stdid):
STATEIDENTIFIERS = line.replace("#", "").split()
stdid=False
continue
else:
if (field > 0):
if(first):
STATEATOMHEADER = ["atomI", "atomJ", "atomK", "atomL", "type1", "type2"]
first = False
state_line = {key: value for key, value in zip(STATEATOMHEADER, line.split())}
state_line.update({"NR":len(STATEATOMS)+1})
final_state_line = {key: state_line[key] for key in state_line if (not "type" in key)}
states = {1: state_line["type1"],
2: state_line["type2"]}
final_state_line.update({"STATES":states})
STATEATOMS.append(atom_lam_pertubation_state_dihedral(**final_state_line))
elif (field == 0):
NPD = int(line.strip())
field += 1
self.NPD = NPD
self.STATEIDENTIFIERS = STATEIDENTIFIERS
self.STATEATOMHEADER = STATEATOMHEADER
self.STATEATOMS = STATEATOMS
"""
STR FUNCTIONS
"""
def _state_STATEATOMHEADER_str(self):
state_format_pattern = "{:>5} {:>5} {:>5} {:>5}"+"".join([" {:>5} "for x in range(self.NPTB)])+""
return state_format_pattern.format(*self.STATEATOMHEADER)
def block_to_string(self) -> str:
result = self.name + self.line_seperator
result += "# NPD " + self.field_seperator + "NPTB = " + self.field_seperator + str(self.NPTB) + self.field_seperator+ self.line_seperator
result += self.field_seperator + str(self.NPD)+self.line_seperator
result += "# " + self._state_STATEATOMHEADER_str() + self.line_seperator
result += "".join(map(str, sorted(self.STATEATOMS, key=lambda x: x.NR)))
result += "END"+self.line_seperator
return result
class PERTPROPERDIHH(_generic_gromos_block):
def __init__(self, STATEATOMS:List[atom_lam_pertubation_state_dihedral]=None,
STATEATOMHEADER: Tuple[str]= None,
NPD: int=None,
dummy_DIH = 22, content:List[str]=None):
self.NPTB = 2
self.dummy_DIH = dummy_DIH
if(content is None):
if(STATEATOMHEADER is None):
self.STATEATOMHEADER = ["atomI", "atomJ", "atomK", "atomL", "type1", "type2"]
else:
self.STATEATOMHEADER = STATEATOMHEADER
if(STATEATOMS is None):
self.STATEATOMS = []
else:
self.STATEATOMS = []
self.NPD = 0
#self.add_state_atoms(STATEATOMS)
super().__init__(used=True, name=__class__.__name__)
else:
super().__init__(used=True, name=__class__.__name__, content=content)
# You can check yourself :)
if(not NPD is None and not len(STATEATOMS)==NPD):
raise ValueError("NJLA must be equal to the length of STATEATOMS! NJLA="+str(NPD)+"\t stateatoms"+str(len(STATEATOMS))+"\n\n"+str(self))
@property
def nStates(self)->int:
return self.NPTB
@property
def nTotalStateAtoms(self)->int:
return self.NPD
@property
def states(self)->dict:
return {self.STATEIDENTIFIERS[state-1]: {atom.NR: atom.STATES[state] for atom in sorted(self.STATEATOMS, key=lambda x: x.NR)} for state in range(1, self.NPTB+1)}
def read_content_from_str(self, content:List[str]):
field = 0
NPD = None
STATEIDENTIFIERS = None
STATEATOMHEADER = None
STATEATOMS = []
first = True
stdid = False
i=1
for line in content:
if ("#" in line):
comment = line
if("state_identifiers" in line):
stdid=True
elif(stdid):
STATEIDENTIFIERS = line.replace("#", "").split()
stdid=False
continue
else:
if (field > 0):
if(first):
STATEATOMHEADER = ["atomI", "atomJ", "atomK", "atomL", "type1", "type2"]
first = False
state_line = {key: value for key, value in zip(STATEATOMHEADER, line.split())}
state_line.update({"NR":len(STATEATOMS)+1})
final_state_line = {key: state_line[key] for key in state_line if (not "type" in key)}
states = {1: state_line["type1"],
2: state_line["type2"]}
final_state_line.update({"STATES":states})
STATEATOMS.append(atom_lam_pertubation_state_dihedral(**final_state_line))
elif (field == 0):
NPD = int(line.strip())
field += 1
self.NPD = NPD
self.STATEIDENTIFIERS = STATEIDENTIFIERS
self.STATEATOMHEADER = STATEATOMHEADER
self.STATEATOMS = STATEATOMS
"""
STR FUNCTIONS
"""
def _state_STATEATOMHEADER_str(self):
state_format_pattern = "{:>5} {:>5} {:>5} {:>5}"+"".join([" {:>5} "for x in range(self.NPTB)])+""
return state_format_pattern.format(*self.STATEATOMHEADER)
def block_to_string(self) -> str:
result = self.name + self.line_seperator
result += "# NPD " + self.field_seperator + "NPTB = " + self.field_seperator + str(self.NPTB) + self.field_seperator+ self.line_seperator
result += self.field_seperator + str(self.NPD)+self.line_seperator
result += "# " + self._state_STATEATOMHEADER_str() + self.line_seperator
result += "".join(map(str, sorted(self.STATEATOMS, key=lambda x: x.NR)))
result += "END"+self.line_seperator
return result
class PERTPROPERDIH(_generic_gromos_block):
def __init__(self, STATEATOMS: List[atom_lam_pertubation_state_dihedral] = None,
STATEATOMHEADER: Tuple[str] = None,
NPD: int = None,
dummy_DIH=22, content: List[str] = None):
self.NPTB = 2
self.dummy_DIH = dummy_DIH
if (content is None):
if (STATEATOMHEADER is None):
self.STATEATOMHEADER = ["atomI", "atomJ", "atomK", "atomL", "type1", "type2"]
else:
self.STATEATOMHEADER = STATEATOMHEADER
if (STATEATOMS is None):
self.STATEATOMS = []
else:
self.STATEATOMS = []
self.NPD = 0
# self.add_state_atoms(STATEATOMS)
super().__init__(used=True, name=__class__.__name__)
else:
super().__init__(used=True, name=__class__.__name__, content=content)
# You can check yourself :)
if (not NPD is None and not len(STATEATOMS) == NPD):
raise ValueError(
"NJLA must be equal to the length of STATEATOMS! NJLA=" + str(NPD) + "\t stateatoms" + str(
len(STATEATOMS)) + "\n\n" + str(self))
@property
def nStates(self) -> int:
return self.NPTB
@property
def nTotalStateAtoms(self) -> int:
return self.NPD
@property
def states(self) -> dict:
return {self.STATEIDENTIFIERS[state - 1]: {atom.NR: atom.STATES[state] for atom in
sorted(self.STATEATOMS, key=lambda x: x.NR)} for state in
range(1, self.NPTB + 1)}
def read_content_from_str(self, content: List[str]):
field = 0
NPD = None
STATEIDENTIFIERS = None
STATEATOMHEADER = None
STATEATOMS = []
first = True
stdid = False
i = 1
for line in content:
if ("#" in line):
comment = line
if ("state_identifiers" in line):
stdid = True
elif (stdid):
STATEIDENTIFIERS = line.replace("#", "").split()
stdid = False
continue
else:
if (field > 0):
if (first):
STATEATOMHEADER = ["atomI", "atomJ", "atomK", "atomL", "type1", "type2"]
first = False
state_line = {key: value for key, value in zip(STATEATOMHEADER, line.split())}
state_line.update({"NR": len(STATEATOMS) + 1})
final_state_line = {key: state_line[key] for key in state_line if (not "type" in key)}
states = {1: state_line["type1"],
2: state_line["type2"]}
final_state_line.update({"STATES": states})
STATEATOMS.append(atom_lam_pertubation_state_dihedral(**final_state_line))
elif (field == 0):
NPD = int(line.strip())
field += 1
self.NPD = NPD
self.STATEIDENTIFIERS = STATEIDENTIFIERS
self.STATEATOMHEADER = STATEATOMHEADER
self.STATEATOMS = STATEATOMS
"""
STR FUNCTIONS
"""
def _state_STATEATOMHEADER_str(self):
state_format_pattern = "{:>5} {:>5} {:>5} {:>5}" + "".join([" {:>5} " for x in range(self.NPTB)]) + ""
return state_format_pattern.format(*self.STATEATOMHEADER)
def block_to_string(self) -> str:
result = self.name + self.line_seperator
result += "# NPD " + self.field_seperator + "NPTB = " + self.field_seperator + str(
self.NPTB) + self.field_seperator + self.line_seperator
result += self.field_seperator + str(self.NPD) + self.line_seperator
result += "# " + self._state_STATEATOMHEADER_str() + self.line_seperator
result += "".join(map(str, sorted(self.STATEATOMS, key=lambda x: x.NR)))
result += "END" + self.line_seperator
return result
### IMPROPER
class PERTIMROPERDIH(_generic_gromos_block):
def __init__(self, STATEATOMS:List[atom_lam_pertubation_state_dihedral]=None,
STATEATOMHEADER: Tuple[str]= None,
NPD: int=None,
dummy_IMP = 22, content:List[str]=None):
self.NPTB = 2
self.dummy_DIH = dummy_IMP
if(content is None):
if(STATEATOMHEADER is None):
self.STATEATOMHEADER = ["atomI", "atomJ", "atomK", "atomL", "type1", "type2"]
else:
self.STATEATOMHEADER = STATEATOMHEADER
if(STATEATOMS is None):
self.STATEATOMS = []
else:
self.STATEATOMS = []
self.NPD = 0
#self.add_state_atoms(STATEATOMS)
super().__init__(used=True, name=__class__.__name__)
else:
super().__init__(used=True, name=__class__.__name__, content=content)
# You can check yourself :)
if(not NPD is None and not len(STATEATOMS)==NPD):
raise ValueError("NPD must be equal to the length of STATEATOMS! NPD="+str(NPD)+"\t stateatoms"+str(len(STATEATOMS))+"\n\n"+str(self))
@property
def nStates(self)->int:
return self.NPTB
@property
def nTotalStateAtoms(self)->int:
return self.NPD
@property
def states(self)->dict:
return {self.STATEIDENTIFIERS[state-1]: {atom.NR: atom.STATES[state] for atom in sorted(self.STATEATOMS, key=lambda x: x.NR)} for state in range(1, self.NPTB+1)}
def read_content_from_str(self, content:List[str]):
field = 0
NPD = None
STATEIDENTIFIERS = None
STATEATOMHEADER = None
STATEATOMS = []
first = True
stdid = False
i=1
for line in content:
if ("#" in line):
comment = line
if("state_identifiers" in line):
stdid=True
elif(stdid):
STATEIDENTIFIERS = line.replace("#", "").split()
stdid=False
continue
else:
if (field > 0):
if(first):
STATEATOMHEADER = ["atomI", "atomJ", "atomK", "atomL", "type1", "type2"]
first = False
state_line = {key: value for key, value in zip(STATEATOMHEADER, line.split())}
state_line.update({"NR":len(STATEATOMS)+1})
final_state_line = {key: state_line[key] for key in state_line if (not "type" in key)}
states = {1: state_line["type1"],
2: state_line["type2"]}
final_state_line.update({"STATES":states})
STATEATOMS.append(atom_lam_pertubation_state_dihedral(**final_state_line))
elif (field == 0):
NPD = int(line.strip())
field += 1
self.NPD = NPD
self.STATEIDENTIFIERS = STATEIDENTIFIERS
self.STATEATOMHEADER = STATEATOMHEADER
self.STATEATOMS = STATEATOMS
"""
STR FUNCTIONS
"""
def _state_STATEATOMHEADER_str(self):
state_format_pattern = "{:>5} {:>5} {:>5} {:>5}"+"".join([" {:>5} "for x in range(self.NPTB)])+""
return state_format_pattern.format(*self.STATEATOMHEADER)
def block_to_string(self) -> str:
result = self.name + self.line_seperator
result += "# NPD " + self.field_seperator + "NPTB = " + self.field_seperator + str(self.NPTB) + self.field_seperator+ self.line_seperator
result += self.field_seperator + str(self.NPD)+self.line_seperator
result += "# " + self._state_STATEATOMHEADER_str() + self.line_seperator
result += "".join(map(str, sorted(self.STATEATOMS, key=lambda x: x.NR)))
result += "END"+self.line_seperator
return result
class PERTIMROPERDIHH(_generic_gromos_block):
def __init__(self, STATEATOMS:List[atom_lam_pertubation_state_dihedral]=None,
STATEATOMHEADER: Tuple[str]= None,
NPD: int=None,
dummy_IMP = 22, content:List[str]=None):
self.NPTB = 2
self.dummy_DIH = dummy_IMP
if(content is None):
if(STATEATOMHEADER is None):
self.STATEATOMHEADER = ["atomI", "atomJ", "atomK", "atomL", "type1", "type2"]
else:
self.STATEATOMHEADER = STATEATOMHEADER
if(STATEATOMS is None):
self.STATEATOMS = []
else:
self.STATEATOMS = []
self.NPD = 0
#self.add_state_atoms(STATEATOMS)
super().__init__(used=True, name=__class__.__name__)
else:
super().__init__(used=True, name=__class__.__name__, content=content)
# You can check yourself :)
if(not NPD is None and not len(STATEATOMS)==NPD):
raise ValueError("NPD must be equal to the length of STATEATOMS! NPD="+str(NPD)+"\t stateatoms"+str(len(STATEATOMS))+"\n\n"+str(self))
@property
def nStates(self)->int:
return self.NPTB
@property
def nTotalStateAtoms(self)->int:
return self.NPD
@property
def states(self)->dict:
return {self.STATEIDENTIFIERS[state-1]: {atom.NR: atom.STATES[state] for atom in sorted(self.STATEATOMS, key=lambda x: x.NR)} for state in range(1, self.NPTB+1)}
def read_content_from_str(self, content:List[str]):
field = 0
NPD = None
STATEIDENTIFIERS = None
STATEATOMHEADER = None
STATEATOMS = []
first = True
stdid = False
i=1
for line in content:
if ("#" in line):
comment = line
if("state_identifiers" in line):
stdid=True
elif(stdid):
STATEIDENTIFIERS = line.replace("#", "").split()
stdid=False
continue
else:
if (field > 0):
if(first):
STATEATOMHEADER = ["atomI", "atomJ", "atomK", "atomL", "type1", "type2"]
first = False
state_line = {key: value for key, value in zip(STATEATOMHEADER, line.split())}
state_line.update({"NR":len(STATEATOMS)+1})
final_state_line = {key: state_line[key] for key in state_line if (not "type" in key)}
states = {1: state_line["type1"],
2: state_line["type2"]}
final_state_line.update({"STATES":states})
STATEATOMS.append(atom_lam_pertubation_state_dihedral(**final_state_line))
elif (field == 0):
NPD = int(line.strip())
field += 1
self.NPD = NPD
self.STATEIDENTIFIERS = STATEIDENTIFIERS
self.STATEATOMHEADER = STATEATOMHEADER
self.STATEATOMS = STATEATOMS
"""
STR FUNCTIONS
"""
def _state_STATEATOMHEADER_str(self):
state_format_pattern = "{:>5} {:>5} {:>5} {:>5}"+"".join([" {:>5} "for x in range(self.NPTB)])+""
return state_format_pattern.format(*self.STATEATOMHEADER)
def block_to_string(self) -> str:
result = self.name + self.line_seperator
result += "# NPD " + self.field_seperator + "NPTB = " + self.field_seperator + str(self.NPTB) + self.field_seperator+ self.line_seperator
result += self.field_seperator + str(self.NPD)+self.line_seperator
result += "# " + self._state_STATEATOMHEADER_str() + self.line_seperator
result += "".join(map(str, sorted(self.STATEATOMS, key=lambda x: x.NR)))
result += "END"+self.line_seperator
return result
| 38.168461 | 199 | 0.558621 | 6,732 | 60,268 | 4.823381 | 0.038473 | 0.027163 | 0.030889 | 0.034 | 0.930892 | 0.919436 | 0.908441 | 0.8964 | 0.893567 | 0.877768 | 0 | 0.009352 | 0.325795 | 60,268 | 1,578 | 200 | 38.192649 | 0.789777 | 0.048035 | 0 | 0.887477 | 0 | 0 | 0.049128 | 0 | 0.010889 | 0 | 0 | 0.001267 | 0 | 1 | 0.086207 | false | 0 | 0.006352 | 0.030853 | 0.167877 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d5f10b6a7dce9aff2d793b07cbeef77c8ce9f945 | 1,987 | py | Python | tests/test_compute_intrinsic.py | pasqoc/heterocl | bdb87b01cbdf613fe746d25dd949e18cd4942ecf | [
"Apache-2.0"
] | 236 | 2019-05-19T01:48:11.000Z | 2022-03-31T09:03:54.000Z | tests/test_compute_intrinsic.py | pasqoc/heterocl | bdb87b01cbdf613fe746d25dd949e18cd4942ecf | [
"Apache-2.0"
] | 248 | 2019-05-17T19:18:36.000Z | 2022-03-30T21:25:47.000Z | tests/test_compute_intrinsic.py | AlgaPeng/heterocl-2 | b5197907d1fe07485466a63671a2a906a861c939 | [
"Apache-2.0"
] | 85 | 2019-05-17T20:09:27.000Z | 2022-02-28T20:19:00.000Z | import heterocl as hcl
import numpy as np
def test_int():
hcl.init(hcl.Int())
a = hcl.placeholder((10,))
b = hcl.compute(a.shape, lambda x: hcl.power(2, a[x]))
s = hcl.create_schedule([a, b])
f = hcl.build(s)
np_a = np.random.randint(1, 10, (10,))
np_b = np.zeros(10, dtype="int")
hcl_a = hcl.asarray(np_a)
hcl_b = hcl.asarray(np_b)
f(hcl_a, hcl_b)
np_golden = np.power(2, np_a)
assert np.allclose(np_golden, hcl_b.asnumpy())
def test_large_int():
hcl.init(hcl.Int())
a = hcl.placeholder((10,))
b = hcl.placeholder((10,))
c = hcl.compute(a.shape, lambda x: hcl.power(a[x], b[x]))
s = hcl.create_schedule([a, b, c])
f = hcl.build(s)
np_a = np.random.randint(1, 10, (10,))
np_b = np.random.randint(1, 10, (10,))
np_c = np.zeros(10, dtype="int")
hcl_a = hcl.asarray(np_a)
hcl_b = hcl.asarray(np_b)
hcl_c = hcl.asarray(np_c)
f(hcl_a, hcl_b, hcl_c)
np_golden = np.power(np_a, np_b)
assert np.allclose(np_golden, hcl_c.asnumpy())
def test_float():
hcl.init(hcl.Float())
a = hcl.placeholder((10,))
b = hcl.compute(a.shape, lambda x: hcl.power(2.0, a[x]))
s = hcl.create_schedule([a, b])
f = hcl.build(s)
np_a = np.random.rand(10)
np_b = np.zeros(10)
hcl_a = hcl.asarray(np_a)
hcl_b = hcl.asarray(np_b)
f(hcl_a, hcl_b)
np_golden = np.power(2, np_a)
assert np.allclose(np_golden, hcl_b.asnumpy())
def test_var():
hcl.init(hcl.Float())
a = hcl.placeholder((10,))
b = hcl.placeholder((10,))
c = hcl.compute(a.shape, lambda x: hcl.power(a[x], b[x]))
s = hcl.create_schedule([a, b, c])
f = hcl.build(s)
np_a = np.random.rand(10)
np_b = np.random.rand(10)
np_c = np.zeros(10)
hcl_a = hcl.asarray(np_a)
hcl_b = hcl.asarray(np_b)
hcl_c = hcl.asarray(np_c)
f(hcl_a, hcl_b, hcl_c)
np_golden = np.power(np_a, np_b)
assert np.allclose(np_golden, hcl_c.asnumpy())
| 22.325843 | 61 | 0.59839 | 366 | 1,987 | 3.065574 | 0.112022 | 0.057041 | 0.106952 | 0.042781 | 0.944742 | 0.935829 | 0.930481 | 0.912656 | 0.912656 | 0.912656 | 0 | 0.029851 | 0.224459 | 1,987 | 88 | 62 | 22.579545 | 0.698248 | 0 | 0 | 0.766667 | 0 | 0 | 0.00302 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 1 | 0.066667 | false | 0 | 0.033333 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d5f3f6f14f81185c0444a71b19c7f481deeed7d2 | 2,519 | py | Python | terrascript/vra/d.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 507 | 2017-07-26T02:58:38.000Z | 2022-01-21T12:35:13.000Z | terrascript/vra/d.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 135 | 2017-07-20T12:01:59.000Z | 2021-10-04T22:25:40.000Z | terrascript/vra/d.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 81 | 2018-02-20T17:55:28.000Z | 2022-01-31T07:08:40.000Z | # terrascript/vra/d.py
# Automatically generated by tools/makecode.py ()
import warnings
warnings.warn(
"using the 'legacy layout' is deprecated", DeprecationWarning, stacklevel=2
)
import terrascript
class vra_block_device(terrascript.Data):
pass
class vra_block_device_snapshots(terrascript.Data):
pass
class vra_blueprint(terrascript.Data):
pass
class vra_blueprint_version(terrascript.Data):
pass
class vra_catalog_item(terrascript.Data):
pass
class vra_catalog_source_blueprint(terrascript.Data):
pass
class vra_catalog_source_entitlement(terrascript.Data):
pass
class vra_cloud_account_aws(terrascript.Data):
pass
class vra_cloud_account_azure(terrascript.Data):
pass
class vra_cloud_account_gcp(terrascript.Data):
pass
class vra_cloud_account_nsxt(terrascript.Data):
pass
class vra_cloud_account_nsxv(terrascript.Data):
pass
class vra_cloud_account_vmc(terrascript.Data):
pass
class vra_cloud_account_vsphere(terrascript.Data):
pass
class vra_data_collector(terrascript.Data):
pass
class vra_deployment(terrascript.Data):
pass
class vra_fabric_datastore_vsphere(terrascript.Data):
pass
class vra_fabric_network(terrascript.Data):
pass
class vra_fabric_storage_account_azure(terrascript.Data):
pass
class vra_fabric_storage_policy_vsphere(terrascript.Data):
pass
class vra_image(terrascript.Data):
pass
class vra_image_profile(terrascript.Data):
pass
class vra_machine(terrascript.Data):
pass
class vra_network(terrascript.Data):
pass
class vra_network_domain(terrascript.Data):
pass
class vra_network_profile(terrascript.Data):
pass
class vra_project(terrascript.Data):
pass
class vra_region(terrascript.Data):
pass
class vra_region_enumeration(terrascript.Data):
pass
class vra_region_enumeration_aws(terrascript.Data):
pass
class vra_region_enumeration_azure(terrascript.Data):
pass
class vra_region_enumeration_gcp(terrascript.Data):
pass
class vra_region_enumeration_vmc(terrascript.Data):
pass
class vra_region_enumeration_vsphere(terrascript.Data):
pass
class vra_security_group(terrascript.Data):
pass
class vra_storage_profile(terrascript.Data):
pass
class vra_storage_profile_aws(terrascript.Data):
pass
class vra_storage_profile_azure(terrascript.Data):
pass
class vra_storage_profile_vsphere(terrascript.Data):
pass
class vra_zone(terrascript.Data):
pass
| 14.817647 | 79 | 0.774911 | 320 | 2,519 | 5.803125 | 0.184375 | 0.172321 | 0.409262 | 0.504039 | 0.826064 | 0.796984 | 0.472806 | 0 | 0 | 0 | 0 | 0.000469 | 0.154029 | 2,519 | 169 | 80 | 14.905325 | 0.870953 | 0.026995 | 0 | 0.470588 | 1 | 0 | 0.015931 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.470588 | 0.023529 | 0 | 0.494118 | 0.035294 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
911c9cb57332b0c5440bd54dfb9f1a82eee8efa9 | 354,889 | py | Python | tests/test_kravatte_wbc.py | inmcm/kravatte | 21a6da899fe2707ff6128a89f83e3aeecf2ea68a | [
"MIT"
] | 13 | 2018-04-16T19:56:50.000Z | 2022-02-07T16:21:43.000Z | tests/test_kravatte_wbc.py | inmcm/kravatte | 21a6da899fe2707ff6128a89f83e3aeecf2ea68a | [
"MIT"
] | 2 | 2018-04-27T12:51:19.000Z | 2018-11-25T16:55:49.000Z | tests/test_kravatte_wbc.py | inmcm/kravatte | 21a6da899fe2707ff6128a89f83e3aeecf2ea68a | [
"MIT"
] | 2 | 2018-08-20T06:40:08.000Z | 2019-11-12T23:16:47.000Z | import hashlib
import numpy as np
from kravatte import KravatteWBC
# Official Test Vectors
class TestOfficialTestVectors_WBC:
"""
Test Vectors Generated From KeccakTools https://github.com/gvanas/KeccakTools
"""
def test_kravatte_WBC_k_8_msg_32_tweak_8(self, test_workers):
my_tweak = bytes([0xAB, 0xBA, 0xDA, 0xBA, 0xDE, 0xAD, 0xBE, 0xEF])
my_key = bytes([0xC5, 0x3A, 0xFF, 0xFF, 0xFF, 0xFF, 0xE1, 0x78])
my_message = bytes([0x12, 0x34, 0x56, 0x78, 0x9A, 0xBC, 0xDE, 0xF0,
0x0F, 0xED, 0xCB, 0xA9, 0x87, 0x65, 0x43, 0x21,
0x0F, 0xED, 0xCB, 0xA9, 0x87, 0x65, 0x43, 0x21,
0x12, 0x34, 0x56, 0x78, 0x9A, 0xBC, 0xDE, 0xF0])
real_ciphertext = b'\x1e\xe0$^\x90\xbe\xa3$\x081w\xe5\xe9\x85\x8b\xf0\x8c\xcf%\xed\xaf\x1a\xee\xed$\xdb\xa3[\xe9\\[\xd5'
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_64_tweak_16(self, test_workers):
# keyLen 16, WLen 16, dataLen 64 (in bytes)
my_key = bytes([0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62])
my_tweak = bytes([0xd7, 0xc8, 0xb9, 0xaa, 0x9b, 0x8c, 0x7d, 0x6e, 0x5f, 0x50, 0x41, 0x32, 0x23, 0x14, 0x05, 0xf6])
my_message = bytes([0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06])
real_ciphertext = bytes([0x48, 0xb3, 0x8e, 0x8e, 0xcb, 0xa4, 0x79, 0x17, 0x1c, 0xf8, 0xe1, 0x54, 0x44, 0x14, 0xc0, 0xea, 0xa7, 0x70, 0x9c, 0xa5, 0x22, 0x82, 0xe3, 0xab, 0x6c, 0xe8, 0x59, 0x78, 0xc4, 0x64, 0x8d, 0xe6, 0xd8, 0xe5, 0xdb, 0xea, 0xfb, 0xe3, 0xa6, 0x24, 0x61, 0xa8, 0xb6, 0xe0, 0xac, 0x64, 0x5e, 0x27, 0xd0, 0xd1, 0x77, 0x24, 0xe8, 0xf0, 0x82, 0x82, 0xdc, 0x21, 0xdb, 0x8b, 0x08, 0x69, 0x88, 0xff])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_20_msg_64_tweak_16(self, test_workers):
my_key = bytes([0x11, 0xf2, 0xd3, 0xb4, 0x95, 0x76, 0x57, 0x38, 0x18, 0xf9, 0xda, 0xbb, 0x9c, 0x7d, 0x5e, 0x3f, 0x1f, 0x00, 0xe1, 0xc2])
my_tweak = bytes([0xf3, 0xf3, 0xf3, 0xf3, 0xf3, 0xf3, 0xf3, 0xf3, 0xf3, 0xf3, 0xf3, 0xf3, 0xf3, 0xf3, 0xf3, 0xf3])
my_message = bytes([0x61, 0x42, 0x23, 0x04, 0xe5, 0xc6, 0xa7, 0x88, 0x68, 0x49, 0x2a, 0x0b, 0xec, 0xcd, 0xae, 0x8f, 0x6f, 0x50, 0x31, 0x12, 0xf3, 0xd4, 0xb5, 0x96, 0x76, 0x57, 0x38, 0x19, 0xfa, 0xdb, 0xbc, 0x9d, 0x7d, 0x5e, 0x3f, 0x20, 0x01, 0xe2, 0xc3, 0xa4, 0x84, 0x65, 0x46, 0x27, 0x08, 0xe9, 0xca, 0xab, 0x8b, 0x6c, 0x4d, 0x2e, 0x0f, 0xf0, 0xd1, 0xb2, 0x92, 0x73, 0x54, 0x35, 0x16, 0xf7, 0xd8, 0xb9])
real_ciphertext = bytes([0xa4, 0x72, 0xa5, 0x58, 0x7b, 0xf3, 0x14, 0x79, 0xb5, 0x98, 0x2d, 0x63, 0x10, 0x17, 0x91, 0x1a, 0x31, 0x52, 0x92, 0x7a, 0x34, 0x3f, 0xed, 0xcd, 0x09, 0x99, 0xf8, 0xf5, 0x57, 0xc5, 0x2c, 0x02, 0x64, 0x31, 0x44, 0x3e, 0xf3, 0x2d, 0x10, 0xce, 0x1e, 0x86, 0x24, 0x72, 0x2a, 0x50, 0x95, 0x4d, 0xde, 0xaf, 0x1a, 0x52, 0xe0, 0xcf, 0xf1, 0x4e, 0x8f, 0x3b, 0x34, 0x67, 0x53, 0x76, 0xa7, 0xe6])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_24_msg_64_tweak_16(self, test_workers):
my_key = bytes([0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a])
my_tweak = bytes([0x0f, 0x00, 0xf1, 0xe2, 0xd3, 0xc4, 0xb5, 0xa6, 0x97, 0x88, 0x79, 0x6a, 0x5b, 0x4c, 0x3d, 0x2e])
my_message = bytes([0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e])
real_ciphertext = bytes([0xb5, 0xbf, 0x75, 0x20, 0xba, 0xe4, 0xb8, 0xae, 0x7d, 0x49, 0xa3, 0xdc, 0xbd, 0x26, 0x1d, 0x63, 0x57, 0x2d, 0x5f, 0x0c, 0x33, 0x2a, 0x9d, 0x12, 0x3b, 0x5b, 0x26, 0x73, 0x2d, 0x0a, 0xda, 0x4b, 0x96, 0xda, 0x96, 0x6c, 0x1f, 0x84, 0x5d, 0x14, 0xc5, 0x60, 0xf1, 0x9f, 0x64, 0x14, 0x19, 0xd9, 0xa6, 0x5c, 0xa6, 0xc3, 0x35, 0xd9, 0x7b, 0xe2, 0x94, 0x8b, 0x1c, 0x41, 0xd1, 0x09, 0x60, 0xac])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_32_msg_64_tweak_16(self, test_workers):
my_key = bytes([0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2])
my_tweak = bytes([0x47, 0x38, 0x29, 0x1a, 0x0b, 0xfc, 0xed, 0xde, 0xcf, 0xc0, 0xb1, 0xa2, 0x93, 0x84, 0x75, 0x66])
my_message = bytes([0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76])
real_ciphertext = bytes([0xfa, 0x85, 0x67, 0x08, 0x4f, 0xf0, 0x66, 0x14, 0x07, 0x2c, 0x97, 0x35, 0xc1, 0xfd, 0xbf, 0x53, 0x2b, 0x8f, 0x93, 0x43, 0xc0, 0x02, 0x7e, 0xd1, 0x37, 0x7d, 0xe6, 0xa7, 0x3c, 0xec, 0x41, 0x91, 0xd8, 0x7e, 0x39, 0x08, 0x29, 0xc0, 0x79, 0xd1, 0xc3, 0x74, 0x1a, 0x7d, 0xae, 0x01, 0xb8, 0xd2, 0xd1, 0xb8, 0x0e, 0xbc, 0x04, 0xec, 0xa6, 0xb9, 0xac, 0x2d, 0xc8, 0x56, 0x28, 0xb6, 0x02, 0x17])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_48_msg_64_tweak_16(self, test_workers):
my_key = bytes([0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42])
my_tweak = bytes([0xb7, 0xa8, 0x99, 0x8a, 0x7b, 0x6c, 0x5d, 0x4e, 0x3f, 0x30, 0x21, 0x12, 0x03, 0xf4, 0xe5, 0xd6])
my_message = bytes([0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6])
real_ciphertext = bytes([0xe4, 0x25, 0x12, 0xab, 0x4a, 0xcb, 0xe2, 0xa7, 0x1f, 0xc9, 0x2c, 0xdd, 0x03, 0x61, 0xaa, 0x5b, 0x34, 0xe2, 0x77, 0x51, 0xa0, 0xa4, 0xce, 0xac, 0x91, 0x44, 0x31, 0xa7, 0xc2, 0x2c, 0x15, 0x14, 0xd3, 0x0c, 0xfa, 0xd9, 0x4c, 0xa7, 0xcb, 0xff, 0xb4, 0x24, 0xda, 0xdd, 0x88, 0x40, 0xab, 0x61, 0xe6, 0xce, 0xba, 0x76, 0x24, 0x1e, 0x90, 0x0c, 0xf8, 0xbe, 0x19, 0xe6, 0xad, 0xab, 0xee, 0xed])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_64_msg_64_tweak_16(self, test_workers):
my_key = bytes([0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2])
my_tweak = bytes([0x27, 0x18, 0x09, 0xfa, 0xeb, 0xdc, 0xcd, 0xbe, 0xaf, 0xa0, 0x91, 0x82, 0x73, 0x64, 0x55, 0x46])
my_message = bytes([0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56])
real_ciphertext = bytes([0x52, 0xba, 0xd3, 0x81, 0xfa, 0xad, 0x95, 0xf8, 0x43, 0x0d, 0x8f, 0x80, 0x2d, 0x03, 0xa2, 0x1c, 0x21, 0x15, 0x51, 0xce, 0xe7, 0x96, 0x01, 0xca, 0x2c, 0xa1, 0x2f, 0xbb, 0x71, 0xd6, 0xab, 0xfe, 0xb1, 0x27, 0xaf, 0x36, 0xa4, 0xa8, 0xd4, 0x40, 0xcf, 0xfb, 0x44, 0x89, 0x87, 0x3e, 0xcf, 0x04, 0x01, 0xe1, 0x64, 0xfd, 0xc3, 0xbb, 0x57, 0xd8, 0xc8, 0x7a, 0xe0, 0x35, 0x56, 0x40, 0x0b, 0x55])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_96_msg_64_tweak_16(self, test_workers):
my_key = bytes([0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92])
my_tweak = bytes([0x07, 0xf8, 0xe9, 0xda, 0xcb, 0xbc, 0xad, 0x9e, 0x8f, 0x80, 0x71, 0x62, 0x53, 0x44, 0x35, 0x26])
my_message = bytes([0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36])
real_ciphertext = bytes([0x2f, 0x45, 0xfd, 0x97, 0xd5, 0xe5, 0xd4, 0xf7, 0x87, 0x7b, 0x99, 0x28, 0x00, 0x36, 0x9f, 0x6a, 0xd1, 0x82, 0x7d, 0x95, 0x13, 0x84, 0x1b, 0x93, 0x23, 0xad, 0xea, 0xec, 0x22, 0xf5, 0xb6, 0x0f, 0xc9, 0x1f, 0x06, 0x2a, 0xd7, 0x5f, 0x9e, 0x42, 0xa3, 0xd7, 0x72, 0x40, 0xee, 0x55, 0xee, 0x68, 0xb3, 0xac, 0x0f, 0xa6, 0x1b, 0xb5, 0xee, 0x5a, 0x3c, 0x68, 0x82, 0x14, 0x77, 0x24, 0x1c, 0x51])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_128_msg_64_tweak_16(self, test_workers):
my_key = bytes([0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72])
my_tweak = bytes([0xe7, 0xd8, 0xc9, 0xba, 0xab, 0x9c, 0x8d, 0x7e, 0x6f, 0x60, 0x51, 0x42, 0x33, 0x24, 0x15, 0x06])
my_message = bytes([0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16])
real_ciphertext = bytes([0xca, 0x44, 0xbf, 0x23, 0x8b, 0x8d, 0x25, 0x24, 0x52, 0xa4, 0x42, 0x68, 0xac, 0x0c, 0xf5, 0x21, 0x66, 0x41, 0x32, 0x6f, 0x0b, 0x32, 0x28, 0x17, 0x35, 0xf4, 0x1b, 0x12, 0xe3, 0xdf, 0xc1, 0x49, 0xee, 0x47, 0x4f, 0xd8, 0xf0, 0xe5, 0x69, 0xb8, 0xee, 0x45, 0xd6, 0x0c, 0xca, 0xbb, 0x5c, 0x39, 0xf4, 0x28, 0xb0, 0x0c, 0xf8, 0x01, 0xa3, 0x86, 0x60, 0xae, 0xac, 0x1c, 0x4c, 0xd3, 0x61, 0xb5])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_160_msg_64_tweak_16(self, test_workers):
my_key = bytes([0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51])
my_tweak = bytes([0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x04, 0xf5, 0xe6])
my_message = bytes([0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6])
real_ciphertext = bytes([0xb3, 0x7d, 0x55, 0x95, 0xb9, 0x75, 0xe4, 0x7e, 0x86, 0x57, 0x82, 0xa4, 0x16, 0x81, 0xf5, 0x8b, 0x1d, 0x81, 0x16, 0x33, 0xb0, 0x62, 0xe0, 0xd4, 0xc9, 0xe4, 0xc8, 0x22, 0x52, 0xae, 0x3c, 0x87, 0x6b, 0x22, 0x80, 0x8a, 0x79, 0x82, 0xbf, 0x44, 0x4d, 0x47, 0x9a, 0xf2, 0x1f, 0xbd, 0x85, 0x0b, 0x6e, 0x9b, 0x80, 0x37, 0x51, 0x7a, 0xb1, 0x54, 0xa5, 0x67, 0xac, 0xd1, 0xaa, 0xa6, 0xf2, 0x7c])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_192_msg_64_tweak_16(self, test_workers):
my_key = bytes([0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31])
my_tweak = bytes([0xa7, 0x98, 0x89, 0x7a, 0x6b, 0x5c, 0x4d, 0x3e, 0x2f, 0x20, 0x11, 0x02, 0xf3, 0xe4, 0xd5, 0xc6])
my_message = bytes([0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6])
real_ciphertext = bytes([0x1a, 0xe0, 0x02, 0x72, 0x49, 0x69, 0xca, 0x0c, 0x11, 0x70, 0xd5, 0x48, 0x4f, 0x39, 0xa9, 0xbf, 0x2f, 0xac, 0xe4, 0x04, 0x30, 0x4d, 0xad, 0x08, 0x79, 0xe3, 0xe2, 0x5e, 0x10, 0xe3, 0xf2, 0x70, 0x95, 0xf9, 0x04, 0xdf, 0x55, 0x13, 0x0d, 0x47, 0xd0, 0xed, 0x02, 0x12, 0x33, 0x77, 0x26, 0x8e, 0x50, 0xf3, 0xe3, 0xa7, 0xe4, 0xf5, 0x47, 0xc2, 0xc7, 0x32, 0xfe, 0x61, 0x1d, 0x2a, 0x30, 0xc7])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_4_tweak_16(self, test_workers):
my_key = bytes([0xcd, 0xae, 0x8f, 0x70, 0x51, 0x32, 0x13, 0xf4, 0xd4, 0xb5, 0x96, 0x77, 0x58, 0x39, 0x1a, 0xfb])
my_tweak = bytes([0x33, 0x33, 0x33, 0x33, 0x33, 0x33, 0x33, 0x33, 0x33, 0x33, 0x33, 0x33, 0x33, 0x33, 0x33, 0x33])
my_message = bytes([0xe5, 0xc6, 0xa7, 0x88])
real_ciphertext = bytes([0x9c, 0x0f, 0xe9, 0x19])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_5_tweak_16(self, test_workers):
my_key = bytes([0xc4, 0x85, 0x46, 0x07, 0xc7, 0x88, 0x49, 0x0a, 0xca, 0x8b, 0x4c, 0x0d, 0xcd, 0x8e, 0x4f, 0x10])
my_tweak = bytes([0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b])
my_message = bytes([0x7d, 0x3e, 0xff, 0xc0, 0x80])
real_ciphertext = bytes([0x38, 0x7a, 0x42, 0x98, 0x9f])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_8_tweak_16(self, test_workers):
my_key = bytes([0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a])
my_tweak = bytes([0xcf, 0xc0, 0xb1, 0xa2, 0x93, 0x84, 0x75, 0x66, 0x57, 0x48, 0x39, 0x2a, 0x1b, 0x0c, 0xfd, 0xee])
my_message = bytes([0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe])
real_ciphertext = bytes([0x30, 0xcc, 0xf7, 0x16, 0x08, 0x97, 0x34, 0xc0])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_12_tweak_16(self, test_workers):
my_key = bytes([0x85, 0x66, 0x47, 0x28, 0x09, 0xea, 0xcb, 0xac, 0x8c, 0x6d, 0x4e, 0x2f, 0x10, 0xf1, 0xd2, 0xb3])
my_tweak = bytes([0xeb, 0xeb, 0xeb, 0xeb, 0xeb, 0xeb, 0xeb, 0xeb, 0xeb, 0xeb, 0xeb, 0xeb, 0xeb, 0xeb, 0xeb, 0xeb])
my_message = bytes([0xa5, 0x86, 0x67, 0x48, 0x29, 0x0a, 0xeb, 0xcc, 0xac, 0x8d, 0x6e, 0x4f])
real_ciphertext = bytes([0x06, 0xbf, 0x6d, 0x3f, 0x16, 0xaf, 0x53, 0x1a, 0x36, 0xf6, 0x60, 0xfd])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_16_tweak_16(self, test_workers):
my_key = bytes([0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12])
my_tweak = bytes([0x87, 0x78, 0x69, 0x5a, 0x4b, 0x3c, 0x2d, 0x1e, 0x0f, 0x00, 0xf1, 0xe2, 0xd3, 0xc4, 0xb5, 0xa6])
my_message = bytes([0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6])
real_ciphertext = bytes([0x56, 0x9d, 0x7b, 0xfd, 0x86, 0x94, 0x39, 0x54, 0xcb, 0x8f, 0x8d, 0x1d, 0x01, 0x9c, 0x48, 0xc2])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_20_tweak_16(self, test_workers):
my_key = bytes([0x3d, 0x1e, 0xff, 0xe0, 0xc1, 0xa2, 0x83, 0x64, 0x44, 0x25, 0x06, 0xe7, 0xc8, 0xa9, 0x8a, 0x6b])
my_tweak = bytes([0xa3, 0xa3, 0xa3, 0xa3, 0xa3, 0xa3, 0xa3, 0xa3, 0xa3, 0xa3, 0xa3, 0xa3, 0xa3, 0xa3, 0xa3, 0xa3])
my_message = bytes([0x65, 0x46, 0x27, 0x08, 0xe9, 0xca, 0xab, 0x8c, 0x6c, 0x4d, 0x2e, 0x0f, 0xf0, 0xd1, 0xb2, 0x93, 0x73, 0x54, 0x35, 0x16])
real_ciphertext = bytes([0xaa, 0x64, 0x84, 0x8a, 0x37, 0x8a, 0xc5, 0xe4, 0x1a, 0xfe, 0x7d, 0x77, 0x0e, 0x31, 0x70, 0xb4, 0x96, 0xc2, 0x01, 0x81])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_24_tweak_16(self, test_workers):
my_key = bytes([0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca])
my_tweak = bytes([0x3f, 0x30, 0x21, 0x12, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e])
my_message = bytes([0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e])
real_ciphertext = bytes([0xad, 0x38, 0x82, 0xeb, 0x73, 0x41, 0xa2, 0x85, 0x12, 0x24, 0x25, 0x98, 0x8a, 0x56, 0x7e, 0xbe, 0x66, 0x8c, 0xf9, 0xfc, 0x07, 0x07, 0xb6, 0x3f])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_34_tweak_16(self, test_workers):
my_key = bytes([0x7f, 0x78, 0x71, 0x6a, 0x63, 0x5c, 0x55, 0x4e, 0x47, 0x40, 0x39, 0x32, 0x2b, 0x24, 0x1d, 0x16])
my_tweak = bytes([0xe5, 0xa6, 0x67, 0x28, 0xe8, 0xa9, 0x6a, 0x2b, 0xeb, 0xac, 0x6d, 0x2e, 0xee, 0xaf, 0x70, 0x31])
my_message = bytes([0x75, 0x6e, 0x67, 0x60, 0x59, 0x52, 0x4b, 0x44, 0x3d, 0x36, 0x2f, 0x28, 0x21, 0x1a, 0x13, 0x0c, 0x05, 0xfe, 0xf7, 0xf0, 0xe9, 0xe2, 0xdb, 0xd4, 0xcd, 0xc6, 0xbf, 0xb8, 0xb1, 0xaa, 0xa3, 0x9c, 0x94, 0x8d])
real_ciphertext = bytes([0xa8, 0x87, 0x6e, 0xca, 0xeb, 0x7b, 0x0b, 0x79, 0x01, 0x04, 0x78, 0x72, 0x42, 0xf4, 0xbf, 0xd9, 0xd6, 0xd1, 0x92, 0xd1, 0xc9, 0xc5, 0x1b, 0x08, 0xad, 0xb5, 0x09, 0xf6, 0x53, 0x47, 0x3e, 0xff, 0x49, 0x2f])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_35_tweak_16(self, test_workers):
my_key = bytes([0x76, 0x67, 0x58, 0x49, 0x3a, 0x2b, 0x1c, 0x0d, 0xfe, 0xef, 0xe0, 0xd1, 0xc2, 0xb3, 0xa4, 0x95])
my_tweak = bytes([0xdc, 0x5d, 0xdd, 0x5e, 0xde, 0x5f, 0xdf, 0x60, 0xe0, 0x61, 0xe1, 0x62, 0xe2, 0x63, 0xe3, 0x64])
my_message = bytes([0x0d, 0xfe, 0xef, 0xe0, 0xd1, 0xc2, 0xb3, 0xa4, 0x95, 0x86, 0x77, 0x68, 0x59, 0x4a, 0x3b, 0x2c, 0x1c, 0x0d, 0xfe, 0xef, 0xe0, 0xd1, 0xc2, 0xb3, 0xa4, 0x95, 0x86, 0x77, 0x68, 0x59, 0x4a, 0x3b, 0x2b, 0x1c, 0x0d])
real_ciphertext = bytes([0x49, 0xe2, 0xc9, 0x88, 0x6b, 0x05, 0x88, 0xed, 0x6c, 0x01, 0x0a, 0x8d, 0x62, 0xbd, 0xad, 0x88, 0x3f, 0xe2, 0xda, 0x78, 0xa0, 0x1f, 0x45, 0x8d, 0xf0, 0x86, 0x21, 0x1a, 0x16, 0xb2, 0xcd, 0xec, 0x0b, 0x6d, 0xe5])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_47_tweak_16(self, test_workers):
my_key = bytes([0x9a, 0x9a, 0x9a, 0x9a, 0x9a, 0x9a, 0x9a, 0x9a, 0x9a, 0x9a, 0x9a, 0x9a, 0x9a, 0x9a, 0x9a, 0x9a])
my_tweak = bytes([0x00, 0xf9, 0xf2, 0xeb, 0xe4, 0xdd, 0xd6, 0xcf, 0xc8, 0xc1, 0xba, 0xb3, 0xac, 0xa5, 0x9e, 0x97])
my_message = bytes([0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd])
real_ciphertext = bytes([0x23, 0x73, 0x99, 0x3e, 0x57, 0x86, 0xb3, 0x14, 0x44, 0xa7, 0x16, 0x21, 0xdf, 0xb8, 0x62, 0x79, 0x46, 0xe8, 0x27, 0x7f, 0x78, 0x3e, 0x2f, 0x8f, 0x01, 0x1b, 0x6b, 0xa4, 0x56, 0xf2, 0xbb, 0xef, 0x29, 0x7e, 0x6b, 0x02, 0xb4, 0xf2, 0x09, 0xf7, 0xdf, 0xf6, 0x30, 0x52, 0xd9, 0xee, 0xba])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_48_tweak_16(self, test_workers):
my_key = bytes([0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2])
my_tweak = bytes([0x67, 0x58, 0x49, 0x3a, 0x2b, 0x1c, 0x0d, 0xfe, 0xef, 0xe0, 0xd1, 0xc2, 0xb3, 0xa4, 0x95, 0x86])
my_message = bytes([0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96])
real_ciphertext = bytes([0x30, 0xf3, 0x5f, 0xd8, 0x1d, 0xc0, 0x4d, 0xaf, 0x78, 0x3b, 0xd8, 0x05, 0x63, 0xe5, 0x56, 0xdb, 0xdc, 0x1a, 0x7a, 0x52, 0xd6, 0x8b, 0x4e, 0x32, 0x19, 0xaf, 0x02, 0x37, 0x6c, 0x47, 0x37, 0x26, 0x53, 0x05, 0x97, 0x81, 0x22, 0x04, 0xf4, 0x1c, 0x03, 0x7f, 0x8d, 0x7e, 0xe0, 0x04, 0xc3, 0x83])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_96_tweak_16(self, test_workers):
my_key = bytes([0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42])
my_tweak = bytes([0xb7, 0xa8, 0x99, 0x8a, 0x7b, 0x6c, 0x5d, 0x4e, 0x3f, 0x30, 0x21, 0x12, 0x03, 0xf4, 0xe5, 0xd6])
my_message = bytes([0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6])
real_ciphertext = bytes([0x1f, 0xe8, 0xe3, 0x3e, 0xf9, 0xfa, 0x6e, 0x72, 0x5a, 0xad, 0xf3, 0xdc, 0x3c, 0x27, 0xb2, 0x77, 0x29, 0x58, 0x47, 0x3b, 0x93, 0xeb, 0x71, 0xfa, 0xe9, 0xda, 0xe1, 0x54, 0x2d, 0x64, 0x43, 0xc8, 0x4e, 0xeb, 0x24, 0x51, 0xd6, 0xca, 0x22, 0x1c, 0x25, 0x64, 0xcb, 0x96, 0x65, 0x4d, 0x79, 0x81, 0xd5, 0x9f, 0x90, 0x55, 0xf1, 0x83, 0xf8, 0xf7, 0x96, 0xb8, 0x1d, 0x5a, 0xca, 0x91, 0x24, 0x26, 0x24, 0x5f, 0x0d, 0x40, 0x06, 0x28, 0x27, 0x75, 0x1a, 0xa8, 0xb8, 0x19, 0xe9, 0x11, 0x44, 0xc8, 0x8f, 0x80, 0x4f, 0xc9, 0xd0, 0x3e, 0xc8, 0xd5, 0x81, 0x5d, 0xe7, 0x4f, 0x95, 0xcc, 0x20, 0x34])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_128_tweak_16(self, test_workers):
my_key = bytes([0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22])
my_tweak = bytes([0x97, 0x88, 0x79, 0x6a, 0x5b, 0x4c, 0x3d, 0x2e, 0x1f, 0x10, 0x01, 0xf2, 0xe3, 0xd4, 0xc5, 0xb6])
my_message = bytes([0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6])
real_ciphertext = bytes([0x0b, 0xd1, 0xb2, 0x2e, 0xe8, 0x34, 0x46, 0x60, 0x56, 0x7d, 0x17, 0xc2, 0x47, 0x60, 0x92, 0xe2, 0x33, 0x64, 0x6d, 0xdf, 0x32, 0xdb, 0x1f, 0x43, 0x4e, 0x67, 0x19, 0x6d, 0x4a, 0xdd, 0x62, 0x0d, 0xd4, 0x6a, 0xd7, 0xe1, 0xbf, 0x56, 0x02, 0x72, 0x5c, 0x73, 0x8e, 0xec, 0xbc, 0xea, 0x81, 0xb6, 0x61, 0x47, 0xf2, 0xa9, 0x29, 0x39, 0xb0, 0x54, 0xce, 0x01, 0x51, 0xb6, 0xa1, 0x49, 0x02, 0x24, 0x53, 0xd8, 0x69, 0xab, 0x3b, 0x90, 0x18, 0x3c, 0xa1, 0x6a, 0x16, 0x94, 0x12, 0xc3, 0xfb, 0x6e, 0x03, 0x7b, 0xe9, 0xa0, 0xc6, 0xfe, 0x5e, 0x2f, 0xbe, 0xcd, 0x75, 0xed, 0x5a, 0x21, 0x92, 0x25, 0x75, 0x6a, 0xa7, 0x68, 0x81, 0xd4, 0xd3, 0x5e, 0x25, 0x09, 0xfe, 0x61, 0x02, 0x7f, 0xfa, 0x80, 0x40, 0x6d, 0xa2, 0x61, 0x3a, 0xf5, 0x42, 0xdc, 0x59, 0xfd, 0x8f, 0x2f, 0xf2, 0xfe, 0xc2, 0xe1])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_256_tweak_16(self, test_workers):
my_key = bytes([0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2])
my_tweak = bytes([0x17, 0x08, 0xf9, 0xea, 0xdb, 0xcc, 0xbd, 0xae, 0x9f, 0x90, 0x81, 0x72, 0x63, 0x54, 0x45, 0x36])
my_message = bytes([0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45])
real_ciphertext = bytes([0x0b, 0xa4, 0x5b, 0xda, 0x50, 0x2f, 0x9e, 0xf1, 0x96, 0x54, 0x2c, 0x8e, 0xf1, 0x2c, 0xbe, 0x9b, 0xf0, 0x73, 0x2e, 0x45, 0x07, 0x5a, 0xec, 0xe7, 0xa3, 0x2c, 0xd1, 0x35, 0x02, 0xbe, 0xda, 0x9b, 0xd3, 0xe2, 0x39, 0xd3, 0x63, 0x4e, 0xa7, 0xb2, 0x9c, 0x5f, 0x6b, 0x13, 0x60, 0xea, 0x87, 0x54, 0xd6, 0x1f, 0x65, 0x06, 0xb1, 0xf7, 0x3f, 0x13, 0x6b, 0xe6, 0x40, 0xd1, 0xa2, 0x64, 0xb0, 0xfe, 0xe8, 0x5b, 0xbc, 0x16, 0xf6, 0x00, 0xcd, 0xc8, 0x85, 0x60, 0x06, 0xa2, 0x92, 0x28, 0xd0, 0x8c, 0xd3, 0xbd, 0x0e, 0xc5, 0x36, 0x22, 0x91, 0x75, 0x65, 0xa9, 0xbb, 0x27, 0x94, 0x90, 0x68, 0xa1, 0xcf, 0xa0, 0x01, 0xc8, 0xfd, 0x3d, 0x0e, 0xd3, 0x8a, 0xb4, 0xd2, 0x7a, 0x7f, 0xae, 0xe1, 0x1a, 0x1e, 0xbb, 0xc0, 0x74, 0x7f, 0xe1, 0x0f, 0x60, 0x5d, 0x59, 0xc3, 0xe6, 0xb6, 0x4c, 0x57, 0xec, 0xac, 0x07, 0x6d, 0x32, 0x4f, 0xba, 0x07, 0x67, 0xc7, 0xff, 0x2f, 0xfe, 0xa4, 0xad, 0x76, 0x30, 0xa4, 0x24, 0xb9, 0x1c, 0x27, 0x9f, 0x4f, 0x03, 0x4f, 0x93, 0x94, 0x20, 0xf4, 0xa7, 0x05, 0xb4, 0x48, 0xb5, 0x4b, 0x07, 0xcf, 0x7e, 0x2e, 0x3e, 0x30, 0xc9, 0xec, 0x6f, 0x1f, 0xa7, 0xea, 0x7c, 0xf2, 0x72, 0x4e, 0x11, 0xc1, 0x86, 0x51, 0x9b, 0xe9, 0xee, 0xd5, 0xa8, 0xfc, 0xfb, 0x92, 0x7e, 0xa6, 0x89, 0x64, 0x95, 0x6f, 0xbf, 0x6d, 0xac, 0x64, 0x26, 0x70, 0x5b, 0xdb, 0xd8, 0x62, 0x2e, 0xe8, 0x63, 0x5a, 0x93, 0xd4, 0xaf, 0x08, 0xc9, 0x05, 0x6a, 0x22, 0x02, 0x18, 0x03, 0xf6, 0x6d, 0xcc, 0xa5, 0xbe, 0xd0, 0xd2, 0x17, 0x4d, 0x69, 0xb0, 0xde, 0x54, 0x16, 0x02, 0xf3, 0xd4, 0xa0, 0xe5, 0x12, 0xb2, 0x83, 0xf1, 0x01, 0x3f, 0x19, 0x09, 0xb2, 0xbd, 0x59, 0x28, 0x5a, 0x02, 0x25])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_398_tweak_16(self, test_workers):
my_key = bytes([0xf3, 0x74, 0xf4, 0x75, 0xf5, 0x76, 0xf6, 0x77, 0xf7, 0x78, 0xf8, 0x79, 0xf9, 0x7a, 0xfa, 0x7b])
my_tweak = bytes([0x59, 0x56, 0x53, 0x50, 0x4d, 0x4a, 0x47, 0x44, 0x41, 0x3e, 0x3b, 0x38, 0x35, 0x32, 0x2f, 0x2c])
my_message = bytes([0xd5, 0x56, 0xd6, 0x57, 0xd7, 0x58, 0xd8, 0x59, 0xd9, 0x5a, 0xda, 0x5b, 0xdb, 0x5c, 0xdc, 0x5d, 0xdd, 0x5e, 0xde, 0x5f, 0xdf, 0x60, 0xe0, 0x61, 0xe1, 0x62, 0xe2, 0x63, 0xe3, 0x64, 0xe4, 0x65, 0xe5, 0x66, 0xe6, 0x67, 0xe7, 0x68, 0xe8, 0x69, 0xe9, 0x6a, 0xea, 0x6b, 0xeb, 0x6c, 0xec, 0x6d, 0xed, 0x6e, 0xee, 0x6f, 0xef, 0x70, 0xf0, 0x71, 0xf1, 0x72, 0xf2, 0x73, 0xf3, 0x74, 0xf4, 0x75, 0xf5, 0x76, 0xf6, 0x77, 0xf7, 0x78, 0xf8, 0x79, 0xf9, 0x7a, 0xfa, 0x7b, 0xfb, 0x7c, 0xfc, 0x7d, 0xfd, 0x7e, 0xfe, 0x7f, 0xff, 0x80, 0x00, 0x81, 0x01, 0x82, 0x02, 0x83, 0x03, 0x84, 0x04, 0x85, 0x05, 0x86, 0x06, 0x87, 0x07, 0x88, 0x08, 0x89, 0x09, 0x8a, 0x0a, 0x8b, 0x0b, 0x8c, 0x0c, 0x8d, 0x0d, 0x8e, 0x0e, 0x8f, 0x0f, 0x90, 0x10, 0x91, 0x11, 0x92, 0x12, 0x93, 0x13, 0x94, 0x14, 0x95, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0xd5, 0x56, 0xd6, 0x57, 0xd7, 0x58, 0xd8, 0x59, 0xd9, 0x5a, 0xda, 0x5b, 0xdb, 0x5c, 0xdc, 0x5d, 0xdd, 0x5e, 0xde, 0x5f, 0xdf, 0x60, 0xe0, 0x61, 0xe1, 0x62, 0xe2, 0x63, 0xe3, 0x64, 0xe4, 0x65, 0xe5, 0x66, 0xe6, 0x67, 0xe7, 0x68, 0xe8, 0x69, 0xe9, 0x6a, 0xea, 0x6b, 0xeb, 0x6c, 0xec, 0x6d, 0xed, 0x6e, 0xee, 0x6f, 0xef, 0x70, 0xf0, 0x71, 0xf1, 0x72, 0xf2, 0x73, 0xf3, 0x74, 0xf4, 0x75, 0xf5, 0x76, 0xf6, 0x77, 0xf7, 0x78, 0xf8, 0x79, 0xf9, 0x7a, 0xfa, 0x7b, 0xfb, 0x7c, 0xfc, 0x7d, 0xfd, 0x7e, 0xfe, 0x7f, 0xff, 0x80, 0x00, 0x81, 0x01, 0x82, 0x02, 0x83, 0x03, 0x84, 0x04, 0x85, 0x05, 0x86, 0x06, 0x87, 0x07, 0x88, 0x08, 0x89, 0x09, 0x8a, 0x0a, 0x8b, 0x0b, 0x8c, 0x0c, 0x8d, 0x0d, 0x8e, 0x0e, 0x8f, 0x0f, 0x90, 0x10, 0x91, 0x11, 0x92, 0x12, 0x93, 0x13, 0x94, 0x14, 0x95, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c])
real_ciphertext = bytes([0x8c, 0xa8, 0xa8, 0x59, 0x41, 0xb1, 0x9e, 0xc8, 0xe9, 0xff, 0x50, 0x0a, 0xc5, 0x6e, 0xd7, 0x9a, 0xf3, 0x25, 0x71, 0xbd, 0x7e, 0xac, 0xbf, 0xd6, 0xaf, 0x3e, 0x60, 0x20, 0x77, 0x44, 0x6f, 0xd8, 0x6f, 0x32, 0xe9, 0x55, 0x5a, 0xf6, 0xda, 0x9b, 0xfe, 0x3e, 0xc0, 0x2c, 0x8c, 0xde, 0x7d, 0x95, 0xad, 0x36, 0x03, 0xa3, 0xe3, 0xf0, 0x6d, 0xd9, 0x75, 0x64, 0xca, 0xa8, 0x23, 0x67, 0xaf, 0xcd, 0xd0, 0x61, 0xaa, 0xdd, 0x50, 0x22, 0xd3, 0x62, 0xd4, 0xa0, 0x16, 0xf7, 0x20, 0x06, 0xc1, 0x09, 0xef, 0xb4, 0x31, 0xaa, 0xcf, 0xe1, 0xc4, 0x57, 0xb7, 0xff, 0x81, 0xca, 0xff, 0x05, 0xc9, 0x9e, 0x4d, 0x26, 0x9b, 0x2e, 0x01, 0x70, 0xec, 0x11, 0x87, 0x44, 0x00, 0x54, 0x7a, 0x37, 0x0c, 0xf3, 0x1a, 0xa8, 0x8d, 0xc4, 0x49, 0xf7, 0xb6, 0x12, 0xa4, 0xb9, 0x0a, 0x45, 0x84, 0xec, 0x4c, 0x70, 0xa8, 0xfc, 0xf2, 0x7a, 0xaa, 0x38, 0x85, 0xde, 0x05, 0x36, 0x40, 0x22, 0xaf, 0xc7, 0xa8, 0x8e, 0xae, 0xee, 0x8c, 0xb7, 0x26, 0x30, 0xfe, 0xa4, 0x43, 0xf3, 0x4b, 0x7d, 0x19, 0xaf, 0x14, 0xa9, 0xbc, 0x9f, 0xd7, 0x15, 0x80, 0x17, 0x0c, 0x0a, 0x3d, 0x2b, 0xc0, 0xce, 0x69, 0x05, 0x38, 0x26, 0x5f, 0xae, 0x7e, 0x1a, 0x4c, 0x53, 0xc1, 0x1f, 0x0a, 0xdf, 0x16, 0xe3, 0x67, 0xa4, 0x28, 0x8b, 0xaa, 0x07, 0xe5, 0xa2, 0x3e, 0x3a, 0x94, 0x4f, 0x8a, 0xf3, 0xf4, 0xbe, 0x29, 0xac, 0xc9, 0x97, 0x7c, 0x46, 0xfe, 0xfe, 0x48, 0xde, 0x1a, 0x36, 0xa2, 0x5e, 0x78, 0x78, 0x4b, 0xc4, 0xdb, 0x91, 0x6d, 0x4b, 0x63, 0x95, 0xd2, 0x91, 0x12, 0x3d, 0xc5, 0x34, 0x2b, 0x98, 0x58, 0x8e, 0xdd, 0xbe, 0x29, 0x3f, 0x4e, 0x07, 0x9e, 0xae, 0x75, 0x9d, 0xb3, 0x23, 0x35, 0x42, 0x47, 0xca, 0xd3, 0x8f, 0x90, 0x39, 0xd2, 0xf5, 0x28, 0x24, 0xbb, 0xce, 0x71, 0xa3, 0x31, 0xa4, 0xfb, 0xa8, 0xd7, 0xe4, 0x0b, 0xd3, 0xbf, 0x07, 0xa8, 0x37, 0x0e, 0x32, 0x25, 0xa3, 0x41, 0x8c, 0x62, 0xe2, 0xbf, 0x75, 0xac, 0x04, 0x4d, 0x51, 0x97, 0x37, 0xb0, 0x37, 0x7d, 0x21, 0xd8, 0x3f, 0x7d, 0x0d, 0xca, 0xb8, 0xdf, 0x9e, 0x69, 0xe4, 0x27, 0x40, 0x99, 0xd8, 0x13, 0x34, 0xfc, 0x00, 0x0a, 0x7d, 0x63, 0x02, 0x53, 0xf3, 0xf1, 0x9a, 0xf3, 0x46, 0x63, 0x03, 0x04, 0x7c, 0xfd, 0x59, 0xd9, 0x7c, 0x19, 0x5e, 0x0e, 0x83, 0x0a, 0xf4, 0xb9, 0xbc, 0xb8, 0x8c, 0xf0, 0x17, 0xf3, 0xf1, 0xb4, 0x05, 0x9b, 0x86, 0x25, 0xcc, 0xbe, 0x52, 0x03, 0xdc, 0x32, 0x9b, 0xab, 0xc5, 0x44, 0xfd, 0x91, 0x0f, 0xb2, 0xc6, 0xe8, 0xb5, 0x96, 0x02, 0x0d, 0xfe, 0x2f, 0x10, 0xcf, 0x8d, 0x3b, 0x1d, 0xb7, 0x65, 0x32, 0x85, 0x4f, 0xb5, 0xf3, 0x74, 0xef, 0xf2, 0x24, 0x59, 0x75, 0xae, 0x3a, 0x5d, 0xf1, 0xf7])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_399_tweak_16(self, test_workers):
my_key = bytes([0xfa, 0xfa, 0xfa, 0xfa, 0xfa, 0xfa, 0xfa, 0xfa, 0xfa, 0xfa, 0xfa, 0xfa, 0xfa, 0xfa, 0xfa, 0xfa])
my_tweak = bytes([0x60, 0x59, 0x52, 0x4b, 0x44, 0x3d, 0x36, 0x2f, 0x28, 0x21, 0x1a, 0x13, 0x0c, 0x05, 0xfe, 0xf7])
my_message = bytes([0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d, 0x7d])
real_ciphertext = bytes([0xb7, 0xf8, 0x0e, 0x5b, 0x4d, 0x8f, 0x9b, 0xbf, 0x2d, 0x1a, 0x81, 0x95, 0xda, 0xb4, 0xe1, 0x55, 0x89, 0xcd, 0x89, 0x3d, 0x85, 0x60, 0x41, 0x05, 0x70, 0xfb, 0x96, 0x23, 0x86, 0x86, 0xe1, 0xbd, 0x7a, 0x84, 0x33, 0x0a, 0x0d, 0x6b, 0x21, 0x30, 0xa3, 0xc4, 0x6f, 0x38, 0x6a, 0x2b, 0xf2, 0x47, 0xdf, 0xee, 0x1b, 0x8f, 0x9d, 0x50, 0x2e, 0x6b, 0xec, 0xee, 0xc4, 0xba, 0x55, 0x7c, 0x03, 0x73, 0x24, 0xbf, 0x09, 0x2e, 0x4e, 0xbb, 0xc4, 0x7c, 0xdf, 0x04, 0xd2, 0xbe, 0xc4, 0x48, 0x3b, 0xfe, 0x84, 0xfe, 0xef, 0x26, 0x7e, 0xea, 0xc8, 0x3b, 0x90, 0x3c, 0xc2, 0x34, 0x16, 0x47, 0x28, 0x01, 0xeb, 0x52, 0x4a, 0x23, 0x70, 0x97, 0x21, 0x61, 0xbd, 0x62, 0xfd, 0x92, 0xde, 0x17, 0x3a, 0x14, 0x95, 0xb8, 0xe2, 0xa5, 0x21, 0xb2, 0xcb, 0xbc, 0x2b, 0xa9, 0xd4, 0x52, 0x17, 0xe3, 0xbc, 0x8b, 0xd7, 0x3c, 0x57, 0xca, 0x2b, 0x2f, 0xe3, 0x74, 0xe2, 0x58, 0x3d, 0xdf, 0xea, 0xfe, 0x5c, 0xd5, 0x70, 0x06, 0x8d, 0x02, 0x7b, 0xa4, 0x6f, 0x41, 0x9d, 0xf1, 0x6a, 0xa7, 0xbf, 0xaa, 0x94, 0x5a, 0x8c, 0x73, 0xd2, 0x2b, 0xc5, 0x08, 0xaa, 0xb3, 0xa7, 0x93, 0xdb, 0x09, 0x31, 0x51, 0x6c, 0x69, 0x59, 0x3e, 0x79, 0x9e, 0xec, 0x26, 0xdd, 0x4f, 0xe2, 0xd7, 0xe6, 0xc3, 0x31, 0x14, 0x7a, 0x9e, 0x9f, 0xf3, 0xd5, 0x6f, 0x55, 0x98, 0xa4, 0x49, 0x7d, 0x46, 0x62, 0xd1, 0x74, 0x25, 0x4d, 0xaf, 0x7e, 0x95, 0x6d, 0xca, 0x8a, 0xaf, 0xd5, 0xcd, 0x66, 0xd1, 0xa4, 0xf9, 0xfb, 0x1d, 0x08, 0xad, 0xa9, 0xd9, 0xca, 0x0c, 0x94, 0x34, 0xb2, 0x67, 0xfa, 0x1d, 0xcd, 0x1b, 0x98, 0xa0, 0x19, 0x91, 0x4c, 0xa5, 0x9b, 0xce, 0x38, 0xa1, 0xdf, 0x5f, 0x59, 0x0d, 0x30, 0x10, 0x19, 0x2c, 0x22, 0x47, 0xf0, 0x3f, 0xca, 0x37, 0x3f, 0x6a, 0x47, 0xed, 0x9b, 0x38, 0xd7, 0xf5, 0x19, 0x10, 0xd0, 0x91, 0xc4, 0xfa, 0x46, 0x59, 0xda, 0xfc, 0xcf, 0xa5, 0xd7, 0x81, 0x36, 0xc9, 0xa8, 0x7a, 0x2e, 0xcb, 0xa4, 0x79, 0x9a, 0xee, 0xad, 0xb7, 0xb3, 0x1a, 0x32, 0x07, 0xcf, 0x93, 0xc5, 0x09, 0x35, 0x26, 0x23, 0xc8, 0xa1, 0x71, 0x4a, 0xbe, 0x41, 0x78, 0xf9, 0x21, 0xb8, 0xc7, 0x83, 0x6c, 0x18, 0xd2, 0x52, 0x4f, 0x86, 0x52, 0x2d, 0x99, 0x5e, 0xe8, 0x16, 0x8f, 0xb8, 0x1a, 0xf3, 0xd4, 0xc6, 0xc8, 0x7b, 0x4e, 0x48, 0x5f, 0xaa, 0x94, 0xa9, 0x97, 0x6c, 0x25, 0xec, 0x95, 0xb0, 0x87, 0x77, 0x6d, 0x75, 0x54, 0xee, 0x84, 0x39, 0x73, 0x54, 0xe8, 0x79, 0x80, 0x01, 0xc2, 0x76, 0x27, 0xc6, 0x34, 0x28, 0x66, 0x98, 0x0c, 0xec, 0xbb, 0x74, 0x83, 0x43, 0xd8, 0x51, 0x96, 0xe3, 0x9a, 0xf3, 0xbf, 0x29, 0xa9, 0x2a, 0x54, 0x9e, 0x1b, 0x83, 0x61, 0x7a, 0xe2, 0xbc, 0x4c, 0x3a, 0xa0, 0x50])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_400_tweak_16(self, test_workers):
my_key = bytes([0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92])
my_tweak = bytes([0x07, 0xf8, 0xe9, 0xda, 0xcb, 0xbc, 0xad, 0x9e, 0x8f, 0x80, 0x71, 0x62, 0x53, 0x44, 0x35, 0x26])
my_message = bytes([0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35])
real_ciphertext = bytes([0xe8, 0x50, 0x67, 0x20, 0x6f, 0x83, 0x22, 0xcf, 0x6d, 0x58, 0xd2, 0x99, 0x5c, 0x77, 0xa0, 0x1b, 0x4f, 0x6b, 0x69, 0x98, 0xdc, 0x20, 0xd3, 0xf7, 0xad, 0x9c, 0xf6, 0x00, 0x3c, 0xf5, 0xc7, 0x0a, 0x79, 0x64, 0x09, 0x97, 0x8d, 0x3c, 0xb8, 0x56, 0x5d, 0xcd, 0x6c, 0xd1, 0x12, 0x4e, 0x2b, 0x28, 0x08, 0xcb, 0x99, 0x6a, 0x92, 0x93, 0x07, 0xda, 0x30, 0x73, 0x2f, 0x06, 0x83, 0x87, 0xcb, 0x51, 0x86, 0xf3, 0xaf, 0x3e, 0xc2, 0xf1, 0xcc, 0x7a, 0xaa, 0xb9, 0x94, 0x21, 0x1a, 0xd3, 0x4b, 0x75, 0x98, 0x72, 0x87, 0x83, 0xed, 0x4f, 0x6c, 0x9d, 0xe5, 0x00, 0xf3, 0xa7, 0xb9, 0xd5, 0x7f, 0xdb, 0x25, 0x59, 0xef, 0x3b, 0xb8, 0x41, 0x39, 0xe9, 0x8b, 0xcb, 0xff, 0x93, 0x6e, 0x2b, 0xbb, 0x82, 0xd9, 0xf2, 0x3e, 0xa9, 0x28, 0xd4, 0xbd, 0x62, 0x92, 0x39, 0xd0, 0xfc, 0xe3, 0xd9, 0xb6, 0x0f, 0x59, 0x9d, 0x5f, 0x88, 0x5b, 0x7d, 0x61, 0x87, 0x49, 0x1f, 0x02, 0xb8, 0x52, 0xdb, 0x39, 0x74, 0xec, 0x29, 0x89, 0x18, 0x83, 0xbf, 0x67, 0x60, 0xb6, 0x9b, 0xb4, 0xc6, 0x16, 0x82, 0x9f, 0xf7, 0xfd, 0x7c, 0x14, 0xad, 0xd6, 0x0e, 0x04, 0x19, 0x16, 0x9f, 0x53, 0x49, 0x09, 0x14, 0xf3, 0x9a, 0x89, 0x8d, 0x06, 0x7f, 0x81, 0x0f, 0x57, 0x0f, 0x85, 0x20, 0xf8, 0xfe, 0x45, 0xf0, 0xd0, 0xc4, 0x38, 0x7b, 0xdc, 0xd0, 0x07, 0xd1, 0xb7, 0x34, 0xe1, 0x05, 0x0f, 0xd9, 0xb5, 0x8f, 0xc5, 0xde, 0x49, 0xa0, 0x39, 0xd9, 0x3e, 0xa6, 0x04, 0x10, 0x79, 0x0e, 0x89, 0x07, 0x63, 0x5e, 0x52, 0xd7, 0x4e, 0xd7, 0x8a, 0x9e, 0xb4, 0x6b, 0xea, 0x61, 0x5b, 0x69, 0x16, 0x4b, 0xd0, 0x43, 0x92, 0xc3, 0xcd, 0x80, 0xbe, 0xe7, 0x02, 0x69, 0xa1, 0xda, 0xb7, 0xe4, 0x6b, 0xef, 0x2b, 0x75, 0x71, 0x83, 0x70, 0x50, 0x7d, 0x73, 0xf1, 0x76, 0x72, 0x2f, 0x10, 0xcf, 0xb8, 0xa5, 0x69, 0xdb, 0x7d, 0xf7, 0x85, 0xda, 0xa4, 0xa4, 0xfc, 0x41, 0x7f, 0xb2, 0x19, 0x92, 0xf0, 0xec, 0xcb, 0xac, 0x09, 0x17, 0x2f, 0xcd, 0xab, 0x84, 0x28, 0xd4, 0x57, 0x8e, 0x8e, 0x8c, 0x4f, 0x0f, 0xd1, 0x7c, 0x99, 0xbd, 0xc5, 0xa6, 0x9d, 0x44, 0x8f, 0x5d, 0x18, 0xf7, 0xf4, 0x24, 0x78, 0xe4, 0x74, 0x50, 0x6f, 0x0d, 0x7b, 0x66, 0x57, 0x8d, 0xab, 0xf3, 0xe5, 0x3b, 0x72, 0x5f, 0xf1, 0xa3, 0x12, 0x68, 0xd0, 0xc6, 0x0d, 0xed, 0xfa, 0x16, 0x27, 0xda, 0x40, 0x4f, 0xbf, 0x07, 0xb7, 0x74, 0x4d, 0x93, 0x12, 0xb7, 0xee, 0x5a, 0x17, 0xa7, 0xbb, 0x4d, 0xad, 0x03, 0x95, 0x94, 0x7e, 0xec, 0x69, 0xb8, 0x7c, 0xed, 0x9a, 0xde, 0x3c, 0x3b, 0x3d, 0x30, 0x58, 0xf2, 0x4e, 0xed, 0x7a, 0x98, 0x2f, 0x32, 0x96, 0x8a, 0xfc, 0xea, 0xe0, 0xd3, 0xca, 0x31, 0x8e, 0xd6, 0x8d, 0x16, 0xf8, 0x30, 0xe8, 0x44, 0x0e, 0x67])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_401_tweak_16(self, test_workers):
my_key = bytes([0x98, 0x95, 0x92, 0x8f, 0x8c, 0x89, 0x86, 0x83, 0x80, 0x7d, 0x7a, 0x77, 0x74, 0x71, 0x6e, 0x6b])
my_tweak = bytes([0xfe, 0xdf, 0xc0, 0xa1, 0x82, 0x63, 0x44, 0x25, 0x05, 0xe6, 0xc7, 0xa8, 0x89, 0x6a, 0x4b, 0x2c])
my_message = bytes([0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1a, 0x17, 0x14, 0x11, 0x0e, 0x0b, 0x08, 0x05, 0x02, 0xff, 0xfc, 0xf9, 0xf6, 0xf3, 0xf0, 0xed, 0xea, 0xe7, 0xe4, 0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab])
real_ciphertext = bytes([0xd1, 0x12, 0x65, 0xc7, 0x75, 0x69, 0xce, 0x2e, 0xca, 0x7d, 0xdc, 0x5f, 0xfc, 0x2c, 0x18, 0x09, 0xaf, 0xa5, 0xae, 0xdb, 0x48, 0xfc, 0xf1, 0x0a, 0x13, 0xc0, 0xef, 0xcd, 0x0a, 0xc9, 0xeb, 0x1d, 0xbb, 0x9b, 0xe6, 0xf5, 0x2b, 0x70, 0x1d, 0x99, 0x98, 0x14, 0x75, 0x97, 0x1d, 0xcc, 0xe6, 0xe6, 0x96, 0x59, 0xf8, 0x08, 0xca, 0xfb, 0x39, 0x29, 0xc5, 0xc4, 0x3d, 0x55, 0xdd, 0xed, 0x86, 0x54, 0xad, 0x39, 0x55, 0x73, 0xb4, 0x68, 0xd7, 0x18, 0x82, 0x76, 0x74, 0x1d, 0x0f, 0x56, 0x34, 0x59, 0xef, 0x09, 0x3a, 0xd4, 0xc7, 0x74, 0x92, 0x81, 0x1a, 0x5d, 0x6a, 0x20, 0xbd, 0xf6, 0xa2, 0xa8, 0x9f, 0xf0, 0x2d, 0x16, 0xb5, 0x31, 0xe8, 0x32, 0xc5, 0xdc, 0x78, 0x8b, 0x1d, 0x1f, 0xe5, 0xc0, 0x34, 0x10, 0x3b, 0x95, 0x9f, 0x6a, 0x70, 0x63, 0x98, 0x75, 0x4d, 0x3c, 0xcd, 0x98, 0x8f, 0x8c, 0x90, 0xf3, 0xef, 0x54, 0x39, 0x3f, 0xe1, 0x5b, 0x9b, 0x0c, 0xb9, 0x48, 0xaf, 0xc5, 0x59, 0xd4, 0x6a, 0xa0, 0x2f, 0xb4, 0xd7, 0xd2, 0xb7, 0xf6, 0xfe, 0xe7, 0x45, 0x15, 0x08, 0x6a, 0xda, 0xe1, 0x5f, 0x40, 0x08, 0x0c, 0x50, 0xa6, 0xad, 0x04, 0x5a, 0xd3, 0xf0, 0x0c, 0x76, 0x8a, 0x27, 0xd3, 0x7a, 0x43, 0x63, 0xb6, 0x4d, 0x04, 0xd1, 0x0f, 0x64, 0x81, 0xda, 0x2c, 0x95, 0xbd, 0x6b, 0x1c, 0x4e, 0x70, 0xeb, 0xe2, 0x8d, 0xb7, 0xe4, 0x9a, 0x3a, 0xbf, 0xb6, 0x77, 0x3f, 0xe3, 0xad, 0x4e, 0x6b, 0x54, 0xf6, 0x5d, 0x02, 0xf7, 0xf8, 0x8c, 0xe1, 0x5d, 0x1c, 0x84, 0x3b, 0xc9, 0xdf, 0x85, 0x23, 0xa4, 0xb1, 0x4c, 0xfa, 0x1b, 0x00, 0x6c, 0x5c, 0x12, 0xa6, 0x1e, 0x43, 0x38, 0x88, 0xa1, 0x96, 0x33, 0xe8, 0xa6, 0x01, 0xad, 0x8e, 0x27, 0xdd, 0x7d, 0x9f, 0x3a, 0x9a, 0xbe, 0x6a, 0x2a, 0x43, 0xbd, 0xbc, 0x8c, 0x7d, 0x27, 0xda, 0x1d, 0x53, 0xcd, 0x09, 0x2c, 0x61, 0x7a, 0x13, 0x92, 0x56, 0xf8, 0x92, 0xe1, 0xf6, 0xcb, 0x18, 0xfa, 0xbf, 0xbd, 0x74, 0x37, 0x7a, 0xce, 0x7e, 0x4f, 0xfd, 0x86, 0x2f, 0x70, 0x26, 0xe0, 0xb2, 0x46, 0x08, 0x87, 0x25, 0x8a, 0x68, 0x78, 0xe6, 0x11, 0x9d, 0x84, 0xc8, 0xe7, 0x50, 0xaf, 0x1f, 0x25, 0x54, 0x15, 0x59, 0xfe, 0x70, 0x44, 0x03, 0xc2, 0x6c, 0x6a, 0xd7, 0x7a, 0x79, 0xa4, 0x7a, 0x9c, 0xd3, 0x8c, 0xa5, 0xd7, 0xbf, 0x43, 0xf6, 0x55, 0x25, 0x90, 0xf1, 0x05, 0xf8, 0xf0, 0xbb, 0xd8, 0x71, 0xc8, 0xe3, 0xbb, 0x4f, 0xc8, 0x7e, 0xd5, 0x52, 0xd4, 0x03, 0xb8, 0x06, 0x94, 0x06, 0xaf, 0x8e, 0x5e, 0xbc, 0xa4, 0x77, 0x69, 0xd0, 0x72, 0x78, 0x6c, 0xb2, 0xd0, 0x73, 0xa1, 0xeb, 0x7f, 0x6b, 0x2a, 0x3f, 0x2e, 0x4f, 0xb6, 0x01, 0x4f, 0x2c, 0x84, 0xb8, 0x40, 0x4c, 0x22, 0x33, 0x31, 0xcd, 0x3c, 0xfa, 0x60, 0x61, 0x5c, 0xcc, 0xb2, 0xb8])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_512_tweak_16(self, test_workers):
my_key = bytes([0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2])
my_tweak = bytes([0x17, 0x08, 0xf9, 0xea, 0xdb, 0xcc, 0xbd, 0xae, 0x9f, 0x90, 0x81, 0x72, 0x63, 0x54, 0x45, 0x36])
my_message = bytes([0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45])
real_ciphertext = bytes([0xbd, 0xb8, 0x41, 0x3c, 0xb4, 0xe7, 0xd3, 0x68, 0xd8, 0x94, 0x13, 0x2a, 0xc7, 0xd7, 0x4a, 0x48, 0x75, 0xc8, 0x29, 0x26, 0x42, 0xa2, 0x41, 0x44, 0x4d, 0x19, 0x88, 0x3a, 0x46, 0x08, 0xeb, 0x31, 0x6c, 0x82, 0x6d, 0x98, 0x88, 0x3c, 0x71, 0x87, 0x4f, 0x9c, 0x0c, 0x05, 0xe2, 0xf1, 0x74, 0x0e, 0x33, 0x00, 0x36, 0xc9, 0x2d, 0x30, 0x57, 0x50, 0x39, 0xf6, 0x7c, 0x2a, 0x1b, 0xae, 0xe6, 0x85, 0x1d, 0xb4, 0x8d, 0xa7, 0xb1, 0x89, 0xde, 0xc5, 0xd8, 0x08, 0x00, 0xde, 0x87, 0xa6, 0x84, 0x89, 0x15, 0x6c, 0x56, 0x2d, 0x44, 0x76, 0xd1, 0x94, 0x64, 0x34, 0x0c, 0x28, 0x48, 0xca, 0xea, 0xfe, 0x0f, 0xdb, 0xc9, 0x2e, 0xd1, 0x57, 0x5f, 0xc9, 0x60, 0x71, 0xf1, 0x3f, 0x1e, 0x1f, 0x8c, 0x32, 0xd2, 0x1f, 0x7c, 0x9a, 0x07, 0xfb, 0xb3, 0xcf, 0x5b, 0x50, 0x65, 0x29, 0x2f, 0x62, 0xe0, 0xc3, 0x1e, 0xea, 0x93, 0x0e, 0x11, 0xbe, 0x69, 0xa7, 0xe0, 0x0f, 0x7e, 0x17, 0xe8, 0xef, 0x18, 0xb2, 0x81, 0xa0, 0x52, 0xfa, 0x53, 0xc0, 0xe1, 0x82, 0xeb, 0x7f, 0xec, 0x6c, 0x0f, 0x78, 0x47, 0xf4, 0xff, 0xef, 0x25, 0x97, 0xc7, 0x63, 0xc6, 0x8d, 0xe7, 0xd4, 0x76, 0x82, 0x93, 0xbb, 0xb9, 0x75, 0x01, 0xe8, 0xf0, 0x87, 0x85, 0x61, 0x59, 0x1e, 0x5c, 0x39, 0xd7, 0x86, 0x85, 0xa1, 0x30, 0x43, 0xb9, 0xfa, 0x54, 0x45, 0x19, 0xb8, 0xae, 0x96, 0xda, 0x8c, 0xa5, 0x60, 0x15, 0x8a, 0x32, 0xe4, 0x75, 0xcc, 0x13, 0xbb, 0x07, 0xe8, 0xa4, 0xc4, 0xc1, 0x75, 0x94, 0xc4, 0xa0, 0xf4, 0x48, 0x72, 0xe9, 0x7f, 0xd3, 0x80, 0x1a, 0x17, 0x23, 0x4c, 0x85, 0xdd, 0x45, 0x99, 0x58, 0x1a, 0x27, 0x35, 0x07, 0xac, 0x32, 0x2f, 0x10, 0xe6, 0x2f, 0xf0, 0xa1, 0x3e, 0x91, 0x9f, 0xa2, 0xf0, 0x8a, 0x8d, 0x4a, 0xec, 0xc6, 0x5b, 0xc5, 0x00, 0x31, 0x8e, 0x5d, 0xa8, 0xf8, 0x84, 0x7a, 0xa1, 0x01, 0xed, 0x7e, 0x0a, 0x17, 0x99, 0x83, 0x37, 0x9e, 0x12, 0xc7, 0x66, 0x30, 0x1a, 0xb6, 0x4a, 0x41, 0x51, 0x2b, 0xbb, 0xc3, 0x80, 0x2b, 0x0e, 0x3b, 0xb5, 0xeb, 0x78, 0xb1, 0xae, 0xec, 0x34, 0xb0, 0x0c, 0x88, 0x58, 0xbd, 0x0b, 0x80, 0xce, 0x30, 0x54, 0x4b, 0x7c, 0x96, 0x84, 0x5d, 0x2a, 0xb3, 0x07, 0xa8, 0x8c, 0x0f, 0x0d, 0xbb, 0x98, 0x6e, 0x8b, 0xfc, 0x03, 0x33, 0x40, 0x7a, 0x8f, 0x7f, 0xc4, 0x69, 0x38, 0x9a, 0x62, 0xcc, 0x89, 0x3e, 0xdb, 0x16, 0x01, 0x18, 0x2b, 0x02, 0x87, 0x8a, 0x35, 0xba, 0x27, 0xaf, 0x21, 0x6c, 0x90, 0x30, 0x81, 0x9c, 0x93, 0x73, 0xe1, 0x42, 0x59, 0x6a, 0x26, 0xd6, 0x7f, 0x6b, 0x15, 0xf0, 0xb8, 0x8f, 0x50, 0x23, 0xfc, 0x46, 0xd1, 0x78, 0x7e, 0xf3, 0xee, 0x7e, 0xa6, 0xea, 0x88, 0xe0, 0xce, 0xa7, 0xaa, 0x58, 0x02, 0xb9, 0x46, 0x81, 0x7c, 0xfe, 0xd4, 0x83, 0x40, 0xe2, 0x9e, 0x8a, 0xa7, 0xc2, 0x1a, 0x7b, 0x30, 0x85, 0x76, 0x0b, 0xef, 0x11, 0x86, 0xc2, 0x3a, 0xc0, 0x6f, 0xdb, 0xa1, 0x5e, 0xb5, 0xbc, 0x58, 0x67, 0x17, 0x11, 0xc9, 0x4f, 0x14, 0xa2, 0xae, 0x7d, 0x55, 0x48, 0xbc, 0x88, 0xb6, 0xc1, 0x22, 0x4a, 0xec, 0x75, 0x23, 0xf5, 0x38, 0x1d, 0x1e, 0xab, 0x8f, 0x0c, 0x42, 0x0f, 0x32, 0x57, 0xf0, 0x6c, 0x9d, 0x53, 0xb3, 0x9f, 0xdf, 0x4f, 0xe6, 0xd8, 0x31, 0xb2, 0x16, 0x04, 0xed, 0xa7, 0xf5, 0x32, 0x50, 0xfc, 0xee, 0x51, 0xf0, 0xb9, 0xae, 0xce, 0x7a, 0x40, 0xb6, 0x2a, 0xeb, 0x28, 0xb5, 0xcf, 0x14, 0x0e, 0x91, 0x8f, 0xcc, 0x77, 0x25, 0xef, 0xf7, 0x12, 0x37, 0x1f, 0xca, 0x98, 0x6e, 0xee, 0x76, 0x5e, 0x22, 0x21, 0x3c])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_598_tweak_16(self, test_workers):
my_key = bytes([0x0b, 0x8c, 0x0c, 0x8d, 0x0d, 0x8e, 0x0e, 0x8f, 0x0f, 0x90, 0x10, 0x91, 0x11, 0x92, 0x12, 0x93])
my_tweak = bytes([0x71, 0x6e, 0x6b, 0x68, 0x65, 0x62, 0x5f, 0x5c, 0x59, 0x56, 0x53, 0x50, 0x4d, 0x4a, 0x47, 0x44])
my_message = bytes([0xb5, 0x36, 0xb6, 0x37, 0xb7, 0x38, 0xb8, 0x39, 0xb9, 0x3a, 0xba, 0x3b, 0xbb, 0x3c, 0xbc, 0x3d, 0xbd, 0x3e, 0xbe, 0x3f, 0xbf, 0x40, 0xc0, 0x41, 0xc1, 0x42, 0xc2, 0x43, 0xc3, 0x44, 0xc4, 0x45, 0xc5, 0x46, 0xc6, 0x47, 0xc7, 0x48, 0xc8, 0x49, 0xc9, 0x4a, 0xca, 0x4b, 0xcb, 0x4c, 0xcc, 0x4d, 0xcd, 0x4e, 0xce, 0x4f, 0xcf, 0x50, 0xd0, 0x51, 0xd1, 0x52, 0xd2, 0x53, 0xd3, 0x54, 0xd4, 0x55, 0xd5, 0x56, 0xd6, 0x57, 0xd7, 0x58, 0xd8, 0x59, 0xd9, 0x5a, 0xda, 0x5b, 0xdb, 0x5c, 0xdc, 0x5d, 0xdd, 0x5e, 0xde, 0x5f, 0xdf, 0x60, 0xe0, 0x61, 0xe1, 0x62, 0xe2, 0x63, 0xe3, 0x64, 0xe4, 0x65, 0xe5, 0x66, 0xe6, 0x67, 0xe7, 0x68, 0xe8, 0x69, 0xe9, 0x6a, 0xea, 0x6b, 0xeb, 0x6c, 0xec, 0x6d, 0xed, 0x6e, 0xee, 0x6f, 0xef, 0x70, 0xf0, 0x71, 0xf1, 0x72, 0xf2, 0x73, 0xf3, 0x74, 0xf4, 0x75, 0xf5, 0x76, 0xf6, 0x77, 0xf7, 0x78, 0xf8, 0x79, 0xf9, 0x7a, 0xfa, 0x7b, 0xfb, 0x7c, 0xfc, 0x7d, 0xfd, 0x7e, 0xfe, 0x7f, 0xff, 0x80, 0x00, 0x81, 0x01, 0x82, 0x02, 0x83, 0x03, 0x84, 0x04, 0x85, 0x05, 0x86, 0x06, 0x87, 0x07, 0x88, 0x08, 0x89, 0x09, 0x8a, 0x0a, 0x8b, 0x0b, 0x8c, 0x0c, 0x8d, 0x0d, 0x8e, 0x0e, 0x8f, 0x0f, 0x90, 0x10, 0x91, 0x11, 0x92, 0x12, 0x93, 0x13, 0x94, 0x14, 0x95, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0xb5, 0x36, 0xb6, 0x37, 0xb7, 0x38, 0xb8, 0x39, 0xb9, 0x3a, 0xba, 0x3b, 0xbb, 0x3c, 0xbc, 0x3d, 0xbd, 0x3e, 0xbe, 0x3f, 0xbf, 0x40, 0xc0, 0x41, 0xc1, 0x42, 0xc2, 0x43, 0xc3, 0x44, 0xc4, 0x45, 0xc5, 0x46, 0xc6, 0x47, 0xc7, 0x48, 0xc8, 0x49, 0xc9, 0x4a, 0xca, 0x4b, 0xcb, 0x4c, 0xcc, 0x4d, 0xcd, 0x4e, 0xce, 0x4f, 0xcf, 0x50, 0xd0, 0x51, 0xd1, 0x52, 0xd2, 0x53, 0xd3, 0x54, 0xd4, 0x55, 0xd5, 0x56, 0xd6, 0x57, 0xd7, 0x58, 0xd8, 0x59, 0xd9, 0x5a, 0xda, 0x5b, 0xdb, 0x5c, 0xdc, 0x5d, 0xdd, 0x5e, 0xde, 0x5f, 0xdf, 0x60, 0xe0, 0x61, 0xe1, 0x62, 0xe2, 0x63, 0xe3, 0x64, 0xe4, 0x65, 0xe5, 0x66, 0xe6, 0x67, 0xe7, 0x68, 0xe8, 0x69, 0xe9, 0x6a, 0xea, 0x6b, 0xeb, 0x6c, 0xec, 0x6d, 0xed, 0x6e, 0xee, 0x6f, 0xef, 0x70, 0xf0, 0x71, 0xf1, 0x72, 0xf2, 0x73, 0xf3, 0x74, 0xf4, 0x75, 0xf5, 0x76, 0xf6, 0x77, 0xf7, 0x78, 0xf8, 0x79, 0xf9, 0x7a, 0xfa, 0x7b, 0xfb, 0x7c, 0xfc, 0x7d, 0xfd, 0x7e, 0xfe, 0x7f, 0xff, 0x80, 0x00, 0x81, 0x01, 0x82, 0x02, 0x83, 0x03, 0x84, 0x04, 0x85, 0x05, 0x86, 0x06, 0x87, 0x07, 0x88, 0x08, 0x89, 0x09, 0x8a, 0x0a, 0x8b, 0x0b, 0x8c, 0x0c, 0x8d, 0x0d, 0x8e, 0x0e, 0x8f, 0x0f, 0x90, 0x10, 0x91, 0x11, 0x92, 0x12, 0x93, 0x13, 0x94, 0x14, 0x95, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0xb5, 0x36, 0xb6, 0x37, 0xb7, 0x38, 0xb8, 0x39, 0xb9, 0x3a, 0xba, 0x3b, 0xbb, 0x3c, 0xbc, 0x3d, 0xbd, 0x3e, 0xbe, 0x3f, 0xbf, 0x40, 0xc0, 0x41, 0xc1, 0x42, 0xc2, 0x43, 0xc3, 0x44, 0xc4, 0x45, 0xc5, 0x46, 0xc6, 0x47, 0xc7, 0x48, 0xc8, 0x49, 0xc9, 0x4a, 0xca, 0x4b, 0xcb, 0x4c, 0xcc, 0x4d, 0xcd, 0x4e, 0xce, 0x4f, 0xcf, 0x50, 0xd0, 0x51, 0xd1, 0x52, 0xd2, 0x53, 0xd3, 0x54, 0xd4, 0x55, 0xd5, 0x56, 0xd6, 0x57, 0xd7, 0x58, 0xd8, 0x59, 0xd9, 0x5a, 0xda, 0x5b, 0xdb, 0x5c, 0xdc, 0x5d, 0xdd, 0x5e, 0xde, 0x5f, 0xdf, 0x60])
real_ciphertext = bytes([0x29, 0x4e, 0x70, 0xf7, 0x55, 0xaf, 0x24, 0x15, 0x34, 0x2a, 0x0b, 0x33, 0x37, 0x51, 0x44, 0xcb, 0x97, 0x97, 0xdf, 0xf8, 0x33, 0x13, 0x25, 0x0b, 0x61, 0x68, 0x66, 0x59, 0x91, 0xb5, 0x04, 0xe7, 0x53, 0xf6, 0xc5, 0x01, 0x2f, 0xbe, 0xd4, 0x8a, 0x97, 0xd2, 0x88, 0x5d, 0xda, 0xb9, 0x23, 0x1e, 0x66, 0x70, 0x50, 0x9a, 0x88, 0x41, 0x1e, 0xa2, 0x2d, 0xb6, 0x32, 0x8d, 0x98, 0x7c, 0x8c, 0x06, 0xbf, 0xfa, 0x5a, 0xf1, 0xf7, 0xdc, 0xe6, 0xd9, 0xe3, 0x7d, 0x0d, 0xc3, 0x39, 0xb5, 0x1b, 0x70, 0x08, 0x87, 0xec, 0x74, 0x2b, 0xaf, 0xc2, 0x85, 0xe7, 0xe4, 0x1d, 0xf8, 0x52, 0x5c, 0xbc, 0xfb, 0x13, 0x7f, 0x50, 0x57, 0x74, 0xb4, 0x3e, 0x97, 0x71, 0x2d, 0x8c, 0x2a, 0x93, 0xc7, 0xc0, 0x8f, 0x9d, 0xda, 0x85, 0x60, 0xc6, 0xde, 0xd4, 0x0c, 0x6e, 0xbd, 0xdf, 0x1d, 0xe2, 0x2b, 0xce, 0x94, 0xd6, 0x41, 0xdf, 0x0f, 0xcb, 0x67, 0xf3, 0x76, 0xbe, 0x5f, 0xb2, 0x14, 0x87, 0x41, 0x9f, 0x6c, 0xc6, 0x3d, 0xb5, 0xd9, 0x8c, 0x78, 0xd4, 0x5a, 0xf3, 0xe9, 0xc9, 0xdb, 0x0d, 0x76, 0x96, 0xdd, 0x91, 0x96, 0x78, 0xad, 0x35, 0xcc, 0x16, 0x60, 0x98, 0x03, 0x1c, 0x80, 0x22, 0xa8, 0x8c, 0x2a, 0x7a, 0xf7, 0xc3, 0x46, 0x2a, 0xce, 0xd2, 0x00, 0xec, 0x6d, 0xb6, 0x0c, 0x3d, 0xb6, 0x2f, 0xd4, 0x93, 0xe8, 0x2f, 0x4a, 0x70, 0x4b, 0x0b, 0x0b, 0xaa, 0x99, 0xb6, 0xd6, 0x7f, 0x20, 0xe1, 0x68, 0xd5, 0xe3, 0xc2, 0x34, 0x8c, 0x32, 0x37, 0xfd, 0x44, 0x15, 0xb1, 0xbe, 0xc9, 0x7c, 0x1d, 0x27, 0xe2, 0x96, 0xa9, 0x86, 0x4e, 0x59, 0x9c, 0x24, 0x3d, 0x18, 0xb5, 0x3e, 0xd9, 0x69, 0xc9, 0xb2, 0x8b, 0x67, 0xa9, 0x7c, 0x37, 0x5e, 0x45, 0xb9, 0x7a, 0xd0, 0x41, 0x0c, 0x37, 0xe5, 0x8f, 0xb1, 0x29, 0xa6, 0x3a, 0x0e, 0xa3, 0xe5, 0x94, 0xa8, 0xbd, 0x10, 0xa7, 0x74, 0xbc, 0xe0, 0x28, 0x6b, 0x64, 0xa4, 0xbe, 0x74, 0x10, 0x2c, 0x4d, 0x4e, 0x0a, 0xdf, 0xa9, 0x8a, 0x2d, 0xf4, 0x85, 0xe1, 0x5b, 0x53, 0x7c, 0xc5, 0x5e, 0x72, 0x0a, 0xe5, 0xc1, 0x53, 0x37, 0x6c, 0x69, 0x7e, 0xfc, 0x9b, 0x46, 0xda, 0x67, 0x0d, 0xc0, 0x23, 0xce, 0x78, 0x97, 0x4a, 0x9f, 0x39, 0xec, 0xe0, 0xc9, 0xe2, 0xcd, 0xc4, 0x4a, 0x4a, 0xb1, 0x0e, 0xc4, 0x7d, 0x14, 0xb9, 0x78, 0xaa, 0x11, 0x1e, 0xe3, 0x20, 0x18, 0x8a, 0x79, 0x7b, 0x07, 0x71, 0xb5, 0x15, 0x6a, 0x37, 0xdc, 0x20, 0xbb, 0xcb, 0xfb, 0x98, 0x22, 0xbd, 0x54, 0xcd, 0x5e, 0x7d, 0xde, 0xdd, 0x6f, 0x07, 0x97, 0x36, 0xf5, 0x2a, 0x84, 0x39, 0xbf, 0xdb, 0xf5, 0x05, 0xe4, 0xd8, 0x21, 0xfc, 0x58, 0x98, 0xc4, 0xbe, 0x61, 0x9e, 0x70, 0xdd, 0xd1, 0xb8, 0x49, 0x36, 0x02, 0xfb, 0x27, 0x3e, 0xae, 0xaa, 0x14, 0x94, 0x92, 0x03, 0xec, 0xd6, 0x3d, 0x2c, 0xa1, 0x09, 0x14, 0x6b, 0x4e, 0xae, 0x5e, 0xf1, 0xf8, 0xd1, 0x61, 0x2d, 0x56, 0x85, 0x36, 0xd1, 0x84, 0xa4, 0xb3, 0x6b, 0x42, 0x0d, 0x0f, 0x93, 0x7c, 0x35, 0xb5, 0x9d, 0x62, 0x73, 0x48, 0x6e, 0xc2, 0x1d, 0x8a, 0xb7, 0x49, 0x5a, 0x44, 0xad, 0x3b, 0x7c, 0x7e, 0x69, 0x0f, 0x6f, 0x3c, 0xdb, 0xc0, 0xb5, 0xbb, 0x37, 0x44, 0x1a, 0x09, 0x5c, 0x68, 0x1a, 0xba, 0x16, 0x0f, 0x16, 0x97, 0x62, 0x7e, 0x7d, 0xe9, 0x63, 0x99, 0x32, 0xa1, 0xfa, 0x91, 0xf0, 0xe2, 0x8d, 0x88, 0xeb, 0xc5, 0xdb, 0xeb, 0xfe, 0x78, 0x05, 0xb3, 0xa1, 0x30, 0x65, 0x30, 0xff, 0xff, 0xab, 0x86, 0x66, 0xcb, 0x0b, 0x8b, 0x9a, 0x12, 0xf2, 0x02, 0xe3, 0x8a, 0x28, 0xa1, 0x29, 0xf8, 0x92, 0xcb, 0x98, 0x8b, 0x3a, 0xa0, 0xa3, 0xe1, 0xe6, 0x64, 0x34, 0xcd, 0xee, 0x35, 0xba, 0x91, 0x37, 0x88, 0x79, 0x3d, 0xc4, 0xc8, 0xf8, 0x68, 0xde, 0x86, 0x52, 0xe1, 0x84, 0xce, 0x92, 0x47, 0xbe, 0x99, 0xa5, 0xbe, 0xd0, 0xc4, 0xac, 0x9e, 0xad, 0x24, 0x42, 0xbe, 0x8d, 0x5c, 0x55, 0x52, 0x07, 0xd9, 0x45, 0x03, 0x37, 0x0a, 0xe4, 0x58, 0x41, 0x66, 0xbd, 0x78, 0xf6, 0x4a, 0x4a, 0x18, 0x89, 0x28, 0x84, 0x70, 0x2e, 0xdc, 0x6b, 0xb8, 0x69, 0x5f, 0x07, 0x89, 0xd7, 0x53, 0xab, 0x06, 0xee, 0x17, 0xeb, 0x0d, 0xa3, 0x1a, 0xa2, 0xb3, 0x8f])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_599_tweak_16(self, test_workers):
my_key = bytes([0x02, 0x02, 0x02, 0x02, 0x02, 0x02, 0x02, 0x02, 0x02, 0x02, 0x02, 0x02, 0x02, 0x02, 0x02, 0x02])
my_tweak = bytes([0x68, 0x61, 0x5a, 0x53, 0x4c, 0x45, 0x3e, 0x37, 0x30, 0x29, 0x22, 0x1b, 0x14, 0x0d, 0x06, 0xff])
my_message = bytes([0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d, 0x4d])
real_ciphertext = bytes([0xe8, 0x33, 0xab, 0xf5, 0x83, 0xc2, 0x32, 0x77, 0x87, 0xdf, 0xd1, 0x5f, 0xee, 0xd8, 0xdd, 0x1d, 0x56, 0x84, 0x7b, 0x0a, 0x15, 0xc4, 0x2b, 0x91, 0x4a, 0x3d, 0xaa, 0x14, 0x66, 0x83, 0xdf, 0xfa, 0x7c, 0x5d, 0x1e, 0xdd, 0x8d, 0xab, 0x97, 0x23, 0xbb, 0x2e, 0x5d, 0xec, 0x5f, 0xa7, 0xf3, 0xdc, 0xb6, 0xbc, 0x2d, 0xfd, 0x55, 0x43, 0x2f, 0x18, 0xb0, 0x52, 0x9e, 0xf6, 0xdc, 0xe7, 0x87, 0xe5, 0x23, 0x87, 0xcf, 0x77, 0xcd, 0x33, 0x65, 0x3a, 0x7e, 0x82, 0xf8, 0xf6, 0x37, 0x10, 0x17, 0xef, 0x84, 0x82, 0xd9, 0x11, 0xf9, 0xfb, 0xe4, 0x3f, 0x20, 0xc7, 0x0d, 0x53, 0xe2, 0xc8, 0x40, 0xcf, 0x02, 0xa1, 0x32, 0xb9, 0x52, 0xc6, 0x5e, 0xb8, 0xca, 0x8b, 0x56, 0x57, 0x0f, 0x51, 0xeb, 0x95, 0xe9, 0x10, 0xc8, 0x44, 0x64, 0xcb, 0x0f, 0xea, 0xd8, 0x15, 0x86, 0xd3, 0x30, 0xb2, 0x75, 0xd4, 0x7d, 0x3f, 0xb9, 0xa5, 0x8d, 0x84, 0xd1, 0x04, 0x90, 0xdc, 0x8d, 0xd3, 0xdf, 0x87, 0x74, 0x74, 0x7a, 0x75, 0x2d, 0x22, 0x4d, 0x38, 0x86, 0x5d, 0x80, 0x23, 0x23, 0xd0, 0xce, 0xef, 0x38, 0xf9, 0x2a, 0xaa, 0x50, 0xe1, 0x68, 0x16, 0xbe, 0x60, 0xaf, 0x00, 0x2d, 0x27, 0xac, 0xe3, 0xf0, 0xee, 0x18, 0xb1, 0xb7, 0xc3, 0x77, 0x60, 0x9a, 0x6a, 0xb7, 0x31, 0xdc, 0x5f, 0x41, 0xa1, 0x24, 0xdb, 0xb1, 0x2a, 0xcb, 0xf6, 0xb0, 0xf8, 0x09, 0xae, 0x08, 0x3e, 0x74, 0x35, 0x30, 0x85, 0xa7, 0x45, 0xbc, 0x7e, 0x53, 0xfb, 0x1e, 0x6d, 0x8d, 0xbf, 0x36, 0x5a, 0xdd, 0xac, 0xff, 0x8a, 0x8e, 0xea, 0xf4, 0x28, 0x18, 0x5c, 0x3d, 0x91, 0xc6, 0xff, 0xee, 0x87, 0x5f, 0x3a, 0x6c, 0xa4, 0xe0, 0xb7, 0xfb, 0x74, 0x1d, 0x6d, 0xe8, 0x4f, 0x9e, 0xa2, 0xb3, 0xe6, 0x31, 0xe3, 0x8c, 0xe3, 0x23, 0xe1, 0x13, 0x61, 0xea, 0xc8, 0xe4, 0x52, 0x7e, 0xf1, 0xd1, 0x0f, 0xb3, 0x27, 0x2b, 0xc1, 0x44, 0x74, 0x82, 0xb9, 0x33, 0x11, 0x78, 0x04, 0x4c, 0xa6, 0x3c, 0xc5, 0x62, 0x40, 0x77, 0xa5, 0xce, 0x8e, 0x12, 0xef, 0xa3, 0x41, 0xcb, 0x78, 0xf7, 0x14, 0x27, 0x65, 0x5d, 0x14, 0x0b, 0x8e, 0xb2, 0xe4, 0xf8, 0x60, 0x9c, 0xa1, 0x31, 0x4c, 0xbd, 0x2a, 0xff, 0x3c, 0x3a, 0xb0, 0x75, 0x21, 0xf9, 0x48, 0xa0, 0xc9, 0x00, 0xe6, 0x4f, 0x9c, 0x8c, 0x16, 0x1c, 0x2e, 0x61, 0x30, 0x5e, 0xb4, 0xfb, 0xa9, 0x36, 0x1a, 0x2a, 0x0a, 0x8c, 0xb4, 0x55, 0xe0, 0x43, 0xce, 0x91, 0x78, 0xa1, 0x2d, 0xe4, 0x87, 0x06, 0x74, 0xd6, 0x45, 0xfa, 0x7c, 0x3f, 0xe2, 0xdd, 0xd0, 0x48, 0x5b, 0x02, 0x5c, 0xd1, 0x4a, 0x03, 0xd1, 0x09, 0xc7, 0x89, 0x84, 0x08, 0x92, 0xf1, 0x5b, 0x9b, 0xed, 0x3c, 0xd1, 0xd0, 0x3f, 0x2f, 0x35, 0xb6, 0x3f, 0xa4, 0x59, 0x9c, 0x62, 0x86, 0xb5, 0x2a, 0x7a, 0x75, 0xf5, 0xc6, 0x91, 0x65, 0xa9, 0xb9, 0xa1, 0x97, 0x72, 0x03, 0x97, 0xb1, 0x6d, 0x7e, 0x3a, 0xc7, 0x2c, 0x81, 0x62, 0x53, 0x6c, 0xdc, 0x19, 0x2b, 0x2a, 0x24, 0x71, 0x67, 0x45, 0xa8, 0xad, 0x9f, 0xe0, 0xa3, 0x1e, 0x3e, 0xce, 0x20, 0x3a, 0xa6, 0x38, 0x37, 0xf7, 0xaf, 0x39, 0x85, 0x02, 0xdd, 0x85, 0x94, 0x30, 0x8f, 0xff, 0x1a, 0x87, 0xe7, 0x7c, 0xf8, 0x33, 0x50, 0x57, 0x28, 0xa1, 0x87, 0x81, 0x63, 0x77, 0xdc, 0x55, 0x93, 0x9b, 0x0f, 0xe2, 0x5e, 0x6d, 0x5e, 0x11, 0xbe, 0x1a, 0x49, 0x32, 0x4a, 0xfe, 0xe1, 0x22, 0x09, 0xb8, 0xed, 0x38, 0x2a, 0x78, 0x98, 0x25, 0x40, 0x94, 0xfd, 0xb1, 0x73, 0x31, 0x15, 0x3f, 0x47, 0xae, 0xa8, 0x0c, 0xda, 0x75, 0x6f, 0x43, 0x72, 0xd3, 0x2d, 0x5c, 0x12, 0x04, 0x6a, 0x63, 0x32, 0x82, 0xa1, 0x2b, 0xff, 0xa8, 0x33, 0x67, 0x1f, 0x32, 0x53, 0x2e, 0x1b, 0x3b, 0x26, 0x1f, 0xcb, 0xdd, 0x28, 0x17, 0x15, 0xd6, 0x68, 0x6d, 0xac, 0x84, 0x21, 0x56, 0x24, 0x44, 0xd0, 0x30, 0xde, 0xac, 0x54, 0x14, 0xdb, 0x61, 0xd3, 0xba, 0x70, 0x46, 0xac, 0x7c, 0x52, 0xfe, 0x58, 0xce, 0x17, 0x72, 0xf7, 0xbc, 0x86, 0x08, 0x00, 0x21, 0x86, 0xd6, 0xd4, 0xd2, 0xd5, 0x3b, 0xdb, 0xf6, 0xb1, 0x78, 0x56, 0x0e, 0x12, 0x3d, 0x65, 0x91, 0x49, 0x4d, 0xc9, 0x98, 0xf2, 0x8c, 0x43, 0x20, 0x84, 0x97, 0xa1, 0xa7, 0xed])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_600_tweak_16(self, test_workers):
my_key = bytes([0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a])
my_tweak = bytes([0x7f, 0x70, 0x61, 0x52, 0x43, 0x34, 0x25, 0x16, 0x07, 0xf8, 0xe9, 0xda, 0xcb, 0xbc, 0xad, 0x9e])
my_message = bytes([0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae])
real_ciphertext = bytes([0x4d, 0x3a, 0x53, 0x55, 0xe3, 0xa4, 0x4b, 0xb4, 0xbf, 0xb4, 0x58, 0x4c, 0xe1, 0xd5, 0xd6, 0xee, 0xc0, 0xf8, 0x90, 0x60, 0x90, 0x10, 0xd4, 0x6e, 0xae, 0xf8, 0x32, 0x1a, 0x55, 0x37, 0xae, 0x68, 0xad, 0xd4, 0x5b, 0x61, 0x3e, 0x41, 0x25, 0x2b, 0x20, 0x8d, 0x56, 0x83, 0x28, 0xbc, 0x44, 0x0d, 0xc2, 0x00, 0x2f, 0x80, 0xa5, 0xd1, 0xe3, 0x47, 0xe6, 0x8b, 0x2a, 0x5e, 0xfa, 0x13, 0x00, 0x88, 0x3e, 0x83, 0x46, 0x61, 0x36, 0x5c, 0x1f, 0xc9, 0xc6, 0x71, 0x29, 0xba, 0x3e, 0x06, 0xe1, 0x31, 0xe9, 0xe4, 0x3a, 0x35, 0x65, 0x54, 0x27, 0xa5, 0xab, 0x59, 0xf5, 0x5d, 0xd7, 0xf0, 0xd0, 0x36, 0x60, 0xae, 0x7b, 0x1b, 0x48, 0xa1, 0xd9, 0xec, 0xa7, 0xf0, 0xc1, 0x3b, 0x7b, 0x62, 0x80, 0x51, 0xe3, 0x55, 0x99, 0xce, 0x7c, 0xf7, 0x06, 0x1f, 0x19, 0x03, 0x11, 0x38, 0x5e, 0x5f, 0xe5, 0xfe, 0x1e, 0x15, 0xaa, 0x98, 0x2c, 0xd9, 0xef, 0x13, 0x63, 0x57, 0xbd, 0x82, 0x6b, 0x76, 0x8b, 0x42, 0xcd, 0x9d, 0xc9, 0x8c, 0xba, 0xd8, 0xd5, 0x7e, 0x9f, 0xe8, 0x03, 0xc9, 0x20, 0xb0, 0xba, 0x7f, 0x9f, 0x2e, 0xc5, 0xb5, 0x2e, 0x33, 0xf0, 0x16, 0x76, 0xad, 0x04, 0x6b, 0xa1, 0xaf, 0x9f, 0xa8, 0x2f, 0x14, 0xae, 0x25, 0x1c, 0xb0, 0x98, 0x4a, 0x47, 0xed, 0x44, 0x1c, 0x22, 0x62, 0x15, 0x1a, 0x1a, 0xc2, 0x00, 0xc7, 0x64, 0xaa, 0x71, 0xe6, 0x79, 0xdf, 0xb5, 0x8e, 0x7d, 0x2c, 0xf1, 0xf3, 0xc8, 0x21, 0x05, 0x92, 0x1a, 0xd0, 0xbe, 0xfb, 0x56, 0xa6, 0x1a, 0x10, 0xd2, 0x98, 0x08, 0x8c, 0xde, 0x93, 0xda, 0xb5, 0x8f, 0x08, 0x96, 0x6e, 0x96, 0xae, 0x87, 0x72, 0xfa, 0x17, 0x43, 0x74, 0xab, 0xdd, 0xc2, 0xcf, 0x5a, 0x9d, 0x6d, 0x21, 0xcb, 0x4f, 0x59, 0x6f, 0x91, 0x82, 0xe3, 0xa6, 0xb7, 0x8d, 0x3c, 0x51, 0xec, 0xbf, 0x5b, 0xfe, 0x7a, 0x8b, 0x0c, 0x70, 0x64, 0x93, 0x97, 0xf1, 0x1c, 0xa4, 0x90, 0x33, 0xde, 0x35, 0x05, 0xdf, 0xd4, 0x8c, 0x05, 0xe5, 0xdf, 0xf6, 0x1e, 0x95, 0x86, 0x74, 0xbd, 0x2c, 0x98, 0x47, 0x9c, 0xc1, 0x63, 0xc6, 0xe2, 0xac, 0xbb, 0xd8, 0x47, 0x9a, 0xb9, 0x49, 0x30, 0xf8, 0xc3, 0xdc, 0x8a, 0x42, 0x3e, 0xca, 0x21, 0x2a, 0x4d, 0x2a, 0xfa, 0x30, 0x4b, 0x58, 0x24, 0xad, 0xea, 0xbd, 0xa5, 0x2d, 0x37, 0x82, 0xc0, 0x0f, 0x42, 0x16, 0x84, 0x89, 0x24, 0x25, 0xf2, 0xfa, 0x22, 0x2e, 0xf1, 0x69, 0x91, 0x2a, 0xd3, 0x3b, 0xac, 0x13, 0x23, 0x5c, 0x10, 0x25, 0xf1, 0xd8, 0x03, 0x41, 0x99, 0x17, 0xa3, 0xf4, 0x63, 0x92, 0x14, 0x56, 0xc7, 0xf4, 0x92, 0x51, 0xc7, 0xe9, 0x48, 0xe7, 0xb4, 0x4c, 0xc8, 0xd3, 0x3e, 0x73, 0x50, 0x44, 0xf0, 0x06, 0x6c, 0x3e, 0xbc, 0x0c, 0x64, 0x19, 0xf5, 0x17, 0x98, 0x5b, 0x17, 0xb0, 0x17, 0x16, 0x18, 0x2e, 0xbf, 0xc6, 0xe9, 0x18, 0xd0, 0xba, 0x9d, 0x1c, 0x65, 0x7d, 0x1d, 0x4b, 0x9d, 0x49, 0xdd, 0xc4, 0xfb, 0xf0, 0xfd, 0x25, 0x86, 0x1e, 0x35, 0x83, 0x7e, 0x6c, 0x3d, 0x2d, 0x4a, 0x36, 0x6d, 0xe5, 0x82, 0x1d, 0x29, 0x07, 0x23, 0x52, 0xc6, 0xd5, 0x1a, 0x9c, 0x90, 0xc7, 0x14, 0x65, 0x5f, 0x4a, 0x23, 0x7d, 0xb5, 0x49, 0xc9, 0x63, 0x17, 0x4e, 0x71, 0x17, 0x24, 0xee, 0x0f, 0xe4, 0xa5, 0xa3, 0xb3, 0xd1, 0x84, 0xdd, 0x9f, 0x9d, 0x2f, 0xa9, 0xef, 0x78, 0xf9, 0x97, 0x10, 0x8e, 0x7e, 0x60, 0x0e, 0x23, 0x0f, 0x22, 0xe5, 0xfb, 0x55, 0x11, 0x90, 0x0e, 0x8a, 0x68, 0x65, 0xd1, 0x87, 0xb2, 0xa7, 0xc3, 0x8f, 0x48, 0xea, 0xc7, 0x02, 0xac, 0xa2, 0xfc, 0x0c, 0xb4, 0xc8, 0x85, 0xb1, 0xa7, 0x24, 0x3d, 0x97, 0xae, 0xbb, 0x39, 0x91, 0x6e, 0xe9, 0x72, 0xaf, 0x95, 0xbf, 0xc2, 0x03, 0xe9, 0xba, 0xf9, 0x08, 0x29, 0x3b, 0x37, 0x0c, 0x68, 0x3c, 0x22, 0xc0, 0x96, 0x1f, 0xab, 0xe5, 0x86, 0x93, 0x12, 0x38, 0xc5, 0xc4, 0x29, 0x05, 0x04, 0xb9, 0x04, 0x74, 0x52, 0xf7, 0x00, 0x97, 0x56, 0x7b, 0x68, 0xbd, 0x4f, 0x32, 0xcc, 0x0a, 0x36, 0x98, 0xe0, 0x90, 0xa4, 0x48, 0x24, 0xc0, 0x98, 0x6a, 0xe5, 0xb8, 0x2a, 0xce, 0x01, 0xa3, 0x0d, 0x4c, 0x1b, 0xab, 0x90, 0x1e, 0x47, 0x5d, 0x8b, 0x5c, 0xe0, 0x63, 0xe8, 0xb8, 0x63, 0x72, 0x95])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_601_tweak_16(self, test_workers):
my_key = bytes([0x20, 0x1d, 0x1a, 0x17, 0x14, 0x11, 0x0e, 0x0b, 0x08, 0x05, 0x02, 0xff, 0xfc, 0xf9, 0xf6, 0xf3])
my_tweak = bytes([0x86, 0x67, 0x48, 0x29, 0x0a, 0xeb, 0xcc, 0xad, 0x8d, 0x6e, 0x4f, 0x30, 0x11, 0xf2, 0xd3, 0xb4])
my_message = bytes([0xad, 0xaa, 0xa7, 0xa4, 0xa1, 0x9e, 0x9b, 0x98, 0x95, 0x92, 0x8f, 0x8c, 0x89, 0x86, 0x83, 0x80, 0x7d, 0x7a, 0x77, 0x74, 0x71, 0x6e, 0x6b, 0x68, 0x65, 0x62, 0x5f, 0x5c, 0x59, 0x56, 0x53, 0x50, 0x4d, 0x4a, 0x47, 0x44, 0x41, 0x3e, 0x3b, 0x38, 0x35, 0x32, 0x2f, 0x2c, 0x29, 0x26, 0x23, 0x20, 0x1d, 0x1a, 0x17, 0x14, 0x11, 0x0e, 0x0b, 0x08, 0x05, 0x02, 0xff, 0xfc, 0xf9, 0xf6, 0xf3, 0xf0, 0xec, 0xe9, 0xe6, 0xe3, 0xe0, 0xdd, 0xda, 0xd7, 0xd4, 0xd1, 0xce, 0xcb, 0xc8, 0xc5, 0xc2, 0xbf, 0xbc, 0xb9, 0xb6, 0xb3, 0xb0, 0xad, 0xaa, 0xa7, 0xa4, 0xa1, 0x9e, 0x9b, 0x98, 0x95, 0x92, 0x8f, 0x8c, 0x89, 0x86, 0x83, 0x80, 0x7d, 0x7a, 0x77, 0x74, 0x71, 0x6e, 0x6b, 0x68, 0x65, 0x62, 0x5f, 0x5c, 0x59, 0x56, 0x53, 0x50, 0x4d, 0x4a, 0x47, 0x44, 0x41, 0x3e, 0x3b, 0x38, 0x35, 0x32, 0x2f, 0x2b, 0x28, 0x25, 0x22, 0x1f, 0x1c, 0x19, 0x16, 0x13, 0x10, 0x0d, 0x0a, 0x07, 0x04, 0x01, 0xfe, 0xfb, 0xf8, 0xf5, 0xf2, 0xef, 0xec, 0xe9, 0xe6, 0xe3, 0xe0, 0xdd, 0xda, 0xd7, 0xd4, 0xd1, 0xce, 0xcb, 0xc8, 0xc5, 0xc2, 0xbf, 0xbc, 0xb9, 0xb6, 0xb3, 0xb0, 0xad, 0xaa, 0xa7, 0xa4, 0xa1, 0x9e, 0x9b, 0x98, 0x95, 0x92, 0x8f, 0x8c, 0x89, 0x86, 0x83, 0x80, 0x7d, 0x7a, 0x77, 0x74, 0x71, 0x6e, 0x6a, 0x67, 0x64, 0x61, 0x5e, 0x5b, 0x58, 0x55, 0x52, 0x4f, 0x4c, 0x49, 0x46, 0x43, 0x40, 0x3d, 0x3a, 0x37, 0x34, 0x31, 0x2e, 0x2b, 0x28, 0x25, 0x22, 0x1f, 0x1c, 0x19, 0x16, 0x13, 0x10, 0x0d, 0x0a, 0x07, 0x04, 0x01, 0xfe, 0xfb, 0xf8, 0xf5, 0xf2, 0xef, 0xec, 0xe9, 0xe6, 0xe3, 0xe0, 0xdd, 0xda, 0xd7, 0xd4, 0xd1, 0xce, 0xcb, 0xc8, 0xc5, 0xc2, 0xbf, 0xbc, 0xb9, 0xb6, 0xb3, 0xb0, 0xad, 0xad, 0xaa, 0xa7, 0xa4, 0xa1, 0x9e, 0x9b, 0x98, 0x95, 0x92, 0x8f, 0x8c, 0x89, 0x86, 0x83, 0x80, 0x7d, 0x7a, 0x77, 0x74, 0x71, 0x6e, 0x6b, 0x68, 0x65, 0x62, 0x5f, 0x5c, 0x59, 0x56, 0x53, 0x50, 0x4d, 0x4a, 0x47, 0x44, 0x41, 0x3e, 0x3b, 0x38, 0x35, 0x32, 0x2f, 0x2c, 0x29, 0x26, 0x23, 0x20, 0x1d, 0x1a, 0x17, 0x14, 0x11, 0x0e, 0x0b, 0x08, 0x05, 0x02, 0xff, 0xfc, 0xf9, 0xf6, 0xf3, 0xf0, 0xec, 0xe9, 0xe6, 0xe3, 0xe0, 0xdd, 0xda, 0xd7, 0xd4, 0xd1, 0xce, 0xcb, 0xc8, 0xc5, 0xc2, 0xbf, 0xbc, 0xb9, 0xb6, 0xb3, 0xb0, 0xad, 0xaa, 0xa7, 0xa4, 0xa1, 0x9e, 0x9b, 0x98, 0x95, 0x92, 0x8f, 0x8c, 0x89, 0x86, 0x83, 0x80, 0x7d, 0x7a, 0x77, 0x74, 0x71, 0x6e, 0x6b, 0x68, 0x65, 0x62, 0x5f, 0x5c, 0x59, 0x56, 0x53, 0x50, 0x4d, 0x4a, 0x47, 0x44, 0x41, 0x3e, 0x3b, 0x38, 0x35, 0x32, 0x2f, 0x2b, 0x28, 0x25, 0x22, 0x1f, 0x1c, 0x19, 0x16, 0x13, 0x10, 0x0d, 0x0a, 0x07, 0x04, 0x01, 0xfe, 0xfb, 0xf8, 0xf5, 0xf2, 0xef, 0xec, 0xe9, 0xe6, 0xe3, 0xe0, 0xdd, 0xda, 0xd7, 0xd4, 0xd1, 0xce, 0xcb, 0xc8, 0xc5, 0xc2, 0xbf, 0xbc, 0xb9, 0xb6, 0xb3, 0xb0, 0xad, 0xaa, 0xa7, 0xa4, 0xa1, 0x9e, 0x9b, 0x98, 0x95, 0x92, 0x8f, 0x8c, 0x89, 0x86, 0x83, 0x80, 0x7d, 0x7a, 0x77, 0x74, 0x71, 0x6e, 0x6a, 0x67, 0x64, 0x61, 0x5e, 0x5b, 0x58, 0x55, 0x52, 0x4f, 0x4c, 0x49, 0x46, 0x43, 0x40, 0x3d, 0x3a, 0x37, 0x34, 0x31, 0x2e, 0x2b, 0x28, 0x25, 0x22, 0x1f, 0x1c, 0x19, 0x16, 0x13, 0x10, 0x0d, 0x0a, 0x07, 0x04, 0x01, 0xfe, 0xfb, 0xf8, 0xf5, 0xf2, 0xef, 0xec, 0xe9, 0xe6, 0xe3, 0xe0, 0xdd, 0xda, 0xd7, 0xd4, 0xd1, 0xce, 0xcb, 0xc8, 0xc5, 0xc2, 0xbf, 0xbc, 0xb9, 0xb6, 0xb3, 0xb0, 0xad, 0xad, 0xaa, 0xa7, 0xa4, 0xa1, 0x9e, 0x9b, 0x98, 0x95, 0x92, 0x8f, 0x8c, 0x89, 0x86, 0x83, 0x80, 0x7d, 0x7a, 0x77, 0x74, 0x71, 0x6e, 0x6b, 0x68, 0x65, 0x62, 0x5f, 0x5c, 0x59, 0x56, 0x53, 0x50, 0x4d, 0x4a, 0x47, 0x44, 0x41, 0x3e, 0x3b, 0x38, 0x35, 0x32, 0x2f, 0x2c, 0x29, 0x26, 0x23, 0x20, 0x1d, 0x1a, 0x17, 0x14, 0x11, 0x0e, 0x0b, 0x08, 0x05, 0x02, 0xff, 0xfc, 0xf9, 0xf6, 0xf3, 0xf0, 0xec, 0xe9, 0xe6, 0xe3, 0xe0, 0xdd, 0xda, 0xd7, 0xd4, 0xd1, 0xce, 0xcb, 0xc8, 0xc5, 0xc2, 0xbf, 0xbc, 0xb9, 0xb6, 0xb3, 0xb0, 0xad, 0xaa, 0xa7, 0xa4])
real_ciphertext = bytes([0x5b, 0x58, 0xd9, 0x4a, 0x17, 0xcf, 0x3b, 0x02, 0x56, 0xad, 0x94, 0x1c, 0x8c, 0x59, 0xa9, 0xa8, 0x6b, 0x50, 0xf1, 0xc3, 0x9d, 0x66, 0xd6, 0x36, 0x54, 0xa1, 0x94, 0x35, 0x8b, 0x77, 0x60, 0x34, 0xee, 0x35, 0x4d, 0x6f, 0x7b, 0xb0, 0x22, 0x25, 0x17, 0xc4, 0x57, 0x93, 0x58, 0x43, 0xcc, 0xa7, 0x47, 0xe1, 0xba, 0xdf, 0x0f, 0x9c, 0x9a, 0x82, 0x43, 0x7c, 0xfd, 0xd7, 0xd4, 0x7e, 0x28, 0xe5, 0xc0, 0x8a, 0x79, 0xbd, 0x37, 0x71, 0x88, 0x04, 0x8a, 0xb1, 0x13, 0x72, 0xfa, 0xb2, 0x63, 0x72, 0x44, 0xf4, 0x5e, 0xd5, 0x8a, 0xc2, 0x76, 0x86, 0x84, 0xba, 0x12, 0x97, 0x40, 0xd4, 0x8d, 0xef, 0x3d, 0x20, 0x28, 0x01, 0x66, 0xa4, 0x19, 0x85, 0xa3, 0x6f, 0xef, 0x1c, 0xe5, 0xa1, 0x37, 0xdb, 0xe9, 0x0f, 0x63, 0xc8, 0x41, 0xa6, 0x10, 0x4b, 0x34, 0x63, 0xbd, 0x50, 0xc0, 0x77, 0x87, 0x6d, 0x79, 0xe7, 0x35, 0x6f, 0xa0, 0x36, 0xa8, 0x2b, 0x1a, 0x8d, 0xe5, 0xb4, 0xb2, 0xae, 0xb2, 0xc1, 0x8c, 0xa3, 0x55, 0xcd, 0x6d, 0x50, 0x46, 0x8e, 0xae, 0x97, 0xcd, 0xb7, 0x42, 0xa6, 0xfc, 0x09, 0xcd, 0xd1, 0x0d, 0x39, 0x6b, 0x90, 0x3d, 0x64, 0x4a, 0x61, 0x7a, 0x8c, 0x9a, 0x1e, 0x2c, 0x67, 0xa3, 0x96, 0xf8, 0x48, 0xe2, 0x56, 0x55, 0x60, 0xbe, 0x07, 0xb3, 0x20, 0x0b, 0x74, 0x1a, 0x1d, 0x9e, 0x00, 0xd6, 0x5a, 0xa5, 0xeb, 0x2b, 0xa1, 0x5c, 0xc4, 0x38, 0x62, 0xf3, 0xc5, 0xad, 0x22, 0x41, 0xc9, 0xbe, 0x39, 0xbd, 0x25, 0xf5, 0x6d, 0x8d, 0x9b, 0x3f, 0x09, 0xad, 0x18, 0x08, 0x6e, 0xf6, 0xf4, 0x55, 0xf8, 0x03, 0xf7, 0x3b, 0x2f, 0x3c, 0x7f, 0x2d, 0x37, 0x97, 0x32, 0x0a, 0xae, 0xc2, 0x83, 0x4d, 0xd1, 0xb8, 0x4e, 0xa0, 0xc4, 0x57, 0x4b, 0xa3, 0x27, 0xf6, 0x71, 0x78, 0x0a, 0xc4, 0x2d, 0x72, 0xb6, 0x9c, 0x02, 0x46, 0x2f, 0xb8, 0xa5, 0x19, 0x60, 0x88, 0x22, 0xbd, 0xa3, 0x01, 0x09, 0xec, 0x3d, 0xd1, 0xc5, 0x4a, 0x73, 0x92, 0xfa, 0xc0, 0x58, 0x38, 0xc0, 0x0d, 0xbf, 0x7b, 0x82, 0x6d, 0xbd, 0x5d, 0x51, 0x0e, 0xa4, 0xef, 0xbd, 0x9a, 0x3b, 0x6a, 0x49, 0x0b, 0x16, 0x06, 0xc1, 0x2b, 0xa0, 0x8a, 0xdd, 0x2a, 0x81, 0xbe, 0xfd, 0xb9, 0xba, 0x54, 0xc1, 0xce, 0x73, 0xad, 0xe7, 0xf0, 0x0f, 0x16, 0xc5, 0x69, 0xbb, 0x0e, 0xcf, 0x83, 0x53, 0xb8, 0x4d, 0x0c, 0xdd, 0x57, 0x5b, 0x85, 0x0b, 0x71, 0xaf, 0x15, 0xe6, 0x85, 0x98, 0xc9, 0xf9, 0x5f, 0xf7, 0xb0, 0xd8, 0xe4, 0xa2, 0xcd, 0x88, 0xe3, 0x33, 0x28, 0xcc, 0xb2, 0x04, 0x73, 0x7c, 0xa2, 0x6a, 0x94, 0x33, 0xeb, 0xfb, 0x26, 0x32, 0xf4, 0x5b, 0x44, 0x17, 0x55, 0x8f, 0x60, 0xb2, 0x48, 0xe0, 0x15, 0x4c, 0xc0, 0xc5, 0x39, 0x9e, 0x64, 0xe3, 0x24, 0x3f, 0xfd, 0xa9, 0x32, 0x57, 0x0d, 0x5a, 0x28, 0x2c, 0x68, 0x36, 0x27, 0x26, 0x2f, 0x14, 0xc2, 0xa0, 0x3e, 0x5a, 0x8a, 0xe7, 0x2b, 0xe1, 0x8c, 0xdb, 0xe6, 0x8c, 0xd4, 0x2f, 0xef, 0x32, 0xdc, 0x61, 0xfc, 0x09, 0xec, 0xeb, 0xbc, 0xb7, 0x28, 0xdd, 0xfb, 0xfd, 0x7b, 0xdf, 0x56, 0x52, 0x36, 0x21, 0xcc, 0x2c, 0xa7, 0xde, 0x4f, 0xcd, 0x33, 0xaa, 0x40, 0x9f, 0x61, 0x64, 0x43, 0xe4, 0xb9, 0x49, 0x0b, 0x67, 0xfe, 0x97, 0xb8, 0x1f, 0x7b, 0x57, 0x45, 0x92, 0x53, 0x93, 0x2b, 0x1c, 0xbe, 0x13, 0x66, 0x81, 0x13, 0x8f, 0xdc, 0xe5, 0x41, 0x26, 0xa3, 0xee, 0x4b, 0x38, 0x24, 0x98, 0x98, 0x7c, 0xf0, 0xaf, 0xf3, 0xf9, 0xec, 0x02, 0xd8, 0xf0, 0x7c, 0x98, 0x64, 0x86, 0x91, 0xc1, 0xe1, 0x3a, 0xf2, 0x42, 0xe3, 0xf4, 0x47, 0x81, 0x04, 0xde, 0xbd, 0x41, 0x21, 0x2f, 0x14, 0x55, 0x86, 0x60, 0x93, 0x5d, 0x3c, 0x9f, 0x5b, 0xf1, 0x4c, 0xac, 0x1a, 0xd3, 0x7e, 0x44, 0xcf, 0x19, 0x59, 0xc0, 0xbe, 0x8e, 0xd4, 0x74, 0x41, 0xd8, 0xaf, 0x47, 0xeb, 0x27, 0xce, 0xc6, 0xb4, 0xfb, 0xc1, 0xfc, 0x10, 0x52, 0x24, 0x15, 0xc7, 0x18, 0x21, 0x7b, 0x56, 0xc5, 0x75, 0x4e, 0x23, 0x38, 0x2b, 0x89, 0x09, 0x9d, 0x5b, 0xbc, 0xad, 0x0d, 0xe2, 0x5e, 0x16, 0xef, 0x26, 0x47, 0x24, 0x3e, 0x88, 0x95, 0xa2, 0x8d, 0x65, 0xde, 0xe6, 0x26, 0x06, 0x4c, 0x20, 0x39, 0xac, 0xd8, 0xf0, 0xb1, 0x92, 0x0a, 0xcd])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_1024_tweak_16(self, test_workers):
my_key = bytes([0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2])
my_tweak = bytes([0x17, 0x08, 0xf9, 0xea, 0xdb, 0xcc, 0xbd, 0xae, 0x9f, 0x90, 0x81, 0x72, 0x63, 0x54, 0x45, 0x36])
my_message = bytes([0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45])
real_ciphertext = bytes([0x97, 0x8f, 0x40, 0xad, 0x49, 0xd2, 0x7c, 0xcb, 0x71, 0xd1, 0x02, 0xd7, 0xac, 0x5c, 0x5d, 0x4b, 0x5d, 0x8b, 0xd7, 0x9a, 0x23, 0x56, 0x13, 0x5b, 0xaa, 0x93, 0x6e, 0x11, 0xe8, 0x71, 0x34, 0x88, 0x16, 0x22, 0x8c, 0xbb, 0x25, 0x7c, 0x69, 0x91, 0xed, 0x34, 0x10, 0x3f, 0x83, 0xf4, 0x9f, 0xeb, 0x31, 0x11, 0xef, 0xcf, 0xc4, 0x06, 0x50, 0x4b, 0x53, 0xb6, 0xf5, 0x32, 0xb6, 0x0b, 0x89, 0x07, 0xd0, 0x44, 0xc3, 0x47, 0x70, 0xec, 0xcb, 0x21, 0x73, 0xa3, 0x73, 0x3d, 0xc6, 0x47, 0x2f, 0x4e, 0xb6, 0xf3, 0xf1, 0x05, 0xd7, 0x14, 0xa7, 0x2d, 0x2c, 0xba, 0xa1, 0x30, 0xad, 0x64, 0x66, 0xce, 0xcd, 0x02, 0xc0, 0x6e, 0xa6, 0x61, 0x24, 0xa8, 0x18, 0xc8, 0xdf, 0x45, 0xf4, 0x7b, 0x44, 0x66, 0x36, 0x38, 0xf8, 0x1b, 0xc1, 0x61, 0xfa, 0x76, 0xf1, 0x7e, 0x3a, 0x4e, 0x97, 0x52, 0x9f, 0xa6, 0x2c, 0xcd, 0xf9, 0xac, 0x19, 0xc5, 0x01, 0xf5, 0x16, 0xf6, 0x4d, 0xc0, 0xce, 0xd9, 0x66, 0x4d, 0x74, 0xd5, 0xd4, 0x05, 0x47, 0x15, 0x70, 0xab, 0x2d, 0x42, 0x45, 0x8c, 0xfa, 0x12, 0xf9, 0x9c, 0x59, 0x37, 0xf1, 0xaa, 0xba, 0xa6, 0x93, 0x6f, 0x02, 0xf2, 0xc7, 0xe3, 0x7c, 0xb8, 0x12, 0xbc, 0x7e, 0x59, 0x32, 0x22, 0x04, 0x99, 0x1e, 0x6d, 0x4c, 0x1c, 0x7d, 0xe6, 0xb4, 0xb8, 0x0e, 0x09, 0xec, 0x71, 0x40, 0xc2, 0x3e, 0x39, 0x21, 0xf8, 0x4d, 0xc8, 0xaf, 0x29, 0x0b, 0x09, 0x38, 0x1a, 0x05, 0x43, 0x40, 0x13, 0x92, 0x30, 0x37, 0x0c, 0x59, 0xa1, 0x7f, 0xc8, 0x66, 0x9a, 0x41, 0x3c, 0x0c, 0x51, 0xca, 0x7e, 0xa6, 0x2f, 0x15, 0x97, 0x40, 0x20, 0xf7, 0xf3, 0xaa, 0x53, 0xd7, 0xab, 0x81, 0x5d, 0x25, 0x7c, 0x95, 0xad, 0xc8, 0x76, 0x85, 0x2c, 0x6e, 0x31, 0x13, 0x66, 0x51, 0xf0, 0xc5, 0xc6, 0x3c, 0xd1, 0x65, 0x7e, 0x36, 0xb6, 0x78, 0xdd, 0x9b, 0x02, 0x98, 0x29, 0xd3, 0xe7, 0x4b, 0x9d, 0x1c, 0x6c, 0x2c, 0xb1, 0x68, 0x7d, 0x1e, 0xf3, 0xb8, 0xe2, 0x43, 0x40, 0x92, 0x04, 0xd2, 0x61, 0x46, 0x12, 0xd8, 0x1b, 0x70, 0x01, 0x67, 0xab, 0x93, 0x75, 0xab, 0x4f, 0x50, 0x68, 0xc9, 0x34, 0xf3, 0xba, 0x14, 0x7e, 0xa2, 0xe9, 0xfb, 0x7d, 0x38, 0x21, 0xcb, 0x67, 0x37, 0xf4, 0x31, 0x77, 0x9f, 0xb9, 0xe1, 0x1d, 0x41, 0x43, 0xf4, 0x26, 0xa0, 0x07, 0x45, 0xef, 0x17, 0xfa, 0x42, 0x03, 0x5b, 0xc1, 0x44, 0x27, 0x53, 0x93, 0x47, 0x26, 0x7c, 0x39, 0x8d, 0x85, 0x78, 0x47, 0xcd, 0x45, 0xb8, 0xe4, 0x12, 0x9c, 0x03, 0xba, 0x86, 0x56, 0xc3, 0x37, 0xac, 0x96, 0xd7, 0xbe, 0xeb, 0x59, 0x53, 0x07, 0x37, 0xf9, 0xca, 0x94, 0x0d, 0x05, 0xe4, 0x8f, 0x9f, 0x73, 0x80, 0x32, 0x7b, 0x27, 0x58, 0x2c, 0xe2, 0x4b, 0x05, 0x35, 0xc4, 0x05, 0x98, 0x15, 0x06, 0xcb, 0xad, 0x3b, 0x49, 0xb4, 0x56, 0xb4, 0x64, 0xf5, 0xa9, 0xe4, 0x63, 0x0d, 0x42, 0x66, 0x91, 0x68, 0xe0, 0xf7, 0x49, 0xf6, 0xb6, 0xb8, 0x4c, 0x70, 0xee, 0xa9, 0x23, 0x62, 0x0a, 0xee, 0x0c, 0xe5, 0x9e, 0x33, 0x87, 0x4f, 0x37, 0xa9, 0x4b, 0x98, 0x4b, 0x5c, 0x67, 0x60, 0xab, 0xc1, 0x67, 0x1e, 0x40, 0xea, 0x94, 0xa8, 0xcc, 0x66, 0x0a, 0xaa, 0x26, 0x69, 0x43, 0x3b, 0xd0, 0xe1, 0x48, 0xf3, 0xbc, 0xf1, 0x69, 0x50, 0xe4, 0xa6, 0x21, 0x10, 0x9d, 0x4a, 0x6f, 0x71, 0xb4, 0x92, 0xee, 0x06, 0x33, 0xd3, 0x33, 0xba, 0xdf, 0x37, 0xd2, 0x07, 0x54, 0x4a, 0xbd, 0xe3, 0x65, 0xe7, 0x5b, 0x4d, 0x5d, 0x9e, 0xb4, 0x18, 0xd1, 0x66, 0xcc, 0x39, 0x6b, 0xb9, 0x55, 0x75, 0xee, 0x81, 0x35, 0x80, 0x51, 0x94, 0xdf, 0x3a, 0xa2, 0x62, 0x87, 0x1c, 0xdc, 0xe7, 0x98, 0x22, 0x33, 0xc0, 0xb8, 0x84, 0x82, 0x7d, 0xab, 0x45, 0x6e, 0xb9, 0xa2, 0xca, 0xe6, 0x82, 0x09, 0x82, 0xad, 0x9c, 0x67, 0x56, 0xd0, 0xb9, 0xa0, 0x2d, 0xa9, 0xeb, 0xec, 0x30, 0x74, 0x93, 0x6a, 0x5c, 0x49, 0xaa, 0x45, 0x40, 0x10, 0x2b, 0x84, 0x20, 0xe4, 0x17, 0x1c, 0x93, 0xa3, 0x29, 0x71, 0xe9, 0xdc, 0x2e, 0x68, 0x60, 0x70, 0x0a, 0x7a, 0x14, 0x13, 0xd1, 0x34, 0xcb, 0x7b, 0x68, 0x4e, 0xb4, 0xa8, 0x35, 0xf3, 0x99, 0x75, 0x75, 0x55, 0xa4, 0xae, 0x9e, 0xaf, 0x08, 0xff, 0x4a, 0x05, 0xbf, 0x4c, 0x82, 0xe7, 0x60, 0xc3, 0x2a, 0x9e, 0x62, 0x09, 0x3e, 0x3e, 0x25, 0xce, 0x16, 0xab, 0x84, 0x41, 0xfc, 0x15, 0x12, 0x75, 0x91, 0xab, 0xaa, 0x36, 0x10, 0x82, 0xdd, 0x64, 0x26, 0x54, 0x72, 0x5c, 0x96, 0xf0, 0x44, 0xc0, 0x5f, 0xe0, 0x9d, 0xdc, 0xe7, 0x82, 0x91, 0x56, 0x8b, 0xe0, 0x72, 0x5f, 0x7f, 0x80, 0xe3, 0x15, 0x89, 0x08, 0x78, 0x80, 0x0b, 0xc5, 0xf4, 0x03, 0xed, 0xb1, 0xb9, 0xbb, 0x73, 0x47, 0x42, 0x6d, 0x64, 0x56, 0x74, 0x98, 0xff, 0x58, 0x07, 0x9d, 0x34, 0xa3, 0xd7, 0x72, 0xd3, 0x2d, 0xc4, 0x3f, 0x8e, 0xfa, 0xcb, 0xdc, 0xc3, 0x45, 0xe0, 0x7b, 0x56, 0x61, 0xfd, 0xe9, 0xdc, 0xb8, 0xd3, 0xa7, 0x6d, 0xc2, 0x13, 0x7e, 0x01, 0xe4, 0xf0, 0xab, 0x57, 0x30, 0x28, 0xde, 0x87, 0xd5, 0x92, 0xd3, 0x5f, 0xae, 0xcd, 0x35, 0x85, 0xe4, 0x71, 0xaf, 0xcc, 0x78, 0x21, 0x55, 0xb4, 0x07, 0xec, 0x11, 0x21, 0xe4, 0x37, 0xd6, 0xea, 0x14, 0xbd, 0xc1, 0xb0, 0x94, 0x6a, 0x1b, 0x94, 0xba, 0x6a, 0x43, 0x67, 0x42, 0x56, 0xbd, 0x83, 0x87, 0x39, 0x55, 0x8e, 0x7d, 0x99, 0x30, 0x38, 0xea, 0xcc, 0x23, 0x73, 0x3b, 0xf0, 0x80, 0x4a, 0x32, 0x9d, 0x8c, 0x0f, 0x2b, 0xe3, 0xf2, 0x75, 0xa1, 0xf6, 0xbd, 0xbf, 0x87, 0x9a, 0x1b, 0x22, 0xea, 0xcb, 0x82, 0xca, 0x00, 0x0a, 0xb8, 0xbc, 0x78, 0x75, 0xc3, 0x3c, 0xce, 0xb9, 0xc2, 0xa0, 0x2e, 0xb8, 0x26, 0x22, 0xf3, 0x00, 0x3d, 0x74, 0x5d, 0x52, 0x6a, 0x50, 0xc5, 0xfa, 0x33, 0x11, 0x7a, 0xe4, 0x1b, 0x2d, 0xe0, 0x5f, 0xe0, 0xb6, 0x29, 0xc6, 0xe9, 0x16, 0x11, 0xbf, 0x85, 0xf2, 0x1a, 0xf4, 0x3b, 0x8b, 0x9a, 0xcf, 0x33, 0x41, 0x37, 0x2a, 0x2d, 0xaa, 0x75, 0xaa, 0x1f, 0x8a, 0x53, 0xe2, 0x7b, 0xe0, 0x17, 0x3c, 0xa0, 0x29, 0xc5, 0x10, 0xfa, 0xd8, 0xcf, 0x32, 0x33, 0x07, 0x10, 0x42, 0x72, 0xb0, 0x8c, 0x9a, 0xe4, 0x89, 0x30, 0x25, 0x20, 0xae, 0x6e, 0x81, 0xc6, 0x73, 0xfc, 0xee, 0x2c, 0x30, 0xec, 0x85, 0xcf, 0x7e, 0x07, 0x21, 0xaa, 0x90, 0x41, 0xc6, 0x9d, 0x8f, 0x5b, 0x9c, 0x59, 0xcc, 0xb0, 0xfc, 0x20, 0x23, 0xed, 0xf9, 0x13, 0x35, 0x0d, 0xe3, 0x6c, 0x50, 0xd4, 0x6f, 0x39, 0x00, 0x87, 0xb1, 0x60, 0xf3, 0x4f, 0x85, 0x63, 0x45, 0x80, 0x7f, 0x03, 0x80, 0xa7, 0x34, 0xe4, 0xf5, 0x84, 0x7e, 0x71, 0x1b, 0x69, 0xc1, 0x88, 0xa9, 0x98, 0x94, 0x3c, 0x98, 0x6c, 0x0d, 0x4a, 0x33, 0xab, 0x38, 0xb5, 0xab, 0x56, 0x61, 0x07, 0x6d, 0xc5, 0x70, 0x83, 0xd7, 0xe0, 0x5a, 0xba, 0x5c, 0x9c, 0x40, 0xdd, 0x41, 0x59, 0xe9, 0x34, 0xcf, 0x41, 0x1d, 0x2c, 0x28, 0xa3, 0x40, 0xa9, 0x2b, 0x1b, 0x1a, 0xa7, 0x15, 0x25, 0x00, 0x90, 0xb5, 0x70, 0xea, 0xcb, 0xfd, 0x7e, 0x88, 0xaa, 0x75, 0xad, 0xaa, 0xc8, 0x91, 0x0f, 0xce, 0xd9, 0x01, 0x71, 0x2d, 0x70, 0x9a, 0x10, 0x50, 0xd3, 0xa5, 0x49, 0x01, 0xbc, 0xaa, 0xea, 0x2a, 0x11, 0x10])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_1198_tweak_16(self, test_workers):
my_key = bytes([0x13, 0x94, 0x14, 0x95, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b])
my_tweak = bytes([0x79, 0x76, 0x73, 0x70, 0x6d, 0x6a, 0x67, 0x64, 0x61, 0x5e, 0x5b, 0x58, 0x55, 0x52, 0x4f, 0x4c])
my_message = bytes([0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0x55, 0xd6, 0x56, 0xd7, 0x57, 0xd8, 0x58, 0xd9, 0x59, 0xda, 0x5a, 0xdb, 0x5b, 0xdc, 0x5c, 0xdd, 0x5d, 0xde, 0x5e, 0xdf, 0x5f, 0xe0, 0x60, 0xe1, 0x61, 0xe2, 0x62, 0xe3, 0x63, 0xe4, 0x64, 0xe5, 0x65, 0xe6, 0x66, 0xe7, 0x67, 0xe8, 0x68, 0xe9, 0x69, 0xea, 0x6a, 0xeb, 0x6b, 0xec, 0x6c, 0xed, 0x6d, 0xee, 0x6e, 0xef, 0x6f, 0xf0, 0x70, 0xf1, 0x71, 0xf2, 0x72, 0xf3, 0x73, 0xf4, 0x74, 0xf5, 0x75, 0xf6, 0x76, 0xf7, 0x77, 0xf8, 0x78, 0xf9, 0x79, 0xfa, 0x7a, 0xfb, 0x7b, 0xfc, 0x7c, 0xfd, 0x7d, 0xfe, 0x7e, 0xff, 0x7f, 0x00, 0x80, 0x01, 0x81, 0x02, 0x82, 0x03, 0x83, 0x04, 0x84, 0x05, 0x85, 0x06, 0x86, 0x07, 0x87, 0x08, 0x88, 0x09, 0x89, 0x0a, 0x8a, 0x0b, 0x8b, 0x0c, 0x8c, 0x0d, 0x8d, 0x0e, 0x8e, 0x0f, 0x8f, 0x10, 0x90, 0x11, 0x91, 0x12, 0x92, 0x13, 0x93, 0x14, 0x94, 0x15, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0x55, 0xd6, 0x56, 0xd7, 0x57, 0xd8, 0x58, 0xd9, 0x59, 0xda, 0x5a, 0xdb, 0x5b, 0xdc, 0x5c, 0xdd, 0x5d, 0xde, 0x5e, 0xdf, 0x5f, 0xe0, 0x60, 0xe1, 0x61, 0xe2, 0x62, 0xe3, 0x63, 0xe4, 0x64, 0xe5, 0x65, 0xe6, 0x66, 0xe7, 0x67, 0xe8, 0x68, 0xe9, 0x69, 0xea, 0x6a, 0xeb, 0x6b, 0xec, 0x6c, 0xed, 0x6d, 0xee, 0x6e, 0xef, 0x6f, 0xf0, 0x70, 0xf1, 0x71, 0xf2, 0x72, 0xf3, 0x73, 0xf4, 0x74, 0xf5, 0x75, 0xf6, 0x76, 0xf7, 0x77, 0xf8, 0x78, 0xf9, 0x79, 0xfa, 0x7a, 0xfb, 0x7b, 0xfc, 0x7c, 0xfd, 0x7d, 0xfe, 0x7e, 0xff, 0x7f, 0x00, 0x80, 0x01, 0x81, 0x02, 0x82, 0x03, 0x83, 0x04, 0x84, 0x05, 0x85, 0x06, 0x86, 0x07, 0x87, 0x08, 0x88, 0x09, 0x89, 0x0a, 0x8a, 0x0b, 0x8b, 0x0c, 0x8c, 0x0d, 0x8d, 0x0e, 0x8e, 0x0f, 0x8f, 0x10, 0x90, 0x11, 0x91, 0x12, 0x92, 0x13, 0x93, 0x14, 0x94, 0x15, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0x55, 0xd6, 0x56, 0xd7, 0x57, 0xd8, 0x58, 0xd9, 0x59, 0xda, 0x5a, 0xdb, 0x5b, 0xdc, 0x5c, 0xdd, 0x5d, 0xde, 0x5e, 0xdf, 0x5f, 0xe0, 0x60, 0xe1, 0x61, 0xe2, 0x62, 0xe3, 0x63, 0xe4, 0x64, 0xe5, 0x65, 0xe6, 0x66, 0xe7, 0x67, 0xe8, 0x68, 0xe9, 0x69, 0xea, 0x6a, 0xeb, 0x6b, 0xec, 0x6c, 0xed, 0x6d, 0xee, 0x6e, 0xef, 0x6f, 0xf0, 0x70, 0xf1, 0x71, 0xf2, 0x72, 0xf3, 0x73, 0xf4, 0x74, 0xf5, 0x75, 0xf6, 0x76, 0xf7, 0x77, 0xf8, 0x78, 0xf9, 0x79, 0xfa, 0x7a, 0xfb, 0x7b, 0xfc, 0x7c, 0xfd, 0x7d, 0xfe, 0x7e, 0xff, 0x7f, 0x00, 0x80, 0x01, 0x81, 0x02, 0x82, 0x03, 0x83, 0x04, 0x84, 0x05, 0x85, 0x06, 0x86, 0x07, 0x87, 0x08, 0x88, 0x09, 0x89, 0x0a, 0x8a, 0x0b, 0x8b, 0x0c, 0x8c, 0x0d, 0x8d, 0x0e, 0x8e, 0x0f, 0x8f, 0x10, 0x90, 0x11, 0x91, 0x12, 0x92, 0x13, 0x93, 0x14, 0x94, 0x15, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0x55, 0xd6, 0x56, 0xd7, 0x57, 0xd8, 0x58, 0xd9, 0x59, 0xda, 0x5a, 0xdb, 0x5b, 0xdc, 0x5c, 0xdd, 0x5d, 0xde, 0x5e, 0xdf, 0x5f, 0xe0, 0x60, 0xe1, 0x61, 0xe2, 0x62, 0xe3, 0x63, 0xe4, 0x64, 0xe5, 0x65, 0xe6, 0x66, 0xe7, 0x67, 0xe8, 0x68, 0xe9, 0x69, 0xea, 0x6a, 0xeb, 0x6b, 0xec, 0x6c, 0xed, 0x6d, 0xee, 0x6e, 0xef, 0x6f, 0xf0, 0x70, 0xf1, 0x71, 0xf2, 0x72, 0xf3, 0x73, 0xf4, 0x74, 0xf5, 0x75, 0xf6, 0x76, 0xf7, 0x77, 0xf8, 0x78, 0xf9, 0x79, 0xfa, 0x7a, 0xfb, 0x7b, 0xfc, 0x7c, 0xfd, 0x7d, 0xfe, 0x7e, 0xff, 0x7f, 0x00, 0x80, 0x01, 0x81, 0x02, 0x82, 0x03, 0x83, 0x04, 0x84, 0x05, 0x85, 0x06, 0x86, 0x07, 0x87, 0x08, 0x88, 0x09, 0x89, 0x0a, 0x8a, 0x0b, 0x8b, 0x0c, 0x8c, 0x0d, 0x8d, 0x0e, 0x8e, 0x0f, 0x8f, 0x10, 0x90, 0x11, 0x91, 0x12, 0x92, 0x13, 0x93, 0x14, 0x94, 0x15, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0x55, 0xd6, 0x56, 0xd7, 0x57, 0xd8, 0x58, 0xd9, 0x59, 0xda, 0x5a, 0xdb, 0x5b, 0xdc, 0x5c, 0xdd, 0x5d, 0xde, 0x5e, 0xdf, 0x5f, 0xe0, 0x60, 0xe1, 0x61, 0xe2, 0x62, 0xe3, 0x63, 0xe4, 0x64, 0xe5, 0x65, 0xe6, 0x66, 0xe7, 0x67, 0xe8, 0x68, 0xe9, 0x69, 0xea, 0x6a, 0xeb, 0x6b, 0xec])
real_ciphertext = bytes([0xca, 0x78, 0x21, 0xd4, 0xd7, 0xc5, 0x32, 0xb9, 0xa6, 0x69, 0x6c, 0xe5, 0x21, 0x17, 0x87, 0x60, 0xc4, 0xf8, 0x94, 0x6e, 0x64, 0x68, 0xf7, 0x8b, 0x80, 0x27, 0xba, 0x89, 0xcd, 0x6a, 0x9b, 0xb2, 0xc3, 0xd9, 0x07, 0xa9, 0xc3, 0xe4, 0x60, 0xf6, 0x35, 0xbd, 0x49, 0xf4, 0x3f, 0x7c, 0x65, 0x83, 0x36, 0x4f, 0x61, 0x61, 0x99, 0x51, 0xc6, 0xb9, 0xa8, 0x9f, 0x7b, 0x5f, 0x50, 0x0c, 0x6e, 0xcb, 0x66, 0xc5, 0x8f, 0x79, 0x6c, 0xb8, 0x85, 0x2d, 0x54, 0x60, 0xc6, 0x5f, 0x38, 0xcd, 0x8d, 0x33, 0xcc, 0x27, 0x3d, 0x61, 0xf2, 0xd4, 0x70, 0x1f, 0x86, 0x2f, 0xc5, 0xd5, 0x21, 0x09, 0x2e, 0x95, 0xe9, 0x7c, 0x27, 0xe1, 0x9f, 0xfb, 0xe0, 0x8b, 0x81, 0x9c, 0xab, 0x82, 0x43, 0x42, 0x4c, 0xfd, 0xf7, 0xf2, 0x2e, 0x0b, 0x6d, 0x36, 0x4a, 0xf4, 0xb0, 0x9d, 0x68, 0x65, 0x1d, 0x96, 0xdf, 0x0a, 0xa8, 0x2e, 0xc3, 0xed, 0xd4, 0xb8, 0xb8, 0xc2, 0xc5, 0x74, 0x4b, 0xef, 0xa9, 0x6b, 0xde, 0x6d, 0x5b, 0x25, 0x84, 0xfb, 0x8b, 0x31, 0x4a, 0x8d, 0x77, 0xdf, 0xca, 0x5a, 0xfb, 0x45, 0x7e, 0xf2, 0x58, 0x64, 0x86, 0xc3, 0x41, 0x1b, 0xe1, 0x18, 0xe5, 0xe6, 0x15, 0x3a, 0x34, 0x1e, 0xb6, 0x94, 0x08, 0x50, 0x62, 0xdb, 0x3f, 0x0d, 0xbe, 0x19, 0xd8, 0x8f, 0xe2, 0x9b, 0xda, 0x19, 0xd1, 0x6d, 0xcb, 0x70, 0xbe, 0xd5, 0xce, 0xea, 0xe3, 0x98, 0x5d, 0x1f, 0xb2, 0xf4, 0x24, 0x64, 0x01, 0xec, 0x8f, 0x6b, 0xca, 0xe5, 0xd8, 0xcf, 0x01, 0xa0, 0x03, 0x1c, 0xc2, 0x6b, 0xd4, 0x88, 0xb7, 0xb1, 0x8b, 0x83, 0xc9, 0x22, 0x17, 0xae, 0xca, 0xa9, 0x01, 0x69, 0x39, 0x92, 0x87, 0x95, 0x63, 0xa5, 0xab, 0xe9, 0x0b, 0x3a, 0xaa, 0xc5, 0x28, 0xd9, 0x77, 0xfb, 0x38, 0x19, 0x72, 0x01, 0x2c, 0xb5, 0x56, 0x9a, 0xbf, 0xc4, 0x1c, 0x79, 0x51, 0x07, 0x2f, 0xe6, 0x7b, 0x5c, 0x5a, 0xd4, 0x25, 0xab, 0x2f, 0xdb, 0x18, 0xd7, 0x82, 0x84, 0xaa, 0x01, 0xd0, 0xe0, 0xe1, 0x6b, 0xaf, 0x89, 0x42, 0xe4, 0x0a, 0x53, 0x11, 0x7c, 0xd8, 0x5c, 0x02, 0x99, 0x63, 0xa1, 0x54, 0x99, 0xf9, 0xa7, 0x16, 0xf5, 0x8c, 0x70, 0x5f, 0x13, 0xfb, 0x9c, 0x59, 0xef, 0x3b, 0x3d, 0xd8, 0x4e, 0xad, 0x6a, 0xc6, 0xec, 0x17, 0x24, 0x16, 0x0f, 0x8d, 0x0e, 0x8e, 0x73, 0x92, 0xe7, 0xb8, 0x46, 0x29, 0xb1, 0xc3, 0x81, 0x75, 0x76, 0xc4, 0x37, 0x07, 0x79, 0xba, 0xcc, 0x41, 0xb6, 0x86, 0xe1, 0xf4, 0xfd, 0x99, 0x07, 0x29, 0x75, 0xb6, 0x4a, 0x53, 0xa8, 0xc9, 0x7d, 0x63, 0xbf, 0xf6, 0x58, 0xbd, 0xf8, 0x03, 0xef, 0x01, 0xf3, 0x39, 0xb0, 0xae, 0xf4, 0x62, 0x48, 0x30, 0x66, 0x68, 0x72, 0x5e, 0x22, 0x95, 0x87, 0x84, 0x3e, 0xf0, 0x63, 0x10, 0xfc, 0x0d, 0x4d, 0xcc, 0x26, 0x0a, 0x42, 0xff, 0x76, 0xe2, 0xde, 0xe1, 0x81, 0xab, 0xa1, 0x0e, 0x58, 0xef, 0xd4, 0x52, 0x6d, 0x91, 0xc4, 0xf7, 0x26, 0x9c, 0x7a, 0x45, 0x9c, 0x36, 0xd8, 0x50, 0xf8, 0x24, 0x17, 0x4c, 0x0a, 0xfc, 0xe3, 0xce, 0x9a, 0x6d, 0xbb, 0xfd, 0x33, 0x32, 0x73, 0x18, 0xc4, 0x97, 0xd7, 0x11, 0x22, 0x37, 0x00, 0xa0, 0xe4, 0xd5, 0x0f, 0xca, 0x36, 0xaa, 0xf7, 0x9c, 0x87, 0x6e, 0xf7, 0x5c, 0x2c, 0xd9, 0x00, 0xcf, 0xe0, 0xe9, 0xf5, 0x1a, 0xe1, 0xf3, 0xe5, 0xef, 0x93, 0x12, 0xb8, 0x22, 0x84, 0xd3, 0x8f, 0x0e, 0x34, 0x61, 0x73, 0x32, 0xc7, 0x96, 0x1b, 0x49, 0xa9, 0xa7, 0xaf, 0x88, 0x29, 0x62, 0xb0, 0xe5, 0x8c, 0x01, 0x27, 0x00, 0x25, 0xc0, 0x0e, 0x32, 0xc1, 0xe7, 0x5c, 0xb0, 0xeb, 0x9a, 0xdb, 0x61, 0xad, 0x9a, 0x9d, 0xa2, 0x51, 0x63, 0x80, 0x30, 0xfe, 0x0a, 0xf7, 0x86, 0xee, 0x58, 0xd0, 0x7c, 0xd4, 0xba, 0xd1, 0x49, 0x6d, 0x8f, 0x91, 0x8a, 0xeb, 0x24, 0x22, 0xbc, 0x2f, 0x64, 0x10, 0xd2, 0xfa, 0xb5, 0x7d, 0x68, 0xfa, 0x81, 0xa2, 0x5b, 0x04, 0xe2, 0x77, 0x2a, 0xe3, 0x6e, 0xa5, 0x89, 0x89, 0x5c, 0x70, 0x35, 0xd7, 0x5b, 0x96, 0xf6, 0x19, 0xe1, 0xa3, 0x8a, 0x69, 0x58, 0x1d, 0x3f, 0x9b, 0x26, 0xe5, 0x1d, 0xb4, 0x9c, 0x20, 0xd1, 0x85, 0x3d, 0x73, 0x6f, 0x9c, 0xf6, 0xb0, 0xb6, 0x04, 0x6a, 0x44, 0xb8, 0x88, 0xdd, 0x14, 0xb7, 0x0a, 0x2b, 0xf6, 0xb3, 0xef, 0x76, 0xf4, 0x99, 0x98, 0x39, 0x4d, 0x4a, 0x18, 0xd3, 0x5a, 0x1a, 0xa3, 0xb9, 0xbe, 0x3f, 0xb7, 0x98, 0x36, 0x9a, 0xe4, 0x59, 0x38, 0x65, 0x44, 0x8d, 0xc0, 0x8d, 0xc5, 0xdc, 0xef, 0xd3, 0x99, 0xe0, 0xe6, 0xda, 0x6a, 0x56, 0xd0, 0xe5, 0x70, 0xcf, 0x56, 0x83, 0x76, 0x64, 0xb3, 0x85, 0x07, 0x9a, 0x22, 0xb6, 0xff, 0x8c, 0xee, 0x56, 0xb5, 0xaa, 0xd2, 0xee, 0x4a, 0xc8, 0x35, 0x2b, 0xda, 0x46, 0xe6, 0x95, 0x5c, 0x03, 0x35, 0xbd, 0xc5, 0x31, 0xe9, 0x1f, 0x5d, 0xa7, 0x30, 0x00, 0xe3, 0x40, 0x6f, 0x96, 0xb7, 0x2d, 0x80, 0x34, 0x41, 0xd3, 0xa5, 0x46, 0x75, 0xe8, 0x50, 0x48, 0xa2, 0xbc, 0x49, 0xb3, 0xba, 0x32, 0x78, 0x96, 0xfe, 0x14, 0xb6, 0x92, 0x64, 0x1e, 0x07, 0x59, 0x59, 0xae, 0xac, 0xb4, 0x85, 0x54, 0x9c, 0xf1, 0xa3, 0xb7, 0x3b, 0xb7, 0x94, 0x4c, 0xba, 0xb0, 0x2b, 0xa1, 0xb6, 0x60, 0x9d, 0x21, 0x32, 0xba, 0x66, 0x78, 0x5f, 0x84, 0x7b, 0x14, 0x24, 0xbd, 0xb1, 0xe5, 0xa6, 0x59, 0x17, 0x9b, 0xb5, 0x62, 0xfc, 0x1b, 0xd0, 0x63, 0xb0, 0x83, 0x6d, 0xb8, 0x26, 0xe4, 0x3a, 0x0d, 0xfe, 0x29, 0x7c, 0xf8, 0xfe, 0x88, 0x22, 0x64, 0x1e, 0x3f, 0xb9, 0x3d, 0x66, 0x6e, 0x2e, 0x07, 0xd4, 0xfd, 0x17, 0xcd, 0x46, 0x01, 0xa2, 0x78, 0x1c, 0xef, 0xb9, 0x37, 0x14, 0xec, 0xfa, 0x1c, 0x22, 0x17, 0x8e, 0x63, 0xfa, 0x7a, 0x24, 0xf6, 0x03, 0x50, 0x79, 0x45, 0xf7, 0xfd, 0xfe, 0xdc, 0x23, 0x2f, 0xc2, 0xe9, 0x8e, 0xb9, 0x9f, 0xb7, 0x4b, 0xd7, 0x17, 0xa1, 0xee, 0xe7, 0x6b, 0x29, 0x83, 0x54, 0xe6, 0xc4, 0x39, 0x57, 0x83, 0xdc, 0x57, 0x34, 0x75, 0x15, 0x7a, 0x5a, 0x60, 0xda, 0xfd, 0x83, 0x08, 0x50, 0xf5, 0xa8, 0x16, 0xa2, 0xc7, 0xc3, 0xd5, 0x5f, 0xa5, 0xdf, 0xec, 0x46, 0x11, 0x55, 0xf4, 0xaf, 0xf4, 0x38, 0xd8, 0x01, 0xb2, 0xb1, 0xb9, 0x50, 0x03, 0xf7, 0xdc, 0x5e, 0xbc, 0x15, 0xc1, 0x04, 0x61, 0x43, 0x93, 0xaa, 0x47, 0x7c, 0x28, 0x1b, 0x19, 0x06, 0x43, 0x43, 0x3f, 0x2e, 0x90, 0xcf, 0xf5, 0x71, 0x2d, 0x75, 0x27, 0x81, 0x8d, 0x13, 0x7d, 0xd1, 0xe6, 0xe3, 0xc8, 0x60, 0xe8, 0xe0, 0x8d, 0x93, 0xc4, 0xbe, 0xaa, 0x39, 0x08, 0x21, 0xe0, 0x39, 0x7e, 0x81, 0x43, 0x20, 0x7a, 0x42, 0xd8, 0x19, 0x82, 0x3f, 0xfe, 0xb1, 0xf6, 0x1b, 0x0b, 0x98, 0x2a, 0x4c, 0x7c, 0x9a, 0xe4, 0x4b, 0x97, 0x78, 0x41, 0xc0, 0xe3, 0xaa, 0xd8, 0x82, 0x3a, 0x64, 0x86, 0x6b, 0x02, 0x14, 0xf8, 0xf1, 0xe7, 0x75, 0x9f, 0x46, 0xd4, 0x54, 0x8c, 0xa5, 0x85, 0xa5, 0xbd, 0xb0, 0xda, 0x1c, 0x3a, 0x43, 0x80, 0x99, 0xff, 0x29, 0x1f, 0xef, 0xaf, 0xe0, 0x2e, 0x97, 0x4b, 0x8a, 0xe1, 0xf7, 0x67, 0x3b, 0x9c, 0x54, 0x3c, 0x3f, 0xf4, 0x85, 0xd6, 0xe0, 0xf2, 0x37, 0xb3, 0x85, 0x4f, 0xaf, 0x1a, 0x5e, 0x63, 0xb9, 0x65, 0x93, 0x21, 0xda, 0x4b, 0x80, 0xfe, 0x1b, 0x6b, 0xa8, 0x42, 0xbc, 0x0b, 0xbb, 0xd1, 0xce, 0xc8, 0x5b, 0x31, 0x46, 0x22, 0x23, 0x3b, 0x3e, 0xe3, 0x4b, 0x09, 0xda, 0xbe, 0xe7, 0x1d, 0x6d, 0x3b, 0x64, 0x5e, 0xcc, 0x7d, 0x72, 0x69, 0xb1, 0xfa, 0x7a, 0xc6, 0xcf, 0x9b, 0x9e, 0x24, 0x18, 0x96, 0x59, 0xb7, 0x07, 0xd2, 0xe0, 0xed, 0xac, 0xf0, 0x41, 0x1e, 0xa5, 0x11, 0xe3, 0x95, 0xfe, 0x95, 0xca, 0x8d, 0x82, 0x88, 0x79, 0x29, 0x36, 0x1c, 0x8d, 0x95, 0xa8, 0x04, 0x1f, 0x20, 0x24, 0x72, 0x5b, 0xd0, 0xc0, 0x03, 0xe8, 0x69, 0x6f, 0x07, 0x0e, 0x24, 0xe4, 0xed, 0xbb, 0xae, 0xa3, 0xdd, 0xc7, 0xca, 0xd5, 0xf3, 0xc7, 0x7e, 0x38, 0x63, 0x4c, 0x3b, 0xf8, 0x6d, 0x76, 0xd6, 0xdc, 0x14, 0x9f, 0xee, 0x75, 0x4c, 0x9e, 0xb1, 0xd1, 0xeb, 0xbf, 0x4f, 0xe9, 0x6a, 0x86, 0x78, 0x64, 0x4e, 0x58, 0xec, 0xb5, 0x81, 0x77, 0x2a, 0x43, 0xf5, 0x70, 0x16, 0x6b, 0x12, 0x2f, 0xf6, 0x6e, 0x10, 0xd1, 0x36, 0x07, 0x7d, 0x36, 0xf1, 0xf9, 0xc4, 0x87, 0x43, 0x19, 0x03, 0x3a, 0xca, 0xe5, 0x04, 0x3e, 0x69, 0x04, 0xee, 0x82, 0xa7, 0xc6, 0x4b, 0x7f, 0xbd, 0xc9, 0x2a, 0x8f, 0x4c, 0x69, 0xfb, 0x1e, 0xa4, 0x58, 0xe6, 0x86, 0x30, 0x10, 0xce, 0xa7, 0x86])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_1199_tweak_16(self, test_workers):
my_key = bytes([0x1a, 0x1a, 0x1a, 0x1a, 0x1a, 0x1a, 0x1a, 0x1a, 0x1a, 0x1a, 0x1a, 0x1a, 0x1a, 0x1a, 0x1a, 0x1a])
my_tweak = bytes([0x80, 0x79, 0x72, 0x6b, 0x64, 0x5d, 0x56, 0x4f, 0x48, 0x41, 0x3a, 0x33, 0x2c, 0x25, 0x1e, 0x17])
my_message = bytes([0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd, 0xbd])
real_ciphertext = bytes([0x95, 0xa2, 0x16, 0x80, 0x54, 0x16, 0xf3, 0xe5, 0x83, 0x05, 0x50, 0x16, 0xf2, 0x03, 0xec, 0xbb, 0x7b, 0xb9, 0x53, 0x24, 0x5f, 0x74, 0x64, 0xdf, 0x0a, 0xa1, 0x91, 0x33, 0x18, 0xed, 0xa5, 0x7d, 0x23, 0x1b, 0xb8, 0x06, 0xc7, 0x6a, 0x4a, 0x7b, 0xcd, 0x5b, 0x85, 0x8b, 0x8a, 0x60, 0x95, 0x46, 0x73, 0xb4, 0xdd, 0x9e, 0xb8, 0xc5, 0xe4, 0xae, 0xd4, 0xd2, 0x69, 0x9d, 0xd5, 0x74, 0xdb, 0x5e, 0x5e, 0x74, 0x2b, 0x78, 0x6b, 0xdb, 0xb4, 0xf8, 0xed, 0xd4, 0x94, 0x07, 0xea, 0xf5, 0xf6, 0x90, 0x5b, 0x20, 0x7e, 0x83, 0x47, 0xb6, 0xcb, 0xb7, 0x93, 0xf6, 0x0b, 0x34, 0x6e, 0xfe, 0x59, 0xd4, 0x71, 0x58, 0x73, 0x36, 0x11, 0x54, 0xfc, 0x55, 0x8b, 0x8f, 0xd1, 0x58, 0xcb, 0x96, 0x36, 0xe9, 0x82, 0x93, 0x83, 0x65, 0xc6, 0x05, 0x57, 0x89, 0xf2, 0x00, 0x93, 0x89, 0x1c, 0xa4, 0x45, 0x40, 0xcc, 0x1a, 0x4a, 0xdd, 0xc7, 0xdb, 0x9d, 0x8b, 0x39, 0x82, 0x1b, 0xf9, 0xfd, 0x7a, 0xbe, 0xc2, 0xfa, 0x48, 0xc7, 0x61, 0xbe, 0x33, 0xe3, 0xe7, 0x29, 0x45, 0x43, 0xb4, 0x64, 0xb6, 0xb0, 0x9a, 0xe0, 0x44, 0xa0, 0x5a, 0xe0, 0x35, 0x4a, 0xdb, 0x74, 0xfa, 0x2c, 0x36, 0xed, 0xe4, 0xb4, 0xa3, 0x76, 0x15, 0x46, 0xb0, 0x6d, 0x02, 0x4c, 0x78, 0x48, 0x94, 0x35, 0xf5, 0x7c, 0xfa, 0x3d, 0xd5, 0x17, 0xae, 0x5e, 0xab, 0x18, 0xf3, 0xb7, 0x62, 0x52, 0xcf, 0xd2, 0x6e, 0x35, 0x5d, 0xd6, 0xd5, 0x95, 0xcf, 0x2f, 0xf1, 0x7e, 0x1f, 0x55, 0x73, 0x6b, 0x4f, 0x65, 0x6b, 0x84, 0xf8, 0x73, 0xc8, 0xcc, 0x06, 0xca, 0xca, 0x83, 0x31, 0x66, 0x85, 0x74, 0x0c, 0xe8, 0x62, 0x86, 0xcf, 0xfd, 0x11, 0x1d, 0xf4, 0x7f, 0x54, 0x21, 0xa5, 0xde, 0x9b, 0xe0, 0xcb, 0x5d, 0x9d, 0x8b, 0xe0, 0x52, 0x2b, 0xee, 0xef, 0x26, 0x08, 0x7d, 0x8b, 0x94, 0x21, 0x6f, 0xe8, 0x3b, 0x90, 0x25, 0x5c, 0xb8, 0x82, 0xc2, 0xb8, 0x53, 0xd2, 0x99, 0xf2, 0x1a, 0xf2, 0x46, 0x13, 0x98, 0xfc, 0x14, 0xda, 0xdb, 0x99, 0x32, 0xc9, 0x11, 0xa2, 0x05, 0xf8, 0x60, 0x93, 0x8b, 0x35, 0xd8, 0xb2, 0x2c, 0x61, 0x76, 0xa1, 0x03, 0x24, 0xaa, 0x5c, 0x89, 0x4f, 0x3d, 0xc2, 0xc0, 0xc0, 0xb9, 0x46, 0xac, 0xa7, 0x3e, 0xcc, 0x07, 0xd2, 0xca, 0xf4, 0xc2, 0xac, 0x48, 0x15, 0xe7, 0xe4, 0x0a, 0x1c, 0xc3, 0x6e, 0x32, 0xb5, 0x04, 0x7a, 0x2b, 0x2f, 0x6f, 0x6f, 0xc3, 0x7b, 0x54, 0xd4, 0x34, 0xb6, 0x43, 0xfa, 0xff, 0x22, 0xe5, 0x59, 0xd4, 0xc0, 0x89, 0xeb, 0xc7, 0xc4, 0xce, 0xb2, 0xbe, 0x95, 0x89, 0xb8, 0xa2, 0x6c, 0xcf, 0x25, 0x2f, 0x93, 0x24, 0x8e, 0x0e, 0x05, 0x1a, 0xc0, 0x12, 0x7b, 0xfc, 0x47, 0xf9, 0x2f, 0x50, 0xce, 0xde, 0x27, 0x8a, 0xbf, 0xe3, 0x91, 0x57, 0xcd, 0x51, 0xb5, 0x26, 0x48, 0xc6, 0xc5, 0xb0, 0x54, 0x90, 0xbb, 0x85, 0x7e, 0x10, 0x8f, 0xd9, 0xea, 0x3c, 0x6b, 0x5c, 0x4d, 0xc7, 0x6f, 0x6d, 0x3d, 0x0d, 0x16, 0x14, 0x81, 0x73, 0xe2, 0xa6, 0xeb, 0x80, 0xcf, 0xa9, 0x30, 0xf8, 0x31, 0xc4, 0xad, 0xfb, 0x25, 0xf8, 0xe7, 0x7c, 0xed, 0x77, 0x11, 0x5f, 0x37, 0x19, 0x94, 0x0f, 0xd2, 0x4a, 0xcc, 0xac, 0xf0, 0xf4, 0xec, 0xdd, 0x29, 0xee, 0x27, 0x01, 0x09, 0x2c, 0xae, 0xc7, 0xb0, 0xed, 0x94, 0x70, 0x2f, 0xa1, 0x07, 0x88, 0x4d, 0x3c, 0x4e, 0x84, 0x17, 0x12, 0xbb, 0xbf, 0xba, 0xba, 0x35, 0x10, 0x5f, 0x46, 0x6a, 0xba, 0x50, 0x30, 0x05, 0xa1, 0x82, 0x5c, 0xb7, 0xb8, 0x57, 0x1b, 0x9f, 0x3f, 0x6c, 0x9e, 0x58, 0x8a, 0x9e, 0x12, 0x78, 0xa7, 0x40, 0xe4, 0xd1, 0xa8, 0x31, 0x67, 0x98, 0x9d, 0x7f, 0x1a, 0xd4, 0xbb, 0xf9, 0x3b, 0x3a, 0xa1, 0xfc, 0x4b, 0x89, 0xb5, 0x82, 0xd8, 0x1a, 0x7d, 0x04, 0x7b, 0x36, 0x2f, 0x3b, 0x6c, 0x32, 0x8a, 0x71, 0xd3, 0x28, 0x48, 0x1f, 0x64, 0x74, 0x95, 0x4c, 0xda, 0xba, 0x76, 0x46, 0xfc, 0x97, 0xf7, 0xbb, 0x3e, 0x96, 0xff, 0xb8, 0xed, 0x59, 0x03, 0x29, 0x6f, 0x97, 0x00, 0x13, 0xe2, 0x82, 0xda, 0x0b, 0x3c, 0x65, 0xf6, 0x34, 0x53, 0x4f, 0xb4, 0x68, 0x8b, 0x99, 0x20, 0xcb, 0x12, 0xf1, 0x20, 0x29, 0x6e, 0xf4, 0x43, 0xd5, 0x56, 0xdc, 0x6c, 0xd2, 0x2b, 0xb5, 0x65, 0x2b, 0x32, 0x93, 0x38, 0x88, 0x3b, 0xb5, 0xe0, 0xc3, 0xe9, 0x56, 0x76, 0x8d, 0x8c, 0xf6, 0xd1, 0x25, 0x9c, 0xc1, 0xef, 0xfe, 0xc2, 0x43, 0x63, 0x73, 0xa4, 0xd5, 0x39, 0xae, 0x63, 0x93, 0x18, 0x52, 0x81, 0xf3, 0xa9, 0xe2, 0x21, 0x22, 0x4c, 0xc5, 0x56, 0x60, 0xee, 0xfb, 0x56, 0x63, 0xdc, 0xff, 0x71, 0x3a, 0x2e, 0x8f, 0xd9, 0x45, 0xea, 0x6e, 0x3d, 0xcc, 0xce, 0xb3, 0x37, 0xae, 0x36, 0xe7, 0x8c, 0x9b, 0x74, 0xcc, 0x0e, 0x9a, 0x3e, 0xe3, 0xc8, 0xb7, 0x6e, 0x51, 0x3b, 0x02, 0x90, 0x39, 0xf5, 0x95, 0x79, 0x8e, 0xce, 0x45, 0xcd, 0xfe, 0x55, 0xd4, 0x25, 0xaf, 0xc3, 0x2e, 0xd1, 0x90, 0x42, 0x8f, 0x04, 0xcc, 0x6d, 0xec, 0xfe, 0xad, 0xf5, 0x4c, 0x8e, 0x75, 0x76, 0x3a, 0x77, 0x1f, 0x06, 0x0c, 0x00, 0x67, 0x83, 0xa8, 0x5a, 0x33, 0x0c, 0x91, 0xa3, 0x54, 0xc1, 0x22, 0x65, 0xa0, 0x34, 0x73, 0x04, 0xe9, 0x98, 0x9d, 0x52, 0xc3, 0x78, 0x9e, 0x9b, 0xcc, 0x26, 0x85, 0xf1, 0x1b, 0xf6, 0xb7, 0x81, 0x12, 0x8c, 0xb6, 0x12, 0x34, 0xd4, 0x25, 0x52, 0x17, 0x67, 0x4b, 0x5c, 0x4f, 0x33, 0x85, 0x9b, 0x7c, 0x59, 0xba, 0xaa, 0x58, 0xae, 0x07, 0x04, 0xd1, 0x5e, 0x9d, 0x46, 0x77, 0xb9, 0xd2, 0xc7, 0xce, 0xb0, 0xc6, 0x5a, 0xfd, 0xf4, 0xfe, 0x8f, 0x8d, 0x1b, 0xda, 0x1b, 0x3d, 0x0e, 0xca, 0xe7, 0x80, 0x43, 0xdf, 0xf5, 0x71, 0xd6, 0x03, 0x74, 0x44, 0x4c, 0x3e, 0x3f, 0xdd, 0xb3, 0x32, 0x17, 0xd5, 0x62, 0x9d, 0xee, 0x9d, 0xaa, 0xf7, 0xee, 0xcb, 0x4e, 0xab, 0xb9, 0x22, 0x74, 0xf4, 0x16, 0x1f, 0xb6, 0xf0, 0xbb, 0x69, 0x52, 0xfb, 0xac, 0xe3, 0xc8, 0x95, 0x9d, 0xa0, 0x6e, 0x4a, 0x0f, 0xdc, 0x2f, 0x44, 0xd8, 0xcb, 0xde, 0x05, 0x51, 0x2f, 0x36, 0x1e, 0xab, 0x1e, 0xfb, 0x9e, 0xcd, 0xf4, 0x81, 0xd4, 0xf9, 0x8e, 0x74, 0xf6, 0x39, 0x24, 0x8b, 0x93, 0x43, 0xa8, 0xd5, 0x22, 0xe9, 0x69, 0x69, 0x90, 0x6b, 0x36, 0x6b, 0x70, 0x9d, 0x9a, 0x8b, 0xff, 0xd9, 0xa8, 0xa6, 0x78, 0xe0, 0x19, 0xc7, 0xcf, 0xa5, 0x5b, 0xec, 0x4d, 0x60, 0xb0, 0x09, 0x26, 0x7b, 0xce, 0xbd, 0xc9, 0x25, 0xc7, 0x04, 0x33, 0x78, 0xaf, 0xaf, 0xf3, 0xb1, 0x3f, 0x02, 0x25, 0x7b, 0x70, 0x1d, 0xd3, 0xcd, 0x3d, 0xba, 0x03, 0x13, 0x84, 0x06, 0x55, 0x97, 0xd9, 0x67, 0x3e, 0xf4, 0x33, 0x21, 0x8c, 0xfb, 0xbd, 0x81, 0x03, 0x06, 0x19, 0x7d, 0xbb, 0x28, 0x46, 0xb6, 0xbd, 0xd1, 0x46, 0x23, 0xc3, 0x7f, 0xb0, 0xdb, 0x87, 0xff, 0xe0, 0xf3, 0x37, 0x83, 0x04, 0x0f, 0x46, 0xbf, 0x3b, 0xa2, 0x28, 0xe0, 0xf4, 0xd7, 0xb0, 0xda, 0x30, 0x7d, 0x86, 0x84, 0x2f, 0xf3, 0x64, 0xa3, 0x5b, 0xfd, 0x52, 0xc6, 0x60, 0xa8, 0xc3, 0xff, 0x1a, 0x98, 0x88, 0x5b, 0xb1, 0x50, 0x8e, 0xfe, 0xb2, 0x2d, 0xec, 0xd9, 0xbe, 0x08, 0xb4, 0xdc, 0x73, 0x75, 0xa7, 0xb5, 0x97, 0x51, 0x32, 0x67, 0x2f, 0x85, 0xd5, 0x52, 0xc7, 0xf5, 0xc7, 0x7d, 0x02, 0x3b, 0xd7, 0x7d, 0xd2, 0xca, 0x8d, 0xa5, 0x25, 0xe0, 0xe0, 0x4f, 0xce, 0x24, 0xf8, 0x34, 0x30, 0x22, 0xb1, 0x07, 0xa6, 0xbe, 0xab, 0xb3, 0x65, 0xa5, 0xb9, 0x43, 0x5f, 0x34, 0x1b, 0x06, 0x19, 0xc8, 0x8a, 0xca, 0xe1, 0x85, 0xe0, 0xb8, 0x40, 0xf2, 0xb7, 0x57, 0x3e, 0xab, 0xe4, 0xa9, 0xc1, 0xec, 0xb2, 0x0e, 0x77, 0xeb, 0x3b, 0x67, 0xb2, 0x2c, 0xd5, 0x51, 0x3d, 0x64, 0xc2, 0x10, 0xc5, 0x2a, 0xcc, 0xaa, 0x93, 0x26, 0x11, 0xbb, 0xe1, 0x07, 0x29, 0x80, 0x8c, 0x07, 0x65, 0x16, 0xcf, 0xa4, 0xb9, 0xc9, 0x37, 0x9a, 0x9b, 0x65, 0x58, 0x93, 0x67, 0x1e, 0x61, 0x9f, 0x92, 0x2a, 0x18, 0x24, 0xb4, 0x1a, 0x70, 0x9e, 0x94, 0x4a, 0x18, 0x5e, 0x82, 0x27, 0xad, 0x25, 0xbb, 0x1a, 0x91, 0x4f, 0xc4, 0x90, 0xa6, 0x76, 0xdc, 0x8c, 0xa3, 0x7d, 0xfd, 0x39, 0xfe, 0x94, 0x8d, 0x28, 0xcc, 0xaa, 0x44, 0x86, 0x03, 0x42, 0xdc, 0xa9, 0x83, 0xea, 0x99, 0x50, 0x50, 0xa9, 0x59, 0x94, 0x89, 0xbc, 0xe1, 0xb0, 0xa4, 0xf0, 0x78, 0xd3, 0xf6, 0xab, 0x3d, 0x7f, 0x86, 0xa2, 0xdc, 0x2a, 0xbc, 0x44, 0x6b, 0xc9, 0x9a, 0x23, 0xaf, 0xe1, 0x9d, 0xa9, 0x1f, 0x14, 0xea, 0x8c, 0x92, 0xee])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_1200_tweak_16(self, test_workers):
my_key = bytes([0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72])
my_tweak = bytes([0xe7, 0xd8, 0xc9, 0xba, 0xab, 0x9c, 0x8d, 0x7e, 0x6f, 0x60, 0x51, 0x42, 0x33, 0x24, 0x15, 0x06])
my_message = bytes([0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15])
real_ciphertext = bytes([0x00, 0x0c, 0x27, 0x17, 0x64, 0xb7, 0x66, 0xf4, 0xb8, 0xe1, 0x0a, 0xe5, 0x66, 0x6e, 0xa0, 0x40, 0xd5, 0xae, 0xfd, 0xdb, 0x21, 0x28, 0x1c, 0xb8, 0x32, 0x2f, 0x01, 0x3d, 0x50, 0x52, 0x08, 0x7e, 0x45, 0x8d, 0x57, 0xce, 0x66, 0x82, 0xa5, 0x61, 0x9e, 0x18, 0x2e, 0xf6, 0x59, 0x99, 0x0c, 0xdc, 0xcd, 0xbf, 0x2c, 0xb1, 0x8d, 0xee, 0x6c, 0xf3, 0x72, 0xf2, 0xc3, 0x4d, 0x21, 0xf2, 0x1a, 0xed, 0x51, 0x52, 0x8c, 0x11, 0x22, 0xc4, 0x6a, 0x85, 0xd2, 0xc1, 0x76, 0xc0, 0x58, 0x75, 0x79, 0x57, 0xb2, 0x69, 0x22, 0xf6, 0x7d, 0xfa, 0x2b, 0x80, 0x4c, 0xbb, 0x23, 0x26, 0xd3, 0xcf, 0x59, 0x9a, 0x94, 0x31, 0xaf, 0xbb, 0x0d, 0x1b, 0x9a, 0xed, 0x7a, 0x70, 0x68, 0xf4, 0xa9, 0x40, 0xf5, 0xd3, 0xf8, 0x19, 0x07, 0xc3, 0xc6, 0xa0, 0x18, 0x1c, 0xb9, 0x4a, 0x9d, 0xe0, 0x87, 0x38, 0x0e, 0x1a, 0xe4, 0x31, 0x62, 0x8e, 0x33, 0x06, 0xe3, 0x8e, 0xec, 0xe9, 0xf7, 0x31, 0x27, 0xf1, 0x8c, 0x8c, 0xac, 0xd3, 0xe4, 0xed, 0xfd, 0x1a, 0xcc, 0x38, 0x28, 0x43, 0xcc, 0xdb, 0x98, 0xea, 0xd8, 0x91, 0xdf, 0xf7, 0xb6, 0x4b, 0x37, 0xb9, 0x02, 0xa5, 0x4c, 0xd4, 0xbc, 0x8c, 0x67, 0x14, 0xcc, 0x2e, 0x96, 0x24, 0x27, 0x5f, 0x5a, 0x6b, 0xbe, 0x3b, 0x17, 0xd9, 0x13, 0x69, 0xc5, 0x54, 0x42, 0xb4, 0x37, 0xfa, 0xcf, 0x5f, 0x29, 0xdd, 0x68, 0xf3, 0xfc, 0x73, 0x8c, 0x38, 0x80, 0x08, 0x3b, 0xa8, 0xfe, 0x3c, 0x67, 0xaa, 0x6b, 0x4c, 0x8e, 0xa9, 0xb5, 0x7e, 0xd2, 0x7b, 0x44, 0x95, 0xcf, 0x93, 0x13, 0x8a, 0xf7, 0x96, 0xa9, 0x9f, 0x5c, 0xe8, 0xb0, 0x05, 0x07, 0xcd, 0x64, 0xd1, 0x3b, 0x40, 0x67, 0x81, 0x31, 0x7b, 0xe6, 0x9e, 0xf1, 0x16, 0x8b, 0x3f, 0x48, 0xc9, 0x6d, 0x4d, 0xd1, 0xcc, 0xcb, 0x16, 0x8f, 0x51, 0xe8, 0x8c, 0xc2, 0xa4, 0x2a, 0xd4, 0x9e, 0x55, 0x90, 0x93, 0xf2, 0xe7, 0xab, 0x5e, 0x0a, 0x1c, 0x4b, 0xdc, 0x36, 0xf6, 0xc3, 0xac, 0xc2, 0x86, 0xeb, 0xe9, 0x0c, 0xa8, 0x6d, 0x24, 0xda, 0x93, 0xb5, 0xa1, 0x8c, 0x14, 0x06, 0xdc, 0xe0, 0xa6, 0x0b, 0x88, 0x71, 0x62, 0x8f, 0xed, 0x27, 0x6a, 0xe1, 0x2d, 0xf4, 0x2c, 0x77, 0x37, 0xad, 0x61, 0x0c, 0xa2, 0x98, 0xd1, 0x07, 0xb8, 0xae, 0xcb, 0xce, 0xef, 0x71, 0x9a, 0x0c, 0xd2, 0xd3, 0x6b, 0x28, 0xbd, 0x25, 0xe3, 0x80, 0x75, 0x19, 0xb2, 0xec, 0xde, 0xb1, 0x2b, 0xaf, 0x22, 0x50, 0xd5, 0x53, 0x02, 0x49, 0xce, 0x0e, 0x3a, 0x2d, 0x82, 0xe0, 0x17, 0x37, 0x46, 0x21, 0xdb, 0xd6, 0x84, 0xde, 0xa4, 0xc0, 0x24, 0x6a, 0xc3, 0x27, 0xf5, 0x97, 0xd7, 0xb4, 0x4f, 0x14, 0x14, 0x72, 0x31, 0xa3, 0x46, 0x41, 0xfd, 0x02, 0xfc, 0x0a, 0xeb, 0xe0, 0xb4, 0x23, 0x1e, 0x27, 0x71, 0x66, 0x26, 0x38, 0x2d, 0x35, 0x61, 0xcc, 0x2a, 0xf4, 0x6f, 0x66, 0x34, 0x28, 0xcb, 0xe6, 0x5b, 0x65, 0x91, 0x21, 0xc7, 0x90, 0xdf, 0xe0, 0xfa, 0x71, 0x27, 0xc0, 0x70, 0xca, 0xe8, 0x40, 0x3b, 0xa0, 0xdb, 0xeb, 0xcd, 0x14, 0x7d, 0x90, 0xe2, 0xbc, 0x1e, 0xca, 0xfa, 0xca, 0x6c, 0xd2, 0xe0, 0x53, 0xa6, 0x60, 0xdd, 0xef, 0x25, 0x3a, 0x44, 0xad, 0x9e, 0x49, 0x38, 0x5e, 0xe6, 0x11, 0xc3, 0x7b, 0xf5, 0x85, 0x3c, 0x1b, 0xa2, 0x11, 0x44, 0x03, 0x68, 0xc6, 0x24, 0x10, 0x7d, 0x23, 0x94, 0x45, 0x89, 0xab, 0x1a, 0x36, 0x10, 0x84, 0x02, 0xc3, 0xba, 0xc0, 0xdf, 0xae, 0xdf, 0x64, 0xab, 0x99, 0xe5, 0xd0, 0xc3, 0xdd, 0xba, 0xb3, 0xca, 0x1d, 0x2d, 0x5e, 0xe7, 0x89, 0x0d, 0x4f, 0x3c, 0xb2, 0xab, 0x8b, 0x86, 0xd1, 0x89, 0x98, 0x21, 0x00, 0x7d, 0x88, 0xa3, 0xfd, 0xdf, 0x70, 0x32, 0x86, 0xb2, 0xcf, 0x1f, 0x4a, 0xd7, 0xcf, 0xba, 0x6b, 0xab, 0x7a, 0xc7, 0xbb, 0x17, 0x00, 0xf3, 0x65, 0x36, 0x1d, 0x6a, 0x55, 0x19, 0xd6, 0x6a, 0x2b, 0x1d, 0x8e, 0x91, 0x0c, 0x6e, 0xcb, 0x18, 0x18, 0xf6, 0x7b, 0x54, 0x85, 0xd1, 0xd8, 0xb6, 0x25, 0x4d, 0xc0, 0x1a, 0x81, 0x9e, 0x1e, 0xe3, 0xa8, 0xa6, 0xdf, 0x21, 0xea, 0x8d, 0x04, 0xbe, 0x26, 0x8a, 0x98, 0x04, 0xcd, 0x40, 0x88, 0x52, 0x48, 0x08, 0x93, 0x73, 0xa1, 0xd3, 0xfc, 0x0a, 0xfa, 0x20, 0x74, 0x4b, 0xe5, 0x1f, 0xde, 0x28, 0x75, 0xd2, 0xc4, 0x9b, 0x71, 0x37, 0x91, 0x60, 0x92, 0x9f, 0x80, 0x25, 0x62, 0x0d, 0x5e, 0xe4, 0x4d, 0x1f, 0x47, 0x0e, 0x8a, 0x68, 0xd9, 0xf9, 0x2e, 0xb5, 0x87, 0xe9, 0xfc, 0x85, 0xd0, 0x3c, 0xd9, 0x15, 0x24, 0x71, 0x41, 0x59, 0xc1, 0x84, 0x41, 0xf0, 0xd8, 0x2d, 0x5a, 0x78, 0x12, 0xd2, 0xef, 0x32, 0x4d, 0xd3, 0x7b, 0x3a, 0x2c, 0x36, 0xf1, 0xed, 0xc6, 0x9f, 0xec, 0xdf, 0x41, 0x76, 0x1c, 0x4b, 0x8a, 0x6a, 0xf4, 0xb7, 0x9c, 0x81, 0x1d, 0x77, 0x42, 0x0f, 0xa1, 0x6e, 0x45, 0x5f, 0x33, 0xa5, 0x4c, 0x13, 0x74, 0x79, 0xa7, 0x03, 0xef, 0xa6, 0x80, 0x09, 0x3b, 0xbd, 0x97, 0x33, 0x41, 0x5b, 0xf1, 0x78, 0x3d, 0x12, 0x40, 0x5a, 0xbd, 0xb8, 0x1c, 0x50, 0xb9, 0x2b, 0x50, 0x3f, 0x3b, 0xe5, 0xd5, 0x38, 0x4c, 0xf5, 0xec, 0x9f, 0x8b, 0x3b, 0x7f, 0xe0, 0x86, 0xbc, 0x61, 0x5e, 0xd6, 0x49, 0x8e, 0x0f, 0xe5, 0x4c, 0xbb, 0x0c, 0xe5, 0x99, 0x25, 0xc5, 0x4f, 0x2a, 0x28, 0x51, 0xb2, 0x60, 0x75, 0x8b, 0x97, 0xad, 0x60, 0x7d, 0xa8, 0xbd, 0x5e, 0x8c, 0xc9, 0x22, 0xe6, 0x50, 0xdd, 0xb5, 0x3b, 0x41, 0x2d, 0xb1, 0x88, 0xf6, 0x5a, 0x14, 0xd1, 0x2d, 0x71, 0xcf, 0x7c, 0x3f, 0xc0, 0x41, 0x19, 0x5f, 0x9a, 0xa7, 0x03, 0xf8, 0x6d, 0x23, 0x6b, 0x15, 0x32, 0x5c, 0xa2, 0x25, 0xf8, 0xd2, 0xe0, 0x7b, 0xbd, 0x0b, 0x00, 0x0c, 0x95, 0x2b, 0xc2, 0x52, 0x37, 0x01, 0x54, 0x05, 0xe2, 0x6a, 0xdc, 0xf3, 0xa2, 0xcd, 0x25, 0x85, 0xe6, 0xce, 0xf9, 0xc9, 0x8b, 0x38, 0x8b, 0x7f, 0xa4, 0xc9, 0x93, 0x8c, 0x8a, 0x5e, 0xb1, 0x74, 0x58, 0x54, 0xab, 0xd4, 0x81, 0x5b, 0x99, 0x5d, 0xa0, 0xd6, 0x9a, 0xa6, 0x36, 0xb9, 0x1a, 0xf3, 0xa6, 0x0b, 0x37, 0xc2, 0xe6, 0x44, 0x6b, 0x68, 0x5b, 0xcc, 0x40, 0x6e, 0x5c, 0x0d, 0x20, 0x84, 0x8c, 0xcf, 0x34, 0xdb, 0x38, 0x48, 0x91, 0xcf, 0x4f, 0xb7, 0x4e, 0x52, 0x96, 0xac, 0x97, 0x4e, 0x92, 0x40, 0x92, 0x98, 0x9c, 0x7e, 0xa6, 0x08, 0x07, 0x18, 0xb0, 0x91, 0x5e, 0x9b, 0x16, 0x17, 0xe9, 0xe0, 0xc7, 0x07, 0x6f, 0xd9, 0xdc, 0xa3, 0xbf, 0x33, 0x79, 0x09, 0x55, 0x30, 0x7d, 0xc8, 0xa6, 0x6f, 0x29, 0xda, 0x28, 0x2f, 0x16, 0x89, 0x35, 0x0d, 0x20, 0x3b, 0xed, 0xc1, 0xb4, 0xd2, 0xc3, 0xbe, 0x4b, 0x72, 0xf8, 0xf4, 0xe5, 0x7a, 0xea, 0x38, 0xd6, 0x90, 0x8d, 0x87, 0x92, 0x5e, 0xe4, 0xa7, 0xb7, 0x5e, 0x0d, 0x7d, 0x74, 0xee, 0xb0, 0x06, 0x1c, 0x6d, 0xc5, 0x6a, 0x1e, 0xf6, 0x50, 0x7d, 0x91, 0x14, 0xde, 0x49, 0x39, 0xdc, 0x16, 0x2b, 0xd6, 0x9d, 0x19, 0x73, 0x95, 0xf8, 0xde, 0xa6, 0xe5, 0x9e, 0x8f, 0x21, 0x68, 0xc2, 0x74, 0x83, 0x70, 0xba, 0x73, 0x4d, 0x43, 0x64, 0x04, 0x2d, 0x45, 0x24, 0x5f, 0x01, 0xaf, 0x1d, 0x1b, 0x84, 0x76, 0x5a, 0x72, 0x90, 0x5b, 0xb2, 0x20, 0x30, 0xc2, 0xc4, 0x16, 0x71, 0xee, 0xc9, 0x60, 0x0a, 0xdc, 0xdd, 0x27, 0x61, 0xc1, 0xe9, 0x88, 0xb3, 0xcf, 0xb9, 0xdd, 0x2c, 0x46, 0x47, 0x6b, 0xe0, 0xc8, 0x1e, 0x20, 0xe5, 0x90, 0xa2, 0xbe, 0x25, 0x19, 0x10, 0x5f, 0xe6, 0x70, 0xd2, 0xe6, 0x8f, 0x97, 0xa3, 0x59, 0x79, 0x59, 0x74, 0x0f, 0x97, 0x56, 0x73, 0xe2, 0x32, 0x80, 0x2b, 0x42, 0xb7, 0x83, 0x59, 0x4b, 0x3d, 0x6e, 0x04, 0x16, 0x49, 0xc4, 0xd3, 0x41, 0x91, 0xee, 0x3b, 0x14, 0x06, 0x5a, 0x66, 0x6a, 0x00, 0xeb, 0xf7, 0x8f, 0xd0, 0x06, 0x63, 0xe5, 0x22, 0x8f, 0xed, 0x95, 0xd5, 0x0b, 0xca, 0x69, 0x3c, 0xdc, 0xb5, 0x6a, 0xc8, 0xa1, 0x7e, 0xba, 0x05, 0x5c, 0x44, 0xa5, 0x38, 0x26, 0x59, 0xa1, 0xe3, 0x98, 0xf4, 0x18, 0x15, 0x54, 0x6a, 0x8d, 0x2a, 0xcf, 0x94, 0x0d, 0x04, 0xf8, 0x77, 0x86, 0xd7, 0x4c, 0x68, 0x76, 0xa4, 0xaf, 0xab, 0x85, 0xcc, 0x5c, 0x95, 0xb8, 0xb9, 0xf0, 0x2a, 0xde, 0x17, 0x58, 0x4d, 0x3a, 0x04, 0x9a, 0x0a, 0x9d, 0x51, 0x23, 0xbb, 0x9f, 0xfa, 0x99, 0xf4, 0x95, 0xca, 0x61, 0x82, 0x2a, 0x64, 0xcc, 0x88, 0xe4, 0x8d, 0x4a, 0xae, 0x36, 0xd1, 0xf0, 0x72, 0x07, 0x41, 0xaf, 0x3b, 0x99, 0x50, 0x61, 0x9c, 0x43, 0x9d, 0x00, 0x46, 0x54, 0x51, 0xb2, 0xf2, 0x56, 0x3c])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_1201_tweak_16(self, test_workers):
my_key = bytes([0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b])
my_tweak = bytes([0xde, 0xbf, 0xa0, 0x81, 0x62, 0x43, 0x24, 0x05, 0xe5, 0xc6, 0xa7, 0x88, 0x69, 0x4a, 0x2b, 0x0c])
my_message = bytes([0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1a, 0x17, 0x14, 0x11, 0x0e, 0x0b, 0x08, 0x05, 0x02, 0xff, 0xfc, 0xf9, 0xf6, 0xf3, 0xf0, 0xed, 0xea, 0xe7, 0xe4, 0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1a, 0x17, 0x14, 0x11, 0x0e, 0x0b, 0x08, 0x05, 0x02, 0xff, 0xfc, 0xf9, 0xf6, 0xf3, 0xf0, 0xed, 0xea, 0xe7, 0xe4, 0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1a, 0x17, 0x14, 0x11, 0x0e, 0x0b, 0x08, 0x05, 0x02, 0xff, 0xfc, 0xf9, 0xf6, 0xf3, 0xf0, 0xed, 0xea, 0xe7, 0xe4, 0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1a, 0x17, 0x14, 0x11, 0x0e, 0x0b, 0x08, 0x05, 0x02, 0xff, 0xfc, 0xf9, 0xf6, 0xf3, 0xf0, 0xed, 0xea, 0xe7, 0xe4, 0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b])
real_ciphertext = bytes([0x74, 0x71, 0xe1, 0xa8, 0xa9, 0x03, 0xbe, 0xea, 0xa8, 0xb0, 0xac, 0xe0, 0x6f, 0x2a, 0xe6, 0xcb, 0x55, 0xd1, 0xc4, 0x38, 0x2d, 0x1b, 0x29, 0x12, 0x8c, 0x00, 0xf8, 0xa4, 0x78, 0x12, 0x3a, 0x85, 0x30, 0x6e, 0xb4, 0x49, 0x38, 0xc4, 0xe2, 0xe1, 0x12, 0x5a, 0x6d, 0x70, 0x80, 0x36, 0x80, 0x83, 0xae, 0xba, 0x6c, 0xa8, 0x8f, 0x75, 0x31, 0x67, 0x45, 0xb2, 0xbf, 0x7d, 0x36, 0x89, 0xbe, 0x6f, 0x9c, 0xf3, 0x33, 0x4d, 0x3a, 0x2d, 0xa1, 0x7f, 0x56, 0xbf, 0x93, 0x96, 0x5b, 0xc0, 0xd9, 0xfd, 0x12, 0x05, 0x1f, 0x1f, 0x69, 0x42, 0x54, 0xf2, 0xa6, 0x76, 0x28, 0xd8, 0x17, 0x8c, 0xc9, 0x29, 0xe8, 0xb6, 0xd6, 0x9c, 0x64, 0xa8, 0xf7, 0x2b, 0x58, 0xbe, 0xa3, 0x7e, 0xbb, 0x05, 0xf3, 0xfa, 0x3b, 0xeb, 0xf1, 0xbb, 0xaf, 0x4c, 0x8b, 0x8f, 0xc1, 0x51, 0x86, 0x73, 0x83, 0xa5, 0x10, 0x16, 0x5d, 0x96, 0x30, 0x8a, 0xa0, 0xb0, 0x2f, 0x87, 0xd8, 0x86, 0x8d, 0xa6, 0x94, 0xe6, 0xd8, 0x69, 0x1d, 0xd9, 0x48, 0x7e, 0xd1, 0x41, 0x2f, 0x3a, 0x69, 0xeb, 0x35, 0x61, 0x3a, 0x74, 0xe6, 0x39, 0x43, 0x98, 0x80, 0x78, 0xfd, 0x29, 0xec, 0x3f, 0xc7, 0xa0, 0xab, 0xf2, 0xf5, 0xeb, 0x18, 0x21, 0x42, 0x51, 0x5f, 0x85, 0x07, 0x20, 0xa6, 0xb3, 0xb0, 0x5c, 0x84, 0x04, 0x1c, 0xb2, 0xf1, 0xfe, 0xda, 0x9c, 0x4b, 0x56, 0x3b, 0x3d, 0x17, 0xc9, 0x72, 0x9a, 0x00, 0x6f, 0xab, 0xdf, 0x9e, 0x5c, 0xaf, 0xe0, 0x1e, 0xc1, 0xe9, 0x0e, 0x8e, 0xc0, 0x94, 0xab, 0x2d, 0xb1, 0x40, 0xde, 0x8e, 0xcb, 0xb4, 0x26, 0x03, 0x15, 0x92, 0x77, 0x6b, 0xdc, 0x68, 0x06, 0xa4, 0xe1, 0xa1, 0xf2, 0x4c, 0xe9, 0xc4, 0xd3, 0x8e, 0xfc, 0x1f, 0x62, 0x37, 0x06, 0xfe, 0xae, 0xd8, 0x64, 0x59, 0xe2, 0xee, 0x0d, 0x41, 0xe8, 0xea, 0x96, 0x29, 0x41, 0x87, 0xd7, 0x4a, 0x19, 0xf7, 0xd3, 0x1d, 0xf9, 0x15, 0x60, 0x10, 0x5c, 0xa2, 0xc5, 0xaf, 0x0e, 0x65, 0x3e, 0x13, 0xfc, 0x0d, 0xb0, 0x85, 0x94, 0xfd, 0x3e, 0xd3, 0xab, 0x13, 0x09, 0x9e, 0x1a, 0x0e, 0x3d, 0x43, 0x55, 0x95, 0xe0, 0xa6, 0xbe, 0x01, 0xd9, 0x93, 0xde, 0xb1, 0x7b, 0xd9, 0xdc, 0xe4, 0x48, 0x1d, 0xd8, 0x8b, 0xb7, 0x5a, 0x9b, 0xbd, 0x9f, 0xf2, 0xff, 0x01, 0x29, 0xac, 0x30, 0x74, 0x7a, 0xaf, 0xed, 0x31, 0xa5, 0xcc, 0x43, 0x8a, 0xa5, 0x81, 0xc2, 0xdc, 0x79, 0x9b, 0xfe, 0x8b, 0x99, 0x9f, 0x87, 0x69, 0x5a, 0xc8, 0x11, 0xf7, 0x33, 0xc9, 0xdb, 0x1f, 0x5a, 0xe7, 0x82, 0x5e, 0x1e, 0x88, 0xed, 0x6f, 0x93, 0x60, 0xb0, 0xd0, 0xbd, 0x8d, 0xbc, 0xe8, 0x37, 0xe3, 0x45, 0x0d, 0x12, 0x6a, 0x21, 0xed, 0x59, 0x77, 0x36, 0x11, 0x16, 0x9f, 0x61, 0x66, 0x65, 0xfc, 0xf1, 0xfe, 0x37, 0x9d, 0x0b, 0xcb, 0x59, 0x34, 0x01, 0xa3, 0xe0, 0x72, 0xce, 0x32, 0x47, 0xe0, 0x2e, 0x3c, 0xec, 0x24, 0x11, 0x8d, 0xc7, 0x0b, 0x75, 0x44, 0x05, 0xf4, 0xd9, 0x5f, 0xc8, 0xd7, 0x00, 0xff, 0x02, 0xaa, 0x76, 0x4b, 0xb6, 0xcd, 0x35, 0x18, 0xb3, 0xee, 0x4e, 0xbc, 0xef, 0x83, 0x95, 0xa2, 0x26, 0x16, 0xa4, 0xa5, 0xe1, 0x56, 0xa0, 0x40, 0x33, 0x3d, 0x4b, 0xa4, 0x90, 0x02, 0xd0, 0x72, 0x8e, 0xd7, 0xc2, 0x0e, 0x66, 0x60, 0xea, 0xda, 0x5a, 0x7d, 0x40, 0x96, 0xc5, 0x00, 0xc3, 0xfb, 0xcf, 0x9a, 0x1e, 0x7d, 0xab, 0xad, 0x75, 0x19, 0xa1, 0xb9, 0x5c, 0x88, 0xd9, 0xda, 0xf3, 0xc4, 0xc7, 0x5b, 0x0e, 0x18, 0x80, 0x15, 0xb6, 0x0e, 0xa2, 0x66, 0x50, 0x9e, 0x2e, 0x94, 0x56, 0xc8, 0x50, 0xbd, 0xc4, 0x42, 0x21, 0x15, 0x20, 0x92, 0xae, 0xa5, 0x80, 0xe7, 0x95, 0xcf, 0x86, 0x0b, 0xac, 0xd2, 0x75, 0x9e, 0x76, 0x40, 0x13, 0x3c, 0x7b, 0xc2, 0xff, 0x7c, 0x03, 0x9e, 0xa7, 0xa8, 0x69, 0xdd, 0x2f, 0xda, 0xd3, 0x4d, 0xb2, 0xcb, 0x07, 0x29, 0x87, 0xad, 0x63, 0xf5, 0xd8, 0x4d, 0xfb, 0xda, 0x03, 0x4b, 0x10, 0xd1, 0x30, 0xa5, 0x48, 0x2c, 0xa7, 0xfc, 0xff, 0xac, 0xda, 0x1c, 0xf7, 0x0b, 0x34, 0x46, 0x4b, 0xc3, 0x56, 0x60, 0x8b, 0x42, 0xcd, 0x09, 0x0c, 0xc3, 0xdd, 0xca, 0xbf, 0x03, 0xda, 0xa2, 0xbd, 0x2c, 0x90, 0xc5, 0x6e, 0xf2, 0x20, 0xac, 0x9b, 0x75, 0xe4, 0x38, 0xf7, 0xd5, 0x3e, 0x86, 0xdd, 0x45, 0x4c, 0x86, 0xd2, 0x8d, 0xc5, 0xd0, 0x38, 0x95, 0x64, 0x1f, 0xac, 0x71, 0x94, 0x2a, 0xc5, 0xb7, 0x60, 0x7f, 0xe2, 0xfb, 0x28, 0x56, 0xfb, 0xac, 0x04, 0xa8, 0xef, 0xe5, 0x5c, 0x56, 0x4a, 0xe7, 0x5c, 0xb7, 0x08, 0xca, 0xff, 0x1d, 0xc2, 0x1c, 0x66, 0xc2, 0x9d, 0x4f, 0x88, 0xac, 0x99, 0x4c, 0xb8, 0xb8, 0xff, 0xc7, 0x71, 0x87, 0xde, 0x3c, 0x5f, 0x90, 0x07, 0xea, 0x37, 0x3a, 0x73, 0x8e, 0x1b, 0xe1, 0x97, 0xc8, 0xaa, 0x84, 0x1e, 0x6c, 0xad, 0x4c, 0x02, 0x97, 0x80, 0x25, 0x3c, 0x59, 0x44, 0x83, 0x72, 0x61, 0xea, 0xce, 0xd2, 0x31, 0x38, 0x9a, 0x07, 0x51, 0x87, 0x0d, 0x33, 0x95, 0xd7, 0xd7, 0x83, 0x61, 0x6c, 0x94, 0x2a, 0x61, 0x3d, 0x9f, 0xb4, 0xb1, 0x7c, 0x0c, 0x0c, 0x6f, 0x23, 0xf1, 0x21, 0x60, 0x7c, 0x0a, 0x78, 0xfc, 0x13, 0xde, 0xc7, 0xa7, 0x6d, 0x65, 0x90, 0x08, 0xb0, 0x31, 0xb5, 0x21, 0xa7, 0x06, 0xac, 0x64, 0x7f, 0xe4, 0x4c, 0xd2, 0x99, 0x97, 0x32, 0xd6, 0x92, 0xd0, 0x2e, 0x11, 0x03, 0x78, 0x53, 0x9c, 0x2e, 0xda, 0x55, 0xda, 0x60, 0x68, 0xf0, 0xe2, 0xdb, 0xa9, 0x88, 0xfc, 0x5d, 0x63, 0x7a, 0x18, 0xcd, 0xe1, 0x64, 0x29, 0xc6, 0x32, 0xbc, 0x41, 0xab, 0x9e, 0xc9, 0xe2, 0xd2, 0xb2, 0x43, 0x12, 0x51, 0xbb, 0x9d, 0xd3, 0x6e, 0xa6, 0xb3, 0x2d, 0xed, 0x23, 0x06, 0x86, 0x77, 0x85, 0x72, 0x3a, 0x7e, 0x2f, 0xaf, 0xcb, 0x3d, 0x98, 0xfa, 0xb3, 0x9e, 0xa8, 0x98, 0x6f, 0xb2, 0xa5, 0x2c, 0x9d, 0xc1, 0xe3, 0x8e, 0x8b, 0x3f, 0xc2, 0x80, 0x65, 0x8e, 0x93, 0xe5, 0xb5, 0xfb, 0xa4, 0x68, 0xca, 0x3c, 0x50, 0x02, 0x0c, 0x36, 0x6d, 0x3b, 0x12, 0xa9, 0xe8, 0xce, 0xa9, 0x56, 0x80, 0xc6, 0x9e, 0x64, 0x5d, 0xe3, 0x95, 0xa2, 0x2a, 0x2d, 0x33, 0xeb, 0xbe, 0x48, 0xed, 0x8f, 0x1a, 0x10, 0xe4, 0xbf, 0x14, 0x48, 0x69, 0x06, 0x8f, 0xb3, 0x39, 0x65, 0x43, 0xdb, 0xf6, 0x27, 0xe6, 0x19, 0x27, 0xb1, 0x34, 0x6f, 0xff, 0xf6, 0x32, 0x44, 0xb4, 0xc3, 0x3c, 0xeb, 0x6e, 0xcb, 0x2c, 0xac, 0x9b, 0x77, 0xc3, 0x75, 0x8c, 0x4a, 0x95, 0xfb, 0xbf, 0x01, 0xc8, 0x98, 0x76, 0x4b, 0x17, 0x29, 0x5d, 0x31, 0x38, 0xaf, 0x35, 0x08, 0x8f, 0xda, 0x9d, 0x37, 0x8a, 0xc2, 0xfc, 0xb6, 0x5d, 0x2b, 0xb4, 0xcf, 0xf5, 0x9e, 0x4d, 0x44, 0x61, 0x27, 0xf5, 0xaa, 0x7b, 0xe4, 0x25, 0x28, 0x39, 0x4b, 0x25, 0x4f, 0x8a, 0x05, 0x23, 0xc7, 0x21, 0x9f, 0x44, 0xa1, 0xd1, 0x02, 0x52, 0xd8, 0xb6, 0x54, 0x4a, 0x59, 0x52, 0x7a, 0x64, 0x8a, 0x5a, 0x19, 0xce, 0x4a, 0x2d, 0x28, 0x98, 0x81, 0x44, 0xde, 0xa9, 0x9e, 0x42, 0x7d, 0xe1, 0xe6, 0x34, 0x0c, 0x7d, 0xa9, 0x33, 0x77, 0x0d, 0x96, 0x05, 0x33, 0xad, 0x99, 0x12, 0x8a, 0xd3, 0xc5, 0xf4, 0x1c, 0x2e, 0x91, 0x76, 0x98, 0xb7, 0x0f, 0xff, 0x0a, 0x04, 0xa9, 0x86, 0xb3, 0xd4, 0xce, 0xe3, 0xbb, 0x68, 0xe9, 0xfc, 0xea, 0xf2, 0xa6, 0x43, 0x68, 0xec, 0xe1, 0x6d, 0xa5, 0x96, 0x64, 0x94, 0x10, 0xb7, 0xce, 0x9a, 0x3c, 0x36, 0x54, 0x69, 0x97, 0x6b, 0xf6, 0x70, 0xc2, 0x3f, 0x52, 0x3b, 0x4d, 0x45, 0x66, 0xf8, 0x37, 0x28, 0x68, 0x98, 0xbc, 0x39, 0xa8, 0x4a, 0xa9, 0xc8, 0x83, 0x93, 0xee, 0xf8, 0x99, 0x1c, 0x52, 0xea, 0x41, 0x4d, 0x27, 0x0c, 0xa6, 0x84, 0x2a, 0x42, 0xef, 0xa6, 0xf2, 0x0a, 0x70, 0xc8, 0x64, 0xca, 0xbe, 0x1d, 0xbe, 0xfb, 0x12, 0x5a, 0x80, 0xa8, 0x4c, 0x6c, 0x12, 0xef, 0x78, 0x8f, 0xc5, 0xa0, 0x18, 0x94, 0x30, 0x05, 0xb9, 0x1d, 0xa5, 0x79, 0x86, 0xb4, 0x1c, 0xe3, 0x1a, 0xfb, 0xb7, 0x9a, 0xa1, 0x46, 0xa0, 0xa3, 0x09, 0xd8, 0x95, 0x97, 0x75, 0x33, 0x04, 0xbe, 0x7d, 0xc9, 0x8b, 0xbd, 0xa8, 0xf1, 0x84, 0x02, 0x08, 0xd9, 0xce, 0x1a, 0xb6, 0x4c, 0x74, 0x64, 0x0a, 0x0c, 0x8b, 0x69, 0xbb, 0x55, 0x99, 0x16, 0x69, 0x71, 0xc8, 0xc1, 0xf8, 0xe9, 0x79, 0xda, 0x50, 0x03, 0xb0, 0xba, 0x54, 0xf6, 0x22, 0x2a, 0x91, 0x2c, 0xe0, 0x49, 0x72, 0x93, 0x8b, 0x74, 0x87, 0x90, 0x78, 0xa7, 0x5d, 0x96, 0x27, 0x30, 0x58, 0x49, 0x5d, 0x95, 0xbf, 0xff, 0x6c, 0xd9, 0x6a, 0x85, 0x9c, 0xbb])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_3398_tweak_16(self, test_workers):
my_key = bytes([0x7b, 0xfc, 0x7c, 0xfd, 0x7d, 0xfe, 0x7e, 0xff, 0x7f, 0x00, 0x80, 0x01, 0x81, 0x02, 0x82, 0x03])
my_tweak = bytes([0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4])
my_message = bytes([0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0x55, 0xd6, 0x56, 0xd7, 0x57, 0xd8, 0x58, 0xd9, 0x59, 0xda, 0x5a, 0xdb, 0x5b, 0xdc, 0x5c, 0xdd, 0x5d, 0xde, 0x5e, 0xdf, 0x5f, 0xe0, 0x60, 0xe1, 0x61, 0xe2, 0x62, 0xe3, 0x63, 0xe4, 0x64, 0xe5, 0x65, 0xe6, 0x66, 0xe7, 0x67, 0xe8, 0x68, 0xe9, 0x69, 0xea, 0x6a, 0xeb, 0x6b, 0xec, 0x6c, 0xed, 0x6d, 0xee, 0x6e, 0xef, 0x6f, 0xf0, 0x70, 0xf1, 0x71, 0xf2, 0x72, 0xf3, 0x73, 0xf4, 0x74, 0xf5, 0x75, 0xf6, 0x76, 0xf7, 0x77, 0xf8, 0x78, 0xf9, 0x79, 0xfa, 0x7a, 0xfb, 0x7b, 0xfc, 0x7c, 0xfd, 0x7d, 0xfe, 0x7e, 0xff, 0x7f, 0x00, 0x80, 0x01, 0x81, 0x02, 0x82, 0x03, 0x83, 0x04, 0x84, 0x05, 0x85, 0x06, 0x86, 0x07, 0x87, 0x08, 0x88, 0x09, 0x89, 0x0a, 0x8a, 0x0b, 0x8b, 0x0c, 0x8c, 0x0d, 0x8d, 0x0e, 0x8e, 0x0f, 0x8f, 0x10, 0x90, 0x11, 0x91, 0x12, 0x92, 0x13, 0x93, 0x14, 0x94, 0x15, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0x55, 0xd6, 0x56, 0xd7, 0x57, 0xd8, 0x58, 0xd9, 0x59, 0xda, 0x5a, 0xdb, 0x5b, 0xdc, 0x5c, 0xdd, 0x5d, 0xde, 0x5e, 0xdf, 0x5f, 0xe0, 0x60, 0xe1, 0x61, 0xe2, 0x62, 0xe3, 0x63, 0xe4, 0x64, 0xe5, 0x65, 0xe6, 0x66, 0xe7, 0x67, 0xe8, 0x68, 0xe9, 0x69, 0xea, 0x6a, 0xeb, 0x6b, 0xec, 0x6c, 0xed, 0x6d, 0xee, 0x6e, 0xef, 0x6f, 0xf0, 0x70, 0xf1, 0x71, 0xf2, 0x72, 0xf3, 0x73, 0xf4, 0x74, 0xf5, 0x75, 0xf6, 0x76, 0xf7, 0x77, 0xf8, 0x78, 0xf9, 0x79, 0xfa, 0x7a, 0xfb, 0x7b, 0xfc, 0x7c, 0xfd, 0x7d, 0xfe, 0x7e, 0xff, 0x7f, 0x00, 0x80, 0x01, 0x81, 0x02, 0x82, 0x03, 0x83, 0x04, 0x84, 0x05, 0x85, 0x06, 0x86, 0x07, 0x87, 0x08, 0x88, 0x09, 0x89, 0x0a, 0x8a, 0x0b, 0x8b, 0x0c, 0x8c, 0x0d, 0x8d, 0x0e, 0x8e, 0x0f, 0x8f, 0x10, 0x90, 0x11, 0x91, 0x12, 0x92, 0x13, 0x93, 0x14, 0x94, 0x15, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0x55, 0xd6, 0x56, 0xd7, 0x57, 0xd8, 0x58, 0xd9, 0x59, 0xda, 0x5a, 0xdb, 0x5b, 0xdc, 0x5c, 0xdd, 0x5d, 0xde, 0x5e, 0xdf, 0x5f, 0xe0, 0x60, 0xe1, 0x61, 0xe2, 0x62, 0xe3, 0x63, 0xe4, 0x64, 0xe5, 0x65, 0xe6, 0x66, 0xe7, 0x67, 0xe8, 0x68, 0xe9, 0x69, 0xea, 0x6a, 0xeb, 0x6b, 0xec, 0x6c, 0xed, 0x6d, 0xee, 0x6e, 0xef, 0x6f, 0xf0, 0x70, 0xf1, 0x71, 0xf2, 0x72, 0xf3, 0x73, 0xf4, 0x74, 0xf5, 0x75, 0xf6, 0x76, 0xf7, 0x77, 0xf8, 0x78, 0xf9, 0x79, 0xfa, 0x7a, 0xfb, 0x7b, 0xfc, 0x7c, 0xfd, 0x7d, 0xfe, 0x7e, 0xff, 0x7f, 0x00, 0x80, 0x01, 0x81, 0x02, 0x82, 0x03, 0x83, 0x04, 0x84, 0x05, 0x85, 0x06, 0x86, 0x07, 0x87, 0x08, 0x88, 0x09, 0x89, 0x0a, 0x8a, 0x0b, 0x8b, 0x0c, 0x8c, 0x0d, 0x8d, 0x0e, 0x8e, 0x0f, 0x8f, 0x10, 0x90, 0x11, 0x91, 0x12, 0x92, 0x13, 0x93, 0x14, 0x94, 0x15, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0x55, 0xd6, 0x56, 0xd7, 0x57, 0xd8, 0x58, 0xd9, 0x59, 0xda, 0x5a, 0xdb, 0x5b, 0xdc, 0x5c, 0xdd, 0x5d, 0xde, 0x5e, 0xdf, 0x5f, 0xe0, 0x60, 0xe1, 0x61, 0xe2, 0x62, 0xe3, 0x63, 0xe4, 0x64, 0xe5, 0x65, 0xe6, 0x66, 0xe7, 0x67, 0xe8, 0x68, 0xe9, 0x69, 0xea, 0x6a, 0xeb, 0x6b, 0xec, 0x6c, 0xed, 0x6d, 0xee, 0x6e, 0xef, 0x6f, 0xf0, 0x70, 0xf1, 0x71, 0xf2, 0x72, 0xf3, 0x73, 0xf4, 0x74, 0xf5, 0x75, 0xf6, 0x76, 0xf7, 0x77, 0xf8, 0x78, 0xf9, 0x79, 0xfa, 0x7a, 0xfb, 0x7b, 0xfc, 0x7c, 0xfd, 0x7d, 0xfe, 0x7e, 0xff, 0x7f, 0x00, 0x80, 0x01, 0x81, 0x02, 0x82, 0x03, 0x83, 0x04, 0x84, 0x05, 0x85, 0x06, 0x86, 0x07, 0x87, 0x08, 0x88, 0x09, 0x89, 0x0a, 0x8a, 0x0b, 0x8b, 0x0c, 0x8c, 0x0d, 0x8d, 0x0e, 0x8e, 0x0f, 0x8f, 0x10, 0x90, 0x11, 0x91, 0x12, 0x92, 0x13, 0x93, 0x14, 0x94, 0x15, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0x55, 0xd6, 0x56, 0xd7, 0x57, 0xd8, 0x58, 0xd9, 0x59, 0xda, 0x5a, 0xdb, 0x5b, 0xdc, 0x5c, 0xdd, 0x5d, 0xde, 0x5e, 0xdf, 0x5f, 0xe0, 0x60, 0xe1, 0x61, 0xe2, 0x62, 0xe3, 0x63, 0xe4, 0x64, 0xe5, 0x65, 0xe6, 0x66, 0xe7, 0x67, 0xe8, 0x68, 0xe9, 0x69, 0xea, 0x6a, 0xeb, 0x6b, 0xec, 0x6c, 0xed, 0x6d, 0xee, 0x6e, 0xef, 0x6f, 0xf0, 0x70, 0xf1, 0x71, 0xf2, 0x72, 0xf3, 0x73, 0xf4, 0x74, 0xf5, 0x75, 0xf6, 0x76, 0xf7, 0x77, 0xf8, 0x78, 0xf9, 0x79, 0xfa, 0x7a, 0xfb, 0x7b, 0xfc, 0x7c, 0xfd, 0x7d, 0xfe, 0x7e, 0xff, 0x7f, 0x00, 0x80, 0x01, 0x81, 0x02, 0x82, 0x03, 0x83, 0x04, 0x84, 0x05, 0x85, 0x06, 0x86, 0x07, 0x87, 0x08, 0x88, 0x09, 0x89, 0x0a, 0x8a, 0x0b, 0x8b, 0x0c, 0x8c, 0x0d, 0x8d, 0x0e, 0x8e, 0x0f, 0x8f, 0x10, 0x90, 0x11, 0x91, 0x12, 0x92, 0x13, 0x93, 0x14, 0x94, 0x15, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0x55, 0xd6, 0x56, 0xd7, 0x57, 0xd8, 0x58, 0xd9, 0x59, 0xda, 0x5a, 0xdb, 0x5b, 0xdc, 0x5c, 0xdd, 0x5d, 0xde, 0x5e, 0xdf, 0x5f, 0xe0, 0x60, 0xe1, 0x61, 0xe2, 0x62, 0xe3, 0x63, 0xe4, 0x64, 0xe5, 0x65, 0xe6, 0x66, 0xe7, 0x67, 0xe8, 0x68, 0xe9, 0x69, 0xea, 0x6a, 0xeb, 0x6b, 0xec, 0x6c, 0xed, 0x6d, 0xee, 0x6e, 0xef, 0x6f, 0xf0, 0x70, 0xf1, 0x71, 0xf2, 0x72, 0xf3, 0x73, 0xf4, 0x74, 0xf5, 0x75, 0xf6, 0x76, 0xf7, 0x77, 0xf8, 0x78, 0xf9, 0x79, 0xfa, 0x7a, 0xfb, 0x7b, 0xfc, 0x7c, 0xfd, 0x7d, 0xfe, 0x7e, 0xff, 0x7f, 0x00, 0x80, 0x01, 0x81, 0x02, 0x82, 0x03, 0x83, 0x04, 0x84, 0x05, 0x85, 0x06, 0x86, 0x07, 0x87, 0x08, 0x88, 0x09, 0x89, 0x0a, 0x8a, 0x0b, 0x8b, 0x0c, 0x8c, 0x0d, 0x8d, 0x0e, 0x8e, 0x0f, 0x8f, 0x10, 0x90, 0x11, 0x91, 0x12, 0x92, 0x13, 0x93, 0x14, 0x94, 0x15, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0x55, 0xd6, 0x56, 0xd7, 0x57, 0xd8, 0x58, 0xd9, 0x59, 0xda, 0x5a, 0xdb, 0x5b, 0xdc, 0x5c, 0xdd, 0x5d, 0xde, 0x5e, 0xdf, 0x5f, 0xe0, 0x60, 0xe1, 0x61, 0xe2, 0x62, 0xe3, 0x63, 0xe4, 0x64, 0xe5, 0x65, 0xe6, 0x66, 0xe7, 0x67, 0xe8, 0x68, 0xe9, 0x69, 0xea, 0x6a, 0xeb, 0x6b, 0xec, 0x6c, 0xed, 0x6d, 0xee, 0x6e, 0xef, 0x6f, 0xf0, 0x70, 0xf1, 0x71, 0xf2, 0x72, 0xf3, 0x73, 0xf4, 0x74, 0xf5, 0x75, 0xf6, 0x76, 0xf7, 0x77, 0xf8, 0x78, 0xf9, 0x79, 0xfa, 0x7a, 0xfb, 0x7b, 0xfc, 0x7c, 0xfd, 0x7d, 0xfe, 0x7e, 0xff, 0x7f, 0x00, 0x80, 0x01, 0x81, 0x02, 0x82, 0x03, 0x83, 0x04, 0x84, 0x05, 0x85, 0x06, 0x86, 0x07, 0x87, 0x08, 0x88, 0x09, 0x89, 0x0a, 0x8a, 0x0b, 0x8b, 0x0c, 0x8c, 0x0d, 0x8d, 0x0e, 0x8e, 0x0f, 0x8f, 0x10, 0x90, 0x11, 0x91, 0x12, 0x92, 0x13, 0x93, 0x14, 0x94, 0x15, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0x55, 0xd6, 0x56, 0xd7, 0x57, 0xd8, 0x58, 0xd9, 0x59, 0xda, 0x5a, 0xdb, 0x5b, 0xdc, 0x5c, 0xdd, 0x5d, 0xde, 0x5e, 0xdf, 0x5f, 0xe0, 0x60, 0xe1, 0x61, 0xe2, 0x62, 0xe3, 0x63, 0xe4, 0x64, 0xe5, 0x65, 0xe6, 0x66, 0xe7, 0x67, 0xe8, 0x68, 0xe9, 0x69, 0xea, 0x6a, 0xeb, 0x6b, 0xec, 0x6c, 0xed, 0x6d, 0xee, 0x6e, 0xef, 0x6f, 0xf0, 0x70, 0xf1, 0x71, 0xf2, 0x72, 0xf3, 0x73, 0xf4, 0x74, 0xf5, 0x75, 0xf6, 0x76, 0xf7, 0x77, 0xf8, 0x78, 0xf9, 0x79, 0xfa, 0x7a, 0xfb, 0x7b, 0xfc, 0x7c, 0xfd, 0x7d, 0xfe, 0x7e, 0xff, 0x7f, 0x00, 0x80, 0x01, 0x81, 0x02, 0x82, 0x03, 0x83, 0x04, 0x84, 0x05, 0x85, 0x06, 0x86, 0x07, 0x87, 0x08, 0x88, 0x09, 0x89, 0x0a, 0x8a, 0x0b, 0x8b, 0x0c, 0x8c, 0x0d, 0x8d, 0x0e, 0x8e, 0x0f, 0x8f, 0x10, 0x90, 0x11, 0x91, 0x12, 0x92, 0x13, 0x93, 0x14, 0x94, 0x15, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0x55, 0xd6, 0x56, 0xd7, 0x57, 0xd8, 0x58, 0xd9, 0x59, 0xda, 0x5a, 0xdb, 0x5b, 0xdc, 0x5c, 0xdd, 0x5d, 0xde, 0x5e, 0xdf, 0x5f, 0xe0, 0x60, 0xe1, 0x61, 0xe2, 0x62, 0xe3, 0x63, 0xe4, 0x64, 0xe5, 0x65, 0xe6, 0x66, 0xe7, 0x67, 0xe8, 0x68, 0xe9, 0x69, 0xea, 0x6a, 0xeb, 0x6b, 0xec, 0x6c, 0xed, 0x6d, 0xee, 0x6e, 0xef, 0x6f, 0xf0, 0x70, 0xf1, 0x71, 0xf2, 0x72, 0xf3, 0x73, 0xf4, 0x74, 0xf5, 0x75, 0xf6, 0x76, 0xf7, 0x77, 0xf8, 0x78, 0xf9, 0x79, 0xfa, 0x7a, 0xfb, 0x7b, 0xfc, 0x7c, 0xfd, 0x7d, 0xfe, 0x7e, 0xff, 0x7f, 0x00, 0x80, 0x01, 0x81, 0x02, 0x82, 0x03, 0x83, 0x04, 0x84, 0x05, 0x85, 0x06, 0x86, 0x07, 0x87, 0x08, 0x88, 0x09, 0x89, 0x0a, 0x8a, 0x0b, 0x8b, 0x0c, 0x8c, 0x0d, 0x8d, 0x0e, 0x8e, 0x0f, 0x8f, 0x10, 0x90, 0x11, 0x91, 0x12, 0x92, 0x13, 0x93, 0x14, 0x94, 0x15, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0x55, 0xd6, 0x56, 0xd7, 0x57, 0xd8, 0x58, 0xd9, 0x59, 0xda, 0x5a, 0xdb, 0x5b, 0xdc, 0x5c, 0xdd, 0x5d, 0xde, 0x5e, 0xdf, 0x5f, 0xe0, 0x60, 0xe1, 0x61, 0xe2, 0x62, 0xe3, 0x63, 0xe4, 0x64, 0xe5, 0x65, 0xe6, 0x66, 0xe7, 0x67, 0xe8, 0x68, 0xe9, 0x69, 0xea, 0x6a, 0xeb, 0x6b, 0xec, 0x6c, 0xed, 0x6d, 0xee, 0x6e, 0xef, 0x6f, 0xf0, 0x70, 0xf1, 0x71, 0xf2, 0x72, 0xf3, 0x73, 0xf4, 0x74, 0xf5, 0x75, 0xf6, 0x76, 0xf7, 0x77, 0xf8, 0x78, 0xf9, 0x79, 0xfa, 0x7a, 0xfb, 0x7b, 0xfc, 0x7c, 0xfd, 0x7d, 0xfe, 0x7e, 0xff, 0x7f, 0x00, 0x80, 0x01, 0x81, 0x02, 0x82, 0x03, 0x83, 0x04, 0x84, 0x05, 0x85, 0x06, 0x86, 0x07, 0x87, 0x08, 0x88, 0x09, 0x89, 0x0a, 0x8a, 0x0b, 0x8b, 0x0c, 0x8c, 0x0d, 0x8d, 0x0e, 0x8e, 0x0f, 0x8f, 0x10, 0x90, 0x11, 0x91, 0x12, 0x92, 0x13, 0x93, 0x14, 0x94, 0x15, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0x55, 0xd6, 0x56, 0xd7, 0x57, 0xd8, 0x58, 0xd9, 0x59, 0xda, 0x5a, 0xdb, 0x5b, 0xdc, 0x5c, 0xdd, 0x5d, 0xde, 0x5e, 0xdf, 0x5f, 0xe0, 0x60, 0xe1, 0x61, 0xe2, 0x62, 0xe3, 0x63, 0xe4, 0x64, 0xe5, 0x65, 0xe6, 0x66, 0xe7, 0x67, 0xe8, 0x68, 0xe9, 0x69, 0xea, 0x6a, 0xeb, 0x6b, 0xec, 0x6c, 0xed, 0x6d, 0xee, 0x6e, 0xef, 0x6f, 0xf0, 0x70, 0xf1, 0x71, 0xf2, 0x72, 0xf3, 0x73, 0xf4, 0x74, 0xf5, 0x75, 0xf6, 0x76, 0xf7, 0x77, 0xf8, 0x78, 0xf9, 0x79, 0xfa, 0x7a, 0xfb, 0x7b, 0xfc, 0x7c, 0xfd, 0x7d, 0xfe, 0x7e, 0xff, 0x7f, 0x00, 0x80, 0x01, 0x81, 0x02, 0x82, 0x03, 0x83, 0x04, 0x84, 0x05, 0x85, 0x06, 0x86, 0x07, 0x87, 0x08, 0x88, 0x09, 0x89, 0x0a, 0x8a, 0x0b, 0x8b, 0x0c, 0x8c, 0x0d, 0x8d, 0x0e, 0x8e, 0x0f, 0x8f, 0x10, 0x90, 0x11, 0x91, 0x12, 0x92, 0x13, 0x93, 0x14, 0x94, 0x15, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0x55, 0xd6, 0x56, 0xd7, 0x57, 0xd8, 0x58, 0xd9, 0x59, 0xda, 0x5a, 0xdb, 0x5b, 0xdc, 0x5c, 0xdd, 0x5d, 0xde, 0x5e, 0xdf, 0x5f, 0xe0, 0x60, 0xe1, 0x61, 0xe2, 0x62, 0xe3, 0x63, 0xe4, 0x64, 0xe5, 0x65, 0xe6, 0x66, 0xe7, 0x67, 0xe8, 0x68, 0xe9, 0x69, 0xea, 0x6a, 0xeb, 0x6b, 0xec, 0x6c, 0xed, 0x6d, 0xee, 0x6e, 0xef, 0x6f, 0xf0, 0x70, 0xf1, 0x71, 0xf2, 0x72, 0xf3, 0x73, 0xf4, 0x74, 0xf5, 0x75, 0xf6, 0x76, 0xf7, 0x77, 0xf8, 0x78, 0xf9, 0x79, 0xfa, 0x7a, 0xfb, 0x7b, 0xfc, 0x7c, 0xfd, 0x7d, 0xfe, 0x7e, 0xff, 0x7f, 0x00, 0x80, 0x01, 0x81, 0x02, 0x82, 0x03, 0x83, 0x04, 0x84, 0x05, 0x85, 0x06, 0x86, 0x07, 0x87, 0x08, 0x88, 0x09, 0x89, 0x0a, 0x8a, 0x0b, 0x8b, 0x0c, 0x8c, 0x0d, 0x8d, 0x0e, 0x8e, 0x0f, 0x8f, 0x10, 0x90, 0x11, 0x91, 0x12, 0x92, 0x13, 0x93, 0x14, 0x94, 0x15, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8, 0x38, 0xb9, 0x39, 0xba, 0x3a, 0xbb, 0x3b, 0xbc, 0x3c, 0xbd, 0x3d, 0xbe, 0x3e, 0xbf, 0x3f, 0xc0, 0x40, 0xc1, 0x41, 0xc2, 0x42, 0xc3, 0x43, 0xc4, 0x44, 0xc5, 0x45, 0xc6, 0x46, 0xc7, 0x47, 0xc8, 0x48, 0xc9, 0x49, 0xca, 0x4a, 0xcb, 0x4b, 0xcc, 0x4c, 0xcd, 0x4d, 0xce, 0x4e, 0xcf, 0x4f, 0xd0, 0x50, 0xd1, 0x51, 0xd2, 0x52, 0xd3, 0x53, 0xd4, 0x54, 0xd5, 0x55, 0xd6, 0x56, 0xd7, 0x57, 0xd8, 0x58, 0xd9, 0x59, 0xda, 0x5a, 0xdb, 0x5b, 0xdc, 0x5c, 0xdd, 0x5d, 0xde, 0x5e, 0xdf, 0x5f, 0xe0, 0x60, 0xe1, 0x61, 0xe2, 0x62, 0xe3, 0x63, 0xe4, 0x64, 0xe5, 0x65, 0xe6, 0x66, 0xe7, 0x67, 0xe8, 0x68, 0xe9, 0x69, 0xea, 0x6a, 0xeb, 0x6b, 0xec, 0x6c, 0xed, 0x6d, 0xee, 0x6e, 0xef, 0x6f, 0xf0, 0x70, 0xf1, 0x71, 0xf2, 0x72, 0xf3, 0x73, 0xf4, 0x74, 0xf5, 0x75, 0xf6, 0x76, 0xf7, 0x77, 0xf8, 0x78, 0xf9, 0x79, 0xfa, 0x7a, 0xfb, 0x7b, 0xfc, 0x7c, 0xfd, 0x7d, 0xfe, 0x7e, 0xff, 0x7f, 0x00, 0x80, 0x01, 0x81, 0x02, 0x82, 0x03, 0x83, 0x04, 0x84, 0x05, 0x85, 0x06, 0x86, 0x07, 0x87, 0x08, 0x88, 0x09, 0x89, 0x0a, 0x8a, 0x0b, 0x8b, 0x0c, 0x8c, 0x0d, 0x8d, 0x0e, 0x8e, 0x0f, 0x8f, 0x10, 0x90, 0x11, 0x91, 0x12, 0x92, 0x13, 0x93, 0x14, 0x94, 0x15, 0x15, 0x96, 0x16, 0x97, 0x17, 0x98, 0x18, 0x99, 0x19, 0x9a, 0x1a, 0x9b, 0x1b, 0x9c, 0x1c, 0x9d, 0x1d, 0x9e, 0x1e, 0x9f, 0x1f, 0xa0, 0x20, 0xa1, 0x21, 0xa2, 0x22, 0xa3, 0x23, 0xa4, 0x24, 0xa5, 0x25, 0xa6, 0x26, 0xa7, 0x27, 0xa8, 0x28, 0xa9, 0x29, 0xaa, 0x2a, 0xab, 0x2b, 0xac, 0x2c, 0xad, 0x2d, 0xae, 0x2e, 0xaf, 0x2f, 0xb0, 0x30, 0xb1, 0x31, 0xb2, 0x32, 0xb3, 0x33, 0xb4, 0x34, 0xb5, 0x35, 0xb6, 0x36, 0xb7, 0x37, 0xb8])
real_ciphertext = bytes([0x4b, 0x04, 0xbe, 0xca, 0x34, 0x10, 0xe2, 0x86, 0xbf, 0x24, 0x48, 0x39, 0xd0, 0x45, 0x23, 0x6b, 0x33, 0x6d, 0xc9, 0x4e, 0xb5, 0xc4, 0x74, 0x7e, 0x8d, 0xfa, 0xac, 0x07, 0x84, 0x1b, 0xb7, 0x13, 0x1c, 0x8a, 0x90, 0x2c, 0x37, 0xbd, 0x1f, 0xf1, 0x9c, 0x58, 0x8d, 0x63, 0xa6, 0x97, 0xc6, 0x4c, 0x17, 0x59, 0x69, 0xb4, 0xa3, 0xa7, 0x4c, 0xd3, 0xe7, 0x7a, 0x62, 0xb4, 0xf1, 0xce, 0x66, 0x31, 0x6e, 0x71, 0xce, 0xb2, 0x6f, 0x5d, 0x67, 0x21, 0xcd, 0x61, 0xe5, 0x1f, 0x53, 0x07, 0x76, 0x9b, 0x9b, 0x80, 0x37, 0xcf, 0x75, 0xac, 0x83, 0x7e, 0x26, 0xfc, 0x65, 0xc6, 0x80, 0x00, 0xb7, 0xd1, 0xb4, 0xb1, 0xcc, 0xc4, 0xf7, 0xda, 0xc9, 0x9f, 0x56, 0xd9, 0x1d, 0xd5, 0x60, 0xbb, 0x47, 0x71, 0xe8, 0xc7, 0x6a, 0xbe, 0x6e, 0xdb, 0x59, 0x2b, 0xe7, 0x73, 0x23, 0x73, 0xfe, 0xf4, 0xd7, 0x5a, 0xb2, 0x1e, 0x84, 0x23, 0xf8, 0xfd, 0xfa, 0xb8, 0x12, 0x1d, 0x07, 0xeb, 0x09, 0x85, 0x21, 0xdc, 0x11, 0x7d, 0x3a, 0x45, 0xc4, 0x2f, 0x46, 0x8a, 0xc0, 0x06, 0x72, 0xd9, 0x82, 0x3e, 0x05, 0xc4, 0xc9, 0xbb, 0x6d, 0x9e, 0xb2, 0x8e, 0xc3, 0x83, 0x8d, 0xd0, 0x17, 0x47, 0xa6, 0x02, 0x89, 0x05, 0xcc, 0xbd, 0xce, 0x2f, 0xf1, 0xab, 0xda, 0xe6, 0x5d, 0xa0, 0x48, 0x3a, 0xca, 0x19, 0x86, 0x54, 0x50, 0xdd, 0xa9, 0xb0, 0xb3, 0x25, 0xe8, 0x0d, 0x2e, 0x30, 0x87, 0x6f, 0x27, 0x41, 0x21, 0xa8, 0xb4, 0x46, 0xd6, 0x95, 0xb2, 0x2d, 0x29, 0x2a, 0xe4, 0x0c, 0x60, 0x7b, 0x61, 0xa7, 0x8f, 0x05, 0x3c, 0x36, 0x02, 0xf5, 0xcc, 0xc9, 0x9a, 0xf5, 0xfe, 0x51, 0xcc, 0x22, 0x90, 0x22, 0x9a, 0x66, 0xac, 0x35, 0x9e, 0x08, 0x26, 0xbd, 0xaf, 0x0b, 0xe6, 0x19, 0xee, 0x0d, 0x0f, 0x55, 0x48, 0x1c, 0xd3, 0xc6, 0x88, 0x54, 0xe8, 0x41, 0x2a, 0x86, 0x2e, 0x5c, 0x97, 0x25, 0x93, 0xad, 0x80, 0xf1, 0x09, 0xa6, 0x6c, 0x88, 0x97, 0x20, 0x3e, 0x56, 0x6e, 0x58, 0xe2, 0x43, 0xc0, 0x48, 0xa4, 0xcb, 0xd6, 0xd2, 0xce, 0x3a, 0xc0, 0x9a, 0x77, 0x94, 0x0a, 0xb8, 0x9e, 0x41, 0x81, 0x82, 0xc5, 0xde, 0xaf, 0x4f, 0xe7, 0x8e, 0x6a, 0xfc, 0x7a, 0xa9, 0xc4, 0xc5, 0xf8, 0xee, 0x1a, 0x6c, 0x37, 0x41, 0x0b, 0xf0, 0x56, 0xb0, 0xef, 0xa9, 0x71, 0x05, 0x48, 0x6c, 0x6c, 0xe4, 0x57, 0xa6, 0x60, 0xcc, 0x39, 0x71, 0x33, 0x68, 0x15, 0x35, 0xbd, 0x8f, 0x09, 0xab, 0x87, 0xe6, 0x98, 0x8b, 0x89, 0xd2, 0xff, 0x68, 0xb5, 0xbb, 0xe0, 0xf9, 0x12, 0x7b, 0xab, 0xb6, 0xa6, 0x24, 0x53, 0xbf, 0x15, 0x51, 0xf7, 0x4d, 0x33, 0x94, 0xfd, 0x80, 0x35, 0xa4, 0x95, 0xf0, 0x0c, 0xbd, 0xab, 0x45, 0x78, 0x51, 0xc8, 0xdd, 0x3d, 0x21, 0xa7, 0x74, 0x60, 0xfd, 0x4c, 0xf9, 0x6b, 0x33, 0xfa, 0x00, 0x91, 0x40, 0xf6, 0xc3, 0xef, 0xea, 0x50, 0x0b, 0x8e, 0x70, 0xd2, 0x09, 0x4e, 0x2b, 0xf1, 0xd7, 0x92, 0xb7, 0x76, 0x61, 0x57, 0xcd, 0x94, 0xa5, 0x68, 0xe2, 0x95, 0x0c, 0xe0, 0x2c, 0x93, 0x20, 0x05, 0x15, 0x8d, 0x7d, 0x3e, 0x22, 0xf1, 0xa7, 0x49, 0x2f, 0x04, 0x49, 0xf9, 0x47, 0x61, 0xe3, 0x19, 0x97, 0x22, 0x79, 0x36, 0x30, 0xb7, 0x9a, 0x2a, 0xe4, 0xde, 0x15, 0x23, 0x32, 0xd4, 0x7a, 0xca, 0x1f, 0x20, 0x9b, 0x7c, 0xdf, 0x08, 0x13, 0xd4, 0x37, 0xc6, 0x80, 0xa0, 0x79, 0x50, 0x1c, 0x58, 0x88, 0xb7, 0xa7, 0x92, 0xb2, 0x4d, 0x8f, 0xa7, 0x5b, 0x1c, 0xd0, 0xc2, 0x3e, 0x72, 0xf2, 0xec, 0x9a, 0x34, 0x4f, 0x05, 0xea, 0xe1, 0x3b, 0x3b, 0x78, 0xb0, 0x02, 0x59, 0x08, 0x0b, 0x4e, 0xc2, 0x78, 0xc5, 0x5e, 0xcd, 0xee, 0xdd, 0x5e, 0x28, 0x86, 0x2d, 0xbe, 0x20, 0xb4, 0x32, 0x47, 0xbb, 0xb1, 0x2c, 0x8a, 0x90, 0x5b, 0x73, 0x5b, 0x65, 0xad, 0x04, 0x54, 0x01, 0xcc, 0xbf, 0x6a, 0x0f, 0x01, 0x63, 0x97, 0xc3, 0xb6, 0x72, 0x99, 0xac, 0x6c, 0x06, 0x7a, 0x82, 0x4e, 0x4a, 0xe8, 0xa5, 0x7b, 0x1f, 0xea, 0x7b, 0x8f, 0x27, 0xbc, 0x5a, 0xec, 0xf5, 0xde, 0xf8, 0xdc, 0x26, 0x47, 0x98, 0x97, 0x79, 0x05, 0x85, 0x0f, 0xa0, 0x00, 0xa1, 0x4e, 0xa1, 0xb6, 0xca, 0xd8, 0x21, 0xfa, 0xdc, 0xf3, 0xf2, 0xef, 0xb9, 0x9a, 0x59, 0x70, 0xdd, 0x62, 0x17, 0x22, 0x4d, 0x1d, 0xe0, 0x4f, 0x5c, 0xdb, 0x87, 0x01, 0xb2, 0xb2, 0x47, 0x32, 0x15, 0x7a, 0xba, 0x7b, 0x80, 0xce, 0x7b, 0x94, 0xde, 0xd3, 0x08, 0xc2, 0xdd, 0x1b, 0x39, 0x9d, 0xbf, 0xa6, 0xc8, 0xc3, 0xe2, 0xa1, 0x98, 0x24, 0xfd, 0xce, 0x36, 0x1c, 0x27, 0x6c, 0xd6, 0x68, 0x24, 0x1c, 0x03, 0x6e, 0x09, 0xf5, 0x0e, 0xc2, 0x9a, 0x6b, 0xf6, 0xe8, 0x86, 0x34, 0xe6, 0x7d, 0xb0, 0xa4, 0x0f, 0xa9, 0x65, 0xbf, 0x1f, 0x18, 0xcb, 0x02, 0x73, 0x26, 0xc5, 0xb0, 0x4d, 0x29, 0x6b, 0x13, 0xa3, 0x53, 0xde, 0xed, 0xf2, 0x8b, 0xdb, 0x8a, 0x35, 0x5c, 0xca, 0x88, 0x8b, 0xc3, 0xe3, 0x12, 0xc2, 0x35, 0xcd, 0x54, 0x44, 0xa3, 0xaf, 0x37, 0x66, 0x9a, 0x04, 0x23, 0xa8, 0x75, 0x07, 0x48, 0x4a, 0xb4, 0x82, 0x7e, 0xf5, 0xaa, 0x6d, 0xca, 0x09, 0xc9, 0x03, 0x88, 0x49, 0x6a, 0xb1, 0x06, 0x20, 0x7f, 0xe7, 0x3d, 0x87, 0xbb, 0x6a, 0x43, 0x2a, 0x2d, 0x28, 0xc0, 0x00, 0x7a, 0x45, 0x8a, 0x57, 0x8c, 0xef, 0xcd, 0x09, 0x4e, 0x0c, 0x76, 0x93, 0xfa, 0xdb, 0xbc, 0xbc, 0xa8, 0x84, 0xe4, 0xc8, 0x41, 0x3f, 0xa8, 0xa0, 0x52, 0x27, 0xa4, 0x6d, 0x87, 0xf6, 0xe4, 0xb0, 0x6f, 0x27, 0x4c, 0x7b, 0xf2, 0x0c, 0x74, 0x7d, 0x17, 0x71, 0xe6, 0x15, 0x21, 0xac, 0x9f, 0xca, 0x6b, 0xdd, 0xa4, 0x3d, 0x5b, 0x88, 0xc9, 0xcc, 0x18, 0x9b, 0x1e, 0x25, 0x44, 0x56, 0x1e, 0x5f, 0x9a, 0x83, 0x68, 0xc6, 0xfc, 0x45, 0x71, 0x7e, 0x16, 0x7f, 0x0c, 0xa9, 0x1f, 0x51, 0x63, 0x9a, 0x93, 0x94, 0x8f, 0x1c, 0x35, 0x23, 0x9a, 0x23, 0x67, 0x4c, 0x74, 0xed, 0x0c, 0x00, 0xe5, 0xbb, 0x5c, 0x4c, 0xee, 0x28, 0x6b, 0x00, 0xa1, 0x54, 0xff, 0x9b, 0xa9, 0x72, 0x25, 0x03, 0xc3, 0x66, 0x38, 0x75, 0x43, 0xbd, 0xb3, 0x80, 0x12, 0x90, 0xac, 0xd6, 0x70, 0xab, 0x2d, 0x72, 0x11, 0x6e, 0xcf, 0xe7, 0x5b, 0x7c, 0x8c, 0xe4, 0xf4, 0x47, 0x6f, 0xe6, 0xc9, 0x9f, 0x93, 0xeb, 0xee, 0xfc, 0xbf, 0xe2, 0x55, 0xe0, 0x04, 0xda, 0x7c, 0xc8, 0xc2, 0x0a, 0x0b, 0xea, 0xb1, 0xe1, 0xc2, 0xf7, 0x2d, 0x49, 0x9f, 0xec, 0x3f, 0x9b, 0xd8, 0x59, 0xbd, 0x99, 0xb0, 0x87, 0x6d, 0x9f, 0xb7, 0x86, 0xf2, 0x66, 0xe1, 0xfd, 0xd6, 0xbf, 0xc7, 0x37, 0x69, 0xbf, 0xae, 0x58, 0x01, 0x43, 0xf2, 0x23, 0xaa, 0x69, 0x84, 0x0b, 0xf1, 0x69, 0xa0, 0x12, 0xd3, 0xe0, 0x06, 0x27, 0x82, 0xdc, 0xc1, 0xf7, 0x6f, 0xfe, 0xb4, 0x58, 0x27, 0x4e, 0x9e, 0xfe, 0xb9, 0xdb, 0xa8, 0x95, 0x3f, 0x02, 0x0a, 0x3c, 0x9a, 0x2b, 0x71, 0x0f, 0xed, 0x22, 0x01, 0x44, 0x76, 0x13, 0x89, 0xb6, 0x66, 0xd8, 0xb3, 0xdb, 0xca, 0x8d, 0xb2, 0xd6, 0x69, 0x97, 0x62, 0x95, 0x43, 0x63, 0x74, 0xdb, 0x8e, 0x15, 0x9d, 0x7a, 0x20, 0x27, 0x1c, 0x37, 0x0a, 0xa8, 0x40, 0x38, 0x57, 0xe0, 0x0e, 0x06, 0x86, 0x75, 0x07, 0x94, 0x8b, 0x42, 0xba, 0xf6, 0xb7, 0x8f, 0x1a, 0xbf, 0x01, 0x2c, 0x5b, 0xe2, 0x3a, 0x3b, 0xad, 0x0c, 0x12, 0xbc, 0x77, 0x8c, 0x94, 0x66, 0xfe, 0xda, 0x95, 0x56, 0xc6, 0xf2, 0xff, 0x70, 0xab, 0x1d, 0xcf, 0xda, 0x22, 0xbb, 0x0e, 0x63, 0xe1, 0x81, 0x88, 0x70, 0x71, 0x97, 0xa8, 0x67, 0x43, 0x28, 0x42, 0x50, 0x7a, 0xbf, 0xf9, 0x63, 0x50, 0xe9, 0x0e, 0x4d, 0x4e, 0x4b, 0x41, 0x17, 0x64, 0xe1, 0x5e, 0x66, 0x12, 0xe8, 0x2b, 0x27, 0xd9, 0x83, 0x74, 0x17, 0x4e, 0xfd, 0x11, 0x96, 0xfd, 0x50, 0x21, 0x75, 0x0d, 0x7e, 0xe8, 0xaa, 0xc0, 0xb7, 0x93, 0x82, 0x13, 0x60, 0x05, 0xd0, 0x19, 0x01, 0x7a, 0x87, 0x6c, 0xbf, 0x70, 0x24, 0x2d, 0xb6, 0xeb, 0x7a, 0x76, 0x2c, 0xc8, 0x3b, 0x10, 0xd6, 0x1c, 0xa2, 0x0e, 0x4f, 0x38, 0x0c, 0x29, 0x56, 0xa0, 0xe6, 0xbd, 0x30, 0x81, 0xb0, 0xaa, 0x34, 0x24, 0x37, 0xe9, 0x91, 0xe4, 0x58, 0x0b, 0xa9, 0x7d, 0xfe, 0x42, 0xf5, 0x1a, 0x3d, 0x8b, 0xa1, 0xd7, 0x34, 0x97, 0x2d, 0xeb, 0xe7, 0xb5, 0xd9, 0x69, 0x0b, 0x52, 0x5c, 0xe0, 0xba, 0xcf, 0x83, 0xa0, 0xc3, 0x8e, 0x41, 0x91, 0x78, 0xca, 0xd4, 0xf3, 0xe4, 0x02, 0x4c, 0x04, 0x0e, 0xcc, 0x4e, 0x71, 0x61, 0x92, 0x3a, 0x4f, 0x61, 0x8f, 0xb8, 0xfa, 0xb2, 0xbc, 0x94, 0xef, 0x46, 0xe9, 0x55, 0x74, 0x51, 0xd4, 0xad, 0xe1, 0x58, 0x99, 0x5e, 0x5b, 0x29, 0xcd, 0xc9, 0xd9, 0x63, 0x8e, 0x98, 0x1f, 0xe0, 0xf1, 0x41, 0x1c, 0xba, 0x4c, 0x2c, 0x76, 0x79, 0xf1, 0xfe, 0xc7, 0x61, 0x11, 0xed, 0x3f, 0x43, 0xe8, 0x1e, 0xa8, 0x0c, 0xec, 0xff, 0x43, 0xcd, 0x3e, 0x7f, 0xc0, 0xe5, 0x6f, 0xb6, 0xef, 0xf3, 0x25, 0x3d, 0x3a, 0xac, 0x9f, 0x39, 0xe5, 0x42, 0x95, 0xeb, 0x59, 0xcf, 0xfa, 0x03, 0x13, 0x33, 0x5e, 0x5b, 0x40, 0x5f, 0x3c, 0xed, 0x6b, 0xeb, 0x44, 0xf2, 0x59, 0x0b, 0xea, 0xf4, 0x1d, 0xda, 0xd7, 0x94, 0x77, 0xef, 0xe9, 0x13, 0x00, 0x23, 0x0f, 0x08, 0x48, 0x11, 0x05, 0xd4, 0x47, 0xb5, 0xcf, 0x65, 0x75, 0x5c, 0x14, 0xd7, 0x9e, 0x51, 0x74, 0xba, 0xcf, 0x91, 0x74, 0x66, 0xa4, 0xe2, 0x8a, 0x04, 0x4c, 0xdb, 0x2d, 0x4b, 0x25, 0x70, 0xa9, 0x2b, 0xe1, 0x0b, 0x4c, 0x90, 0xcf, 0xb1, 0x09, 0x6f, 0xac, 0x96, 0x82, 0xeb, 0x66, 0xec, 0x19, 0xf4, 0xf8, 0xb8, 0xc2, 0x00, 0x2c, 0xc5, 0xdc, 0x6c, 0xa4, 0x01, 0xad, 0x8b, 0x10, 0xba, 0x4d, 0x7b, 0xac, 0x00, 0xa8, 0x35, 0xc7, 0xab, 0x3f, 0x08, 0x0c, 0x8a, 0xce, 0x54, 0xc8, 0x2e, 0xfb, 0x7b, 0x90, 0x51, 0x0a, 0xcd, 0x31, 0xbe, 0x7f, 0xa0, 0x47, 0xd7, 0xf9, 0xe6, 0x78, 0x7d, 0x44, 0x10, 0x94, 0x7b, 0x1a, 0xf4, 0x43, 0x80, 0x81, 0x3f, 0xa2, 0xa7, 0x41, 0xf8, 0x82, 0x64, 0x44, 0x08, 0x84, 0x1a, 0x41, 0xc0, 0x96, 0xf5, 0x7d, 0x80, 0x5d, 0xb5, 0x7a, 0x36, 0x2b, 0x9d, 0xec, 0x9c, 0x06, 0xf1, 0x39, 0x0a, 0xf1, 0x14, 0x47, 0x27, 0x5a, 0x1e, 0xd8, 0xee, 0x17, 0xa7, 0xce, 0x76, 0xf8, 0x34, 0x39, 0x38, 0xf6, 0x91, 0xb6, 0x37, 0xcc, 0xfa, 0x98, 0xd4, 0xd5, 0xa4, 0x60, 0xde, 0xc7, 0x6a, 0x38, 0x84, 0xde, 0x4c, 0x36, 0xbb, 0x35, 0x67, 0xba, 0xaf, 0x75, 0x01, 0x8e, 0xb2, 0xda, 0x7d, 0x72, 0x0d, 0xb9, 0xe4, 0xc3, 0xd8, 0x0d, 0xf7, 0x66, 0x0b, 0xac, 0x86, 0x2c, 0x51, 0x44, 0xc1, 0xec, 0x1b, 0xbb, 0x96, 0x00, 0x6e, 0xee, 0xe1, 0xc7, 0x7e, 0xde, 0xa3, 0x99, 0xea, 0xcd, 0x54, 0x3f, 0x05, 0x78, 0x1a, 0xca, 0x7c, 0x3c, 0x68, 0x6a, 0xd9, 0xba, 0x39, 0x4b, 0x82, 0x3d, 0xf5, 0xcd, 0xc9, 0xe1, 0x6f, 0x7c, 0x9f, 0x0e, 0x25, 0xe5, 0xf2, 0x1a, 0xcc, 0xad, 0x76, 0x78, 0x2e, 0x7c, 0x26, 0xf7, 0x5c, 0x40, 0xdb, 0x2d, 0x62, 0xf6, 0x37, 0x1e, 0x6d, 0xa1, 0x68, 0x02, 0x89, 0x51, 0x98, 0x8f, 0x8b, 0x59, 0x31, 0xb9, 0x3e, 0x6f, 0x46, 0xb3, 0xc5, 0x70, 0xc3, 0x71, 0x62, 0x93, 0x93, 0x7e, 0xd9, 0x7d, 0xcc, 0xc9, 0xf9, 0x77, 0xd0, 0x26, 0x3a, 0x99, 0x16, 0x91, 0xae, 0xbe, 0x55, 0x22, 0x8a, 0x67, 0xc6, 0x4d, 0x0d, 0x60, 0x1c, 0xfd, 0xf6, 0xa9, 0x16, 0xfd, 0x2b, 0x80, 0xbc, 0x70, 0x05, 0xaf, 0xa8, 0x9c, 0xec, 0x06, 0xf3, 0x60, 0xa2, 0x2b, 0xa7, 0xf2, 0x45, 0x82, 0x7f, 0xb3, 0x44, 0x65, 0xee, 0x2d, 0xc5, 0x90, 0x2b, 0x37, 0x6f, 0xf4, 0x8b, 0xd2, 0x9f, 0x92, 0x90, 0xce, 0x40, 0x8c, 0x83, 0xbf, 0x93, 0x1c, 0x2e, 0x22, 0x7e, 0xbf, 0x36, 0x32, 0x12, 0x17, 0x6a, 0xb4, 0xe5, 0xc4, 0xab, 0x3c, 0x99, 0xdc, 0x8e, 0x60, 0x05, 0x42, 0xae, 0x7e, 0xdd, 0xf3, 0x42, 0xef, 0x90, 0xd6, 0x0a, 0x03, 0x87, 0x48, 0xa7, 0x89, 0xf4, 0xec, 0x74, 0x01, 0x2b, 0xbd, 0x24, 0x2e, 0x5b, 0x06, 0xb9, 0xe4, 0x32, 0xb6, 0x23, 0x65, 0xbd, 0x45, 0xf7, 0xb3, 0x88, 0x77, 0xa1, 0x71, 0xb2, 0x14, 0xe3, 0x46, 0x3b, 0xb6, 0x4f, 0x55, 0xd8, 0xcc, 0xe4, 0x6f, 0xb8, 0xa6, 0x22, 0x98, 0x94, 0x70, 0x01, 0x8c, 0xdb, 0xf7, 0xaa, 0xe5, 0xe4, 0x95, 0x53, 0x78, 0x62, 0xcc, 0x67, 0x59, 0x5c, 0x91, 0x8a, 0xb5, 0x6a, 0x72, 0xb2, 0x1c, 0xcc, 0xdc, 0xc8, 0x37, 0x4d, 0x21, 0xc6, 0x71, 0xcb, 0x51, 0xef, 0x07, 0xf6, 0xcb, 0xa2, 0x85, 0xd7, 0xff, 0xaa, 0x60, 0xbe, 0x07, 0x63, 0x9e, 0x44, 0xc8, 0x1c, 0x71, 0x40, 0xe2, 0xd2, 0xbd, 0x29, 0xcf, 0xc0, 0x0f, 0xb9, 0x00, 0xce, 0x78, 0x08, 0x5b, 0xe4, 0x5e, 0xfd, 0x14, 0x0d, 0x33, 0xf6, 0xc5, 0x5a, 0xec, 0x83, 0xe0, 0xcf, 0xdc, 0x73, 0xcf, 0xfc, 0x68, 0x7c, 0xe8, 0x4c, 0x23, 0x5f, 0xbc, 0xba, 0xc0, 0x99, 0x4b, 0x2a, 0xd4, 0x27, 0xe9, 0x9b, 0x27, 0xae, 0x94, 0xfe, 0x4d, 0xc4, 0xc7, 0xcc, 0x7c, 0x3e, 0xc2, 0x3b, 0xb8, 0xfe, 0xf1, 0xc1, 0xeb, 0xe8, 0x4c, 0xe1, 0x5b, 0x0c, 0x98, 0x43, 0x9b, 0x73, 0x17, 0xa4, 0x1d, 0x36, 0x15, 0x47, 0xcf, 0x68, 0x62, 0x96, 0x4f, 0xd4, 0x01, 0xca, 0x80, 0xac, 0x4f, 0xa9, 0x5a, 0xe6, 0x82, 0x71, 0xde, 0x20, 0xe8, 0x22, 0x75, 0xa1, 0x68, 0x28, 0xca, 0x07, 0xf7, 0xe2, 0x61, 0x86, 0xca, 0xcb, 0xa6, 0x3a, 0xd5, 0x98, 0x57, 0xdc, 0x0d, 0xa0, 0x02, 0xe1, 0x5f, 0xa2, 0xfb, 0x57, 0x78, 0x0e, 0x5f, 0x99, 0x1f, 0x63, 0x61, 0x59, 0x6e, 0x86, 0xaf, 0xc5, 0xf7, 0x20, 0xb2, 0xc9, 0x8b, 0x26, 0x0b, 0x08, 0xc8, 0x74, 0x03, 0x4d, 0x6d, 0xdb, 0xaf, 0x6f, 0x3d, 0x25, 0xdd, 0x1e, 0x54, 0xf8, 0x0a, 0x79, 0xa6, 0x3f, 0x6b, 0x4a, 0x70, 0x0d, 0x6f, 0x97, 0xed, 0x18, 0x9f, 0xc1, 0x97, 0x9f, 0x82, 0x03, 0x02, 0x1f, 0x7c, 0x57, 0x06, 0x49, 0x3e, 0x03, 0x96, 0x03, 0x3f, 0x97, 0x74, 0xb8, 0xd4, 0xe2, 0xf3, 0xeb, 0xcf, 0x7d, 0x1a, 0xad, 0x58, 0x3c, 0x75, 0xe8, 0xd2, 0x00, 0x06, 0x53, 0x1b, 0x31, 0xbd, 0xea, 0x40, 0xeb, 0xd2, 0x5b, 0x70, 0x1d, 0xd7, 0x73, 0xec, 0x7b, 0x03, 0x9e, 0x24, 0xe2, 0x7f, 0x89, 0xa6, 0x45, 0x5d, 0xb0, 0x69, 0xbd, 0xfb, 0x44, 0x19, 0x9b, 0xd2, 0x70, 0x92, 0x05, 0xa3, 0x27, 0x33, 0xe4, 0x9e, 0x24, 0xec, 0x59, 0x36, 0x01, 0x8d, 0xdb, 0xa9, 0x30, 0xe6, 0x1e, 0x75, 0xf2, 0x2f, 0xc4, 0x48, 0x7d, 0x75, 0xfd, 0xac, 0xf2, 0xc5, 0xee, 0xd7, 0x85, 0x7f, 0xaf, 0x16, 0xde, 0x89, 0xaa, 0x8d, 0x30, 0x3f, 0xf8, 0x49, 0x7e, 0x3d, 0x16, 0xb8, 0xf1, 0x52, 0xfb, 0x87, 0x30, 0xe5, 0x61, 0x7b, 0x47, 0xf7, 0x87, 0xd3, 0x5d, 0x82, 0x10, 0x94, 0x28, 0x24, 0xd7, 0xd6, 0xc5, 0x83, 0xbb, 0xdd, 0x78, 0x98, 0xe2, 0x4c, 0xf1, 0x77, 0xb8, 0x1b, 0xdf, 0x96, 0x3e, 0xc8, 0xf2, 0xc0, 0xcf, 0x63, 0x5c, 0xb5, 0x25, 0xfc, 0x52, 0xdb, 0x94, 0x8a, 0x18, 0x6d, 0xdf, 0xff, 0xe9, 0xc6, 0x30, 0xcb, 0xe1, 0x3a, 0xd5, 0x60, 0x9a, 0x58, 0xf3, 0x5c, 0x6d, 0x8c, 0xdf, 0x8a, 0x6b, 0x0e, 0x83, 0xf8, 0x1a, 0xaa, 0x20, 0xcc, 0xec, 0xa3, 0x59, 0x04, 0x1c, 0x47, 0xa3, 0x4e, 0x84, 0x6d, 0xd1, 0xfe, 0x30, 0xec, 0x10, 0x75, 0x71, 0x5a, 0x56, 0xfb, 0x4e, 0xcf, 0xa6, 0x78, 0xef, 0x93, 0xf7, 0xc7, 0x5a, 0x59, 0x34, 0x90, 0x5f, 0x77, 0xc1, 0x4d, 0xd4, 0x1b, 0x72, 0x25, 0xf6, 0xb2, 0xca, 0xe3, 0xdc, 0x5e, 0x7f, 0x30, 0xa2, 0x4b, 0x2b, 0x1d, 0x5a, 0x49, 0xde, 0xa4, 0xbc, 0x36, 0xd2, 0xcd, 0x95, 0x9e, 0x96, 0xa3, 0x03, 0xc3, 0x20, 0xcd, 0x54, 0x46, 0x96, 0x8d, 0xc3, 0x70, 0xe9, 0xdf, 0xd2, 0x28, 0x11, 0x5d, 0x66, 0x95, 0x1a, 0x38, 0xf4, 0xce, 0x80, 0x07, 0xa8, 0x73, 0x83, 0x2b, 0x4e, 0x7c, 0xde, 0xa5, 0x1d, 0x16, 0xa5, 0xf0, 0xff, 0x4e, 0xee, 0x51, 0xfe, 0x73, 0x9f, 0xe1, 0x91, 0xd7, 0x5d, 0x08, 0x3b, 0xe2, 0x88, 0x54, 0xf9, 0xd5, 0x63, 0x17, 0x9a, 0xcc, 0xc4, 0x75, 0x9e, 0x85, 0x6d, 0x11, 0xde, 0xb3, 0x43, 0x6d, 0xee, 0xbd, 0x50, 0x4b, 0xcf, 0xee, 0x8f, 0x02, 0xe9, 0x8d, 0xa5, 0xd5, 0xd2, 0x06, 0x31, 0x0c, 0x41, 0x51, 0xd7, 0x7b, 0xef, 0xe5, 0x00, 0x28, 0x1e, 0x55, 0x7b, 0xbb, 0x17, 0x45, 0x81, 0x2e, 0x79, 0xe6, 0xa0, 0x02, 0x0e, 0x14, 0xee, 0x6d, 0x93, 0x4e, 0x3c, 0xf3, 0xb2, 0x49, 0x52, 0x36, 0x29, 0xdd, 0xc8, 0xf5, 0x94, 0x9f, 0xae, 0x19, 0x4e, 0x4a, 0xba, 0xdd, 0x14, 0x30, 0x13, 0x90, 0xc6, 0x02, 0xd3, 0x06, 0x78, 0x6e, 0x5f, 0xd3, 0x14, 0x31, 0xf7, 0x84, 0x0d, 0x56, 0xc4, 0x79, 0xf9, 0x82, 0x01, 0x88, 0xf2, 0xe7, 0x9b, 0x68, 0xef, 0x16, 0x28, 0xdb, 0x2a, 0x35, 0xb9, 0x4a, 0xd4, 0x41, 0x93, 0x75, 0x6f, 0xf8, 0xfb, 0xfe, 0x2f, 0xe7, 0x95, 0x12, 0xd5, 0x0a, 0xec, 0xe3, 0xa9, 0x70, 0x18, 0x1b, 0xd1, 0x03, 0x77, 0x73, 0xf0, 0x52, 0xec, 0x5d, 0x01, 0x55, 0x0c, 0xc7, 0xa7, 0x88, 0xd2, 0xc0, 0x0e, 0x3e, 0xd3, 0xae, 0xa2, 0x59, 0x86, 0xc9, 0x8f, 0xf1, 0x7d, 0x93, 0x21, 0x36, 0xb4, 0x0a, 0xc3, 0x8d, 0xef, 0xde, 0x39, 0x4d, 0xf8, 0x46, 0xf6, 0x7e, 0x00, 0x56, 0x4e, 0xb9, 0x9a, 0x4d, 0xe5, 0x3c, 0xa8, 0x5a, 0x75, 0x9e, 0x51, 0xaa, 0xca, 0xe6, 0x15, 0x57, 0x45, 0x12, 0x80, 0x1e, 0xf0, 0x25, 0x96, 0xc0, 0x8f, 0xd0, 0x8d, 0x6f, 0x34, 0x71, 0xa8, 0xb6, 0x8f, 0xc0, 0x73, 0x63, 0x19, 0x41, 0xd3, 0xb3, 0x25, 0x48, 0x37, 0x1f, 0x2e, 0xc6, 0x5f, 0xce, 0x97, 0x81, 0x90, 0xbb, 0x7f, 0xe8, 0x6f, 0x23, 0xbf, 0x8b, 0x33, 0xe6, 0x85, 0xd8, 0x99, 0xff, 0x7e, 0xe9, 0xe6, 0x33, 0xbb, 0x23, 0x1d, 0xe0, 0xbf, 0x35, 0x74, 0xf4, 0xfb, 0x71, 0xb6, 0xe9, 0xac, 0xe4, 0x09, 0xe5, 0x8e, 0x9a, 0x7a, 0xe6, 0xf3, 0x72, 0x46, 0x92, 0x81, 0x8d, 0xf6, 0x20, 0x1d, 0x61, 0x44, 0x88, 0x83, 0x53, 0x03, 0xe5, 0xf7, 0xdb, 0x4f, 0x9e, 0x72, 0x17, 0xe4, 0x76, 0xca, 0x54, 0x75, 0x72, 0xec, 0x11, 0x31, 0x3d, 0xc9, 0x43, 0x63, 0x52, 0xce, 0x04, 0xc0, 0x8f, 0x67, 0xcd, 0xa2, 0xe4, 0x73, 0xf6, 0x3b, 0xc8, 0xce, 0x6b, 0x74, 0x19, 0x6c, 0x96, 0x62, 0x42, 0x21, 0x2a, 0xe6, 0x28, 0x71, 0xe8, 0xe7, 0x11, 0xd1, 0xf6, 0x96, 0x91, 0x73, 0xa6, 0x5e, 0xb2, 0x22, 0x99, 0x1d, 0xe9, 0x5c, 0xf5, 0x0f, 0xc7, 0x85, 0xb9, 0x1d, 0x52, 0x73, 0x08, 0x69, 0x5a, 0x01, 0x9d, 0xeb, 0x9d, 0x6f, 0x04, 0xfa, 0x84, 0x03, 0x03, 0xfc, 0xdc, 0xfb, 0x29, 0xad, 0xec, 0x4a, 0x12, 0x61, 0x42, 0xf3, 0x40, 0x48, 0x3f, 0x8c, 0x5f, 0xbe, 0x3c, 0xc5, 0xec, 0x30, 0x04, 0x26, 0x9d, 0x06, 0x46, 0x27, 0x73, 0x9f, 0x73, 0xeb, 0x96, 0xd8, 0x98, 0x93, 0xfa, 0x5c, 0xb7, 0x16, 0x38, 0x71, 0xeb, 0x92, 0x23, 0xc6, 0xce, 0xab, 0x89, 0xe5, 0x71, 0xcf, 0x1f, 0x63, 0x3d, 0xf9, 0xcd, 0xad, 0xe5, 0xae, 0x1e, 0xa6, 0x0c, 0x0b, 0xf2, 0xee, 0x22, 0x86, 0xfb, 0xe9, 0xfe, 0xf9, 0x73, 0xac, 0x79, 0x5d, 0x8b, 0xa7, 0xfc, 0x1f, 0xf0, 0x6b, 0x54, 0x81, 0x4e, 0x91, 0x0b, 0x55, 0x67, 0x42, 0x8a, 0xa6, 0x24, 0x7f, 0x94, 0x31, 0x70, 0x17, 0xef, 0xad, 0x92, 0xf6, 0xf0, 0xa2, 0x20, 0x3e, 0xed, 0xe2, 0x72, 0xf0, 0x66, 0xec, 0x25, 0x73, 0x0e, 0x96, 0x4c, 0x55, 0x6c, 0x7e, 0xa8, 0xe0, 0xc1, 0x1c, 0xe2, 0xea, 0x6d, 0x8f, 0xf2, 0xa4, 0x44, 0x02, 0x7e, 0xf6, 0x16, 0x05, 0x47, 0xcf, 0xa8, 0xa8, 0xf5, 0xa7, 0xe5, 0x97, 0x41, 0x2e, 0x3c, 0x97, 0xd3, 0xa3, 0x74, 0x91, 0x3b, 0x8f, 0x9e, 0x12, 0x16, 0xb3, 0x2d, 0x71, 0x07, 0x21, 0x10, 0x74, 0x18, 0xfe, 0xd0, 0x91, 0x95, 0x8d, 0x40, 0x82, 0xa4, 0xfa, 0x98, 0x04, 0xf8, 0x12, 0xe5, 0xa9, 0x94, 0x25, 0x86, 0x07, 0x98, 0xc7, 0xc7, 0xfe, 0x75, 0xc0, 0x6d, 0xb9, 0x2e, 0xb4, 0xe7, 0x69, 0x88, 0x46, 0x93, 0x16, 0xa8, 0xb5, 0x01, 0x59, 0xff, 0x4e, 0x7d, 0x88, 0x2c, 0xec, 0x89, 0xa5, 0x50, 0x03, 0x89, 0x7e, 0x6d, 0xc9, 0x9b, 0xe0, 0xd4, 0xa4, 0x5e, 0xff, 0xae, 0xf2, 0x97, 0xb9, 0xcc, 0x8a, 0xbd, 0x81, 0x98, 0x37, 0xdc, 0x87, 0x42, 0x4d, 0x98, 0x17, 0x7b, 0x12, 0x13, 0xe9, 0xd4, 0x1d, 0x71, 0x00, 0x83, 0x98, 0x8c, 0xac, 0x01, 0x8c, 0x36, 0xb2, 0xef, 0xc9, 0xa3, 0xee, 0xb8, 0x48, 0x7a, 0x7e, 0x84, 0x2a, 0x24, 0xb5, 0xaa, 0xb7, 0x07, 0x5a, 0x35, 0x13, 0x8b, 0xc2, 0xa1, 0x7b, 0xb5, 0xbf, 0xdd, 0xc5, 0x05, 0x90, 0xe8, 0x5c, 0x18, 0xb9, 0x68, 0x05, 0xf5, 0xa5, 0xbd, 0x4a, 0x4c, 0xb1, 0x5a, 0xf6, 0x79, 0x7e, 0x92, 0x98, 0x07, 0x19, 0x9b, 0x7b, 0x77, 0x9c, 0x46, 0xd5, 0x4a, 0xc8, 0xf7, 0x78, 0x7f, 0x33, 0x0e, 0xcb, 0x92, 0x2c, 0x71, 0x80, 0x24, 0x97, 0xe9, 0x00, 0xc9, 0x0f, 0xef, 0x37, 0x61, 0xb2, 0x63, 0xc4, 0xc3, 0x16, 0xcb, 0x67, 0xf7, 0x22, 0x4e, 0x55, 0x8e, 0x66, 0x03, 0x03, 0x8a, 0xab, 0xbe, 0xea, 0x04, 0x8a, 0x3d, 0x00, 0x22, 0x0e, 0x95, 0x7e, 0x9d, 0x52, 0x51, 0x59, 0x84, 0xea, 0x5c, 0xa5, 0x6a, 0x10, 0xe3, 0x6b, 0xb3, 0xb8, 0x8d, 0xf0, 0x83, 0x96, 0xde, 0x10, 0x22, 0x1c, 0xce, 0x2a, 0x37, 0x7f, 0xf2, 0xb4, 0x7a, 0x55, 0xce, 0x8e, 0x35, 0xfb, 0x58, 0x4b, 0xd1, 0xff, 0x03, 0xa9, 0xee, 0xbb, 0x6b, 0xe9, 0xa8, 0x97, 0xd4, 0x76, 0xcf, 0x6f, 0xc9, 0x80, 0x67, 0x6a, 0xd0, 0xa4, 0x16, 0x4d, 0x5e, 0xe7, 0x95, 0xe5, 0x0f, 0xa6, 0xf3, 0x91, 0xee, 0x97, 0x3d, 0xc7, 0x32, 0x80, 0x08, 0x0c, 0xce, 0x6b, 0x6c, 0xc7, 0x73, 0xf6, 0x59, 0x87, 0x95, 0x1d, 0x9b, 0x0d, 0x8c, 0x39, 0x94, 0xfd, 0x54, 0x4d, 0xf2, 0x71, 0x9a, 0xfd, 0xcf, 0x30, 0xc0, 0xfb, 0x8f, 0xc3, 0xfa, 0x6e, 0xb5, 0x51, 0xf8, 0x2f, 0xa5, 0x89, 0xc3, 0x23, 0xc4, 0x67, 0x2e, 0xd2, 0x77, 0xf7, 0x05, 0x59, 0x86, 0x50, 0xd5, 0x32, 0xc9, 0x01, 0xaa, 0x3c, 0xb5, 0x6f, 0xf7, 0x8a, 0xc3, 0x75, 0x96, 0xbc, 0x88, 0x54, 0x9f, 0xf3, 0x28, 0x8e, 0xa7, 0x30, 0x53, 0x54, 0xd1, 0xca, 0x4e, 0x9c, 0xc5, 0x31, 0x64, 0xc0, 0x59, 0xc7, 0x5a, 0xb3, 0xae, 0x8a, 0x39, 0x76, 0xbd, 0x12, 0x7b, 0x53, 0xc6, 0x1c, 0x4d, 0x5e, 0xda, 0xa3, 0x32, 0x48, 0x77, 0x77, 0xf9, 0x58, 0xb0, 0xfc, 0x64, 0x20, 0xc8, 0xd1, 0xa1, 0x96, 0x51, 0xeb, 0x43, 0x85, 0xf1, 0x26, 0x7a, 0x5d, 0xc7, 0xfd, 0x9a, 0x5b, 0xc3, 0x3b, 0xd6, 0x76, 0x9b, 0xe4, 0x55, 0x6f, 0x75, 0xe4, 0x36, 0x1e, 0xf8, 0x37, 0x0c, 0xbf, 0xb1, 0xf6, 0xac, 0x3a, 0x56, 0x67, 0xea, 0xcb, 0x8b, 0x5b, 0x36, 0xf8, 0x51, 0x42, 0x01, 0x42, 0x18, 0xdc, 0x91, 0xd5, 0x00, 0x5b, 0x61, 0x59, 0x73, 0xb6, 0xeb, 0x92, 0xf2, 0x35, 0xaf, 0xbc, 0xec, 0xd6, 0x9e, 0x64, 0xe1, 0x86, 0x73, 0x56, 0xac, 0xc3, 0x02, 0x99, 0xad, 0x2b, 0x62, 0x4a, 0xbf, 0x3b, 0x87, 0x14, 0xb7, 0x64, 0x04, 0x8c, 0x51, 0xcb, 0x16, 0xf0, 0x11, 0xcd, 0xa2, 0xc8, 0xa1, 0xc9, 0x9f, 0x6b, 0xcb, 0xa5, 0xe2, 0xe1, 0x8d, 0xc2, 0xbc, 0x82, 0x38, 0x4b, 0x6a, 0x01, 0x6d, 0x6c, 0xf3, 0x2a, 0x5b, 0x8f, 0x2d, 0x85, 0x51, 0x62, 0x55, 0x21, 0x52, 0x48, 0xd6, 0x06, 0x53, 0xc3, 0x8b, 0x8f, 0x0f, 0x74, 0x86, 0x3d, 0xf4, 0xf7, 0x47, 0x0e, 0xae, 0x62, 0x95, 0xa6, 0xb5, 0xca, 0x27, 0xf6, 0x00, 0x13, 0xbe, 0x77, 0x1a, 0x03, 0xb5, 0xe4, 0x32, 0x9e, 0x6a, 0xe4, 0xff, 0x70, 0xb5, 0x41, 0x9a, 0xa0, 0x42, 0x4f, 0xf6, 0x87, 0x56, 0xf4, 0x05, 0xf3, 0x60, 0x2f, 0xa1, 0x7c, 0x01, 0x3f, 0xcf, 0xe3, 0x55, 0x15, 0xbb, 0x19, 0x84, 0x12, 0x97, 0xbd, 0x4b, 0x94, 0xd4, 0x55, 0x47, 0xd7, 0x8d, 0x45, 0x3f, 0x5a, 0x30, 0x9a, 0xc6, 0xb7, 0xef, 0xa2, 0xa3, 0xec, 0xb3, 0x39, 0x66, 0x5a, 0xc0, 0x9c, 0xdc, 0xcb, 0x4e, 0x5e, 0x7e, 0x7b, 0x91, 0x0f, 0x72, 0x15, 0x50, 0x71, 0x0f, 0x02, 0x62, 0x01, 0x19, 0x5c, 0xa0, 0x44, 0xa1, 0xa9, 0x25, 0x4f, 0x21, 0x6c, 0x09, 0x03, 0xa7, 0x3c, 0xa2, 0x9f, 0xec, 0x9c, 0x4c])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_3399_tweak_16(self, test_workers):
my_key = bytes([0x72, 0x72, 0x72, 0x72, 0x72, 0x72, 0x72, 0x72, 0x72, 0x72, 0x72, 0x72, 0x72, 0x72, 0x72, 0x72])
my_tweak = bytes([0xd8, 0xd1, 0xca, 0xc3, 0xbc, 0xb5, 0xae, 0xa7, 0xa0, 0x99, 0x92, 0x8b, 0x84, 0x7d, 0x76, 0x6f])
my_message = bytes([0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad, 0xad])
real_ciphertext = bytes([0x53, 0xa3, 0xc2, 0x1c, 0xdc, 0x8f, 0x55, 0x8a, 0x0a, 0xd8, 0x5d, 0xde, 0xdb, 0x57, 0x8a, 0x72, 0x91, 0x25, 0x74, 0xc0, 0x6c, 0x0c, 0x44, 0x59, 0x77, 0x2d, 0x5f, 0x24, 0xa5, 0x29, 0x69, 0x84, 0x2f, 0x06, 0x8a, 0xb1, 0xcc, 0xfe, 0x26, 0xd2, 0x71, 0x4c, 0xc8, 0xfa, 0x05, 0xd7, 0x65, 0x8a, 0x84, 0xf4, 0x4a, 0x6c, 0x48, 0xc7, 0xb4, 0x44, 0x95, 0xb2, 0x86, 0xfc, 0xe8, 0x08, 0x29, 0xc1, 0xa1, 0x99, 0x05, 0x17, 0xaa, 0xdc, 0xba, 0x88, 0x65, 0x23, 0x05, 0x93, 0x66, 0xac, 0xe6, 0x18, 0x8e, 0xc7, 0x94, 0x0c, 0x1b, 0x45, 0x57, 0xe1, 0xa2, 0xf5, 0xfd, 0x95, 0x1a, 0x9f, 0x3f, 0x20, 0xfc, 0xf0, 0x89, 0xa9, 0xdd, 0x56, 0xa9, 0x8e, 0xc1, 0x9f, 0x6d, 0xf4, 0x4b, 0x13, 0xbe, 0x2c, 0x56, 0x4f, 0x33, 0x3d, 0xc2, 0x06, 0x0f, 0xef, 0x4a, 0xd1, 0xa4, 0xf6, 0xd6, 0x57, 0x5b, 0xf0, 0x4b, 0x29, 0x62, 0x3c, 0x7b, 0x01, 0x90, 0xed, 0x79, 0xfe, 0x3a, 0x4b, 0xc4, 0xc1, 0xe6, 0x9a, 0xad, 0x28, 0x5a, 0x9d, 0x65, 0x86, 0x98, 0xf6, 0xff, 0x74, 0x21, 0xd0, 0xc7, 0xe4, 0x44, 0x13, 0xb5, 0xe7, 0x2e, 0x68, 0xdc, 0x25, 0x7a, 0x49, 0x13, 0x74, 0x12, 0xe7, 0xe1, 0x4c, 0xa2, 0xd9, 0x6e, 0xe0, 0x34, 0xb4, 0x40, 0xd7, 0x53, 0x76, 0x2b, 0x4e, 0x17, 0x05, 0x55, 0x37, 0x07, 0x4a, 0x89, 0x85, 0x8b, 0xbd, 0xec, 0xba, 0xe3, 0x26, 0x62, 0xd6, 0x6c, 0x0e, 0xcd, 0xe1, 0x17, 0x9a, 0xd0, 0x67, 0x5f, 0x09, 0xe4, 0x77, 0xf3, 0x85, 0x09, 0x8f, 0x2d, 0x85, 0x1b, 0xe4, 0x5d, 0x6e, 0x87, 0xba, 0xde, 0x2c, 0xeb, 0x76, 0xab, 0x18, 0x44, 0x02, 0x05, 0x74, 0xa0, 0xfe, 0x5d, 0xab, 0xfb, 0xfe, 0x0c, 0x13, 0xa4, 0xff, 0x5a, 0x26, 0xed, 0xa5, 0x4f, 0x23, 0x41, 0xec, 0x9a, 0x32, 0xcd, 0xd7, 0x7f, 0x9d, 0x9f, 0x12, 0xe9, 0x33, 0x5a, 0xd3, 0x61, 0xd4, 0x89, 0xd6, 0xa5, 0x6a, 0xea, 0x47, 0x17, 0x57, 0x73, 0xe8, 0xaf, 0xc3, 0xab, 0xca, 0x5c, 0x0b, 0x76, 0xa0, 0x9c, 0x02, 0x15, 0x08, 0xab, 0x65, 0x92, 0xd4, 0x17, 0xc7, 0xe3, 0xa2, 0x92, 0x4c, 0x5d, 0xb2, 0xcb, 0x9e, 0x66, 0x8d, 0x15, 0xa3, 0xe9, 0xd2, 0x25, 0xff, 0x3a, 0x85, 0xf0, 0x1a, 0xf3, 0x4b, 0x1a, 0x3a, 0x24, 0xeb, 0xc6, 0x7f, 0x8c, 0x96, 0xb2, 0xcf, 0x24, 0xf0, 0x47, 0x11, 0xf7, 0xc1, 0x79, 0x32, 0x24, 0xe7, 0x63, 0x62, 0x58, 0x44, 0x12, 0xcb, 0x46, 0x59, 0xae, 0x1e, 0xf3, 0xf3, 0x97, 0xc5, 0x40, 0x1b, 0xc5, 0xfa, 0xed, 0x05, 0x23, 0x76, 0x56, 0xed, 0xd7, 0x2f, 0x85, 0xd3, 0x33, 0x65, 0xd0, 0x09, 0x28, 0x9a, 0x76, 0x05, 0x9b, 0xc8, 0x1f, 0x8b, 0x5d, 0xb5, 0xfe, 0xc6, 0xcb, 0xd7, 0x5d, 0xa8, 0xdb, 0x1b, 0x67, 0x3e, 0x75, 0xc8, 0x3c, 0x81, 0xd5, 0x3e, 0xb7, 0x84, 0x84, 0xf7, 0xb4, 0x90, 0xfb, 0x48, 0x97, 0xb5, 0xd2, 0x2b, 0x2b, 0xd5, 0x91, 0x9e, 0x2e, 0x28, 0x88, 0x12, 0xb2, 0xd1, 0x9a, 0x9c, 0x37, 0x5b, 0xed, 0x09, 0x4c, 0x92, 0x62, 0xc2, 0xcb, 0x04, 0x81, 0x84, 0x3f, 0x9e, 0x74, 0xfd, 0xf8, 0x65, 0x7c, 0x0e, 0x65, 0xb0, 0x3a, 0xd6, 0xa1, 0xf0, 0xf8, 0xa2, 0x33, 0xc0, 0x06, 0x47, 0xf0, 0x73, 0x0c, 0x21, 0x77, 0xf4, 0x80, 0x25, 0x03, 0x56, 0x58, 0x9d, 0xc3, 0x94, 0x0b, 0x57, 0x8b, 0x2c, 0xd0, 0x55, 0x2d, 0x78, 0x5b, 0x48, 0x46, 0xcf, 0x9c, 0x03, 0xa2, 0x62, 0x52, 0xd7, 0x7c, 0x92, 0xac, 0x0c, 0x13, 0x00, 0xda, 0x73, 0x35, 0x3b, 0x6b, 0x70, 0xb5, 0xab, 0x38, 0x6f, 0x4c, 0xf1, 0xc0, 0x7d, 0xac, 0xe2, 0x4e, 0x68, 0xdc, 0x82, 0x68, 0x44, 0xce, 0xcf, 0x68, 0xa5, 0xe8, 0xda, 0x6d, 0x6e, 0xa0, 0x2e, 0x2a, 0x5c, 0x48, 0x89, 0x4a, 0xcf, 0x03, 0xa9, 0xde, 0x07, 0x97, 0xda, 0xa3, 0x57, 0x4d, 0xa7, 0x21, 0x0e, 0x35, 0x11, 0x22, 0x3f, 0x62, 0xfa, 0xb6, 0xaa, 0x01, 0x23, 0x29, 0xc6, 0x9e, 0x8e, 0xaa, 0xe0, 0x92, 0x55, 0x9a, 0x76, 0xf2, 0xd5, 0xbc, 0xc1, 0x86, 0x5b, 0x29, 0x0a, 0x55, 0x3d, 0x2c, 0x2b, 0xaa, 0x75, 0x7e, 0xfe, 0x29, 0x87, 0x4f, 0xda, 0x41, 0xe4, 0x11, 0x6b, 0xc5, 0xa4, 0xb9, 0xa7, 0x80, 0x11, 0xc5, 0x47, 0xe8, 0xfd, 0x86, 0x23, 0xb5, 0x65, 0xea, 0x8e, 0xdc, 0x60, 0xc0, 0xb5, 0x4e, 0xf4, 0x84, 0x19, 0xa0, 0xb1, 0x83, 0xc4, 0x57, 0x80, 0x48, 0x8a, 0xf7, 0x72, 0xd5, 0x32, 0x99, 0xa5, 0xce, 0x5d, 0x96, 0x7f, 0xbe, 0x5a, 0x7f, 0xda, 0x0e, 0x58, 0xd4, 0x8f, 0xbe, 0xb1, 0xfd, 0x19, 0xde, 0x7c, 0xbb, 0x5a, 0x59, 0x98, 0x18, 0x54, 0x84, 0xda, 0x8a, 0x91, 0x35, 0xd2, 0xa7, 0xd0, 0x34, 0x48, 0x1e, 0xb1, 0x51, 0x0b, 0xa5, 0x4b, 0x08, 0x93, 0x4e, 0x2e, 0x7b, 0x97, 0xc0, 0xd0, 0x5a, 0xd8, 0xe3, 0x1c, 0x16, 0xe8, 0xef, 0x72, 0x70, 0x16, 0xeb, 0x09, 0x17, 0x8e, 0xe4, 0x6e, 0xd2, 0x16, 0xec, 0x16, 0x2e, 0x05, 0xbb, 0xed, 0x7c, 0x91, 0x73, 0x35, 0xef, 0xb6, 0xf6, 0x64, 0x81, 0x1a, 0x78, 0x18, 0x59, 0x98, 0x91, 0xb3, 0x9a, 0x6a, 0x73, 0x34, 0x6d, 0xfe, 0x49, 0x79, 0xe3, 0xe0, 0xae, 0xf7, 0x74, 0xe4, 0xd0, 0x5f, 0xae, 0xf3, 0x8c, 0x72, 0x8d, 0xf8, 0x14, 0xb0, 0x67, 0xf3, 0xbf, 0x45, 0x1b, 0x7f, 0xdf, 0xb2, 0xbf, 0xe0, 0x0b, 0xdd, 0x70, 0xd9, 0xe7, 0xbb, 0xc3, 0xd2, 0xc4, 0xab, 0xfa, 0xa8, 0x21, 0xab, 0x33, 0x92, 0x61, 0x4a, 0x4e, 0x27, 0x4c, 0x7d, 0x0e, 0xd5, 0x63, 0xdd, 0xd7, 0xcc, 0x40, 0x62, 0xe2, 0x8e, 0x6a, 0x49, 0xf0, 0xa7, 0xa5, 0xbc, 0x7f, 0xac, 0xb7, 0xc6, 0xc5, 0x1d, 0x4c, 0xaa, 0x7f, 0xb6, 0x78, 0xc2, 0x48, 0x88, 0xf5, 0x77, 0x63, 0x1e, 0x14, 0xa6, 0xd1, 0x9a, 0x88, 0x34, 0xd9, 0x0f, 0xb9, 0x9a, 0xc7, 0xf8, 0x52, 0x7b, 0x88, 0x92, 0x05, 0x0c, 0x36, 0xcf, 0xce, 0xc0, 0x05, 0x6b, 0xe0, 0x9f, 0x02, 0x68, 0x48, 0x8a, 0x01, 0xdf, 0x48, 0xbf, 0x87, 0xd9, 0xed, 0x66, 0x0d, 0xce, 0x24, 0x9b, 0x32, 0x9a, 0x5d, 0xe8, 0xe1, 0x37, 0x6c, 0x47, 0x58, 0x85, 0x8b, 0x43, 0x98, 0x9d, 0x8a, 0x09, 0xd2, 0xe3, 0x0b, 0xfb, 0xcf, 0xc4, 0x75, 0xc6, 0x1e, 0x6a, 0xb8, 0x77, 0x6a, 0xa4, 0x49, 0x1d, 0x94, 0x51, 0x83, 0x5b, 0x25, 0xf7, 0xab, 0x7f, 0xb4, 0x2b, 0x3f, 0xa1, 0x7b, 0xba, 0x20, 0x73, 0xe8, 0x0e, 0xb0, 0x72, 0xa3, 0x8e, 0x53, 0x54, 0xaa, 0xca, 0xfd, 0x40, 0x07, 0xc4, 0x7f, 0xfc, 0x74, 0xcf, 0xae, 0xc1, 0xf6, 0xf2, 0xfd, 0xee, 0xfe, 0xbb, 0x99, 0x30, 0x11, 0xa1, 0x35, 0x97, 0xa2, 0xe0, 0xbd, 0x46, 0x27, 0xac, 0x80, 0x3a, 0xfb, 0xe9, 0x6c, 0xc0, 0xec, 0x1e, 0xf8, 0x74, 0x2c, 0xb7, 0x3f, 0xef, 0xcd, 0x06, 0x81, 0xf1, 0x0a, 0x7b, 0xa5, 0xc3, 0x6b, 0x64, 0xe8, 0x14, 0x6a, 0x0c, 0xf2, 0x9b, 0x14, 0xc9, 0x5e, 0x7f, 0xec, 0xf1, 0xe4, 0xab, 0xa9, 0xc2, 0x44, 0xbf, 0x06, 0x0b, 0xe1, 0xd4, 0x3b, 0x53, 0x50, 0x6b, 0xeb, 0xf7, 0xc6, 0x0d, 0x8f, 0xf1, 0xde, 0xf2, 0xf8, 0x7b, 0x88, 0x7d, 0x0b, 0x98, 0x6c, 0xf2, 0x32, 0x14, 0x99, 0xcc, 0xcf, 0x88, 0xd8, 0x32, 0x76, 0x1e, 0xdb, 0x13, 0x13, 0x30, 0x09, 0x1f, 0x1c, 0xba, 0xa9, 0x81, 0xcf, 0x30, 0xb5, 0x8c, 0xcc, 0xf3, 0xd6, 0x6b, 0xae, 0xc0, 0x61, 0x2d, 0x87, 0xc1, 0xac, 0xca, 0x85, 0x9b, 0xe8, 0x9c, 0x02, 0x49, 0xa1, 0xde, 0x1b, 0xf1, 0xbf, 0x34, 0xcd, 0x75, 0x1e, 0x28, 0x2b, 0x90, 0x73, 0xfe, 0x22, 0xc7, 0xba, 0xd2, 0x9c, 0x04, 0xe5, 0x3c, 0x8d, 0x39, 0x98, 0x6a, 0x1e, 0x6e, 0x9f, 0xd8, 0x4c, 0x96, 0x2f, 0x25, 0x81, 0xd6, 0x70, 0xef, 0x5b, 0x05, 0xce, 0x10, 0xe8, 0x8b, 0x41, 0xe0, 0xb3, 0xbb, 0x07, 0xc1, 0x55, 0x28, 0xfd, 0x03, 0x6d, 0xb5, 0xee, 0x19, 0xbf, 0xf9, 0xdc, 0x0e, 0xcc, 0x7e, 0x86, 0x93, 0x5d, 0xde, 0xd3, 0xfb, 0xe5, 0xaf, 0x55, 0xa8, 0x5b, 0x94, 0x7d, 0x76, 0xb3, 0x6c, 0x78, 0x56, 0xe8, 0x00, 0x9d, 0x1c, 0xd1, 0xaf, 0x17, 0x0c, 0x8e, 0x29, 0x82, 0x68, 0xfd, 0x72, 0xb2, 0x87, 0x0d, 0x7f, 0x02, 0x7f, 0x9e, 0xa9, 0x21, 0xc5, 0x31, 0xaa, 0x4c, 0xcc, 0x6a, 0xac, 0x1e, 0x96, 0x7d, 0x90, 0xa6, 0x94, 0xbe, 0x1c, 0xe4, 0x89, 0x59, 0xce, 0xe7, 0xe2, 0x9b, 0x75, 0xca, 0xef, 0xed, 0xcf, 0xbb, 0xbe, 0xb6, 0xbe, 0xac, 0x53, 0x8d, 0x67, 0x8b, 0xf4, 0x53, 0x02, 0x5a, 0xe7, 0xb2, 0xf9, 0xde, 0x20, 0x2e, 0x81, 0x71, 0xa6, 0x0a, 0x74, 0x81, 0x94, 0xe0, 0x73, 0x60, 0x81, 0x62, 0xfe, 0x70, 0x8c, 0xc6, 0xb5, 0x8c, 0x0f, 0x1c, 0x29, 0x71, 0x95, 0xab, 0xc5, 0x0a, 0x7b, 0x71, 0x3d, 0x11, 0xef, 0xfa, 0xfd, 0x98, 0x07, 0x12, 0x89, 0xed, 0x60, 0x25, 0xa0, 0x37, 0x62, 0xea, 0xac, 0x13, 0xd1, 0xae, 0xfa, 0x50, 0x19, 0x08, 0x01, 0x88, 0x68, 0x60, 0x14, 0xb6, 0x46, 0x26, 0xa3, 0x4b, 0xfa, 0x4e, 0xa6, 0xc4, 0xc2, 0x44, 0x58, 0x9b, 0xfe, 0xbd, 0x2c, 0xc1, 0x60, 0xbf, 0x9f, 0x0c, 0x17, 0xa3, 0xb6, 0x45, 0x73, 0xa4, 0x01, 0x63, 0x93, 0xfd, 0x53, 0x4a, 0x72, 0x79, 0x16, 0xe5, 0xca, 0x24, 0x45, 0xb3, 0x46, 0xbb, 0xcd, 0xee, 0x76, 0xc0, 0xe5, 0x03, 0x52, 0x98, 0xf4, 0xb8, 0x6b, 0x0e, 0xbf, 0xe5, 0x3d, 0x78, 0xad, 0xcc, 0x70, 0x21, 0x63, 0x5a, 0x07, 0x9e, 0x2e, 0x7b, 0xe8, 0x79, 0xd5, 0xad, 0xd1, 0x65, 0x6d, 0x66, 0x63, 0x45, 0x7f, 0xf3, 0xc1, 0xef, 0x5e, 0x8e, 0xe1, 0x37, 0xbf, 0x27, 0x0d, 0xc5, 0x4e, 0xd1, 0xd0, 0xc0, 0x46, 0xb5, 0x10, 0xf9, 0xb1, 0x02, 0x53, 0x8d, 0xd6, 0xfe, 0xd8, 0x71, 0x83, 0x04, 0x50, 0xe2, 0x20, 0x2e, 0x44, 0x94, 0x47, 0xa6, 0x04, 0x6f, 0x49, 0xe2, 0x20, 0x40, 0xb8, 0x96, 0x6d, 0x7e, 0x70, 0x9f, 0xed, 0x34, 0x73, 0x14, 0xa1, 0x00, 0xf9, 0x06, 0x0d, 0x76, 0xdc, 0x83, 0x15, 0x84, 0x7a, 0xb4, 0xa7, 0x01, 0xed, 0x7b, 0x25, 0x95, 0x2f, 0x54, 0x07, 0x3b, 0x15, 0x9d, 0x5d, 0xb1, 0x40, 0xdc, 0xa6, 0x34, 0x35, 0x98, 0x93, 0xa7, 0x0a, 0x78, 0x33, 0x3f, 0x7c, 0x92, 0x2d, 0x59, 0x80, 0x78, 0x5f, 0x2e, 0x50, 0xa0, 0xcf, 0xf8, 0xe2, 0xc6, 0x96, 0xd2, 0x11, 0xe0, 0x36, 0x43, 0x2b, 0x82, 0xd8, 0x6c, 0x30, 0xa1, 0x2e, 0xf2, 0x95, 0x7e, 0xc3, 0xc9, 0x0d, 0x75, 0x41, 0x4e, 0x8b, 0x74, 0x4f, 0x9b, 0xdd, 0x69, 0x74, 0x7e, 0x32, 0xf1, 0x6b, 0x63, 0x39, 0x5e, 0x08, 0x5c, 0x04, 0xe7, 0x19, 0xeb, 0x5c, 0x58, 0x76, 0x10, 0xf6, 0x67, 0x98, 0x71, 0x2c, 0x3a, 0x79, 0x02, 0xe7, 0x1a, 0x06, 0xbd, 0x3e, 0xdf, 0xc8, 0xa0, 0xd0, 0x5f, 0x00, 0x9b, 0xb5, 0x2d, 0x1a, 0xf1, 0xd4, 0x63, 0x6e, 0x60, 0x5c, 0x4c, 0xbd, 0x9e, 0x37, 0x5e, 0x09, 0x41, 0x9f, 0x70, 0x75, 0x6c, 0xaf, 0x83, 0x20, 0x5c, 0x1f, 0x20, 0x1a, 0x81, 0xee, 0xc4, 0xd5, 0x9f, 0xfb, 0x33, 0x2d, 0xba, 0x53, 0xe4, 0xf7, 0x58, 0x29, 0x14, 0xf4, 0x90, 0xde, 0xbb, 0xc3, 0x42, 0x55, 0xa6, 0x0c, 0xa6, 0x78, 0x56, 0xf6, 0x9c, 0xba, 0x4b, 0xac, 0xbe, 0x55, 0x2c, 0xf2, 0xf9, 0x57, 0x31, 0xb2, 0x41, 0x25, 0x14, 0xcb, 0x5f, 0x07, 0x1a, 0xa2, 0x90, 0x03, 0x84, 0xdd, 0xf2, 0x73, 0x28, 0xa1, 0x5b, 0x0b, 0xc1, 0x4b, 0x47, 0x01, 0x11, 0xee, 0x36, 0x1e, 0x09, 0x97, 0x74, 0xc4, 0xfc, 0x94, 0x09, 0xb7, 0x5a, 0x8d, 0x7d, 0x13, 0x4f, 0xf9, 0x57, 0x6c, 0x11, 0xa8, 0xbc, 0x89, 0x47, 0x1f, 0x35, 0x45, 0xd7, 0xcd, 0x75, 0x1f, 0x3e, 0x77, 0x97, 0xbd, 0x5f, 0x60, 0xde, 0x73, 0xdf, 0xd3, 0x21, 0x53, 0x19, 0x86, 0x8a, 0xac, 0x79, 0x68, 0xf1, 0xda, 0x4b, 0x40, 0xc8, 0x1e, 0x98, 0xbd, 0x4d, 0x4d, 0xf6, 0x89, 0x40, 0xf6, 0xf6, 0xef, 0xf2, 0x5e, 0xa6, 0xf0, 0x72, 0x0e, 0x3d, 0xef, 0x63, 0x83, 0x51, 0xc0, 0x69, 0x38, 0x40, 0xec, 0x14, 0x33, 0x0e, 0xe3, 0x8e, 0x38, 0x19, 0x2f, 0xe1, 0x62, 0x01, 0xba, 0xea, 0x92, 0x31, 0x01, 0x25, 0x80, 0x7e, 0xa4, 0xf4, 0xba, 0xd7, 0xc3, 0x8d, 0xca, 0xa4, 0x66, 0x49, 0x35, 0xbe, 0x6c, 0x1d, 0x62, 0x12, 0x60, 0xfd, 0xe2, 0xe5, 0x94, 0xe3, 0x60, 0x3e, 0xb4, 0xaa, 0x57, 0x24, 0x5f, 0x57, 0x70, 0x18, 0xc5, 0x98, 0xf1, 0x1c, 0xd3, 0x81, 0xbf, 0x80, 0x1a, 0xcc, 0x90, 0xb1, 0x1e, 0x40, 0x2f, 0x7d, 0x0f, 0xeb, 0x6d, 0x35, 0x70, 0xdd, 0x72, 0x21, 0x63, 0x31, 0xe6, 0xb8, 0x8c, 0xfa, 0x42, 0xfb, 0x00, 0x4b, 0xd6, 0xc3, 0xdd, 0x00, 0x37, 0x9d, 0x08, 0x0f, 0xa4, 0xbc, 0xf9, 0x32, 0x69, 0x85, 0xc6, 0xc0, 0xbb, 0x94, 0x2c, 0x1f, 0x51, 0x0f, 0x1b, 0xaf, 0x22, 0x2d, 0x4d, 0xa7, 0xa2, 0x30, 0x02, 0xf3, 0x9e, 0x18, 0x3f, 0x00, 0x16, 0x04, 0x2d, 0xec, 0x0d, 0xd1, 0x5b, 0xcc, 0xe3, 0x7b, 0xd6, 0x72, 0xf5, 0x59, 0x7c, 0x86, 0xf5, 0xc6, 0xf4, 0x0e, 0xa6, 0x80, 0xdb, 0xdd, 0x35, 0x31, 0x65, 0xcf, 0x56, 0xe7, 0x94, 0x4d, 0xa6, 0x57, 0x56, 0x4d, 0xe9, 0x0c, 0xeb, 0xbc, 0x55, 0xbc, 0x7a, 0xa7, 0x26, 0x26, 0x7b, 0xe2, 0x57, 0xb9, 0xa2, 0x8a, 0x3c, 0xc7, 0x64, 0x09, 0x25, 0xe1, 0x2d, 0x76, 0x20, 0xb8, 0x70, 0xdf, 0xca, 0xb8, 0x79, 0x9c, 0xab, 0x7e, 0xad, 0x74, 0xf2, 0x59, 0x9d, 0xcb, 0xd9, 0xd2, 0x83, 0xac, 0x3b, 0x3f, 0xf3, 0xd7, 0x35, 0x43, 0x17, 0xef, 0x5d, 0xcc, 0x64, 0x8b, 0x28, 0xb9, 0xfe, 0xd1, 0x98, 0xd4, 0x4d, 0xa8, 0xab, 0x89, 0x43, 0xa8, 0x00, 0x5e, 0x9d, 0x48, 0x06, 0x55, 0x70, 0x0d, 0x59, 0xe6, 0xd3, 0x84, 0xe0, 0x55, 0x9f, 0x29, 0x19, 0xad, 0x2f, 0x58, 0x02, 0xb8, 0x84, 0xac, 0x7e, 0x09, 0xfe, 0x14, 0x50, 0xe9, 0x1e, 0x5e, 0x53, 0x58, 0x23, 0xcb, 0x3a, 0x06, 0x0d, 0x98, 0xf9, 0x0a, 0x91, 0xf7, 0x66, 0xfd, 0x4c, 0xf1, 0xea, 0x09, 0x73, 0xe9, 0xff, 0x7a, 0x35, 0x36, 0xb9, 0x12, 0x39, 0x4a, 0x3e, 0x32, 0x08, 0x12, 0x7c, 0x56, 0xe6, 0x70, 0x01, 0x8d, 0x02, 0x44, 0x58, 0xda, 0x6c, 0x4f, 0x3f, 0xa7, 0x9d, 0x17, 0x6c, 0xdd, 0x9e, 0x54, 0xcb, 0x49, 0x67, 0xa3, 0x6b, 0x10, 0x5e, 0xcf, 0x2a, 0x00, 0x14, 0x2b, 0xbc, 0xff, 0x0e, 0xee, 0x37, 0xdb, 0xfd, 0xdf, 0xa1, 0xa9, 0x8d, 0x0f, 0x10, 0x96, 0x7b, 0xe5, 0x18, 0xf9, 0x46, 0xd0, 0x51, 0xa5, 0x2a, 0x1e, 0xd6, 0xdf, 0xf3, 0x46, 0x99, 0x34, 0xbb, 0x5b, 0x7e, 0xba, 0x70, 0xb6, 0x72, 0x54, 0x84, 0x0e, 0xe4, 0xb4, 0x79, 0x18, 0xd4, 0x87, 0x0d, 0x5a, 0x36, 0x8c, 0x2a, 0x97, 0xa9, 0xbc, 0x87, 0x33, 0xd7, 0x06, 0x3a, 0x62, 0x22, 0xeb, 0xeb, 0x59, 0xde, 0x5f, 0x32, 0x5f, 0x86, 0x91, 0x4b, 0x87, 0xc2, 0xfc, 0x7a, 0xf3, 0x0a, 0x0a, 0x60, 0x4e, 0x7a, 0xea, 0x0c, 0xe0, 0xc5, 0xac, 0x5f, 0x63, 0x50, 0xfa, 0x2a, 0x41, 0x8b, 0x08, 0x23, 0x13, 0x83, 0x5c, 0xd3, 0x76, 0x92, 0xf0, 0xaf, 0xe2, 0x83, 0x0b, 0xe5, 0x9b, 0x00, 0x21, 0xb3, 0x3f, 0x7b, 0xa5, 0x07, 0xa4, 0x33, 0x39, 0x03, 0x6b, 0xc4, 0xd7, 0x32, 0x83, 0xc6, 0xc6, 0xa7, 0xe8, 0x41, 0x90, 0xac, 0x53, 0x12, 0x07, 0x94, 0xb0, 0xae, 0x33, 0x8b, 0x7c, 0x34, 0x1b, 0x41, 0x26, 0x67, 0x2a, 0x3b, 0xf4, 0xa7, 0x75, 0xb4, 0xec, 0x3d, 0x6f, 0x26, 0x4a, 0xce, 0x89, 0x77, 0xfc, 0x89, 0xc4, 0x45, 0x5a, 0x41, 0xeb, 0x5b, 0x27, 0x18, 0xad, 0x60, 0x07, 0x19, 0x5b, 0xd8, 0x47, 0xd8, 0x50, 0xe9, 0x60, 0xb9, 0x80, 0x0e, 0xff, 0x54, 0xa8, 0xe1, 0x07, 0x9b, 0x65, 0x3d, 0x3b, 0x83, 0x53, 0xe9, 0x83, 0xcf, 0x4f, 0xd3, 0xdb, 0x90, 0x47, 0x7b, 0xda, 0x70, 0x6b, 0xcd, 0x5b, 0x0f, 0x17, 0x36, 0xcb, 0xb5, 0x2c, 0x43, 0xbe, 0xd4, 0xb2, 0xf3, 0xed, 0x2c, 0x97, 0xbc, 0x2a, 0xcc, 0xf1, 0xb6, 0xdc, 0x0d, 0x7e, 0xdf, 0xab, 0x99, 0xe7, 0x6f, 0xbd, 0xd8, 0x1d, 0x71, 0xe2, 0x8b, 0xeb, 0x70, 0x2a, 0x56, 0xd2, 0xdf, 0x67, 0xcc, 0xc0, 0x3b, 0x92, 0x3b, 0x5c, 0x1d, 0x4b, 0x03, 0x1e, 0xcc, 0xf5, 0x0f, 0x6b, 0x0b, 0xad, 0x9e, 0x5d, 0x35, 0x87, 0x4a, 0xf7, 0x13, 0x84, 0x45, 0x5a, 0xed, 0xef, 0xe8, 0x80, 0xcd, 0x8e, 0xad, 0x90, 0x3a, 0x89, 0xb5, 0x41, 0xae, 0x08, 0xf9, 0x9a, 0xbd, 0x41, 0x7b, 0x55, 0x5f, 0x3d, 0xfb, 0x8d, 0xca, 0xe0, 0x9a, 0xc5, 0xb8, 0xb2, 0xe9, 0xe1, 0x83, 0x60, 0x8c, 0x53, 0x08, 0x9c, 0x58, 0x6b, 0x94, 0xf3, 0xa7, 0x46, 0x28, 0xc6, 0x70, 0x22, 0xbf, 0x5d, 0xe7, 0x00, 0x0d, 0x39, 0x9a, 0xd0, 0xf3, 0xd4, 0x08, 0xff, 0xdb, 0x3f, 0x50, 0xb0, 0xbd, 0x7f, 0x41, 0x8d, 0x4d, 0xcb, 0x2e, 0x1b, 0xc1, 0x77, 0x86, 0x2f, 0x41, 0xc9, 0xdc, 0xdd, 0xd3, 0xd7, 0x81, 0x74, 0x5d, 0x6e, 0xa9, 0xf4, 0x81, 0x1d, 0x8b, 0x39, 0xf4, 0xfa, 0x14, 0x64, 0x21, 0x82, 0xe5, 0x54, 0xd5, 0x49, 0xaf, 0xc7, 0x88, 0xea, 0x04, 0x05, 0x00, 0x56, 0xa7, 0xb5, 0x39, 0x69, 0x0d, 0x76, 0xe2, 0x1e, 0x63, 0x0b, 0xf0, 0x6e, 0xc6, 0x9e, 0x31, 0x3d, 0xab, 0x81, 0x35, 0x24, 0x04, 0xf5, 0x10, 0xcb, 0x08, 0x9c, 0x8c, 0x8b, 0x93, 0xa5, 0xf2, 0x8d, 0x23, 0x1d, 0xf2, 0x31, 0xc8, 0x5b, 0xa1, 0x37, 0x74, 0x64, 0x90, 0xfd, 0x5e, 0xfa, 0xd3, 0xf3, 0xe7, 0x60, 0x90, 0xde, 0xb8, 0x4b, 0x3c, 0x40, 0xd0, 0x21, 0x68, 0xae, 0xb9, 0x67, 0xd6, 0x3a, 0x2a, 0x75, 0x01, 0xb0, 0x75, 0x93, 0x0b, 0x6d, 0xb0, 0x08, 0xd5, 0xa5, 0x5c, 0x63, 0x24, 0x56, 0x15, 0x90, 0x2f, 0x72, 0xea, 0x65, 0x38, 0x3c, 0x3e, 0xe9, 0x7f, 0xb9, 0x29, 0x37, 0xb7, 0x61, 0xa5, 0x3f, 0x04, 0x0e, 0x43, 0xe8, 0x43, 0xaf, 0x61, 0x7c, 0x55, 0x40, 0x68, 0x8c, 0x33, 0x04, 0x16, 0x1b, 0xb6, 0x28, 0xde, 0xfe, 0x53, 0x30, 0xa9, 0xdd, 0x43, 0xd0, 0xfc, 0x91, 0x28, 0x91, 0x9c, 0x4c, 0xd6, 0x16, 0x27, 0x95, 0x27, 0xc2, 0x2e, 0x7a, 0xb2, 0x48, 0xed, 0x0a, 0xce, 0xeb, 0x0b, 0x82, 0xfd, 0x8b, 0x50, 0x28, 0x53, 0xf9, 0xa9, 0x65, 0x3c, 0xab, 0xb4, 0xc1, 0x4a, 0x57, 0xe6, 0xa3, 0x52, 0x94, 0xb8, 0x6a, 0xb9, 0x86, 0xfe, 0x34, 0xe1, 0x0b, 0xed, 0x41, 0xaf, 0x1a, 0x1e, 0x0f, 0xa1, 0x35, 0x0a, 0x84, 0xa2, 0x7e, 0xe0, 0xff, 0x3a, 0xec, 0x1c, 0xbf, 0x70, 0xc5, 0xb0, 0x50, 0x97, 0x8b, 0xdd, 0xcb, 0x2c, 0xe0, 0xe1, 0x6a, 0x25, 0xc7, 0x13, 0xeb, 0xe3, 0x2f, 0xc7, 0xdc, 0x7b, 0x48, 0xfa, 0x28, 0x59, 0x88, 0xeb, 0x4e, 0xba, 0x9b, 0x9c, 0x68, 0xee, 0x06, 0x93, 0x53, 0xbb, 0x93, 0x0e, 0xf1, 0x39, 0xfb, 0x47, 0x30, 0x5d, 0x15, 0x5f, 0x54, 0x2d, 0xc0, 0x22, 0x01, 0x4e, 0xe7, 0xd5, 0x64, 0x38, 0xe5, 0x22, 0xd7, 0x9f, 0xba, 0x5b, 0x79, 0xe0, 0xec, 0xd0, 0xe6, 0x94, 0xb0, 0xd6, 0xf3, 0x0d, 0x4e, 0x28, 0x35, 0x01, 0x56, 0xf2, 0x07, 0x19, 0x45, 0xc1, 0x2d, 0x67, 0x63, 0xbd, 0xd2, 0x42, 0x46, 0x13, 0x02, 0x3c, 0x65, 0x68, 0x30, 0x8c, 0x5a, 0xfa, 0xaa, 0xe9, 0x5c, 0x51, 0x03, 0x31, 0x37, 0xbd, 0xec, 0xc2, 0x86, 0x62, 0x8d, 0xbb, 0x55, 0x78, 0x2f, 0x2c, 0xd6, 0x1f, 0x09, 0xdd, 0x63, 0xe9, 0xc6, 0x4e, 0x5d, 0x44, 0x75, 0x3b, 0x0d, 0x6b, 0x0e, 0x12, 0xb8, 0x90, 0x79, 0x25, 0x23, 0x5c, 0xe6, 0xa7, 0xc1, 0xcc, 0x9c, 0x02, 0x9a, 0xf3, 0x5f, 0x90, 0x16, 0xb9, 0x7c, 0xbe, 0x21, 0x92, 0xf5, 0x68, 0xc9, 0x4b, 0x5a, 0xc7, 0x5f, 0x39, 0xd5, 0x79, 0xb7, 0x53, 0x06, 0x42, 0x95, 0x41, 0x9c, 0x9d, 0xd6, 0xf1, 0xc5, 0x6d, 0x63, 0xfd, 0xac, 0xf2, 0x13, 0xd3, 0x3b, 0x73, 0x29, 0x4c, 0x74, 0x55, 0x75, 0xd2, 0xb3, 0x8d, 0x94, 0x6e, 0xac, 0x9a, 0xef, 0xb9, 0x81, 0x38, 0x21, 0x76, 0x56, 0x9f, 0xb2, 0x83, 0xba, 0x8f, 0xad, 0xe3, 0x61, 0x8e, 0x55, 0xa7, 0xef, 0xb6, 0xfd, 0xfd, 0xe1, 0xc4, 0x0d, 0xd6, 0xb1, 0x79, 0x22, 0x13, 0x4b, 0x7a, 0xc8, 0xd6, 0x46, 0x17, 0xfc, 0x35, 0xdf, 0x38, 0x79, 0x94, 0xf8, 0x42, 0xe3, 0xb8, 0x8b, 0x32, 0x95, 0xaa, 0x52, 0xda, 0x73, 0x92, 0xb6, 0xd2, 0x06, 0x46, 0x04, 0xf2, 0x9d, 0xd9, 0xa6, 0x30, 0x00, 0x88, 0x3b, 0x74, 0x0c, 0x95, 0xe7, 0x86, 0x74, 0x07, 0x23, 0x80, 0x93, 0x01, 0xb7, 0x86, 0x45, 0xf8, 0x5a, 0x7c, 0x12, 0xda, 0x6e, 0x0d, 0x09, 0x1d, 0xdf, 0xb2, 0x18, 0xee, 0x0e, 0xec, 0x51, 0xc2, 0x13, 0xaa, 0x4c, 0x82, 0xae, 0xdd, 0xf8, 0x06, 0xcc, 0x59, 0x86, 0xa2, 0xcf, 0x8a, 0x32, 0x28, 0x81, 0xe8, 0x19, 0x85, 0xb5, 0xb9, 0x2f, 0x2e, 0x6d, 0x8d, 0x80, 0xf2, 0x32, 0x34, 0x11, 0x3e, 0xf2, 0xae, 0x82, 0xca, 0x8b, 0x52, 0x9a, 0xe0, 0x8e, 0xf5, 0x6d, 0x3c, 0x3a, 0x94, 0x41, 0xdf, 0x14, 0xe2, 0xd0, 0xfc, 0x0c, 0x28, 0x2e, 0xdd, 0x84, 0xec, 0x71, 0x8b, 0x66, 0x35, 0xb6, 0x61, 0xf9, 0x49, 0x20, 0x2c, 0xc5, 0xdc, 0xa9, 0x6f, 0x5c, 0xb4, 0xf9, 0x87, 0x78, 0x6d, 0x71, 0xe0, 0x29, 0x0b, 0xa5, 0x70, 0xad, 0xfe, 0x8b, 0x75, 0x7a, 0x0f, 0xc5, 0x47, 0x24, 0x2a, 0x5f, 0x76, 0xf6, 0x72, 0x9a, 0xc5, 0x14, 0x35, 0x75, 0x79, 0x38, 0x3a, 0x5f, 0x9e, 0x99, 0x92, 0x26, 0x16, 0xfc, 0xef, 0x1f, 0x15, 0x1b, 0x60, 0x79, 0x48, 0x25, 0x1b, 0x1a, 0xba, 0x26, 0xa9, 0xc4, 0x5c, 0x10, 0xd0, 0x78, 0xd8, 0x09, 0x39, 0x46, 0x7e, 0xd3, 0xf3, 0x18, 0x23, 0xf5, 0x46, 0xb8, 0x35, 0x8f, 0xba, 0x5c, 0xdc, 0x3d, 0x21, 0xdc, 0x5b, 0xaa, 0xfb, 0xb2, 0xe6, 0xe6, 0x72, 0x8c, 0x41, 0xdf, 0xec, 0x7a, 0x82, 0x93, 0x99, 0x2d, 0xab, 0x16, 0xfd, 0x47, 0xc2, 0x8f, 0x3e, 0x3b, 0x80, 0x26, 0x0f, 0xe2, 0xbd, 0x78, 0x03, 0x3f, 0x36, 0x99, 0xa5, 0x0f, 0xe6, 0x48, 0x53, 0x49, 0xad, 0x06, 0x88, 0x17, 0x8c, 0x72, 0xdf, 0x25, 0x5e, 0x07, 0x9b, 0xbd, 0xdb, 0x54, 0xe5, 0x5a, 0xec, 0xce, 0x4c, 0x1d, 0xc4, 0x44, 0x05, 0x10, 0xe5, 0x29, 0x37, 0x05, 0xe1, 0x30, 0x56, 0xcb, 0xb9, 0x22, 0x48, 0x80, 0x3d, 0xd9, 0x48, 0x6d, 0xd7, 0x40, 0x00, 0xa6, 0x4c, 0x62, 0xa2, 0x1e, 0x7f, 0x33, 0xc2, 0x3f, 0x97, 0xee, 0x8c, 0x72, 0xb9, 0x3c, 0x39, 0x8d, 0x35, 0x7c, 0xef, 0x6f, 0x1d, 0x64, 0x1b, 0x16, 0x23, 0xfe, 0xe8, 0x10, 0xe1, 0x55, 0xe9, 0xd3, 0x3e, 0xbb, 0x83, 0x39, 0x9c, 0x69, 0xcb, 0x13, 0x9e, 0x32, 0x3b, 0x7f, 0x6a, 0x53, 0x7b, 0x32, 0xee, 0x0b, 0x10, 0xc9, 0x02, 0x86, 0x01, 0x5d, 0x77, 0x9c, 0xae, 0x9f, 0xfc, 0x6a, 0x8d, 0xf3, 0xf2, 0x91, 0x8f, 0xcb, 0x57, 0xa8, 0xc0, 0x25, 0x6e, 0x4b, 0x2d, 0xef, 0x8f, 0x42, 0x75, 0x8f, 0xb2, 0x57, 0xa9, 0x54, 0x45, 0x19, 0x12, 0xcd, 0x99, 0xfe, 0x37, 0x3e, 0xd4, 0x4d, 0x95, 0x89, 0x0c, 0xc0, 0xae, 0x5a, 0x9e, 0xce, 0x08, 0xcc, 0x2f, 0x5f, 0x48, 0x22, 0x8d, 0x55, 0x72, 0xf3, 0xea, 0x8d, 0x06, 0x80, 0x5e, 0xa7, 0xb7, 0xc9, 0x19, 0xe5, 0xef, 0x14, 0x12, 0x3d, 0xb6, 0x1b, 0x31, 0x2d, 0x3f, 0xb0, 0x7b, 0x5e, 0x7b, 0x5d, 0x99, 0x2c, 0xe9, 0xd9, 0xf5, 0xb3, 0x86, 0x68, 0x0a, 0xa1, 0x5e, 0xab, 0xb8, 0xec, 0x7b, 0x0c, 0x3a, 0x3e, 0xf8, 0xe0, 0xcc, 0xb6, 0x91, 0xdb, 0x5b, 0xf4, 0xe6, 0x72, 0xc0, 0xf9, 0x15, 0xd2, 0x51, 0x40, 0xc1, 0x54, 0xe4, 0x76, 0x17, 0x39, 0xd1, 0x8a, 0xeb, 0xf2, 0x6c, 0x94, 0xf0, 0xc2, 0x0d, 0x7f, 0x88, 0xc1, 0x89, 0x08, 0xdd, 0xc3, 0x80, 0x8d, 0xa5, 0xf4, 0x4f, 0x08, 0xd8, 0x5a, 0xdf, 0xf9, 0x1a, 0xa0, 0x58, 0x98, 0x1b, 0x3e, 0x80, 0xf2, 0x84, 0x7f, 0xf2, 0x51, 0x46, 0x3f, 0xd9, 0xbc, 0x2b, 0xe8, 0xc8, 0xcb, 0x44, 0x80, 0x7d, 0x0d, 0xeb, 0x52, 0xf3, 0xbb, 0xa8, 0xb7, 0xd7, 0xea, 0x9e, 0xef, 0x25, 0xb3, 0x03, 0x03, 0x74, 0x7c, 0x0b, 0xfa, 0xa0, 0x28, 0xe5, 0x0c, 0x60, 0x9a, 0xb3, 0x79, 0xb1, 0x14, 0xc0, 0xd2, 0x7f, 0xf4, 0x0a, 0x4a, 0xdb, 0x57, 0x55, 0xeb, 0x7e, 0x26, 0x21, 0x6a, 0x39, 0x3a, 0x44, 0x04, 0x83, 0x01, 0xc3, 0xa2, 0x67, 0x6a, 0xa4, 0x8e, 0x9b, 0xd6, 0x1b, 0xa4, 0xdf, 0x3b, 0x9e, 0xca, 0x63, 0xd6, 0x04, 0xe6, 0x5d, 0xb6, 0x10, 0x0a, 0x12, 0x07, 0xaf, 0x55, 0x74, 0xe4, 0x5b, 0x29, 0xbc, 0xcb, 0x40, 0xec])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_3400_tweak_16(self, test_workers):
my_key = bytes([0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a])
my_tweak = bytes([0x0f, 0x00, 0xf1, 0xe2, 0xd3, 0xc4, 0xb5, 0xa6, 0x97, 0x88, 0x79, 0x6a, 0x5b, 0x4c, 0x3d, 0x2e])
my_message = bytes([0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e])
real_ciphertext = bytes([0x38, 0x01, 0xeb, 0xd8, 0xd4, 0x6d, 0x23, 0xd8, 0x8f, 0x98, 0x3a, 0x0e, 0xc3, 0x9c, 0xb4, 0x79, 0x9a, 0x80, 0x5c, 0x28, 0x89, 0x5a, 0x1d, 0x4b, 0xfb, 0xbb, 0x4b, 0x53, 0x94, 0xc6, 0x13, 0x46, 0x92, 0x05, 0x11, 0x80, 0xf5, 0x74, 0x59, 0xb1, 0x01, 0x43, 0xe8, 0x04, 0x47, 0x47, 0x24, 0x25, 0x46, 0xc5, 0xa3, 0x68, 0xc2, 0x76, 0x55, 0x26, 0xe7, 0x2f, 0xac, 0x56, 0x44, 0x44, 0x66, 0xb9, 0x78, 0x9a, 0x0f, 0x0b, 0xc6, 0x9a, 0xe2, 0x2d, 0xa1, 0x4a, 0x4a, 0x87, 0x59, 0x83, 0x9d, 0xc6, 0x2a, 0x9f, 0x0d, 0xa8, 0xfa, 0x46, 0x20, 0x80, 0x57, 0x11, 0x2e, 0xda, 0x16, 0x76, 0x6f, 0x24, 0x7b, 0xfc, 0x17, 0x8a, 0x5e, 0xc5, 0x61, 0x30, 0x95, 0x2b, 0x95, 0x5d, 0x6a, 0x40, 0x25, 0xf4, 0x90, 0xde, 0x7f, 0x2f, 0xbc, 0xed, 0x4d, 0xb5, 0xbf, 0xb4, 0xf7, 0xa5, 0x2b, 0x15, 0x7e, 0xbf, 0xe2, 0x02, 0x6c, 0x56, 0xb8, 0xf8, 0xef, 0xf5, 0x56, 0xc4, 0xdd, 0x57, 0xfe, 0x3a, 0x0e, 0xb2, 0x7f, 0xa1, 0xd0, 0x9a, 0x4a, 0xa4, 0x4d, 0xce, 0x15, 0xc1, 0x6f, 0x37, 0x0e, 0xb0, 0xcf, 0x1f, 0xba, 0x6c, 0x57, 0xd4, 0xca, 0x94, 0x0f, 0x30, 0x6c, 0xdf, 0x22, 0x74, 0xb8, 0x8d, 0xba, 0xb1, 0x00, 0x04, 0x85, 0xe1, 0xb4, 0x34, 0x36, 0xa9, 0xa8, 0xbc, 0xb5, 0x5d, 0x25, 0x6c, 0x20, 0x48, 0x31, 0x37, 0x1b, 0x75, 0x7c, 0x2a, 0x3a, 0x4c, 0xc7, 0xe1, 0xc5, 0x2e, 0x77, 0xdc, 0x26, 0x33, 0x4a, 0xc4, 0x90, 0xa8, 0xfe, 0xc8, 0xe8, 0x39, 0x8e, 0xf7, 0x2f, 0xea, 0x58, 0xa2, 0x44, 0x5d, 0x43, 0xe8, 0x21, 0xa2, 0x12, 0xae, 0xf6, 0xce, 0x9f, 0xa7, 0x83, 0x32, 0x8c, 0xf3, 0x86, 0xb8, 0x96, 0x96, 0x4b, 0xbd, 0x7d, 0x89, 0x6c, 0xe2, 0x5b, 0x07, 0x20, 0x02, 0x28, 0x49, 0x58, 0xfb, 0xf7, 0x5e, 0xae, 0xb1, 0xc0, 0xc3, 0x17, 0x71, 0xe1, 0xbc, 0xdc, 0xdf, 0xc9, 0xb5, 0x6b, 0x52, 0xd8, 0x42, 0xe0, 0x19, 0x14, 0x69, 0x77, 0x77, 0xa0, 0x2b, 0x3b, 0xd6, 0xd1, 0xae, 0x94, 0xeb, 0x3f, 0xb5, 0x4f, 0x7f, 0xe9, 0x5f, 0xd1, 0x0b, 0x77, 0xb8, 0x89, 0x5c, 0x03, 0x4d, 0xa5, 0x96, 0xf1, 0x10, 0xd5, 0x79, 0xd3, 0xba, 0x67, 0x6e, 0x25, 0x94, 0x34, 0xbd, 0x85, 0xe0, 0x66, 0x83, 0x29, 0xc3, 0x44, 0x91, 0x13, 0xe0, 0x7e, 0x4c, 0x67, 0xb6, 0x5e, 0x15, 0xb2, 0x63, 0x20, 0xaf, 0xee, 0xfe, 0x76, 0x47, 0x36, 0x01, 0xf7, 0xc0, 0xa7, 0x39, 0x89, 0xc8, 0x93, 0xf2, 0xbb, 0x49, 0x87, 0xab, 0x99, 0xbc, 0xe1, 0xcf, 0x71, 0x96, 0xfa, 0xcc, 0x39, 0xf7, 0x32, 0xe5, 0x58, 0x64, 0x71, 0x14, 0x8f, 0x92, 0x0c, 0xfd, 0x31, 0xef, 0x60, 0xdc, 0x38, 0xbd, 0x89, 0x4e, 0xa5, 0x81, 0xdc, 0x4d, 0x96, 0x0d, 0x66, 0xf9, 0x9d, 0x8c, 0xcc, 0x55, 0x26, 0xa3, 0x20, 0x34, 0xe4, 0xbd, 0x3b, 0x69, 0xa8, 0x81, 0x86, 0x34, 0xd6, 0x98, 0x79, 0x79, 0x80, 0xe8, 0x15, 0xae, 0x99, 0x6d, 0x6b, 0xc7, 0x07, 0x2a, 0x77, 0x71, 0x8d, 0xbd, 0x61, 0x6f, 0x40, 0x97, 0x46, 0x37, 0x08, 0xd9, 0xbf, 0x4a, 0x85, 0xf8, 0xb4, 0x97, 0x81, 0x65, 0x39, 0x8d, 0xbd, 0x7c, 0x3b, 0x59, 0x1a, 0xe0, 0xac, 0x65, 0x9d, 0x65, 0xf4, 0xdf, 0xd9, 0xb7, 0x57, 0xf2, 0x96, 0x2b, 0xaa, 0xac, 0xc3, 0x56, 0x6a, 0x84, 0xb0, 0xe8, 0xfb, 0xf4, 0x7d, 0x0d, 0x61, 0x1d, 0xb3, 0xc2, 0x95, 0x2e, 0x11, 0xcc, 0x58, 0xe0, 0x68, 0x52, 0xe9, 0x28, 0x60, 0x6c, 0x1a, 0x29, 0x84, 0xf4, 0x0c, 0x76, 0x64, 0x79, 0x2c, 0xcf, 0xb4, 0x95, 0xfc, 0x3f, 0x13, 0x75, 0x77, 0x6d, 0x86, 0xdf, 0x2e, 0x30, 0xa8, 0xc3, 0xeb, 0x3f, 0x0e, 0xf6, 0x6f, 0xec, 0xb9, 0x69, 0x53, 0xb7, 0x46, 0xf6, 0x61, 0xe8, 0xe2, 0x11, 0x36, 0x50, 0xec, 0x26, 0x21, 0x0f, 0xab, 0xde, 0xee, 0x02, 0xa9, 0xf6, 0xac, 0x9e, 0x96, 0xff, 0x13, 0x34, 0xf7, 0xc6, 0x97, 0xa1, 0x0a, 0x12, 0x9c, 0x5f, 0xf8, 0x64, 0x23, 0x74, 0x21, 0xec, 0x29, 0xc8, 0xc4, 0xd6, 0x08, 0x92, 0x83, 0xfd, 0x16, 0x40, 0x36, 0x3b, 0x3c, 0xea, 0x2d, 0x85, 0x5d, 0x2e, 0xb1, 0xce, 0x92, 0x91, 0x36, 0x9d, 0x9b, 0xac, 0x1a, 0xa0, 0x35, 0xfd, 0xe7, 0x94, 0xf7, 0xab, 0x91, 0xe2, 0xea, 0x56, 0x5e, 0x32, 0x36, 0x7f, 0x1a, 0x6d, 0x5d, 0x9f, 0x53, 0xa2, 0x9a, 0x91, 0x88, 0x73, 0xa7, 0xcd, 0x44, 0x6b, 0x6d, 0xd6, 0xde, 0xfa, 0x30, 0xa9, 0x7d, 0x16, 0xe4, 0xbe, 0xd6, 0xba, 0x46, 0x06, 0xe4, 0x18, 0xd3, 0xe7, 0xdb, 0x4a, 0x3b, 0x3e, 0xd4, 0xfe, 0xe2, 0x05, 0xcd, 0x5a, 0x19, 0xb7, 0x18, 0x05, 0x3f, 0x5c, 0xd9, 0x46, 0x34, 0xeb, 0x11, 0xa9, 0x04, 0xad, 0xdc, 0xab, 0x32, 0x0d, 0x56, 0x6f, 0x6f, 0x5a, 0xbf, 0x5e, 0xd0, 0x26, 0x44, 0xa2, 0xe4, 0x00, 0x3e, 0xd2, 0x3c, 0xf1, 0xbc, 0x08, 0xd3, 0xe9, 0x07, 0x50, 0x6e, 0x24, 0xd9, 0xbb, 0xb3, 0x17, 0x79, 0xe4, 0x8a, 0x88, 0x82, 0xe6, 0x51, 0xdc, 0xf6, 0x3b, 0x91, 0x0e, 0x80, 0x5a, 0x05, 0xfd, 0xc5, 0x0f, 0xea, 0xd4, 0xc8, 0x39, 0x05, 0x33, 0xe2, 0x36, 0x91, 0x83, 0xc8, 0x87, 0x55, 0x5b, 0x97, 0xcb, 0xf4, 0x1c, 0x5a, 0xe6, 0x8c, 0x75, 0x84, 0xc2, 0xa5, 0x0c, 0x3e, 0xf1, 0xa2, 0x02, 0x3d, 0xf7, 0xff, 0x6b, 0x5d, 0xa9, 0x74, 0x9c, 0xae, 0x73, 0xbe, 0x06, 0x99, 0xc8, 0xcf, 0x8d, 0x11, 0x69, 0xc4, 0x7d, 0x99, 0x99, 0x17, 0x9b, 0x7e, 0x74, 0x46, 0x46, 0xe8, 0x61, 0x50, 0x87, 0x4d, 0x9d, 0x7a, 0x9c, 0xb0, 0x9b, 0x8f, 0x5a, 0x1c, 0x9f, 0xff, 0xbb, 0xf5, 0xaf, 0xc9, 0x75, 0x2f, 0xe3, 0x34, 0xb9, 0xe2, 0x11, 0x7c, 0x43, 0x13, 0xca, 0x02, 0xba, 0x88, 0x32, 0xfb, 0x48, 0x2e, 0x4a, 0x08, 0x7e, 0x67, 0xfb, 0x21, 0xe9, 0xd1, 0xdd, 0xf3, 0x3b, 0xbb, 0x59, 0x83, 0xda, 0x98, 0x02, 0x9d, 0x61, 0x67, 0xfd, 0x6b, 0x7f, 0x96, 0x6d, 0xd0, 0x81, 0x03, 0xab, 0xc1, 0x39, 0x70, 0xe2, 0xbf, 0x57, 0xe7, 0x78, 0x4c, 0xc2, 0x48, 0x7d, 0xcf, 0x9f, 0x74, 0xb1, 0x8b, 0x68, 0x19, 0x56, 0x5b, 0x44, 0x4a, 0xb4, 0x07, 0x37, 0x9a, 0x26, 0x5c, 0x5d, 0x1f, 0xfc, 0x00, 0xb5, 0x86, 0x08, 0xd0, 0x4e, 0xbd, 0xbe, 0x8b, 0x4a, 0x18, 0xe4, 0x9a, 0x17, 0x39, 0x2c, 0xa0, 0x30, 0x9a, 0x53, 0xb5, 0xf8, 0xfc, 0xd3, 0xd5, 0x0c, 0x2b, 0x4e, 0x7d, 0xb1, 0xfb, 0x1f, 0x52, 0x2f, 0x6d, 0x85, 0x6d, 0x5a, 0xe1, 0xdd, 0x47, 0x51, 0x3e, 0x02, 0x94, 0xfb, 0x19, 0xe3, 0x8e, 0xc7, 0x8c, 0x25, 0x36, 0xab, 0x21, 0x5c, 0x10, 0xdb, 0xcf, 0xd6, 0x61, 0x1b, 0x89, 0x94, 0x3f, 0x2f, 0x88, 0x6b, 0x16, 0x30, 0x2f, 0xd3, 0x2c, 0x6d, 0xbb, 0x0f, 0x4c, 0x48, 0x48, 0x66, 0x04, 0x5a, 0x05, 0x7c, 0x55, 0x2b, 0x5b, 0x1e, 0x3d, 0x9d, 0xc0, 0x52, 0xeb, 0xf3, 0x50, 0x22, 0xc6, 0xc4, 0x91, 0x9a, 0xf4, 0x90, 0x3a, 0x62, 0x80, 0xac, 0xcb, 0xab, 0xb8, 0x49, 0x35, 0x5d, 0xce, 0x24, 0xe4, 0x8a, 0xd3, 0x36, 0x62, 0x34, 0x11, 0x00, 0x76, 0x40, 0xb4, 0x72, 0x0a, 0x3e, 0x60, 0xa7, 0x66, 0x9b, 0x84, 0xf0, 0xe5, 0x65, 0x63, 0x3a, 0x32, 0x34, 0x83, 0xb9, 0xbf, 0xa6, 0x56, 0x33, 0x88, 0x8f, 0x43, 0x62, 0x91, 0x3e, 0x08, 0x86, 0xe3, 0xf8, 0x20, 0xf1, 0x7c, 0x32, 0x90, 0xb8, 0xba, 0x36, 0x79, 0x4b, 0x76, 0x4c, 0x52, 0x1f, 0x5c, 0xda, 0xe8, 0x9d, 0x64, 0x06, 0x5c, 0x33, 0xec, 0x00, 0x60, 0xba, 0x9f, 0xdf, 0x2d, 0x19, 0x21, 0xcf, 0xdc, 0x8e, 0xd3, 0x00, 0x90, 0x00, 0x90, 0xdd, 0x5b, 0x30, 0x26, 0x96, 0xb9, 0x7f, 0x8d, 0x02, 0x76, 0x52, 0x40, 0x78, 0x24, 0x65, 0x7b, 0x4e, 0xcb, 0x9a, 0x19, 0x79, 0x15, 0x43, 0x2b, 0xc3, 0x4b, 0x36, 0x73, 0x27, 0xca, 0xaf, 0x55, 0xee, 0xee, 0x41, 0xb3, 0xbe, 0x69, 0x2a, 0x88, 0x03, 0x3e, 0xf1, 0xac, 0xc0, 0x9e, 0x41, 0x37, 0x1a, 0xd8, 0xaa, 0x32, 0x44, 0x42, 0x81, 0x4e, 0x67, 0x75, 0x50, 0xf6, 0x74, 0xb6, 0x4f, 0x67, 0x73, 0xe8, 0xae, 0x12, 0x4c, 0xfc, 0xb3, 0xbf, 0x5f, 0x96, 0x9b, 0xf1, 0x41, 0x31, 0x4a, 0x59, 0x3a, 0xa1, 0xf2, 0xd2, 0xf4, 0x82, 0xb1, 0xc8, 0x13, 0xc7, 0x29, 0xae, 0x28, 0xae, 0x6e, 0xdf, 0x8e, 0xf4, 0x1b, 0x9f, 0x07, 0x1e, 0x15, 0x65, 0x6e, 0x9c, 0xc8, 0xd6, 0xe8, 0x7a, 0xee, 0x1e, 0x7e, 0x92, 0xdb, 0xb5, 0x84, 0x1a, 0xbf, 0x3c, 0x3e, 0xf2, 0xf1, 0xe4, 0x59, 0x5a, 0xbf, 0x19, 0xc9, 0x14, 0xd8, 0xf4, 0x71, 0x1c, 0x88, 0x36, 0x95, 0x55, 0x8d, 0x92, 0x84, 0xc4, 0xec, 0x4c, 0x48, 0x28, 0x5c, 0x9c, 0xb5, 0xcb, 0x55, 0x6d, 0xdf, 0xa6, 0x60, 0xbd, 0x19, 0x44, 0x71, 0xff, 0x3d, 0x74, 0x0f, 0x19, 0xd1, 0x4a, 0x5f, 0x88, 0xe2, 0x49, 0xcc, 0xc2, 0x99, 0x94, 0xdc, 0xbc, 0x4a, 0x99, 0xa7, 0x58, 0x2b, 0xab, 0x29, 0xa1, 0x25, 0x16, 0xe7, 0x10, 0xb8, 0x45, 0x75, 0x6b, 0xd7, 0x5e, 0x73, 0x2d, 0xf6, 0x0f, 0xf8, 0x15, 0x5f, 0xfb, 0x10, 0x39, 0x05, 0xad, 0xfc, 0x45, 0x86, 0x68, 0x6b, 0x59, 0x4d, 0x26, 0xf1, 0x2b, 0xc8, 0xfe, 0xe4, 0xc0, 0x78, 0xef, 0xd7, 0x2d, 0xfd, 0x5a, 0xb9, 0x96, 0x69, 0x68, 0x50, 0x4d, 0x8c, 0x0c, 0xed, 0x25, 0x20, 0xc0, 0xd7, 0xce, 0x87, 0xdb, 0x1d, 0x1e, 0x68, 0xae, 0x9c, 0x07, 0x42, 0xbd, 0x01, 0x3e, 0x4f, 0x68, 0xce, 0x16, 0x0b, 0xc1, 0xb4, 0x5e, 0x1c, 0xda, 0xfa, 0x9e, 0xcd, 0x89, 0x8f, 0x9b, 0xf0, 0x3c, 0x6a, 0xd1, 0x83, 0x34, 0x3f, 0x4f, 0xf1, 0xf7, 0x9a, 0xfa, 0x3c, 0x92, 0xf2, 0x71, 0xfc, 0xa1, 0x67, 0x96, 0x35, 0x35, 0x59, 0x31, 0xa8, 0x49, 0xa4, 0xb5, 0x3b, 0x22, 0x50, 0x4d, 0x3b, 0x82, 0xe1, 0x59, 0xdd, 0x9b, 0x8a, 0xbc, 0x3a, 0xeb, 0x56, 0x5c, 0xe5, 0x4b, 0x29, 0x19, 0x9a, 0xef, 0x88, 0x68, 0x30, 0xf1, 0xf6, 0xa5, 0x0e, 0xb9, 0xef, 0x6f, 0x53, 0x83, 0xa0, 0x73, 0xc3, 0x2d, 0xcb, 0xa8, 0x52, 0x27, 0xb2, 0xf4, 0x23, 0xf9, 0x6c, 0x9a, 0x5e, 0x04, 0x85, 0x44, 0x30, 0x92, 0xfc, 0x10, 0x84, 0x69, 0x8e, 0xfe, 0xa8, 0x93, 0xcb, 0x0e, 0x63, 0xbb, 0x86, 0xc1, 0x6e, 0x17, 0x9a, 0xbf, 0x22, 0xce, 0x01, 0xdb, 0x08, 0x77, 0x24, 0xfe, 0xa0, 0x60, 0x07, 0x5c, 0x48, 0x2d, 0x69, 0x40, 0x22, 0x28, 0x3b, 0xca, 0x67, 0x0e, 0xfb, 0x84, 0x55, 0x66, 0x01, 0x3d, 0xf7, 0x40, 0x8c, 0xa8, 0x06, 0x6b, 0x39, 0x97, 0x37, 0x9c, 0x90, 0x2e, 0x69, 0x10, 0xbe, 0xe8, 0xef, 0x69, 0x20, 0x75, 0x5b, 0xa6, 0x36, 0x9d, 0x1e, 0x79, 0xe2, 0xe1, 0x09, 0x32, 0x08, 0x4e, 0xd8, 0x72, 0x4e, 0x36, 0x74, 0x4d, 0x29, 0x34, 0xf2, 0xec, 0x4c, 0xbd, 0x26, 0x4e, 0x87, 0xaa, 0xaf, 0x9c, 0xb9, 0x14, 0xf5, 0xd1, 0xd6, 0xe4, 0x4b, 0x3f, 0xbc, 0x0f, 0x65, 0x09, 0x13, 0x74, 0x2d, 0x2b, 0xef, 0x35, 0xb0, 0x62, 0xb6, 0x99, 0x98, 0x06, 0x0f, 0x77, 0x54, 0x6c, 0x93, 0x1f, 0x3e, 0x7b, 0xb9, 0x8d, 0xec, 0x31, 0x7d, 0xc3, 0x89, 0x27, 0xaa, 0xe8, 0xad, 0xbe, 0xe3, 0x8b, 0xa2, 0x2a, 0xb9, 0xa7, 0x19, 0x88, 0xd1, 0x37, 0xee, 0x24, 0x47, 0xc4, 0xb3, 0x14, 0xbd, 0xa1, 0x53, 0xf2, 0xd8, 0xff, 0x8e, 0x40, 0xe2, 0x36, 0x36, 0x30, 0xaa, 0xab, 0x84, 0x2f, 0xaf, 0xd7, 0xc5, 0x3f, 0x85, 0xd2, 0x81, 0x57, 0x35, 0xa5, 0x90, 0xa5, 0xec, 0x45, 0xa1, 0x34, 0xec, 0xed, 0x64, 0xf4, 0xb5, 0xd6, 0x73, 0x2b, 0xb6, 0xa9, 0x32, 0xe6, 0x82, 0xf5, 0x20, 0xdd, 0x2c, 0xe3, 0x0c, 0x2e, 0x47, 0x70, 0x10, 0x99, 0x13, 0x10, 0x20, 0x8e, 0xba, 0x75, 0x75, 0x91, 0x04, 0x1c, 0x72, 0xe7, 0x81, 0xfa, 0x7b, 0xae, 0xd3, 0x4a, 0x1a, 0x39, 0x99, 0x57, 0xe4, 0xd2, 0x43, 0xfd, 0x21, 0x08, 0x49, 0xd4, 0xf7, 0xb9, 0x49, 0x21, 0x61, 0x1e, 0xd9, 0x18, 0x81, 0xb4, 0x7c, 0x98, 0x41, 0x13, 0xea, 0x78, 0xcd, 0x20, 0xc5, 0xb3, 0x9e, 0x9f, 0xe4, 0x7d, 0x37, 0x1b, 0x77, 0x8a, 0x48, 0x1e, 0xaf, 0x30, 0x86, 0x2f, 0x3c, 0x07, 0x11, 0x21, 0xd7, 0x17, 0x99, 0x38, 0x90, 0xad, 0x4d, 0x2d, 0x4a, 0x25, 0x72, 0xd2, 0x3b, 0xe6, 0x86, 0x67, 0xf6, 0xf7, 0x60, 0x78, 0xfe, 0x94, 0xeb, 0xcb, 0x31, 0x10, 0x5c, 0xac, 0xee, 0x56, 0x95, 0xb2, 0x66, 0xb9, 0x11, 0x4b, 0x02, 0x7a, 0x1e, 0x49, 0x11, 0xb6, 0x87, 0xfd, 0x9c, 0x63, 0x13, 0x0f, 0xf9, 0x45, 0x14, 0x85, 0xc7, 0x3f, 0xfa, 0xc5, 0xab, 0xb6, 0x5e, 0xdc, 0x87, 0xd5, 0xad, 0xcc, 0x0a, 0x38, 0x44, 0x67, 0x60, 0xd9, 0xbf, 0x61, 0xc5, 0xd2, 0x92, 0xf4, 0xa5, 0x56, 0xfb, 0x85, 0xc7, 0x35, 0xdf, 0xfb, 0xad, 0x75, 0xed, 0xc0, 0x8f, 0x5e, 0x93, 0x56, 0x79, 0x82, 0x49, 0x4e, 0x10, 0x0c, 0x42, 0xb9, 0x6f, 0xec, 0x7d, 0xbd, 0x6d, 0x54, 0xab, 0x40, 0x45, 0x23, 0xf5, 0xfb, 0x30, 0xb0, 0x40, 0xc6, 0xd3, 0xc5, 0xc9, 0x4f, 0xef, 0xad, 0x1c, 0xd0, 0x58, 0xb7, 0x04, 0xcb, 0xbf, 0x5b, 0x26, 0xe3, 0x12, 0xe2, 0x4e, 0x22, 0xfe, 0xae, 0x05, 0xae, 0x8e, 0x2d, 0x86, 0xda, 0xd5, 0xfd, 0x36, 0xd0, 0x59, 0xba, 0xcf, 0x32, 0xa3, 0x7e, 0x04, 0x0f, 0x2d, 0xcc, 0x49, 0xcd, 0xe4, 0xa3, 0xa3, 0x5a, 0xfc, 0xf8, 0x68, 0x73, 0x1c, 0xe5, 0x7c, 0x6e, 0x8f, 0x4a, 0x1f, 0xf3, 0xe6, 0x53, 0x7b, 0x84, 0x43, 0x4e, 0x30, 0x6d, 0xfc, 0x59, 0x22, 0x6b, 0x3d, 0x76, 0x87, 0x6f, 0x68, 0xb6, 0x54, 0x1d, 0xd0, 0x82, 0x34, 0x66, 0xf1, 0x0d, 0x6f, 0xa3, 0x0c, 0x14, 0xb6, 0x89, 0x9d, 0xff, 0x4c, 0x43, 0x8b, 0xa6, 0xf8, 0x2b, 0x2e, 0x4d, 0x32, 0x31, 0x17, 0x96, 0xa8, 0xda, 0x7b, 0xf1, 0x8b, 0xf0, 0xd6, 0x89, 0xba, 0xaf, 0x8c, 0xef, 0xd8, 0x08, 0x98, 0x4d, 0x61, 0xb4, 0x20, 0x87, 0xeb, 0xbb, 0x40, 0x66, 0x58, 0x49, 0x65, 0xc4, 0x0a, 0x93, 0x37, 0x92, 0x27, 0x4b, 0x98, 0xf4, 0x89, 0xdd, 0x07, 0x48, 0xad, 0x15, 0x9d, 0xb1, 0xb4, 0xb7, 0xf3, 0x8a, 0xc1, 0xb6, 0x2c, 0x45, 0x0c, 0xeb, 0x60, 0xf5, 0x2a, 0xc0, 0x45, 0x18, 0xe7, 0x96, 0x8b, 0xc3, 0x31, 0xf5, 0x97, 0xce, 0x53, 0xcb, 0x35, 0xe1, 0x59, 0x0c, 0x73, 0xc2, 0xcc, 0x20, 0x12, 0x46, 0xf8, 0x53, 0x91, 0x55, 0xc1, 0xff, 0xab, 0x5f, 0x4a, 0x96, 0xe7, 0xf9, 0x03, 0x97, 0x01, 0xbd, 0xaf, 0x70, 0xe1, 0xff, 0xc0, 0xc8, 0xde, 0x66, 0x94, 0x76, 0xe9, 0x26, 0x5f, 0xfc, 0xa5, 0x0e, 0x4f, 0xa4, 0xb9, 0x78, 0x7c, 0x8e, 0xad, 0xa2, 0x34, 0x43, 0x53, 0x3d, 0x81, 0x91, 0xf4, 0xe9, 0x06, 0x51, 0x73, 0x37, 0xed, 0x56, 0x19, 0xcf, 0xcf, 0xa9, 0xc1, 0xab, 0xf7, 0x94, 0x37, 0x64, 0xa2, 0x23, 0xc5, 0x95, 0x9d, 0xd5, 0x69, 0xf8, 0x20, 0x0e, 0x1f, 0x03, 0xb6, 0x5c, 0xb1, 0xd2, 0xe1, 0xa4, 0xbc, 0x98, 0xdc, 0x16, 0x34, 0x5f, 0x98, 0xd6, 0x7d, 0xaa, 0x87, 0xbc, 0xd7, 0xfb, 0x0c, 0xfa, 0x36, 0x91, 0xfa, 0x3e, 0xad, 0x1c, 0xbb, 0xc5, 0x35, 0x24, 0x07, 0x91, 0x79, 0x48, 0x5b, 0xe7, 0xd9, 0x96, 0xe1, 0x81, 0xdc, 0x8a, 0x9f, 0x17, 0xf4, 0x9c, 0x89, 0xee, 0xdf, 0x5e, 0x24, 0x12, 0xc0, 0xd1, 0x65, 0x9c, 0x16, 0x3e, 0x07, 0xc0, 0x4b, 0x80, 0x8b, 0x96, 0x19, 0xb2, 0x13, 0x95, 0xb0, 0x75, 0x9a, 0x0c, 0xe6, 0x14, 0x27, 0xc5, 0xfc, 0x02, 0x6e, 0xfa, 0x9e, 0xcc, 0x7a, 0x2b, 0x98, 0xf3, 0xfc, 0x3d, 0x5d, 0x53, 0xc1, 0x77, 0x14, 0x59, 0xf6, 0x88, 0x76, 0x26, 0xa8, 0x44, 0x72, 0x18, 0x24, 0xff, 0x63, 0x5c, 0x72, 0x53, 0x52, 0x01, 0xfe, 0xaa, 0x7f, 0x5e, 0xf9, 0xf6, 0xfe, 0x4a, 0x9a, 0x9d, 0xf6, 0x9d, 0x46, 0xc8, 0xa9, 0x7e, 0xae, 0x86, 0x49, 0x2f, 0x46, 0x1e, 0x3c, 0x08, 0x87, 0x59, 0xe9, 0xa1, 0x6a, 0x0c, 0x79, 0xc6, 0xbe, 0xc8, 0x32, 0xde, 0x0d, 0xc2, 0xc7, 0xd8, 0x2f, 0xbf, 0x4e, 0x96, 0x7d, 0x09, 0x87, 0x35, 0xcc, 0xd8, 0x6f, 0x98, 0xe8, 0xfc, 0xd6, 0x70, 0xe0, 0x20, 0x08, 0x6f, 0xb5, 0x91, 0x16, 0x6f, 0x85, 0x53, 0xa4, 0x9c, 0x83, 0x99, 0x16, 0x81, 0xdf, 0xdd, 0x32, 0x57, 0x97, 0x69, 0x09, 0xa7, 0x0b, 0xad, 0x6f, 0x9b, 0xb0, 0x35, 0x5a, 0xe2, 0xe6, 0xc2, 0xf4, 0xc9, 0xef, 0x84, 0xee, 0x9a, 0x35, 0xbe, 0x40, 0x55, 0x3d, 0x31, 0x75, 0xfb, 0xec, 0x21, 0xe6, 0x13, 0x25, 0xd2, 0xaf, 0x9d, 0xce, 0xe1, 0x28, 0xe2, 0x8a, 0xcb, 0x83, 0xbc, 0x00, 0xe5, 0x65, 0x2d, 0x96, 0x59, 0x12, 0x93, 0xd7, 0x58, 0xeb, 0xda, 0x2c, 0xad, 0x62, 0x99, 0xb6, 0x82, 0x7a, 0x46, 0x94, 0x4c, 0x73, 0x24, 0x82, 0x61, 0xf0, 0xec, 0x32, 0x87, 0x82, 0xb1, 0xe2, 0x0f, 0x7a, 0x81, 0x1a, 0x84, 0xef, 0xb1, 0xfe, 0x62, 0x1b, 0x61, 0x39, 0x5f, 0xad, 0x19, 0xd1, 0x9d, 0x0b, 0xed, 0x67, 0x15, 0x3c, 0xa0, 0x8c, 0x7a, 0x28, 0x57, 0xe3, 0xdc, 0xac, 0xb0, 0x2d, 0x66, 0x58, 0xce, 0xc6, 0xec, 0x7e, 0x66, 0xfc, 0xcf, 0x12, 0x8f, 0x33, 0x80, 0xa6, 0xfb, 0xc9, 0xb3, 0xfe, 0xd9, 0x62, 0x78, 0x94, 0x33, 0x74, 0xc6, 0xcd, 0x58, 0xfa, 0x39, 0xed, 0x6e, 0x61, 0x3c, 0x70, 0x32, 0xe5, 0x1b, 0x6f, 0x70, 0x79, 0x62, 0x41, 0x6b, 0x07, 0x99, 0xae, 0xab, 0xd8, 0x7a, 0x4c, 0x6c, 0x14, 0xf7, 0x53, 0x92, 0xd6, 0x77, 0xbb, 0x92, 0xec, 0xa2, 0xfc, 0x92, 0x12, 0x05, 0xce, 0x59, 0x82, 0xff, 0xd0, 0x3a, 0x88, 0x00, 0x34, 0x75, 0xec, 0x56, 0x4c, 0x34, 0x19, 0x53, 0x17, 0xc7, 0x82, 0x4e, 0x22, 0x9f, 0x10, 0x04, 0x92, 0x80, 0xf7, 0x02, 0xd9, 0x9a, 0x6b, 0x1b, 0xa8, 0xc8, 0xb8, 0x8b, 0x07, 0xfe, 0xa2, 0xf8, 0x74, 0x85, 0x99, 0x53, 0x05, 0x0d, 0xdf, 0x7e, 0x28, 0xbe, 0x23, 0x66, 0x6e, 0x81, 0x06, 0xb0, 0x6f, 0x95, 0x80, 0x2b, 0x6d, 0xab, 0xad, 0x52, 0xde, 0x85, 0xee, 0x88, 0x4c, 0x37, 0x27, 0x04, 0x81, 0xaf, 0xc9, 0x91, 0x5a, 0xa0, 0x97, 0xfd, 0x2b, 0x64, 0x7b, 0x00, 0x5d, 0xd6, 0x91, 0xb3, 0xa9, 0x8e, 0xcf, 0xa0, 0xdd, 0x7e, 0xe1, 0x70, 0x4f, 0xf9, 0x2c, 0x73, 0xb1, 0xe5, 0xb9, 0x5d, 0x99, 0x87, 0xd3, 0xcf, 0x7a, 0x71, 0x91, 0xa2, 0x3c, 0xce, 0x7a, 0xb0, 0xc4, 0x4a, 0xa5, 0x04, 0x84, 0x96, 0x0c, 0x4e, 0x33, 0x60, 0x0d, 0xc8, 0xce, 0x7e, 0x8f, 0xba, 0xdb, 0x5e, 0x79, 0xd8, 0xc5, 0xdc, 0x34, 0xa2, 0x00, 0x41, 0x3f, 0x8d, 0x46, 0xb6, 0x90, 0x45, 0x83, 0x51, 0x76, 0x06, 0xab, 0x64, 0x30, 0x8e, 0x82, 0xf2, 0x32, 0x7a, 0x7a, 0x9b, 0x7c, 0x19, 0x2e, 0xe1, 0xeb, 0xc9, 0xb4, 0x50, 0x1f, 0xae, 0x85, 0x11, 0x71, 0x22, 0xec, 0x67, 0x57, 0x31, 0x45, 0x0c, 0x09, 0x6c, 0x3f, 0xb2, 0x6a, 0xab, 0x87, 0xb9, 0xe5, 0x65, 0xdd, 0xab, 0x9f, 0x31, 0x02, 0x99, 0x14, 0xed, 0x9f, 0x86, 0x7b, 0xe2, 0xdc, 0xc1, 0xf7, 0x5e, 0x0e, 0xf2, 0x18, 0x52, 0x91, 0xe6, 0x8b, 0x19, 0xda, 0x15, 0x50, 0x7d, 0xeb, 0x20, 0x69, 0xaf, 0x6c, 0x6b, 0x2f, 0x13, 0xe4, 0x50, 0xf4, 0x2c, 0x43, 0x4d, 0x22, 0x4b, 0xc5, 0x87, 0x0d, 0x12, 0xe0, 0x52, 0x4f, 0x9b, 0x22, 0x21, 0x41, 0x47, 0x18, 0x2a, 0x2b, 0xca, 0x6b, 0xe7, 0xd7, 0xb3, 0x2d, 0x45, 0x29, 0xff, 0x01, 0xc6, 0x1b, 0x19, 0xec, 0xf3, 0xda, 0x61, 0x78, 0x1e, 0x17, 0xfa, 0x4f, 0xd9, 0xff, 0x27, 0x22, 0xf5, 0x82, 0x46, 0x14, 0x13, 0x87, 0x79, 0x09, 0x32, 0xae, 0xfc, 0x34, 0x71, 0x9d, 0x14, 0x73, 0x89, 0xe7, 0x47, 0x93, 0xda, 0x36, 0xad, 0xb9, 0x7d, 0xe0, 0x39, 0x6c, 0xc2, 0xfa, 0xd8, 0xa7, 0xbd, 0x1e, 0x7b, 0xbf, 0xb8, 0xa2, 0x48, 0x4a, 0x61, 0xf4, 0x61, 0x79, 0x81, 0x92, 0x7b, 0xe3, 0x22, 0xbf, 0xda, 0x2d, 0x25, 0x37, 0xbe, 0x29, 0xab, 0x61, 0x20, 0x21, 0x69, 0x65, 0x6f, 0x3f, 0x00, 0x01, 0x89, 0x78, 0x21, 0xf9, 0x69, 0x3f, 0x37, 0x51, 0xd2, 0xef, 0x81, 0x52, 0x35, 0x35, 0x94, 0x6f, 0x70, 0x2b, 0x1f, 0xbe, 0x13, 0x8f, 0x25, 0x26, 0xf5, 0xb6, 0xb1, 0x69, 0xbc, 0x91, 0xc7, 0xb7, 0xce, 0x48, 0x78, 0xc3, 0x55, 0xb3, 0x6f, 0xab, 0x04, 0xcc, 0x7d, 0xe9, 0x09, 0x67, 0x25, 0x12, 0x7f, 0xaf, 0xe5, 0xbd, 0xe4, 0xc4, 0xa2, 0x3f, 0x23, 0xd7, 0x78, 0x89, 0xd9, 0x30, 0xcc, 0x54, 0x3d, 0xc1, 0xa6, 0x81, 0x89, 0x5c, 0x1f, 0x0c, 0xbc, 0xd4, 0x8d, 0xe1, 0xb8, 0x6a, 0x62, 0x5e, 0x86, 0xf9, 0xad, 0x11, 0x17, 0x22, 0xa8, 0xdc, 0x0f, 0xa2, 0x3e, 0x2b, 0x68, 0x7a, 0x9b, 0xe9, 0xa0, 0xf6, 0x3f, 0x7b, 0xfe, 0x1a, 0xf5, 0x2b, 0x96, 0x88, 0xe4, 0x9d, 0x8e, 0x9e, 0xa4, 0x97, 0xa9, 0x94, 0x09, 0xf2, 0x94, 0x3c, 0xc9, 0xea, 0x34, 0x8f, 0xde, 0x8d, 0xcf, 0xb8, 0x53, 0x99, 0x12, 0x55, 0x68, 0xed, 0xba, 0x26, 0x95, 0xd8, 0x37, 0x40, 0xef, 0x02, 0xe3, 0x8b, 0x11, 0x52, 0x24, 0xa7, 0xb7, 0xea, 0x72, 0xd8, 0x70, 0x47, 0x83, 0xb2, 0x53, 0x83, 0x71, 0x0c, 0x8d, 0x1a, 0xd1, 0x3c, 0x2b, 0x75, 0x7a, 0xea, 0x2a, 0x74, 0x74, 0xb6, 0x13, 0xa0, 0x56, 0x5a, 0xeb, 0xbe, 0x0a, 0x89, 0xe6, 0xc8, 0x7e, 0x98, 0x76, 0x5a, 0x44, 0x0f, 0x52, 0x9d, 0x74, 0x6d, 0xa2, 0xf1, 0x06, 0x2e, 0x24, 0x2f, 0xcd, 0x34, 0x0e, 0x22, 0xe8, 0xd6, 0x2e, 0xfa, 0xd3, 0x67, 0x41, 0x1a, 0xa9, 0x0f, 0xf1, 0x9c, 0xfe, 0xdf, 0xc0, 0xc9, 0x4f, 0x76, 0xc1, 0x72, 0xab, 0x04, 0x64, 0x47, 0x48, 0x88, 0xf5, 0xa5, 0x6f, 0x6e, 0x33, 0x13, 0x90, 0xa8, 0x29, 0x4d, 0x29, 0x49, 0x3d, 0x16, 0x79, 0x1f, 0x88, 0x27, 0x0a, 0xf3, 0x25, 0xfa, 0x7b, 0x0c, 0xb8, 0x1b, 0xf8, 0x98, 0x4f, 0xd4, 0xe5, 0xa8, 0x10, 0xe7, 0x98, 0xf1, 0x0e, 0xd0, 0x85, 0xa9, 0x0e, 0xd8, 0xf8, 0x66, 0xee, 0x34, 0xd6, 0xa0, 0x13, 0x55, 0xcd, 0x92, 0xb4, 0x12, 0xc6, 0x97, 0xdc, 0xf2, 0xbf, 0x6e, 0x1f, 0x00, 0xa8, 0x0a, 0x6f, 0xe9, 0x00, 0xdc, 0xda, 0xd6, 0x95, 0xa6, 0x97, 0xdd, 0x9d, 0x59, 0x4f, 0x40, 0x82, 0x24, 0x47, 0x99, 0x49, 0x9c, 0xd3, 0xd3, 0x8b, 0x40, 0xd4, 0xb9, 0x31, 0xa2, 0xf7, 0x2a, 0x48, 0xbf, 0x38, 0x93, 0x90, 0x41, 0x8a, 0x45, 0x3c, 0xd5, 0x2f, 0xbc, 0x67, 0x53, 0xaa, 0x0f, 0xa0, 0xa3, 0xe9, 0x37, 0x6f, 0x2e, 0xa4, 0x27, 0x01, 0x9c, 0x40, 0x6a, 0x26, 0xc2, 0x84, 0x30, 0x4d, 0x8f, 0x09, 0x98, 0xd8, 0x04, 0x87, 0x9f, 0xa3, 0x5f, 0x31, 0x6e, 0x15, 0x3d, 0xec, 0xc6, 0x88, 0x2d, 0x7d, 0xfc, 0x9c, 0x11, 0x7f, 0x6e, 0xf2, 0x3c, 0xbf, 0xde, 0x3a, 0xd1, 0xeb, 0x8f, 0x01, 0x3a, 0xb5, 0x1f, 0x26, 0x71, 0x18, 0xf4, 0x13, 0x4d, 0x84, 0xe0, 0x1c, 0x26, 0xcf, 0x6e, 0xb3, 0xad, 0x64, 0xcb, 0x1d, 0xb1, 0xec, 0x5f, 0xa0, 0x7f, 0x4b, 0xcd, 0x79, 0x65, 0x7d, 0x37, 0xa6, 0x2c, 0x9b, 0x99, 0xff, 0xe5, 0xce, 0x26, 0xda, 0x8c, 0x5b, 0xfb, 0x0e, 0xbf, 0x21, 0xd7, 0xea, 0x53, 0xc2, 0x2f, 0x86, 0x70, 0x60, 0x0e, 0xae, 0xdc, 0xb1, 0x05, 0xb7, 0xe0, 0x7b, 0xcd, 0x38, 0x73, 0x73, 0x89, 0x4f, 0xc9, 0x37, 0x94, 0x20, 0x79, 0x3e, 0x6d, 0xf1, 0xab, 0x36, 0x01, 0x2c, 0x99, 0xa2, 0x55, 0x6b, 0x73, 0x92, 0x42, 0x82, 0x9e, 0x07, 0xd3, 0xfa, 0xd0, 0xd1, 0x7c, 0x1d, 0x05, 0x19, 0xf4, 0x3d, 0xb5, 0x60, 0x7e, 0xc5, 0x39, 0xb3, 0xb0, 0x51, 0x1b, 0x16, 0x8d, 0xa1, 0x55, 0xaf, 0x64, 0x01, 0x2c, 0xef, 0x01, 0x92, 0x86, 0x8e, 0xa1, 0x73, 0xe9, 0x3c, 0x8c, 0x65, 0xfd, 0x6e, 0x81, 0xb3, 0xc7, 0xaf, 0xbc, 0xa1, 0x4e, 0xa1, 0xde, 0x83, 0x7a, 0xe8, 0xad, 0xa6, 0x33, 0x35, 0xa8, 0x78, 0xd1, 0x2d, 0x04, 0x37, 0x92, 0x39, 0xf6, 0x72, 0x8c, 0x4b, 0x3d, 0x8f, 0x48, 0x7a, 0x94, 0xd5, 0x5f, 0x6d, 0x0f, 0x52, 0x7d, 0x87, 0xe6, 0x34, 0x47, 0x5e, 0x47, 0x39, 0xda, 0x6e, 0xbe, 0x6f, 0x91, 0x74, 0x61, 0x72, 0xcf, 0x03, 0xb4, 0x02, 0xcc, 0x6a, 0xca, 0x12, 0xa8, 0x20, 0x31, 0x9c, 0x2a, 0x23, 0xbc, 0x7b, 0x61, 0x68, 0xc6, 0xd9, 0x7c, 0xf8, 0xa9, 0xf7, 0xea, 0xf7, 0xf5, 0xef, 0x0c, 0x08, 0x58, 0xb4, 0x3f, 0xe6, 0x04, 0x37, 0xa5, 0x13, 0x74, 0xc1, 0x9d, 0xf8, 0x55, 0x73, 0xcd, 0x54, 0x09, 0x82, 0x08, 0x39, 0x6f, 0xaf, 0xb3, 0xde, 0xad, 0x70, 0xa1, 0x60, 0x53])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_16_msg_3401_tweak_16(self, test_workers):
my_key = bytes([0xb0, 0xad, 0xaa, 0xa7, 0xa4, 0xa1, 0x9e, 0x9b, 0x98, 0x95, 0x92, 0x8f, 0x8c, 0x89, 0x86, 0x83])
my_tweak = bytes([0x16, 0xf7, 0xd8, 0xb9, 0x9a, 0x7b, 0x5c, 0x3d, 0x1d, 0xfe, 0xdf, 0xc0, 0xa1, 0x82, 0x63, 0x44])
my_message = bytes([0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9d, 0x9a, 0x97, 0x94, 0x91, 0x8e, 0x8b, 0x88, 0x85, 0x82, 0x7f, 0x7c, 0x79, 0x76, 0x73, 0x70, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xea, 0xe7, 0xe4, 0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9d, 0x9a, 0x97, 0x94, 0x91, 0x8e, 0x8b, 0x88, 0x85, 0x82, 0x7f, 0x7c, 0x79, 0x76, 0x73, 0x70, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xea, 0xe7, 0xe4, 0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9d, 0x9a, 0x97, 0x94, 0x91, 0x8e, 0x8b, 0x88, 0x85, 0x82, 0x7f, 0x7c, 0x79, 0x76, 0x73, 0x70, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xea, 0xe7, 0xe4, 0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9d, 0x9a, 0x97, 0x94, 0x91, 0x8e, 0x8b, 0x88, 0x85, 0x82, 0x7f, 0x7c, 0x79, 0x76, 0x73, 0x70, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xea, 0xe7, 0xe4, 0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9d, 0x9a, 0x97, 0x94, 0x91, 0x8e, 0x8b, 0x88, 0x85, 0x82, 0x7f, 0x7c, 0x79, 0x76, 0x73, 0x70, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xea, 0xe7, 0xe4, 0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9d, 0x9a, 0x97, 0x94, 0x91, 0x8e, 0x8b, 0x88, 0x85, 0x82, 0x7f, 0x7c, 0x79, 0x76, 0x73, 0x70, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xea, 0xe7, 0xe4, 0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9d, 0x9a, 0x97, 0x94, 0x91, 0x8e, 0x8b, 0x88, 0x85, 0x82, 0x7f, 0x7c, 0x79, 0x76, 0x73, 0x70, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xea, 0xe7, 0xe4, 0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9d, 0x9a, 0x97, 0x94, 0x91, 0x8e, 0x8b, 0x88, 0x85, 0x82, 0x7f, 0x7c, 0x79, 0x76, 0x73, 0x70, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xea, 0xe7, 0xe4, 0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9d, 0x9a, 0x97, 0x94, 0x91, 0x8e, 0x8b, 0x88, 0x85, 0x82, 0x7f, 0x7c, 0x79, 0x76, 0x73, 0x70, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xea, 0xe7, 0xe4, 0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9d, 0x9a, 0x97, 0x94, 0x91, 0x8e, 0x8b, 0x88, 0x85, 0x82, 0x7f, 0x7c, 0x79, 0x76, 0x73, 0x70, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xea, 0xe7, 0xe4, 0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9d, 0x9a, 0x97, 0x94, 0x91, 0x8e, 0x8b, 0x88, 0x85, 0x82, 0x7f, 0x7c, 0x79, 0x76, 0x73, 0x70, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xea, 0xe7, 0xe4, 0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9d, 0x9a, 0x97, 0x94, 0x91, 0x8e, 0x8b, 0x88, 0x85, 0x82, 0x7f, 0x7c, 0x79, 0x76, 0x73, 0x70, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xea, 0xe7, 0xe4, 0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9d, 0x9a, 0x97, 0x94, 0x91, 0x8e, 0x8b, 0x88, 0x85, 0x82, 0x7f, 0x7c, 0x79, 0x76, 0x73, 0x70, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xea, 0xe7, 0xe4, 0xe1, 0xde, 0xdb, 0xd8, 0xd5, 0xd2, 0xcf, 0xcc, 0xc9, 0xc6, 0xc3, 0xc0, 0xbd, 0xba, 0xb7, 0xb4, 0xb1, 0xae, 0xab, 0xa8, 0xa5, 0xa2, 0x9f, 0x9c, 0x99, 0x96, 0x93, 0x90, 0x8d, 0x8a, 0x87, 0x84, 0x81, 0x7e, 0x7b, 0x78, 0x75, 0x72, 0x6f, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54, 0x51, 0x4e, 0x4b, 0x48, 0x45, 0x42, 0x3f, 0x3c, 0x39, 0x36, 0x33, 0x30, 0x2d, 0x2d, 0x2a, 0x27, 0x24, 0x21, 0x1e, 0x1b, 0x18, 0x15, 0x12, 0x0f, 0x0c, 0x09, 0x06, 0x03, 0x00, 0xfd, 0xfa, 0xf7, 0xf4, 0xf1, 0xee, 0xeb, 0xe8, 0xe5, 0xe2, 0xdf, 0xdc, 0xd9, 0xd6, 0xd3, 0xd0, 0xcd, 0xca, 0xc7, 0xc4, 0xc1, 0xbe, 0xbb, 0xb8, 0xb5, 0xb2, 0xaf, 0xac, 0xa9, 0xa6, 0xa3, 0xa0, 0x9d, 0x9a, 0x97, 0x94, 0x91, 0x8e, 0x8b, 0x88, 0x85, 0x82, 0x7f, 0x7c, 0x79, 0x76, 0x73, 0x70, 0x6c, 0x69, 0x66, 0x63, 0x60, 0x5d, 0x5a, 0x57, 0x54])
real_ciphertext = bytes([0xc4, 0x60, 0xa2, 0x83, 0x5d, 0x65, 0x9c, 0x14, 0x9b, 0x14, 0x95, 0x2b, 0xe1, 0x6f, 0x2f, 0xa0, 0x23, 0xef, 0x5e, 0xf8, 0x5f, 0xf8, 0x9f, 0xe4, 0x65, 0x9f, 0xa8, 0xce, 0x1c, 0x1c, 0xe0, 0x70, 0xbf, 0xe2, 0x12, 0xba, 0xb8, 0x20, 0x3d, 0x93, 0xce, 0x5f, 0x75, 0x44, 0x79, 0x9e, 0xb0, 0x11, 0x9e, 0x2b, 0x1c, 0x88, 0x2d, 0x2b, 0x35, 0xc3, 0x88, 0xd0, 0xb8, 0x55, 0xbb, 0xe9, 0xa6, 0x04, 0xf6, 0x8b, 0x2f, 0x7e, 0x2d, 0x66, 0xb9, 0xdc, 0xf5, 0x9b, 0x7d, 0x36, 0xdf, 0xa2, 0x2a, 0xc1, 0x6b, 0x40, 0xc0, 0xda, 0x2a, 0x8a, 0x30, 0x6a, 0x28, 0x7d, 0x56, 0xcf, 0x2e, 0xd1, 0x00, 0xe9, 0x71, 0x3a, 0x2d, 0xbe, 0x1d, 0xf7, 0x55, 0xc8, 0xaf, 0xf0, 0xdf, 0x2c, 0x34, 0x5a, 0xc1, 0xdd, 0x78, 0xff, 0x0a, 0xec, 0x7d, 0xb6, 0x1f, 0x59, 0xb0, 0x3a, 0x44, 0x37, 0x10, 0xaa, 0x53, 0x8c, 0xdc, 0x28, 0xfe, 0xb1, 0xd2, 0x18, 0x69, 0x82, 0x3e, 0x7f, 0x28, 0xaf, 0x07, 0xe6, 0x27, 0xfb, 0x15, 0x1f, 0x84, 0xc8, 0xaf, 0xcd, 0x99, 0xe7, 0x97, 0xec, 0x0c, 0xd6, 0xbb, 0x14, 0x1a, 0xe2, 0xad, 0x16, 0x76, 0xcc, 0x0e, 0x71, 0x6d, 0xb6, 0xe1, 0xe5, 0x39, 0xac, 0xa9, 0x06, 0x23, 0x6c, 0x9a, 0x61, 0x4f, 0x17, 0xa7, 0x80, 0xee, 0x9d, 0x68, 0x76, 0x81, 0x02, 0x66, 0x43, 0x99, 0x82, 0xb6, 0xf3, 0x80, 0xaf, 0x3a, 0x4d, 0xf2, 0xc7, 0x7d, 0x3d, 0x88, 0x04, 0x8e, 0x91, 0x90, 0x3d, 0xb2, 0xa2, 0x88, 0x40, 0xdc, 0x1d, 0xc2, 0x2f, 0x9a, 0x15, 0x97, 0xe1, 0x0f, 0xf3, 0x61, 0x44, 0xfa, 0x4c, 0x4c, 0xe6, 0x0a, 0x50, 0xa6, 0xa5, 0xc5, 0xe2, 0x56, 0x40, 0xf4, 0x30, 0xd0, 0xed, 0xb3, 0x88, 0x9e, 0xc7, 0xff, 0x17, 0xfb, 0x29, 0x03, 0x0a, 0x10, 0x79, 0x24, 0x63, 0x62, 0xfd, 0xa0, 0x7c, 0x96, 0x90, 0x22, 0xa3, 0x49, 0xc6, 0xce, 0x84, 0x01, 0x06, 0x72, 0xd0, 0x13, 0x74, 0x17, 0xce, 0x13, 0xc9, 0x71, 0x23, 0x63, 0x68, 0x91, 0x6e, 0x36, 0x09, 0x19, 0xc1, 0x62, 0xa3, 0xc2, 0xb0, 0x58, 0x1e, 0xff, 0xdb, 0x78, 0xba, 0x77, 0x33, 0xf6, 0xbc, 0xe5, 0xde, 0xff, 0x83, 0xf4, 0xa7, 0x7d, 0xd6, 0xe8, 0x5c, 0x91, 0xcc, 0xdf, 0x4f, 0x02, 0x75, 0x30, 0x3f, 0x90, 0x40, 0xee, 0x85, 0xc6, 0x65, 0xab, 0x43, 0xe4, 0x1c, 0x88, 0x12, 0xf1, 0xdb, 0x0a, 0xdd, 0x41, 0x2b, 0xa9, 0xb1, 0x63, 0xd0, 0x99, 0x44, 0xbe, 0x83, 0xcc, 0x1e, 0x3f, 0x9b, 0x11, 0x1f, 0x27, 0x79, 0xcc, 0x8a, 0xb0, 0x01, 0x22, 0x6a, 0x02, 0x2c, 0x46, 0x7d, 0x78, 0xcd, 0x37, 0xa4, 0x70, 0x98, 0x4d, 0x05, 0x04, 0x6e, 0x8f, 0x08, 0xe7, 0x77, 0xfb, 0xdd, 0x2b, 0x2e, 0x69, 0x82, 0xbc, 0x8a, 0x20, 0x20, 0xe0, 0x3f, 0x42, 0x64, 0xe7, 0xaa, 0x66, 0x34, 0x45, 0x41, 0xdf, 0x69, 0xca, 0x35, 0x01, 0xe8, 0xac, 0x6f, 0x63, 0x8b, 0x64, 0x7d, 0x38, 0x86, 0x5b, 0x00, 0x80, 0x9a, 0x5f, 0x6f, 0x23, 0x27, 0x67, 0x28, 0x14, 0x98, 0x4c, 0xce, 0x3d, 0xac, 0xe6, 0x00, 0xea, 0x4f, 0x32, 0x9f, 0x1c, 0x24, 0x96, 0x36, 0xb9, 0x05, 0xc4, 0x73, 0xeb, 0xd0, 0xf1, 0x5e, 0x1a, 0xfe, 0x11, 0x05, 0x60, 0x56, 0xea, 0x9d, 0x1f, 0x8d, 0xf1, 0xd2, 0x33, 0x25, 0xb4, 0x8b, 0xa3, 0x78, 0xc5, 0x4a, 0xb5, 0x0a, 0x0b, 0xc9, 0xee, 0x4c, 0x1a, 0x70, 0x18, 0x52, 0x83, 0xde, 0x52, 0xa0, 0x63, 0x34, 0x7c, 0x6e, 0x6c, 0x30, 0xe3, 0x1c, 0xee, 0x19, 0xee, 0x07, 0xa4, 0xd6, 0x29, 0xff, 0x75, 0xae, 0x0f, 0x86, 0x4b, 0x78, 0x9a, 0x5d, 0xbb, 0x80, 0x33, 0xe3, 0xd1, 0xad, 0x5d, 0x42, 0xcc, 0x51, 0x06, 0x54, 0x66, 0x13, 0x13, 0x95, 0x8f, 0x92, 0xdf, 0xbb, 0x8b, 0xcb, 0x79, 0x40, 0x72, 0x20, 0x12, 0x8a, 0x81, 0x77, 0x91, 0x3f, 0xe2, 0x41, 0x87, 0xd3, 0x66, 0xbb, 0x0e, 0x2e, 0x67, 0x1b, 0x97, 0xb1, 0x0f, 0x46, 0x0a, 0x5c, 0x5f, 0x9a, 0xf1, 0x11, 0x95, 0xbc, 0xa6, 0xe0, 0x74, 0x3a, 0x39, 0xdf, 0xec, 0xaa, 0x02, 0x2c, 0x43, 0x90, 0xf3, 0x20, 0xb4, 0xd5, 0x5b, 0x53, 0x78, 0x38, 0xe5, 0xf4, 0x08, 0xd6, 0x66, 0xd2, 0xd7, 0x20, 0x75, 0xf0, 0x09, 0xf9, 0xb4, 0xc2, 0x41, 0x18, 0xcc, 0xa2, 0x48, 0xa3, 0xc3, 0xfc, 0x2c, 0xd9, 0x6c, 0x02, 0x4c, 0x51, 0xd6, 0x3c, 0xc5, 0x5c, 0x43, 0x4c, 0xef, 0x93, 0x85, 0x9d, 0xb3, 0x47, 0xd1, 0x9d, 0x20, 0xa2, 0x1d, 0x78, 0xaf, 0x47, 0x36, 0x2a, 0x6b, 0xd0, 0x36, 0x48, 0x0b, 0x3c, 0xb5, 0xd3, 0xaa, 0x5b, 0x58, 0xeb, 0x70, 0x8a, 0x8b, 0xe5, 0x3b, 0x50, 0x59, 0x0e, 0xd0, 0x4a, 0x5d, 0x88, 0x7a, 0x66, 0x9b, 0x16, 0xe0, 0x29, 0x93, 0x5e, 0x5e, 0xe0, 0xb2, 0x3d, 0x03, 0x19, 0xea, 0x04, 0x99, 0xdc, 0x22, 0x92, 0x9f, 0x2e, 0x67, 0x7c, 0xc0, 0xb7, 0xcc, 0x5f, 0x08, 0x63, 0x2a, 0x3c, 0x53, 0xc3, 0xe3, 0x66, 0x34, 0xc1, 0x97, 0x91, 0xe2, 0x45, 0xf9, 0x89, 0xce, 0x01, 0xb1, 0x36, 0x1e, 0xc5, 0x62, 0x6e, 0x1a, 0x5a, 0xce, 0x92, 0xdb, 0x4f, 0xf3, 0xef, 0xe4, 0x48, 0xce, 0xf5, 0xb9, 0xdd, 0x00, 0x50, 0x17, 0xed, 0xeb, 0xec, 0xb0, 0xb0, 0xc9, 0xe1, 0x62, 0x04, 0xf7, 0x79, 0xf2, 0x0d, 0x5d, 0x6d, 0x0e, 0xdb, 0xe4, 0xc4, 0x6d, 0x10, 0x25, 0x6a, 0x73, 0x62, 0xc6, 0xda, 0x77, 0xc4, 0xf3, 0x4d, 0x56, 0xf7, 0x06, 0xe1, 0xc7, 0xa9, 0xcd, 0x20, 0xb4, 0xae, 0x2f, 0x4a, 0xe4, 0xb5, 0x30, 0x8c, 0xd0, 0xb0, 0xf1, 0x56, 0x1e, 0xbb, 0x7a, 0x84, 0xbc, 0x02, 0xbc, 0x72, 0x42, 0xba, 0x3a, 0x51, 0xc3, 0xb9, 0x14, 0x7e, 0xdc, 0x38, 0x2e, 0x21, 0x34, 0x42, 0x4b, 0x0a, 0x98, 0x24, 0xe3, 0xfa, 0x8f, 0x89, 0xce, 0xaf, 0x9c, 0xda, 0x5f, 0xcc, 0x07, 0xa8, 0x9e, 0xa2, 0xca, 0x06, 0xfd, 0xf5, 0x99, 0xfc, 0xa3, 0x0c, 0x26, 0x11, 0xed, 0x8d, 0x6f, 0xf4, 0x96, 0xa7, 0x93, 0xb1, 0x54, 0x10, 0x7c, 0x44, 0x1b, 0xb9, 0xcb, 0xbf, 0x58, 0xab, 0x57, 0x64, 0xe9, 0x84, 0x5f, 0x23, 0x92, 0x04, 0x24, 0xe9, 0xa2, 0x65, 0x55, 0xe1, 0xfa, 0xeb, 0x07, 0x4e, 0x25, 0xc8, 0x9c, 0x90, 0x4f, 0x32, 0x5e, 0xc7, 0xbc, 0x68, 0x5b, 0x99, 0xea, 0xd9, 0x5b, 0x0c, 0x8c, 0x1a, 0xb8, 0x1d, 0xd4, 0xa8, 0xac, 0xcb, 0x26, 0xac, 0x87, 0xe6, 0x71, 0xa9, 0xc2, 0xaf, 0x5b, 0x6c, 0xab, 0xff, 0xdf, 0xfa, 0x02, 0x4f, 0xb1, 0xe7, 0x88, 0x21, 0xd6, 0xaa, 0xcd, 0xfb, 0xe4, 0x3c, 0x9e, 0x08, 0xc5, 0xa8, 0xa4, 0x53, 0x45, 0x27, 0x96, 0x75, 0x9f, 0x72, 0xd4, 0xd6, 0xe9, 0x0e, 0xac, 0xeb, 0xc8, 0x45, 0xd2, 0xc3, 0x99, 0x32, 0x68, 0x0c, 0xaa, 0x05, 0x22, 0x05, 0x9a, 0xa7, 0xbd, 0xe8, 0x9a, 0x45, 0xf2, 0xdc, 0x4a, 0xf6, 0xb9, 0x97, 0xa4, 0x70, 0x9b, 0x3c, 0x17, 0x55, 0x9d, 0x01, 0xbb, 0x70, 0xcb, 0x24, 0x1b, 0x41, 0x47, 0x26, 0x8d, 0x28, 0x61, 0x59, 0xdb, 0xd0, 0xcf, 0x05, 0x23, 0x94, 0x56, 0x73, 0xf7, 0xf8, 0x45, 0x36, 0xf7, 0x74, 0x47, 0x8d, 0xb1, 0x07, 0xe4, 0x89, 0x5b, 0xdc, 0xec, 0x59, 0x12, 0xca, 0x74, 0x54, 0xb9, 0x09, 0x74, 0x9e, 0x4c, 0x11, 0x98, 0x20, 0xa1, 0xb1, 0x60, 0xa6, 0xdb, 0x3e, 0xe6, 0x39, 0x4b, 0x06, 0xde, 0x05, 0x6e, 0xb6, 0x2c, 0x40, 0xff, 0x6b, 0xb0, 0x88, 0x91, 0x0e, 0x60, 0x63, 0x82, 0xe3, 0x85, 0x97, 0x2f, 0xcf, 0x6b, 0x96, 0x43, 0xb9, 0xb0, 0xe1, 0xdf, 0xe2, 0x1e, 0x65, 0x39, 0x35, 0x91, 0x2d, 0xc9, 0x7a, 0x60, 0xd9, 0x48, 0x84, 0x7b, 0x5e, 0x9a, 0x91, 0x2a, 0x87, 0x56, 0xee, 0x0a, 0x1d, 0xbd, 0x4d, 0xd9, 0xb7, 0x6c, 0xaf, 0x40, 0xa9, 0x2f, 0x86, 0xeb, 0xaa, 0x37, 0xeb, 0xa2, 0x11, 0x2e, 0xe0, 0x1e, 0xe4, 0xd8, 0xb1, 0x79, 0x37, 0x39, 0x36, 0x31, 0x63, 0x0c, 0xa2, 0xc5, 0x5c, 0x80, 0x9c, 0x73, 0x08, 0x21, 0xd7, 0xb4, 0x93, 0x4b, 0x8d, 0xdb, 0x45, 0x09, 0x4d, 0x22, 0x96, 0x77, 0xc8, 0x26, 0x95, 0xe4, 0x5a, 0xc2, 0x1b, 0xce, 0x90, 0x0d, 0xe2, 0x2b, 0x16, 0xd7, 0x9c, 0x3e, 0xaa, 0x33, 0x01, 0x4c, 0x2c, 0x40, 0xcd, 0xe7, 0x3b, 0x20, 0xa6, 0x4a, 0xbe, 0xf0, 0xd8, 0xe9, 0x75, 0x49, 0x81, 0xfd, 0x85, 0x90, 0xe1, 0xea, 0x7c, 0x25, 0xb4, 0x48, 0xab, 0xd2, 0x8c, 0x0a, 0x76, 0x02, 0x2d, 0xa1, 0xc5, 0x73, 0xeb, 0x6d, 0x56, 0xec, 0x9c, 0x18, 0x03, 0x8e, 0x06, 0xe6, 0x8d, 0xed, 0xd5, 0xe9, 0x58, 0x3f, 0x59, 0xe0, 0xed, 0x1e, 0xce, 0x2d, 0xd8, 0x45, 0x6a, 0xd6, 0xbd, 0x95, 0xe0, 0x7c, 0x1b, 0x51, 0xd4, 0x9f, 0x8d, 0xda, 0x1b, 0xa0, 0x30, 0x98, 0x93, 0xfd, 0xc1, 0x64, 0xef, 0xf9, 0x62, 0x7d, 0xa9, 0x5d, 0xb2, 0x45, 0x27, 0xd1, 0x4d, 0xa3, 0x63, 0xaa, 0xa9, 0x10, 0xee, 0x25, 0x51, 0x6a, 0x91, 0x62, 0xde, 0x94, 0x3a, 0xef, 0x8c, 0xdb, 0x9b, 0x71, 0x45, 0xd3, 0x8b, 0x39, 0xbe, 0xea, 0xe8, 0x68, 0x00, 0x5c, 0xd5, 0xbe, 0xef, 0x41, 0xf3, 0x47, 0x01, 0x6a, 0xc7, 0x6a, 0x13, 0xad, 0xba, 0x0f, 0xeb, 0xc0, 0x13, 0x86, 0x42, 0xa3, 0xc5, 0x47, 0x4d, 0x39, 0x3f, 0xc7, 0xea, 0xd4, 0x34, 0xf8, 0xb1, 0x48, 0xf8, 0x46, 0x63, 0x85, 0x60, 0x49, 0x60, 0x56, 0x06, 0xe0, 0xcc, 0x36, 0xc1, 0x3f, 0x9d, 0x85, 0xa1, 0xfc, 0xf5, 0x3e, 0x56, 0x63, 0xd0, 0x60, 0xcd, 0x7a, 0x27, 0xcb, 0xec, 0xb4, 0x7f, 0xec, 0x9f, 0xa9, 0x6b, 0xd7, 0x8e, 0xeb, 0x3d, 0x21, 0xdc, 0x26, 0x80, 0x90, 0x17, 0x29, 0x7a, 0xc4, 0xa3, 0x28, 0xb6, 0x16, 0xb0, 0xe4, 0x9f, 0xe3, 0x0e, 0x54, 0xbc, 0xb3, 0x1f, 0x68, 0xcd, 0xa8, 0x82, 0x16, 0xa8, 0xe8, 0x28, 0xd2, 0x8a, 0xde, 0x03, 0xe1, 0x0d, 0x18, 0x00, 0xc6, 0x1d, 0x48, 0xd8, 0x82, 0x26, 0x5e, 0x9f, 0xe9, 0x37, 0x53, 0x40, 0xa1, 0xc0, 0xb5, 0x88, 0xfb, 0x6e, 0xc0, 0x40, 0x7c, 0xf5, 0xb4, 0x46, 0x84, 0x1c, 0xff, 0x9d, 0x20, 0xb4, 0x28, 0xfd, 0x92, 0xf4, 0x3e, 0xeb, 0xbc, 0xc5, 0x87, 0xc8, 0x0c, 0xef, 0xdf, 0x4c, 0xec, 0x58, 0xe6, 0xe2, 0xc0, 0x5b, 0xa8, 0x3a, 0xe5, 0x6e, 0xd9, 0x2e, 0x76, 0x12, 0x74, 0x43, 0xcc, 0x74, 0x4e, 0xed, 0xe9, 0x77, 0x10, 0x78, 0x81, 0xad, 0xb9, 0xdd, 0x7f, 0x58, 0x01, 0x85, 0xbf, 0xba, 0xa4, 0x06, 0x9a, 0x59, 0x85, 0x8e, 0xb9, 0x36, 0x49, 0x02, 0x37, 0x75, 0x2b, 0xa7, 0x4c, 0x63, 0xd9, 0x43, 0x2a, 0x93, 0x7d, 0x2a, 0x54, 0x21, 0x2b, 0xdb, 0x7e, 0x15, 0x4a, 0xc4, 0x8d, 0xf7, 0xa7, 0x4e, 0xa3, 0x7e, 0xf3, 0x72, 0xdb, 0xbe, 0xf3, 0xcc, 0x75, 0x8c, 0xba, 0x83, 0xf4, 0x6c, 0x27, 0xfe, 0x01, 0xa6, 0xcc, 0xae, 0xf8, 0x4f, 0x18, 0x48, 0xae, 0x09, 0x40, 0x5f, 0x3d, 0x3e, 0x10, 0xa3, 0xdf, 0x6c, 0x1c, 0x87, 0x6f, 0x2c, 0x14, 0x24, 0x73, 0xb0, 0x35, 0x05, 0xdd, 0x73, 0xa6, 0x68, 0x33, 0xc4, 0x2e, 0xb4, 0xfe, 0xa8, 0xfc, 0xba, 0x0a, 0xc2, 0x97, 0x7a, 0x12, 0xfb, 0xb9, 0x0d, 0x54, 0xc5, 0x5f, 0xb8, 0x90, 0x50, 0xda, 0x37, 0xa8, 0xd4, 0x59, 0xfe, 0x6d, 0x03, 0x6c, 0xe6, 0x86, 0xa1, 0x1b, 0x1e, 0x9b, 0x65, 0xcd, 0x53, 0x6b, 0xac, 0x94, 0x03, 0x84, 0x79, 0xaf, 0x47, 0xc5, 0xff, 0x9c, 0x74, 0x62, 0xd0, 0x7a, 0xfe, 0xb0, 0x34, 0xd2, 0x7e, 0xb5, 0x89, 0x29, 0xb4, 0x50, 0x34, 0x1a, 0x6a, 0xf8, 0x56, 0x12, 0x26, 0x63, 0x91, 0xdb, 0xc0, 0xdd, 0x98, 0xac, 0x5c, 0x68, 0x8e, 0xfc, 0x2a, 0x85, 0x33, 0xbd, 0xfd, 0x5e, 0x75, 0x73, 0x02, 0x0c, 0xcc, 0x4b, 0xa7, 0xc0, 0xd2, 0x71, 0xf9, 0xb1, 0x1b, 0x65, 0x07, 0xda, 0x83, 0xa2, 0x91, 0xb6, 0x2f, 0x33, 0x88, 0x47, 0x0c, 0x4c, 0xa3, 0xa4, 0x5a, 0xa4, 0x2f, 0x81, 0xf2, 0xb5, 0x9b, 0x99, 0x66, 0x5f, 0xc4, 0x01, 0x7e, 0x8b, 0xf7, 0xe0, 0xfb, 0xff, 0x24, 0xd5, 0x11, 0xf3, 0xa5, 0x4f, 0x73, 0x57, 0xe2, 0x3f, 0x2a, 0xc1, 0x55, 0xff, 0x3b, 0xfa, 0xf8, 0x45, 0x82, 0x71, 0xb7, 0x3c, 0x34, 0x5d, 0xae, 0x9a, 0xbc, 0x9c, 0x70, 0x84, 0x66, 0xe7, 0x0a, 0x51, 0x1a, 0x3a, 0x1c, 0x37, 0x57, 0xeb, 0x68, 0x30, 0xf0, 0x4e, 0x07, 0x51, 0x09, 0x77, 0x24, 0x55, 0x1d, 0x68, 0x3d, 0x84, 0xfa, 0x35, 0x4e, 0x4c, 0x76, 0x0d, 0x63, 0xa9, 0x1e, 0xb6, 0x2b, 0xb7, 0x15, 0x63, 0x29, 0x6a, 0xd4, 0xe1, 0x41, 0x9e, 0x46, 0x75, 0x09, 0x4f, 0x18, 0xd0, 0x3f, 0xa2, 0xd3, 0x1c, 0xc3, 0xcd, 0xcd, 0x6b, 0xf8, 0x91, 0x6c, 0x02, 0x59, 0xaf, 0x53, 0xec, 0xc1, 0x6e, 0xf0, 0xd0, 0x06, 0xc6, 0x7e, 0x3b, 0x6b, 0x35, 0xb8, 0xe4, 0xcf, 0x11, 0x99, 0x49, 0x2d, 0x18, 0x8d, 0xac, 0xad, 0xa8, 0x8e, 0x1e, 0x60, 0x45, 0x19, 0xd8, 0x60, 0x3c, 0xd4, 0x82, 0xf5, 0x7a, 0xd9, 0xb6, 0x02, 0x95, 0x47, 0xdc, 0x0b, 0xc0, 0xb0, 0xcc, 0xa0, 0x4a, 0x5f, 0xdb, 0xdd, 0x69, 0xea, 0x8b, 0x64, 0xae, 0xbe, 0x42, 0x8d, 0xb3, 0x24, 0xcf, 0x19, 0x8e, 0x21, 0x3f, 0xae, 0xbe, 0x28, 0x07, 0x21, 0x5b, 0x33, 0x84, 0xbe, 0xf6, 0xe4, 0x21, 0xa0, 0xe3, 0xe1, 0x52, 0xbf, 0x2f, 0x46, 0x9d, 0xb3, 0x0e, 0x18, 0x4b, 0x1b, 0x76, 0xe0, 0x3c, 0x1a, 0xc0, 0x31, 0xc7, 0x21, 0x1e, 0x5e, 0x2d, 0xa3, 0xa2, 0x90, 0xe3, 0x3e, 0x93, 0x88, 0x5b, 0xa5, 0xd8, 0x73, 0xfd, 0xa3, 0x8c, 0xd8, 0xc8, 0x0c, 0x53, 0x08, 0xf7, 0x79, 0xe9, 0x16, 0x71, 0x2a, 0x1e, 0xcf, 0x78, 0x97, 0x7e, 0x13, 0x04, 0xc6, 0x3f, 0xb7, 0x33, 0xf4, 0x30, 0x20, 0x08, 0x30, 0x71, 0x35, 0xd2, 0x7a, 0x50, 0xdc, 0x74, 0xe5, 0xc7, 0xbd, 0xc7, 0xe2, 0xcd, 0x27, 0xaf, 0xd2, 0x40, 0x4b, 0x27, 0x03, 0xa4, 0xa2, 0x15, 0x40, 0x61, 0x66, 0xe3, 0x6b, 0xb6, 0x81, 0x3a, 0xe8, 0xa8, 0xe8, 0x4f, 0x24, 0xf4, 0xc0, 0x15, 0x2f, 0x58, 0x0a, 0xa2, 0x96, 0x6f, 0xcd, 0x2a, 0xb9, 0x45, 0x26, 0x52, 0x69, 0x4c, 0x7a, 0x4f, 0x66, 0x07, 0x83, 0x4d, 0x66, 0xb2, 0x5c, 0x1e, 0x9a, 0x54, 0xa6, 0xca, 0x9b, 0x7d, 0xfa, 0x1b, 0xf7, 0xed, 0x9c, 0xf9, 0x04, 0x34, 0x64, 0x45, 0x0c, 0xcf, 0xa8, 0x2b, 0x65, 0xac, 0xcf, 0xaa, 0x19, 0x08, 0xcd, 0x46, 0xec, 0x9e, 0xb5, 0x94, 0xa0, 0x0c, 0x56, 0x92, 0x52, 0xc4, 0x5c, 0xca, 0xc5, 0x7a, 0xae, 0xd9, 0xb9, 0x59, 0xf4, 0xb9, 0x02, 0x7b, 0xcd, 0xe2, 0xcc, 0x17, 0x8a, 0x79, 0x89, 0x82, 0x3d, 0xe2, 0x6f, 0x83, 0x1a, 0x76, 0x7b, 0x31, 0xca, 0xc4, 0x51, 0x76, 0xdd, 0x33, 0x5a, 0x66, 0x4b, 0xf4, 0xd0, 0xf7, 0xcb, 0x73, 0x02, 0xf6, 0x38, 0xc1, 0x90, 0x69, 0x4f, 0x42, 0x68, 0x09, 0xa0, 0x95, 0x32, 0xef, 0x6a, 0x57, 0x84, 0x0e, 0x0b, 0x12, 0x0e, 0xd2, 0x50, 0x4e, 0x35, 0xb5, 0xb1, 0x93, 0xce, 0x86, 0x8e, 0xfa, 0xfc, 0x29, 0x73, 0x8e, 0x21, 0x77, 0x13, 0xdb, 0x5e, 0xa3, 0xba, 0xba, 0x39, 0xe0, 0xdc, 0xb1, 0xcc, 0xb4, 0x9a, 0xa1, 0xa8, 0x27, 0x33, 0x7b, 0xbd, 0x99, 0x17, 0x12, 0xe5, 0xed, 0x9b, 0xe6, 0xda, 0x6e, 0xc3, 0xd9, 0xf1, 0x75, 0x89, 0xc6, 0x28, 0xe7, 0x27, 0x36, 0x5e, 0xad, 0x67, 0xbc, 0x08, 0x70, 0xc2, 0xbb, 0x1a, 0x1a, 0x35, 0xfb, 0xf3, 0x76, 0x2d, 0xe6, 0xe3, 0xab, 0x80, 0x7d, 0x5e, 0xb4, 0x43, 0x74, 0xdc, 0x31, 0xc6, 0x33, 0x6a, 0xca, 0xb4, 0xf1, 0xb0, 0xfe, 0xa7, 0x3e, 0xec, 0x45, 0x5a, 0xde, 0x94, 0xc1, 0x84, 0x1c, 0xe9, 0xa4, 0x9f, 0x67, 0xef, 0xa1, 0xee, 0x7f, 0x1c, 0x65, 0xa7, 0x19, 0x78, 0x8d, 0x10, 0x64, 0xb0, 0x2b, 0x7b, 0x55, 0x80, 0x43, 0x72, 0xaf, 0xd9, 0x8b, 0xc5, 0x41, 0xe0, 0x62, 0x63, 0xe4, 0x49, 0x00, 0xac, 0x19, 0x35, 0xdb, 0xaa, 0x42, 0x25, 0x2a, 0xed, 0x6d, 0x57, 0x21, 0xa2, 0xa4, 0x4a, 0xf1, 0x11, 0x08, 0x33, 0x15, 0x8c, 0xcc, 0xcf, 0x3e, 0x32, 0xf4, 0xa2, 0x61, 0x9c, 0x6b, 0x50, 0x07, 0x6f, 0x90, 0xd7, 0x3e, 0xc3, 0xa8, 0x06, 0xf4, 0x51, 0x46, 0xfa, 0x87, 0xad, 0x79, 0x17, 0xb9, 0xb9, 0xe9, 0x2e, 0x56, 0xb3, 0x45, 0xcf, 0x25, 0x27, 0xdd, 0x4d, 0xb7, 0xf4, 0x0f, 0x00, 0xa0, 0xb3, 0x3b, 0x96, 0x75, 0x3c, 0xda, 0x19, 0x44, 0x23, 0xd7, 0xd1, 0x4b, 0x60, 0x51, 0x25, 0xb6, 0xdd, 0x00, 0x42, 0x76, 0x77, 0xe7, 0x72, 0x33, 0xe9, 0x63, 0x59, 0x09, 0xc0, 0x2e, 0x3c, 0x53, 0xb1, 0x7e, 0x5d, 0x8d, 0x7d, 0x2c, 0xaf, 0xfc, 0x21, 0xb5, 0x3c, 0xff, 0xd8, 0xf4, 0x9a, 0xf9, 0xdd, 0x27, 0xf4, 0x9a, 0xd2, 0x8d, 0xea, 0x0a, 0x3d, 0x91, 0x94, 0xdc, 0x1f, 0xf9, 0xf3, 0xd0, 0xab, 0x1f, 0xc8, 0x46, 0x10, 0x1e, 0x61, 0x22, 0xb1, 0x0e, 0xd4, 0xc3, 0xb3, 0xbb, 0x0a, 0x48, 0x63, 0x76, 0xd9, 0xe8, 0x2b, 0xc8, 0x66, 0x1c, 0x48, 0x73, 0x09, 0xaf, 0x28, 0xdc, 0x7b, 0xa6, 0x4c, 0xc4, 0x79, 0xb0, 0x1b, 0x0c, 0x52, 0x6e, 0x32, 0x79, 0xab, 0x84, 0x49, 0x72, 0x7f, 0xc2, 0x38, 0xcb, 0xf6, 0x99, 0x8f, 0x9f, 0x41, 0x65, 0xc5, 0x9d, 0xf5, 0xc8, 0xff, 0x7f, 0x3e, 0x57, 0x7e, 0xc2, 0xa8, 0xcb, 0x07, 0x3a, 0xe8, 0x30, 0x05, 0x88, 0x93, 0x3d, 0x92, 0x59, 0xf8, 0xde, 0x6d, 0x9f, 0xe5, 0x9d, 0xde, 0x47, 0x64, 0x67, 0x1f, 0x37, 0xe7, 0xfd, 0x41, 0x04, 0x05, 0xf7, 0xd7, 0xac, 0xe1, 0xf4, 0x12, 0x7a, 0x2a, 0xda, 0xd9, 0xb9, 0x31, 0x13, 0xe9, 0x7f, 0x8c, 0xeb, 0x4b, 0xeb, 0x8b, 0x80, 0x40, 0x75, 0xed, 0x19, 0x92, 0xcd, 0xf6, 0x34, 0xdf, 0x67, 0xc5, 0x4a, 0xb7, 0xb6, 0xe5, 0x12, 0xef, 0x25, 0xa0, 0xc5, 0x35, 0x4b, 0xf9, 0xfc, 0x4e, 0xed, 0x6d, 0x7c, 0x9e, 0xd0, 0x80, 0xb0, 0x54, 0x13, 0x9d, 0x18, 0x76, 0x25, 0x13, 0x1e, 0xdc, 0xa4, 0xdd, 0x7c, 0x82, 0x83, 0x88, 0x9a, 0x23, 0xd8, 0xd9, 0xc5, 0x61, 0x56, 0x3c, 0x29, 0xb1, 0x53, 0x58, 0xdb, 0x17, 0x4f, 0xe2, 0x35, 0x0a, 0xbd, 0x8c, 0xe5, 0x16, 0x22, 0x36, 0xef, 0x42, 0x97, 0x32, 0x6c, 0xa7, 0x86, 0x18, 0x0a, 0x35, 0x97, 0xfd, 0x1b, 0xd2, 0x59, 0xc2, 0x0e, 0x51, 0x95, 0xd9, 0xce, 0x47, 0x57, 0x07, 0xbb, 0xe9, 0xbf, 0x76, 0x31, 0x99, 0x77, 0x00, 0x66, 0x4d, 0xe4, 0x44, 0x94, 0xf0, 0xed, 0x0e, 0xdb, 0xc2, 0xdc, 0x2c, 0xa5, 0x10, 0x5d, 0xe4, 0x4f, 0xa7, 0xf8, 0xa1, 0x09, 0xb4, 0xec, 0x31, 0x05, 0x20, 0xf6, 0x5a, 0x20, 0x4b, 0x74, 0xe6, 0xad, 0xad, 0x34, 0x23, 0x81, 0xd2, 0x8d, 0x83, 0xc7, 0x0b, 0xde, 0xbc, 0x63, 0x3f, 0xb5, 0x1e, 0x96, 0x1e, 0x6e, 0x92, 0xb7, 0x56, 0x05, 0x06, 0x17, 0x61, 0x2b, 0xa1, 0xcd, 0x6d, 0x1b, 0xa6, 0x95, 0xf0, 0xd0, 0xdf, 0xae, 0xe1, 0x2a, 0x95, 0xc7, 0xde, 0x65, 0x4c, 0xc5, 0xb5, 0xc9, 0x73, 0x10, 0x05, 0xb3, 0x09, 0x4a, 0x13, 0x36, 0xc6, 0xe6, 0x82, 0xbe, 0x2a, 0x1e, 0xf3, 0x87, 0x53, 0xec, 0x30, 0x5a, 0x84, 0x20, 0x9f, 0x29, 0xdf, 0x8b, 0xd5, 0x26, 0xfa, 0x82, 0x60, 0xb5, 0x5c, 0x75, 0x0b, 0xdb, 0x68, 0x27, 0xf1, 0x62, 0xc5, 0x84, 0xf3, 0x01, 0xd8, 0x48, 0x30, 0xa8, 0xd1, 0x68, 0x8c, 0x6c, 0x10, 0x21, 0x33, 0x55, 0x9a, 0xc1, 0xc2, 0x0f, 0x2f, 0x05, 0x08, 0x42, 0xd7, 0x7e, 0xb5, 0x74, 0x87, 0x2d, 0x0e, 0x19, 0x34, 0x00, 0xbb, 0x70, 0x9f, 0x4e, 0x4f, 0x3d, 0x15, 0x37, 0xb6, 0x98, 0x92, 0x03, 0xb5, 0x76, 0x9d, 0x90, 0x14, 0x91, 0x8a, 0x19, 0xc7, 0x32, 0xc2, 0x88, 0xcd, 0x9d, 0xa7, 0x95, 0xb7, 0xa9, 0x5a, 0x37, 0xe0, 0x65, 0x41, 0x7b, 0xbb, 0x5e, 0x43, 0x09, 0x14, 0xfc, 0xc5, 0xef, 0xaf, 0x15, 0x9c, 0x29, 0xa4, 0x3c, 0x97, 0xde, 0x39, 0x41, 0x63, 0x6e, 0xb7, 0xc4, 0xd8, 0x13, 0xbf, 0x37, 0x38, 0xfe, 0x06, 0xd6, 0xe8, 0xfd, 0xce, 0xe4, 0x0b, 0x5f, 0xdb, 0xe3, 0x6d, 0xda, 0x42, 0xac, 0x6d, 0x2a, 0x3e, 0x85, 0x3d, 0x2d, 0xdd, 0x55, 0x23, 0xc5, 0x3f, 0xd5, 0xc0, 0xd7, 0xa5, 0xbd, 0xe6, 0x6d, 0x90, 0x1f, 0xe1, 0x4e, 0xd2, 0x5c, 0xa8, 0x0b, 0xcb, 0x71, 0x59, 0xb0, 0xe5, 0x0a, 0xa8, 0x60, 0x7b, 0xd9, 0x27, 0x41, 0x11, 0x7f, 0x60, 0x1c, 0x9e, 0x4f, 0x37, 0x06, 0x1d, 0x66, 0x04, 0xec, 0x41, 0xd0, 0x9a, 0x20, 0x3d, 0xc9, 0x8f, 0x22, 0xc6, 0x38, 0xa2, 0x04, 0x0e, 0x64, 0xa9, 0x66, 0x35, 0x07, 0x17, 0x42, 0xd0, 0x66, 0x31, 0xb4, 0x5f, 0x3e, 0x81, 0x5b, 0x92, 0xe5, 0xbe, 0xfd, 0xe4, 0xbe, 0xaa, 0xe0, 0xb2, 0xbb, 0xc9, 0x23, 0x57, 0xaf, 0xd3, 0xc6, 0x68, 0x82, 0x8d, 0x02, 0xde, 0xaf, 0xcc, 0xf6, 0xd4, 0x87, 0x27, 0xc2, 0x04, 0x89, 0x2d, 0xa5, 0x66, 0xfb, 0x62, 0x8b, 0x72, 0x77, 0xb0, 0xda, 0xf4, 0x30, 0xe8, 0x63, 0x1d, 0xc6, 0x79, 0x00, 0xef, 0x36, 0xf3, 0x4a, 0x68, 0x84, 0x82, 0xec, 0xec, 0x2b, 0x70, 0x64, 0xc3, 0xed, 0xe4, 0x1a, 0xf5, 0xe2, 0xc3, 0x52, 0x45, 0x57, 0x20, 0xbf, 0xc1, 0x36, 0x17, 0x77, 0x9e, 0x8e, 0xfe, 0xfd, 0x34, 0xce, 0x7b, 0xa7, 0xfe, 0x21, 0x80, 0xad, 0xa6, 0x01, 0xcd, 0xa1, 0xd8, 0xf3, 0x2a, 0xe1, 0x44, 0x7e, 0x94, 0x38, 0x60, 0x5c, 0x01, 0xad, 0x1f, 0x29, 0x41, 0xbf, 0x64, 0x3b, 0x39, 0x04, 0x5f, 0x9e, 0x17, 0xff, 0x4b, 0xd5, 0x22, 0xc2, 0xce, 0x93, 0xba, 0xf1, 0xf4, 0x91, 0x64, 0xe0, 0xb5, 0x0b, 0x02, 0xf3, 0x0d, 0xb7, 0xf5, 0x8b, 0xac, 0xdc, 0xa4, 0xce, 0x75, 0x1e, 0x5d, 0x36, 0xf7, 0xae, 0x5e, 0x12, 0xa9, 0xa6, 0xcb, 0x35, 0x5d, 0x77, 0x27, 0xeb, 0xab, 0x71, 0x46, 0x17, 0x7d, 0xe3, 0xf1, 0xce, 0x74, 0xaa, 0x32, 0x12, 0x10, 0x1d, 0xd6, 0xcc, 0x75, 0x8e, 0xcd, 0xc3, 0x67, 0x53, 0x72, 0x04, 0x20, 0x70, 0x80, 0x0c, 0x24, 0x91, 0x29, 0xe4, 0x4d, 0x62, 0xae, 0xc4, 0x3b, 0x4a, 0x95, 0x67, 0x50, 0x72, 0x98, 0x09, 0x72, 0x79, 0x04, 0xe8, 0x7e, 0x77, 0xab, 0x80, 0x12, 0xf8, 0x1e, 0x43, 0x55, 0x9a, 0x10, 0xc3, 0xed, 0xa0, 0x8c, 0x3d, 0x90, 0x36, 0xb1, 0xc3, 0xc7, 0x45, 0x8c, 0x6b, 0x57, 0xb0, 0xfa, 0x03, 0xe1, 0xe3, 0x8c, 0x59, 0x06, 0x78, 0x36, 0x06, 0x45, 0x2c, 0x23, 0x0a, 0xec, 0x83, 0xfa, 0xbe, 0x3e, 0xd9, 0xaa, 0x75, 0x50, 0xce, 0xb7, 0x4a, 0x29, 0x80, 0xda, 0xb3, 0x80, 0xf7, 0x7b, 0xbd, 0x1a, 0xb7, 0xd9, 0xfd, 0xb2, 0x46, 0x34, 0xb1, 0x2a, 0x64, 0xd7, 0x4b, 0x2a, 0x4a, 0x04, 0xdc, 0xd4, 0x31, 0xe5, 0x5d, 0x5d, 0x8c, 0xd3, 0xa4, 0x67, 0xf7, 0xcb, 0xe9, 0xd7, 0x1c, 0x2f, 0xef, 0xa6, 0x51, 0x27, 0xec, 0xfb, 0x1d, 0x6f, 0x45, 0x8e, 0x01, 0xb8, 0xb9, 0x5d, 0x58, 0xb8, 0x8a, 0x8e, 0xbc, 0xc8, 0x2c, 0x1d, 0x6b, 0xe5, 0x44, 0x0b, 0xab, 0x13, 0x38, 0xcd, 0x10, 0x28, 0x61, 0x30, 0x94, 0x03, 0x54, 0xde, 0x88, 0xdb, 0x11, 0x6a, 0xe4, 0x03, 0x3d, 0x0f, 0x09, 0xb8, 0x23, 0x23, 0xc0, 0x54, 0x11, 0x52, 0x1f, 0x22, 0xe7, 0x35, 0x5a, 0x27, 0x6f, 0x9b, 0xab, 0xcb, 0xf0, 0x67, 0xef, 0xb8, 0xd6, 0xad, 0x01, 0xab, 0xcc, 0x8f, 0xb5, 0xe2, 0xee, 0xd2, 0x65, 0x5d, 0x74, 0xc0, 0xd5, 0x1c, 0x3b, 0x54, 0xf1, 0xef, 0x72, 0x80, 0x19, 0x99, 0x9d, 0xf3, 0x0c, 0xaf, 0x1d, 0x89, 0x39, 0xfb, 0x85, 0xd8, 0xd1, 0x78, 0x13, 0x4f, 0x1d, 0x70, 0xac, 0x62, 0xa0, 0x20, 0xcd, 0xed, 0xfb, 0xf0, 0x85, 0xbd, 0xa3, 0x8b, 0x95, 0x6c, 0x25, 0x10, 0xce, 0x65, 0xc4, 0x74, 0x3a, 0xd0, 0xe2, 0x74, 0x49, 0x53, 0x81, 0x4d, 0x92, 0x8d, 0xce, 0x7e, 0x46, 0x89, 0xaa, 0xc6, 0x69, 0xad, 0x80, 0x07, 0xd5, 0x0e, 0x6d, 0x5f, 0x02, 0xe3, 0xed, 0x4d, 0xb8, 0xd0, 0x98, 0xeb, 0x98, 0x7b, 0x76, 0xa0, 0xff, 0x83, 0x37, 0x9d, 0x8c, 0x2e, 0xf0, 0x80, 0x58, 0x23, 0xb8, 0x8c, 0x2e, 0x07, 0x3b, 0x52, 0x94, 0xcb, 0xc2, 0x9b, 0x82, 0xae, 0xc2, 0xe3, 0x47, 0x27, 0x49, 0xa8, 0xf0, 0x58, 0xe1, 0xbf, 0x0f, 0x83, 0x33, 0x79, 0x05, 0xbb, 0xdd, 0xd2, 0x00, 0x0c, 0x8e, 0x76, 0x19, 0xe3, 0x76, 0xc7, 0x70, 0xca, 0x4d, 0x47, 0x92, 0xe9, 0x78, 0x0a, 0xae, 0x8c, 0x77, 0xfe, 0xd3, 0x14, 0x96, 0xe4, 0x03, 0xb1, 0xd4, 0x68, 0x5b, 0x8c, 0x7b, 0xad, 0x15, 0x73, 0xdc, 0x85, 0xe8, 0xb0, 0xd3, 0x69, 0x67, 0xa6, 0x13, 0xdc, 0xd7, 0xea, 0xd8, 0x4c, 0x0f, 0x54, 0xea, 0xe9, 0x87, 0xa6, 0x57, 0x07, 0x99])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_32_msg_64_tweak_4(self, test_workers):
my_key = bytes([0x9d, 0x7e, 0x5f, 0x40, 0x21, 0x02, 0xe3, 0xc4, 0xa4, 0x85, 0x66, 0x47, 0x28, 0x09, 0xea, 0xcb, 0xab, 0x8c, 0x6d, 0x4e, 0x2f, 0x10, 0xf1, 0xd2, 0xb2, 0x93, 0x74, 0x55, 0x36, 0x17, 0xf8, 0xd9])
my_tweak = bytes([0x67, 0x67, 0x67, 0x67])
my_message = bytes([0x61, 0x42, 0x23, 0x04, 0xe5, 0xc6, 0xa7, 0x88, 0x68, 0x49, 0x2a, 0x0b, 0xec, 0xcd, 0xae, 0x8f, 0x6f, 0x50, 0x31, 0x12, 0xf3, 0xd4, 0xb5, 0x96, 0x76, 0x57, 0x38, 0x19, 0xfa, 0xdb, 0xbc, 0x9d, 0x7d, 0x5e, 0x3f, 0x20, 0x01, 0xe2, 0xc3, 0xa4, 0x84, 0x65, 0x46, 0x27, 0x08, 0xe9, 0xca, 0xab, 0x8b, 0x6c, 0x4d, 0x2e, 0x0f, 0xf0, 0xd1, 0xb2, 0x92, 0x73, 0x54, 0x35, 0x16, 0xf7, 0xd8, 0xb9])
real_ciphertext = bytes([0x15, 0x09, 0x95, 0x6c, 0xb7, 0x0f, 0x71, 0xe7, 0x19, 0x5d, 0x83, 0x34, 0x44, 0xb8, 0xcd, 0x2a, 0x51, 0x62, 0x0c, 0x39, 0x40, 0x7f, 0xa7, 0xeb, 0xe9, 0xf0, 0xe1, 0xae, 0xba, 0xbf, 0x9f, 0x82, 0x57, 0x8a, 0x1b, 0x20, 0x57, 0xfe, 0x97, 0xe9, 0xcf, 0xe4, 0x1b, 0x46, 0xd7, 0x15, 0x14, 0xca, 0x8c, 0x53, 0xc7, 0x34, 0x87, 0xac, 0x5a, 0xd5, 0xee, 0x45, 0xa0, 0xbc, 0xcd, 0x81, 0x76, 0x3c])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_32_msg_64_tweak_8(self, test_workers):
my_key = bytes([0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a])
my_tweak = bytes([0x07, 0xf8, 0xe9, 0xda, 0xcb, 0xbc, 0xad, 0x9e])
my_message = bytes([0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42, 0x41, 0x40, 0x3f, 0x3e])
real_ciphertext = bytes([0x4c, 0x37, 0x6d, 0x66, 0x5f, 0xf7, 0xac, 0x93, 0x3f, 0xda, 0xd0, 0xcf, 0x8c, 0xae, 0x98, 0x3f, 0xd5, 0xd4, 0xce, 0x40, 0x62, 0x80, 0x5a, 0xab, 0x38, 0xa0, 0x94, 0x56, 0x4b, 0xe3, 0x15, 0x44, 0x34, 0x0c, 0xa2, 0xb4, 0xb4, 0x07, 0x96, 0xb0, 0xc8, 0x88, 0x4f, 0xde, 0x4b, 0x44, 0xb2, 0xf8, 0xf5, 0xa3, 0x49, 0xdb, 0x62, 0xd9, 0xc5, 0x50, 0xe0, 0x8c, 0x2a, 0x32, 0x4d, 0x51, 0xa6, 0xd6])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_32_msg_64_tweak_18(self, test_workers):
my_key = bytes([0xff, 0xf8, 0xf1, 0xea, 0xe3, 0xdc, 0xd5, 0xce, 0xc7, 0xc0, 0xb9, 0xb2, 0xab, 0xa4, 0x9d, 0x96, 0x8f, 0x88, 0x81, 0x7a, 0x73, 0x6c, 0x65, 0x5e, 0x57, 0x50, 0x49, 0x42, 0x3b, 0x34, 0x2d, 0x26])
my_tweak = bytes([0x97, 0x58, 0x19, 0xda, 0x9a, 0x5b, 0x1c, 0xdd, 0x9d, 0x5e, 0x1f, 0xe0, 0xa0, 0x61, 0x22, 0xe3, 0xa3, 0x64])
my_message = bytes([0xc3, 0xbc, 0xb5, 0xae, 0xa7, 0xa0, 0x99, 0x92, 0x8b, 0x84, 0x7d, 0x76, 0x6f, 0x68, 0x61, 0x5a, 0x53, 0x4c, 0x45, 0x3e, 0x37, 0x30, 0x29, 0x22, 0x1b, 0x14, 0x0d, 0x06, 0xff, 0xf8, 0xf1, 0xea, 0xe2, 0xdb, 0xd4, 0xcd, 0xc6, 0xbf, 0xb8, 0xb1, 0xaa, 0xa3, 0x9c, 0x95, 0x8e, 0x87, 0x80, 0x79, 0x72, 0x6b, 0x64, 0x5d, 0x56, 0x4f, 0x48, 0x41, 0x3a, 0x33, 0x2c, 0x25, 0x1e, 0x17, 0x10, 0x09])
real_ciphertext = bytes([0x3a, 0x37, 0x42, 0x3c, 0xcc, 0x63, 0x02, 0xbb, 0x06, 0x17, 0x38, 0x76, 0x55, 0xee, 0xda, 0xa1, 0x96, 0x6b, 0xc4, 0xa6, 0x53, 0xec, 0x53, 0x05, 0x40, 0xd2, 0xb1, 0x24, 0xcc, 0x71, 0x7e, 0x78, 0x48, 0x51, 0x16, 0x57, 0x79, 0xd5, 0x9b, 0x93, 0xaa, 0xc0, 0xb8, 0x30, 0x45, 0xdc, 0x4e, 0x83, 0x40, 0x1b, 0x68, 0x73, 0xe7, 0x48, 0x5e, 0x51, 0x23, 0x2a, 0x2f, 0x4c, 0x5c, 0x8e, 0xbd, 0x8d])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_32_msg_64_tweak_24(self, test_workers):
my_key = bytes([0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a])
my_tweak = bytes([0x87, 0x78, 0x69, 0x5a, 0x4b, 0x3c, 0x2d, 0x1e, 0x0f, 0x00, 0xf1, 0xe2, 0xd3, 0xc4, 0xb5, 0xa6, 0x96, 0x87, 0x78, 0x69, 0x5a, 0x4b, 0x3c, 0x2d])
my_message = bytes([0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae])
real_ciphertext = bytes([0x06, 0x0a, 0x44, 0x4d, 0xd8, 0xa2, 0x89, 0xb1, 0x91, 0xfe, 0x87, 0x64, 0x46, 0x53, 0xe5, 0xf4, 0xaa, 0xa1, 0xf0, 0xdf, 0x81, 0x96, 0xc6, 0x48, 0x9d, 0x7a, 0x4a, 0x05, 0xb0, 0xfa, 0xea, 0xc3, 0x88, 0x2e, 0x06, 0x0e, 0xf5, 0xe7, 0x3f, 0xae, 0xa3, 0x3c, 0xb0, 0x5f, 0x72, 0x20, 0x5b, 0x07, 0x94, 0x5a, 0x0c, 0xd7, 0x18, 0x2a, 0xbe, 0x57, 0x67, 0xb8, 0x4d, 0xda, 0x22, 0x89, 0x2f, 0xb3])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_32_msg_64_tweak_32(self, test_workers):
my_key = bytes([0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46, 0x45, 0x44, 0x43, 0x42])
my_tweak = bytes([0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x04, 0xf5, 0xe6, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x04, 0xf5])
my_message = bytes([0x25, 0x24, 0x23, 0x22, 0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6])
real_ciphertext = bytes([0xc2, 0x15, 0xa7, 0x2e, 0x26, 0xd5, 0x85, 0xea, 0x87, 0x9c, 0x1e, 0xb4, 0x6e, 0x97, 0x5c, 0x81, 0xa5, 0xa9, 0x30, 0xcc, 0x56, 0x09, 0x14, 0x45, 0x5d, 0x87, 0x57, 0x7b, 0x3b, 0x4c, 0xdd, 0x0b, 0x03, 0xe9, 0xce, 0x28, 0xaf, 0x02, 0x88, 0x0d, 0x63, 0x60, 0xf1, 0x6d, 0xb5, 0xe2, 0x91, 0xd3, 0xf7, 0x2a, 0x1b, 0x97, 0x30, 0xe5, 0xa8, 0x24, 0x16, 0x6d, 0xd0, 0x6a, 0x25, 0xc4, 0x33, 0xc4])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_32_msg_64_tweak_64(self, test_workers):
my_key = bytes([0x41, 0x40, 0x3f, 0x3e, 0x3d, 0x3c, 0x3b, 0x3a, 0x39, 0x38, 0x37, 0x36, 0x35, 0x34, 0x33, 0x32, 0x31, 0x30, 0x2f, 0x2e, 0x2d, 0x2c, 0x2b, 0x2a, 0x29, 0x28, 0x27, 0x26, 0x25, 0x24, 0x23, 0x22])
my_tweak = bytes([0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x04, 0xf5, 0xe6, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x04, 0xf5, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x04, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13])
my_message = bytes([0x05, 0x04, 0x03, 0x02, 0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6])
real_ciphertext = bytes([0xc6, 0x64, 0x9c, 0xbd, 0xe9, 0x90, 0x96, 0xff, 0x68, 0x01, 0xa2, 0x8f, 0x17, 0x21, 0xf7, 0xc7, 0xef, 0xa4, 0x86, 0x54, 0x78, 0xd1, 0xdc, 0xaa, 0x92, 0xfd, 0xcf, 0xc7, 0x87, 0xe6, 0xa5, 0x63, 0x8c, 0x1c, 0x95, 0x53, 0xbc, 0xe9, 0xdf, 0x88, 0x59, 0xa6, 0x2e, 0x8a, 0x58, 0xb7, 0x5e, 0x2a, 0xf8, 0x97, 0xa0, 0x9d, 0x2b, 0x07, 0x86, 0x9c, 0xbc, 0xd1, 0xa6, 0xb1, 0x7e, 0xba, 0x01, 0xaf])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_32_msg_64_tweak_96(self, test_workers):
my_key = bytes([0x21, 0x20, 0x1f, 0x1e, 0x1d, 0x1c, 0x1b, 0x1a, 0x19, 0x18, 0x17, 0x16, 0x15, 0x14, 0x13, 0x12, 0x11, 0x10, 0x0f, 0x0e, 0x0d, 0x0c, 0x0b, 0x0a, 0x09, 0x08, 0x07, 0x06, 0x05, 0x04, 0x03, 0x02])
my_tweak = bytes([0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x04, 0xf5, 0xe6, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x04, 0xf5, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x04, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x12, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31])
my_message = bytes([0xe5, 0xe4, 0xe3, 0xe2, 0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6])
real_ciphertext = bytes([0xe1, 0xa1, 0xc2, 0x49, 0xb5, 0x3c, 0x41, 0x28, 0xe1, 0x9c, 0xb1, 0x8e, 0x55, 0x15, 0x66, 0xa0, 0xe1, 0x54, 0xb6, 0x2c, 0x72, 0xd3, 0xad, 0xfb, 0x61, 0x7f, 0x30, 0x1b, 0x8b, 0x35, 0x33, 0x5f, 0x7d, 0x7c, 0x04, 0x4e, 0x2d, 0xa3, 0x94, 0x72, 0x72, 0x44, 0xba, 0x32, 0x49, 0x7c, 0x1e, 0x7b, 0xd7, 0x48, 0x94, 0xe7, 0x12, 0xd1, 0x9f, 0x1d, 0xb3, 0xaa, 0x71, 0x13, 0x51, 0xc7, 0x29, 0x46])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_32_msg_64_tweak_128(self, test_workers):
my_key = bytes([0x01, 0x00, 0xff, 0xfe, 0xfd, 0xfc, 0xfb, 0xfa, 0xf9, 0xf8, 0xf7, 0xf6, 0xf5, 0xf4, 0xf3, 0xf2, 0xf1, 0xf0, 0xef, 0xee, 0xed, 0xec, 0xeb, 0xea, 0xe9, 0xe8, 0xe7, 0xe6, 0xe5, 0xe4, 0xe3, 0xe2])
my_tweak = bytes([0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x04, 0xf5, 0xe6, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x04, 0xf5, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x04, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x12, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x21, 0x12, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x30, 0x21, 0x12, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f])
my_message = bytes([0xc5, 0xc4, 0xc3, 0xc2, 0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86])
real_ciphertext = bytes([0xdd, 0xf2, 0xc4, 0xc5, 0x95, 0x9a, 0xe2, 0x45, 0xe0, 0xab, 0x11, 0xc5, 0x85, 0x9d, 0x46, 0x51, 0xde, 0x8d, 0xeb, 0x78, 0xc5, 0xf9, 0x2d, 0xd3, 0x78, 0x63, 0x19, 0x2c, 0xfa, 0x72, 0x9e, 0x36, 0x83, 0xc8, 0x09, 0x90, 0xe1, 0xac, 0x06, 0x9c, 0xbe, 0x25, 0x9d, 0x7d, 0x8b, 0x08, 0x32, 0x13, 0x8c, 0xb9, 0xcb, 0xe3, 0x0b, 0xd4, 0xb7, 0x07, 0x54, 0xe0, 0xae, 0xb1, 0x8b, 0x37, 0xa2, 0x89])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_32_msg_64_tweak_160(self, test_workers):
my_key = bytes([0xe1, 0xe0, 0xdf, 0xde, 0xdd, 0xdc, 0xdb, 0xda, 0xd9, 0xd8, 0xd7, 0xd6, 0xd5, 0xd4, 0xd3, 0xd2, 0xd1, 0xd0, 0xcf, 0xce, 0xcd, 0xcc, 0xcb, 0xca, 0xc9, 0xc8, 0xc7, 0xc6, 0xc5, 0xc4, 0xc3, 0xc2])
my_tweak = bytes([0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x04, 0xf5, 0xe6, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x04, 0xf5, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x04, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x12, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x21, 0x12, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x30, 0x21, 0x12, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x3f, 0x30, 0x21, 0x12, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4e, 0x3f, 0x30, 0x21, 0x12, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d])
my_message = bytes([0xa5, 0xa4, 0xa3, 0xa2, 0xa1, 0xa0, 0x9f, 0x9e, 0x9d, 0x9c, 0x9b, 0x9a, 0x99, 0x98, 0x97, 0x96, 0x95, 0x94, 0x93, 0x92, 0x91, 0x90, 0x8f, 0x8e, 0x8d, 0x8c, 0x8b, 0x8a, 0x89, 0x88, 0x87, 0x86, 0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66])
real_ciphertext = bytes([0xdf, 0xfd, 0xd9, 0xff, 0x61, 0x46, 0x2f, 0xa7, 0x81, 0x7e, 0xd1, 0xcf, 0x2b, 0xf4, 0x74, 0x6e, 0x96, 0x7f, 0x58, 0x55, 0x5c, 0xf4, 0xfc, 0xae, 0xda, 0x46, 0xe2, 0x98, 0x6e, 0x6f, 0x39, 0x55, 0x52, 0x19, 0xd3, 0x9d, 0xaf, 0x80, 0xf4, 0xb7, 0x64, 0x60, 0xa6, 0xdf, 0xa4, 0xd3, 0x3b, 0x82, 0x05, 0xa1, 0x7c, 0xbc, 0xa2, 0x66, 0x6f, 0x9d, 0x88, 0x7b, 0x7a, 0x37, 0x31, 0x6b, 0x7c, 0x1c])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
def test_kravatte_WBC_k_32_msg_64_tweak_192(self, test_workers):
my_key = bytes([0xc1, 0xc0, 0xbf, 0xbe, 0xbd, 0xbc, 0xbb, 0xba, 0xb9, 0xb8, 0xb7, 0xb6, 0xb5, 0xb4, 0xb3, 0xb2, 0xb1, 0xb0, 0xaf, 0xae, 0xad, 0xac, 0xab, 0xaa, 0xa9, 0xa8, 0xa7, 0xa6, 0xa5, 0xa4, 0xa3, 0xa2])
my_tweak = bytes([0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x04, 0xf5, 0xe6, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x04, 0xf5, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x04, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x13, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x22, 0x12, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x31, 0x21, 0x12, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x40, 0x30, 0x21, 0x12, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4f, 0x3f, 0x30, 0x21, 0x12, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5e, 0x4e, 0x3f, 0x30, 0x21, 0x12, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6d, 0x5d, 0x4e, 0x3f, 0x30, 0x21, 0x12, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b, 0x7c, 0x6c, 0x5d, 0x4e, 0x3f, 0x30, 0x21, 0x12, 0x03, 0xf4, 0xe5, 0xd6, 0xc7, 0xb8, 0xa9, 0x9a, 0x8b])
my_message = bytes([0x85, 0x84, 0x83, 0x82, 0x81, 0x80, 0x7f, 0x7e, 0x7d, 0x7c, 0x7b, 0x7a, 0x79, 0x78, 0x77, 0x76, 0x75, 0x74, 0x73, 0x72, 0x71, 0x70, 0x6f, 0x6e, 0x6d, 0x6c, 0x6b, 0x6a, 0x69, 0x68, 0x67, 0x66, 0x65, 0x64, 0x63, 0x62, 0x61, 0x60, 0x5f, 0x5e, 0x5d, 0x5c, 0x5b, 0x5a, 0x59, 0x58, 0x57, 0x56, 0x55, 0x54, 0x53, 0x52, 0x51, 0x50, 0x4f, 0x4e, 0x4d, 0x4c, 0x4b, 0x4a, 0x49, 0x48, 0x47, 0x46])
real_ciphertext = bytes([0x28, 0x7b, 0xd4, 0x83, 0xbf, 0xf5, 0xb0, 0x5a, 0xc5, 0xf8, 0x50, 0xc6, 0x5a, 0xe3, 0xc6, 0x99, 0xca, 0x17, 0x06, 0xcd, 0x37, 0x0a, 0x59, 0x1f, 0xe3, 0xb3, 0x1a, 0x7a, 0x90, 0xe5, 0x5c, 0x07, 0xdd, 0x56, 0x87, 0xee, 0x8e, 0xc9, 0x46, 0xf6, 0xa9, 0x91, 0xb6, 0xba, 0x4e, 0x6f, 0x69, 0x0a, 0x47, 0xba, 0x37, 0x52, 0x52, 0x29, 0xaf, 0xe7, 0x05, 0xe6, 0x4b, 0x61, 0xbc, 0x03, 0xe4, 0x5d])
kravatte_wbc = KravatteWBC(len(my_message), my_tweak, my_key, workers=test_workers)
ciphertext = kravatte_wbc.encrypt(my_message)
plaintext = kravatte_wbc.decrypt(ciphertext)
assert real_ciphertext == ciphertext
assert my_message == plaintext
| 589.516611 | 20,439 | 0.667561 | 58,017 | 354,889 | 4.06243 | 0.006102 | 0.11544 | 0.172956 | 0.23054 | 0.545937 | 0.545857 | 0.544906 | 0.543498 | 0.538822 | 0.538822 | 0 | 0.400633 | 0.171369 | 354,889 | 601 | 20,440 | 590.497504 | 0.400837 | 0.000425 | 0 | 0.534451 | 1 | 0.001862 | 0.000279 | 0.000279 | 0 | 0 | 0.619529 | 0 | 0.197393 | 1 | 0.098696 | false | 0 | 0.005587 | 0 | 0.106145 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
91537e13eea1a2fb0371c822106fed46f8070f6d | 28,987 | py | Python | evaluation.py | RichardHGL/knowledge_representation_pytorch | 2bdbc255c4a3b63c2eef276dba696537c0da4c30 | [
"MIT"
] | 327 | 2018-03-26T14:00:04.000Z | 2022-03-27T07:48:50.000Z | evaluation.py | moonlightlong/knowledge_representation_pytorch | 26446f26ac16d57927da29c123718f244085dec4 | [
"MIT"
] | 15 | 2018-03-26T15:16:15.000Z | 2021-06-11T05:28:21.000Z | evaluation.py | moonlightlong/knowledge_representation_pytorch | 26446f26ac16d57927da29c123718f244085dec4 | [
"MIT"
] | 99 | 2018-03-27T05:03:42.000Z | 2021-11-30T06:17:13.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# @Date : 2017-12-15 16:03:42
# @Author : jimmy (jimmywangheng@qq.com)
# @Link : http://sdcs.sysu.edu.cn
# @Version : $Id$
import os
import numpy as np
import time
import datetime
import random
import multiprocessing
import math
from itertools import groupby
from utils import Triple, getRel
import torch
import torch.autograd as autograd
import torch.nn as nn
import torch.nn.functional as F
import torch.optim as optim
from sklearn.metrics.pairwise import pairwise_distances
from projection import *
USE_CUDA = torch.cuda.is_available()
if USE_CUDA:
longTensor = torch.cuda.LongTensor
floatTensor = torch.cuda.FloatTensor
else:
longTensor = torch.LongTensor
floatTensor = torch.FloatTensor
def isHit10(triple, tree, cal_embedding, tripleDict, isTail):
# If isTail == True, evaluate the prediction of tail entity
if isTail == True:
k = 0
wrongCount = 0
while wrongCount < 10:
k += 15
tail_dist, tail_ind = tree.query(cal_embedding, k=k)
for elem in tail_ind[0][k - 15: k]:
if triple.t == elem:
return True
elif (triple.h, elem, triple.r) in tripleDict:
continue
else:
wrongCount += 1
if wrongCount > 9:
return False
# If isTail == False, evaluate the prediction of head entity
else:
k = 0
wrongCount = 0
while wrongCount < 10:
k += 15
head_dist, head_ind = tree.query(cal_embedding, k=k)
for elem in head_ind[0][k - 15: k]:
if triple.h == elem:
return True
elif (elem, triple.t, triple.r) in tripleDict:
continue
else:
wrongCount += 1
if wrongCount > 9:
return False
# Find the rank of ground truth tail in the distance array,
# If (head, num, rel) in tripleDict,
# skip without counting.
def argwhereTail(head, tail, rel, array, tripleDict):
wrongAnswer = 0
for num in array:
if num == tail:
return wrongAnswer
elif (head, num, rel) in tripleDict:
continue
else:
wrongAnswer += 1
return wrongAnswer
# Find the rank of ground truth head in the distance array,
# If (head, num, rel) in tripleDict,
# skip without counting.
def argwhereHead(head, tail, rel, array, tripleDict):
wrongAnswer = 0
for num in array:
if num == head:
return wrongAnswer
elif (num, tail, rel) in tripleDict:
continue
else:
wrongAnswer += 1
return wrongAnswer
def pairwise_L1_distances(A, B):
dist = torch.sum(torch.abs(A.unsqueeze(1) - B.unsqueeze(0)), dim=2)
return dist
def pairwise_L2_distances(A, B):
AA = torch.sum(A ** 2, dim=1).unsqueeze(1)
BB = torch.sum(B ** 2, dim=1).unsqueeze(0)
dist = torch.mm(A, torch.transpose(B, 0, 1))
dist *= -2
dist += AA
dist += BB
return dist
def evaluation_transE_helper(testList, tripleDict, ent_embeddings,
rel_embeddings, L1_flag, filter, head=0):
# embeddings are numpy like
headList = [triple.h for triple in testList]
tailList = [triple.t for triple in testList]
relList = [triple.r for triple in testList]
h_e = ent_embeddings[headList]
t_e = ent_embeddings[tailList]
r_e = rel_embeddings[relList]
# Evaluate the prediction of only head entities
if head == 1:
c_h_e = t_e - r_e
if L1_flag == True:
dist = pairwise_distances(c_h_e, ent_embeddings, metric='manhattan')
else:
dist = pairwise_distances(c_h_e, ent_embeddings, metric='euclidean')
rankArrayHead = np.argsort(dist, axis=1)
# Don't check whether it is false negative
if filter == False:
rankListHead = [int(np.argwhere(elem[1]==elem[0])) for elem in zip(headList, rankArrayHead)]
# Check whether it is false negative
else:
rankListHead = [argwhereHead(elem[0], elem[1], elem[2], elem[3], tripleDict)
for elem in zip(headList, tailList, relList, rankArrayHead)]
isHit10ListHead = [x for x in rankListHead if x < 10]
totalRank = sum(rankListHead)
hit10Count = len(isHit10ListHead)
tripleCount = len(rankListHead)
# Evaluate the prediction of only tail entities
elif head == 2:
c_t_e = h_e + r_e
if L1_flag == True:
dist = pairwise_distances(c_t_e, ent_embeddings, metric='manhattan')
else:
dist = pairwise_distances(c_t_e, ent_embeddings, metric='euclidean')
rankArrayTail = np.argsort(dist, axis=1)
if filter == False:
rankListTail = [int(np.argwhere(elem[1]==elem[0])) for elem in zip(tailList, rankArrayTail)]
else:
rankListTail = [argwhereTail(elem[0], elem[1], elem[2], elem[3], tripleDict)
for elem in zip(headList, tailList, relList, rankArrayTail)]
isHit10ListTail = [x for x in rankListTail if x < 10]
totalRank = sum(rankListTail)
hit10Count = len(isHit10ListTail)
tripleCount = len(rankListTail)
# Evaluate the prediction of both head and tail entities
else:
c_t_e = h_e + r_e
c_h_e = t_e - r_e
if L1_flag == True:
dist = pairwise_distances(c_t_e, ent_embeddings, metric='manhattan')
else:
dist = pairwise_distances(c_t_e, ent_embeddings, metric='euclidean')
rankArrayTail = np.argsort(dist, axis=1)
if filter == False:
rankListTail = [int(np.argwhere(elem[1]==elem[0])) for elem in zip(tailList, rankArrayTail)]
else:
rankListTail = [argwhereTail(elem[0], elem[1], elem[2], elem[3], tripleDict)
for elem in zip(headList, tailList, relList, rankArrayTail)]
isHit10ListTail = [x for x in rankListTail if x < 10]
if L1_flag == True:
dist = pairwise_distances(c_h_e, ent_embeddings, metric='manhattan')
else:
dist = pairwise_distances(c_h_e, ent_embeddings, metric='euclidean')
rankArrayHead = np.argsort(dist, axis=1)
if filter == False:
rankListHead = [int(np.argwhere(elem[1]==elem[0])) for elem in zip(headList, rankArrayHead)]
else:
rankListHead = [argwhereHead(elem[0], elem[1], elem[2], elem[3], tripleDict)
for elem in zip(headList, tailList, relList, rankArrayHead)]
isHit10ListHead = [x for x in rankListHead if x < 10]
totalRank = sum(rankListTail) + sum(rankListHead)
hit10Count = len(isHit10ListTail) + len(isHit10ListHead)
tripleCount = len(rankListTail) + len(rankListHead)
return hit10Count, totalRank, tripleCount
class MyProcessTransE(multiprocessing.Process):
def __init__(self, L, tripleDict, ent_embeddings,
rel_embeddings, L1_flag, filter, queue=None, head=0):
super(MyProcessTransE, self).__init__()
self.L = L
self.queue = queue
self.tripleDict = tripleDict
self.ent_embeddings = ent_embeddings
self.rel_embeddings = rel_embeddings
self.L1_flag = L1_flag
self.filter = filter
self.head = head
def run(self):
while True:
testList = self.queue.get()
try:
self.process_data(testList, self.tripleDict, self.ent_embeddings, self.rel_embeddings,
self.L1_flag, self.filter, self.L, self.head)
except:
time.sleep(5)
self.process_data(testList, self.tripleDict, self.ent_embeddings, self.rel_embeddings,
self.L1_flag, self.filter, self.L, self.head)
self.queue.task_done()
def process_data(self, testList, tripleDict, ent_embeddings, rel_embeddings,
L1_flag, filter, L, head):
hit10Count, totalRank, tripleCount = evaluation_transE_helper(testList, tripleDict, ent_embeddings,
rel_embeddings, L1_flag, filter, head)
L.append((hit10Count, totalRank, tripleCount))
# Use multiprocessing to speed up evaluation
def evaluation_transE(testList, tripleDict, ent_embeddings, rel_embeddings,
L1_flag, filter, k=0, num_processes=multiprocessing.cpu_count(), head=0):
# embeddings are numpy like
if k > len(testList):
testList = random.choices(testList, k=k)
elif k > 0:
testList = random.sample(testList, k=k)
# Split the testList into #num_processes parts
len_split = math.ceil(len(testList) / num_processes)
testListSplit = [testList[i : i + len_split] for i in range(0, len(testList), len_split)]
with multiprocessing.Manager() as manager:
# Create a public writable list to store the result
L = manager.list()
queue = multiprocessing.JoinableQueue()
workerList = []
for i in range(num_processes):
worker = MyProcessTransE(L, tripleDict, ent_embeddings, rel_embeddings,
L1_flag, filter, queue=queue, head=head)
workerList.append(worker)
worker.daemon = True
worker.start()
for subList in testListSplit:
queue.put(subList)
queue.join()
resultList = list(L)
# Terminate the worker after execution, to avoid memory leaking
for worker in workerList:
worker.terminate()
if head == 1 or head == 2:
hit10 = sum([elem[0] for elem in resultList]) / len(testList)
meanrank = sum([elem[1] for elem in resultList]) / len(testList)
else:
hit10 = sum([elem[0] for elem in resultList]) / (2 * len(testList))
meanrank = sum([elem[1] for elem in resultList]) / (2 * len(testList))
print('Meanrank: %.6f' % meanrank)
print('Hit@10: %.6f' % hit10)
return hit10, meanrank
def evaluation_transH_helper(testList, tripleDict, ent_embeddings,
rel_embeddings, norm_embeddings, L1_flag, filter, head=0):
# embeddings are torch tensor like (No Variable!)
# Only one kind of relation
headList = [triple.h for triple in testList]
tailList = [triple.t for triple in testList]
relList = [triple.r for triple in testList]
h_e = ent_embeddings[headList]
t_e = ent_embeddings[tailList]
r_e = rel_embeddings[relList]
this_rel = relList[0]
this_norm_emb = norm_embeddings[this_rel]
this_proj_all_e = projection_transH_pytorch(ent_embeddings, this_norm_emb)
this_proj_all_e = this_proj_all_e.cpu().numpy()
if head == 1:
proj_t_e = projection_transH_pytorch(t_e, this_norm_emb)
c_h_e = proj_t_e - r_e
c_h_e = c_h_e.cpu().numpy()
if L1_flag == True:
dist = pairwise_distances(c_h_e, this_proj_all_e, metric='manhattan')
else:
dist = pairwise_distances(c_h_e, this_proj_all_e, metric='euclidean')
rankArrayHead = np.argsort(dist, axis=1)
if filter == False:
rankListHead = [int(np.argwhere(elem[1]==elem[0])) for elem in zip(headList, rankArrayHead)]
else:
rankListHead = [argwhereHead(elem[0], elem[1], elem[2], elem[3], tripleDict)
for elem in zip(headList, tailList, relList, rankArrayHead)]
isHit10ListHead = [x for x in rankListHead if x < 10]
totalRank = sum(rankListHead)
hit10Count = len(isHit10ListHead)
tripleCount = len(rankListHead)
elif head == 2:
proj_h_e = projection_transH_pytorch(h_e, this_norm_emb)
c_t_e = proj_h_e + r_e
c_t_e = c_t_e.cpu().numpy()
if L1_flag == True:
dist = pairwise_distances(c_t_e, this_proj_all_e, metric='manhattan')
else:
dist = pairwise_distances(c_t_e, this_proj_all_e, metric='euclidean')
rankArrayTail = np.argsort(dist, axis=1)
if filter == False:
rankListTail = [int(np.argwhere(elem[1]==elem[0])) for elem in zip(tailList, rankArrayTail)]
else:
rankListTail = [argwhereTail(elem[0], elem[1], elem[2], elem[3], tripleDict)
for elem in zip(headList, tailList, relList, rankArrayTail)]
isHit10ListTail = [x for x in rankListTail if x < 10]
totalRank = sum(rankListTail)
hit10Count = len(isHit10ListTail)
tripleCount = len(rankListTail)
else:
proj_h_e = projection_transH_pytorch(h_e, this_norm_emb)
c_t_e = proj_h_e + r_e
proj_t_e = projection_transH_pytorch(t_e, this_norm_emb)
c_h_e = proj_t_e - r_e
c_t_e = c_t_e.cpu().numpy()
c_h_e = c_h_e.cpu().numpy()
if L1_flag == True:
dist = pairwise_distances(c_t_e, this_proj_all_e, metric='manhattan')
else:
dist = pairwise_distances(c_t_e, this_proj_all_e, metric='euclidean')
rankArrayTail = np.argsort(dist, axis=1)
if filter == False:
rankListTail = [int(np.argwhere(elem[1]==elem[0])) for elem in zip(tailList, rankArrayTail)]
else:
rankListTail = [argwhereTail(elem[0], elem[1], elem[2], elem[3], tripleDict)
for elem in zip(headList, tailList, relList, rankArrayTail)]
isHit10ListTail = [x for x in rankListTail if x < 10]
if L1_flag == True:
dist = pairwise_distances(c_h_e, this_proj_all_e, metric='manhattan')
else:
dist = pairwise_distances(c_h_e, this_proj_all_e, metric='euclidean')
rankArrayHead = np.argsort(dist, axis=1)
if filter == False:
rankListHead = [int(np.argwhere(elem[1]==elem[0])) for elem in zip(headList, rankArrayHead)]
else:
rankListHead = [argwhereHead(elem[0], elem[1], elem[2], elem[3], tripleDict)
for elem in zip(headList, tailList, relList, rankArrayHead)]
isHit10ListHead = [x for x in rankListHead if x < 10]
totalRank = sum(rankListTail) + sum(rankListHead)
hit10Count = len(isHit10ListTail) + len(isHit10ListHead)
tripleCount = len(rankListTail) + len(rankListHead)
return hit10Count, totalRank, tripleCount
class MyProcessTransH(multiprocessing.Process):
def __init__(self, L, tripleDict, ent_embeddings,
rel_embeddings, norm_embeddings, L1_flag, filter, queue=None, head=0):
super(MyProcessTransH, self).__init__()
self.L = L
self.queue = queue
self.tripleDict = tripleDict
self.ent_embeddings = ent_embeddings
self.rel_embeddings = rel_embeddings
self.norm_embeddings = norm_embeddings
self.L1_flag = L1_flag
self.filter = filter
self.head = head
def run(self):
while True:
testList = self.queue.get()
try:
self.process_data(testList, self.tripleDict, self.ent_embeddings, self.rel_embeddings,
self.norm_embeddings, self.L1_flag, self.filter, self.L, self.head)
except:
time.sleep(5)
self.process_data(testList, self.tripleDict, self.ent_embeddings, self.rel_embeddings,
self.norm_embeddings, self.L1_flag, self.filter, self.L, self.head)
self.queue.task_done()
def process_data(self, testList, tripleDict, ent_embeddings, rel_embeddings,
norm_embeddings, L1_flag, filter, L, head):
hit10Count, totalRank, tripleCount = evaluation_transH_helper(testList, tripleDict, ent_embeddings,
rel_embeddings, norm_embeddings, L1_flag, filter, head)
L.append((hit10Count, totalRank, tripleCount))
def evaluation_transH(testList, tripleDict, ent_embeddings, rel_embeddings,
norm_embeddings, L1_flag, filter, k=0, num_processes=multiprocessing.cpu_count(), head=0):
# embeddings are torch tensor like (No Variable!)
if k > len(testList):
testList = random.choices(testList, k=k)
elif k > 0:
testList = random.sample(testList, k=k)
# Split the testList according to the relation
testList.sort(key=lambda x: (x.r, x.h, x.t))
grouped = [(k, list(g)) for k, g in groupby(testList, key=getRel)]
ent_embeddings = ent_embeddings.cpu()
rel_embeddings = rel_embeddings.cpu()
norm_embeddings = norm_embeddings.cpu()
with multiprocessing.Manager() as manager:
L = manager.list()
queue = multiprocessing.JoinableQueue()
workerList = []
for i in range(num_processes):
worker = MyProcessTransH(L, tripleDict, ent_embeddings, rel_embeddings,
norm_embeddings, L1_flag, filter, queue=queue, head=head)
workerList.append(worker)
worker.daemon = True
worker.start()
for k, subList in grouped:
queue.put(subList)
queue.join()
resultList = list(L)
for worker in workerList:
worker.terminate()
if head == 1 or head == 2:
hit10 = sum([elem[0] for elem in resultList]) / len(testList)
meanrank = sum([elem[1] for elem in resultList]) / len(testList)
else:
hit10 = sum([elem[0] for elem in resultList]) / (2 * len(testList))
meanrank = sum([elem[1] for elem in resultList]) / (2 * len(testList))
print('Meanrank: %.6f' % meanrank)
print('Hit@10: %.6f' % hit10)
return hit10, meanrank
def evaluation_transR_helper(testList, tripleDict, ent_embeddings,
rel_embeddings, proj_embeddings, L1_flag, filter, head):
# embeddings are torch tensor like (No Variable!)
# Only one kind of relation
headList = [triple.h for triple in testList]
tailList = [triple.t for triple in testList]
relList = [triple.r for triple in testList]
h_e = ent_embeddings[headList]
t_e = ent_embeddings[tailList]
r_e = rel_embeddings[relList]
this_rel = relList[0]
this_proj_emb = proj_embeddings[[this_rel]]
this_proj_all_e = projection_transR_pytorch(ent_embeddings, this_proj_emb)
this_proj_all_e = this_proj_all_e.cpu().numpy()
if head == 1:
proj_t_e = projection_transR_pytorch(t_e, this_proj_emb)
c_h_e = proj_t_e - r_e
c_h_e = c_h_e.cpu().numpy()
if L1_flag == True:
dist = pairwise_distances(c_h_e, this_proj_all_e, metric='manhattan')
else:
dist = pairwise_distances(c_h_e, this_proj_all_e, metric='euclidean')
rankArrayHead = np.argsort(dist, axis=1)
if filter == False:
rankListHead = [int(np.argwhere(elem[1]==elem[0])) for elem in zip(headList, rankArrayHead)]
else:
rankListHead = [argwhereHead(elem[0], elem[1], elem[2], elem[3], tripleDict)
for elem in zip(headList, tailList, relList, rankArrayHead)]
isHit10ListHead = [x for x in rankListHead if x < 10]
totalRank = sum(rankListHead)
hit10Count = len(isHit10ListHead)
tripleCount = len(rankListHead)
elif head == 2:
proj_h_e = projection_transR_pytorch(h_e, this_proj_emb)
c_t_e = proj_h_e + r_e
c_t_e = c_t_e.cpu().numpy()
if L1_flag == True:
dist = pairwise_distances(c_t_e, this_proj_all_e, metric='manhattan')
else:
dist = pairwise_distances(c_t_e, this_proj_all_e, metric='euclidean')
rankArrayTail = np.argsort(dist, axis=1)
if filter == False:
rankListTail = [int(np.argwhere(elem[1]==elem[0])) for elem in zip(tailList, rankArrayTail)]
else:
rankListTail = [argwhereTail(elem[0], elem[1], elem[2], elem[3], tripleDict)
for elem in zip(headList, tailList, relList, rankArrayTail)]
isHit10ListTail = [x for x in rankListTail if x < 10]
totalRank = sum(rankListTail)
hit10Count = len(isHit10ListTail)
tripleCount = len(rankListTail)
else:
proj_h_e = projection_transR_pytorch(h_e, this_proj_emb)
c_t_e = proj_h_e + r_e
proj_t_e = projection_transR_pytorch(t_e, this_proj_emb)
c_h_e = proj_t_e - r_e
c_t_e = c_t_e.cpu().numpy()
c_h_e = c_h_e.cpu().numpy()
if L1_flag == True:
dist = pairwise_distances(c_t_e, this_proj_all_e, metric='manhattan')
else:
dist = pairwise_distances(c_t_e, this_proj_all_e, metric='euclidean')
rankArrayTail = np.argsort(dist, axis=1)
if filter == False:
rankListTail = [int(np.argwhere(elem[1]==elem[0])) for elem in zip(tailList, rankArrayTail)]
else:
rankListTail = [argwhereTail(elem[0], elem[1], elem[2], elem[3], tripleDict)
for elem in zip(headList, tailList, relList, rankArrayTail)]
isHit10ListTail = [x for x in rankListTail if x < 10]
if L1_flag == True:
dist = pairwise_distances(c_h_e, this_proj_all_e, metric='manhattan')
else:
dist = pairwise_distances(c_h_e, this_proj_all_e, metric='euclidean')
rankArrayHead = np.argsort(dist, axis=1)
if filter == False:
rankListHead = [int(np.argwhere(elem[1]==elem[0])) for elem in zip(headList, rankArrayHead)]
else:
rankListHead = [argwhereHead(elem[0], elem[1], elem[2], elem[3], tripleDict)
for elem in zip(headList, tailList, relList, rankArrayHead)]
isHit10ListHead = [x for x in rankListHead if x < 10]
totalRank = sum(rankListTail) + sum(rankListHead)
hit10Count = len(isHit10ListTail) + len(isHit10ListHead)
tripleCount = len(rankListTail) + len(rankListHead)
return hit10Count, totalRank, tripleCount
class MyProcessTransR(multiprocessing.Process):
def __init__(self, L, tripleDict, ent_embeddings,
rel_embeddings, proj_embeddings, L1_flag, filter, queue=None, head=0):
super(MyProcessTransR, self).__init__()
self.L = L
self.queue = queue
self.tripleDict = tripleDict
self.ent_embeddings = ent_embeddings
self.rel_embeddings = rel_embeddings
self.proj_embeddings = proj_embeddings
self.L1_flag = L1_flag
self.filter = filter
self.head = head
def run(self):
while True:
testList = self.queue.get()
try:
self.process_data(testList, self.tripleDict, self.ent_embeddings, self.rel_embeddings,
self.proj_embeddings, self.L1_flag, self.filter, self.L, self.head)
except:
time.sleep(5)
self.process_data(testList, self.tripleDict, self.ent_embeddings, self.rel_embeddings,
self.proj_embeddings, self.L1_flag, self.filter, self.L, self.head)
self.queue.task_done()
def process_data(self, testList, tripleDict, ent_embeddings, rel_embeddings,
proj_embeddings, L1_flag, filter, L, head):
hit10Count, totalRank, tripleCount = evaluation_transR_helper(testList, tripleDict, ent_embeddings,
rel_embeddings, proj_embeddings, L1_flag, filter, head=head)
L.append((hit10Count, totalRank, tripleCount))
def evaluation_transR(testList, tripleDict, ent_embeddings, rel_embeddings,
proj_embeddings, L1_flag, filter, k=0, num_processes=multiprocessing.cpu_count(), head=0):
# embeddings are torch tensor like (No Variable!)
if k > len(testList):
testList = random.choices(testList, k=k)
elif k > 0:
testList = random.sample(testList, k=k)
# Split the testList according to the relation
testList.sort(key=lambda x: (x.r, x.h, x.t))
grouped = [(k, list(g)) for k, g in groupby(testList, key=getRel)]
ent_embeddings = ent_embeddings.cpu()
rel_embeddings = rel_embeddings.cpu()
proj_embeddings = proj_embeddings.cpu()
with multiprocessing.Manager() as manager:
L = manager.list()
queue = multiprocessing.JoinableQueue()
workerList = []
for i in range(num_processes):
worker = MyProcessTransR(L, tripleDict, ent_embeddings, rel_embeddings,
proj_embeddings, L1_flag, filter, queue=queue, head=head)
workerList.append(worker)
worker.daemon = True
worker.start()
for k, subList in grouped:
queue.put(subList)
queue.join()
resultList = list(L)
for worker in workerList:
worker.terminate()
if head == 1 or head == 2:
hit10 = sum([elem[0] for elem in resultList]) / len(testList)
meanrank = sum([elem[1] for elem in resultList]) / len(testList)
else:
hit10 = sum([elem[0] for elem in resultList]) / (2 * len(testList))
meanrank = sum([elem[1] for elem in resultList]) / (2 * len(testList))
print('Meanrank: %.6f' % meanrank)
print('Hit@10: %.6f' % hit10)
return hit10, meanrank
def evaluation_transD_helper(testList, tripleDict, ent_embeddings,
rel_embeddings, ent_proj_embeddings, rel_proj_embeddings, L1_flag, filter, head=0):
# embeddings are torch tensor like (No Variable!)
# Only one kind of relation
headList = [triple.h for triple in testList]
tailList = [triple.t for triple in testList]
relList = [triple.r for triple in testList]
h_e = ent_embeddings[headList]
t_e = ent_embeddings[tailList]
r_e = rel_embeddings[relList]
head_proj_emb = ent_proj_embeddings[headList]
tail_proj_emb = ent_proj_embeddings[tailList]
this_rel = relList[0]
this_rel_proj_emb = rel_proj_embeddings[[this_rel]]
this_proj_all_e = projection_transD_pytorch_samesize(ent_embeddings, ent_proj_embeddings, this_rel_proj_emb)
this_proj_all_e = this_proj_all_e.cpu().numpy()
if head == 1:
proj_t_e = projection_transD_pytorch_samesize(t_e, tail_proj_emb, this_rel_proj_emb)
c_h_e = proj_t_e - r_e
c_h_e = c_h_e.cpu().numpy()
if L1_flag == True:
dist = pairwise_distances(c_h_e, this_proj_all_e, metric='manhattan')
else:
dist = pairwise_distances(c_h_e, this_proj_all_e, metric='euclidean')
rankArrayHead = np.argsort(dist, axis=1)
if filter == False:
rankListHead = [int(np.argwhere(elem[1]==elem[0])) for elem in zip(headList, rankArrayHead)]
else:
rankListHead = [argwhereHead(elem[0], elem[1], elem[2], elem[3], tripleDict)
for elem in zip(headList, tailList, relList, rankArrayHead)]
isHit10ListHead = [x for x in rankListHead if x < 10]
totalRank = sum(rankListHead)
hit10Count = len(isHit10ListHead)
tripleCount = len(rankListHead)
elif head == 2:
proj_h_e = projection_transD_pytorch_samesize(h_e, head_proj_emb, this_rel_proj_emb)
c_t_e = proj_h_e + r_e
c_t_e = c_t_e.cpu().numpy()
if L1_flag == True:
dist = pairwise_distances(c_t_e, this_proj_all_e, metric='manhattan')
else:
dist = pairwise_distances(c_t_e, this_proj_all_e, metric='euclidean')
rankArrayTail = np.argsort(dist, axis=1)
if filter == False:
rankListTail = [int(np.argwhere(elem[1]==elem[0])) for elem in zip(tailList, rankArrayTail)]
else:
rankListTail = [argwhereTail(elem[0], elem[1], elem[2], elem[3], tripleDict)
for elem in zip(headList, tailList, relList, rankArrayTail)]
isHit10ListTail = [x for x in rankListTail if x < 10]
totalRank = sum(rankListTail)
hit10Count = len(isHit10ListTail)
tripleCount = len(rankListTail)
else:
proj_h_e = projection_transD_pytorch_samesize(h_e, head_proj_emb, this_rel_proj_emb)
c_t_e = proj_h_e + r_e
proj_t_e = projection_transD_pytorch_samesize(t_e, tail_proj_emb, this_rel_proj_emb)
c_h_e = proj_t_e - r_e
c_t_e = c_t_e.cpu().numpy()
c_h_e = c_h_e.cpu().numpy()
if L1_flag == True:
dist = pairwise_distances(c_t_e, this_proj_all_e, metric='manhattan')
else:
dist = pairwise_distances(c_t_e, this_proj_all_e, metric='euclidean')
rankArrayTail = np.argsort(dist, axis=1)
if filter == False:
rankListTail = [int(np.argwhere(elem[1]==elem[0])) for elem in zip(tailList, rankArrayTail)]
else:
rankListTail = [argwhereTail(elem[0], elem[1], elem[2], elem[3], tripleDict)
for elem in zip(headList, tailList, relList, rankArrayTail)]
isHit10ListTail = [x for x in rankListTail if x < 10]
if L1_flag == True:
dist = pairwise_distances(c_h_e, this_proj_all_e, metric='manhattan')
else:
dist = pairwise_distances(c_h_e, this_proj_all_e, metric='euclidean')
rankArrayHead = np.argsort(dist, axis=1)
if filter == False:
rankListHead = [int(np.argwhere(elem[1]==elem[0])) for elem in zip(headList, rankArrayHead)]
else:
rankListHead = [argwhereHead(elem[0], elem[1], elem[2], elem[3], tripleDict)
for elem in zip(headList, tailList, relList, rankArrayHead)]
isHit10ListHead = [x for x in rankListHead if x < 10]
totalRank = sum(rankListTail) + sum(rankListHead)
hit10Count = len(isHit10ListTail) + len(isHit10ListHead)
tripleCount = len(rankListTail) + len(rankListHead)
return hit10Count, totalRank, tripleCount
class MyProcessTransD(multiprocessing.Process):
def __init__(self, L, tripleDict, ent_embeddings,
rel_embeddings, ent_proj_embeddings, rel_proj_embeddings,L1_flag, filter, queue=None, head=0):
super(MyProcessTransD, self).__init__()
self.L = L
self.queue = queue
self.tripleDict = tripleDict
self.ent_embeddings = ent_embeddings
self.rel_embeddings = rel_embeddings
self.ent_proj_embeddings = ent_proj_embeddings
self.rel_proj_embeddings = rel_proj_embeddings
self.L1_flag = L1_flag
self.filter = filter
self.head = head
def run(self):
while True:
testList = self.queue.get()
try:
self.process_data(testList, self.tripleDict, self.ent_embeddings, self.rel_embeddings,
self.ent_proj_embeddings, self.rel_proj_embeddings, self.L1_flag, self.filter, self.L, self.head)
except:
time.sleep(5)
self.process_data(testList, self.tripleDict, self.ent_embeddings, self.rel_embeddings,
self.ent_proj_embeddings, self.rel_proj_embeddings, self.L1_flag, self.filter, self.L, self.head)
self.queue.task_done()
def process_data(self, testList, tripleDict, ent_embeddings, rel_embeddings,
ent_proj_embeddings, rel_proj_embeddings, L1_flag, filter, L, head):
hit10Count, totalRank, tripleCount = evaluation_transD_helper(testList, tripleDict, ent_embeddings,
rel_embeddings, ent_proj_embeddings, rel_proj_embeddings, L1_flag, filter, head=head)
L.append((hit10Count, totalRank, tripleCount))
def evaluation_transD(testList, tripleDict, ent_embeddings, rel_embeddings,
ent_proj_embeddings, rel_proj_embeddings, L1_flag, filter, k=0, num_processes=multiprocessing.cpu_count(), head=0):
# embeddings are torch tensor like (No Variable!)
if k > len(testList):
testList = random.choices(testList, k=k)
elif k > 0:
testList = random.sample(testList, k=k)
# Split the testList according to the relation
testList.sort(key=lambda x: (x.r, x.h, x.t))
grouped = [(k, list(g)) for k, g in groupby(testList, key=getRel)]
ent_embeddings = ent_embeddings.cpu()
rel_embeddings = rel_embeddings.cpu()
ent_proj_embeddings = ent_proj_embeddings.cpu()
rel_proj_embeddings = rel_proj_embeddings.cpu()
with multiprocessing.Manager() as manager:
L = manager.list()
queue = multiprocessing.JoinableQueue()
workerList = []
for i in range(num_processes):
worker = MyProcessTransD(L, tripleDict, ent_embeddings, rel_embeddings,
ent_proj_embeddings, rel_proj_embeddings, L1_flag, filter, queue=queue, head=head)
workerList.append(worker)
worker.daemon = True
worker.start()
for k, subList in grouped:
queue.put(subList)
queue.join()
resultList = list(L)
for worker in workerList:
worker.terminate()
if head == 1 or head == 2:
hit10 = sum([elem[0] for elem in resultList]) / len(testList)
meanrank = sum([elem[1] for elem in resultList]) / len(testList)
else:
hit10 = sum([elem[0] for elem in resultList]) / (2 * len(testList))
meanrank = sum([elem[1] for elem in resultList]) / (2 * len(testList))
print('Meanrank: %.6f' % meanrank)
print('Hit@10: %.6f' % hit10)
return hit10, meanrank
| 33.433679 | 116 | 0.723945 | 4,211 | 28,987 | 4.789361 | 0.060793 | 0.041898 | 0.022313 | 0.019635 | 0.905841 | 0.895131 | 0.885313 | 0.881297 | 0.876091 | 0.859927 | 0 | 0.019911 | 0.161417 | 28,987 | 866 | 117 | 33.472286 | 0.809775 | 0.051333 | 0 | 0.826484 | 0 | 0 | 0.014275 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038052 | false | 0 | 0.024353 | 0 | 0.09589 | 0.012177 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e66fa71d5a066447cf7b9faa9a286877890a77e1 | 2,202 | py | Python | tests/test_tree.py | danielhgasparin/algorithms-python | 4b27c3cddd22762599fe55d3b760f388733c4fa7 | [
"MIT"
] | null | null | null | tests/test_tree.py | danielhgasparin/algorithms-python | 4b27c3cddd22762599fe55d3b760f388733c4fa7 | [
"MIT"
] | null | null | null | tests/test_tree.py | danielhgasparin/algorithms-python | 4b27c3cddd22762599fe55d3b760f388733c4fa7 | [
"MIT"
] | null | null | null | import unittest
from algorithms.tree import Tree, Node
class TestTree(unittest.TestCase):
def test_tree_traverse_breadth_first(self):
def make_node_counter():
count = 1
def node_counter(node):
nonlocal count
node.value = str(count) + "_" + str(node.value)
count += 1
return node_counter
tree = Tree()
tree.root = Node(1)
tree.root.add(2)
tree.root.add(3)
tree.root.add(4)
tree.root.children[0].add(5)
tree.root.children[0].add(6)
tree.root.children[2].add(7)
tree.traverse_breadth_first(make_node_counter())
self.assertEqual(tree.root.value, "1_1")
self.assertEqual(tree.root.children[0].value, "2_2")
self.assertEqual(tree.root.children[1].value, "3_3")
self.assertEqual(tree.root.children[2].value, "4_4")
self.assertEqual(tree.root.children[0].children[0].value, "5_5")
self.assertEqual(tree.root.children[0].children[1].value, "6_6")
self.assertEqual(tree.root.children[2].children[0].value, "7_7")
def test_tree_traverse_depth_first(self):
def make_node_counter():
count = 1
def node_counter(node):
nonlocal count
node.value = str(count) + "_" + str(node.value)
count += 1
return node_counter
tree = Tree()
tree.root = Node(1)
tree.root.add(2)
tree.root.add(3)
tree.root.add(4)
tree.root.children[0].add(5)
tree.root.children[0].add(6)
tree.root.children[2].add(7)
tree.traverse_depth_first(make_node_counter())
self.assertEqual(tree.root.value, "1_1")
self.assertEqual(tree.root.children[0].value, "2_2")
self.assertEqual(tree.root.children[1].value, "5_3")
self.assertEqual(tree.root.children[2].value, "6_4")
self.assertEqual(tree.root.children[0].children[0].value, "3_5")
self.assertEqual(tree.root.children[0].children[1].value, "4_6")
self.assertEqual(tree.root.children[2].children[0].value, "7_7") | 35.516129 | 72 | 0.588102 | 294 | 2,202 | 4.268707 | 0.119048 | 0.178486 | 0.229482 | 0.256574 | 0.884462 | 0.884462 | 0.884462 | 0.884462 | 0.823904 | 0.823904 | 0 | 0.043668 | 0.272025 | 2,202 | 62 | 73 | 35.516129 | 0.739239 | 0 | 0 | 0.705882 | 0 | 0 | 0.019973 | 0 | 0 | 0 | 0 | 0 | 0.27451 | 1 | 0.117647 | false | 0 | 0.039216 | 0 | 0.215686 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
e6d940404bb9f51e31804c7af88a836dcd0a1174 | 24,354 | py | Python | sdk/compute/azure-mgmt-vmwarecloudsimple/azure/mgmt/vmwarecloudsimple/operations/_dedicated_cloud_nodes_operations.py | GabrielHobold/azure-sdk-for-python | 7248645bcb0d590eafdae6ffc9d25ec688a0ff68 | [
"MIT"
] | null | null | null | sdk/compute/azure-mgmt-vmwarecloudsimple/azure/mgmt/vmwarecloudsimple/operations/_dedicated_cloud_nodes_operations.py | GabrielHobold/azure-sdk-for-python | 7248645bcb0d590eafdae6ffc9d25ec688a0ff68 | [
"MIT"
] | null | null | null | sdk/compute/azure-mgmt-vmwarecloudsimple/azure/mgmt/vmwarecloudsimple/operations/_dedicated_cloud_nodes_operations.py | GabrielHobold/azure-sdk-for-python | 7248645bcb0d590eafdae6ffc9d25ec688a0ff68 | [
"MIT"
] | null | null | null | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
import uuid
from msrest.pipeline import ClientRawResponse
from msrest.polling import LROPoller, NoPolling
from msrestazure.polling.arm_polling import ARMPolling
from .. import models
class DedicatedCloudNodesOperations(object):
"""DedicatedCloudNodesOperations operations.
You should not instantiate directly this class, but create a Client instance that will create it for you and attach it as attribute.
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An object model deserializer.
:ivar api_version: Client API version. Constant value: "2019-04-01".
"""
models = models
def __init__(self, client, config, serializer, deserializer):
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self.api_version = "2019-04-01"
self.config = config
def list_by_subscription(
self, filter=None, top=None, skip_token=None, custom_headers=None, raw=False, **operation_config):
"""Implements list of dedicated cloud nodes within subscription method.
Returns list of dedicate cloud nodes within subscription.
:param filter: The filter to apply on the list operation
:type filter: str
:param top: The maximum number of record sets to return
:type top: int
:param skip_token: to be used by nextLink implementation
:type skip_token: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of DedicatedCloudNode
:rtype:
~azure.mgmt.vmwarecloudsimple.models.DedicatedCloudNodePaged[~azure.mgmt.vmwarecloudsimple.models.DedicatedCloudNode]
:raises:
:class:`CSRPErrorException<azure.mgmt.vmwarecloudsimple.models.CSRPErrorException>`
"""
def prepare_request(next_link=None):
if not next_link:
# Construct URL
url = self.list_by_subscription.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
if filter is not None:
query_parameters['$filter'] = self._serialize.query("filter", filter, 'str')
if top is not None:
query_parameters['$top'] = self._serialize.query("top", top, 'int')
if skip_token is not None:
query_parameters['$skipToken'] = self._serialize.query("skip_token", skip_token, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
return request
def internal_paging(next_link=None):
request = prepare_request(next_link)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.CSRPErrorException(self._deserialize, response)
return response
# Deserialize response
header_dict = None
if raw:
header_dict = {}
deserialized = models.DedicatedCloudNodePaged(internal_paging, self._deserialize.dependencies, header_dict)
return deserialized
list_by_subscription.metadata = {'url': '/subscriptions/{subscriptionId}/providers/Microsoft.VMwareCloudSimple/dedicatedCloudNodes'}
def list_by_resource_group(
self, resource_group_name, filter=None, top=None, skip_token=None, custom_headers=None, raw=False, **operation_config):
"""Implements list of dedicated cloud nodes within RG method.
Returns list of dedicate cloud nodes within resource group.
:param resource_group_name: The name of the resource group
:type resource_group_name: str
:param filter: The filter to apply on the list operation
:type filter: str
:param top: The maximum number of record sets to return
:type top: int
:param skip_token: to be used by nextLink implementation
:type skip_token: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of DedicatedCloudNode
:rtype:
~azure.mgmt.vmwarecloudsimple.models.DedicatedCloudNodePaged[~azure.mgmt.vmwarecloudsimple.models.DedicatedCloudNode]
:raises:
:class:`CSRPErrorException<azure.mgmt.vmwarecloudsimple.models.CSRPErrorException>`
"""
def prepare_request(next_link=None):
if not next_link:
# Construct URL
url = self.list_by_resource_group.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
if filter is not None:
query_parameters['$filter'] = self._serialize.query("filter", filter, 'str')
if top is not None:
query_parameters['$top'] = self._serialize.query("top", top, 'int')
if skip_token is not None:
query_parameters['$skipToken'] = self._serialize.query("skip_token", skip_token, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
return request
def internal_paging(next_link=None):
request = prepare_request(next_link)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.CSRPErrorException(self._deserialize, response)
return response
# Deserialize response
header_dict = None
if raw:
header_dict = {}
deserialized = models.DedicatedCloudNodePaged(internal_paging, self._deserialize.dependencies, header_dict)
return deserialized
list_by_resource_group.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.VMwareCloudSimple/dedicatedCloudNodes'}
def get(
self, resource_group_name, dedicated_cloud_node_name, custom_headers=None, raw=False, **operation_config):
"""Implements dedicated cloud node GET method.
Returns dedicated cloud node.
:param resource_group_name: The name of the resource group
:type resource_group_name: str
:param dedicated_cloud_node_name: dedicated cloud node name
:type dedicated_cloud_node_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: DedicatedCloudNode or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.vmwarecloudsimple.models.DedicatedCloudNode or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`CSRPErrorException<azure.mgmt.vmwarecloudsimple.models.CSRPErrorException>`
"""
# Construct URL
url = self.get.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'dedicatedCloudNodeName': self._serialize.url("dedicated_cloud_node_name", dedicated_cloud_node_name, 'str', pattern=r'^[-a-zA-Z0-9]+$')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.CSRPErrorException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('DedicatedCloudNode', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.VMwareCloudSimple/dedicatedCloudNodes/{dedicatedCloudNodeName}'}
def _create_or_update_initial(
self, resource_group_name, dedicated_cloud_node_name, dedicated_cloud_node_request, custom_headers=None, raw=False, **operation_config):
# Construct URL
url = self.create_or_update.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'dedicatedCloudNodeName': self._serialize.url("dedicated_cloud_node_name", dedicated_cloud_node_name, 'str', pattern=r'^[-a-zA-Z0-9]+$')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
header_parameters['Referer'] = self._serialize.header("self.config.referer", self.config.referer, 'str')
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(dedicated_cloud_node_request, 'DedicatedCloudNode')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.CSRPErrorException(self._deserialize, response)
deserialized = None
header_dict = {}
if response.status_code == 200:
deserialized = self._deserialize('DedicatedCloudNode', response)
header_dict = {
'Azure-AsyncOperation': 'str',
'Location': 'str',
'Retry-After': 'int',
}
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
client_raw_response.add_headers(header_dict)
return client_raw_response
return deserialized
def create_or_update(
self, resource_group_name, dedicated_cloud_node_name, dedicated_cloud_node_request, custom_headers=None, raw=False, polling=True, **operation_config):
"""Implements dedicated cloud node PUT method.
Returns dedicated cloud node by its name.
:param resource_group_name: The name of the resource group
:type resource_group_name: str
:param dedicated_cloud_node_name: dedicated cloud node name
:type dedicated_cloud_node_name: str
:param dedicated_cloud_node_request: Create Dedicated Cloud Node
request
:type dedicated_cloud_node_request:
~azure.mgmt.vmwarecloudsimple.models.DedicatedCloudNode
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns DedicatedCloudNode or
ClientRawResponse<DedicatedCloudNode> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.vmwarecloudsimple.models.DedicatedCloudNode]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.vmwarecloudsimple.models.DedicatedCloudNode]]
:raises:
:class:`CSRPErrorException<azure.mgmt.vmwarecloudsimple.models.CSRPErrorException>`
"""
raw_result = self._create_or_update_initial(
resource_group_name=resource_group_name,
dedicated_cloud_node_name=dedicated_cloud_node_name,
dedicated_cloud_node_request=dedicated_cloud_node_request,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
header_dict = {
'Azure-AsyncOperation': 'str',
'Location': 'str',
'Retry-After': 'int',
}
deserialized = self._deserialize('DedicatedCloudNode', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
client_raw_response.add_headers(header_dict)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
create_or_update.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.VMwareCloudSimple/dedicatedCloudNodes/{dedicatedCloudNodeName}'}
def delete(
self, resource_group_name, dedicated_cloud_node_name, custom_headers=None, raw=False, **operation_config):
"""Implements dedicated cloud node DELETE method.
Delete dedicated cloud node.
:param resource_group_name: The name of the resource group
:type resource_group_name: str
:param dedicated_cloud_node_name: dedicated cloud node name
:type dedicated_cloud_node_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: None or ClientRawResponse if raw=true
:rtype: None or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`CSRPErrorException<azure.mgmt.vmwarecloudsimple.models.CSRPErrorException>`
"""
# Construct URL
url = self.delete.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'dedicatedCloudNodeName': self._serialize.url("dedicated_cloud_node_name", dedicated_cloud_node_name, 'str', pattern=r'^[-a-zA-Z0-9]+$')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [204]:
raise models.CSRPErrorException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
client_raw_response.add_headers({
'Content-Type': 'str',
})
return client_raw_response
delete.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.VMwareCloudSimple/dedicatedCloudNodes/{dedicatedCloudNodeName}'}
def update(
self, resource_group_name, dedicated_cloud_node_name, tags=None, custom_headers=None, raw=False, **operation_config):
"""Implements dedicated cloud node PATCH method.
Patches dedicated node properties.
:param resource_group_name: The name of the resource group
:type resource_group_name: str
:param dedicated_cloud_node_name: dedicated cloud node name
:type dedicated_cloud_node_name: str
:param tags: The tags key:value pairs
:type tags: dict[str, str]
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: DedicatedCloudNode or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.vmwarecloudsimple.models.DedicatedCloudNode or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`CSRPErrorException<azure.mgmt.vmwarecloudsimple.models.CSRPErrorException>`
"""
dedicated_cloud_node_request = models.PatchPayload(tags=tags)
# Construct URL
url = self.update.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'dedicatedCloudNodeName': self._serialize.url("dedicated_cloud_node_name", dedicated_cloud_node_name, 'str', pattern=r'^[-a-zA-Z0-9]+$')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(dedicated_cloud_node_request, 'PatchPayload')
# Construct and send request
request = self._client.patch(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.CSRPErrorException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('DedicatedCloudNode', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
update.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.VMwareCloudSimple/dedicatedCloudNodes/{dedicatedCloudNodeName}'}
| 47.846758 | 192 | 0.671307 | 2,572 | 24,354 | 6.136858 | 0.091757 | 0.040801 | 0.050177 | 0.037633 | 0.853776 | 0.838951 | 0.819564 | 0.810568 | 0.801698 | 0.796123 | 0 | 0.003222 | 0.235444 | 24,354 | 508 | 193 | 47.940945 | 0.844468 | 0.280159 | 0 | 0.71374 | 0 | 0 | 0.164541 | 0.089645 | 0 | 0 | 0 | 0 | 0 | 1 | 0.049618 | false | 0 | 0.019084 | 0 | 0.137405 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e6ddeb67e2634c7e9420cfc7abfab5ec7a655dd9 | 11,317 | py | Python | parallel_clustering/run_exp_2.py | jeshi96/google-research | b769d51bb71eefe0a732eb0286716feb62117e2d | [
"Apache-2.0"
] | null | null | null | parallel_clustering/run_exp_2.py | jeshi96/google-research | b769d51bb71eefe0a732eb0286716feb62117e2d | [
"Apache-2.0"
] | null | null | null | parallel_clustering/run_exp_2.py | jeshi96/google-research | b769d51bb71eefe0a732eb0286716feb62117e2d | [
"Apache-2.0"
] | null | null | null | import os
import sys
import signal
import time
import subprocess
def signal_handler(signal,frame):
print "bye\n"
sys.exit(0)
signal.signal(signal.SIGINT,signal_handler)
def shellGetOutput(str1) :
process = subprocess.Popen(str1,shell=True,stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
output, err = process.communicate()
if (len(err) > 0):
print(str1+"\n"+output+err)
return output
def appendToFile(out, filename):
with open(filename, "a+") as out_file:
out_file.writelines(out)
def run_2_corr():
programs = ["ParallelCorrelationClusterer"]
programs_pres = ["pc"]
files = ["friendster_h"]#"dblp_h", "lj_h","orkut_h","friendster_h"]
pres = ["friendster"]#"dblp","lj","orkut","friendster"]
async_sync = ["true"]
refines = ["true"]
moves = ["NBHR_MOVE"]
moves_pres = ["nbhr"]
resolutions = [x * (1.0/100.0) for x in range(1, 100)]
num_workers = [96]#[1, 2, 4, 8, 16, 30, 60]
read_dir = "/home/jeshi/snap/"
write_dir = "/home/jeshi/clustering_out_exp2/"
for prog_idx, prog in enumerate(programs):
for file_idx, filename in enumerate(files):
for r in reversed(resolutions):
for ref_idx, ref in enumerate(refines):
for asy_idx, asy in enumerate(async_sync):
for move_idx, move in enumerate(moves):
for nw in num_workers:
if True:
out_filename = write_dir + programs_pres[prog_idx] + "_" + pres[file_idx] + "_" + str(r) + "_" + asy + "_" + ref + "_" + moves_pres[move_idx]+"_" + str(nw) + ".out"
ss = ("NUM_THREADS="+str(nw)+" timeout 6h bazel run //clustering:cluster-in-memory_main -- --"
"input_graph=" + read_dir + filename + " --clusterer_name=" + prog + " "
" --clusterer_config='correlation_clusterer_config"
" {resolution: " + str(r) + ", subclustering_method: NONE_SUBCLUSTERING, "
"clustering_moves_method: LOUVAIN , preclustering_method: NONE_PRECLUSTERING, "
"refine: "+ref+", async: "+asy+", move_method: "+move+"}' --input_communities"
"='" + read_dir + "com-" + pres[file_idx] + ".top5000.cmty.txt'")
out = shellGetOutput(ss)
appendToFile(out, out_filename)
def run_2_mod():
programs = ["ParallelModularityClusterer"]
programs_pres = ["pm"]
files = ["friendster_h"]#"dblp_h", "lj_h","orkut_h","friendster_h"]
pres = ["friendster"]#"dblp","lj","orkut","friendster"]
async_sync = ["true"]
refines = ["true"]
moves = ["NBHR_MOVE"]
moves_pres = ["nbhr"]
resolutions = [0.02 * ((1 + 1.0 / 5.0) ** x) for x in range(0, 101)]
num_workers = [96]#[1, 2, 4, 8, 16, 30, 60]
read_dir = "/home/jeshi/snap/"
write_dir = "/home/jeshi/clustering_out_exp2/"
for prog_idx, prog in enumerate(programs):
for file_idx, filename in enumerate(files):
for r in reversed(resolutions):
for ref_idx, ref in enumerate(refines):
for asy_idx, asy in enumerate(async_sync):
for move_idx, move in enumerate(moves):
for nw in num_workers:
if True:
out_filename = write_dir + programs_pres[prog_idx] + "_" + pres[file_idx] + "_" + str(r) + "_" + asy + "_" + ref + "_" + moves_pres[move_idx]+"_" + str(nw) + ".out"
ss = ("NUM_THREADS="+str(nw)+" timeout 6h bazel run //clustering:cluster-in-memory_main -- --"
"input_graph=" + read_dir + filename + " --clusterer_name=" + prog + " "
" --clusterer_config='correlation_clusterer_config"
" {resolution: " + str(r) + ", subclustering_method: NONE_SUBCLUSTERING, "
"clustering_moves_method: LOUVAIN , preclustering_method: NONE_PRECLUSTERING, "
"refine: "+ref+", async: "+asy+", move_method: "+move+"}' --input_communities"
"='" + read_dir + "com-" + pres[file_idx] + ".top5000.cmty.txt'")
out = shellGetOutput(ss)
appendToFile(out, out_filename)
def run_2_corr_seq():
programs = ["CorrelationClusterer"]
programs_pres = ["c"]
files = ["friendster_h"]#"dblp_h", "lj_h","orkut_h","friendster_h"]
pres = ["friendster"]#"dblp","lj","orkut","friendster"]
async_sync = ["true"]
refines = ["true"]
moves = ["NBHR_MOVE"]
moves_pres = ["nbhr"]
resolutions = [x * (1.0/100.0) for x in range(1, 100)]
num_workers = [96]#[1, 2, 4, 8, 16, 30, 60]
read_dir = "/home/jeshi/snap/"
write_dir = "/home/jeshi/clustering_out_exp2/"
for prog_idx, prog in enumerate(programs):
for file_idx, filename in enumerate(files):
for r in reversed(resolutions):
for ref_idx, ref in enumerate(refines):
for asy_idx, asy in enumerate(async_sync):
for move_idx, move in enumerate(moves):
for nw in num_workers:
if True:
out_filename = write_dir + programs_pres[prog_idx] + "_" + pres[file_idx] + "_" + str(r) + "_" + asy + "_" + ref + "_" + moves_pres[move_idx]+"_" + str(nw) + ".out"
ss = ("NUM_THREADS="+str(nw)+" timeout 6h bazel run //clustering:cluster-in-memory_main -- --"
"input_graph=" + read_dir + filename + " --clusterer_name=" + prog + " "
" --clusterer_config='correlation_clusterer_config"
" {resolution: " + str(r) + ", subclustering_method: NONE_SUBCLUSTERING, "
"clustering_moves_method: LOUVAIN , preclustering_method: NONE_PRECLUSTERING, "
"refine: "+ref+", async: "+asy+", move_method: "+move+"}' --input_communities"
"='" + read_dir + "com-" + pres[file_idx] + ".top5000.cmty.txt'")
out = shellGetOutput(ss)
appendToFile(out, out_filename)
def run_2_mod_seq():
programs = ["ModularityClusterer"]
programs_pres = ["m"]
files = ["friendster_h"]#"dblp_h", "lj_h","orkut_h","friendster_h"]
pres = ["friendster"]#"dblp","lj","orkut","friendster"]
async_sync = ["true"]
refines = ["true"]
moves = ["NBHR_MOVE"]
moves_pres = ["nbhr"]
resolutions = [0.02 * ((1 + 1.0 / 5.0) ** x) for x in range(0, 101)]
num_workers = [96]#[1, 2, 4, 8, 16, 30, 60]
read_dir = "/home/jeshi/snap/"
write_dir = "/home/jeshi/clustering_out_exp2/"
for prog_idx, prog in enumerate(programs):
for file_idx, filename in enumerate(files):
for r in reversed(resolutions):
for ref_idx, ref in enumerate(refines):
for asy_idx, asy in enumerate(async_sync):
for move_idx, move in enumerate(moves):
for nw in num_workers:
if True:
out_filename = write_dir + programs_pres[prog_idx] + "_" + pres[file_idx] + "_" + str(r) + "_" + asy + "_" + ref + "_" + moves_pres[move_idx]+"_" + str(nw) + ".out"
ss = ("NUM_THREADS="+str(nw)+" timeout 6h bazel run //clustering:cluster-in-memory_main -- --"
"input_graph=" + read_dir + filename + " --clusterer_name=" + prog + " "
" --clusterer_config='correlation_clusterer_config"
" {resolution: " + str(r) + ", subclustering_method: NONE_SUBCLUSTERING, "
"clustering_moves_method: LOUVAIN , preclustering_method: NONE_PRECLUSTERING, "
"refine: "+ref+", async: "+asy+", move_method: "+move+"}' --input_communities"
"='" + read_dir + "com-" + pres[file_idx] + ".top5000.cmty.txt'")
out = shellGetOutput(ss)
appendToFile(out, out_filename)
def run_2_corr_seq_ai():
programs = ["CorrelationClusterer"]
programs_pres = ["c"]
files = ["friendster_h"]#"dblp_h", "lj_h","orkut_h","friendster_h"]
pres = ["friendster"]#"dblp","lj","orkut","friendster"]
async_sync = ["true"]
refines = ["true"]
moves = ["NBHR_MOVE"]
moves_pres = ["nbhr"]
resolutions = [x * (1.0/100.0) for x in range(1, 100)]
num_workers = [96]#[1, 2, 4, 8, 16, 30, 60]
read_dir = "/home/jeshi/snap/"
write_dir = "/home/jeshi/clustering_out_exp2ai/"
for prog_idx, prog in enumerate(programs):
for file_idx, filename in enumerate(files):
for r in reversed(resolutions):
for ref_idx, ref in enumerate(refines):
for asy_idx, asy in enumerate(async_sync):
for move_idx, move in enumerate(moves):
for nw in num_workers:
if True:
out_filename = write_dir + programs_pres[prog_idx] + "_" + pres[file_idx] + "_" + str(r) + "_" + asy + "_" + ref + "_" + moves_pres[move_idx]+"_" + str(nw) + ".out"
ss = ("NUM_THREADS="+str(nw)+" timeout 6h bazel run //clustering:cluster-in-memory_main -- --"
"input_graph=" + read_dir + filename + " --clusterer_name=" + prog + " "
" --clusterer_config='correlation_clusterer_config"
" {resolution: " + str(r) + ", subclustering_method: NONE_SUBCLUSTERING, "
"clustering_moves_method: LOUVAIN , preclustering_method: NONE_PRECLUSTERING, "
"refine: "+ref+", async: "+asy+", move_method: "+move+", all_iter: true}' --input_communities"
"='" + read_dir + "com-" + pres[file_idx] + ".top5000.cmty.txt'")
out = shellGetOutput(ss)
appendToFile(out, out_filename)
def run_2_mod_seq_ai():
programs = ["ModularityClusterer"]
programs_pres = ["m"]
files = ["friendster_h"]#"dblp_h", "lj_h","orkut_h","friendster_h"]
pres = ["friendster"]#"dblp","lj","orkut","friendster"]
async_sync = ["true"]
refines = ["true"]
moves = ["NBHR_MOVE"]
moves_pres = ["nbhr"]
resolutions = [0.02 * ((1 + 1.0 / 5.0) ** x) for x in range(0, 101)]
num_workers = [96]#[1, 2, 4, 8, 16, 30, 60]
read_dir = "/home/jeshi/snap/"
write_dir = "/home/jeshi/clustering_out_exp2ai/"
for prog_idx, prog in enumerate(programs):
for file_idx, filename in enumerate(files):
for r in reversed(resolutions):
for ref_idx, ref in enumerate(refines):
for asy_idx, asy in enumerate(async_sync):
for move_idx, move in enumerate(moves):
for nw in num_workers:
if True:
out_filename = write_dir + programs_pres[prog_idx] + "_" + pres[file_idx] + "_" + str(r) + "_" + asy + "_" + ref + "_" + moves_pres[move_idx]+"_" + str(nw) + ".out"
ss = ("NUM_THREADS="+str(nw)+" timeout 6h bazel run //clustering:cluster-in-memory_main -- --"
"input_graph=" + read_dir + filename + " --clusterer_name=" + prog + " "
" --clusterer_config='correlation_clusterer_config"
" {resolution: " + str(r) + ", subclustering_method: NONE_SUBCLUSTERING, "
"clustering_moves_method: LOUVAIN , preclustering_method: NONE_PRECLUSTERING, "
"refine: "+ref+", async: "+asy+", move_method: "+move+", all_iter: true}' --input_communities"
"='" + read_dir + "com-" + pres[file_idx] + ".top5000.cmty.txt'")
out = shellGetOutput(ss)
appendToFile(out, out_filename)
def large_rerun_2():
run_2_corr()
run_2_mod()
run_2_mod_seq()
run_2_mod_seq_ai()
run_2_corr_seq()
run_2_corr_seq_ai()
| 50.522321 | 182 | 0.587788 | 1,369 | 11,317 | 4.594595 | 0.097151 | 0.052464 | 0.022893 | 0.019078 | 0.916057 | 0.911447 | 0.911447 | 0.911447 | 0.911447 | 0.911447 | 0 | 0.022754 | 0.254396 | 11,317 | 223 | 183 | 50.748879 | 0.722683 | 0.052487 | 0 | 0.830189 | 0 | 0 | 0.280803 | 0.106959 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.023585 | null | null | 0.009434 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
fc01e5ffeccf57bc8ffe89dede68bf27a3ee7aa0 | 32,432 | py | Python | packages/gtmcore/gtmcore/workflows/tests/test_gitworkflow_ut.py | gigabackup/gigantum-client | 70fe6b39b87b1c56351f2b4c551b6f1693813e4f | [
"MIT"
] | 60 | 2018-09-26T15:46:00.000Z | 2021-10-10T02:37:14.000Z | packages/gtmcore/gtmcore/workflows/tests/test_gitworkflow_ut.py | gigabackup/gigantum-client | 70fe6b39b87b1c56351f2b4c551b6f1693813e4f | [
"MIT"
] | 1,706 | 2018-09-26T16:11:22.000Z | 2021-08-20T13:37:59.000Z | packages/gtmcore/gtmcore/workflows/tests/test_gitworkflow_ut.py | griffinmilsap/gigantum-client | 70fe6b39b87b1c56351f2b4c551b6f1693813e4f | [
"MIT"
] | 11 | 2019-03-14T13:23:51.000Z | 2022-01-25T01:29:16.000Z | from typing import Optional
import pytest
import mock
import responses
import tempfile
import shutil
import subprocess
import os
from pkg_resources import resource_filename
import time
import glob
from mock import patch
from collections import namedtuple
import responses
from gtmcore.configuration.utils import call_subprocess
from gtmcore.gitlib import RepoLocation
from gtmcore.inventory.inventory import InventoryManager, InventoryException
from gtmcore.workflows import LabbookWorkflow, DatasetWorkflow, MergeOverride
from gtmcore.workflows.gitlab import GitLabException
from gtmcore.workflows.gitworkflows_utils import create_remote_gitlab_repo
from gtmcore.fixtures import (helper_create_remote_repo as _MOCK_create_remote_repo, mock_labbook_lfs_disabled,
mock_config_file)
from gtmcore.inventory.branching import BranchManager, MergeError
from gtmcore.dispatcher import Dispatcher
import gtmcore.dispatcher.dataset_jobs
from gtmcore.dataset.manifest import Manifest
from gtmcore.fixtures.datasets import helper_append_file
from gtmcore.dataset.io.manager import IOManager
def _mock_fetch(self, remote):
assert isinstance(remote, str)
pass
class TestGitWorkflowsMethods(object):
@responses.activate
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_publish__simple(self, mock_labbook_lfs_disabled):
"""Test a simple publish and ensuring master is active branch. """
responses.add(responses.GET, 'https://test.repo.gigantum.com/backup', status=404)
username = 'test'
lb = mock_labbook_lfs_disabled[2]
bm = BranchManager(lb, username)
bm.create_branch('test-local-only')
assert bm.branches_remote == []
assert bm.branches_local == ['master', 'test-local-only']
wf = LabbookWorkflow(lb)
# Test you can only publish on master.
with pytest.raises(GitLabException):
wf.publish(username=username)
assert wf.remote is None
# Once we return to master branch, then we can publish.
bm.workon_branch(bm.workspace_branch)
wf.publish(username=username)
assert os.path.exists(wf.remote)
# Assert that publish only pushes up the master branch.
assert bm.branches_local == ['master', 'test-local-only']
assert bm.branches_remote == ['master']
@responses.activate
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_publish__cannot_overwrite(self, mock_labbook_lfs_disabled):
""" Test cannot publish a project already published. """
responses.add(responses.GET, 'https://test.repo.gigantum.com/backup', status=404)
username = 'test'
lb = mock_labbook_lfs_disabled[2]
wf = LabbookWorkflow(lb)
wf.publish(username=username)
with pytest.raises(GitLabException):
wf.publish(username=username)
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_publish__publish_then_import_with_another_user(self, mock_labbook_lfs_disabled, mock_config_file):
""" Test cannot publish a project already published. """
username = 'test'
lb = mock_labbook_lfs_disabled[2]
wf = LabbookWorkflow(lb)
wf.publish(username=username)
other_user = 'other-test-user'
remote = RepoLocation(wf.remote, other_user)
wf_other = LabbookWorkflow.import_from_remote(remote, username=other_user)
lb_other = wf_other.repository
assert lb_other.root_dir != lb.root_dir
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_publish__publish_then_import_then_sync(self, mock_labbook_lfs_disabled, mock_config_file):
""" Test cannot publish a project already published. """
username = 'test'
lb = mock_labbook_lfs_disabled[2]
wf = LabbookWorkflow(lb)
wf.publish(username=username)
other_user = 'other-test-user'
remote = RepoLocation(wf.remote, other_user)
wf_other = LabbookWorkflow.import_from_remote(remote, username=other_user)
with open(os.path.join(wf_other.repository.root_dir, 'testfile'), 'w') as f: f.write('filedata')
wf_other.repository.sweep_uncommitted_changes()
wf_other.sync(username=other_user)
commit_hash = wf_other.repository.git.commit_hash
assert wf.repository.git.commit_hash != commit_hash
wf.sync(username=username)
assert len(wf.repository.git.commit_hash) == len(commit_hash)
assert wf.repository.git.commit_hash == commit_hash
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_import_from_remote__nominal(self, mock_labbook_lfs_disabled, mock_config_file):
""" test import_from_remote method """
username = 'testuser'
lb = mock_labbook_lfs_disabled[2]
wf = LabbookWorkflow(lb)
wf.publish(username=username)
other_user = 'other-test-user2'
remote = RepoLocation(wf.remote, other_user)
wf_other = LabbookWorkflow.import_from_remote(remote, username=other_user)
# The remotes must be the same, cause it's the same remote repo
assert wf_other.remote == remote.remote_location
# The actual path on disk will be different, though
assert wf_other.repository != wf.repository
# Check imported into namespace of original owner (testuser)
assert f'{other_user}/{username}/labbooks/labbook1' in wf_other.repository.root_dir
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_import_from_remote__dataset(self, mock_labbook_lfs_disabled, mock_config_file):
""" test importing a published dataset """
username = 'testuser'
lb = mock_labbook_lfs_disabled[2]
im = InventoryManager()
ds = im.create_dataset(username, username, 'test-ds', storage_type='gigantum_object_v1')
wf = DatasetWorkflow(ds)
wf.publish(username=username)
other_user = 'other-test-user2'
remote = RepoLocation(wf.remote, other_user)
wf_other = DatasetWorkflow.import_from_remote(remote, username=other_user)
# The remotes must be the same, cause it's the same remote repo
assert wf_other.remote == remote.remote_location
# The actual path on disk will be different, though
assert wf_other.repository != wf.repository
# Check imported into namespace of original owner (testuser)
assert f'{other_user}/{username}/datasets/test-ds' in wf_other.repository.root_dir
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_import_from_remote__linked_dataset(self, mock_labbook_lfs_disabled, mock_config_file):
""" test importing a project with a linked dataset"""
def dispatcher_mock(self, function_ref, kwargs, metadata):
assert kwargs['logged_in_username'] == 'other-test-user2'
assert kwargs['dataset_owner'] == 'testuser'
assert kwargs['dataset_name'] == 'test-ds'
# Stop patching so job gets scheduled for real
dispatcher_patch.stop()
# Call same method as in mutation
d = Dispatcher()
res = d.dispatch_task(gtmcore.dispatcher.dataset_jobs.check_and_import_dataset,
kwargs=kwargs, metadata=metadata)
return res
username = 'testuser'
lb = mock_labbook_lfs_disabled[2]
im = InventoryManager()
ds = im.create_dataset(username, username, 'test-ds', storage_type='gigantum_object_v1')
# Publish dataset
dataset_wf = DatasetWorkflow(ds)
dataset_wf.publish(username=username)
# Link to project
im.link_dataset_to_labbook(dataset_wf.remote, username, username, lb, username)
# Publish project
labbook_wf = LabbookWorkflow(lb)
labbook_wf.publish(username=username)
# Patch dispatch_task so you can inject the mocked config file
dispatcher_patch = patch.object(Dispatcher, 'dispatch_task', dispatcher_mock)
dispatcher_patch.start()
# Import project, triggering an auto-import of the dataset
other_user = 'other-test-user2'
remote = RepoLocation(labbook_wf.remote, other_user)
wf_other = LabbookWorkflow.import_from_remote(remote, username=other_user)
# The remotes must be the same, cause it's the same remote repo
assert wf_other.remote == remote.remote_location
# The actual path on disk will be different, though
assert wf_other.repository != labbook_wf.repository
# Check imported into namespace of original owner (testuser)
assert f'{other_user}/{username}/labbooks/labbook1' in wf_other.repository.root_dir
cnt = 0
while cnt < 20:
try:
im_other_user = InventoryManager()
ds = im_other_user.load_dataset(other_user, username, 'test-ds')
break
except InventoryException:
cnt += 1
time.sleep(1)
assert cnt < 20
assert ds.name == 'test-ds'
assert ds.namespace == username
assert mock_config_file[1] in ds.root_dir
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_sync__linked_dataset(self, mock_labbook_lfs_disabled, mock_config_file):
""" test syncing a project that pulls in a linked dataset"""
def dispatcher_mock(self, function_ref, kwargs, metadata):
assert kwargs['logged_in_username'] == 'other-test-user2'
assert kwargs['dataset_owner'] == 'testuser'
assert kwargs['dataset_name'] == 'test-ds'
# Stop patching so job gets scheduled for real
dispatcher_patch.stop()
# Call same method as in mutation
d = Dispatcher()
res = d.dispatch_task(gtmcore.dispatcher.dataset_jobs.check_and_import_dataset,
kwargs=kwargs, metadata=metadata)
return res
username = 'testuser'
lb = mock_labbook_lfs_disabled[2]
im = InventoryManager()
ds = im.create_dataset(username, username, 'test-ds', storage_type='gigantum_object_v1')
# Publish dataset
dataset_wf = DatasetWorkflow(ds)
dataset_wf.publish(username=username)
# Publish project
labbook_wf = LabbookWorkflow(lb)
labbook_wf.publish(username=username)
# Import project
other_user = 'other-test-user2'
remote = RepoLocation(labbook_wf.remote, other_user)
wf_other = LabbookWorkflow.import_from_remote(remote, username=other_user)
# The remotes must be the same, cause it's the same remote repo
assert wf_other.remote == remote.remote_location
assert wf_other.repository != labbook_wf.repository
assert f'{other_user}/{username}/labbooks/labbook1' in wf_other.repository.root_dir
with pytest.raises(InventoryException):
im_other_user = InventoryManager()
ds = im_other_user.load_dataset(other_user, username, 'test-ds')
# Link to project
im.link_dataset_to_labbook(dataset_wf.remote, username, username, lb, username)
# Sync project with linked dataset
labbook_wf.sync(username=username)
# Patch dispatch_task so you can inject the mocked config file
dispatcher_patch = patch.object(Dispatcher, 'dispatch_task', dispatcher_mock)
dispatcher_patch.start()
# Sync on the other end, get the dataset!
wf_other.sync(username=other_user)
cnt = 0
while cnt < 20:
try:
im_other_user = InventoryManager()
ds = im_other_user.load_dataset(other_user, username, 'test-ds')
break
except InventoryException:
cnt += 1
time.sleep(1)
assert cnt < 20
assert ds.name == 'test-ds'
assert ds.namespace == username
assert mock_config_file[1] in ds.root_dir
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_checkout__linked_dataset(self, mock_labbook_lfs_disabled, mock_config_file):
""" test checking out a branch in a project that pulls in a linked dataset"""
def dispatcher_mock(self, function_ref, kwargs, metadata):
assert kwargs['logged_in_username'] == 'other-test-user2'
assert kwargs['dataset_owner'] == 'testuser'
assert kwargs['dataset_name'] == 'test-ds'
# Stop patching so job gets scheduled for real
dispatcher_patch.stop()
# Call same method as in mutation
d = Dispatcher()
res = d.dispatch_task(gtmcore.dispatcher.dataset_jobs.check_and_import_dataset,
kwargs=kwargs, metadata=metadata)
return res
username = 'testuser'
lb = mock_labbook_lfs_disabled[2]
im = InventoryManager()
ds = im.create_dataset(username, username, 'test-ds', storage_type='gigantum_object_v1')
# Publish dataset
dataset_wf = DatasetWorkflow(ds)
dataset_wf.publish(username=username)
# Publish project
labbook_wf = LabbookWorkflow(lb)
labbook_wf.publish(username=username)
# Switch branches
labbook_wf.labbook.checkout_branch(branch_name="dataset-branch", new=True)
# Link to project
im.link_dataset_to_labbook(dataset_wf.remote, username, username, labbook_wf.labbook, username)
# Publish branch
labbook_wf.sync(username=username)
# Import project
other_user = 'other-test-user2'
remote = RepoLocation(labbook_wf.remote, other_user)
wf_other = LabbookWorkflow.import_from_remote(remote, username=other_user)
# The remotes must be the same, cause it's the same remote repo
assert wf_other.remote == remote.remote_location
assert wf_other.repository != labbook_wf.repository
assert f'{other_user}/{username}/labbooks/labbook1' in wf_other.repository.root_dir
with pytest.raises(InventoryException):
im_other_user = InventoryManager()
ds = im_other_user.load_dataset(other_user, username, 'test-ds')
# Patch dispatch_task so you can inject the mocked config file
dispatcher_patch = patch.object(Dispatcher, 'dispatch_task', dispatcher_mock)
dispatcher_patch.start()
# Checkout the branch
assert wf_other.labbook.active_branch == "master"
wf_other.checkout(username=other_user, branch_name="dataset-branch")
cnt = 0
while cnt < 20:
try:
im_other_user = InventoryManager()
ds = im_other_user.load_dataset(other_user, username, 'test-ds')
break
except InventoryException:
cnt += 1
time.sleep(1)
assert cnt < 20
assert ds.name == 'test-ds'
assert ds.namespace == username
assert mock_config_file[1] in ds.root_dir
assert wf_other.labbook.active_branch == "dataset-branch"
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_sync___simple_push_to_master(self, mock_labbook_lfs_disabled, mock_config_file):
""" test import_from_remote method """
username = 'test'
lb = mock_labbook_lfs_disabled[2]
wf = LabbookWorkflow(lb)
wf.publish(username=username)
fpath = os.path.join(lb.root_dir, 'input', 'testfile')
with open(fpath, 'w') as f: f.write('filedata')
lb.sweep_uncommitted_changes()
wf.sync(username=username)
# Check hash on remote - make sure it matches local.
remote_hash = call_subprocess('git log -n 1 --oneline'.split(), cwd=wf.remote).split()[0]
assert remote_hash in lb.git.commit_hash
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_sync___push_up_new_branch(self, mock_labbook_lfs_disabled, mock_config_file):
""" test import_from_remote method """
username = 'test'
lb = mock_labbook_lfs_disabled[2]
wf = LabbookWorkflow(lb)
wf.publish(username=username)
bm = BranchManager(lb, username='test')
bm.create_branch('new-branch-to-push')
assert 'new-branch-to-push' not in bm.branches_remote
wf.sync('test')
assert 'new-branch-to-push' in bm.branches_remote
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_sync___push_up_then_sync(self, mock_labbook_lfs_disabled, mock_config_file):
""" test import_from_remote method """
username = 'test'
lb = mock_labbook_lfs_disabled[2]
wf = LabbookWorkflow(lb)
wf.publish(username=username)
bm = BranchManager(lb, username='test')
bm.create_branch('new-branch-to-push')
wf.sync('test')
# Make some change locally and commit, then sync.
fpath = os.path.join(lb.root_dir, 'input', 'testfile')
with open(fpath, 'w') as f: f.write('filedata')
lb.sweep_uncommitted_changes()
wf.sync(username=username)
@responses.activate
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_sync___detect_merge_conflict(self, mock_labbook_lfs_disabled, mock_config_file):
""" test import_from_remote method """
responses.add(responses.GET, 'https://test.repo.gigantum.com/backup', status=404)
username = 'test'
lb = mock_labbook_lfs_disabled[2]
wf = LabbookWorkflow(lb)
wf.publish(username=username)
bm = BranchManager(lb, username='test')
bm.create_branch('test-conflict-branch')
fpath1 = os.path.join(lb.root_dir, 'input', 'testfile1')
with open(fpath1, 'w') as f:
f.write('filedata1')
fpath2 = os.path.join(lb.root_dir, 'input', 'testfile2')
with open(fpath2, 'w') as f:
f.write('filedata2')
lb.sweep_uncommitted_changes()
wf.sync('test')
other_user = 'other-test-user2'
remote = RepoLocation(wf.remote, other_user)
wf_other = LabbookWorkflow.import_from_remote(remote, username=other_user)
bm_other = BranchManager(wf_other.labbook, username=other_user)
bm_other.workon_branch('test-conflict-branch')
with open(os.path.join(wf_other.labbook.root_dir, 'input', 'testfile1'), 'w') as f:
f.write('conflicting-change-other-user1')
with open(os.path.join(wf_other.labbook.root_dir, 'input', 'testfile2'), 'w') as f:
f.write('conflicting-change-other-user2')
wf_other.labbook.sweep_uncommitted_changes()
wf_other.sync(username=username)
with open(fpath1, 'w') as f:
f.write('conflicting-change-original-user')
with open(fpath2, 'w') as f:
f.write('conflicting-change-original-user')
wf.labbook.sweep_uncommitted_changes()
h = wf.labbook.git.commit_hash
with pytest.raises(MergeError):
n = wf.sync(username=username)
assert h == wf.labbook.git.commit_hash
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_sync___override_merge_conflict_theirs(self, mock_labbook_lfs_disabled, mock_config_file):
""" test sync, with override in case of merge conflict. """
""" test import_from_remote method """
username = 'test'
lb = mock_labbook_lfs_disabled[2]
wf = LabbookWorkflow(lb)
wf.publish(username=username)
bm = BranchManager(lb, username='test')
bm.create_branch('test-conflict-branch')
fpath = os.path.join(lb.root_dir, 'input', 'testfile')
with open(fpath, 'w') as f: f.write('filedata')
lb.sweep_uncommitted_changes()
wf.sync('test')
other_user = 'other-test-user2'
remote = RepoLocation(wf.remote, other_user)
wf_other = LabbookWorkflow.import_from_remote(remote, username=other_user)
bm_other = BranchManager(wf_other.labbook, username=other_user)
bm_other.workon_branch('test-conflict-branch')
with open(os.path.join(wf_other.labbook.root_dir, 'input', 'testfile'), 'w') as f:
f.write('conflicting-change-other-user')
wf_other.labbook.sweep_uncommitted_changes()
wf_other.sync(username=username)
assert wf_other.repository.root_dir != wf.repository.root_dir
fpath = os.path.join(wf.labbook.root_dir, 'input', 'testfile')
with open(fpath, 'w') as f: f.write('conflicting-change-original-user')
wf.labbook.sweep_uncommitted_changes()
assert wf.labbook.is_repo_clean
h = wf.labbook.git.commit_hash
n = wf.sync(username=username, override=MergeOverride.THEIRS)
assert h != wf.labbook.git.commit_hash
flines = open(os.path.join(wf.labbook.root_dir, 'input', 'testfile')).read()
assert 'conflicting-change-other-user' == flines
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_sync___override_merge_conflict_ours(self, mock_labbook_lfs_disabled, mock_config_file):
""" test sync, with override in case of merge conflict. """
username = 'test'
lb = mock_labbook_lfs_disabled[2]
wf = LabbookWorkflow(lb)
wf.publish(username=username)
bm = BranchManager(lb, username='test')
bm.create_branch('test-conflict-branch')
fpath = os.path.join(lb.root_dir, 'input', 'testfile')
with open(fpath, 'w') as f: f.write('filedata')
lb.sweep_uncommitted_changes()
wf.sync('test')
other_user = 'other-test-user2'
remote = RepoLocation(wf.remote, other_user)
wf_other = LabbookWorkflow.import_from_remote(remote, username=other_user)
bm_other = BranchManager(wf_other.labbook, username=other_user)
bm_other.workon_branch('test-conflict-branch')
with open(os.path.join(wf_other.labbook.root_dir, 'input', 'testfile'), 'w') as f:
f.write('conflicting-change-other-user')
wf_other.labbook.sweep_uncommitted_changes()
wf_other.sync(username=username)
fpath = os.path.join(wf.labbook.root_dir, 'input', 'testfile')
with open(fpath, 'w') as f: f.write('conflicting-change-original-user')
wf.labbook.sweep_uncommitted_changes()
n = wf.sync(username=username, override=MergeOverride.OURS)
flines = open(os.path.join(wf.labbook.root_dir, 'input', 'testfile')).read()
assert 'conflicting-change-original-user' == flines
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_reset__no_op(self, mock_labbook_lfs_disabled, mock_config_file):
""" test reset performs no operation when there's nothing to do """
username = 'test'
lb = mock_labbook_lfs_disabled[2]
wf = LabbookWorkflow(lb)
wf.reset(username=username)
wf.publish(username=username)
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_reset__reset_local_change_same_owner(self, mock_labbook_lfs_disabled):
""" test reset performs no operation when there's nothing to do """
username = 'test'
lb = mock_labbook_lfs_disabled[2]
wf = LabbookWorkflow(lb)
wf.publish(username=username)
commit_to_check = lb.git.commit_hash
# Make some change locally and commit
fpath = os.path.join(lb.root_dir, 'input', 'testfile')
with open(fpath, 'w') as f:
f.write('filedata')
lb.sweep_uncommitted_changes()
assert lb.git.commit_hash != commit_to_check
# Make an UNTRACKED change locally, make sure it gets clared up
untracked_file = os.path.join(lb.root_dir, 'output', 'untracked-file')
with open(untracked_file, 'w') as f:
f.write('untracked data')
# Do a reset and make sure state resets appropriately
wf.reset(username=username)
assert lb.git.commit_hash == commit_to_check
assert not os.path.exists(fpath)
assert not os.path.exists(untracked_file)
remote_hash = call_subprocess('git log -n 1 --oneline'.split(), cwd=wf.remote).split()[0]
assert remote_hash in lb.git.commit_hash
def test_migrate_old_schema_1_project(self, mock_config_file):
""" Test migrating a very old schema 1/gm.workspace LabBook """
p = resource_filename('gtmcore', 'workflows')
p2 = os.path.join(p, 'tests', 'snappy.zip')
with tempfile.TemporaryDirectory() as td:
call_subprocess(f"unzip {p2} -d {td}".split(), cwd=td)
temp_lb_path = os.path.join(td, 'snappy')
# Tests backwards compatibility (test.zip is a very old schema 1 LabBook)
lb = InventoryManager().load_labbook_from_directory(temp_lb_path)
wf = LabbookWorkflow(lb)
wf.labbook.remove_remote()
wf.migrate()
# Test that current branch is as appropriate
assert wf.labbook.active_branch == 'master'
# Test that there is an activity record indicate migration
assert any(['Migrate schema to 2' in c['message'] for c in wf.labbook.git.log()[:5]])
# Test schema has successfully rolled to 2
assert wf.labbook.schema == 2
# Test that untracked space exists (if we add something to untracked space)
assert wf.labbook.is_repo_clean
with open(os.path.join(lb.root_dir, 'output/untracked', 'untracked-file'), 'wb') as fb:
fb.write(b'cat' * 100)
assert wf.labbook.is_repo_clean
def test_migrate_no_op_on_new_labbook(self, mock_labbook_lfs_disabled):
username = 'test'
lb = mock_labbook_lfs_disabled[2]
wf = LabbookWorkflow(lb)
h1 = wf.labbook.git.commit_hash
wf.migrate()
# No change of git status after no-op migration
assert h1 == wf.labbook.git.commit_hash
def test_should_migrate_on_old_project(self, mock_config_file):
p = resource_filename('gtmcore', 'workflows')
p2 = os.path.join(p, 'tests', 'lb-to-migrate-197b6a.zip')
with tempfile.TemporaryDirectory() as tempdir:
lbp = shutil.copyfile(p2, os.path.join(tempdir, 'lb-to-migrate.zip'))
subprocess.run(f'unzip lb-to-migrate.zip'.split(),
check=True, cwd=tempdir)
im = InventoryManager()
lb = im.load_labbook_from_directory(os.path.join(tempdir, 'lb-to-migrate'))
wf = LabbookWorkflow(lb)
assert wf.should_migrate() is True
wf.migrate()
assert wf.labbook.active_branch == 'master'
assert wf.should_migrate() is False
wf.labbook.git.checkout('gm.workspace')
assert wf.should_migrate() is False
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_publish__dataset(self, mock_config_file):
def update_feedback(msg: str, has_failures: Optional[bool] = None, failure_detail: Optional[str] = None,
percent_complete: Optional[float] = None):
"""Method to update the job's metadata and provide feedback to the UI"""
assert has_failures is None or has_failures is False
assert failure_detail is None
def dispatch_query_mock(self, job_key):
JobStatus = namedtuple("JobStatus", ['status', 'meta'])
return JobStatus(status='finished', meta={'completed_bytes': '500'})
def dispatch_mock(self, method_reference, kwargs, metadata, persist):
return "afakejobkey"
username = 'test'
im = InventoryManager()
ds = im.create_dataset(username, username, 'dataset-1', 'gigantum_object_v1')
m = Manifest(ds, username)
wf = DatasetWorkflow(ds)
# Put a file into the dataset that needs to be pushed
helper_append_file(m.cache_mgr.cache_root, m.dataset_revision, "test1.txt", "asdfadfsdf")
m.sweep_all_changes()
iom = IOManager(ds, m)
assert len(glob.glob(f'{iom.push_dir}/*')) == 1
with patch.object(Dispatcher, 'dispatch_task', dispatch_mock):
with patch.object(Dispatcher, 'query_task', dispatch_query_mock):
wf.publish(username=username, feedback_callback=update_feedback)
assert os.path.exists(wf.remote)
assert len(glob.glob(f'{iom.push_dir}/*')) == 0
@mock.patch('gtmcore.workflows.gitworkflows_utils.create_remote_gitlab_repo', new=_MOCK_create_remote_repo)
def test_sync__dataset(self, mock_config_file):
def update_feedback(msg: str, has_failures: Optional[bool] = None, failure_detail: Optional[str] = None,
percent_complete: Optional[float] = None):
"""Method to update the job's metadata and provide feedback to the UI"""
assert has_failures is None or has_failures is False
assert failure_detail is None
def dispatch_query_mock(self, job_key):
JobStatus = namedtuple("JobStatus", ['status', 'meta'])
return JobStatus(status='finished', meta={'completed_bytes': '100'})
def dispatch_mock(self, method_reference, kwargs, metadata, persist):
return "afakejobkey"
username = 'test'
im = InventoryManager()
ds = im.create_dataset(username, username, 'dataset-1', 'gigantum_object_v1')
m = Manifest(ds, username)
wf = DatasetWorkflow(ds)
iom = IOManager(ds, m)
assert len(glob.glob(f'{iom.push_dir}/*')) == 0
wf.publish(username=username, feedback_callback=update_feedback)
# Put a file into the dataset that needs to be pushed
helper_append_file(m.cache_mgr.cache_root, m.dataset_revision, "test1.txt", "asdfadfsdf")
m.sweep_all_changes()
assert len(glob.glob(f'{iom.push_dir}/*')) == 1
with patch.object(Dispatcher, 'dispatch_task', dispatch_mock):
with patch.object(Dispatcher, 'query_task', dispatch_query_mock):
wf.sync(username=username, feedback_callback=update_feedback)
assert os.path.exists(wf.remote)
assert len(glob.glob(f'{iom.push_dir}/*')) == 0
@responses.activate
@mock.patch('gtmcore.gitlib.git_fs_shim.GitFilesystemShimmed.fetch', new=_mock_fetch)
def test_create_remote_gitlab_repo(self, mock_config_file):
responses.add(responses.POST, 'https://test.gigantum.com/api/v1/',
json={'data': {'additionalCredentials': {'gitServiceToken': 'afaketoken'}}}, status=200)
responses.add(responses.GET, 'https://test.repo.gigantum.com/api/v4/projects/test%2Flabbook-1',
status=404)
responses.add(responses.POST, 'https://test.repo.gigantum.com/api/v4/projects', status=201)
username = 'test'
im = InventoryManager()
lb = im.create_labbook(username, username, 'labbook-1')
with pytest.raises(ValueError):
create_remote_gitlab_repo(lb, username, 'private', 'afakeaccesstoken', None)
create_remote_gitlab_repo(lb, username, 'private', 'afakeaccesstoken', "afakeidtoken")
assert lb.remote == 'https://test@test.repo.gigantum.com/test/labbook-1.git/'
| 44.79558 | 112 | 0.671281 | 4,040 | 32,432 | 5.158168 | 0.092574 | 0.025481 | 0.024857 | 0.039061 | 0.825615 | 0.800566 | 0.770718 | 0.747829 | 0.715533 | 0.709391 | 0 | 0.005646 | 0.229927 | 32,432 | 723 | 113 | 44.857538 | 0.828749 | 0.099747 | 0 | 0.735521 | 0 | 0.001931 | 0.1423 | 0.061646 | 0 | 0 | 0 | 0 | 0.164093 | 1 | 0.063707 | false | 0.001931 | 0.086873 | 0.003861 | 0.166023 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fc098f58cda0e95ec768d21a0045940cb106f883 | 324 | py | Python | panoptes/aws/__init__.py | BryanCruz/panoptes | 1219e9bdabec9d45aab2e8cd3c87fd219f276bf1 | [
"Apache-2.0"
] | 35 | 2018-03-29T00:24:28.000Z | 2022-01-24T07:50:39.000Z | panoptes/aws/__init__.py | BryanCruz/panoptes | 1219e9bdabec9d45aab2e8cd3c87fd219f276bf1 | [
"Apache-2.0"
] | 5 | 2018-08-02T19:00:40.000Z | 2020-09-28T04:01:28.000Z | panoptes/aws/__init__.py | BryanCruz/panoptes | 1219e9bdabec9d45aab2e8cd3c87fd219f276bf1 | [
"Apache-2.0"
] | 3 | 2019-02-23T02:33:03.000Z | 2022-01-24T08:09:18.000Z | """ Panoptes - AWS
Responsible to import analysis functions from AWS.
"""
from panoptes.aws import analysis
from panoptes.aws import attached
from panoptes.aws import authentication
from panoptes.aws import exceptions
from panoptes.aws import output
from panoptes.aws import whitelist
if __name__ == "__main__":
pass
| 21.6 | 50 | 0.79321 | 43 | 324 | 5.790698 | 0.395349 | 0.309237 | 0.361446 | 0.506024 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148148 | 324 | 14 | 51 | 23.142857 | 0.902174 | 0.200617 | 0 | 0 | 0 | 0 | 0.031873 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.125 | 0.75 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
fc0ce62d0044a162d754075d7320b455ab44f26b | 5,176 | py | Python | ext/ANTsPyNet/antspynet/architectures/create_image_super_resolution_model.py | tsmonteiro/fmri_proc | ee740cfa3c3a7ef8e1ee1ebd3b286a66712e0ec1 | [
"MIT"
] | 2 | 2021-11-16T10:00:33.000Z | 2021-12-13T02:57:40.000Z | ext/ANTsPyNet/antspynet/architectures/create_image_super_resolution_model.py | tsmonteiro/fmri_proc | ee740cfa3c3a7ef8e1ee1ebd3b286a66712e0ec1 | [
"MIT"
] | null | null | null | ext/ANTsPyNet/antspynet/architectures/create_image_super_resolution_model.py | tsmonteiro/fmri_proc | ee740cfa3c3a7ef8e1ee1ebd3b286a66712e0ec1 | [
"MIT"
] | 1 | 2021-12-13T02:57:27.000Z | 2021-12-13T02:57:27.000Z |
from keras.models import Model
from keras.layers import (Input, Conv2D, Conv3D)
def create_image_super_resolution_model_2d(input_image_size,
convolution_kernel_sizes=[(9, 9), (1, 1), (5, 5)],
number_of_filters=(64, 32)
):
"""
2-D implementation of the image super resolution deep learning architecture.
Creates a keras model of the image super resolution deep learning framework.
based on the paper available here:
https://arxiv.org/pdf/1501.00092
This particular implementation is based on the following python
implementation:
https://github.com/titu1994/Image-Super-Resolution
Arguments
---------
input_image_size : tuple of length 3
Used for specifying the input tensor shape. The shape (or dimension) of
that tensor is the image dimensions followed by the number of channels
(e.g., red, green, and blue).
convolution_kernel_sizes : list of 2-d tuples
specifies the kernel size at each convolution layer. Default values are
the same as given in the original paper. The length of kernel size list
must be 1 greater than the tuple length of the number of filters.
number_of_filters : tuple
Contains the number of filters for each convolutional layer. Default values
are the same as given in the original paper.
Returns
-------
Keras model
A 2-D Keras model defining the network.
Example
-------
>>> model = create_image_super_resolution_model_2d((128, 128, 1))
>>> model.summary()
"""
number_of_convolution_layers = len(convolution_kernel_sizes)
if len(number_of_filters) != (number_of_convolution_layers - 1):
raise ValueError("The length of the number of filters must be 1 less than the length of the convolution vector size")
inputs = Input(shape = input_image_size)
outputs = inputs
for i in range(number_of_convolution_layers - 1):
outputs = Conv2D(filters=number_of_filters[i],
kernel_size=convolution_kernel_sizes[i],
activation='relu',
padding='same')(outputs)
number_of_channels = input_image_size[-1]
outputs = Conv2D(filters=number_of_channels,
kernel_size=convolution_kernel_sizes[-1],
activation='relu',
padding='same')(outputs)
sr_model = Model(inputs=inputs, outputs=outputs)
return(sr_model)
def create_image_super_resolution_model_3d(input_image_size,
convolution_kernel_sizes=[(9, 9, 9), (1, 1, 1), (5, 5, 5)],
number_of_filters=(64, 32)
):
"""
3-D implementation of the image super resolution deep learning architecture.
Creates a keras model of the image super resolution deep learning framework.
based on the paper available here:
https://arxiv.org/pdf/1501.00092
This particular implementation is based on the following python
implementation:
https://github.com/titu1994/Image-Super-Resolution
Arguments
---------
input_image_size : tuple of length 4
Used for specifying the input tensor shape. The shape (or dimension) of
that tensor is the image dimensions followed by the number of channels
(e.g., red, green, and blue).
convolution_kernel_sizes : list of 3-d tuples
specifies the kernel size at each convolution layer. Default values are
the same as given in the original paper. The length of kernel size list
must be 1 greater than the tuple length of the number of filters.
number_of_filters : tuple
Contains the number of filters for each convolutional layer. Default values
are the same as given in the original paper.
Returns
-------
Keras model
A 3-D Keras model defining the network.
Example
-------
>>> model = create_image_super_resolution_model_3d((128, 128, 128, 1))
>>> model.summary()
"""
number_of_convolution_layers = len(convolution_kernel_sizes)
if len(number_of_filters) != (number_of_convolution_layers - 1):
raise ValueError("The length of the number of filters must be 1 less than the length of the convolution vector size")
inputs = Input(shape = input_image_size)
outputs = inputs
for i in range(number_of_convolution_layers - 1):
outputs = Conv3D(filters=number_of_filters[i],
kernel_size=convolution_kernel_sizes[i],
activation='relu',
padding='same')(outputs)
number_of_channels = input_image_size[-1]
outputs = Conv3D(filters=number_of_channels,
kernel_size=convolution_kernel_sizes[-1],
activation='relu',
padding='same')(outputs)
sr_model = Model(inputs=inputs, outputs=outputs)
return(sr_model)
| 35.696552 | 125 | 0.632921 | 648 | 5,176 | 4.891975 | 0.186728 | 0.065615 | 0.066246 | 0.049211 | 0.974132 | 0.974132 | 0.942587 | 0.929338 | 0.905363 | 0.905363 | 0 | 0.026258 | 0.293663 | 5,176 | 144 | 126 | 35.944444 | 0.84081 | 0.471793 | 0 | 0.772727 | 0 | 0 | 0.090909 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.045455 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fc2dd9f3bff3c320301f81890bb1d0786ff83c68 | 16,974 | py | Python | sdk/python/pulumi_azure/iot/endpoint_servicebus_queue.py | aangelisc/pulumi-azure | 71dd9c75403146e16f7480e5a60b08bc0329660e | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure/iot/endpoint_servicebus_queue.py | aangelisc/pulumi-azure | 71dd9c75403146e16f7480e5a60b08bc0329660e | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure/iot/endpoint_servicebus_queue.py | aangelisc/pulumi-azure | 71dd9c75403146e16f7480e5a60b08bc0329660e | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['EndpointServicebusQueueArgs', 'EndpointServicebusQueue']
@pulumi.input_type
class EndpointServicebusQueueArgs:
def __init__(__self__, *,
connection_string: pulumi.Input[str],
iothub_name: pulumi.Input[str],
resource_group_name: pulumi.Input[str],
name: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a EndpointServicebusQueue resource.
:param pulumi.Input[str] connection_string: The connection string for the endpoint.
:param pulumi.Input[str] name: The name of the endpoint. The name must be unique across endpoint types. The following names are reserved: `events`, `operationsMonitoringEvents`, `fileNotifications` and `$default`.
"""
pulumi.set(__self__, "connection_string", connection_string)
pulumi.set(__self__, "iothub_name", iothub_name)
pulumi.set(__self__, "resource_group_name", resource_group_name)
if name is not None:
pulumi.set(__self__, "name", name)
@property
@pulumi.getter(name="connectionString")
def connection_string(self) -> pulumi.Input[str]:
"""
The connection string for the endpoint.
"""
return pulumi.get(self, "connection_string")
@connection_string.setter
def connection_string(self, value: pulumi.Input[str]):
pulumi.set(self, "connection_string", value)
@property
@pulumi.getter(name="iothubName")
def iothub_name(self) -> pulumi.Input[str]:
return pulumi.get(self, "iothub_name")
@iothub_name.setter
def iothub_name(self, value: pulumi.Input[str]):
pulumi.set(self, "iothub_name", value)
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> pulumi.Input[str]:
return pulumi.get(self, "resource_group_name")
@resource_group_name.setter
def resource_group_name(self, value: pulumi.Input[str]):
pulumi.set(self, "resource_group_name", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the endpoint. The name must be unique across endpoint types. The following names are reserved: `events`, `operationsMonitoringEvents`, `fileNotifications` and `$default`.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@pulumi.input_type
class _EndpointServicebusQueueState:
def __init__(__self__, *,
connection_string: Optional[pulumi.Input[str]] = None,
iothub_name: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering EndpointServicebusQueue resources.
:param pulumi.Input[str] connection_string: The connection string for the endpoint.
:param pulumi.Input[str] name: The name of the endpoint. The name must be unique across endpoint types. The following names are reserved: `events`, `operationsMonitoringEvents`, `fileNotifications` and `$default`.
"""
if connection_string is not None:
pulumi.set(__self__, "connection_string", connection_string)
if iothub_name is not None:
pulumi.set(__self__, "iothub_name", iothub_name)
if name is not None:
pulumi.set(__self__, "name", name)
if resource_group_name is not None:
pulumi.set(__self__, "resource_group_name", resource_group_name)
@property
@pulumi.getter(name="connectionString")
def connection_string(self) -> Optional[pulumi.Input[str]]:
"""
The connection string for the endpoint.
"""
return pulumi.get(self, "connection_string")
@connection_string.setter
def connection_string(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "connection_string", value)
@property
@pulumi.getter(name="iothubName")
def iothub_name(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "iothub_name")
@iothub_name.setter
def iothub_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "iothub_name", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the endpoint. The name must be unique across endpoint types. The following names are reserved: `events`, `operationsMonitoringEvents`, `fileNotifications` and `$default`.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "resource_group_name")
@resource_group_name.setter
def resource_group_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource_group_name", value)
class EndpointServicebusQueue(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
connection_string: Optional[pulumi.Input[str]] = None,
iothub_name: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Manages an IotHub ServiceBus Queue Endpoint
> **NOTE:** Endpoints can be defined either directly on the `iot.IoTHub` resource, or using the `azurerm_iothub_endpoint_*` resources - but the two ways of defining the endpoints cannot be used together. If both are used against the same IoTHub, spurious changes will occur. Also, defining a `azurerm_iothub_endpoint_*` resource and another endpoint of a different type directly on the `iot.IoTHub` resource is not supported.
## Example Usage
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_namespace = azure.servicebus.Namespace("exampleNamespace",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
sku="Standard")
example_queue = azure.servicebus.Queue("exampleQueue",
resource_group_name=example_resource_group.name,
namespace_name=example_namespace.name,
enable_partitioning=True)
example_queue_authorization_rule = azure.servicebus.QueueAuthorizationRule("exampleQueueAuthorizationRule",
namespace_name=example_namespace.name,
queue_name=example_queue.name,
resource_group_name=example_resource_group.name,
listen=False,
send=True,
manage=False)
example_io_t_hub = azure.iot.IoTHub("exampleIoTHub",
resource_group_name=example_resource_group.name,
location=example_resource_group.location,
sku=azure.iot.IoTHubSkuArgs(
name="B1",
tier="Basic",
capacity=1,
),
tags={
"purpose": "example",
})
example_endpoint_servicebus_queue = azure.iot.EndpointServicebusQueue("exampleEndpointServicebusQueue",
resource_group_name=example_resource_group.name,
iothub_name=example_io_t_hub.name,
connection_string=example_queue_authorization_rule.primary_connection_string)
```
## Import
IoTHub ServiceBus Queue Endpoint can be imported using the `resource id`, e.g.
```sh
$ pulumi import azure:iot/endpointServicebusQueue:EndpointServicebusQueue servicebus_queue1 /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/mygroup1/providers/Microsoft.Devices/IotHubs/hub1/Endpoints/servicebusqueue_endpoint1
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] connection_string: The connection string for the endpoint.
:param pulumi.Input[str] name: The name of the endpoint. The name must be unique across endpoint types. The following names are reserved: `events`, `operationsMonitoringEvents`, `fileNotifications` and `$default`.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: EndpointServicebusQueueArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Manages an IotHub ServiceBus Queue Endpoint
> **NOTE:** Endpoints can be defined either directly on the `iot.IoTHub` resource, or using the `azurerm_iothub_endpoint_*` resources - but the two ways of defining the endpoints cannot be used together. If both are used against the same IoTHub, spurious changes will occur. Also, defining a `azurerm_iothub_endpoint_*` resource and another endpoint of a different type directly on the `iot.IoTHub` resource is not supported.
## Example Usage
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_namespace = azure.servicebus.Namespace("exampleNamespace",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
sku="Standard")
example_queue = azure.servicebus.Queue("exampleQueue",
resource_group_name=example_resource_group.name,
namespace_name=example_namespace.name,
enable_partitioning=True)
example_queue_authorization_rule = azure.servicebus.QueueAuthorizationRule("exampleQueueAuthorizationRule",
namespace_name=example_namespace.name,
queue_name=example_queue.name,
resource_group_name=example_resource_group.name,
listen=False,
send=True,
manage=False)
example_io_t_hub = azure.iot.IoTHub("exampleIoTHub",
resource_group_name=example_resource_group.name,
location=example_resource_group.location,
sku=azure.iot.IoTHubSkuArgs(
name="B1",
tier="Basic",
capacity=1,
),
tags={
"purpose": "example",
})
example_endpoint_servicebus_queue = azure.iot.EndpointServicebusQueue("exampleEndpointServicebusQueue",
resource_group_name=example_resource_group.name,
iothub_name=example_io_t_hub.name,
connection_string=example_queue_authorization_rule.primary_connection_string)
```
## Import
IoTHub ServiceBus Queue Endpoint can be imported using the `resource id`, e.g.
```sh
$ pulumi import azure:iot/endpointServicebusQueue:EndpointServicebusQueue servicebus_queue1 /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/mygroup1/providers/Microsoft.Devices/IotHubs/hub1/Endpoints/servicebusqueue_endpoint1
```
:param str resource_name: The name of the resource.
:param EndpointServicebusQueueArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(EndpointServicebusQueueArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
connection_string: Optional[pulumi.Input[str]] = None,
iothub_name: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = EndpointServicebusQueueArgs.__new__(EndpointServicebusQueueArgs)
if connection_string is None and not opts.urn:
raise TypeError("Missing required property 'connection_string'")
__props__.__dict__["connection_string"] = connection_string
if iothub_name is None and not opts.urn:
raise TypeError("Missing required property 'iothub_name'")
__props__.__dict__["iothub_name"] = iothub_name
__props__.__dict__["name"] = name
if resource_group_name is None and not opts.urn:
raise TypeError("Missing required property 'resource_group_name'")
__props__.__dict__["resource_group_name"] = resource_group_name
super(EndpointServicebusQueue, __self__).__init__(
'azure:iot/endpointServicebusQueue:EndpointServicebusQueue',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
connection_string: Optional[pulumi.Input[str]] = None,
iothub_name: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None) -> 'EndpointServicebusQueue':
"""
Get an existing EndpointServicebusQueue resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] connection_string: The connection string for the endpoint.
:param pulumi.Input[str] name: The name of the endpoint. The name must be unique across endpoint types. The following names are reserved: `events`, `operationsMonitoringEvents`, `fileNotifications` and `$default`.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _EndpointServicebusQueueState.__new__(_EndpointServicebusQueueState)
__props__.__dict__["connection_string"] = connection_string
__props__.__dict__["iothub_name"] = iothub_name
__props__.__dict__["name"] = name
__props__.__dict__["resource_group_name"] = resource_group_name
return EndpointServicebusQueue(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="connectionString")
def connection_string(self) -> pulumi.Output[str]:
"""
The connection string for the endpoint.
"""
return pulumi.get(self, "connection_string")
@property
@pulumi.getter(name="iothubName")
def iothub_name(self) -> pulumi.Output[str]:
return pulumi.get(self, "iothub_name")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
The name of the endpoint. The name must be unique across endpoint types. The following names are reserved: `events`, `operationsMonitoringEvents`, `fileNotifications` and `$default`.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> pulumi.Output[str]:
return pulumi.get(self, "resource_group_name")
| 46.631868 | 433 | 0.671439 | 1,848 | 16,974 | 5.905844 | 0.124459 | 0.064321 | 0.074766 | 0.054426 | 0.824079 | 0.806212 | 0.794484 | 0.770753 | 0.756643 | 0.734653 | 0 | 0.005949 | 0.237422 | 16,974 | 363 | 434 | 46.760331 | 0.837222 | 0.432956 | 0 | 0.602273 | 1 | 0 | 0.117523 | 0.01764 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153409 | false | 0.005682 | 0.028409 | 0.034091 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fc52c58984c21e1028810ebc60b565adfdc399f5 | 6,730 | py | Python | python_modules/dagster/dagster_tests/core_tests/runtime_types_tests/test_python_dict.py | asamoal/dagster | 08fad28e4b608608ce090ce2e8a52c2cf9dd1b64 | [
"Apache-2.0"
] | null | null | null | python_modules/dagster/dagster_tests/core_tests/runtime_types_tests/test_python_dict.py | asamoal/dagster | 08fad28e4b608608ce090ce2e8a52c2cf9dd1b64 | [
"Apache-2.0"
] | null | null | null | python_modules/dagster/dagster_tests/core_tests/runtime_types_tests/test_python_dict.py | asamoal/dagster | 08fad28e4b608608ce090ce2e8a52c2cf9dd1b64 | [
"Apache-2.0"
] | null | null | null | import typing
import pytest
from dagster import (
DagsterInvalidDefinitionError,
DagsterTypeCheckDidNotPass,
Dict,
InputDefinition,
OutputDefinition,
execute_solid,
lambda_solid,
usable_as_dagster_type,
)
def test_basic_python_dictionary_output():
@lambda_solid(output_def=OutputDefinition(dict))
def emit_dict():
return {"key": "value"}
assert execute_solid(emit_dict).output_value() == {"key": "value"}
def test_basic_python_dictionary_input():
@lambda_solid(input_defs=[InputDefinition("data", dict)], output_def=OutputDefinition(str))
def input_dict(data):
return data["key"]
assert (
execute_solid(input_dict, input_values={"data": {"key": "value"}}).output_value() == "value"
)
def test_basic_python_dictionary_wrong():
@lambda_solid(output_def=OutputDefinition(dict))
def emit_dict():
return 1
with pytest.raises(DagsterTypeCheckDidNotPass):
execute_solid(emit_dict)
def test_basic_python_dictionary_input_wrong():
@lambda_solid(input_defs=[InputDefinition("data", dict)], output_def=OutputDefinition(str))
def input_dict(data):
return data["key"]
with pytest.raises(DagsterTypeCheckDidNotPass):
execute_solid(input_dict, input_values={"data": 123})
def test_execute_dict_from_config():
@lambda_solid(input_defs=[InputDefinition("data", dict)], output_def=OutputDefinition(str))
def input_dict(data):
return data["key"]
assert (
execute_solid(
input_dict,
run_config={"solids": {"input_dict": {"inputs": {"data": {"key": "in_config"}}}}},
).output_value()
== "in_config"
)
def test_dagster_dictionary_output():
@lambda_solid(output_def=OutputDefinition(dict))
def emit_dict():
return {"key": "value"}
assert execute_solid(emit_dict).output_value() == {"key": "value"}
def test_basic_dagster_dictionary_input():
@lambda_solid(input_defs=[InputDefinition("data", Dict)], output_def=OutputDefinition(str))
def input_dict(data):
return data["key"]
assert (
execute_solid(input_dict, input_values={"data": {"key": "value"}}).output_value() == "value"
)
def test_basic_typing_dictionary_output():
@lambda_solid(output_def=OutputDefinition(typing.Dict))
def emit_dict():
return {"key": "value"}
assert execute_solid(emit_dict).output_value() == {"key": "value"}
def test_basic_typing_dictionary_input():
@lambda_solid(
input_defs=[InputDefinition("data", typing.Dict)], output_def=OutputDefinition(str)
)
def input_dict(data):
return data["key"]
assert (
execute_solid(input_dict, input_values={"data": {"key": "value"}}).output_value() == "value"
)
def test_basic_closed_typing_dictionary_wrong():
@lambda_solid(output_def=OutputDefinition(typing.Dict[int, int]))
def emit_dict():
return 1
with pytest.raises(DagsterTypeCheckDidNotPass):
execute_solid(emit_dict)
def test_basic_closed_typing_dictionary_output():
@lambda_solid(output_def=OutputDefinition(typing.Dict[str, str]))
def emit_dict():
return {"key": "value"}
assert execute_solid(emit_dict).output_value() == {"key": "value"}
assert emit_dict.output_defs[0].dagster_type.key == "TypedPythonDict.String.String"
assert emit_dict.output_defs[0].dagster_type.key_type.unique_name == "String"
assert emit_dict.output_defs[0].dagster_type.value_type.unique_name == "String"
def test_basic_closed_typing_dictionary_input():
@lambda_solid(
input_defs=[InputDefinition("data", typing.Dict[str, str])],
output_def=OutputDefinition(str),
)
def input_dict(data):
return data["key"]
assert (
execute_solid(input_dict, input_values={"data": {"key": "value"}}).output_value() == "value"
)
def test_basic_closed_typing_dictionary_key_wrong():
@lambda_solid(output_def=OutputDefinition(typing.Dict[str, str]))
def emit_dict():
return {1: "foo"}
with pytest.raises(DagsterTypeCheckDidNotPass):
execute_solid(emit_dict)
def test_basic_closed_typing_dictionary_value_wrong():
@lambda_solid(output_def=OutputDefinition(typing.Dict[str, str]))
def emit_dict():
return {"foo": 1}
with pytest.raises(DagsterTypeCheckDidNotPass):
execute_solid(emit_dict)
def test_complicated_dictionary_typing_pass():
@lambda_solid(output_def=OutputDefinition(typing.Dict[str, typing.List[typing.Dict[int, int]]]))
def emit_dict():
return {"foo": [{1: 1, 2: 2}]}
assert execute_solid(emit_dict).output_value() == {"foo": [{1: 1, 2: 2}]}
def test_complicated_dictionary_typing_fail():
@lambda_solid(output_def=OutputDefinition(typing.Dict[str, typing.List[typing.Dict[int, int]]]))
def emit_dict():
return {"foo": [{1: 1, "2": 2}]}
with pytest.raises(DagsterTypeCheckDidNotPass):
execute_solid(emit_dict)
def test_dict_type_loader():
test_input = {"hello": 5, "goodbye": 42}
@lambda_solid(input_defs=[InputDefinition("dict_input", dagster_type=typing.Dict[str, int])])
def emit_dict(dict_input):
return dict_input
result = execute_solid(
emit_dict,
run_config={"solids": {"emit_dict": {"inputs": {"dict_input": test_input}}}},
)
assert result.success
assert result.output_value() == test_input
def test_dict_type_loader_typing_fail():
@usable_as_dagster_type
class CustomType(str):
pass
test_input = {"hello": "foo", "goodbye": "bar"}
@lambda_solid(
input_defs=[InputDefinition("dict_input", dagster_type=typing.Dict[str, CustomType])]
)
def emit_dict(dict_input):
return dict_input
with pytest.raises(
DagsterInvalidDefinitionError,
match="Input 'dict_input' of solid 'emit_dict' has no upstream output, "
"no default value, and no dagster type loader.",
):
execute_solid(
emit_dict,
run_config={"solids": {"emit_dict": {"inputs": {"dict_input": test_input}}}},
)
def test_dict_type_loader_inner_type_mismatch():
test_input = {"hello": "foo", "goodbye": "bar"}
@lambda_solid(input_defs=[InputDefinition("dict_input", dagster_type=typing.Dict[str, int])])
def emit_dict(dict_input):
return dict_input
# TODO: change this depending on the resolution of
# https://github.com/dagster-io/dagster/issues/3180
with pytest.raises(DagsterTypeCheckDidNotPass):
execute_solid(
emit_dict,
run_config={"solids": {"emit_dict": {"inputs": {"dict_input": test_input}}}},
)
| 29.647577 | 100 | 0.680386 | 802 | 6,730 | 5.389027 | 0.108479 | 0.061083 | 0.09255 | 0.060157 | 0.828089 | 0.80472 | 0.782508 | 0.750347 | 0.719343 | 0.69019 | 0 | 0.00529 | 0.185438 | 6,730 | 226 | 101 | 29.778761 | 0.783108 | 0.014562 | 0 | 0.55 | 0 | 0 | 0.084779 | 0.004375 | 0 | 0 | 0 | 0.004425 | 0.09375 | 1 | 0.2375 | false | 0.0625 | 0.01875 | 0.11875 | 0.38125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 9 |
fc576c9297f6e598f2ae98c8bfc963cf5e5dddaf | 3,769 | py | Python | tests/functional/saltenv/ops/test_func_remove_version.py | eitrtechnologies/saltenv | 66add964657fe270ed96ddfe50802e27539a6526 | [
"Apache-2.0"
] | 5 | 2022-03-25T17:15:04.000Z | 2022-03-28T23:24:26.000Z | tests/functional/saltenv/ops/test_func_remove_version.py | eitrtechnologies/saltenv | 66add964657fe270ed96ddfe50802e27539a6526 | [
"Apache-2.0"
] | null | null | null | tests/functional/saltenv/ops/test_func_remove_version.py | eitrtechnologies/saltenv | 66add964657fe270ed96ddfe50802e27539a6526 | [
"Apache-2.0"
] | 2 | 2022-03-26T06:33:30.000Z | 2022-03-29T19:43:50.000Z | from pathlib import Path
async def test_func_remove_version_exists(mock_hub, hub, tmp_path):
"""
SCENARIO #1:
- The version exists within LOCAL_VERSIONS
"""
# Link the function to the mock_hub
mock_hub.saltenv.ops.remove_version = hub.saltenv.ops.remove_version
# Create the new valid versions to include in LOCAL_VERSIONS
mocked_versions_dir = tmp_path / "versions"
mocked_versions_dir.mkdir()
valid_path_3001 = mocked_versions_dir / "salt-3001"
valid_path_3001.write_text("valid")
valid_path_3004 = mocked_versions_dir / "salt-3004"
valid_path_3004.write_text("valid")
# Add the two valid versions to LOCAL_VERSIONS
mock_hub.saltenv.ops.LOCAL_VERSIONS = {
"3001": Path(valid_path_3001),
"3004": Path(valid_path_3004),
}
# Set the expected LOCAL_VERSIONS value after remove_version is called.
# This needs to be created before remove_version because otherwise
# the Path object creation would fail due to the file having been deleted.
expected_local_versions = {
"3001": Path(valid_path_3001),
"3004": Path(valid_path_3004),
}
# Call remove_version with a version that is present in LOCAL_VERSIONS
ret = await mock_hub.saltenv.ops.remove_version("3004")
assert ret == True
# Confirm that the specified version had its file removed
assert valid_path_3004.exists() == False
# Confirm that the LOCAL_VERSIONS list is unchanged.
assert expected_local_versions == mock_hub.saltenv.ops.LOCAL_VERSIONS
async def test_func_remove_version_does_not_exist(mock_hub, hub, tmp_path):
"""
SCENARIO #2:
- The version does not exist within LOCAL_VERSIONS
"""
# Link the function to the mock_hub
mock_hub.saltenv.ops.remove_version = hub.saltenv.ops.remove_version
# Create the new valid versions to include in LOCAL_VERSIONS
mocked_versions_dir = tmp_path / "versions"
mocked_versions_dir.mkdir()
valid_path_3001 = mocked_versions_dir / "salt-3001"
valid_path_3001.write_text("valid")
valid_path_3004 = mocked_versions_dir / "salt-3004"
valid_path_3004.write_text("valid")
# Add the two valid versions to LOCAL_VERSIONS
mock_hub.saltenv.ops.LOCAL_VERSIONS = {
"3001": Path(valid_path_3001),
"3004": Path(valid_path_3004),
}
# Set the expected LOCAL_VERSIONS value after remove_version is called.
expected_local_versions = {
"3001": Path(valid_path_3001),
"3004": Path(valid_path_3004),
}
# Call remove_version with a version that is not present in LOCAL_VERSIONS
ret = await mock_hub.saltenv.ops.remove_version("3003")
assert ret == True
# Confirm that the two valid version files still exist and were unaffected by
# the remove_version call
assert valid_path_3001.exists() == True
assert valid_path_3004.exists() == True
# Confirm that the LOCAL_VERSIONS list is unchanged.
assert expected_local_versions == mock_hub.saltenv.ops.LOCAL_VERSIONS
async def test_func_remove_version_empty_local_versions(mock_hub, hub):
"""
SCENARIO #3:
- LOCAL_VERSIONS is empty
"""
# Link the function to the mock_hub
mock_hub.saltenv.ops.remove_version = hub.saltenv.ops.remove_version
# Set LOCAL_VERSIONS as empty
mock_hub.saltenv.ops.LOCAL_VERSIONS = {}
# Set the expected LOCAL_VERSIONS value after remove_version is called.
expected_local_versions = {}
# Call remove_version with a version that is not present in LOCAL_VERSIONS
ret = await mock_hub.saltenv.ops.remove_version("3003")
assert ret == True
# Confirm that the LOCAL_VERSIONS list is unchanged.
assert expected_local_versions == mock_hub.saltenv.ops.LOCAL_VERSIONS
| 35.556604 | 81 | 0.726187 | 532 | 3,769 | 4.866541 | 0.174812 | 0.150637 | 0.075319 | 0.078795 | 0.840479 | 0.821166 | 0.768637 | 0.768637 | 0.768637 | 0.768637 | 0 | 0.046072 | 0.199522 | 3,769 | 105 | 82 | 35.895238 | 0.812065 | 0.320775 | 0 | 0.714286 | 0 | 0 | 0.049978 | 0 | 0 | 0 | 0 | 0 | 0.183673 | 1 | 0 | false | 0 | 0.020408 | 0 | 0.020408 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fc667803a162cf97b256e13d54d013fd5cb87145 | 94 | py | Python | onehot/__init__.py | kmedian/onehot | dff3d014742e628acdc472f8d69a9936958ce76f | [
"MIT"
] | null | null | null | onehot/__init__.py | kmedian/onehot | dff3d014742e628acdc472f8d69a9936958ce76f | [
"MIT"
] | 6 | 2018-11-12T16:59:06.000Z | 2020-03-29T07:02:34.000Z | onehot/__init__.py | kmedian/onehot | dff3d014742e628acdc472f8d69a9936958ce76f | [
"MIT"
] | null | null | null | from .onehotdummy_class import OneHotDummy
from .mapping_to_colname import mapping_to_colname
| 31.333333 | 50 | 0.893617 | 13 | 94 | 6.076923 | 0.538462 | 0.227848 | 0.405063 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085106 | 94 | 2 | 51 | 47 | 0.918605 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
fc6c051a958a15efcfe4165dc1ac4f8893868736 | 77 | py | Python | nighres/cortex/__init__.py | marcobarilari/nighres | e503bb96a6a73f73020c5d9d7b540bc5f17699a8 | [
"Apache-2.0"
] | 41 | 2017-08-15T12:23:31.000Z | 2022-02-28T15:12:22.000Z | nighres/cortex/__init__.py | marcobarilari/nighres | e503bb96a6a73f73020c5d9d7b540bc5f17699a8 | [
"Apache-2.0"
] | 130 | 2017-07-27T11:09:09.000Z | 2022-03-31T10:05:07.000Z | nighres/cortex/__init__.py | marcobarilari/nighres | e503bb96a6a73f73020c5d9d7b540bc5f17699a8 | [
"Apache-2.0"
] | 35 | 2017-08-17T17:05:41.000Z | 2022-03-28T12:22:14.000Z | from nighres.cortex.cruise_cortex_extraction import cruise_cortex_extraction
| 38.5 | 76 | 0.922078 | 10 | 77 | 6.7 | 0.6 | 0.358209 | 0.656716 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051948 | 77 | 1 | 77 | 77 | 0.917808 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
fc7d75bd06a54f33811585ed527288b4fe363b4e | 24,949 | py | Python | tests/test_radius.py | jaywonchung/mabwiser | 805326a99213f94a6e813530cebf0c8a0f96a2d1 | [
"Apache-2.0"
] | null | null | null | tests/test_radius.py | jaywonchung/mabwiser | 805326a99213f94a6e813530cebf0c8a0f96a2d1 | [
"Apache-2.0"
] | null | null | null | tests/test_radius.py | jaywonchung/mabwiser | 805326a99213f94a6e813530cebf0c8a0f96a2d1 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
import numpy as np
from mabwiser.mab import LearningPolicy, NeighborhoodPolicy
from tests.test_base import BaseTest
class RadiusTest(BaseTest):
def test_greedy0_r2(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.Radius(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertListEqual(arms, [3, 1])
def test_greedy0_r2_single_test(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.Radius(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5]],
seed=123456,
num_run=1,
is_predict=True)
self.assertEqual(arms, 3)
def test_greedy0_r2_single_list(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.Radius(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5]],
seed=123456,
num_run=1,
is_predict=True)
self.assertEqual(arms, 3)
def test_greedy0_r2_exps(self):
exps, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.Radius(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=False)
self.assertDictEqual(exps[0], {1: 0.0, 2: 0.0, 3: 0.5, 4: 0})
self.assertDictEqual(exps[1], {1: 1.0, 2: 0.0, 3: 1.0, 4: 0})
def test_greedy0_r5(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 0, 0, 1, 1, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.Radius(5),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertListEqual(arms, [2, 2])
def test_greedy1_r2(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=1.0),
neighborhood_policy=NeighborhoodPolicy.Radius(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertListEqual(arms, [2, 1])
def test_thompson_r2(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.ThompsonSampling(),
neighborhood_policy=NeighborhoodPolicy.Radius(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertListEqual(arms, [2, 1])
def test_ucb_r2(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.UCB1(alpha=1),
neighborhood_policy=NeighborhoodPolicy.Radius(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertListEqual(arms, [3, 3])
def test_softmax_r2(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.Softmax(tau=1),
neighborhood_policy=NeighborhoodPolicy.Radius(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertListEqual(arms, [4, 1])
def test_no_neighbors(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 0, 0, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.Radius(.1),
context_history=[[10, 10, 10, 10, 10], [10, 10, 10, 10, 10], [10, 10, 10, 10, 10],
[10, 10, 10, 10, 10], [10, 10, 10, 10, 10], [10, 10, 10, 10, 10],
[10, 10, 10, 10, 10], [10, 10, 10, 10, 10], [10, 10, 10, 10, 10],
[10, 10, 10, 10, 10]],
contexts=[[0, 0, 0, 0, 0], [0, 0, 0, 0, 0], [.01, .01, .01, .01, .01], [0, 0, 0, 0, 0],
[0, 0, 0, 0, 0]],
seed=123456,
num_run=1,
is_predict=True)
self.assertListEqual(arms, [3, 1, 1, 4, 1])
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 0, 0, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.Radius(.1),
context_history=[[10, 10, 10, 10, 10], [10, 10, 10, 10, 10], [10, 10, 10, 10, 10],
[10, 10, 10, 10, 10], [10, 10, 10, 10, 10], [10, 10, 10, 10, 10],
[10, 10, 10, 10, 10], [10, 10, 10, 10, 10], [10, 10, 10, 10, 10],
[10, 10, 10, 10, 10]],
contexts=[[0, 0, 0, 0, 0], [0, 0, 0, 0, 0], [.01, .01, .01, .01, .01], [0, 0, 0, 0, 0],
[0, 0, 0, 0, 0]],
seed=7,
num_run=1,
is_predict=True)
self.assertListEqual(arms, [2, 4, 1, 2, 2])
def test_no_neighbors_expectations(self):
exp, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 0, 0, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.Radius(.1),
context_history=[[10, 10, 10, 10, 10], [10, 10, 10, 10, 10], [10, 10, 10, 10, 10],
[10, 10, 10, 10, 10], [10, 10, 10, 10, 10], [10, 10, 10, 10, 10],
[10, 10, 10, 10, 10], [10, 10, 10, 10, 10], [10, 10, 10, 10, 10],
[10, 10, 10, 10, 10]],
contexts=[[0, 0, 0, 0, 0], [0, 0, 0, 0, 0], [.01, .01, .01, .01, .01], [0, 0, 0, 0, 0],
[0, 0, 0, 0, 0]],
seed=123456,
num_run=1,
is_predict=False)
for index, row in enumerate(exp):
for key in row.keys():
self.assertIs(np.nan, row[key])
exp, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 0, 0, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.Radius(.1),
context_history=[[10, 10, 10, 10, 10], [10, 10, 10, 10, 10], [10, 10, 10, 10, 10],
[10, 10, 10, 10, 10], [10, 10, 10, 10, 10], [10, 10, 10, 10, 10],
[10, 10, 10, 10, 10], [10, 10, 10, 10, 10], [10, 10, 10, 10, 10],
[10, 10, 10, 10, 10]],
contexts=[[0, 0, 0, 0, 0], [0, 0, 0, 0, 0], [.01, .01, .01, .01, .01], [0, 0, 0, 0, 0],
[0, 0, 0, 0, 0]],
seed=7,
num_run=1,
is_predict=False)
for index, row in enumerate(exp):
for key in row.keys():
self.assertIs(np.nan, row[key])
def test_partial_fit_greedy0_r2(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.Radius(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertListEqual(arms, [3, 1])
self.assertEqual(len(mab._imp.decisions), 10)
self.assertEqual(len(mab._imp.rewards), 10)
self.assertEqual(len(mab._imp.contexts), 10)
self.assertEqual(np.ndim(mab._imp.decisions), 1)
decisions2 = [1, 2, 3]
rewards2 = [1, 1, 1]
context_history2 = [[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0]]
mab.partial_fit(decisions2, rewards2, context_history2)
self.assertEqual(len(mab._imp.decisions), 13)
self.assertEqual(len(mab._imp.rewards), 13)
self.assertEqual(len(mab._imp.contexts), 13)
self.assertEqual(np.ndim(mab._imp.decisions), 1)
def test_partial_fit_thompson_thresholds(self):
arm_to_threshold = {1: 1, 2: 5, 3: 2, 4: 3}
def binarize(arm, reward):
return reward >= arm_to_threshold[arm]
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 7, 0, 1, 9, 0, 2, 6, 11],
learning_policy=LearningPolicy.ThompsonSampling(binarize),
neighborhood_policy=NeighborhoodPolicy.Radius(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertTrue(mab._imp.lp.is_contextual_binarized)
self.assertListEqual(arms, [2, 1])
self.assertEqual(len(mab._imp.decisions), 10)
self.assertEqual(len(mab._imp.rewards), 10)
self.assertEqual(len(mab._imp.contexts), 10)
self.assertEqual(np.ndim(mab._imp.decisions), 1)
self.assertTrue(mab._imp.rewards.all() in [0, 1])
decisions2 = [1, 2, 3]
rewards2 = [11, 1, 6]
context_history2 = [[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0]]
mab.partial_fit(decisions2, rewards2, context_history2)
self.assertEqual(len(mab._imp.decisions), 13)
self.assertEqual(len(mab._imp.rewards), 13)
self.assertEqual(len(mab._imp.contexts), 13)
self.assertEqual(np.ndim(mab._imp.decisions), 1)
self.assertTrue(mab._imp.rewards.all() in [0, 1])
def test_fit_twice_thompson_thresholds(self):
arm_to_threshold = {1: 1, 2: 5, 3: 2, 4: 3}
def binarize(arm, reward):
return reward >= arm_to_threshold[arm]
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 7, 0, 1, 9, 0, 2, 6, 11],
learning_policy=LearningPolicy.ThompsonSampling(binarize),
neighborhood_policy=NeighborhoodPolicy.Radius(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertTrue(mab._imp.lp.is_contextual_binarized)
self.assertListEqual(arms, [2, 1])
self.assertEqual(len(mab._imp.decisions), 10)
self.assertEqual(len(mab._imp.rewards), 10)
self.assertEqual(len(mab._imp.contexts), 10)
self.assertEqual(np.ndim(mab._imp.decisions), 1)
self.assertTrue(mab._imp.rewards.all() in [0, 1])
decisions2 = [1, 2, 3]
rewards2 = [11, 1, 6]
context_history2 = [[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0]]
mab.fit(decisions2, rewards2, context_history2)
self.assertEqual(len(mab._imp.decisions), 3)
self.assertEqual(len(mab._imp.rewards), 3)
self.assertEqual(len(mab._imp.contexts), 3)
self.assertEqual(np.ndim(mab._imp.decisions), 1)
self.assertTrue(mab._imp.rewards.all() in [0, 1])
def test_add_arm(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.Radius(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
mab.add_arm(5)
self.assertTrue(5 in mab.arms)
self.assertTrue(5 in mab._imp.arms)
self.assertTrue(5 in mab._imp.lp.arms)
self.assertTrue(5 in mab._imp.lp.arm_to_expectation.keys())
def test_greedy0_no_nhood_predict_random(self):
# 2nd, 3rd arm has bad rewards should not be selected
# Use small neighborhood size to force Radius to no nhood
arms, mab = self.predict(arms=[1, 2, 3],
decisions=[1, 1, 1, 2, 2, 2],
rewards=[10, 10, 10, -10, -10, -10],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.Radius(0.00001),
context_history=[[1, 1, 2, 3, 5], [1, 2, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=2,
is_predict=True)
# 3rd arm was never seen but picked up by random neighborhood in both tests
self.assertListEqual(arms[0], [3, 1])
self.assertListEqual(arms[1], [1, 3])
def test_greedy0_no_nhood_predict_weighted(self):
# 2nd, 3rd arm has bad rewards should not be selected
# Use small neighborhood size to force Radius to no nhoods
arms, mab = self.predict(arms=[1, 2, 3],
decisions=[1, 1, 1, 2, 2, 2],
rewards=[10, 10, 10, -10, -10, -10],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.Radius(0.00001,
no_nhood_prob_of_arm=[0, 0.8, 0.2]),
context_history=[[1, 1, 2, 3, 5], [1, 2, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=45676,
num_run=2,
is_predict=True)
# 2nd arm is weighted highly but 3rd is picked too
self.assertListEqual(arms[0], [2, 2])
self.assertListEqual(arms[1], [3, 3])
def test_greedy0_no_nhood_expectation_nan(self):
# 2nd, 3rd arm has bad rewards should not be selected
# Use small neighborhood size to force Radius to no nhoods
arms, mab = self.predict(arms=[1, 2, 3],
decisions=[1, 1, 1, 2, 2, 2],
rewards=[10, 10, 10, -10, -10, -10],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.Radius(0.00001),
context_history=[[1, 1, 2, 3, 5], [1, 2, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=False)
# When there are no neighborhoods, expectations will be nan
self.assertDictEqual(arms[0], {1: np.nan, 2: np.nan, 3: np.nan})
self.assertDictEqual(arms[1], {1: np.nan, 2: np.nan, 3: np.nan}) | 56.573696 | 120 | 0.365466 | 3,041 | 24,949 | 2.926669 | 0.046366 | 0.062247 | 0.137528 | 0.177079 | 0.908764 | 0.902022 | 0.879663 | 0.876629 | 0.870112 | 0.866067 | 0 | 0.182618 | 0.489479 | 24,949 | 441 | 121 | 56.573696 | 0.516134 | 0.021163 | 0 | 0.847645 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157895 | 1 | 0.055402 | false | 0 | 0.00831 | 0.00554 | 0.072022 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
fc9229c093920289f675eca90586f755731a468b | 30,272 | py | Python | src/secml/array/tests/test_c_array_sysoverloads.py | zangobot/secml | 95a293e1201c24256eb7fe2f1d2125cd5f318c8c | [
"Apache-2.0"
] | 63 | 2020-04-20T16:31:16.000Z | 2022-03-29T01:05:35.000Z | src/secml/array/tests/test_c_array_sysoverloads.py | zangobot/secml | 95a293e1201c24256eb7fe2f1d2125cd5f318c8c | [
"Apache-2.0"
] | 5 | 2020-04-21T11:31:39.000Z | 2022-03-24T13:42:56.000Z | src/secml/array/tests/test_c_array_sysoverloads.py | zangobot/secml | 95a293e1201c24256eb7fe2f1d2125cd5f318c8c | [
"Apache-2.0"
] | 8 | 2020-04-21T09:16:42.000Z | 2022-02-23T16:28:43.000Z | from secml.array.tests import CArrayTestCases
import numpy as np
import scipy.sparse as scs
import operator as op
from itertools import product
from secml.array import CArray
from secml.array.c_dense import CDense
from secml.array.c_sparse import CSparse
class TestCArraySystemOverloads(CArrayTestCases):
"""Unit test for CArray SYSTEM OVERLOADS methods."""
def test_operators_array_vs_array_broadcast(self):
"""Test for mathematical operators array vs array with broadcast."""
operators = [op.add, op.sub]
expected_result = [CSparse, CDense, CDense, CDense]
items = [(self.array_sparse_sym, self.row_sparse),
(self.array_sparse_sym, self.row_dense),
(self.array_dense_sym, self.row_sparse),
(self.array_dense_sym, self.row_dense)]
self._test_operator_cycle(operators, items, expected_result)
operators = [op.mul]
expected_result = [CSparse, CSparse, CSparse, CDense]
items = [(self.array_sparse_sym, self.row_sparse),
(self.array_sparse_sym, self.row_dense),
(self.array_dense_sym, self.row_sparse),
(self.array_dense_sym, self.row_dense)]
self._test_operator_cycle(operators, items, expected_result)
operators = [op.truediv, op.floordiv]
expected_result = [CDense, CDense, CDense, CDense]
items = [(self.array_sparse_sym, self.row_sparse),
(self.array_sparse_sym, self.row_dense),
(self.array_dense_sym, self.row_sparse),
(self.array_dense_sym, self.row_dense)]
with self.logger.catch_warnings():
# For 0 / 0 divisions
self.logger.filterwarnings(
action='ignore',
message="divide by zero encountered in true_divide",
category=RuntimeWarning)
self._test_operator_cycle(operators, items, expected_result)
operators = [op.pow, CArray.pow]
expected_result = [CDense, CDense]
items = [(self.array_dense_sym, self.row_sparse),
(self.array_dense_sym, self.row_dense)]
self._test_operator_cycle(operators, items, expected_result)
# Sparse array ** array is not supported
with self.assertRaises(TypeError):
self.array_sparse ** self.row_sparse
with self.assertRaises(TypeError):
self.array_sparse ** self.row_dense
with self.assertRaises(TypeError):
self.array_sparse.pow(self.row_sparse)
with self.assertRaises(TypeError):
self.array_sparse.pow(self.row_dense)
def test_operators_array_vs_array(self):
"""Test for mathematical operators array vs array."""
operators = [op.add, op.sub]
expected_result = [CSparse, CDense, CDense, CDense]
items = [(self.array_sparse, self.array_sparse),
(self.array_sparse, self.array_dense),
(self.array_dense, self.array_sparse),
(self.array_dense, self.array_dense)]
self._test_operator_cycle(operators, items, expected_result)
operators = [op.mul]
expected_result = [CSparse, CSparse, CSparse, CDense]
items = [(self.array_sparse, self.array_sparse),
(self.array_sparse, self.array_dense),
(self.array_dense, self.array_sparse),
(self.array_dense, self.array_dense)]
self._test_operator_cycle(operators, items, expected_result)
operators = [op.truediv, op.floordiv]
expected_result = [CDense, CDense, CDense, CDense]
items = [(self.array_sparse, self.array_sparse),
(self.array_sparse, self.array_dense),
(self.array_dense, self.array_sparse),
(self.array_dense, self.array_dense)]
with self.logger.catch_warnings():
# For 0 / 0 divisions
self.logger.filterwarnings(
action='ignore',
message="invalid value encountered in true_divide",
category=RuntimeWarning)
self._test_operator_cycle(operators, items, expected_result)
operators = [op.pow, CArray.pow]
expected_result = [CDense, CDense]
items = [(self.array_dense, self.array_sparse),
(self.array_dense, self.array_dense)]
self._test_operator_cycle(operators, items, expected_result)
# Sparse array ** array is not supported
with self.assertRaises(TypeError):
self.array_sparse ** self.array_sparse
with self.assertRaises(TypeError):
self.array_sparse ** self.array_dense
with self.assertRaises(TypeError):
self.array_sparse.pow(self.array_sparse)
with self.assertRaises(TypeError):
self.array_sparse.pow(self.array_dense)
def test_operators_array(self):
"""Test for mathematical operators for single array."""
# abs()
self.logger.info("Checking abs() operator...")
s_abs = abs(self.array_sparse)
d_abs = abs(self.array_dense)
# Check if method returned correct datatypes
self.assertIsInstance(s_abs._data, CSparse)
self.assertIsInstance(d_abs._data, CDense)
# Check if we have the same output in all cases
self.assertTrue(self._test_multiple_eq([s_abs, d_abs]))
# array.abs()
self.logger.info("Checking .abs() method...")
s_abs = self.array_sparse.abs()
d_abs = self.array_dense.abs()
# Check if method returned correct datatypes
self.assertIsInstance(s_abs._data, CSparse)
self.assertIsInstance(d_abs._data, CDense)
# Check if we have the same output in all cases
self.assertTrue(self._test_multiple_eq([s_abs, d_abs]))
# Negative
self.logger.info("Checking negative operator...")
s_abs = -self.array_sparse
d_abs = -self.array_dense
# Check if method returned correct datatypes
self.assertIsInstance(s_abs._data, CSparse)
self.assertIsInstance(d_abs._data, CDense)
# Check if we have the same output in all cases
self.assertTrue(self._test_multiple_eq([s_abs, d_abs]))
def test_operators_array_vs_scalar(self):
"""Test for mathematical operators array vs scalar."""
test_scalars = [
2, np.ravel(2)[0], 2.0, np.ravel(2.0)[0], np.float32(2.0)]
test_z_scalars = [
0, np.ravel(0)[0], 0.0, np.ravel(0.0)[0], np.float32(0.0)]
# DENSE ARRAY + NONZERO SCALAR, NONZERO SCALAR + DENSE ARRAY
# sparse array + nonzero scalar is not supported (and viceversa)
operators = [op.add, op.mul]
expected_result = [CDense] * 10
items = list(product([self.array_dense], test_scalars)) + \
list(product(test_scalars, [self.array_dense]))
self._test_operator_cycle(operators, items, expected_result)
# ARRAY + ZERO SCALAR, ZERO SCALAR + ARRAY
operators = [op.add, op.mul]
expected_result = [CDense] * 10 + [CSparse] * 10
items = list(product([self.array_dense], test_z_scalars)) + \
list(product(test_z_scalars, [self.array_dense])) + \
list(product([self.array_sparse], test_z_scalars)) + \
list(product(test_z_scalars, [self.array_sparse]))
self._test_operator_cycle(operators, items, expected_result)
# DENSE ARRAY - NONZERO SCALAR
# sparse array - nonzero scalar is not supported (and viceversa)
operators = [op.sub]
expected_result = [CDense] * 5
items = list(product([self.array_dense], test_scalars))
self._test_operator_cycle(operators, items, expected_result)
# NONZERO SCALAR - DENSE ARRAY
operators = [op.sub]
expected_result = [CDense] * 5
items = list(product(test_scalars, [self.array_dense]))
self._test_operator_cycle(operators, items, expected_result)
# ARRAY - ZERO SCALAR
operators = [op.sub]
expected_result = [CDense] * 5 + [CSparse] * 5
items = list(product([self.array_dense], test_z_scalars)) + \
list(product([self.array_sparse], test_z_scalars))
self._test_operator_cycle(operators, items, expected_result)
# ZERO SCALAR - ARRAY
operators = [op.sub]
expected_result = [CDense] * 5 + [CSparse] * 5
items = list(product(test_z_scalars, [self.array_dense])) + \
list(product(test_z_scalars, [self.array_sparse]))
self._test_operator_cycle(operators, items, expected_result)
# ARRAY * NONZERO SCALAR, NONZERO SCALAR * ARRAY
operators = [op.mul]
expected_result = [CDense] * 10 + [CSparse] * 10
items = list(product([self.array_dense], test_scalars)) + \
list(product(test_scalars, [self.array_dense])) + \
list(product([self.array_sparse], test_scalars)) + \
list(product(test_scalars, [self.array_sparse]))
self._test_operator_cycle(operators, items, expected_result)
# ARRAY * ZERO SCALAR, ZERO SCALAR * ARRAY
operators = [op.mul]
expected_result = [CDense] * 10 + [CSparse] * 10
items = list(product([self.array_dense], test_z_scalars)) + \
list(product(test_z_scalars, [self.array_dense])) + \
list(product([self.array_sparse], test_z_scalars)) + \
list(product(test_z_scalars, [self.array_sparse]))
self._test_operator_cycle(operators, items, expected_result)
# ARRAY / NONZERO SCALAR
operators = [op.truediv, op.floordiv]
expected_result = [CDense] * 5 + [CSparse] * 5
items = list(product([self.array_dense], test_scalars)) + \
list(product([self.array_sparse], test_scalars))
self._test_operator_cycle(operators, items, expected_result)
# NONZERO SCALAR / DENSE ARRAY
# nonzero scalar / sparse array is not supported
operators = [op.truediv, op.floordiv]
expected_result = [CDense] * 5
items = list(product(test_scalars, [self.array_dense]))
with self.logger.catch_warnings():
# we are dividing using arrays having zeros
self.logger.filterwarnings(
action='ignore',
message="divide by zero encountered in true_divide",
category=RuntimeWarning)
self.logger.filterwarnings(
action='ignore',
message="divide by zero encountered in divide",
category=RuntimeWarning)
self._test_operator_cycle(operators, items, expected_result)
# ZERO SCALAR / DENSE ARRAY
# zero scalar / sparse array is not supported
operators = [op.truediv, op.floordiv]
expected_result = [CDense] * 5
items = list(product(test_z_scalars, [self.array_dense]))
with self.logger.catch_warnings():
# we are dividing a zero scalar by something
self.logger.filterwarnings(
action='ignore',
message="divide by zero encountered in true_divide",
category=RuntimeWarning)
# For 0 / 0 divisions
self.logger.filterwarnings(
action='ignore',
message="invalid value encountered in true_divide",
category=RuntimeWarning)
self.logger.filterwarnings(
action='ignore',
message="invalid value encountered in divide",
category=RuntimeWarning)
self._test_operator_cycle(operators, items, expected_result)
# ARRAY ** NONZERO SCALAR
operators = [op.pow, CArray.pow]
expected_result = [CDense] * 5 + [CSparse] * 5
items = list(product([self.array_dense], test_scalars)) + \
list(product([self.array_sparse], test_scalars))
self._test_operator_cycle(operators, items, expected_result)
# NONZERO SCALAR ** DENSE ARRAY
# nonzero scalar ** sparse array is not supported
operators = [op.pow]
expected_result = [CDense] * 5
items = list(product(test_scalars, [self.array_dense]))
self._test_operator_cycle(operators, items, expected_result)
# DENSE ARRAY ** ZERO SCALAR
# sparse array ** zero scalar is not supported
operators = [op.pow, CArray.pow]
expected_result = [CDense] * 5
items = list(product([self.array_dense], test_z_scalars))
self._test_operator_cycle(operators, items, expected_result)
# ZERO SCALAR ** DENSE ARRAY
# zero scalar ** sparse array is not supported
operators = [op.pow]
expected_result = [CDense] * 5
items = list(product(test_z_scalars, [self.array_dense]))
self._test_operator_cycle(operators, items, expected_result)
# NONZERO SCALAR +,- SPARSE ARRAY NOT SUPPORTED (AND VICEVERSA)
items = list(product([self.array_sparse], test_scalars)) + \
list(product(test_scalars, [self.array_sparse]))
operators = [op.add, op.sub]
self._test_operator_notimplemented(operators, items)
# ZERO SCALAR / SPARSE ARRAY NOT SUPPORTED
# NONZERO SCALAR / SPARSE ARRAY NOT SUPPORTED
items = list(product(test_scalars, [self.array_sparse])) + \
list(product(test_z_scalars, [self.array_sparse]))
operators = [op.truediv, op.floordiv]
self._test_operator_notimplemented(operators, items)
# NONZERO SCALAR ** SPARSE ARRAY NOT SUPPORTED
# ZERO SCALAR ** SPARSE ARRAY NOT SUPPORTED
items = list(product(test_scalars, [self.array_sparse])) + \
list(product(test_z_scalars, [self.array_sparse]))
operators = [op.pow]
self._test_operator_notimplemented(operators, items)
# SPARSE ARRAY ** ZERO SCALAR NOT SUPPORTED
items = list(product([self.array_sparse], test_z_scalars))
operators = [op.pow]
self._test_operator_notimplemented(operators, items)
# TODO: ARRAY / ZERO SCALAR TEST (SEE #353)
def test_operators_array_vs_unsupported(self):
"""Test for mathematical operators array vs unsupported types."""
def test_unsupported(x):
operators = [op.add, op.sub, op.mul,
op.truediv, op.floordiv, op.pow]
for operator in operators:
with self.assertRaises(TypeError):
self.logger.info("Testing {:} dense vs '{:}'".format(
operator.__name__, type(x).__name__))
operator(self.array_dense, x)
with self.assertRaises(TypeError):
self.logger.info("Testing {:} sparse vs '{:}'".format(
operator.__name__, type(x).__name__))
operator(self.array_sparse, x)
with self.assertRaises(TypeError):
self.logger.info("Testing {:} dense vect vs '{:}'".format(
operator.__name__, type(x).__name__))
operator(self.row_flat_dense, x)
with self.assertRaises(TypeError):
self.logger.info("Testing {:} sparse vect vs '{:}'".format(
operator.__name__, type(x).__name__))
operator(self.row_sparse, x)
test_unsupported(np.array([1, 2, 3]))
test_unsupported(scs.csr_matrix([1, 2, 3]))
test_unsupported([1, 2, 3])
test_unsupported((1, 2, 3))
test_unsupported(set([1, 2, 3]))
test_unsupported(dict({1: 2}))
test_unsupported('test')
def test_operators_unsupported_vs_array(self):
"""Test for mathematical operators unsupported types vs array."""
def test_unsupported(x):
operators = [op.add, op.sub, op.mul,
op.truediv, op.floordiv, op.pow]
for operator in operators:
with self.assertRaises(TypeError):
self.logger.info("Testing {:} '{:}' vs dense".format(
operator.__name__, type(x).__name__))
operator(x, self.array_dense)
with self.assertRaises(TypeError):
self.logger.info("Testing {:} '{:}' vs sparse".format(
operator.__name__, type(x).__name__))
operator(x, self.array_sparse)
with self.assertRaises(TypeError):
self.logger.info("Testing {:} '{:}' vs dense vect".format(
operator.__name__, type(x).__name__))
operator(x, self.row_flat_dense)
with self.assertRaises(TypeError):
self.logger.info("Testing {:} '{:}' vs sparse vect".format(
operator.__name__, type(x).__name__))
operator(x, self.row_sparse)
# Array do broadcasting of each element wrt our array
# There is NO way of blocking this
# test_unsupported(np.array([1, 2, 3]))
# test_unsupported(scs.csr_matrix([1, 2, 3]))
test_unsupported([1, 2, 3])
test_unsupported((1, 2, 3))
test_unsupported(set([1, 2, 3]))
test_unsupported(dict({1: 2}))
test_unsupported('test')
def test_comparison_array_vs_array(self):
"""Test for comparison operators array vs array."""
operators = [op.eq, op.lt, op.le, op.gt, op.ge, op.ne]
expected_result = [CSparse, CDense, CDense, CDense]
items = [(self.array_sparse, self.array_sparse),
(self.array_sparse, self.array_dense),
(self.array_dense, self.array_sparse),
(self.array_dense, self.array_dense)]
with self.logger.catch_warnings():
# Comparing sparse arrays using ==, <= and >= is inefficient
self.logger.filterwarnings(
action='ignore',
message="Comparing sparse matrices using*",
category=scs.SparseEfficiencyWarning)
self._test_operator_cycle(operators, items, expected_result)
def test_comparison_array_vs_array_broadcast(self):
"""Test for comparison operators array vs array with broadcast."""
operators = [op.eq, op.lt, op.le, op.gt, op.ge, op.ne]
expected_result = [CSparse, CDense, CDense, CDense]
items = [(self.array_sparse_sym, self.row_sparse),
(self.array_sparse_sym, self.row_dense),
(self.array_dense_sym, self.row_sparse),
(self.array_dense_sym, self.row_dense)]
with self.logger.catch_warnings():
# Comparing sparse arrays using ==, <= and >= is inefficient
self.logger.filterwarnings(
action='ignore',
message="Comparing sparse matrices using*",
category=scs.SparseEfficiencyWarning)
self._test_operator_cycle(operators, items, expected_result)
def test_comparison_array_vs_scalar(self):
"""Test for comparison operators array vs scalar."""
operators = [op.eq, op.lt, op.le, op.gt, op.ge, op.ne]
expected_result = [CSparse, CDense, CSparse, CDense]
items = [(self.array_sparse, 2),
(self.array_dense, 2),
(self.array_sparse, np.ravel(2)[0]),
(self.array_dense, np.ravel(2)[0])]
with self.logger.catch_warnings():
# Comparing a sparse matrix with a scalar greater than zero
# using < or <= is inefficient
# Comparing a sparse matrix with a nonzero scalar
# using != is inefficient
self.logger.filterwarnings(
action='ignore',
message="Comparing a sparse matrix*",
category=scs.SparseEfficiencyWarning)
self._test_operator_cycle(operators, items, expected_result)
def test_comparison_array_vs_unsupported(self):
"""Test for comparison operators array vs unsupported types."""
def test_unsupported_arrays(x):
for operator in [op.eq, op.lt, op.le, op.gt, op.ge, op.ne]:
with self.assertRaises(TypeError):
self.logger.info("Testing {:} '{:}' vs dense".format(
operator.__name__, type(x).__name__))
operator(self.array_dense, x)
with self.assertRaises(TypeError):
self.logger.info("Testing {:} '{:}' vs sparse".format(
operator.__name__, type(x).__name__))
operator(self.array_sparse, x)
with self.assertRaises(TypeError):
self.logger.info("Testing {:} '{:}' vs dense vect".format(
operator.__name__, type(x).__name__))
operator(self.row_flat_dense, x)
with self.assertRaises(TypeError):
self.logger.info("Testing {:} '{:}' vs sparse vect".format(
operator.__name__, type(x).__name__))
operator(self.row_sparse, x)
def test_unsupported(x):
for operator in [op.lt, op.le, op.gt, op.ge]:
with self.assertRaises(TypeError):
self.logger.info("Testing {:} '{:}' vs dense".format(
operator.__name__, type(x).__name__))
operator(self.array_dense, x)
with self.assertRaises(TypeError):
self.logger.info("Testing {:} '{:}' vs sparse".format(
operator.__name__, type(x).__name__))
operator(self.array_sparse, x)
with self.assertRaises(TypeError):
self.logger.info("Testing {:} '{:}' vs dense vect".format(
operator.__name__, type(x).__name__))
operator(self.row_flat_dense, x)
with self.assertRaises(TypeError):
self.logger.info("Testing {:} '{:}' vs sparse vect".format(
operator.__name__, type(x).__name__))
operator(self.row_sparse, x)
def test_false(x):
self.logger.info("Testing {:} dense vs '{:}'".format(
op.eq.__name__, type(x).__name__))
self.assertFalse(op.eq(self.array_dense, x))
self.logger.info("Testing {:} sparse vs '{:}'".format(
op.eq.__name__, type(x).__name__))
self.assertFalse(op.eq(self.array_sparse, x))
self.logger.info("Testing {:} dense vect vs '{:}'".format(
op.eq.__name__, type(x).__name__))
self.assertFalse(op.eq(self.row_flat_dense, x))
self.logger.info("Testing {:} sparse vect vs '{:}'".format(
op.eq.__name__, type(x).__name__))
self.assertFalse(op.eq(self.row_sparse, x))
def test_true(x):
self.logger.info("Testing {:} dense vs '{:}'".format(
op.ne.__name__, type(x).__name__))
self.assertTrue(op.ne(self.array_dense, x))
self.logger.info("Testing {:} sparse vs '{:}'".format(
op.ne.__name__, type(x).__name__))
self.assertTrue(op.ne(self.array_sparse, x))
self.logger.info("Testing {:} dense vect vs '{:}'".format(
op.ne.__name__, type(x).__name__))
self.assertTrue(op.ne(self.row_flat_dense, x))
self.logger.info("Testing {:} sparse vect vs '{:}'".format(
op.ne.__name__, type(x).__name__))
self.assertTrue(op.ne(self.row_sparse, x))
test_unsupported_arrays(np.array([1, 2, 3]))
test_unsupported_arrays(scs.csr_matrix([1, 2, 3]))
test_unsupported([1, 2, 3])
test_unsupported((1, 2, 3))
test_unsupported(set([1, 2, 3]))
test_unsupported(dict({1: 2}))
test_unsupported('test')
test_false([1, 2, 3])
test_false((1, 2, 3))
test_false(set([1, 2, 3]))
test_false(dict({1: 2}))
test_false('test')
test_true([1, 2, 3])
test_true((1, 2, 3))
test_true(set([1, 2, 3]))
test_true(dict({1: 2}))
test_true('test')
def test_operators_comparison_vs_array(self):
"""Test for comparison operators unsupported types vs array."""
def test_unsupported(x):
for operator in [op.lt, op.le, op.gt, op.ge]:
with self.assertRaises(TypeError):
self.logger.info("Testing {:} '{:}' vs dense".format(
operator.__name__, type(x).__name__))
operator(x, self.array_dense)
with self.assertRaises(TypeError):
self.logger.info("Testing {:} '{:}' vs sparse".format(
operator.__name__, type(x).__name__))
operator(x, self.array_sparse)
with self.assertRaises(TypeError):
self.logger.info("Testing {:} '{:}' vs dense vect".format(
operator.__name__, type(x).__name__))
operator(x, self.row_flat_dense)
with self.assertRaises(TypeError):
self.logger.info("Testing {:} '{:}' vs sparse vect".format(
operator.__name__, type(x).__name__))
operator(x, self.row_sparse)
def test_false(x):
self.logger.info("Testing {:} dense vs '{:}'".format(
op.eq.__name__, type(x).__name__))
self.assertFalse(op.eq(x, self.array_dense))
self.logger.info("Testing {:} sparse vs '{:}'".format(
op.eq.__name__, type(x).__name__))
self.assertFalse(op.eq(x, self.array_sparse))
self.logger.info("Testing {:} dense vect vs '{:}'".format(
op.eq.__name__, type(x).__name__))
self.assertFalse(op.eq(x, self.row_flat_dense))
self.logger.info("Testing {:} sparse vect vs '{:}'".format(
op.eq.__name__, type(x).__name__))
self.assertFalse(op.eq(x, self.row_sparse))
def test_true(x):
self.logger.info("Testing {:} dense vs '{:}'".format(
op.ne.__name__, type(x).__name__))
self.assertTrue(op.ne(x, self.array_dense))
self.logger.info("Testing {:} sparse vs '{:}'".format(
op.ne.__name__, type(x).__name__))
self.assertTrue(op.ne(x, self.array_sparse))
self.logger.info("Testing {:} dense vect vs '{:}'".format(
op.ne.__name__, type(x).__name__))
self.assertTrue(op.ne(x, self.row_flat_dense))
self.logger.info("Testing {:} sparse vect vs '{:}'".format(
op.ne.__name__, type(x).__name__))
self.assertTrue(op.ne(x, self.row_sparse))
# Array do broadcasting of each element wrt our array
# There is NO way of blocking this
# test_unsupported(np.array([1, 2, 3]))
# test_unsupported(scs.csr_matrix([1, 2, 3]))
test_unsupported([1, 2, 3])
test_unsupported((1, 2, 3))
test_unsupported(set([1, 2, 3]))
test_unsupported(dict({1: 2}))
test_unsupported('test')
test_false([1, 2, 3])
test_false((1, 2, 3))
test_false(set([1, 2, 3]))
test_false(dict({1: 2}))
test_false('test')
test_true([1, 2, 3])
test_true((1, 2, 3))
test_true(set([1, 2, 3]))
test_true(dict({1: 2}))
test_true('test')
def test_bool_operators(self):
a = CArray([1, 2, 3])
b = CArray([1, 1, 1])
d = (a < 2)
c = (b == 1)
self.logger.info("C -> " + str(c))
self.logger.info("D -> " + str(d))
self.logger.info("C logical_and D -> " + str(c.logical_and(d)))
self.logger.info("D logical_and C -> " + str(d.logical_and(c)))
with self.assertRaises(ValueError):
print(d and c)
with self.assertRaises(ValueError):
print(c and d)
with self.assertRaises(ValueError):
print(d or c)
with self.assertRaises(ValueError):
print(c or d)
a = CArray(True)
b = CArray(False)
self.assertTrue((a and b) == False)
self.assertTrue((b and a) == False)
self.assertTrue((a or b) == True)
self.assertTrue((b or a) == True)
def test_iteration(self):
"""Unittest for CArray __iter__."""
self.logger.info("Unittest for CArray __iter__")
res = []
for elem_id, elem in enumerate(self.array_dense):
res.append(elem)
self.assertEqual(self.array_dense.ravel()[elem_id].item(), elem)
# Check if all array elements have been returned
self.assertEqual(self.array_dense.size, len(res))
res = []
for elem_id, elem in enumerate(self.array_sparse):
res.append(elem)
self.assertEqual(self.array_sparse.ravel()[elem_id].item(), elem)
# Check if all array elements have been returned
self.assertEqual(self.array_sparse.size, len(res))
res = []
for elem_id, elem in enumerate(self.row_flat_dense):
res.append(elem)
self.assertEqual(self.row_flat_dense[elem_id].item(), elem)
# Check if all array elements have been returned
self.assertEqual(self.row_flat_dense.size, len(res))
res = []
for elem_id, elem in enumerate(self.row_dense):
res.append(elem)
self.assertEqual(self.row_dense[elem_id].item(), elem)
# Check if all array elements have been returned
self.assertEqual(self.row_dense.size, len(res))
res = []
for elem_id, elem in enumerate(self.row_sparse):
res.append(elem)
self.assertEqual(self.row_sparse[elem_id].item(), elem)
# Check if all array elements have been returned
self.assertEqual(self.row_sparse.size, len(res))
if __name__ == '__main__':
CArrayTestCases.main()
| 43.184023 | 79 | 0.588167 | 3,506 | 30,272 | 4.824586 | 0.052481 | 0.072362 | 0.061188 | 0.044694 | 0.930239 | 0.917588 | 0.879515 | 0.845817 | 0.821106 | 0.798404 | 0 | 0.008791 | 0.293539 | 30,272 | 700 | 80 | 43.245714 | 0.782147 | 0.10518 | 0 | 0.762279 | 0 | 0 | 0.061726 | 0 | 0 | 0 | 0 | 0.001429 | 0.139489 | 1 | 0.043222 | false | 0 | 0.015717 | 0 | 0.060904 | 0.007859 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fc982a400bff196e54e4a47fdfa9590de62fd1dc | 8,119 | py | Python | hintapi/parameters/field_functions.py | abersheeran/hintapi | 6684d165b958c44c9a47184a7a75bb74383fdd3a | [
"Apache-2.0"
] | 15 | 2021-11-05T10:07:43.000Z | 2021-12-07T04:58:43.000Z | hintapi/parameters/field_functions.py | abersheeran/hintapi | 6684d165b958c44c9a47184a7a75bb74383fdd3a | [
"Apache-2.0"
] | 1 | 2021-11-07T05:30:39.000Z | 2021-11-07T05:30:39.000Z | hintapi/parameters/field_functions.py | abersheeran/hintapi | 6684d165b958c44c9a47184a7a75bb74383fdd3a | [
"Apache-2.0"
] | 3 | 2021-11-07T03:01:31.000Z | 2021-11-09T02:37:23.000Z | from __future__ import annotations
from typing import Any, Callable, Optional, TypeVar
from pydantic.fields import NoArgAnyCallable, Undefined
from ..utils import is_coroutine_callable
from .fields import (
BodyInfo,
CookieInfo,
DependInfo,
HeaderInfo,
PathInfo,
QueryInfo,
RequestInfo,
)
__all__ = ["Path", "Query", "Header", "Cookie", "Body", "Request", "Depends"]
T = TypeVar("T")
def Path(
default: Any = Undefined,
*,
default_factory: Optional[NoArgAnyCallable] = None,
alias: Optional[str] = None,
title: Optional[str] = None,
description: Optional[str] = None,
exclusive: bool = False,
**extra: Any,
) -> Any:
"""
Used to provide extra information about a field.
:param default: since this is replacing the field’s default, its first argument is used
to set the default, use ellipsis (``...``) to indicate the field is required
:param default_factory: callable that will be called when a default value is needed for this field
If both `default` and `default_factory` are set, an error is raised.
:param alias: the public name of the field
:param title: can be any string, used in the schema
:param description: can be any string, used in the schema
:param exclusive: decide whether this field receives all parameters
:param **extra: any additional keyword arguments will be added as is to the schema
"""
field_info = PathInfo(
default,
default_factory=default_factory,
alias=alias,
title=title,
description=description,
exclusive=exclusive,
**extra,
)
field_info._validate()
return field_info
def Query(
default: Any = Undefined,
*,
default_factory: Optional[NoArgAnyCallable] = None,
alias: Optional[str] = None,
title: Optional[str] = None,
description: Optional[str] = None,
exclusive: bool = False,
**extra: Any,
) -> Any:
"""
Used to provide extra information about a field.
:param default: since this is replacing the field’s default, its first argument is used
to set the default, use ellipsis (``...``) to indicate the field is required
:param default_factory: callable that will be called when a default value is needed for this field
If both `default` and `default_factory` are set, an error is raised.
:param alias: the public name of the field
:param title: can be any string, used in the schema
:param description: can be any string, used in the schema
:param exclusive: decide whether this field receives all parameters
:param **extra: any additional keyword arguments will be added as is to the schema
"""
field_info = QueryInfo(
default,
default_factory=default_factory,
alias=alias,
title=title,
description=description,
exclusive=exclusive,
**extra,
)
field_info._validate()
return field_info
def Header(
default: Any = Undefined,
*,
default_factory: Optional[NoArgAnyCallable] = None,
alias: Optional[str] = None,
title: Optional[str] = None,
description: Optional[str] = None,
exclusive: bool = False,
**extra: Any,
) -> Any:
"""
Used to provide extra information about a field.
:param default: since this is replacing the field’s default, its first argument is used
to set the default, use ellipsis (``...``) to indicate the field is required
:param default_factory: callable that will be called when a default value is needed for this field
If both `default` and `default_factory` are set, an error is raised.
:param alias: the public name of the field
:param title: can be any string, used in the schema
:param description: can be any string, used in the schema
:param exclusive: decide whether this field receives all parameters
:param **extra: any additional keyword arguments will be added as is to the schema
"""
field_info = HeaderInfo(
default,
default_factory=default_factory,
alias=alias,
title=title,
description=description,
exclusive=exclusive,
**extra,
)
field_info._validate()
return field_info
def Cookie(
default: Any = Undefined,
*,
default_factory: Optional[NoArgAnyCallable] = None,
alias: Optional[str] = None,
title: Optional[str] = None,
description: Optional[str] = None,
exclusive: bool = False,
**extra: Any,
) -> Any:
"""
Used to provide extra information about a field.
:param default: since this is replacing the field’s default, its first argument is used
to set the default, use ellipsis (``...``) to indicate the field is required
:param default_factory: callable that will be called when a default value is needed for this field
If both `default` and `default_factory` are set, an error is raised.
:param alias: the public name of the field
:param title: can be any string, used in the schema
:param description: can be any string, used in the schema
:param exclusive: decide whether this field receives all parameters
:param **extra: any additional keyword arguments will be added as is to the schema
"""
field_info = CookieInfo(
default,
default_factory=default_factory,
alias=alias,
title=title,
description=description,
exclusive=exclusive,
**extra,
)
field_info._validate()
return field_info
def Body(
default: Any = Undefined,
*,
default_factory: Optional[NoArgAnyCallable] = None,
alias: Optional[str] = None,
title: Optional[str] = None,
description: Optional[str] = None,
exclusive: bool = False,
**extra: Any,
) -> Any:
"""
Used to provide extra information about a field.
:param default: since this is replacing the field’s default, its first argument is used
to set the default, use ellipsis (``...``) to indicate the field is required
:param default_factory: callable that will be called when a default value is needed for this field
If both `default` and `default_factory` are set, an error is raised.
:param alias: the public name of the field
:param title: can be any string, used in the schema
:param description: can be any string, used in the schema
:param exclusive: decide whether this field receives all parameters
:param **extra: any additional keyword arguments will be added as is to the schema
"""
field_info = BodyInfo(
default,
default_factory=default_factory,
alias=alias,
title=title,
description=description,
exclusive=exclusive,
**extra,
)
field_info._validate()
return field_info
def Request(
default: Any = Undefined,
*,
default_factory: Optional[NoArgAnyCallable] = None,
alias: Optional[str] = None,
) -> Any:
"""
Used to provide extra information about a field.
:param default: since this is replacing the field’s default, its first argument is used
to set the default, use ellipsis (``...``) to indicate the field is required
:param default_factory: callable that will be called when a default value is needed for this field
If both `default` and `default_factory` are set, an error is raised.
:param alias: the public name of the field
"""
if default is not Undefined and default_factory is not None:
raise ValueError("cannot specify both default and default_factory")
return RequestInfo(default, default_factory=default_factory, alias=alias)
def Depends(call: Callable, *, to_async: bool = False) -> Any:
"""
Used to provide extra information about a field.
:param call: callable that will be called when a dependency is needed for this field
:param to_async: whether to convert the callable to async by threadpool
"""
if is_coroutine_callable(call) and to_async:
raise TypeError("`to_async` can only work with non-coroutine functions")
return DependInfo(call, to_async=to_async)
| 34.845494 | 102 | 0.679517 | 1,060 | 8,119 | 5.138679 | 0.109434 | 0.082247 | 0.044061 | 0.025702 | 0.872407 | 0.863595 | 0.863595 | 0.850009 | 0.850009 | 0.850009 | 0 | 0 | 0.241286 | 8,119 | 232 | 103 | 34.99569 | 0.884253 | 0.50388 | 0 | 0.714286 | 0 | 0 | 0.037695 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0.037594 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5da1bd5011a90f08b6901cc1b8007cc05a469755 | 182 | py | Python | segm_benchmark/data/transforms/__init__.py | anotherTK/segmentation.pytorch | 36b6b412ee5561745fd9a67e4b6e28c0b9f58d68 | [
"MIT"
] | null | null | null | segm_benchmark/data/transforms/__init__.py | anotherTK/segmentation.pytorch | 36b6b412ee5561745fd9a67e4b6e28c0b9f58d68 | [
"MIT"
] | null | null | null | segm_benchmark/data/transforms/__init__.py | anotherTK/segmentation.pytorch | 36b6b412ee5561745fd9a67e4b6e28c0b9f58d68 | [
"MIT"
] | null | null | null |
from .transforms import Compose
from .transforms import RandomHorizontalFlip
from .transforms import ToTensor
from .transforms import Normalize
from .build import build_transforms
| 22.75 | 44 | 0.851648 | 21 | 182 | 7.333333 | 0.380952 | 0.363636 | 0.519481 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.120879 | 182 | 7 | 45 | 26 | 0.9625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.